Aller au contenu


Top Posters

Newest Members

Recently Added Posts

 Photo

Inverse Kinematics

31 mars 2024

Hi All,

It has been a while since we last talked Inverse Kinematics!  I have been working on a new quadruped and find myself making these calculations again. This time, I had ChatGPT to help and hinder my progress - but that is a different topic!!  :dash2:

I have a version of code that is somewhat working.  For every bug I found, I created another.  I give an explanation (from my blog), pictures, and code.  please take a look and give me some feedback.  Much appreicated!

- did I miss some simplification?

- easier way to do things?

- something to be carefull with?

 

For  Mojo5, I've chosen to use two MG995 servos, arranged in a stack. One servo is responsible for the 'hip' motion, and the other controls the 'knee.' The crux of IK in this setup is to map a target position within the coordinate plane to specific angles for these servos. To streamline the calculations, I've made a series of strategic design decisions. The lengths of the leg segments, L1 and L2, are set to be equal, each measuring 70mm. The hip servo is positioned as the origin point of our coordinate system. The mechanism for the knee is somewhat intricate, primarily because the servo controlling it is not mounted directly on the leg. Additionally, I've introduced a concept of 'yaw' movement along the z-axis, although, for the time being, our IK calculations will focus solely on movements within the x and y axes.
 
When it comes to calculating the necessary angles through IK, the approach is to visualize a triangle formed by the leg segments. Given that L1 and L2 are of equal length, this triangle is always isosceles. While this detail may seem minor at first, it becomes crucial when applying the Pythagorean theorem—a^2 = b^2 + c^2—to determine the distance (D) between the endpoints of the leg segments. This distance is key to assessing the feasibility of reaching a given target (x, y) position. To ensure reachability, D must not exceed the sum of the lengths of the two leg segments, or in other words, D <= 2*L.
 
Calculating the Hip Angle (Theta1)
 
To determine the hip angle, one must sum two distinct angles. The initial angle is formed between the horizontal axis and the target point (x,y) at the end of line D in our coordinate system. This can be calculated using the ArcTangent function, specifically arctan2(y/x) in standard practices. However, in my application, I employ arctan2(-y/x). The choice to use a negative y value due to my y values will consistently fall below zero. An alternative approach could involve taking the absolute value of the ArcTangent result to ensure a positive angle.
 
Following this, it's necessary to find the interior angle between line D and leg segment L1 within our conceptualized triangle. This angle, designated as alpha, can be determined through the law of cosines. In a simplified form, the calculation of alpha is expressed as acos(D / (2*L)). By adding alpha to the previously calculated angle, we derive the hip angle. However, there's a twist due to the servo's counterclockwise incrementation: the actual Theta1 is the supplement of the sum of alpha and our initial angle, mathematically expressed as Theta1 = 180 - (alpha + theta).
 
Mojo5_leg-v3-IK-alpha.png
 
Calculating the Knee Angle (Theta2)
 
To calculate the knee angle, our first step involves identifying the interior angle between the two legs, L1 and L2, which we'll refer to as beta. Once again, the law of cosines proves invaluable for this calculation. While the deeper mathematical proofs are better left to academia, the simplified formula to compute beta is given by acos((2*L^2 - D^2) / (2*L^2)). This equation allows us to calculate beta, which represents the angle between the leg segments in our model.
 
However, to translate this angle into a form usable by the servo mechanism, additional adjustments are necessary due to the servo being linked to the leg segments via cams. We must take the supplementary angle to beta. This supplementary angle, once processed through the cam system, achieves the effect of pulling the leg segments into the correct position but in a reversed direction. Consequently, we must employ the complement of this supplementary angle to align with the actual geometry and movement direction required by the servo mechanism. This raises an interesting question: could the calculation have been simplified to just beta minus 90 degrees?
 
Mojo5_leg-v3-IK-beta.png
 
Code:
 
// Function to calculate and return the servo angles given a target (x, y)
void calcIK(float x, float y, float &theta1, float &theta2) {
  float D = sqrt(x*x + y*y); // Distance from hip to target point  (x hori, y vert always neg)
  // Ensure the angle is within the robot's physical capability
  if (D > 2 * L) {
    // If the target is beyond the maximum reach, approximate values within range
    D = 2 * L;
  }

  float theta = atan2(-y,x); // Angle to target from horizontal, need pos angle its upside down.
  float alpha = acos(D / (2*L)); // Angle between leg segments for target

  // For an inverted servo, use the complementary angle
  theta1 = 180 - ((theta + alpha) * (180.0 / M_PI)); // Adjusting theta1 for the inverted servo
  
  // Beta & Theta2 calculation, knee indepedent of hip
  float beta = acos((2*L*L - D*D) / (2*L*L)); //interior angle between L1 and D
  // Theta2 subtract from 180 to invert angle (complment); bisect beta for independent; adjust for servo rotation
  theta2 = (180 - (beta * (180.0 / M_PI) / 2)) + ADJUST_B;

  Serial.printf("D: %.2f, Theta: %.2f, Alpha: %.2f, Theta1: %.2f, Beta: %.2f, Theta2: %.2f\n",
          D, theta * (180.0 / M_PI), alpha * (180.0 / M_PI), theta1, beta * (180.0 / M_PI), theta2);
}

 

  1 126 Views · 22 Replies ( Last reply by Oracid )

 Photo

[Mars Attacks!] Participating in the French Robot Cup...

31 déc. 2023

(French version here)

 

Hello everyone,

 

This year, I am participating again in the Coupe de France de Robotique, but this time, I'm starting with a new team composed of

pat92fr hdumcke  (whom I both met during the TRR), and myself.

Pat chose the name Mars Attacks! in reference to the movie of the same name because the theme for the 2024 edition of the Coupe de France de Robotique is "Farming Mars," revealed in September 2023.

 

It's the first participation for pat92fr and hdumcke (yes, I roped them into this, but I promise they are willing participants and not being held hostage :) ). Since it's their first time, we are starting a bit from scratch, but since we are not beginners in robotics, we're progressing quite well. As they are both in the Paris area and I'm in the south, near Biarritz, most of our sessions are conducted remotely, which can be a challenge, but we're managing quite well.

 

To provide a brief update on where we stand at the moment:

 

Mechanically:

 

We are working on a mecanum base with a lidar positioned quite low to hit the table edges for localization (based on my experience with vigibot). Counter-axes are added at the wheel level in the protective fairing to relieve the motor reducers a bit. The front motor train is mounted on a pivot linkage to keep all four wheels in perfect contact with the ground.

 

Some photos:

 

Assembled base from the top:

base mecanum 1.jpg

Assembled base from the front:

base mecanum 2.jpg

Highlight of the front train pivot without wheels:

base mecanum 3.jpg

Top view of the motor support integrating bearings and counter-axis in the wheel fairing:

base mecanum 4.jpg

Front view of the motor support integrating bearings and counter-axis in the wheel fairing:

base mecanum 5.jpg

 

For this mecanum base, we are using:

 

Several 3D-printed parts (PETG), (the files will be shared) and plenty of screws and fasteners 

 

Software-wise:
 
We have implemented a visualization tool "from above" that allows, among other things, controlling the robot and logging data. This visualization tool, developed in Java with Processing, can connect directly to the robot during tests or to other simulation tools that have also been developed.
 
Here's what the tool looks like at the moment:
outil.png
 
This tool will be shared with various examples on the visualization tool's Git :  (Basic examples have already been shared, with the idea of providing building blocks so that everyone can develop the tool that suits them from examples) 


Robot Code:
 
Regarding the robot code, for now, we have mainly focused on robot control and localization on the table. The system runs on an STM32, and data is sent to the visualizer via WiFi using an ESP32. Currently, we have a working system, although not perfect yet, connected to our visualizer. During tests, we get results like in this video: 
In this video, the robot is moving on a table, and what we see is the data sent by the robot. It displays where it thinks it is on the table, and in red, we see the lidar points. 
Around 1 minute and 50 seconds, the robot touches an obstacle and loses its localization. We need to work on detecting opponents, and it's in progress, but the results are already quite interesting. 


We plan to publish a lot more, but we need to make progress first, so stay tuned for more soon! :)

  1 221 Views · 8 Replies ( Last reply by TNERA )

 Photo

Recrutement de profils neuroatypiques (autisme, DYS, TDAH...

26 avril 2024

Bonjour à toutes et tous,

 

Notre entreprise sociale spécialisée sur l'inclusion des profils neuroatypiques en entreprise a le plaisir de vous informer que nous lançons notre nouveau programme d’inclusion de personnes neuroatypiques (autisme, dys, TDAH) aux côtés de notre partenaire ACC afin de pourvoir 2 postes en CDI à Bordeaux (33) dans les domaines de :

 

Il n’est pas nécessaire d’avoir obtenu un diplôme d’ingénieur ou un master 2 pour postuler : un niveau bac+2/bac+3 avec un sens concret de la physique suffit. Si vous connaissez dans votre entourage des adultes neuroatypiques en recherche d’emploi, ayant un intérêt pour la modélisation des phénomènes physiques et l’optimisation de modèles, ou bien un intérêt pour la validation et l’analyse rigoureuse et critique de tests physiques, n’hésitez pas à les en informer.

 

Les candidat(e)s ne seront pas évalué(e)s sur leur capacité à se vendre : nous utilisons notre approche sans sélection sur CV, ni entretien d’embauche, en tenant compte des besoins et particularités liés à leur neuroatypie. Pour cela, nous proposons à nos candidats une immersion chez notre client partenaire, que nous encadrons, et lors de laquelle nous déroulons notre méthode de recrutement inclusive danoise (nous utilisons entre autres des robots lors de ce recrutement, c'est aussi pour cela que nous avons souhaité informer la communauté Robot Maker). Lors de la prise de poste, nous accompagnons systématiquement les candidat(e)s recruté(e)s pendant plusieurs mois, et sensibilisons les équipes d’ACC. Par ailleurs, les candidats pourront être formés (voir les offres pour plus de précisions à ce sujet).

 

Les personnes intéressées peuvent nous transmettre leur candidature dès à présent en consultant nos offres d’emploi neurodiversity-friendly et en postulant en ligne sur notre site web. Même si elles n’ont pas obtenu leur diplôme ou que leur parcours est en “zig-zag”, nous étudierons leur candidature. Si elles ont des doutes ou qu'elles souhaitent en savoir plus sur le programme, elles peuvent nous poser toutes leurs questions lors des webinaires d'information des 4, 18 et 30 avril.

 

La date limite pour candidater est le 03/05/2024.

 

Nous vous prions de noter que toutes les candidatures doivent être adressées à Specialisterne France. ACC ne les recevra pas directement.

 

Nous restons bien-sûr à votre disposition pour vous fournir plus de détails sur le programme.

 

Au plaisir de vous rencontrer et d'échanger avec vous,

L'équipe de Specialisterne

  89 Views · 0 Replies


Statistiques de la communauté

Total des messages
112 853
Total des membres
9 787
Dernier membre
nico77_78 
Record de connectés simultanés
1 425
25 avril 2024

931 utilisateur(s) actif(s) (durant les 30 dernières minutes)
0 membre(s), 931 invité(s), 0 utilisateur(s) anonyme(s) | Afficher par : dernier clic ou nom du membre