Tech
Latam-GPT: The Free, Open Source, and Collaborative AI of Latin America
Latam-GPT is new large language model being developed in and for Latin America. The project, led by the nonprofit Chilean National Center for Artificial Intelligence (CENIA), aims to help the region achieve technological independence by developing an open source AI model trained on Latin American languages and contexts.
“This work cannot be undertaken by just one group or one country in Latin America: It is a challenge that requires everyone’s participation,” says Álvaro Soto, director of CENIA, in an interview with WIRED en Español. “Latam-GPT is a project that seeks to create an open, free, and, above all, collaborative AI model. We’ve been working for two years with a very bottom-up process, bringing together citizens from different countries who want to collaborate. Recently, it has also seen some more top-down initiatives, with governments taking an interest and beginning to participate in the project.”
The project stands out for its collaborative spirit. “We’re not looking to compete with OpenAI, DeepSeek, or Google. We want a model specific to Latin America and the Caribbean, aware of the cultural requirements and challenges that this entails, such as understanding different dialects, the region’s history, and unique cultural aspects,” explains Soto.
Thanks to 33 strategic partnerships with institutions in Latin America and the Caribbean, the project has gathered a corpus of data exceeding eight terabytes of text, the equivalent of millions of books. This information base has enabled the development of a language model with 50 billion parameters, a scale that makes it comparable to GPT-3.5 and gives it a medium to high capacity to perform complex tasks such as reasoning, translation, and associations.
Latam-GPT is being trained on a regional database that compiles information from 20 Latin American countries and Spain, with an impressive total of 2,645,500 documents. The distribution of data shows a significant concentration in the largest countries in the region, with Brazil the leader with 685,000 documents, followed by Mexico with 385,000, Spain with 325,000, Colombia with 220,000, and Argentina with 210,000 documents. The numbers reflect the size of these markets, their digital development, and the availability of structured content.
“Initially, we’ll launch a language model. We expect its performance in general tasks to be close to that of large commercial models, but with superior performance in topics specific to Latin America. The idea is that, if we ask it about topics relevant to our region, its knowledge will be much deeper,” Soto explains.
The first model is the starting point for developing a family of more advanced technologies in the future, including ones with image and video, and for scaling up to larger models. “As this is an open project, we want other institutions to be able to use it. A group in Colombia could adapt it for the school education system or one in Brazil could adapt it for the health sector. The idea is to open the door for different organizations to generate specific models for particular areas like agriculture, culture, and others,” explains the CENIA director.
Tech
BMW Is Betting Big on the New iX3. The Good News Is It’s Superb
BMW’s first car on its new EV platform has finally arrived. But will a big range, thumping charging tech, and a new driving brain that aims to deliver the ultimate ride be enough to beat China?
Source link
Tech
MIT engineers design an aerial microrobot that can fly as fast as a bumblebee
In the future, tiny flying robots could be deployed to aid in the search for survivors trapped beneath the rubble after a devastating earthquake. Like real insects, these robots could flit through tight spaces larger robots can’t reach, while simultaneously dodging stationary obstacles and pieces of falling rubble.
So far, aerial microrobots have only been able to fly slowly along smooth trajectories, far from the swift, agile flight of real insects — until now.
MIT researchers have demonstrated aerial microrobots that can fly with speed and agility that is comparable to their biological counterparts. A collaborative team designed a new AI-based controller for the robotic bug that enabled it to follow gymnastic flight paths, such as executing continuous body flips.
With a two-part control scheme that combines high performance with computational efficiency, the robot’s speed and acceleration increased by about 450 percent and 250 percent, respectively, compared to the researchers’ best previous demonstrations.
The speedy robot was agile enough to complete 10 consecutive somersaults in 11 seconds, even when wind disturbances threatened to push it off course.
Credit: Courtesy of the Soft and Micro Robotics Laboratory
“We want to be able to use these robots in scenarios that more traditional quad copter robots would have trouble flying into, but that insects could navigate. Now, with our bioinspired control framework, the flight performance of our robot is comparable to insects in terms of speed, acceleration, and the pitching angle. This is quite an exciting step toward that future goal,” says Kevin Chen, an associate professor in the Department of Electrical Engineering and Computer Science (EECS), head of the Soft and Micro Robotics Laboratory within the Research Laboratory of Electronics (RLE), and co-senior author of a paper on the robot.
Chen is joined on the paper by co-lead authors Yi-Hsuan Hsiao, an EECS MIT graduate student; Andrea Tagliabue PhD ’24; and Owen Matteson, a graduate student in the Department of Aeronautics and Astronautics (AeroAstro); as well as EECS graduate student Suhan Kim; Tong Zhao MEng ’23; and co-senior author Jonathan P. How, the Ford Professor of Engineering in the Department of Aeronautics and Astronautics and a principal investigator in the Laboratory for Information and Decision Systems (LIDS). The research appears today in Science Advances.
An AI controller
Chen’s group has been building robotic insects for more than five years.
They recently developed a more durable version of their tiny robot, a microcassette-sized device that weighs less than a paperclip. The new version utilizes larger, flapping wings that enable more agile movements. They are powered by a set of squishy artificial muscles that flap the wings at an extremely fast rate.
But the controller — the “brain” of the robot that determines its position and tells it where to fly — was hand-tuned by a human, limiting the robot’s performance.
For the robot to fly quickly and aggressively like a real insect, it needed a more robust controller that could account for uncertainty and perform complex optimizations quickly.
Such a controller would be too computationally intensive to be deployed in real time, especially with the complicated aerodynamics of the lightweight robot.
To overcome this challenge, Chen’s group joined forces with How’s team and, together, they crafted a two-step, AI-driven control scheme that provides the robustness necessary for complex, rapid maneuvers, and the computational efficiency needed for real-time deployment.
“The hardware advances pushed the controller so there was more we could do on the software side, but at the same time, as the controller developed, there was more they could do with the hardware. As Kevin’s team demonstrates new capabilities, we demonstrate that we can utilize them,” How says.
For the first step, the team built what is known as a model-predictive controller. This type of powerful controller uses a dynamic, mathematical model to predict the behavior of the robot and plan the optimal series of actions to safely follow a trajectory.
While computationally intensive, it can plan challenging maneuvers like aerial somersaults, rapid turns, and aggressive body tilting. This high-performance planner is also designed to consider constraints on the force and torque the robot could apply, which is essential for avoiding collisions.
For instance, to perform multiple flips in a row, the robot would need to decelerate in such a way that its initial conditions are exactly right for doing the flip again.
“If small errors creep in, and you try to repeat that flip 10 times with those small errors, the robot will just crash. We need to have robust flight control,” How says.
They use this expert planner to train a “policy” based on a deep-learning model, to control the robot in real time, through a process called imitation learning. A policy is the robot’s decision-making engine, which tells the robot where and how to fly.
Essentially, the imitation-learning process compresses the powerful controller into a computationally efficient AI model that can run very fast.
The key was having a smart way to create just enough training data, which would teach the policy everything it needs to know for aggressive maneuvers.
“The robust training method is the secret sauce of this technique,” How explains.
The AI-driven policy takes robot positions as inputs and outputs control commands in real time, such as thrust force and torques.
Insect-like performance
In their experiments, this two-step approach enabled the insect-scale robot to fly 447 percent faster while exhibiting a 255 percent increase in acceleration. The robot was able to complete 10 somersaults in 11 seconds, and the tiny robot never strayed more than 4 or 5 centimeters off its planned trajectory.
“This work demonstrates that soft and microrobots, traditionally limited in speed, can now leverage advanced control algorithms to achieve agility approaching that of natural insects and larger robots, opening up new opportunities for multimodal locomotion,” says Hsiao.
The researchers were also able to demonstrate saccade movement, which occurs when insects pitch very aggressively, fly rapidly to a certain position, and then pitch the other way to stop. This rapid acceleration and deceleration help insects localize themselves and see clearly.
“This bio-mimicking flight behavior could help us in the future when we start putting cameras and sensors on board the robot,” Chen says.
Adding sensors and cameras so the microrobots can fly outdoors, without being attached to a complex motion capture system, will be a major area of future work.
The researchers also want to study how onboard sensors could help the robots avoid colliding with one another or coordinate navigation.
“For the micro-robotics community, I hope this paper signals a paradigm shift by showing that we can develop a new control architecture that is high-performing and efficient at the same time,” says Chen.
“This work is especially impressive because these robots still perform precise flips and fast turns despite the large uncertainties that come from relatively large fabrication tolerances in small-scale manufacturing, wind gusts of more than 1 meter per second, and even its power tether wrapping around the robot as it performs repeated flips,” says Sarah Bergbreiter, a professor of mechanical engineering at Carnegie Mellon University, who was not involved with this work.
“Although the controller currently runs on an external computer rather than onboard the robot, the authors demonstrate that similar, but less precise, control policies may be feasible even with the more limited computation available on an insect-scale robot. This is exciting because it points toward future insect-scale robots with agility approaching that of their biological counterparts,” she adds.
This research is funded, in part, by the National Science Foundation (NSF), the Office of Naval Research, Air Force Office of Scientific Research, MathWorks, and the Zakhartchenko Fellowship.
Tech
Thursday’s Cold Moon Is the Last Supermoon of the Year. Here’s How and When to View It
A cold supermoon is on its way. On December 4, Earth’s satellite will delight us with one of the last astronomical spectacles of 2025. Not only will it be the last full moon of the year, but it’s also a cold moon—which refers to the frigid temperatures typical of this time of year—and, finally, a supermoon. Here’s how and when best to enjoy this spectacle of the year-end sky.
What Is a Supermoon?
The term supermoon refers to a full moon that occurs when our satellite is at perigee, the point at which its orbit brings it closest to our planet. (The moon’s orbit is elliptical, and its distance from Earth varies between about 407,000 km at apogee, the point of maximum distance, and about 380,000 at perigee.)
In addition to being the third consecutive supermoon of the year, as reported by EarthSky, it will be about 357,000 km away from us, making it the second-closest full Moon of the year. Consequently it will also be the second-largest and brightest.
Although most of us won’t notice any difference in size compared to a normal full moon (it appears up to 8 percent larger to us), its brightness could exceed that of an ordinary full Moon by 16 percent. This time, moreover, it will be 100 percent illuminated just 12 hours after its perigee.
The Cold Supermoon
In addition to its name, which refers to the cold temperatures of this period, December’s full moon will be the last of 12 full moons in 2025 and the highest of the year. With the winter solstice approaching on December 21, the sun is at its lowest point in the sky, so the full moon is at its highest point. In other words, this means that the super cold moon will be particularly high in the sky. As EarthSky points out, however, it is not the closest full Moon to the December 21 solstice. While it occurs 17 days before, the first full moon of 2026 will occer on January 3—just 12 days ater teh solstice. That will be the fourth and last consecutive supermoon.
How to Enjoy the Show
Although the moon may appear full both the night before and the night after, the exact time of the full moon is scheduled for 6:14 pm ET on Thursday, December 4. In general, moonrise is the best time to be subject to the so-called lunar illusion, during which the moon appears larger than usual to us. NASA still doesn’t have a scientific explanation for why this happens, but as you might expect, the effect is greatest during a supermoon. Weather permitting, therefore, find an elevated place or meadow with an unobstructed view of the eastern horizon and enjoy the last moon show of the year.
This story originally appeared on WIRED Italia and has been translated from Italian.
-
Sports3 days agoIndia Triumphs Over South Africa in First ODI Thanks to Kohli’s Heroics – SUCH TV
-
Tech4 days agoGet Your Steps In From Your Home Office With This Walking Pad—On Sale This Week
-
Fashion3 days agoResults are in: US Black Friday store visits down, e-visits up, apparel shines
-
Politics3 days agoElon Musk reveals partner’s half-Indian roots, son’s middle name ‘Sekhar’
-
Entertainment3 days agoSadie Sink talks about the future of Max in ‘Stranger Things’
-
Tech3 days agoPrague’s City Center Sparkles, Buzzes, and Burns at the Signal Festival
-
Uncategorized1 week ago
[CinePlex360] Please moderate: “Americans would
-
Tech1 week agoWake Up—the Best Black Friday Mattress Sales Are Here
