Connect with us

Tech

Scientists Have Identified the Origin of an Extraordinarily Powerful Outer Space Radio Wave

Published

on

Scientists Have Identified the Origin of an Extraordinarily Powerful Outer Space Radio Wave


The Earth is constantly receiving space signals that contain vital information about extremely energetic phenomena. Among the most peculiar are brief pulses of extremely high-energy radio waves, known as fast radio bursts (FRB). Astronomers compare them to a powerful lighthouse that shines for milliseconds in the middle of a rough, distant sea. Detecting one of these signals is an achievement in itself, but identifying its origin and understanding the nature of its source remains one of the great challenges of science.

That is why recent research led by Northwestern University in the United States has captured the attention of the astronomical community. The team not only detected one of the brightest FRBs ever recorded, but also traced its origin with unprecedented precision.

The pulse, identified as RBFLOAT, arrived in March 2025, lasted just a few milliseconds, and released as much energy as the sun produces in four days. Thanks to a new method of analysis, the researchers located its origin in an arm of a spiral galaxy located 130 million light-years away, in the direction of the constellation Ursa Major. The research was published in the journal The Astrophysical Journal Letters.

The CHIME radio telescope in Canada, one of the world’s leading FRB observatories, and a subnetwork of smaller stations called Outriggers detected the anomalous outburst. CHIME characterized the signal, while the Outriggers triangulated it to a narrow region of space. Optical and X-ray telescopes then provided complementary data. The team achieved a precision of 13 parsecs, equivalent to 42 light-years, within the galaxy NGC 4141.

Astronomers had previously pinpointed other FRBs, but in those cases the signals were repeated, which made the analysis easier. “RBFLOAT was the first non-repeating source localized to such precision,” said Sunil Simha, coauthor of the study, in a university statement. “These are much harder to locate. Thus, even detecting RBFLOAT is proof of concept that CHIME is indeed capable of detecting such events and building a statistically interesting sample of FRBs.”

What Caused the RBFLOAT?

Scientists are still not sure what causes RBFs, but they have some ideas. Because of the enormous energy they release and the brevity of the phenomenon, it is likely that they originate from extreme cosmic events, such as neutron star mergers, magnetars, or pulsars.

In the case of RBFLOAT, the data indicate that it is located in a star-forming region with really massive stars. The triangulation places the signal in a galactic arm where new stars are also being born. This suggests that it could be a magnetar, a subclass of neutron star with a magnetic field billions of times stronger than that of the Earth.

The experience with RBFLOAT will allow the team to apply the same triangulation technique to future signals. The authors estimate that they could achieve about 200 accurate RBF detections per year with just the signals CHIME captures.

“For years, we’ve known FRBs occur all over the sky, but pinning them down has been painstakingly slow. Now, we can routinely tie them to specific galaxies, even down to neighborhoods within those galaxies,” said Yuxin Dong, another member of the team.

This story originally appeared on WIRED en Español and has been translated from Spanish.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Gazing Into Sam Altman’s Orb Now Proves You’re Human on Tinder

Published

on

Gazing Into Sam Altman’s Orb Now Proves You’re Human on Tinder


Sam Altman’s iris-scanning, humanity-verifying World project announced at an event in San Francisco on Friday that Tinder users around the globe can now put a digital badge on their profiles signaling to potential suitors that they’re a real human, provided they’ve already stared into one of World’s glossy white Orbs and allowed their eyes to be scanned. The announcement follows a pilot project for Tinder verification that World previously conducted in Japan.

The global Tinder expansion is one of the biggest tests yet for World, and the company’s bet that everyday consumers will be willing to sign up for biometric verification services to use internet applications. Founded in 2019 by Altman and Alex Blania, the World project was designed for a future where the internet is overrun with highly capable AI agents that make it incredibly difficult, if not impossible, to tell who is really human. As companies like OpenAI—where Altman is CEO—and Anthropic push AI agents into the mainstream, the problem World was built to solve feels increasingly urgent.

But World has struggled to achieve mainstream adoption, and it has encountered resistance from governments around the globe that have probed the company over suspected violations of data protection laws. The company says 18 million people have now been verified with an Orb, up from 12 million last year.

In addition to the Tinder global expansion, Tools for Humanity, the company behind World, announced a number of other consumer and enterprise partnerships on Friday at its Lift Off event in San Francisco. The startup says Tinder users who verify with their World ID will receive five free “boosts,” typically a paid feature that increases the number of users who see a profile by up to 10 times for 30 minutes. The videoconferencing platform Zoom also says that users can now require other participants to verify their identity with World before joining a call. Docusign, the contract signing software, will allow users to require World’s identity verification technology.

Tiago Sada, Tools for Humanity’s chief product officer, tells WIRED the company sees major platform partnerships as key to helping World become a mainstream identity-verification technology. Sada said he’s especially interested in working with social media companies in the future, and was encouraged to see that Reddit has started testing World as a solution to help users distinguish bots from real people.

World is also launching a tool called Concert Kit, which lets artists reserve concert tickets for verified humans, a pitch aimed squarely at the bot-driven scalping problem that critics say has plagued sites like TicketMaster. World will test the feature on the upcoming Bruno Mars World Tour featuring Anderson .Paak, who is scheduled to play a verified-humans-only show under his alias DJ Pee .Wee in San Francisco on Friday night.

No new hardware announcements or updates were made at Friday’s event. World first launched the iris-scanning Orb back in 2023, alongside a mobile app that contains “mini apps” for different verification and blockchain-related programs. After a person scans their eyeball with one of World’s Orbs, the startup creates a unique cryptographic key for each person—their World ID. This creates a private, decentralized way to verify people online, without requiring them to upload their government ID all over the internet.

The project was initially called Worldcoin, and in the early days the startup offered people free cryptocurrency to scan their irises. World still offers a cryptocurrency token and a wallet for digital currencies, but dropped the “coin” from its name in 2024 and has since shifted its focus to identity verification for the AI era. Jess Montejano, a spokesperson for Tools for Humanity, says the company still offers crypto as an incentive when new users sign up, but has also expanded its offerings to include Netflix and Apple TV subscription trials.



Source link

Continue Reading

Tech

Do You Actually Need a Smart Bird Feeder With a Movable Camera?

Published

on

Do You Actually Need a Smart Bird Feeder With a Movable Camera?


Assembly was quick and tool-free, requiring only a handful of included knob screws. I also like that it included both fence- and pole-mounting options, the latter of which is critical for preventing squirrel damage.

ScreenshotCoolfly app via Kat Merck

Smart feeder companies continue to upgrade their cameras’ quality with each new model, but the general range still seems to be anywhere from 1080p photos and 2K video on the low end (as with the Birdfy Lite), all the way up to 32-MP photos and 4K video (as with Camojojo’s new Hibird Pro). The Aura falls somewhere in the middle of this range, with 4-MP photos and a respectable 2.5K Ultra HD video.

The camera’s 150-degree field of view is wider than that of a typical bird feeder camera, and it helps to capture all angles of what’s really the Aura’s signature feature—a wraparound perch with little platforms on the left and right sides, where you can position the camera upright (which shows pictures in a horizontal “landscape mode”) at the angle you prefer. If you want the camera to be on its side (vertical “portrait mode”), there’s a little adapter that connects to the back and screws into the platform. Do note, though, that despite some marketing photos showing the Aura with two cameras, it only comes with one camera, and when it’s on its side, it can only be mounted on the right side of the perch.

Portrait mode (the camera mounted on its side) allows for greater detail in photos, but it wasn’t always successful at capturing all the action, depending on where a bird stood. The biggest issue with this camera orientation, however, is that the app’s AI identification doesn’t work with it. I asked Coolfly if this was an error, but it turns out it’s how the camera was designed.

“To offer users ‘Limited Free AI’ without monthly subscription fees, our bird ID algorithm is hardcoded directly into the device’s hardware,” Coolfly’s rep told me. “Because this on-device neural network was trained exclusively on horizontal datasets, physically flipping the camera … disrupts the local algorithm’s spatial mapping.”

The solution? “If our users shoot vertically and spot an unknown bird, they can simply take a screenshot and send it to our in-app ChirpChat feature. Our interactive AI assistant will identify it perfectly from the image,” Coolfly’s rep said.

Though this step was cumbersome, it did correctly identify nearly all of the birds I proffered (as did the built-in AI ID). I liked seeing the birds slightly closer up with the side camera orientation, but it wasn’t a dramatic difference between the views. Certainly not dramatic enough to justify the hassle of losing the AI ID or of having to go out and fiddle with taking the camera on and off its little mount to switch modes. So for the majority of testing, I kept the camera in its default upright position.

Birds on Film

The Aura uses the Coolfly app, which isn’t as intuitive as some of the bigger brands’ apps, like Birdbuddy’s, but it was perfectly usable. There’s the ChirpChat, a bird search, and a Facebook-esque “social feed” where you can follow other Coolfly feeder users and see their posted videos and images. (Note that there were only about 10 users total at the time of my test.)

What I liked the most about the app was that it immediately IDs all the bird captures in the album with a little bird-head icon of that species. It helped me visually sort at a glance which visitors were new and noteworthy that day, and clicking the icon leads to an informational page on the bird, as well as a sound clip of the species’ typical call, so you can see if you’ve heard it around. What I liked the least, however, was the number of marketing push notifications the app would send, for sales and other irrelevant topics. It became so irritating, in fact, that I ended up turning off notifications altogether, which meant I was only aware of bird activity if I went into the app.



Source link

Continue Reading

Tech

How Can Astronauts Tell How Fast They’re Going?

Published

on

How Can Astronauts Tell How Fast They’re Going?


Let’s use our car again, but this time we’ll get real numbers from the accelerometer in our smartphone. Say we start at a red light and then accelerate at 2 m/s2 (meters per second squared) for five seconds. From the equation above, Δv1 would be 2 x 5 = 10 m/s, so that’s our velocity. Now, after cruising for a while, we accelerate again at 1 m/s2 for five more seconds. Δv2 is then 1 x 5 = 5 m/s. Adding these two changes, our velocity is now 15 m/s. And so on.

The only problem is that inertial measurement isn’t as accurate as the Doppler method over long periods, because small errors will keep accumulating. That means you need to recalibrate your system periodically using some other method.

Optical Navigation

On Earth, people have long navigated by the stars. In the northern hemisphere, just find Polaris. It’s called the North Star because Earth’s axis of rotation points right at it. That’s why it appears stationary, while the other stars seem to revolve around it. If you point a finger at Polaris you’ll be pointing north, and you can use that orientation to go in whatever direction you want.

Now, if you can measure the angle of Polaris above the horizon, you’ll also know your latitude. If the angle is 30 degrees, you’re at latitude 30 degrees. See, it’s easy. And once you can measure position, you just need to do it twice and record the time interval to find your velocity.

But celestial navigation works because we know how the Earth rotates, and that doesn’t help in a spacecraft. Oh well, can we just use the stars like you would use the cows on the side of the road? Nope. The stars are so far away, astronauts would need to travel for many, many generations to detect any shift in their position. Like the airplane flying over the sea, you’d seem to be stationary, even while traveling 25,000 mph.

But we can still use the basic idea. For optical navigation in space, a spacecraft can locate other objects in the solar system. By knowing the precise location of these objects (which change over time) and where they appear relative to the viewer, it’s possible to triangulate a position. And again, by taking multiple position measurements over time, you can calculate a velocity.

In the end, even though spaceships lack speedometers, it’s possible to track their speed indirectly with a little physics. But it’s just another example of how flying in space is really, totally different—and way more complicated—than driving or flying on Earth.



Source link

Continue Reading

Trending