Insta360 X3 for $250: You’ll have to settle for 5.7K footage here, and that’s on a 1/2-inch sensor, which is only 1080p when you crop to a rectangular video format. Still, you get nearly the same form factor as the X4, and you can use it as a 4K, single-lens action cam. At this price the X3 remains a viable option for those wanting to dabble in 360 video without spending a fortune.
Insta360 One RS for $300: The company’s interchangeable-lens action-camera/360-camera hybrid is another option. The video footage isn’t as good as the other cameras in this guide, but you can swap the lens and have an action camera in a moment, which is the major selling point. That said, now that the X3 and X4 can also be used as 4K action cameras, the One RS is less tempting than it used to be. Still, if you like the action-camera form factor but want to be able to shoot 360 footage as well, this One RS is a great camera. The real combo would be the the 360 lens paired with the Leica lens, but the price for that combo is considerably higher.
GoPro Max for $822: GoPro’s entry into the 360 camera world, the Max is a capable action camera, featuring 6K video in a waterproof form factor with industry-leading stabilization. It’s got all the shooting modes you know from your GoPro, like HyperSmooth, TimeWarp, PowerPano, and more. Like the X4, there’s a single-lens mode (called Hero mode), and, my favorite part, the Max is compatible with most GoPro mounts and accessories. The main reason the Max is not one of our top picks is that the Max 2 is likely coming very soon. If you want a Max, you’re better off waiting.
Qoocam 3 Ultra for $599: It’s not widely available, and we have not had a chance to try one, but Kandao’s Qoocam 3 Ultra is another 8K 360 camera that looks promising, at least on paper. The f/1.6 aperture is especially interesting, as most of the rest of these are in the f/2 and up range. We’ll update this guide when we’ve had a chance to test a Qoocam.
360 Cameras to Avoid
Insta360 One X2 for $230: Insta360’s older X2 is different from the X3 that replaced it. The form factor is less convenient. (The screen is tiny; you pretty much have to use it with a phone). It still shoots 5.7K video, but it’s not as well stabilized nor is it anywhere near as sharp as the X3 or X4. Unless you can get it for well under $200, the X2 is not worth buying.
Insta360 One RS 1 360 Edition: Although I still like and use this camera, it appears to have been discontinued, and there’s no replacement in sight. The X5 delivers better video quality in a lighter, less fragile body, but I will miss those 1-inch sensors that managed to pull a lot of detail, even if the footage did top out at 6K. These are still available used, but at outrageous prices. You’re better off with the X5.
Frequently Asked Questions
There are two reasons you’d want a 360-degree camera. The first is to shoot virtual reality content, where the final viewing is done on a 360 screen, e.g., VR headsets and the like. So far this is mostly the province of professionals who are shooting on very expensive 360 rigs not covered in this guide, though there is a growing body of amateur creators as well. If this is what you want to do, go for the highest-resolution camera you can get. Either of our top two picks will work.
For most of us though, the main appeal of a 360 camera is to shoot everything around you and then edit or reframe to the part of the scene we want to focus on, or panning and tracking objects within the 360 footage, but with the result being a typical, rectangular video that then gets exported to the web. The video resolution and image quality will never match what you get from a high-end DSLR, but the DSLR might not be pointed at the right place, at the right time. The 360 camera doesn’t have to be pointed anywhere, it just has to be on.
This is the best use case for the cameras on this page, which primarily produce HD (1080p) or better video—but not 4K—when reframed. I expect to see 12K-capable consumer-level 360 cameras in the next year or two (which is what you need to reframe to 4K), but for now, these are the best cameras you can buy.
Whether you’re shooting virtual tours or your kid’s birthday, the basic premise of a 360 camera is the same. The fisheye lens (usually two very wide-angle lenses combined) captures the entire scene around you, ideally editing out the selfie stick if you’re using one. Once you’ve captured your 360-degree view, you can then edit or reframe that content down to something ready to upload to YouTube, TikTok, and other video-sharing sites.
Why Is High Resolution Important in 360 Cameras?
Camera makers have been pushing ever-higher video resolution for so long it feel like a gimmick in many cases, but not with 360 cameras. Because the camera is capturing a huge field of view, the canvas if you will, is very large. To get a conventional video from that footage you have to crop which zooms in on the image, meaning your 8K 360 shot becomes just under 2.7K when you reframe that footage.
How Does “Reframing” Work?
Reframing is the process of taking the huge, 360-degree view of the world that your camera capture and zooming in on just a part of it to tell your story. This makes the 360 footage fit traditional movie formats (like 16:9), but as noted above it means cropping your footage, so the higher resolution you start with the better your reframed video will look.
If you’re shooting for VR headsets or other immersive tools then you don’t have to reframe anything.
I’ve been shooting with 360 cameras since Insta360 released the X2 back in 2020. Early 360 cameras were fun, but the video they produced wasn’t high enough resolution to fit with footage from other cameras, limiting their usefulness. Thankfully we’ve come a long way in the last five years. The 360 camera market has grown and the footage these cameras produce is good enough to mix seamless with your action camera and even your high end mirrorless camera footage.
To test 360 cameras I’ve broken the process down into different shooting scenarios, especially scenes with different lighting conditions, to see how each performs. No camera is perfect, so which one is right for you depends on what you’re shooting. I’ve paid special attention to the ease of use of each camera (360 cameras can be confusing for beginners), along with what kind of helpful extras each offers, HDR modes, and support for accessories.
The final element of the picture is the editing workflow and tools available for each camera. Since most people are shooting for social media, the raw 360 footage has to be edited before you post it anywhere. All the cameras above have software for mobile, Windows and macOS.
And thanks to a mention in Dan Brown’s new novel, The Secret of Secrets, the festival has gained even more global recognition. Just a few weeks after the release of Brown’s new bestseller set in contemporary Prague, viewers were able to see for themselves what drew the popular writer to the festival, which is the largest Czech and Central European showcase of digital art. In one passage, the Signal Festival has a cameo appearance when the novel’s protagonist recalls attending an event at the 2024 edition.
“We’re happy about it,” festival director Martin Pošta says about the mention. “It’s a kind of recognition.” Not that the event needed promotion, even in one of the most anticipated novels of recent years. The organizers have yet to share the number of visitors to the festival this year, but the four-day event typically attracts half a million visitors.
On the final day, there was a long queue in front of the monumental installation Tristan’s Ascension by American video art pioneer Bill Viola before it opened for the evening, even though it was a ticketed event. In the Church of St. Salvator in the Convent of St. Agnes, visitors could watch a Christ-like figure rise upwards, streams of water defying gravity along with him, all projected on a huge screen.
The festival premiere took place on the Vltava River near the Dvořák Embankment. Taiwan’s Peppercorns Interactive Media Art presented a projection on a cloud of mist called Tzolk’in Light. While creators of other light installations have to deal with the challenges of buildings—their irregular surfaces, decorative details, and awkward cornices—projecting onto water droplets is a challenge of a different kind with artists having to give up control over the resulting image. The shape and depth of the Peppercorns’ work depended on the wind at any given moment, which determined how much of the scene was revealed to viewers and how much simply blown away. The reward, however, was an extraordinary 3D spectacle reminiscent of a hologram—something that can’t be achieved with video projections on static and flat buildings.
Another premiere event was a projection on the tower of the Old Town Hall, created for the festival by the Italian studio mammasONica. It transformed the 230-foot structure into a kaleidoscope of blue, green, red, and white surfaces. A short distance away, on Republic Square, Peppercorns had another installation. On a circular LED installation, they projected a work entitled Between Mountains and Seas, which recounted the history of Taiwan.
Software development is associated with the idea of not reinventing the wheel, which means developers often select components or software libraries with pre-built functionality, rather than write code to achieve the same result.
There are many benefits of this approach. For example, a software component that is widely deployed is likely to have undergone extensive testing and debugging. It is considered tried and trusted, mature technology, unlike brand-new code, which has not been thoroughly debugged and may inadvertently introduce unknown cyber security issues into the business.
The Lego analogy is often used to describe how these components can be put together to build enterprise applications. Developers can draw on functionality made available through application programming interfaces (APIs), which provide programmatic access to software libraries and components.
Increasingly, in the age of data-driven applications and greater use of artificial intelligence (AI), API access to data sources is another Lego brick that developers can use to create new software applications. And just as is the case with a set of old-school Lego bricks, constructing the application from the numerous software components available is left to the creativity of the software developer.
A Lego template for application development
To take the Lego analogy a bit further, there are instructions, templates and pathways developers can be encouraged to follow to build enterprise software that complies with corporate policies.
A developer self-service platform provides a way for organisations to offer their developers almost pre-authorised assets, artefacts and tools that they can use to develop code Roy Illsley, Omdia
Roy Illsley, chief analyst, IT operations, at Omdia, defines an internal developer platform (IDP) as a developer self-service portal to access the tools and environments that the IT strategy has defined the organisation should standardise on. “A developer self-service platform provides a way for organisations to offer their developers almost pre-authorised assets, artefacts and tools that they can use to develop code,” he says.
The basic idea is to provide a governance framework with a suite of compliant tools. Bola Rotibi, chief of enterprise research at CCS Insight, says: “A developer self-service platform is really about trying to get a governance path.”
Rotibi regards the platform as “a golden path”, which provides developers who are not as skilled as more experienced colleagues a way to fast-track their work within a governance structure that allows them a certain degree of flexibility and creativity.
As to why offering flexibility to developers is an important consideration falls under the umbrella of developer experience and productivity. SnapLogic effectively provides modern middleware. It is used in digital transformation projects to connect disparate systems, and is now being repositioned for the age of agentic AI.
SnapLogic’s chief technology officer, Jeremiah Stone, says quite a few of the companies it has spoken to that identify as leaders in business transformation regard a developer portal offering self-service as something that goes hand-in-hand with digital infrastructure and AI-powered initiatives.
SnapLogic’s platform offers API management and service management, which manages the lifecycle of services, version control and documentation through a developer portal called the Dev Hub.
Stone says the capabilities of this platform extend from software developers to business technologists, and now AI users, who, he says, may be looking for a Model Context Protocol (MCP) endpoint.
Such know-how captured in a self-service developer portal enables users – whether they are software developers, or business users using low-code or no-code tooling – to connect AI with existing enterprise IT systems.
Enter Backstage
One platform that seems to have captured the minds of the developer community when it comes to developer self-service is Backstage. Having begun life internally at audio streaming site Spotify, Backstage is now an open source project managed by the Cloud Native Computing Foundation (CNCF).
While many teams that implemented Backstage assumed that it would be an easy, free addition to their DevOps practices, that isn’t always the case. Backstage can be complex and requires engineering expertise to assemble, build and deploy Christopher Condo and Lauren Alexander, Forrester
Pia Nilsson, senior director of engineering at the streaming service, says: “At Spotify, we’ve learned that enabling developer self-service begins with standardisation. Traditional centralised processes create bottlenecks, but complete decentralisation can lead to chaos. The key is finding the middle ground – standardisation through design, where automation and clear workflows replace manual oversight.”
Used by two million developers, Backstage is an open source framework for building internal developer portals. Nilsson says Backstage provides a single, consistent entry point for all development activities – tools, services, documentation and data. She says this means “developers can move quickly while staying aligned with organisational standards”.
Nilsson points out that standardising the fleet of components that comprise an enterprise technology stack is sometimes regarded as a large migration effort, moving everyone onto a single version or consolidating products into one. However, she says: “While that’s a critical part of standardising the fleet, it’s even more important to figure out the intrinsic motivator for the organisation to keep it streamlined and learn to ‘self-heal’ tech fragmentation.”
According to Nilsson, this is why it is important to integrate all in-house-built tools, as well as all the developer tools the business has purchased, in the same IDP. Doing so, she notes, makes it very easy to spot duplication. “Engineers will only use what they enjoy using, and we usually enjoy using the stuff we built ourselves because it’s exactly what we need,” she says.
The fact that Backstage is a framework is something IT leaders need to consider. In a recent blog post, Forrester analysts Christopher Condo and Lauren Alexander warned that most IDPs are frameworks that require assembly: “While many teams that implemented Backstage assumed that it would be an easy, free addition to their DevOps practices, that isn’t always the case. Backstage can be complex and requires engineering expertise to assemble, build and deploy.”
However, Forrester also notes that commercial IDP options are now available that include an orchestration layer on top of Backstage. These offer another option that may be a better fit for some organisations.
AI in an IDP
As well as the assembly organisations will need to carry out if they do not buy a commercial IDP, AI is revolutionising software development, and its impact needs to be taken into account in any decisions made around developer self-service and IDP.
Spotify’s Nilsson believes it is important for IT leaders to figure out how to support AI tooling usage in the most impactful way for their company.
“Today, there is both a risk to not leveraging enough AI tools or having it very unevenly spread across the company, as well as the risk that some teams give in to the vibes and release low-quality code to production,” she says.
According to Nilsson, this is why the IT team responsible for the IDP needs to drive up the adoption of these tools and evaluate the impact over time. “At Spotify, we drive broad AI adoption through education and hack weeks, which we promote through our product Skill Exchange. We also help engineers use context-aware agentic tools,” she adds.
Looking ahead
In terms of AI tooling, an example of how developer self-service could evolve is the direction of travel SAP looks to be taking with its Joule AI copilot tool.
When structure, automation and visibility are built into the developer experience, you replace bottlenecks with flow and create an environment where teams can innovate quickly, confidently and responsibly Pia Nilsson, Spotify
CCS Insights’ Rotibi believes the trend to integrate AI into developer tools and platforms is an area of opportunity for developer self-service platforms. Among the interesting topics Rotibi saw at the recent SAP TechEd conference in Berlin was the use of AI in SAP Joule.
SAP announced new AI assistants in Joule, which it said are able to coordinate multiple agents across workflows, departments and applications. According to SAP, these assistants plan, initiate and complete complex tasks spanning finance, supply chain, HR and beyond.
“SAP Joule is an AI interface. It’s a bit more than just a chatbot. It is also a workbench,” says Rotibi. Given that Joule has access to the SAP product suite, she notes that, as well as providing access, Joule understands the products. “It knows all the features and functions SAP has worked on, and, behind the scenes, uses the best data model to get the data points the user wants,” she says.
Recognising that enterprise software developers will want to build their own applications and create their own integration between different pieces of software, she says SAP Joule effectively plays the role of a developer self-service portal for the SAP product suite.
Besides what comes next with AI-powered functionality, there are numerous benefits in offering developer self-service to improve the overall developer experience, but there needs to be structure and standards.
Nilsson says: “When structure, automation and visibility are built into the developer experience, you replace bottlenecks with flow and create an environment where teams can innovate quickly, confidently and responsibly.”
First a confession: I own more MoonSwatches than I care to admit. Never let it be said that WIRED does not walk the walk when it comes to recommending products—Swatch has assiduously extracted a considerable amount of cash from me, all in $285 increments. This was no doubt the Swiss company’s dastardly plan all along, to lure us in, then, oh so gently, get watch fans hooked. The horological equivalent of boiling a frog. It’s worked, too—Swatch has, so far, netted hundreds of millions of dollars from MoonSwatch sales.
But while I’ve been a fan of the Omega X Swatch mashup since we reported on exactly how the hugely lucrative collaboration came to be in the first place, I have never liked the iterative Moonshine Gold versions. Employing a sliver of Omega’s exclusive 18K pale yellow gold alloy in marginally different ways on each design, they seemed almost cynical—a way of milking the MoonSwatch superfans on the hunt to complete the set.
A hidden Snoopy message on the Cold Moon’s dial is revealed under UV light.
Photograph: Courtesy of Swatch
The MoonSwatch comes with a rubber strap upgrade over the original launch models.
Photograph: Courtesy of Swatch
Now, though, just when I thought I was done with MoonSwatch—having gone as far as to upgrade all of mine with official $45 color-matching rubber straps—Swatch has managed to ensnare me once again, and with a Moonshine Gold model: the new MoonSwatch Mission To Earthphase Moonshine Gold Cold Moon.
Clumsy moniker aside, this version takes the all-white 2024 Snoopy model (WIRED’s top pick of the entire collection), mixes it with the Earthphase MoonSwatches, and replaces the inferior original strap for a superior white and blue Swatch rubber velcro one. Aesthetically, it’s definitely a win, but this is not the Cold Moon’s party trick.
On each $450 Cold Moon MoonSwatch, a snowflake is lasered onto its Moonshine Gold moon phase indicator—and, just like a real snowflake, Swatch claims each one will be completely unique. When you consider the volumes of MoonSwatches Swatch produces each year, this is no mean feat.
The unique golden snowflakes appear on the moon phase dial of the Cold Moon.