Connect with us

Tech

The Instagram iPad App Is Finally Here

Published

on

The Instagram iPad App Is Finally Here


Apple debuted the iconic and now wildly popular iPad in 2010. A few months later, Instagram landed on the App Store to rapid success. But for 15 years, Instagram hasn’t bothered to optimize its app layout for the iPad’s larger screen.

That’s finally changing today: There’s now a dedicated Instagram iPad app available globally on the App Store.

It has been a long time coming. Even before Apple began splitting its mobile operating system from iOS into iOS and iPadOS, countless apps adopted a fresh user interface that embraced the larger screen size of the tablet. This was the iPad’s calling card at the time, and those native apps optimized for its precise screen size are what made Apple’s device stand out from a sea of Android tablets that largely ran phone apps inelegantly blown up to fit the bigger screen.

Except Instagram never went iPad-native. Open the existing app right now, and you’ll see the same phone app stretched to the iPad’s screen size, with awkward gaps on the sides. And you’ll run into the occasional problems when you post photos from the iPad, like low-resolution images. Weirdly, Instagram did introduce layout improvements for folding phones a few years ago, which means the experience is better optimized on Android tablets today than it is on iPad.

Instagram’s chief, Adam Mosseri, has long offered excuses, often citing a lack of resources despite being a part of Meta, a multibillion-dollar company. Instagram wasn’t the only offender—Meta promised a WhatsApp iPad app in 2023 and only delivered it earlier this year. (WhatsApp made its debut on phones in 2009.)

The fresh iPad app (which runs on iPadOS 15.1 or later) offers more than just a facelift. Yes, the Instagram app now takes up the entire screen, but the company says users will drop straight into Reels, the short-form video platform it introduced five years ago to compete with TikTok. The Stories module remains at the top, and you’ll be able to hop into different tabs via the menu icons on the left. There’s a new Following tab (the people icon right below the home icon), and this is a dedicated section to see the latest posts from people you actually follow.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Impact of US judge’s ruling on Google’s search dominance

Published

on

Impact of US judge’s ruling on Google’s search dominance


The antitrust case weighed the impact of Google’s search dominance on consumers.

Google has escaped a breakup of its Chrome browser in a major US competition case, but the judge imposed remedies whose impact remains uncertain just as AI starts to compete with search engines.

Here is what we know about how the antitrust ruling could affect the company, the wider tech sector and ordinary users of the giant’s services.

—What is the impact on Google?

Judge Amit Mehta, who found a year ago that Google illegally maintained monopolies in online search, did not order the company to sell off its widely used Chrome browser in his Tuesday ruling.

Neither did he halt Google’s agreements with companies like iPhone maker Apple or Firefox browser developer Mozilla, under which it pays them to make Google their default search engine.

Instead, he ordered remedies including requirements to share data with other firms so they could develop their own search products, and barring exclusive deals to make Google the only search engine on a device or service.

The ruling was “far milder than feared… (it) removes a significant legal overhang and signals that the court is willing to pursue pragmatic remedies,” Hargreaves Lansdown analyst Matt Britzman commented.

Google chiefs nevertheless still “disagree… strongly with the Court’s initial decision in August 2024,” the company’s Vice President of Regulatory Affairs Lee-Anne Mulholland said in a blog post—hinting at a likely appeal that could go all the way to the US Supreme Court.

Stock in Google parent company Alphabet surged on Wednesday as investors welcomed the ruling.

—How will this affect the wider tech sector?

Mehta himself noted that the landscape has changed since the US Justice Department and 11 states launched their antitrust case against Google in 2020.

The emergence of generative artificial intelligence as a challenge to traditional search “give(s) the court hope that Google will not simply outbid competitors for distribution if superior products emerge,” he wrote in his ruling.

“Competition is intense and people can easily choose the services they want,” Google’s Mulholland agreed.

Others in the sector were unhappy with the ruling.

“Google will still be allowed to continue to use its monopoly to hold back competitors, including in AI search,” said Gabriel Weinberg, chief executive of privacy-conscious search engine DuckDuckGo.

Beyond Google, observers have pointed out that Apple and Mozilla are both big winners from the decision.

Ending tie-ups like theirs with Google would “impose substantial—in some cases, crippling—downstream harms to distribution partners, related markets and consumers,” Mehta wrote.

“This is a huge win for Apple, but perhaps even more so for Mozilla, which may very well have died” without the cash infusions, former Google Ventures investor M.G. Siegler wrote on his blog.

—What about ordinary search and AI users?

In the near term, some search data will be shared by Google with competitors under the ruling—with Mulholland saying the company has “concerns about how these requirements will impact our users and their privacy”.

Looking further ahead, “Google Search is in the process of being disrupted” by chatbots, Siegler said.

A future where the company’s flagship search product is completely displaced may yet be far off, as Google Search notched up more than 85 billion individual visits in the month of March 2024, the most recent with data available from Statista.

That compares with around 700 million weekly users reported by OpenAI for its ChatGPT chatbot, the biggest-name generative AI product.

What’s more, Google is not barred from entering into the same kinds of distribution deals as it struck for online search to place its own AI products on partner devices or services.

The company already reports 450 million monthly users for its Gemini chatbot app, and offers competitive tools in other areas like video generation.

© 2025 AFP

Citation:
Impact of US judge’s ruling on Google’s search dominance (2025, September 3)
retrieved 3 September 2025
from https://techxplore.com/news/2025-09-impact-google-dominance.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Continue Reading

Tech

What is phantom energy? The hidden electricity drain explained

Published

on

What is phantom energy? The hidden electricity drain explained


Electronics are plugged in at a home Tuesday, Sept. 2, 2025, in Cincinnati. Credit: AP Photo/Joshua A. Bickel

The lights are off, the house is quiet and nothing seems to be running. But electricity is silently flowing through the plugs in your home. This hidden drain is known as phantom energy.

Also called vampire energy, the wasted electricity comes from leaving devices plugged in when they’re not in use. That could range from such as phone chargers and microwaves to TVs and gaming consoles.

This wasted electricity accounts for about 5% to 10% of home energy use, depending on factors like the age of the equipment, according to Alexis Abramson, dean of the Columbia Climate School.

“Phantom energy depends on … what kind of systems you have and how much they’ve improved over time,” said Abramson.

For example, televisions that are connected to the internet and have smart wake features that allow them to interact with phones and other devices can consume up to 40 watts of energy during the hours of the day that the TV would normally be off, according to Matt Malinowski, director of the buildings program at the American Council for an Energy-Efficient Economy. That’s almost 40 times as much as a regular television.

“The good news is there have been new, renewed efforts to tackle this,” said Malinowski.

He said advocates and manufacturers have come up with a voluntary agreement seeking to reduce the amount of energy smart televisions use when they’re in standby mode.

Phantom energy contributes to climate change because power drawn by unused devices can increase demand for electricity from sources that release planet-warming emissions. Aidan Charron, associate director of Global Earth Day, said that while the amount may seem small when a person looks at their individual utility bill, the environmental toll of phantom energy is significant when multiplied over homes across the country.

“Just take a little step of unplugging the things that you’re not using,” said Charron. “It will save you money and it’ll save emissions in the long run.”

What you can do

Some of the main culprits when it comes to draining energy are appliances that are constantly connected to electricity, such as those with a clock.

“Do you really need your microwave to tell you the time, or can you unplug your microwave when you’re not using it?” said Charron.

While unplugging devices may seem burdensome, it significantly contributes to reducing emissions.

Charron recommends starting with like unplugging chargers for phones and other devices once the battery is fully charged. The next step is moving to other appliances such as unplugging an unused lamp.

If unplugging sounds too hard, regularly checking your settings and disabling any extra feature you’re not using that could be draining energy help, too. For example, smart televisions often have optional features that can be turned off so the television isn’t listening for signals from other devices while in standby mode.

“If you’re not using it, then you’re getting no benefit, yet you’re paying the price and increased the use,” said Malinowski.

How individual actions can make a difference

Individuals also tend to take more sustainable actions, such as unplugging devices, once they learn what they can do to decrease their household emissions efficiently. Those actions could contribute to reducing U.S. emissions by about 20% per year, which equals about 450 tons (408 metric tons) of , according to Jonathan Gilligan, a professor of earth and environmental science at Vanderbilt University.

The choices individuals take in their daily lives all add up, Gilligan said, mainly because of how much the U.S. population contributes to direct greenhouse gas emissions.

“The question becomes, what can we do to try to address this?” said Gilligan. “Phantom power is one part of this.”

The more individuals decrease their footprint, the more likely it is that others will follow, too, and eventually, those actions may turn to , according to Gilligan, because individuals don’t want to feel like they’re being irresponsible.

“This is a place where psychologists find that this effect is real. If people see that other people are doing actions to reduce their greenhouse gas emissions, they want to do that” said Giligan.

When it comes to daily choices, individuals may think what they’re doing isn’t really making a big difference. But what they tend to overlook is how they influence others around them by choosing to live a more sustainable life.

The impact may be much stronger than a lot of people realize, Gilligan said.

© 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
What is phantom energy? The hidden electricity drain explained (2025, September 3)
retrieved 3 September 2025
from https://techxplore.com/news/2025-09-phantom-energy-hidden-electricity.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Continue Reading

Tech

Physical AI uses both sight and touch to manipulate objects like a human

Published

on

Physical AI uses both sight and touch to manipulate objects like a human


Based on camera information, the arm grips both ends of the Velcro (A.1, B.1). Using tactile information, it senses the orientation of the tape and adjusts the posture and angle to align the hook surface with the loop surface (A.2, B.2). The Velcro is fixed, and the right arm presses it to ensure a firm connection (A.3, B.3). Different tape manipulation movements are automatically generated to adapt to the situation. Credit: Tohoku University

In everyday life, it’s a no-brainer to be able to grab a cup of coffee from the table. Multiple sensory inputs such as sight (seeing how far away the cup is) and touch are combined in real-time. However, recreating this in artificial intelligence (AI) is not quite as easy.

An international group of researchers created a new approach that integrates visual and to manipulate robotic arms, while adaptively responding to the environment. Compared to conventional vision-based methods, this approach achieved higher task success rates. These promising results represent a significant advancement in the field of multimodal physical AI.

Details of their breakthrough were published in the journal IEEE Robotics and Automation Letters.

Machine learning can be used to support (AI) to learn human movement patterns, enabling robots to autonomously perform daily tasks such as cooking and cleaning. For example, ALOHA (A Low-cost Open-source Hardware System for Bimanual Teleoperation) is a system developed by Stanford University that enables the low-cost and versatile remote operation and learning of dual-arm robots. Both hardware and software are , so the research team was able to build upon this base.

However, these systems mainly rely on only. Therefore, they lack the same tactile judgments a human could make, such as distinguishing the texture of materials or the front and back sides of objects. For example, it can be easier to tell which is the front or back side of Velcro by simply touching it instead of discerning how it looks. Relying solely on vision without other input is an unfortunate weakness.






Video of the physical AI in action, successfully tying a zip tie. Credit: Tohoku University

“To overcome these limitations, we developed a system that also enables operational decisions based on the texture of target objects—which are difficult to judge from visual information alone,” explains Mitsuhiro Hayashibe, a professor at Tohoku University’s Graduate School of Engineering.

“This achievement represents an important step toward realizing a multimodal physical AI that integrates and processes multiple senses such as vision, hearing, and touch—just like we do.”

The new system was dubbed “TactileAloha.” They found that the could perform appropriate bimanual operations even in tasks where front-back differences and adhesiveness are crucial, such as with Velcro and zip ties. They found that by applying vision-tactile transformer technology, their Physical AI robot exhibited more flexible and adaptive control.

The improved physical AI method was able to accurately manipulate objects, by combining multiple to form adaptive, responsive movements. There are nearly endless possible practical applications of these types of robots to lend a helping hand. Research contributions such as TactileAloha bring us one step closer to these robotic helpers becoming a seamless part of our everyday lives.

The research group was comprised of members from Tohoku University’s Graduate School of Engineering and the Center for Transformative Garment Production, Hong Kong Science Park, and the University of Hong Kong.

More information:
Ningquan Gu et al, TactileAloha: Learning Bimanual Manipulation With Tactile Sensing, IEEE Robotics and Automation Letters (2025). DOI: 10.1109/LRA.2025.3585396

Provided by
Tohoku University


Citation:
Physical AI uses both sight and touch to manipulate objects like a human (2025, September 3)
retrieved 3 September 2025
from https://techxplore.com/news/2025-09-physical-ai-sight-human.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Continue Reading

Trending