Connect with us

Tech

Heightened global risk pushes interest in data sovereignty | Computer Weekly

Published

on

Heightened global risk pushes interest in data sovereignty | Computer Weekly


Heightened risk related to data sovereignty is universally acknowledged. Most IT decision makers see that risk increasing as a result of geopolitical instability, and that inadequate preparation could result in costly reputational damage and a loss of customer trust.

Those are the key findings of a Pure Storage-sponsored survey in which the University of Technology Sydney carried out interview-based qualitative research among IT practitioners in the Europe and Asia-Pacific regions.

The survey found:

  • 100% of those asked believed sovereignty risks that include potential service disruption have forced organisations to reconsider where data is located;
  • 92% said geopolitical shifts had increased sovereignty risks;
  • 92% believed inadequate sovereignty planning could lead to reputational damage;
  • 85% identified loss of customer trust as the key consequence of inaction;
  • 78% said they had embraced data strategies that included engaging with multiple service providers; adopting sovereign datacentres (on-premise or in-country), and building enhanced governance requirements into commercial agreements.

The survey commentary talks of a “perfect storm” where service disruption risks, foreign influence and evolving regulations converge to create huge exposure to risk for organisations that could result in revenue loss, regulatory penalties and irreparable damage to stakeholder trust if not addressed.

One IT decision maker talked about how complex data sovereignty can be to unpick, and how it now forms key planks of their organisation’s agreements with customers.

“The Access Group handles sensitive end user data for our customers across the world, from the NHS in the UK to the Tax Department in Australia,” said Rolf Krolke, regional technology director for APAC with The Access Group. “Data sovereignty is an absolutely critical issue for us and our customers. In fact, they ask that it be written into our contracts.”

The concept of data sovereignty centres on the idea that information created, processed, converted and stored in digital form is subject to the laws of the country in which it was generated. But data can travel, too, and when it does, its destination country’s laws on data held there that must be adhered to. That is known as data residency.

Difficulties can arise when the two concepts meet and the laws of one state contradict another, such as with the European Union’s General Data Protection Regulation, which requires that data transferred to another jurisdiction is held with adequate safeguards and protections.

For such reasons, organisations often want to know where their data goes, and also might want to keep it in known – often home country – locations.

Such concerns have been heightened in the recent climate of geopolitical instability, as well as the febrile climate that has grown around international cyber crime.

The rise in use of the cloud is core to many of the concerns and the difficulties that arise.

Datacentre locations

Also present as concerns are datacentre locations and the global supply chain, said Patrick Smith, EMEA chief technology officer of Pure Storage, who suggests organisations and states will need to move to – or are already moving towards – building their own sovereign capacity.

This, he said, means physical equipment and in-country datacentre capacity, and that’s not a trivial obstacle to surmount.

“It’s interesting when you think about some of the constrained components that we’ve seen on the global stage,” said Smith. “A great example is Nvidia GPUs [graphics processing units], which require almost a global village to produce them.

“As soon as you start looking at data sovereignty, you’re looking at, ‘How do I build my sovereign capability? Where do I get all the components from?’ Many countries have effectively outsourced datacentres. They’ve put them outside of their own geography.

“With a sovereign capability, you’re talking about having to host those datacentres within your own borders,” he said. “And that suddenly means that you need to have that energy production and water supply to support that datacentre.”



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

The Justice Department Released More Epstein Files—but Not the Ones Survivors Want

Published

on

The Justice Department Released More Epstein Files—but Not the Ones Survivors Want


Over the weekend, the Justice Department released three new data sets comprising files related to Jeffrey Epstein. The DOJ had previously released nearly 4,000 documents prior to the Friday midnight deadline required by the Epstein Files Transparency Act.

As with Friday’s release, the new tranche appears to contain hundreds of photographs, along with various court records pertaining to Epstein and his associates. The first of the additional datasets, Data Set 5, is photos of hard drives and physical folders, as well as chain-of-custody forms. Data Set 6 appears to mostly be grand jury materials from cases out of the Southern District of New York against Epstein and his coconspirator, Ghislaine Maxwell. Data Set 7 includes more grand jury materials from those cases, as well as materials from a separate 2007 Florida grand jury.

Data Set 7 also includes an out-of-order transcript between R. Alexander Acosta and the DOJ’s Office of Professional Responsibility from 2019. According to the transcript, the OPR was investigating whether attorneys in the Southern District of Florida US Attorney’s Office committed professional misconduct by entering into a non-prosecution agreement with Epstein, who was being investigated by state law enforcement on sexual battery charges. Acosta was the head of the office when the agreement was signed.

Leading up to the deadline to release materials, the DOJ made three separate requests to unseal grand jury materials. Those requests were granted earlier this month.

The initial release of the Epstein files was met with protest, particularly by Epstein victims and Democratic lawmakers. “The public received a fraction of the files, and what we received was riddled with abnormal and extreme redactions with no explanation,” wrote a group of 19 women who had survived abuse from Epstein and Maxwell in a statement posted on social media. Senator Chuck Schumer said Monday that he would force a vote that would allow the Senate to sue the Trump administration for a full release of the Epstein files.

Along with the release of the new batch of files over the weekend, the Justice Department also removed at least 16 files from its initial offering, including a photograph that depicted Donald Trump. The DOJ later restored that photograph, saying in a statement on X that it had initially been flagged “for potential further action to protect victims.” The post went on to say that “after the review, it was determined there is no evidence that any Epstein victims are depicted in the photograph, and it has been reposted without any alteration or redaction.”

The Justice Department acknowledged in a fact sheet on Sunday that it has “hundreds of thousands of pages of material to release,” claiming that it has more than 200 lawyers reviewing files prior to release.



Source link

Continue Reading

Tech

OpenAI’s Child Exploitation Reports Increased Sharply This Year

Published

on

OpenAI’s Child Exploitation Reports Increased Sharply This Year


OpenAI sent 80 times as many child exploitation incident reports to the National Center for Missing & Exploited Children during the first half of 2025 as it did during a similar time period in 2024, according to a recent update from the company. The NCMEC’s CyberTipline is a Congressionally authorized clearinghouse for reporting child sexual abuse material (CSAM) and other forms of child exploitation.

Companies are required by law to report apparent child exploitation to the CyberTipline. When a company sends a report, NCMEC reviews it and then forwards it to the appropriate law enforcement agency for investigation.

Statistics related to NCMEC reports can be nuanced. Increased reports can sometimes indicate changes in a platform’s automated moderation, or the criteria it uses to decide whether a report is necessary, rather than necessarily indicating an increase in nefarious activity.

Additionally, the same piece of content can be the subject of multiple reports, and a single report can be about multiple pieces of content. Some platforms, including OpenAI, disclose the number of both the reports and the total pieces of content they were about for a more complete picture.

OpenAI spokesperson Gaby Raila said in a statement that the company made investments toward the end of 2024 “to increase [its] capacity to review and action reports in order to keep pace with current and future user growth.” Raila also said that the time frame corresponds to “the introduction of more product surfaces that allowed image uploads and the growing popularity of our products, which contributed to the increase in reports.” In August, Nick Turley, vice president and head of ChatGPT, announced that the app had four times the amount of weekly active users than it did the year before.

During the first half of 2025, the number of CyberTipline reports OpenAI sent was roughly the same as the amount of content OpenAI sent the reports about—75,027 compared to 74,559. In the first half of 2024, it sent 947 CyberTipline reports about 3,252 pieces of content. Both the number of reports and pieces of content the reports saw a marked increase between the two time periods.

Content, in this context, could mean multiple things. OpenAI has said that it reports all instances of CSAM, including uploads and requests, to NCMEC. Besides its ChatGPT app, which allows users to upload files—including images—and can generate text and images in response, OpenAI also offers access to its models via API access. The most recent NCMEC count wouldn’t include any reports related to video-generation app Sora, as its September release was after the time frame covered by the update.

The spike in reports follows a similar pattern to what NCMEC has observed at the CyberTipline more broadly with the rise of generative AI. The center’s analysis of all CyberTipline data found that reports involving generative AI saw a 1,325 percent increase between 2023 and 2024. NCMEC has not yet released 2025 data, and while other large AI labs like Google publish statistics about the NCMEC reports they’ve made, they don’t specify what percentage of those reports are AI-related.



Source link

Continue Reading

Tech

The Doomsday Glacier Is Getting Closer and Closer to Irreversible Collapse

Published

on

The Doomsday Glacier Is Getting Closer and Closer to Irreversible Collapse


Known as the “Doomsday Glacier,” the Thwaites Glacier in Antarctica is one of the most rapidly changing glaciers on Earth, and its future evolution is one of the biggest unknowns when it comes to predicting global sea level rise.

The eastern ice shelf of the Thwaites Glacier is supported at its northern end by a ridge of the ocean floor. However, over the past two decades, cracks in the upper reaches of the glacier have increased rapidly, weakening its structural stability. A new study by the International Thwaites Glacier Collaboration (ITGC) presents a detailed record of this gradual collapse process.

Researchers at the Centre for Earth Observation and Science at the University of Manitoba, Canada, analyzed observational data from 2002 to 2022 to track the formation and propagation of cracks in the ice shelf shear zone. They discovered that as the cracks grew, the connection between the ice shelf and the mid-ocean ridge weakened, accelerating the upstream flow of ice.

A fast-motion video of Thwaites Glacier in Antarctica over a period of about 10 years.

Video: University of Manitoba

The Crack in the Ice Shelf Widens in Two Stages

The study reveals that the weakening of the ice shelf occurred in four distinct phases, with crack growth occurring in two stages. In the first phase, long cracks appeared along the ice flow, gradually extending eastward. Some exceeded 8 km in length and spanned the entire shelf. In the second phase, numerous short cross-flow cracks, less than 2 km long, emerged, doubling the total length of the fissures.

Analysis of satellite images showed that the total length of the cracks increased from about 165 km in 2002 to approximately 336 km in 2021. Meanwhile, the average length of each crack decreased from 3.2 km to 1.5 km, with a notable increase in small cracks. These changes reflect a significant shift in the stress state of the ice shelf, that is, in the interaction of forces within its structure.

Between 2002 and 2006, the ice shelf accelerated as it was pulled by nearby fast-moving currents, generating compressive stress on the anchorage point, which initially stabilized the shelf. After 2007, the shear zone between the shelf and the Western ice tongue collapsed. The stress concentrated around the anchorage point, leading to the formation of large cracks.

Since 2017, these cracks have completely penetrated the ice shelf, severing the connection to the anchorage. According to researchers, this has accelerated the upstream flow of ice and turned the anchorage into a destabilizing factor.

Feedback Loop Collapse

One of the most significant findings of the study is the existence of a feedback loop: Cracks accelerate the flow of ice, and in turn, this increased speed generates new cracks. This process was clearly recorded by the GPS devices that the team deployed on the ice shelf between 2020 and 2022.

During the winter of 2020, the upward propagation of structural changes in the shear zone was particularly evident. These changes advanced at a rate of approximately 55 kilometers per year within the ice shelf, demonstrating that structural collapse in the shear zone directly impacts upstream ice flow.



Source link

Continue Reading

Trending