Linux fréttir
Power utility GM talks to El Reg about getting that call and what happened next
Nick Lawler, general manager of the Littleton Electric Light and Water Departments (LELWD), was at home one Friday when he got a call from the FBI alerting him that the public power utility's network had been compromised. The digital intruders turned out to be Volt Typhoon.…
An anonymous reader quotes a report from The Register: New York State has sued Allstate Insurance for operating websites so badly designed they would deliver personal information in plain-text to anyone that went looking for it. The data was lifted from Allstate's National General business unit, which ran a website for consumers who wanted to get a quote for a policy. That task required users to input a name and address, and once that info was entered, the site searched a LexisNexis Risk Solutions database for data on anyone who lived at the address provided. The results of that search would then appear on a screen that included the driver's license number (DLN) for the given name and address, plus "names of any other drivers identified as potentially living at that consumer's address, and the entire DLNs of those other drivers."
Naturally, miscreants used the system to mine for people's personal information for fraud. "National General intentionally built these tools to automatically populate consumers' entire DLNs in plain text -- in other words, fully exposed on the face of the quoting websites -- during the quoting process," the court documents [PDF] state. "Not surprisingly, attackers identified this vulnerability and targeted these quoting tools as an easy way to access the DLNs of many New Yorkers," according to the lawsuit. The digital thieves then used this information to "submit fraudulent claims for pandemic and unemployment benefits," we're told. ... [B]y the time the insurer resolved the mess, crooks had built bots that harvested at least 12,000 individuals' driver's license numbers from the quote-generating site.
Read more of this story at Slashdot.
Agency willing to take huge risks with human exploration, but not willing to do it for some dirt?
Rocket Lab has been on a roll lately, with multiple Electron launches, plans for an ocean platform for its Neutron rocket, and a second mission for in-space manufacturing business Varda under its belt. However, NASA has apparently rejected the company's Mars Sample Return mission proposal. Why?…
Don't, don't, DON'T believe the hype
The developer of Free95 says it will be a free Windows 95-compatible OS, but we suspect an elaborate prank. At best, maybe an unknowing one.…
Leaders call for fewer contractors and more top talent installed across government
Senior officials in the UK's civil service understand that future cyber hires in Whitehall will need to be paid a salary higher than that of the Prime Minister if the government wants to get serious about fending off attacks.…
AmiMoJo shares a report from Electrek: The U.S. installed 50 gigawatts (GW) of new solar capacity in 2024, the largest single year of new capacity added to the grid by any energy technology in over two decades. That's enough to power 8.5 million households. According to the U.S. Solar Market Insight 2024 Year in Review report (PDF) released today by the Solar Energy Industries Association (SEIA) and Wood Mackenzie, solar and storage account for 84% of all new electric generating capacity added to the grid last year.
In addition to historic deployment, surging U.S. solar manufacturing emerged as a landmark economic story in 2024. Domestic solar module production tripled last year, and at full capacity, U.S. factories can now produce enough to meet nearly all demand for solar panels in the U.S. Solar cell manufacturing also resumed in 2024, strengthening the U.S. energy supply chain. [...] Total US solar capacity is expected to reach 739 GW by 2035, but the report forecasts include scenarios showing how policy changes could impact the solar market. [...] The low case forecast shows a 130 GW decline in solar deployment over the next decade compared to the base case, representing nearly $250 billion of lost investment.
Read more of this story at Slashdot.
Five years after it launched its first database service, the MySQL fork is trying again
MariaDB says it is building a database-as-a-service based on open source principles after offloading its old DBaaS before going into private ownership.…
Vendors just don't want machines to live double lives
Column My decade-old and very well-travelled 13" MacBook Pro finally died, and I hoped my new-ish M2 iPad Pro could replace it.…
Redmond insists it's got this right and has even more impressive results to share soon
Microsoft's claim of having made quantum computing breakthroughs has attracted strong criticism from scientists, but the software giant says it’s work is sound – and it will soon reveal data that proves it.…
Longtime Slashdot reader schwit1 shares a report from Behind The Black: According to information at this tweet from anonymous sources, parts of Starship will likely require a major redesign due to the spacecraft's break-up shortly after stage separation on its last two test flights. These are the key take-aways, most of which focus on the redesign of the first version of Starship (V1) to create the V2 that flew unsuccessfully on those flights:
- Hot separation also aggravates the situation in the compartment.
- Not related to the flames from the Super Heavy during the booster turn.
- This is a fundamental miscalculation in the design of the Starship V2 and the engine section.
- The fuel lines, wiring for the engines and the power unit will be urgently redone.
- The fate of S35 and S36 is still unclear. Either revision or scrap.
- For the next ships, some processes may be paused in production until a decision on the design is made.
- The team was rushed with fixes for S34, hence the nervous start. There was no need to rush.
- The fixes will take much longer than 4-6 weeks.
- Comprehensive ground testing with long-term fire tests is needed. [emphasis mine]
It must be emphasized that this information comes from leaks from anonymous sources, and could be significantly incorrect. It does however fit the circumstances, and suggests that the next test flight will not occur in April but will be delayed for an unknown period beyond.
Read more of this story at Slashdot.
Skip the schnitzel with gravy and chips for lunch - this is an experimental device for transplant candidates
Australian company BiVACOR has revealed a patient implanted with its artificial heart survived for 100 days – and is still with us after receiving a donated organ.…
An anonymous reader quotes a report from Ars Technica: On Tuesday, OpenAI unveiled a new "Responses API" designed to help software developers create AI agents that can perform tasks independently using the company's AI models. The Responses API will eventually replace the current Assistants API, which OpenAI plans to retire in the first half of 2026. With the new offering, users can develop custom AI agents that scan company files with a file search utility that rapidly checks company databases (with OpenAI promising not to train its models on these files) and navigate websites -- similar to functions available through OpenAI's Operator agent, whose underlying Computer-Using Agent (CUA) model developers can also access to enable automation of tasks like data entry and other operations.
However, OpenAI acknowledges that its CUA model is not yet reliable for automating tasks on operating systems and can make unintended mistakes. The company describes the new API as an early iteration that it will continue to improve over time. Developers using the Responses API can access the same models that power ChatGPT Search: GPT-4o search and GPT-4o mini search. These models can browse the web to answer questions and cite sources in their responses. That's notable because OpenAI says the added web search ability dramatically improves the factual accuracy of its AI models. On OpenAI's SimpleQA benchmark, which aims to measure confabulation rate, GPT-4o search scored 90 percent, while GPT-4o mini search achieved 88 percent -- both substantially outperforming the larger GPT-4.5 model without search, which scored 63 percent.
Despite these improvements, the technology still has significant limitations. Aside from issues with CUA properly navigating websites, the improved search capability doesn't completely solve the problem of AI confabulations, with GPT-4o search still making factual mistakes 10 percent of the time. Alongside the Responses API, OpenAI released the open source Agents SDK, providing developers free tools to integrate models with internal systems, implement safeguards, and monitor agent activities. This toolkit follows OpenAI's earlier release of Swarm, a framework for orchestrating multiple agents.
Read more of this story at Slashdot.
Election infosec advisory agency also shuttered
A penetration tester who worked at the US govt's CISA claims his 100-strong team was dismissed after Elon Musk's Trump-blessed DOGE unit cancelled a contract – and that more staff at the cybersecurity agency have also been let go.…
An anonymous reader quotes a report from TechCrunch: There's a power crunch looming as AI and cloud providers ramp up data center construction. But a new report suggests that a solution lies beneath their foundations. Advanced geothermal power could supply nearly two-thirds of new data center demand by 2030, according to an analysis by the Rhodium Group. The additions would quadruple the amount of geothermal power capacity in the U.S. -- from 4 gigawatts to about 16 gigawatts -- while costing the same or less than what data center operators pay today. In the western U.S., where geothermal resources are more plentiful, the technology could provide 100% of new data center demand. Phoenix, for example, could add 3.8 gigawatts of data center capacity without building a single new conventional power plant.
Geothermal resources have enormous potential to provide consistent power. Historically, geothermal power plants have been limited to places where Earth's heat seeps close to the surface. But advanced geothermal techniques could unlock 90 gigawatts of clean power in the U.S. alone, according to the U.S. Department of Energy. [...] Because geothermal power has very low running costs, its price is competitive with data centers' energy costs today, the Rhodium report said. When data centers are sited similarly to how they are today, a process that typically takes into account proximity to fiber optics and major metro areas, geothermal power costs just over $75 per megawatt hour. But when developers account for geothermal potential in their siting, the costs drop significantly, down to around $50 per megawatt hour.
The report assumes that new generating capacity would be "behind the meter," which is what experts call power plants that are hooked up directly to a customer, bypassing the grid. Wait times for new power plants to connect to the grid can stretch on for years. As a result, behind the meter arrangements have become more appealing for data center operators who are scrambling to build new capacity.
Read more of this story at Slashdot.
Microsoft tackles 50-plus security blunders, Adobe splats 3D bugs, and Apple deals with a doozy
Patch Tuesday Microsoft’s Patch Tuesday bundle has appeared, with a dirty dozen flaws competing for your urgent attention – six of them rated critical and another six already being exploited by criminals.…
Pages
|