Linux fréttir
Yokohama release also adds meta-observabiilty and takes a tilt at CRM
ServiceNow has for years used the example of employee onboarding to explain the power of its wares, pointing out that a lot of people around an organization are needed to get new hires on the payroll, registered with HR, equipped with a computer, and assigned appropriate permissions to access applications.…
Power utility GM talks to El Reg about getting that call and what happened next
Nick Lawler, general manager of the Littleton Electric Light and Water Departments (LELWD), was at home one Friday when he got a call from the FBI alerting him that the public power utility's network had been compromised. The digital intruders turned out to be Volt Typhoon.…
An anonymous reader quotes a report from The Register: New York State has sued Allstate Insurance for operating websites so badly designed they would deliver personal information in plain-text to anyone that went looking for it. The data was lifted from Allstate's National General business unit, which ran a website for consumers who wanted to get a quote for a policy. That task required users to input a name and address, and once that info was entered, the site searched a LexisNexis Risk Solutions database for data on anyone who lived at the address provided. The results of that search would then appear on a screen that included the driver's license number (DLN) for the given name and address, plus "names of any other drivers identified as potentially living at that consumer's address, and the entire DLNs of those other drivers."
Naturally, miscreants used the system to mine for people's personal information for fraud. "National General intentionally built these tools to automatically populate consumers' entire DLNs in plain text -- in other words, fully exposed on the face of the quoting websites -- during the quoting process," the court documents [PDF] state. "Not surprisingly, attackers identified this vulnerability and targeted these quoting tools as an easy way to access the DLNs of many New Yorkers," according to the lawsuit. The digital thieves then used this information to "submit fraudulent claims for pandemic and unemployment benefits," we're told. ... [B]y the time the insurer resolved the mess, crooks had built bots that harvested at least 12,000 individuals' driver's license numbers from the quote-generating site.
Read more of this story at Slashdot.
Agency willing to take huge risks with human exploration, but not willing to do it for some dirt?
Rocket Lab has been on a roll lately, with multiple Electron launches, plans for an ocean platform for its Neutron rocket, and a second mission for in-space manufacturing business Varda under its belt. However, NASA has apparently rejected the company's Mars Sample Return mission proposal. Why?…
Don't, don't, DON'T believe the hype
The developer of Free95 says it will be a free Windows 95-compatible OS, but we suspect an elaborate prank. At best, maybe an unknowing one.…
Leaders call for fewer contractors and more top talent installed across government
Senior officials in the UK's civil service understand that future cyber hires in Whitehall will need to be paid a salary higher than that of the Prime Minister if the government wants to get serious about fending off attacks.…
AmiMoJo shares a report from Electrek: The U.S. installed 50 gigawatts (GW) of new solar capacity in 2024, the largest single year of new capacity added to the grid by any energy technology in over two decades. That's enough to power 8.5 million households. According to the U.S. Solar Market Insight 2024 Year in Review report (PDF) released today by the Solar Energy Industries Association (SEIA) and Wood Mackenzie, solar and storage account for 84% of all new electric generating capacity added to the grid last year.
In addition to historic deployment, surging U.S. solar manufacturing emerged as a landmark economic story in 2024. Domestic solar module production tripled last year, and at full capacity, U.S. factories can now produce enough to meet nearly all demand for solar panels in the U.S. Solar cell manufacturing also resumed in 2024, strengthening the U.S. energy supply chain. [...] Total US solar capacity is expected to reach 739 GW by 2035, but the report forecasts include scenarios showing how policy changes could impact the solar market. [...] The low case forecast shows a 130 GW decline in solar deployment over the next decade compared to the base case, representing nearly $250 billion of lost investment.
Read more of this story at Slashdot.
Five years after it launched its first database service, the MySQL fork is trying again
MariaDB says it is building a database-as-a-service based on open source principles after offloading its old DBaaS before going into private ownership.…
Vendors just don't want machines to live double lives
Column My decade-old and very well-travelled 13" MacBook Pro finally died, and I hoped my new-ish M2 iPad Pro could replace it.…
Redmond insists it's got this right and has even more impressive results to share soon
Microsoft's claim of having made quantum computing breakthroughs has attracted strong criticism from scientists, but the software giant says it’s work is sound – and it will soon reveal data that proves it.…
Longtime Slashdot reader schwit1 shares a report from Behind The Black: According to information at this tweet from anonymous sources, parts of Starship will likely require a major redesign due to the spacecraft's break-up shortly after stage separation on its last two test flights. These are the key take-aways, most of which focus on the redesign of the first version of Starship (V1) to create the V2 that flew unsuccessfully on those flights:
- Hot separation also aggravates the situation in the compartment.
- Not related to the flames from the Super Heavy during the booster turn.
- This is a fundamental miscalculation in the design of the Starship V2 and the engine section.
- The fuel lines, wiring for the engines and the power unit will be urgently redone.
- The fate of S35 and S36 is still unclear. Either revision or scrap.
- For the next ships, some processes may be paused in production until a decision on the design is made.
- The team was rushed with fixes for S34, hence the nervous start. There was no need to rush.
- The fixes will take much longer than 4-6 weeks.
- Comprehensive ground testing with long-term fire tests is needed. [emphasis mine]
It must be emphasized that this information comes from leaks from anonymous sources, and could be significantly incorrect. It does however fit the circumstances, and suggests that the next test flight will not occur in April but will be delayed for an unknown period beyond.
Read more of this story at Slashdot.
Skip the schnitzel with gravy and chips for lunch - this is an experimental device for transplant candidates
Australian company BiVACOR has revealed a patient implanted with its artificial heart survived for 100 days – and is still with us after receiving a donated organ.…
An anonymous reader quotes a report from Ars Technica: On Tuesday, OpenAI unveiled a new "Responses API" designed to help software developers create AI agents that can perform tasks independently using the company's AI models. The Responses API will eventually replace the current Assistants API, which OpenAI plans to retire in the first half of 2026. With the new offering, users can develop custom AI agents that scan company files with a file search utility that rapidly checks company databases (with OpenAI promising not to train its models on these files) and navigate websites -- similar to functions available through OpenAI's Operator agent, whose underlying Computer-Using Agent (CUA) model developers can also access to enable automation of tasks like data entry and other operations.
However, OpenAI acknowledges that its CUA model is not yet reliable for automating tasks on operating systems and can make unintended mistakes. The company describes the new API as an early iteration that it will continue to improve over time. Developers using the Responses API can access the same models that power ChatGPT Search: GPT-4o search and GPT-4o mini search. These models can browse the web to answer questions and cite sources in their responses. That's notable because OpenAI says the added web search ability dramatically improves the factual accuracy of its AI models. On OpenAI's SimpleQA benchmark, which aims to measure confabulation rate, GPT-4o search scored 90 percent, while GPT-4o mini search achieved 88 percent -- both substantially outperforming the larger GPT-4.5 model without search, which scored 63 percent.
Despite these improvements, the technology still has significant limitations. Aside from issues with CUA properly navigating websites, the improved search capability doesn't completely solve the problem of AI confabulations, with GPT-4o search still making factual mistakes 10 percent of the time. Alongside the Responses API, OpenAI released the open source Agents SDK, providing developers free tools to integrate models with internal systems, implement safeguards, and monitor agent activities. This toolkit follows OpenAI's earlier release of Swarm, a framework for orchestrating multiple agents.
Read more of this story at Slashdot.
Election infosec advisory agency also shuttered
A penetration tester who worked at the US govt's CISA claims his 100-strong team was dismissed after Elon Musk's Trump-blessed DOGE unit cancelled a contract – and that more staff at the cybersecurity agency have also been let go.…
An anonymous reader quotes a report from TechCrunch: There's a power crunch looming as AI and cloud providers ramp up data center construction. But a new report suggests that a solution lies beneath their foundations. Advanced geothermal power could supply nearly two-thirds of new data center demand by 2030, according to an analysis by the Rhodium Group. The additions would quadruple the amount of geothermal power capacity in the U.S. -- from 4 gigawatts to about 16 gigawatts -- while costing the same or less than what data center operators pay today. In the western U.S., where geothermal resources are more plentiful, the technology could provide 100% of new data center demand. Phoenix, for example, could add 3.8 gigawatts of data center capacity without building a single new conventional power plant.
Geothermal resources have enormous potential to provide consistent power. Historically, geothermal power plants have been limited to places where Earth's heat seeps close to the surface. But advanced geothermal techniques could unlock 90 gigawatts of clean power in the U.S. alone, according to the U.S. Department of Energy. [...] Because geothermal power has very low running costs, its price is competitive with data centers' energy costs today, the Rhodium report said. When data centers are sited similarly to how they are today, a process that typically takes into account proximity to fiber optics and major metro areas, geothermal power costs just over $75 per megawatt hour. But when developers account for geothermal potential in their siting, the costs drop significantly, down to around $50 per megawatt hour.
The report assumes that new generating capacity would be "behind the meter," which is what experts call power plants that are hooked up directly to a customer, bypassing the grid. Wait times for new power plants to connect to the grid can stretch on for years. As a result, behind the meter arrangements have become more appealing for data center operators who are scrambling to build new capacity.
Read more of this story at Slashdot.
Microsoft tackles 50-plus security blunders, Adobe splats 3D bugs, and Apple deals with a doozy
Patch Tuesday Microsoft’s Patch Tuesday bundle has appeared, with a dirty dozen flaws competing for your urgent attention – six of them rated critical and another six already being exploited by criminals.…
Sphere Entertainment Co, the company behind the Las Vegas Sphere, said they are considering opening scaled-down versions of the immersive venue in other cities. AV Magazine reports: While this has been been feasible for its high-profile residencies such as U2, the Eagles, Dead & Company and Anyma, smaller venues could attract a broader range of artists who might not have the budget or demand to fill the flagship Las Vegas location. By scaling down the size while retaining the signature technology, Sphere Entertainment Co can offer a similar spectacle at a more sustainable cost for artists and spectators.
The possibility of mini-Spheres follows news that a full-scale venue will open in the UAE as a result of a partnership between Sphere Entertainment Co and the Department of Culture and Tourism -- Abu Dhabi. Beyond concerts, the Las Vegas Sphere has proven successful with immersive films such as V-U2: An Immersive Concert Film and the Sphere Expeience featuring Darren Aronofsky's Postcard from Earth, which In January passed 1,000 screenings. "As we enter a new fiscal year, we see significant opportunities to drive our Sphere business forward in Las Vegas and beyond," said Dolan. "We believe we are on a path toward realizing our vision for this next-generation medium and generating long-term shareholder value."
Read more of this story at Slashdot.
MojoKid writes: AMD just launched its latest flagship desktop processors, the Ryzen 9 9950X3D. Ryzen 9 9950X3D is a 16-core/32-thread, dual-CCD part with a base clock of 4.3GHz and a max boost clock of 5.7GHz. There's also 96MB of second-gen 3D V-Cache on board. Standard Ryzen 9000 series processors feature 32MB of L3 cache per compute die, but with the Ryzen 9 9950X3D, one compute die is outfitted with an additional 96MB of 3D V-Cache, bringing the total L3 up to 128MB (144MB total cache). The CCD outfitted with 3D V-Cache operates at more conservative voltages and frequencies, but the bare compute die is unencumbered.
The Ryzen 9 9950X3D turns out to be a high-performance, no-compromise desktop processor. Its complement of 3D V-Cache provides tangible benefits in gaming, and AMD's continued work on the platform's firmware and driver software ensures that even with the Ryzen 9 9950X3D's asymmetrical CCD configuration, performance is strong across the board. At $699, it's not cheap but its a great CPU for gaming and content creation, and one of the most powerful standard desktop CPUs money can buy currently.
Read more of this story at Slashdot.
Microsoft is ending support of its Remote Desktop app for Windows on May 27th. From a report: If you use the Remote Desktop app to connect to Windows 365, Azure Virtual Desktop, or Microsoft Dev Box machines then you'll have to transition to the Windows app instead.
The new Windows app, which launched in September, includes multimonitor support, dynamic display resolutions, and easy access to cloud PCs and virtual desktops. Microsoft says "connections to Windows 365, Azure Virtual Desktop, and Microsoft Dev Box via the Remote Desktop app from the Microsoft Store will be blocked after May 27th, 2025."
Read more of this story at Slashdot.
A new chapter has begun for two of the world's most popular preprint platforms, bioRxiv and medRxiv, with the launch of a non-profit organization that will manage them, their co-founders announced today. From a report: The servers allow researchers to share manuscripts for free before peer review and have become an integral part of publishing biology and medical research. Until now, they had been managed by Cold Spring Harbor Laboratory (CSHL) in New York. The new organization, named openRxiv, will have a board of directors and a scientific and medical advisory board. It is supported by a fresh US$16-million grant from the Chan Zuckerberg Initiative (CZI), the projects' main financial backer.
"It's just exciting to see this key piece of infrastructure really get the attention that it deserves as a dedicated initiative," says Katie Corker, executive director of ASAPbio, a scientist-driven non-profit organization, which is based in San Francisco, California. Preprints are "the backbone of the scientific publishing ecosystem, maybe especially at the current moment, when there's a lot of worries about who has control of information."
The launch of openRxiv "reflects a maturation of the projects," which started as an experiment at CSHL, says Richard Sever, a co-founder of both servers and chief science and strategy officer at openRxiv. It has "become so important that they should have their own organization running them, which is focused on the long-term sustainability of the servers, as opposed to being a side project within a big research institution," says Sever.
Read more of this story at Slashdot.
Pages
|