Linux fréttir
Roomba maker iRobot has warned it may cease operations within 12 months unless it can refinance debt or find a buyer, just one day after launching a new vacuum cleaner line. In its March 12 quarterly report, the company disclosed it had spent $3.6 million to amend terms on a $200 million Carlyle Group loan from 2023, as U.S. revenue plunged 47% in the fourth quarter.
"Given these uncertainties and the implication they may have on the Company's financials, there is substantial doubt about the Company's ability to continue as a going concern for a period of at least 12 months from the date of the issuance of its consolidated 2024 financial statements," the company wrote.
The robot vacuum pioneer has initiated a formal strategic review after a failed Amazon acquisition, the departure of founder Colin Angle, and layoffs affecting over half its workforce. iRobot cited mounting competition from Chinese manufacturers and expects continued losses for "the foreseeable future."
Read more of this story at Slashdot.
Artist formerly known as OpenStack to huddle under same umbrella as the Cloud Native Computing Foundation
The votes are in, confirming that the Open Infrastructure Foundation intends to join the Linux Foundation.…
Morgan Stanley has reduced its iPhone shipment forecasts after Apple confirmed the delay of a more advanced Siri personal assistant, dampening prospects for accelerating phone upgrades. The investment bank now predicts 230 million iPhone shipments in 2025 (flat year-over-year) and 243 million in 2026 (up 6%), down from previous estimates.
An upgraded Siri was the most sought-after Apple Intelligence feature among prospective buyers, according to the bank's survey data. "Access to Advanced AI Features" appeared as a top-five driver of smartphone upgrades for the first time, with about 50% of iPhone owners who didn't upgrade to iPhone 16 citing the delayed Apple Intelligence rollout as affecting their decision. The firm also incorporated headwinds from China tariffs in its assessment, noting Apple is unlikely to fully offset these costs without broader exemptions.
Read more of this story at Slashdot.
Fewer than 10 known victims, but Mandiant suspects others compromised, too
Chinese spies have for months exploited old Juniper Networks routers, infecting the buggy gear with custom backdoors and gaining root access to the compromised devices.…
Amazon, Alphabet's Google and Meta Platforms on Wednesday said they support efforts to at least triple nuclear energy worldwide by 2050. From a report: The tech companies signed a pledge first adopted in December 2023 by more than 20 countries, including the U.S., at the U.N. Climate Change Conference. Financial institutions including Bank of America, Goldman Sachs and Morgan Stanley backed the pledge last year.
The pledge is nonbinding, but highlights the growing support for expanding nuclear power among leading industries, finance and governments. Amazon, Google and Meta are increasingly important drivers of energy demand in the U.S. as they build out AI centers. The tech sector is turning to nuclear power after concluding that renewables alone won't provide enough reliable power for their energy needs. Microsoft and Apple did not sign the statement.
Read more of this story at Slashdot.
Yokohama release also adds meta-observabiilty and takes a tilt at CRM
ServiceNow has for years used the example of employee onboarding to explain the power of its wares, pointing out that a lot of people around an organization are needed to get new hires on the payroll, registered with HR, equipped with a computer, and assigned appropriate permissions to access applications.…
Power utility GM talks to El Reg about getting that call and what happened next
Nick Lawler, general manager of the Littleton Electric Light and Water Departments (LELWD), was at home one Friday when he got a call from the FBI alerting him that the public power utility's network had been compromised. The digital intruders turned out to be Volt Typhoon.…
An anonymous reader quotes a report from The Register: New York State has sued Allstate Insurance for operating websites so badly designed they would deliver personal information in plain-text to anyone that went looking for it. The data was lifted from Allstate's National General business unit, which ran a website for consumers who wanted to get a quote for a policy. That task required users to input a name and address, and once that info was entered, the site searched a LexisNexis Risk Solutions database for data on anyone who lived at the address provided. The results of that search would then appear on a screen that included the driver's license number (DLN) for the given name and address, plus "names of any other drivers identified as potentially living at that consumer's address, and the entire DLNs of those other drivers."
Naturally, miscreants used the system to mine for people's personal information for fraud. "National General intentionally built these tools to automatically populate consumers' entire DLNs in plain text -- in other words, fully exposed on the face of the quoting websites -- during the quoting process," the court documents [PDF] state. "Not surprisingly, attackers identified this vulnerability and targeted these quoting tools as an easy way to access the DLNs of many New Yorkers," according to the lawsuit. The digital thieves then used this information to "submit fraudulent claims for pandemic and unemployment benefits," we're told. ... [B]y the time the insurer resolved the mess, crooks had built bots that harvested at least 12,000 individuals' driver's license numbers from the quote-generating site.
Read more of this story at Slashdot.
Agency willing to take huge risks with human exploration, but not willing to do it for some dirt?
Rocket Lab has been on a roll lately, with multiple Electron launches, plans for an ocean platform for its Neutron rocket, and a second mission for in-space manufacturing business Varda under its belt. However, NASA has apparently rejected the company's Mars Sample Return mission proposal. Why?…
Don't, don't, DON'T believe the hype
The developer of Free95 says it will be a free Windows 95-compatible OS, but we suspect an elaborate prank. At best, maybe an unknowing one.…
Leaders call for fewer contractors and more top talent installed across government
Senior officials in the UK's civil service understand that future cyber hires in Whitehall will need to be paid a salary higher than that of the Prime Minister if the government wants to get serious about fending off attacks.…
AmiMoJo shares a report from Electrek: The U.S. installed 50 gigawatts (GW) of new solar capacity in 2024, the largest single year of new capacity added to the grid by any energy technology in over two decades. That's enough to power 8.5 million households. According to the U.S. Solar Market Insight 2024 Year in Review report (PDF) released today by the Solar Energy Industries Association (SEIA) and Wood Mackenzie, solar and storage account for 84% of all new electric generating capacity added to the grid last year.
In addition to historic deployment, surging U.S. solar manufacturing emerged as a landmark economic story in 2024. Domestic solar module production tripled last year, and at full capacity, U.S. factories can now produce enough to meet nearly all demand for solar panels in the U.S. Solar cell manufacturing also resumed in 2024, strengthening the U.S. energy supply chain. [...] Total US solar capacity is expected to reach 739 GW by 2035, but the report forecasts include scenarios showing how policy changes could impact the solar market. [...] The low case forecast shows a 130 GW decline in solar deployment over the next decade compared to the base case, representing nearly $250 billion of lost investment.
Read more of this story at Slashdot.
Five years after it launched its first database service, the MySQL fork is trying again
MariaDB says it is building a database-as-a-service based on open source principles after offloading its old DBaaS before going into private ownership.…
Vendors just don't want machines to live double lives
Column My decade-old and very well-travelled 13" MacBook Pro finally died, and I hoped my new-ish M2 iPad Pro could replace it.…
Redmond insists it's got this right and has even more impressive results to share soon
Microsoft's claim of having made quantum computing breakthroughs has attracted strong criticism from scientists, but the software giant says it’s work is sound – and it will soon reveal data that proves it.…
Longtime Slashdot reader schwit1 shares a report from Behind The Black: According to information at this tweet from anonymous sources, parts of Starship will likely require a major redesign due to the spacecraft's break-up shortly after stage separation on its last two test flights. These are the key take-aways, most of which focus on the redesign of the first version of Starship (V1) to create the V2 that flew unsuccessfully on those flights:
- Hot separation also aggravates the situation in the compartment.
- Not related to the flames from the Super Heavy during the booster turn.
- This is a fundamental miscalculation in the design of the Starship V2 and the engine section.
- The fuel lines, wiring for the engines and the power unit will be urgently redone.
- The fate of S35 and S36 is still unclear. Either revision or scrap.
- For the next ships, some processes may be paused in production until a decision on the design is made.
- The team was rushed with fixes for S34, hence the nervous start. There was no need to rush.
- The fixes will take much longer than 4-6 weeks.
- Comprehensive ground testing with long-term fire tests is needed. [emphasis mine]
It must be emphasized that this information comes from leaks from anonymous sources, and could be significantly incorrect. It does however fit the circumstances, and suggests that the next test flight will not occur in April but will be delayed for an unknown period beyond.
Read more of this story at Slashdot.
Skip the schnitzel with gravy and chips for lunch - this is an experimental device for transplant candidates
Australian company BiVACOR has revealed a patient implanted with its artificial heart survived for 100 days – and is still with us after receiving a donated organ.…
An anonymous reader quotes a report from Ars Technica: On Tuesday, OpenAI unveiled a new "Responses API" designed to help software developers create AI agents that can perform tasks independently using the company's AI models. The Responses API will eventually replace the current Assistants API, which OpenAI plans to retire in the first half of 2026. With the new offering, users can develop custom AI agents that scan company files with a file search utility that rapidly checks company databases (with OpenAI promising not to train its models on these files) and navigate websites -- similar to functions available through OpenAI's Operator agent, whose underlying Computer-Using Agent (CUA) model developers can also access to enable automation of tasks like data entry and other operations.
However, OpenAI acknowledges that its CUA model is not yet reliable for automating tasks on operating systems and can make unintended mistakes. The company describes the new API as an early iteration that it will continue to improve over time. Developers using the Responses API can access the same models that power ChatGPT Search: GPT-4o search and GPT-4o mini search. These models can browse the web to answer questions and cite sources in their responses. That's notable because OpenAI says the added web search ability dramatically improves the factual accuracy of its AI models. On OpenAI's SimpleQA benchmark, which aims to measure confabulation rate, GPT-4o search scored 90 percent, while GPT-4o mini search achieved 88 percent -- both substantially outperforming the larger GPT-4.5 model without search, which scored 63 percent.
Despite these improvements, the technology still has significant limitations. Aside from issues with CUA properly navigating websites, the improved search capability doesn't completely solve the problem of AI confabulations, with GPT-4o search still making factual mistakes 10 percent of the time. Alongside the Responses API, OpenAI released the open source Agents SDK, providing developers free tools to integrate models with internal systems, implement safeguards, and monitor agent activities. This toolkit follows OpenAI's earlier release of Swarm, a framework for orchestrating multiple agents.
Read more of this story at Slashdot.
Election infosec advisory agency also shuttered
A penetration tester who worked at the US govt's CISA claims his 100-strong team was dismissed after Elon Musk's Trump-blessed DOGE unit cancelled a contract – and that more staff at the cybersecurity agency have also been let go.…
An anonymous reader quotes a report from TechCrunch: There's a power crunch looming as AI and cloud providers ramp up data center construction. But a new report suggests that a solution lies beneath their foundations. Advanced geothermal power could supply nearly two-thirds of new data center demand by 2030, according to an analysis by the Rhodium Group. The additions would quadruple the amount of geothermal power capacity in the U.S. -- from 4 gigawatts to about 16 gigawatts -- while costing the same or less than what data center operators pay today. In the western U.S., where geothermal resources are more plentiful, the technology could provide 100% of new data center demand. Phoenix, for example, could add 3.8 gigawatts of data center capacity without building a single new conventional power plant.
Geothermal resources have enormous potential to provide consistent power. Historically, geothermal power plants have been limited to places where Earth's heat seeps close to the surface. But advanced geothermal techniques could unlock 90 gigawatts of clean power in the U.S. alone, according to the U.S. Department of Energy. [...] Because geothermal power has very low running costs, its price is competitive with data centers' energy costs today, the Rhodium report said. When data centers are sited similarly to how they are today, a process that typically takes into account proximity to fiber optics and major metro areas, geothermal power costs just over $75 per megawatt hour. But when developers account for geothermal potential in their siting, the costs drop significantly, down to around $50 per megawatt hour.
The report assumes that new generating capacity would be "behind the meter," which is what experts call power plants that are hooked up directly to a customer, bypassing the grid. Wait times for new power plants to connect to the grid can stretch on for years. As a result, behind the meter arrangements have become more appealing for data center operators who are scrambling to build new capacity.
Read more of this story at Slashdot.
Pages
|