Linux fréttir
Longtime Slashdot reader sinij shares a press release from the White House, outlining a series of executive orders that overhaul the Nuclear Regulatory Commission and speed up deployment of new nuclear power reactions in the U.S.. From a report: The NRC is a 50-year-old, independent agency that regulates the nation's fleet of nuclear reactors. Trump's orders call for a "total and complete reform" of the agency, a senior White House official told reporters in a briefing. Under the new rules, the commission will be forced to decide on nuclear reactor licenses within 18 months. Trump said Friday the orders focus on small, advanced reactors that are viewed by many in the industry as the future. But the president also said his administration supports building large plants. "We're also talking about the big plants -- the very, very big, the biggest," Trump said. "We're going to be doing them also."
When asked whether NRC reform will result in staff reductions, the White House official said "there will be turnover and changes in roles." "Total reduction in staff is undetermined at this point, but the executive orders do call for a substantial reorganization" of the agency, the official said. The orders, however, will not remove or replace any of the five commissioners who lead the body, according to the White House. Any reduction in staff at the NRC would come at time when the commission faces a heavy workload. The agency is currently reviewing whether two mothballed nuclear plants, Palisades in Michigan and Three Mile Island in Pennsylvania, should restart operations, a historic and unprecedented process. [...]
Trump's orders also create a regulatory framework for the Departments of Energy and Defense to build nuclear reactors on federal land, the administration official said. "This allows for safe and reliable nuclear energy to power and operate critical defense facilities and AI data centers," the official told reporters. The NRC will not have a direct role, as the departments will use separate authorities under their control to authorize reactor construction for national security purposes, the official said. The president's orders also aim to jump start the mining of uranium in the U.S. and expand domestic uranium enrichment capacity, the official said. Trump's actions also aim to speed up reactor testing at the Department of Energy's national laboratories.
Read more of this story at Slashdot.
An anonymous reader quotes a report from Jalopnik: With the gradual rise of semi-autonomous vehicles, there will likely be multiple cameras pointing back when you pull out a phone to take a photo or record video of a car. One reddit user found out earlier this month that car-mounted lidar sensors can damage a phone camera under certain circumstances. It was the technological equivalent of staring directly into the Sun. Their phone's camera was toast, but only because it was close-up and pointed directly at the lidar sensor.
Reddit user u/Jeguetelli posted worrying footage of a brand new Volvo EX90 from his iPhone 16 Pro Max. Nothing was wrong with the crossover SUV. That was the problem. The lidar sensor mounted in a pod above the windshield shot out a laser barrage of near-infrared light into the camera. The damage was immediate and obvious, leaving behind a red, pink and purple constellation of fried pixels. You can tell the permanent damage was to that specific lens because the image returned to normal after zooming out to a different lens. Jeguetelli didn't seem too concerned about the incident because he had Apple Care. In a statement to The Drive, Volvo confirmed that bad things can happen. "It's generally advised to avoid pointing a camera directly at a lidar sensor," the Swedish manufacturer said. "The laser light emitted by the lidar can potentially damage the camera's sensor or affect its performance."
"Using filters or protective covers on the camera lens can help reduce the impact of lidar exposure. Some cameras are designed with built-in protections against high-intensity light sources."
Read more of this story at Slashdot.
Richard Speed writes via The Register: It was 30 years ago when the first public release of the Java programming language introduced the world to Write Once, Run Anywhere -- and showed devs something cuddlier than C and C++. Originally called "Oak," Java was designed in the early 1990s by James Gosling at Sun Microsystems. Initially aimed at digital devices, its focus soon shifted to another platform that was pretty new at the time -- the World Wide Web.
The language, which has some similarities to C and C++, usually compiles to a bytecode that can, in theory, run on any Java Virtual Machine (JVM). The intention was to allow programmers to Write Once Run Anywhere (WORA) although subtle differences in JVM implementations meant that dream didn't always play out in reality. This reporter once worked with a witty colleague who described the system as Write Once Test Everywhere, as yet another unexpected wrinkle in a JVM caused their application to behave unpredictably. However, the language soon became wildly popular, rapidly becoming the backbone of many enterprises. [...]
However, the platform's ubiquity has meant that alternatives exist to Oracle Java, and the language's popularity is undiminished by so-called "predatory licensing tactics." Over 30 years, Java has moved from an upstart new language to something enterprises have come to depend on. Yes, it may not have the shiny baubles demanded by the AI applications of today, but it continues to be the foundation for much of today's modern software development. A thriving ecosystem and a vast community of enthusiasts mean that Java remains more than relevant as it heads into its fourth decade.
Read more of this story at Slashdot.
Google's new AI Mode for Search, which is rolling out to everyone in the U.S., has sparked outrage among publishers, who call it "the definition of theft" for using content without fair compensation and without offering a true opt-out option. Internal documents revealed by Bloomberg earlier this week suggest that Google considered giving publishers more control over how their content is used in AI-generated results but ultimately decided against it, prioritizing product functionality over publisher protections.
News/Media Alliance slammed Google for "further depriving publishers of original content both traffic and revenue." Their full statement reads: "Links were the last redeeming quality of search that gave publishers traffic and revenue. Now Google just takes content by force and uses it with no return, the definition of theft. The DOJ remedies must address this to prevent continued domination of the internet by one company." 9to5Google's take: It's not hard to see why Google went the route that it did here. Giving publishers the ability to opt out of AI products while still benefiting from Search would ultimately make Google's flashy new tools useless if enough sites made the switch. It was very much a move in the interest of building a better product.
Does that change anything regarding how Google's AI products in Search cause potential harm to the publishing industry? Nope.
Google's tools continue to serve the company and its users (mostly) well, but as they continue to bleed publishers dry, those publishers are on the verge of vanishing or, arguably worse, turning to cheap and poorly produced content just to get enough views to survive. This is a problem Google needs to address, as it's making the internet as a whole worse for everyone.
Read more of this story at Slashdot.
An anonymous reader quotes a report from Ars Technica, written by Nate Anderson: Don't worry about the "mission-driven not-for-profit" College Board -- it's drowning in cash. The US group, which administers the SAT and AP tests to college-bound students, paid its CEO $2.38 million in total compensation in 2023 (the most recent year data is available). The senior VP in charge of AP programs made $694,662 in total compensation, while the senior VP for Technology Strategy made $765,267 in total compensation. Given such eye-popping numbers, one would have expected the College Board's transition to digital exams to go smoothly, but it continues to have issues.
Just last week, the group's AP Psychology exam was disrupted nationally when the required "Bluebook" testing app couldn't be accessed by many students. Because the College Board shifted to digital-only exams for 28 of its 36 AP courses beginning this year, no paper-based backup options were available. The only "solution" was to wait quietly in a freezing gymnasium, surrounded by a hundred other stressed-out students, to see if College Board could get its digital act together. [...] College Board issued a statement on the day of the AP Psych exam, copping to "an issue that prevented [students] from logging into the College Board's Bluebook testing application and beginning their exams at the assigned local start time." Stressing that "most students have had a successful testing experience, with more than 5 million exams being successfully submitted thus far," College Board nonetheless did "regret that their testing period was disrupted." It's not the first such disruption, though. [...]
College Board also continues to have problems delivering digital testing at scale in a high-pressure environment. During the SAT exam sessions on March 8-9, 2025, more than 250,000 students sat for the test -- and some found that their tests were automatically submitted before the testing time ended. College Board blamed the problem on "an incorrectly configured security setting on Bluebook." The problem affected nearly 10,000 students, and several thousand more "may have lost some testing time if they were asked by their room monitor to reboot their devices during the test to fix and prevent the auto-submit error." College Board did "deeply and sincerely apologize to the students who were not able to complete their tests, or had their test time interrupted, for the difficulty and frustration this has caused them and their families." It offered refunds, plus a free future SAT testing voucher.
Read more of this story at Slashdot.
Spain's grid operator has accused some large power plants of not doing their job to help regulate the country's electricity system in the moments before last month's catastrophic blackout across the Iberian peninsula. From a report: Beatriz Corredor, chair of grid operator Red Electrica's parent company, said power plants fell short in controlling the voltage of the electricity system.
However, the heads of Spain's biggest plant owners linked the blackout to a lack of grid investment and insufficient efforts to boost electricity demand. The public blame game over the outage is intensifying as more than three weeks after 60 million people were left without power, Spanish government investigators insisted they needed more time to establish the root cause.
Read more of this story at Slashdot.
Out of 186 countries, only Guyana produces enough food to self-sufficiently feed all its citizens without foreign imports, according to new research. From a report: The study, published in Nature Food, investigated how well each country could feed their populations in seven food groups: fruits, vegetables, dairy, fish, meat, plant-based protein and starchy staples.
Worldwide, the study found that 65% of countries were overproducing meat and dairy, compared to their own population's dietary needs. It also found that Guyana, located in South America, was the only country that could boast total self-sufficiency, while China and Vietnam were close behind, being able to produce enough food in six out of seven food groups. Just one in seven of the tested countries were judged self-sufficient in five or more categories.
Read more of this story at Slashdot.
Several romance novelists have accidentally left AI writing prompts embedded in their published books, exposing their use of chatbots, 404Media reports. Readers discovered passages like "Here's an enhanced version of your passage, making Elena more relatable" in K.C. Crowne's "Dark Obsession," for instance, and similar AI-generated instructions in works by Lena McDonald and Rania Faris.
Read more of this story at Slashdot.
SQL Server and Cosmos DB added to data lake platform as lure for building AI features into transactional systems
Microsoft is throwing more transactional database systems into its Fabric analytics and data lake environment in expectation the proximity will help users that are adding AI to their systems.…
Rice plants can inherit tolerance to cold without changes to their genomes, according to a decade-long study carried out by researchers in China. From a report: The work, published in Cell this week, strengthens the evidence for a form of evolution in which environmental pressures induce heritable changes that do not alter an organism's DNA. The study conducted experiments that demonstrate, for the first time, the mechanism for these changes -- 'epigenetic' tweaks to chemical markers on the plant's DNA that don't actually tinker with the sequences themselves.
"What they're showing is extremely convincing; I would say that it's a landmark in the field," says Leandro Quadrana, a plant geneticist at the French National Centre for Scientific Research in Paris-Saclay. Michael Skinner, who studies epigenetic inheritance at Washington State University in Pullman, says the study adds to the growing body of evidence challenging the prevailing view of evolution that the only way that adaptations emerge is through gradual natural selection of randomly arising DNA mutations. This study shows that the environment isn't just a passive actor in evolution, but a selective force inducing a targeted change.
Read more of this story at Slashdot.
Bank accounts, personal details all hoovered up in the attack
Nova Scotia Power on Friday confirmed it had been hit by a ransomware attack that began earlier this spring and disrupted certain IT systems, and admitted the crooks leaked data belonging to an unspecified number of its roughly 500,000 customers online. The stolen info may have included billing details and, for those on autopay, bank account numbers.…
Harvard University's Galileo Project is using AI to automate the search for unidentified anomalous phenomena, marking a significant shift in how academics approach what was once considered fringe research. The project operates a Massachusetts observatory equipped with infrared cameras, acoustic sensors, and radio-frequency analyzers that continuously scan the sky for unusual objects.
Researchers Laura Domine and Richard Cloete are training machine learning algorithms to recognize all normal aerial phenomena -- planes, birds, drones, weather balloons -- so the system can flag genuine anomalies for human analysis. The team uses computer vision software called YOLO (You Only Look Once) and has generated hundreds of thousands of synthetic images to train their models, though the software currently identifies only 36% of aircraft captured by infrared cameras.
The Pentagon is pursuing parallel efforts through its All-domain Anomaly Resolution Office, which has examined over 1,800 UAP reports and identified 50 to 60 cases as "true anomalies" that government scientists cannot explain. AARO has developed its own sensor suite called Gremlin, using similar technology to Harvard's observatory. Both programs represent the growing legitimization of UAP research following 2017 Defense Department disclosures about military encounters with unexplained aerial phenomena.
Read more of this story at Slashdot.
Fastly acquisition asks that redirects be set up before December 31
Three years after confirming its acquisition by Fastly, Glitch is pulling the plug on its app hosting platform.…
Cyberbaddies are coming for your M365 creds, US infosec agency warns
The Cybersecurity and Infrastructure Security Agency (CISA) is warning that SaaS companies are under fire from criminals on the prowl for cloud apps with weak security.…
Glitch, the coding platform where developers can share and remix projects, will soon no longer offer its core feature: hosting apps on the web. From a report: In an update on Thursday, Glitch CEO Anil Dash said it will stop hosting projects and close user profiles on July 8th, 2025 -- but stopped short of saying that it's shutting down completely.
Users will be able to access their dashboard and download code for their projects through the end of 2025, and Glitch is working on a new feature that allows users to redirect their project subdomains. The platform has also stopped taking new Pro subscriptions, but it will continue to honor existing subscriptions until July 8th.
Read more of this story at Slashdot.
A simple text editor that dates back to Windows 1.0 is getting smartified
Microsoft has continued to shovel AI into its built-in Windows inbox apps, and now it's rolling out a Notepad update that will use Copilot to write text for you.…
Cornell University researchers have solved a kitchen mystery by demonstrating that sharp knives produce fewer and slower-moving droplets when cutting onions compared to dull blades. The findings used high-speed cameras and particle tracking to analyze droplet formation during onion cutting at speeds up to 20,000 frames per second.
The team discovered that onion droplets form through a two-stage process: an initial violent ejection driven by internal pressure, followed by slower fragmentation of liquid streams in air. Blunter blades create up to 40 times more droplets because the onion's tough outer skin acts as a barrier, allowing the softer interior tissue to compress significantly before rupturing and releasing pressurized liquid.
The research reveals that droplets are ejected at speeds between 1 and 40 meters per second, with the fastest ones posing the greatest risk of reaching a cook's eyes. Beyond tear reduction, the study suggests sharp knives may also limit the spread of foodborne pathogens, since atomized droplets can carry bacteria like Salmonella from contaminated cutting boards.
Read more of this story at Slashdot.
The coffee shows no signs of cooling
Feature It was 30 years ago when the first public release of the Java programming language introduced the world to Write Once, Run Anywhere – and showed devs something cuddlier than C and C++.…
An interesting piece on Construction Physics that examines how Japan transformed discarded American wartime shipbuilding techniques into a revolutionary manufacturing system that captured nearly half the global market by 1970. The story reveals the essential ingredients for industrial dominance: government backing, organizational alignment, relentless will to improve, and the systematic coordination needed to turn existing technologies into something entirely new. A few excerpts: During WWII, the US constructed an unprecedented shipbuilding machine. By assembling ships from welded, prefabricated blocks, the US built a huge number of cargo ships incredibly quickly, overwhelming Germany's u-boats and helping to win the war. But when the war was over, this shipbuilding machine was dismantled. Industrialists like Henry Kaiser and Stephen Bechtel, who operated some of the US's most efficient wartime shipyards, left the shipbuilding business. Prior to the war, the US had been an uncompetitive commercial shipbuilder producing a small fraction of commercial oceangoing ships, and that's what it became again. At the height of the war the US was producing nearly 90% of the world's ships. By the 1950s, it produced just over 2%.
But the lessons from the US's shipbuilding machine weren't forgotten. After the war, practitioners brought them to Japan, where they would continue to evolve, eventually allowing Japan to build ships faster and cheaper than almost anyone else in the world.
[...] The third strategy that formed the core of modern shipbuilding methods was statistical process control. The basic idea behind process control is that it's impossible to make an industrial process perfectly reliable. There will always be some variation in what it produces: differences in part dimensions, material strength, chemical composition, and so on. But while some variation is inherent to the process (and must be accepted), much of the variation is from specific causes that can be hunted down and eliminated. By analyzing the variation in a process, undesirable sources of variation can be removed. This makes a process work more reliably and predictably, reducing waste and rework from parts that are outside acceptable tolerances.
Read more of this story at Slashdot.
From bit barn to algae farm?
Euro datacenter operator Data4 is trialling a project to reuse heat from its servers and captured carbon dioxide to grow algae that can then be used in the agri-food or pharmacology sectors.…
Pages
|