You are here
Technology
Mozilla Announces 'TABS API' For Developers Building AI Agents
"Fresh from announcing it is building an AI browsing mode in Firefox and laying the groundwork for agentic interactions in the Firefox 145 release, the corp arm of Mozilla is now flexing its AI muscles in the direction of those more likely to care," writes the blog OMG Ubuntu:
If you're a developer building AI agents, you can sign up to get early access to Mozilla's TABS API, a "powerful web content extraction and transformation toolkit designed specifically for AI agent builders"... The TABS API enables devs to create agents to automate web interactions, like clicking, scrolling, searching, and submitting forms "just like a human". Real-time feedback and adaptive behaviours will, Mozilla say, offer "full control of the web, without the complexity."
As TABS is not powered by a Mozilla-backed LLM you'll need to connect it to your choice of third-party LLM for any relevant processing... Developers get 1,000 requests monthly on the free tier, which seems reasonable for prototyping personal projects. Complex agentic workloads may require more. Though pricing is yet to be locked in, the TABS API website suggests it'll cost ~$5 per 1000 requests.
Paid plans will offer additional features too, like lower latency and, somewhat ironically, CAPTCHA solving so AI can 'prove' it's not a robot on pages gated to prevent automated activities.
Google, OpenAI, and other major AI vendors offer their own agentic APIs. Mozilla is pitching up late, but it plans to play differently. It touts a "strong focus on data minimisation and security", with scraped data treated ephemerally — i.e., not kept. As a distinction, that matters. AI agents can be given complex online tasks that involve all sorts of personal or sensitive data being fetched and worked with.... If you're minded to make one, perhaps without a motivation to asset-strip the common good, Mozilla's TABS API look like a solid place to start.
Read more of this story at Slashdot.
Categories: Technology
One Company's Plan to Sink Nuclear Reactors Deep Underground
Long-time Slashdot reader jenningsthecat shared this article from IEEE Spectrum:
By dropping a nuclear reactor 1.6 kilometers (1 mile) underground, Deep Fission aims to use the weight of a billion tons of rock and water as a natural containment system comparable to concrete domes and cooling towers. With the fission reaction occurring far below the surface, steam can safely circulate in a closed loop to generate power.
The California-based startup announced in October that prospective customers had signed non-binding letters of intent for 12.5 gigawatts of power involving data center developers, industrial parks, and other (mostly undisclosed) strategic partners, with initial sites under consideration in Kansas, Texas, and Utah... The company says its modular approach allows multiple 15-megawatt reactors to be clustered on a single site: A block of 10 would total 150 MW, and Deep Fission claims that larger groupings could scale to 1.5 GW. Deep Fission claims that using geological depth as containment could make nuclear energy cheaper, safer, and deployable in months at a fraction of a conventional plant's footprint...
The company aims to finalize its reactor design and confirm the pilot site in the coming months. [Company founder Liz] Muller says the plan is to drill the borehole, lower the canister, load the fuel, and bring the reactor to criticality underground in 2026. Sites in Utah, Texas, and Kansas are among the leading candidates for the first commercial-scale projects, which could begin construction in 2027 or 2028, depending on the speed of DOE and NRC approvals. Deep Fission expects to start manufacturing components for the first unit in 2026 and does not anticipate major bottlenecks aside from typical long-lead items.
In short "The same oil and gas drilling techniques that reliably reach kilometer-deep wells can be adapted to host nuclear reactors..." the article points out. Their design would also streamline construction, since "Locating the reactors under a deep water column subjects them to roughly 160 atmospheres of pressure — the same conditions maintained inside a conventional nuclear reactor — which forms a natural seal to keep any radioactive coolant or steam contained at depth, preventing leaks from reaching the surface."
Other interesting points from the article:
They plan on operating and controlling the reactor remotely from the surface.
Company founder Muller says if an earthquake ever disrupted the site, "you seal it off at the bottom of the borehole, plug up the borehole, and you have your waste in safe disposal."
For waste management, the company "is eyeing deep geological disposal in the very borehole systems they deploy for their reactors."
"The company claims it can cut overall costs by 70 to 80 percent compared with full-scale nuclear plants."
"Among its competition are projects like TerraPower's Natrium, notes
the tech news site Hackaday, saying TerraPower's fast neutron reactors "are already under construction and offer much more power per reactor, along with Natrium in particular also providing built-in grid-level storage.
"One thing is definitely for certain..." they add. "The commercial power sector in the US has stopped being mind-numbingly boring."
Read more of this story at Slashdot.
Categories: Technology
Could High-Speed Trains Shorten US Travel Times While Reducing Emissions?
With some animated graphics, CNN "reimagined" what three of America's busiest air and road travel routes would look like with high-speed trains, for "a glimpse into a faster, more connected future."
The journey from New York City to Chicago could take just over six hours by high-speed train at an average speed of 160 mph, cutting travel time by more than 13 hours compared with the current Amtrak route... The journey from San Francisco to Los Angeles could be completed in under three hours by high-speed train... The journey from Atlanta to Orlando could be completed in under three hours by high-speed train that reaches 160 mph, cutting travel time by over half compared with driving...
While high-speed rail remains a fantasy in the United States, it is already hugely successful across the globe. Passengers take 3 billion trips annually on more than 40,000 miles of modern high-speed railway across the globe, according to the International Union of Railways. China is home to the world's largest high-speed rail network. The 809-mile train journey from Beijing to Shanghai takes just four and a half hours... In Europe, France's Train a Grand Vitesse (TGV) is recognized as a pioneer of high-speed rail technology. Spain soon followed France's success and now hosts Europe's most extensive high-speed rail network...
[T]rain travel contributes relatively less pollution of every type, said Jacob Mason of the Institute for Transportation and Development Policy, from burning less gasoline to making less noise than cars and taking up less space than freeways. The reduction in greenhouse gas emissions is staggering: Per kilometer traveled, the average car or a short-haul flight each emit more than 30 times the CO2 equivalent than Eurostar high-speed trains, according to data from the UK government.
Read more of this story at Slashdot.
Categories: Technology
Microsoft and GitHub Preview New Tool That Identifies, Prioritizes, and Fixes Vulnerabilities With AI
"Security, development, and AI now move as one," says Microsoft's director of cloud/AI security
product marketing.
Microsoft and GitHub "have launched a native integration between Microsoft Defender for Cloud and GitHub Advanced Security that aims to address what one executive calls decades of accumulated security debt in enterprise codebases..." according to The New Stack:
The integration, announced this week in San Francisco at the
Microsoft
Ignite 2025 conference and now available in public preview,
connects runtime intelligence from production environments directly
into developer workflows. The goal is to help organizations
prioritize which vulnerabilities actually matter and use AI to fix
them faster. "Throughout my career, I've seen vulnerability
trends going up into the right. It didn't matter how good of a
detection
engine and how accurate our detection engine was, people just
couldn't fix things fast enough," said Marcelo
Oliveira, VP of product management at GitHub, who has spent
nearly a decade in application security. "That basically resulted
in decades of accumulation of security debt into enterprise code
bases." According to industry data, critical and high-severity
vulnerabilities constitute 17.4% of security backlogs, with a mean
time to remediation of 116 days, said Andrew
Flick, senior director of developer services, languages and tools
at Microsoft, in a blog
post. Meanwhile, applications face attacks as frequently as once
every three minutes, Oliveira said.
The integration represents the first native link between runtime
intelligence and developer workflows, said Elif
Algedik, director of product marketing for cloud and AI security
at Microsoft, in a blog
post... The problem, according to Flick, comes down to three
challenges: security teams drowning in alert fatigue while AI rapidly
introduces new threat
vectors that they have little time to understand; developers
lacking clear prioritization while remediation takes too long; and
both teams relying on separate, nonintegrated tools that make
collaboration slow and frustrating... The new integration works
bidirectionally. When Defender for Cloud detects a vulnerability in a
running workload, that runtime context flows into GitHub, showing
developers whether the vulnerability is internet-facing, handling
sensitive data or actually exposed in production. This is powered by
what GitHub calls the Virtual Registry, which creates code-to-runtime
mapping, Flick said...
In the past, this alert would age in a dashboard while developers
worked on unrelated fixes because they didn't know this was the
critical one, he said. Now, a security campaign can be created in
GitHub, filtering for runtime risk like internet exposure or
sensitive data, notifying the developer to prioritize this issue.
GitHub Copilot "now automatically checks dependencies, scans
for first-party code vulnerabilities and catches hardcoded secrets
before code reaches developers," the article points out — but
GitHub's VP of product management says this takes things even
further.
"We're not only helping you fix existing vulnerabilities,
we're also reducing the number of vulnerabilities that come into
the system when the level of throughput of new code being created is
increasing dramatically with all these agentic coding agent platforms."
Read more of this story at Slashdot.
Categories: Technology
Engineers are Building the Hottest Geothermal Power Plant on Earth - Next to a US Volcano
"On the slopes of an Oregon volcano, engineers are building the hottest geothermal power plant on Earth," reports the Washington Post:
The plant will tap into the infernal energy of Newberry Volcano, "one of the largest and most hazardous active volcanoes in the United States," according to the U.S. Geological Survey. It has already reached temperatures of 629 degrees Fahrenheit, making it one of the hottest geothermal sites in the world, and next year it will start selling electricity to nearby homes and businesses. But the start-up behind the project, Mazama Energy, wants to crank the temperature even higher — north of 750 degrees — and become the first to make electricity from what industry insiders call "superhot rock." Enthusiasts say that could usher in a new era of geothermal power, transforming the always-on clean energy source from a minor player to a major force in the world's electricity systems.
"Geothermal has been mostly inconsequential," said Vinod Khosla, a venture capitalist and one of Mazama Energy's biggest financial backers. "To do consequential geothermal that matters at the scale of tens or hundreds of gigawatts for the country, and many times that globally, you really need to solve these high temperatures." Today, geothermal produces less than 1 percent of the world's electricity. But tapping into superhot rock, along with other technological advances, could boost that share to 8 percent by 2050, according to the International Energy Agency (IEA). Geothermal using superhot temperatures could theoretically generate 150 times more electricity than the world uses, according to the IEA. "We believe this is the most direct path to driving down the cost of geothermal and making it possible across the globe," said Terra Rogers, program director for superhot rock geothermal at the Clean Air Task Force, an environmentalist think tank. "The [technological] gaps are within reason. These are engineering iterations, not breakthroughs."
The Newberry Volcano project combines two big trends that could make geothermal energy cheaper and more widely available. First, Mazama Energy is bringing its own water to the volcano, using a method called "enhanced geothermal energy"... [O]ver the past few decades, pioneering projects have started to make energy from hot dry rocks by cracking the stone and pumping in water to make steam, borrowing fracking techniques developed by the oil and gas industry... The Newberry project also taps into hotter rock than any previous enhanced geothermal project. But even Newberry's 629 degrees fall short of the superhot threshold of 705 degrees or above. At that temperature, and under a lot of pressure, water becomes "supercritical" and starts acting like something between a liquid and a gas. Supercritical water holds lots of heat like a liquid, but it flows with the ease of a gas — combining the best of both worlds for generating electricity... [Sriram Vasantharajan, Mazama's CEO] said Mazama will dig new wells to reach temperatures above 750 degrees next year. Alongside an active volcano, the company expects to hit that temperature less than three miles beneath the surface. But elsewhere, geothermal developers might have to dig as deep as 12 miles.
While Mazama plans to generate 15 megawatts of electricity next year, it hopes to eventually increase that to 200 megawatts. (And the company's CEO said it could theoretically generate five gigawatts of power.)
But more importantly, successful projects "motivate other players to get into the market," according to a senior geothermal research analyst at energy consultancy Wood Mackenzie, who predicted "a ripple effect," to the Washington Post where "we'll start seeing more companies get the financial support to kick off their own pilots."
Read more of this story at Slashdot.
Categories: Technology
