19 stories tagged with #moe, in publish-time order across the WeSearch catalog. Tag pages update as new stories ingest.
⌘ RSS feed for this tag → or search "Moe"
Crusaders cruise behind new Moeller all-time RBI leader Conner Cuozzo
Granite 4.1: IBM's 8B Model Matching 32B MoE
Man dies covered in necrotic lesions after amoebas eat him alive
Doctors suspect three factors, each unremarkable on its own, contributed to his fate.…
Remains found in Tampa Bay identified as second missing USF student
Florida officials have identified human remains found in Tampa Bay as the second missing University of South Florida doctoral student. The Hillsborough County Sheriff’s Office said…
Does Your AI Agent Need a VPN? The Company Behind Norton and Avast Thinks So
You might use a VPN yourself, but have you considered giving one to your AI agent? It might be more important than you think.…
Unexpected regional Victorian towns where house prices rose nearly 20 per cent
Property prices in the regions are defying Melbourne’s slowdown, as home buyers and investors seek more affordable options.…
Earnings call transcript: Moelis & Co misses Q1 2026 forecasts despite record revenue
These bedazzled Crocs earn me more compliments than anything else in my closet
When I say these are the most comfortably stylish shoes you'll ever wear, I mean it with my whole chest (and wardrobe).…
Defect engineering lifts chalcopyrite thermoelectrics to record performance
Moelis & Company (MC) Q1 2026 Earnings Call Transcript
Moelis & Company (MC) Q1 2026 Earnings Call April 29, 2026 5:00 PM EDTCompany ParticipantsMatthew Tsukroff - Vice President of Investor RelationsNavid...…
AI Designs Thermoelectric Generators 10k Times Faster Than We Can
US startup Poolside debuts its first open-weight model, Laguna XS.2, a 33B-A3B-parameter MoE model, and Laguna M.1, a proprietary 225B-A23B-parameter MoE model (Carl Franzen/VentureBeat)
Carl Franzen / VentureBeat : US startup Poolside debuts its first open-weight model, Laguna XS.2, a 33B-A3B-parameter MoE model, and Laguna M.1, a proprietary 225B-A23B-parameter M…
Nvidia launches Nemotron 3 Nano Omni, an open multimodal model with a 30B-A3B hybrid MoE architecture; the Nemotron 3 family saw 50M+ downloads in the past year (Kyt Dotson/SiliconANGLE)
Kyt Dotson / SiliconANGLE : Nvidia launches Nemotron 3 Nano Omni, an open multimodal model with a 30B-A3B hybrid MoE architecture; the Nemotron 3 family saw 50M+ downloads in the p…
Russia stocks lower at close of trade; MOEX Russia Index down 1.27%
AI slashes the time needed to design better heat-harvesting devices
From wearable technology to industrial heat recovery, thermoelectric generators which convert waste heat into electricity have an enormous range of potential applications. So far, …
First direct side by side MoE vs Dense comparison.
'10,000 times faster than a human scientist' — New AI tool designed ultra-efficient heat-to-electricity generators at lightning speed, a breakthrough that could slash the cost of energy harvesters and help enable cheaper, high-performance home heat pumps
“10,000 times faster than a human scientist” — New AI tool designed ultra-efficient heat-to-electricity generators at lightning speed, a breakthrough that could soon slash the cost…
Mark Carney looks for investment the Liberal way
Prime Minister calculates bolstering investment is immediate trade-war imperative for Canada’s economy…
Going from 3B/7B dense to Nemotron 3 Nano (hybrid Mamba-MoE) for multi-task reasoning — what changes in the fine-tuning playbook? [D]
Following up on something I posted a few days back about fine-tuning for multi-task reasoning. Read a lot since then, and I've moved past the dense 3B vs 7B question — landing on N…