Meta shifts from open to closed-source AI in pursuit of profit, marking a major strategic reset. Meanwhile, space is emerging as the next frontier for data center energy as companies begin launching infrastructure beyond Earth. And the Nvidia-to-China export dilemma continues, a true “damn if you do, damn if you don’t” moment for U.S. tech policy.
Let’s dive in and stay curious.
Get 60% off for 1 year
Following the failure of its Llama 4 model, Mark Zuckerberg has taken direct control of Meta’s AI division, replacing key leadership, including the departure of Yann LeCun and the hiring of Scale AI’s Alexandr Wang. The company is investing $600 billion to develop “Avocado,” a closed-source proprietary model trained on rival data from Google and OpenAI. This shift aims to monetize AI and compete directly with top rivals, though it has caused significant internal turmoil and raises regulatory concerns regarding safety.
Share
Apply Today — Open Positions.
Washington unexpectedly lifted export controls on Nvidia’s advanced H200 chips, forcing Beijing to choose between supporting its domestic chipmakers or accelerating AI development with U.S. hardware.
The shift comes as the U.S. holds a roughly 13:1 compute advantage over China, a lead analysts warn could shrink quickly if tens of billions in GPUs flow into the country. But if the US decides to continue the ban, China will develop its own chips and accelerate innovation, and still catch up without Nvidia getting any financial benefits. Some say might as well export chips, make some money in the process, and perhaps install surveillance technology to learn how they are developing AI.
National security experts call the move a “disaster,” arguing it boosts China’s military and intelligence capabilities at a time when U.S. agencies say they “cannot get enough chips” themselves. Major tech companies like Microsoft and AWS back restrictions such as the GAIN Act, which prioritizes U.S. demand, but Nvidia, facing declining reliance from U.S. hyperscalers, pushed aggressively for the policy change. Critics warn this could accelerate China’s frontier models like DeepSeek and Qwen, undermine export controls, and erode long-term U.S. AI dominance.
Share
Open-source learning resources
1. Efficiently Serving LLMs — The Best “Crash Course” for Concepts. Focuses on performance optimizations like KV Caching, Continuous Batching, and Quantization, the key methods for saving money on inference.
2. vLLM Documentation — The Best “Engineer’s Bible” for Production — This is the definitive technical guide for the serving engine that provides the high-throughput and low-latency you need for a production environment. Look for the “Serving an LLM” section.
3. Ollama — The Best “Try it Now” Tool (Laptop to Server) -The simplest way to start running models like Llama 3, DeepSeek, and Mistral with a single command line for rapid prototyping and testing.
The next frontier for artificial intelligence is space. Starcloud, an Nvidia-backed startup, has successfully trained and run an AI model from orbit for the first time, marking a significant milestone in the race to build data centers off-planet.
Starcloud’s Starcloud-1 satellite was launched with an Nvidia H100 GPU, a chip reportedly 100 times more powerful than previous space computing hardware. The satellite successfully ran Google’s open-source LLM, Gemma, in orbit, proving that complex AI operations can function in space.
The motivation behind this move is the escalating crisis of terrestrial data centers, which are projected to more than double their electricity consumption by 2030. Moving computing to orbit offers powerful solutions:
Starcloud is now planning a 5-gigawatt orbital data center, a structure that would dwarf the largest power plant in the U.S., powered entirely by solar energy.
Starcloud’s success has intensified a high-stakes competition among the world’s most powerful tech and space companies, all aiming to capitalize on the promise of scalable, sustainable AI compute.
The technical hurdles remain, including radiation, maintenance, and space debris, but with the biggest names in tech betting on space, the future of AI may soon be floating above our heads.
Best Text Open Source Models for efficiency and savings.
⚡NVIDIA Chips to China. “Damn If You Do, Damn If You Don’t.” was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story.


BitGo’s move creates further competition in a burgeoning European crypto market that is expected to generate $26 billion revenue this year, according to one estimate. BitGo, a digital asset infrastructure company with more than $100 billion in assets under custody, has received an extension of its license from Germany’s Federal Financial Supervisory Authority (BaFin), enabling it to offer crypto services to European investors. The company said its local subsidiary, BitGo Europe, can now provide custody, staking, transfer, and trading services. Institutional clients will also have access to an over-the-counter (OTC) trading desk and multiple liquidity venues.The extension builds on BitGo’s previous Markets-in-Crypto-Assets (MiCA) license, also issued by BaFIN, and adds trading to the existing custody, transfer and staking services. BitGo acquired its initial MiCA license in May 2025, which allowed it to offer certain services to traditional institutions and crypto native companies in the European Union.Read more
