Intel unveils next-gen AI chips for data centers / Tesla redirects Nvidia AI chips to xAI and X / Cohere secures $450M [EN]

Intel unveiled its latest AI chips for data centers and upcoming Lunar Lake processors, insiders from top AI firms signed an open letter urging more AI transparency, Elon Musk redirected Nvidia AI chips from Tesla to his companies X Corp. and xAI Corp., Canadian LLM startup Cohere secured $450M in funding, and Cisco Systems announced a $1B fund to invest in AI startups.

Host 3:AI can now compose symphonies, yet it's clueless about not burning my toast. Guess we know where the real genius lies!
Host 1:Intel's AI chips, Musk's Nvidia move, and a four hundred fifty million dollar startup—what's the buzz? Let's dive into the AI shake-up!
Host 1:So, did you hear about the latest move in the AI chip game? They just unveiled their new Xeon six processors for data centers, right after Nvidia and AMD announced their own new AI chips.
Host 2:Oh, I did! It's like a tech arms race out there. Intel's saying their Xeon six processors are all about superior performance and power efficiency. What’s the scoop on that?
Host 1:Yeah, they’re claiming these new chips will outperform their predecessors in data centers. Imagine upgrading from a bicycle to a rocket ship! They just launched the Fifth Generation Xeon processors six months ago, and now they’re already rolling out the Xeon Six. It’s like they’re on a mission to keep up with the AI demands and high-performance computing needs.
Host 2:That’s wild. And did you catch what their CEO, Pat Gelsinger, said? "Simply put, performance up, power down." It’s like they’re trying to make AI chips sound sexy or something.
Host 1:Haha, right? But it’s not just about the data centers. They also talked about their upcoming Lunar Lake processors for AI PCs. These are supposed to launch in the third quarter and promise up to forty percent less power consumption while boosting on-device AI capabilities.
Host 2:Text Host 2: That’s pretty cool. I mean, who doesn’t want a personal computer that’s smarter and more power-efficient? Speaking of which, did you see the price tags on their Gaudi three AI accelerator chips? They’re aiming to undercut the competition big time.
Host 1:Oh yeah, the Gaudi three kit with eight chips is going for one hundred twenty-five thousand dollars, while the Gaudi two was sixty-five thousand dollars. And when you compare that to an HGX server with eight Nvidia H100 AI chips, which can exceed three hundred thousand dollars, it’s clear Intel’s playing the price game hard.
Host 2:Damn, that’s a huge difference. The company is still leading in revenue, though. Their data center unit pulled in eighteen point four billion dollars in Q4 twenty twenty-three. That’s some serious cash flow.
Host 1:Absolutely. Nvidia’s been dominating with their GPU sales, and their next-gen AI chip platform, Rubin, is set to launch in two thousand twenty-six. It’s going to be interesting to see how Intel and AMD keep up.
Host 2:For sure. And speaking of keeping up, I’ve been trying to stay updated on all this tech news while balancing my hobbies. You know, like my obsession with retro video games and trying to build my own mini arcade machine.
Host 1:That sounds like a fun project! Maybe you can use some of these AI chips to power it up. Imagine a retro arcade machine with cutting-edge AI capabilities.
Host 2:That would be epic! But for now, I’ll stick to my old-school soldering and coding. Anyway, back to the chips—do you think Intel’s strategy will pay off?
Host 1:It’s hard to say. They’re definitely making bold moves, but the competition is fierce. If they can deliver on their promises of better performance and lower power consumption, they might just carve out a bigger slice of the market.
Host 2:Yeah, only time will tell. Meanwhile, I’ll be here, geeking out over all this tech news and trying not to fry my circuits with excitement.
Host 3:Sure, taking into account the critique provided, here’s a refined version of the draft:
Host 3:Is there anything more thrilling than another useless upgrade? Oh, wait, everything.
Host 1:Talking about AI advancements, did you catch the recent open letter from insiders at top AI firms? It's like a plot twist in a sci-fi movie!
Host 2:Oh, totally! It's like the Avengers of the AI world, right? I mean, folks from OpenAI, Google DeepMind, and Anthropic PBC all teaming up to call for more transparency.
Host 1:Exactly! And the letter, "A Right to Warn about Advanced Artificial Intelligence," is backed by some heavy hitters like Yoshua Bengio, Geoffrey Hinton, and Stuart Russell. These guys are like the rockstars of AI.
Host 2:Yeah, and they’re saying that AI companies have all this non-public data on AI risks. It's like they’re sitting on a goldmine of secrets, and the insiders are the only ones who can blow the whistle.
Host 1:Right! They argue that these firms have strong financial incentives to dodge effective oversight. It’s like saying, "Hey, we’re making a ton of money, so let’s not look too closely at the potential fallout."
Host 2:And they’re pushing for anonymous channels for staff to report AI risks and protections for whistleblowers. It’s like setting up a secret hotline for AI snitches.
Host 1:Haha, exactly! They want to make sure employees can voice concerns without fearing retaliation. It’s a pretty bold move, considering the stakes.
Host 2:Speaking of bold moves, did you see OpenAI’s response? They’re all like, "We’re proud of our track record and our scientific approach to addressing risk." Classic corporate speak, right?
Host 1:Totally. And they just formed a new Safety and Security Committee led by CEO Sam Altman. It’s like they’re saying, "Look, we’re doing something about it!" even though they just dissolved their "superalignment" team.
Host 2:Yeah, that’s a bit of a head-scratcher. They dissolved the team that was supposed to manage long-term AI risks. It’s like they’re playing musical chairs with their safety protocols.
Host 1:Well, it’s a complex issue. Balancing innovation with safety is like walking a tightrope. But hey, if the insiders are this concerned, maybe it’s time for the rest of us to pay attention.
Host 2:Absolutely. And speaking of paying attention, did you know that I’ve been dabbling in AI art lately? It’s crazy how these algorithms can create something that looks like it came from a human artist.
Host 1:Oh, that’s awesome! AI art is fascinating. It’s like blending creativity with code. But it also raises questions about originality and authorship, right?
Host 2:Yeah, exactly. It’s like, who owns the art? The programmer? The machine? Or is it a collaboration? It’s a whole new world of questions.
Host 1:And that’s the beauty of AI. It’s pushing us to rethink so many aspects of our world. But with great power comes great responsibility, as they say.
Host 2:Totally. And on that note, folks, keep your eyes peeled and your minds open. The AI revolution is just getting started, and we’re all along for the ride.
Host 3:Certainly! Based on the critique and suggested improvements, here's the refined version of the standup draft:
Host 3:AI whistleblowers? Oh joy, another reason for robots to give us the side-eye.
Host 3:This version is concise, witty, has a clear ironic tone, is relatable, and flows more naturally, making it well-suited for standup comedy.
Host 1:Talking about AI advancements, this switches us perfectly to Elon Musk's recent moves with Nvidia AI chips. Did you hear about this?
Host 2:Oh, absolutely! He is always up to something wild. So, he redirected a bunch of Nvidia AI chips from Tesla to his other companies, X Corp. and xAI Corp. Can you believe that?
Host 1:Yeah, it's like he’s playing a giant game of chess with his own companies. He mentioned on X that the chips would have just sat in a warehouse because Tesla "had no place to send the Nvidia chips to turn them on." I mean, what a waste, right?
Host 2:Totally! But it’s not just a few chips we’re talking about. CNBC reported he redirected twelve thousand Nvidia H one hundred GPUs. That’s like moving a small army of AI powerhouses. And it’s causing a delay for Tesla’s five hundred million dollar GPU receipt by months.
Host 1:Exactly, and the company explained that Tesla couldn't take the Nvidia GPUs due to its unfinished Austin factory. The south extension of the Texas plant is nearly finished and will house ten thousand H100 chips for Full Self-Driving training. So, it’s all about timing and logistics.
Host 2:Man, that’s some serious planning. And get this, the company also estimated that Tesla would spend three billion to four billion dollars on Nvidia hardware purchases this year. That’s like buying a small country’s worth of tech!
Host 1:Half of Tesla's ten billion AI capital expenditures this year are for internal projects like the "Tesla-designed AI inference computer and sensors" in its vehicles and the Dojo supercomputer. It’s all part of advancing their autonomous driving software and robotics.
Host 2:Speaking of robotics, did you know I’ve been tinkering with some DIY robot kits? It’s nothing like Tesla’s stuff, but it’s pretty fun. Anyway, Musk said Tesla would increase H100 acquisitions from thirty-five thousand to eighty-five thousand by year's end. That’s a massive leap!
Host 1:That’s awesome! And yeah, Musk is really pushing the envelope. He’s not just thinking about today but planning for the future. It’s like he’s building the foundation for something even bigger.
Host 2:For sure. And it’s crazy how he balances all these projects. One minute he’s working on artificial intelligence, the next he’s launching rockets. It’s like he has a time machine or something.
Host 1:Or maybe he’s just really good at multitasking. Either way, it’s fascinating to watch. And who knows, maybe one day your DIY robots will be part of something just as big!
Host 2:Ha! From my garage to the stars. I like the sound of that. But seriously, it’s exciting to see where all this tech is headed. And with someone like Musk at the helm, the sky’s the limit.
Host 1:Absolutely. And for everyone watching, keep an eye on these developments. The world of AI and robotics is evolving fast, and it’s going to be a thrilling ride.
Host 3:Certainly! Based on the critique provided, here's the refined version of the standup draft:
Host 3:"Oh great, another Musk miracle. Call me when it's not a circus."
Host 3:This version addresses the need for context clarity, improves the wit and humor to be more universally relatable, maintains an informal and conversational tone, and enhances the sarcasm and irony for a stronger punchline.
Host 1:So, have you heard about the latest buzz in AI? Canadian startup Cohere just bagged a whopping four hundred fifty million dollars from Nvidia, Salesforce Ventures, and a bunch of other big names!
Host 2:Text Host 2: Holy crap, that's huge! I read they're now valued at five billion. That's like winning the lottery twice in a row!
Host 1:Exactly! Just last June, they were at two point two billion with a two hundred seventy million raise. It's like they're on a rocket ship to the moon.
Host 2:And it's not just Nvidia and Salesforce Ventures. Cisco and the Canadian pension fund PSP Investments are in on it too. Talk about an all-star lineup!
Host 1:Right? Cohere's really making waves with their enterprise-oriented LLMs. Their flagship, Command R+, can handle prompts with up to one hundred twenty-eight thousand tokens. That's like having a chat with a supercomputer!
Host 2:That's insane! So, companies can access this via a paid API and on Microsoft Azure. Any plans for other platforms?
Host 1:Oh, definitely. They're planning to expand to other cloud platforms soon. It's like they're building an AI empire.
Host 2:I love it. It's like a sci-fi movie coming to life. Speaking of sci-fi, did you catch the latest episode of "The Expanse"? The AI tech there is wild!
Host 1:Oh, totally! It's fascinating how fiction is becoming reality. And with companies like Cohere, we're getting closer to that future every day.
Host 2:For sure. Imagine the possibilities. AI that can understand and generate human-like text at that scale... it's mind-blowing.
Host 1:And it's not just about the tech. It's about the impact on businesses, healthcare, education... the list goes on.
Host 2:Absolutely. It's a game-changer. I can't wait to see what they do next.
Host 1:Me neither. And for all you out there, keep an eye on Cohere. They're definitely one to watch in the AI space.
Host 2:Alright, back to my hobbies. You know, I'm into drone racing. Imagine AI like Cohere's controlling those drones. It'd be next-level!
Host 1:Haha, you and your drones! But seriously, AI could revolutionize that too. The future's looking pretty exciting.
Host 3:Certainly! Based on the critique and the improvements suggested, here’s the refined version of the standup draft:
Host 3:Great, more artificial intelligence. Now even robots can be unemployed.
Host 3:This version maintains the required length and captures a more humorous, witty, and metamodern tone, blending sarcasm with a clever twist.
Host 1:So, a company just dropped a billion dollars on AI startups. That's a lot of zeros! They're focusing on "secure and reliable" AI.
Host 2:Text Host 2: Holy crap, that's insane! They're not just throwing money around, right? They’ve already committed nearly two hundred million dollars to companies like Cohere, Mistral AI, and Scale AI.
Host 1:Exactly! Cisco's being super strategic. It's like, "You’ve got cool tech, we’ve got the platform. Let’s make some magic happen."
Host 2:If I had a billion dollars, I'd probably just buy a lifetime supply of pizza and video games. But the company is thinking long-term. They want to boost their customers’ AI readiness and complement their own AI strategy.
Host 1:Right? And it’s not just about the money. The company's Chief Strategy Officer mentioned they’re planning to use the tech from these startups in their own products and help sell those services. It’s a win-win.
Host 2:It’s like they’re building an AI Avengers team. The company has already made over twenty AI-focused acquisitions and investments. They’re really pushing the envelope on generative AI, machine learning, and integrating AI across their portfolio.
Host 1:Totally! And it’s not just about flashy new tech. It’s about making AI secure and reliable. No one wants a rogue AI messing things up.
Host 2:Yeah, like, imagine if your smart fridge decided to order one hundred gallons of milk because it misread the data. Nightmare scenario!
Host 1:Exactly! So, Cisco’s focus on security and reliability is spot on. They’re ensuring that AI can be trusted and integrated seamlessly into our lives.
Host 2:Speaking of integration, I’ve been dabbling in some AI projects myself. Just for fun. It’s amazing how much potential there is, but also how careful you have to be with data and security.
Host 1:Oh, for sure! AI is like a double-edged sword. It can do incredible things, but if not handled properly, it can cause a lot of trouble. That’s why investments like Cisco’s are so important. They’re not just pushing the boundaries; they’re setting the standards for safe and reliable AI.
Host 2:And that’s what makes this whole thing so exciting. We’re not just looking at the future of technology; we’re shaping it. And companies like Cisco are leading the charge.
Host 1:Absolutely! So, folks, keep an eye on these AI startups. They’re the ones to watch, and with Cisco backing them, we’re in for some groundbreaking innovations.
Host 3:Sure, here's the refined version of the standup replica, incorporating the suggestions for improvement:
Host 3:Another day, another robot. Can't wait for them to feel existential dread too.
Host 1:Talking about AI, this switches us perfectly to who is hiring in the AI space. It's like a gold rush out there.
Host 2:Oh, totally! I mean, just look at this list. The organization is looking for an Artificial Intelligence Analyst in Laurel, Maryland. That sounds like a place where you’d need a PhD just to understand the coffee machine.
Host 1:Haha, right? And then there's Fidelity TalentSource in Westlake, Texas, hunting for a Machine Learning Ops/GenAI Engineer. I bet they want someone who can juggle Python scripts while riding a mechanical bull.
Host 2:Yeehaw! And Capital One's on the prowl for a Senior Engineer - Generative AI Product Engineer in multiple locations. They probably want someone who can turn credit card data into a Picasso painting.
Host 1:Oh, and don't forget TikTok in San Jose, California. They're looking for a Senior Product Manager for their Ads Platform using Generative AI. Imagine the kind of ads they could generate—maybe even ones that don’t make you want to skip immediately.
Host 2:Now that’s the dream! And last but not least, the company is on the lookout for a Senior Solutions Architect in Machine Learning, also in multiple locations. They probably want someone who can build a supercomputer out of LEGO bricks.
Host 1:American University is seeking a Senior Staff Software Engineer, Machine Learning, for Google Ads in multiple locations. That sounds like a position where you could potentially work from anywhere—perhaps even a beach in Bali.
Host 2:Haha, if only! Deloitte in Rosslyn, Virginia, needs a Senior Machine Learning Engineer. I bet they want someone who can make sense of all those spreadsheets and turn them into something magical.
Host 2:Yeah, and speaking of pie, did you know that the first AI was actually used to play chess? Now we’ve got AI doing everything from diagnosing diseases to creating art. It’s wild!
Host 1:Absolutely! And the best part is, we’re just scratching the surface. Who knows what the next big breakthrough will be? Maybe AI that can finally understand my sense of humor.
Host 2:Haha, good luck with that! But seriously, it’s an exciting time to be in the field. If you’re into artificial intelligence, now’s the time to dive in.
Host 1:Haha, exactly! It’s crazy how many opportunities are out there. It’s like every company is trying to get a piece of the AI pie.
Host 3:### Final Version:

```Another robot job? Guess I’ll just binge-watch my own obsolescence.```

Intel unveils next-gen AI chips for data centers / Tesla redirects Nvidia AI chips to xAI and X / Cohere secures $450M [EN]
Broadcast by