MediaTek Bets on Facebook’s Meta Llama 2 For On-Device Generative AI

MediaTek Bets on Facebook’s Meta Llama 2 For On-Device Generative AI

7 mins read

MediaTek, one of the leading mobile processor makers, has big AI plans for the future, and they include Meta Llama 2 large language model.

Meta
, the parent company of Facebook, has been using AI for a while to refine its social media algorithms, and MediaTek wants to create a generative AI powered edge computing ecosystem based on Facebook’s AI.

But what does that mean?

Mediatek’s vision centers on enhancing a range of edge devices with artificial intelligence. They’re focusing on smartphones, and other edge devices (cars, IoT, etc.). In simpler terms, they want the gadgets and tools we use daily to become much smarter and more responsive.

What’s generative AI?

It refers to types of artificial intelligence that can create new content instead of just recognizing existing ones. This could be images, music, text, or even videos. The most famous applications using generative AI with LLMs are OpenAi’s ChatGPT and Google Bard.

Recently, Adobe launched new generative AI-powered features for Express, its online design platform.

The AI Model Behind the Vision: Meta’s Llama 2

They’ll be using Meta’s Llama 2 large language model (or LLM) to achieve this. It’s basically a sophisticated pre-trained language AI that helps machines understand and generate human language. This tool is special because it’s open source, unlike its competitors from big companies like Google and OpenAI.

Open source means that any developer can look at its inner workings, modify it, improve upon it or use it for commercial purposes without paying royalties.

Why is this Important?

Mediatek is basically saying that with its upcoming chips, devices will host some of these advanced behaviors right inside them, instead of relying on distant servers. This comes with a bunch of potential benefits:

  •       Privacy: Your data doesn’t leave your device.
  •       Speed: Responses can be faster since there’s no waiting for data to travel.
  •       Reliability: Less reliance on distant servers means fewer potential interruptions.
  •       No need for connectivity: The devices can operate even if you’re offline.
  •       Cost-effective: it’s potentially cheaper to run AI directly on an edge device.

Mediatek also highlighted that their devices, especially the ones with 5G, are already advanced enough to handle some AI models, and that’s true, but LLMs are in a category of their own.

We’d love to get more details

All of this sounds exciting, but it’s hard to gauge the true potential of using Meta’s Llama 2 on edge devices without more context. Typically, LLMs run in data centers because they occupy a lot of memory and consume a lot of computing power.

ChatGPT reportedly costs $700,000 per day to run, but that’s also because there are a lot of users. On an edge device, there’s only one user (you!), so things would be much different. That said, services like ChatGPT still typically take a big gaming-type PC to run, even at home.

For a frame of reference, phones can probably run some AI with ~1-2B parameters today, because that would fit in their memory (see Compression). This number is likely to rise quickly. However, ChatGPT 3 has 175B parameters and the next one is said to be 500X larger.

Edge devices typically are much more nimble, and depending on their capabilities, it remains to be seen how much intelligence they can extract from Meta’s Llama 2 and what type of AI services they can offer.

What kind of optimizations will the model go through? How many tokens/sec are these device capable of processing? There are some of the many questions Mediatek is likely to answer in the second half of the year.

There is no question that mobile or edge-devices can churn AI workloads with a high power-efficiency. That’s because they are optimize for battery life, while datacenters are optimized for absolute performance.

Also, it is possible that “some” AI workload will happen on the device, but other workloads will still be executed in the cloud. In any case, this is the beginning of a larger trend as real-world data can be gathered and analysed for the next round of optimizations.

When can we get the goods?

By the end of this year, we can expect devices that use both Mediatek’s technology and the Llama 2 tool to hit the market. Since Llama 2 is user-friendly and can be easily added to common cloud platforms, many developers might be keen to use it. This means more innovative applications and tools for everyone.

While Llama 2 is still growing and isn’t yet a direct competitor to some popular AI tools like chatgpt, it has a lot of potential. Given time, and with the backing of Mediatek, it might become a major player in the world of AI.

In conclusion, the future looks bright for AI in our daily devices, and Mediatek seems to be at the forefront of this evolution. Let’s keep an eye out for what’s to come!

Filed in Cellphones. Read more about AI (Artificial Intelligence), IoT (Internet of Things) and MediaTek.

Source link

lovelydaryll

I love to share all cool things that matter

Leave a Reply

Your email address will not be published.

Latest from Blog