Sony PSP, Evan-Amos, Public Domain.

Llama Habitat Continues To Expand, Now Includes The PSP

Organic Llamas have a rather restricted range, in nature: the Andes Mountains, and that’s it. Humans weren’t content to let the fluffy, friend-shaped creatures stay in their natural habitat, however, and they can now be found on every continent except Antarctica. The Llama2 Large Language Model is like that: while it may have started on a GPU somewhere, thanks to enterprising hackers like [Caio Madeira], who has ported Llama2 to the PlayStation Portable (PSP), the fluffiest LLM can be found just about anywhere.

The AI, in all its glory, dooming yet another system.

Ultimately this project has its roots in Llama2.c by [karpathy], a project we’ve seen used on Pentium II under Windows 98, DOS machines running 486 processors, and even the venerable Commodore 64, of all impossible things. Now, it’s the PSP’s turn. This implementation uses the same 260K tinystories model as the C64 port, upon which it is based. Of course the PSP’s RAM has room for a much larger model, but [Ciao] apparently prefers to run the tiny model faster on this less-ancient gaming hardware.

Its getting to the point that it’s harder to find systems that won’t run LLMs than those that do. Given that Llama2 seems to be the new DOOM, it’s probably only a matter of time before their virtual fur is all over all our old equipment. Fortunately for allergy sufferers, virtual fur cannot trigger a histamine response.

If you know of another system getting LLMs (Alpaca-adjacent or otherwise), send in a tip.

AI Helps Make Web Scraping Faster And Easier

Web scraping is usually only a first step towards extracting meaningful data. Once you’ve got everything pulled down, you’ve still got to process it into something useful. Here to assist with that is Scrapegraph-ai, a Python tool that promises to automate the process using a selection of large language models (LLMs).

Scrapegraph-ai is able to accept a URL as well as a prompt, which is a plain-English instruction on what to do with the data. Examples include summarizing, describing images, and more. In other words, gathering the data and analyzing or formatting it can now be done as one.

The project is actually pretty flexible in terms of the AI back-end. It’s able to work with locally-installed AI tools (via ollama) or with API keys for services like OpenAI and more. If you have an OpenAI API key, there’s an online demo that will show you the capabilities pretty effectively. Otherwise, local installation is only a few operations away.

This isn’t the first time we have seen the flexibility of AI tools like large language models leveraged to ease the notoriously-fiddly task of web scraping, and it’s great to see the results have only gotten better.