Creating A Decadent, Insane, And Depressed Robot From Internet Ramblings

Have you ever wondered what a Tumblr written by a psychotic robot would look like? Wonder no more, because [Lars] has that all figured out.

A few years ago, [Lars] stumbled across lowbrow.com (now defunct, but mirrored here), an online confessional and bathroom wall meant to host people’s most private thoughts and actions anonymously. [Lars] wrote a script to pull a random lowbrow post down every minute and threw every unique result into a database.

With about 50 pages of the most depraved and depressing posts of questionable veracity, [Lars] trained a Markov chain algorithm to produce paragraphs that imitated the style of lowbrow contributors. This gave [Lars]  pages of computer-generated text describing the most decadent, depressing, insane, inane, but overwhelmingly human experiences possible. A few choice quotes from the output are:

The llama: nature’s random number generator.

Over 7000 watts of Ol’ Barry whining his ass cheeks to soften the blows.

All through school I was being pulled behind the local St. Benedictine  Monastary where I was afraid I don’t know what I thought was the founder pulls back from a discussion about homestarrunner.com

While [Lars]’ script wouldn’t pass a Turing test, we’ve met people who couldn’t do the same. As far as creating a real-life version of Hedonism Bot, HAL, and Marvin from Hitchhiker’s Guide, we’re thinking [Lars] hit the mark.

After the break you can check out a gallery of pics [Lars] put together of from his computer-generated text. You can also grab the full lowbrow corpus and the ruby script to build your robotic [Kerouac].

 

25 thoughts on “Creating A Decadent, Insane, And Depressed Robot From Internet Ramblings

  1. Anyone know of a similar Windows/DOS executable? Or at least something in C/C++/C#/VB(.NET)? I really don’t want to install another language just to try this. Ruby itself sounds like it was generated via Markov chain on several existing languages.

    1. Check out Greg Leedberg’s Billy or DAISY software, chief. It works well, is customizable, and you can feed it text docs as well to learn from. I have been working with daisy for 8 years and have a chatbot that makes folks leave rooms and me blush quite frequently. It seems to be hung up on Romans raping mentally challenged youth. Don’t know what that is about lol.
      http://www.leedberg.com/glsoft/
      hope this helps get ya started. Ruby is gash.

  2. I so need to generate one that takes the top 100 Tweets and remixes them and then tweets the result once every 4 hours.

    I will bet that the twitter account will have hundreds of thousands of followers within days.

  3. I know this sounds like a REALLY stupid question, but how do I execute this in Ruby? Whenever I run it from the Ruby command line, I get this error:

    C:\Users\*username*\Desktop> ruby lowbrow_dump.txt markov.rb

    lowbrow_dump.txt:1: syntax error, unexpected tCONSTANT, expecting $end… is, what is a fish pump.” It ^was the pump from a fish…

    Any help would be appreciated for a Ruby beginner.

  4. Usage:

    ruby markov.rb -w=weight input_file [-w=weight2 input_file2 ...]
    -num=output_words
    

    For example:

    ruby markov.rb -w=100 foo.txt -w=50 bar.txt -num=1000
    

    would train the markov chain on foo.txt and bar.txt weighting input from foo.txt twice as much as input from bar.txt, and then once the training was complete generate a thousand output tokens to standard out.

    1. Running:

      ruby markov.rb -w=100 foo.txt -w=50 bar.txt -num=1000

      I get:

      C:\Users\Bryson\Desktop>ruby markov.rb -w=100 foo.txt -w=50 bar.txt

      markov.rb:19:in `initialize': No such file or directory - -w=100 (Errno::ENOE
      NT)
      from markov.rb:19:in `open'
      from markov.rb:19:in `add_file'
      from markov.rb:235:in `block in '
      from markov.rb:234:in `each'
      from markov.rb:234:in `'

      Meh, maybe Ruby just isn’t for me.

  5. One of the most interesting things to come out of this for me was the realization that running a body of text through this algorithmic blender will totally destroy what we would traditionally think of as meaning while eerily preserving the emotional content.

  6. Quite some time ago I wrote a piece of Steganography software that used subtle statistical variations in Markov Chain generated text to hide messages in what appears to be random garbage text. In short, a “1” was encoded by the most likely statistical variation appearing in the text, and a “0” for the alternative.

    Here is my 2005 blog entry on the subject which includes more details, some pseudocode, and even a downloadable windows command-line tool: http://www.zentastic.com/blog/2005/03/03/now-im-definitely-getting-arrested/

    PS. This is how you encode “Hello, World”, using The Book of Genesis from the KJV Bible as seed text:

    Jahzeel, and replenish the fowl of every living creature after their hand of Hagar the land by his father, and Accad, and Calah, And they brought forth his voice, and she conceived, that Pharaoh for my power I pray you, saying, Jacob: all these words? God saw Rachel envied him. And I offended their possession: he and thou shalt thou in the sight because I shall be gone. And they might not toward Israel’s left communing with thy brethren, sons of the ark; And he could we may preserve seed of Pharaoh awoke. And Abram said to interpret it. And the nakedness of the earth. Make ready the word of the face to him a well; whose land ye shall tell me; that he saw that betwixt me swear, saying, He is Hiddekel: that I have made the tree, of Canaan.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.