A Slew Of AI Courses To Get Yourself Up To Speed

When there’s a new technology, there’s always a slew of people who want to educate you about it. Some want to teach you to use their tools, some want you to pay for training, and others will use free training to entice you to buy further training. Since AI is the new hot buzzword, there are plenty of free classes from reputable sources. The nice thing about a free class is that if you find it isn’t doing it for you, there’s no penalty to just quit.

We noticed NVIDIA — one of the companies that has most profited from the AI boom — has some courses (not all free, though). Generative AI Explained, and Augment your LLM Using Retrieval Augmented Generation caught our eye. There’s also Building a Brain in 10 Minutes, and Introduction to Physics-informed Machine Learning with Modulus. These are all quite short, though.

Surprisingly, Google hasn’t been as successful with AI, but it does have some longer and possibly more technical topics in its Google Cloud Skills Boost program. There are actually several “paths” with AI content, such as “Generative AI for Developers.” If you prefer Amazon, they have a training center with many free courses, including those on AI topics like Foundations of Prompt Engineering. Of course, you can expect these offerings will center on what a great idea the Google systems or the Amazon systems are. Can’t blame them for that.

They are all, of course, playing catchup to OpenAI. If you prefer to see what classes they offer, you can check out their partner DeepLearning.ai. Many other vendors have training here, also.

If you want something more rigorous, edX has a plethora of AI classes ranging from Harvard’s CS50 introduction class that uses Python (see the almost 12-hour video below) to offerings from IBM, Google, and others. These are typically free, but you have to pay if you want grading and a certificate or college credit. Microsoft also offers a comprehensive 12-week study program.

Naturally, there are more. The good news is you have choices. The bad news is that it is probably easy to make the wrong choice. Do you have any you’ve taken that you’d recommend or not recommend? Leave us a comment!

We are always amazed at how much you can learn online if you are structured and disciplined about it. There is no shortage of materials from very reputable schools available.

12 thoughts on “A Slew Of AI Courses To Get Yourself Up To Speed

  1. I tried a few courses but I always abandon them when they finish teaching about linear regression and other basic stuff and tell you to import one of the example image data sets.

    That’s just disillusioning to me, maybe I’m just dumb it makes no sense to me how everything works. What would make a lot more sense is giving me a very, very simple time series data set and doing something with it, instead of images.

    Anyway, I can’t really comment because I don’t know jack shit about AI.

    1. It might be they are trying to keep it simple and assume you can build upon a single image example yourself by iterating through your own set of images or video at a later stage. But i agree, it does not inspire confidence (excuse the pun) in the system if you dont see a time series example working at an early stage and makes you want to quit.

        1. I’m not sure if I remember correctly from my college years, when we were playing with the neural networks several years before any big public LLM, but you very basic approach is to treat 2D data of the picture as 1D. You just take the first pixel line, then add the second, then the third, etc. Neural network doesnt care. You can do the same thing even with 3D data.
          Note: in this context, AI = Neural network. I always hated how everybody calls it AI, when it is just a glorified xxl neural network with huge ammount of training data.

        2. @shinsukke I’m completing the DLS series at Deeplearning.ai now– And I have to say Prof Ng is an excellent teacher. I can’t speak directly as to where everyone is on their journey, so you have to have at least *some* experience with the requisite Maths, as well as Python as that isn’t taught. Also having experience with ‘traditional’ ML (Yes, regression, but also say kNN, SVM, etc) really helps you to grasp the concepts better.

          They even have an MLS course that covers some of this Machine Learning, but I haven’t taken it as I did my ML elsewhere. But, other than that, crucially they don’t automatically ‘assume’ you know a good part of Linear Algebra, or have all the experience for running your chain functions for derivatives in doing back prop– He explains it, but then just gives you the formula.

          None of which to say formal knowledge is not useful to make sure you *really* understand in the end what you are learning, but also towards developing your own novel models– But there it is not ‘entirely necessary’.

          Though yes, other courses I’d agree, they either use proprietary libraries (I have a lot of respect, but Fast.AI comes to mind), strictly use other architectures such as TensorFlow or Pytorch– Or conversely make the curriculum too simple that it just feels like you are not learning anything.

  2. heh i am not particularly interested in AI but i developed a vague curiosity about this whole ‘opencl’ thing. i figured that since integrated gpu/apu sort of things are so popular, probably my PCs had amazing opencl performance i wasn’t even aware of. who knows what possibilities i might imagine, once i know the tool exists?

    so i found a little reference material and made what i thought was a decent ‘hello world’, and i ran it on my celeron N4000 laptop and it was much slower than the regular CPU. not sure if it wasn’t being accelerated, or if my test case just measured overhead. so just casting about for another thing to try, i ran the same example on my AMD ryzen 3 2200G. and it failed (unkillable zombie) and hard-locked the “opencl” part of my computer! after running my test, clinfo and radeontop also become unkillable zombies instead of working.

    i find i don’t really understand how any of the kernel interfaces to this stuff works. sometimes i have a hard time telling if an opengl thing is running accelerated or just happens to be fast enough on modern CPUs. it is all a mystery. but apparently the kernel interface is a PoS and full of bugs and probably a huge security vulnerability surface as well. so that’s reassuring.

    a little off topic but kind of a bummer about what you find when you start poking around. i guess that’s one of the reasons people use so many libraries-on-top-of-libraries for this sort of stuff.

  3. I’m promoting my first post on Instagram I read through some of the comments on here about the AI logistics and it seems like what I kind of expect with a new thing that’s not well understood by outliers. I don’t really think having a moving thing from a early stage object is any particular thing about it. I think having better understood architecture and maybe graphic model user interface systems that show the knowledge in the drop-downs of some sort for portraying how the logic works in the architectural layouts of the CPUs or the logic units could be better and are well understood at this point for myself after doing a single layout object in rhino 3D with grasshopper. I used in advance parameter and put in the parameters that I had after laying it out in the computer it still didn’t recognize it probably cuz it’s just going to a different strategy area or a different graphic user from the particular area where the logic is called from. I wasn’t sure I thought maybe if I design something to a certain stage and then I use the logic that’s in the computer it would organically apprehend that logic and set it into something else. It didn’t work like that.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.