Linear algebra, Neural Network Mathematics, and other nerd stuff - Jan/Feb 2021
new year of things and stuff
Hi, reader, welcome back to my bimonthly organized rant about the activities I did over the past bi-months. Thanks for checking this out.
I was able to do a few things this half-season, and that’s what I’m going to outline, although I’ll assume you know that since that’s pretty much common sense, I don’t know why I wrote this, I don’t know why I’m still writing this paragraph.
If you don’t know who this Adam Dhalla person is, I like learning about the mathematics behind machine learning algorithms, programming them, and seeing how they can help model & understand stuff in the natural world. You can visit my website adamdhalla.com for more stuff.
Subscribe to this newsletter if you love recaps.
Linear Algebra
For two months starting on the 25th of December 2020, I taught myself linear algebra by using MIT OpenCourseware’s online course 18.06 taught by Gilbert Strang and by following along with reading and doing exercises in his textbook (Linear Algebra and It’s Applications, 4th edition).
It was considerably hard but also quite fascinating, and really showed the ‘beauty’ of math - the ability to easily transition from two to 4213 dimensions is amazing.
I wrote a bunch of articles on my medium on the linear algebra I was learning about. The article, imo, with the most important subject matter, is the one on the Normal Equation, which is a matrix equation used to find a linear line of best fit. I explain it in terms of vector projections and subspaces and you can read it here.
I’m currently ordering his [Gilbert Strang’s] sexy new textbook, Linear Algebra and Learning From Data, and once I get that (sometime mid march since I don’t have amazon prime) I’ll follow along with his new course, Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, or 18.065, for more applied things.
Neural Network Mathematics
Running theme might be how math dominates here, but never mind that. I devoted a couple weeks to understanding how multivariable calculus operates in the context of neural networks and backpropagation, and one thing lead to another and I ended up recording and editing a comprehensive 5hr lecture series on the complete mathematics for neural networks.
I’m no Andrew Ng but it covers pretty much everything you need to know the mathematics behind a standard ANN. Here’s the syllabus if you’re curious.
Alright enough of that. What else you got?
Non-learning Artificial Intelligence
I was lucky enough to get my hands on the 2020 copy of Artificial Intelligence: A Modern Approach, a beautiful (and large) textbook by Stuart Russell and Peter Norvig.
Working my way through the textbook, I wrote articles and coded a few systems. I’ve currently stopped going through it for now while I attempt a few other problems.
I wrote articles and coded rule-based systems like above as well as genetic algorithms.
Writing on Modern Transcendentalism
I usually don’t do a whole feature thing about an article, but my article on the philosophy of transcendentalism got quite a bit of notice.
I was fortunate enough for my article to be translated into Greek and featured in the Greek philosophy publication, Orthos Logos. You can take a look at the translation here (press the link below the blue button if you want to see the whole thing).
Small things
I did a few other small things that aren’t worth big headlines, so here are those things
I taught myself Julia which is an awesome language and everyone should use it for all things. It’s as simple semantically as python but runs as fast as C.
I wrote eight or so articles.
I made my personal website.
A mobile game on birds and conservation I co-created, secured funding and is set for launch in spring of 2021 (this is actually not a small thing but I can’t say all the details yet).
My practical application task for the next couple months - programming a segmentation and computer vision software to detect protein localization within images of many cells using weakly labeled data. more details later.
Thanks for reading this. See you in two months.
Thanks you very much for the effort of transmitting us this content! ;)