Lua logs
. A change of plan . Jul 9, 2025 .
I need to put C on the backburner for now. After a month and a half of hitting my head against the wall, I feel that I have not made much progress at all. I might come back to C in the future, but now is not the time. My goal after all is to make a game. If I continue with C, I don’t believe I will have the skills to make a game by the end of the year. I must move away from this language for the time being and try something else. That something else has been on my mind for a long time as I’ve learned a bit of it trying to make a game for the Playdate last year. The language is Lua. It has a simple syntax, garbage collection and is dynamically typed (sigh of relief). It’s also a very fast language as well as being lightweight so it could also be used for embedded devices. Using Lua, I would be able to make games for both the Playdate and the Pico-8 Fantasy Console as they both have a Lua API. Let’s see where that leads. I’m very confident that moving to Lua will improve the learning curve quite a bit. I will also move away from boot.dev as they don’t have a Lua course. I don’t yet have a learning platform, but I plan on continuing/refactoring the playdate game I started last year as I learn more about the language. At the time, I was mostly using AI (yeah not proud of that) trying to Vibe code a game. I want to be able to code the game with my own fingers and head, not having AI constantly be there as a crutch. As I mentioned in a previous log entry, being able to create something without much external help and only relying on your own knowledge and skills is tremendously satisfying. It also really boosts your confidence - allowing you to undertake greater and greater projects as you evolve. So this is it. That’s what I’ll do. That’s the plan. LFG!
. Bad habit . Jun 20, 2025 .
It’s been about a month since I started learning C on my own time. It’s been challenging and slow, but I’m getting there. Pointers… man, POINTERS! That’ll get you scratching your head, that’s for darn sure. It can get quite messy to keep up with the flow of it all. I’m so used to Python where its VM takes care of collecting all that garbage that’s left on the curbs. Either way, I’m trying to keep my head cool and step away when I feel like my brain is getting fried. I’ve noticed that doing so helps me understand things better when I come back to it with a fresher perspective.
I remember a guy I used to work with a long long time ago, who was a C/C++ programmer, told me that even though Python is an easy to learn language and is widely recommended as a first language, it teaches you the wrong reflexes. I think I get what he meant now. For one, in Python you don’t need to worry about variable types as much because they are dynamic, meaning they can change on the fly. It sounds like a minor detail, but it messes up with my head a lot.
In Python, you can do something like this:
x = 10 # Python infers that 'x' is an integer
x = "Hello" # Now 'x' holds a string
x = [1, 2, 3] # Now 'x' is a list (array in C)
This is not something that you can do in C as you must first declare your intention with each initialized variable.
int x = 10;
x = "Hello"; # The compiler will say 'Nah bruh, ain't nobody can do this 'round here.
Static types, like in C, forces you to think many steps in advance to ensure that things run smoothly. Python, being dynamically typed, reinforces a bit of carelessness, as you never have to think about this. Now, if you never write anything else but Python code, that won’t necessarily be a problem for you (assuming you’re O.K. with poor performance). However, changing types on the fly like this can and most certainly will introduce bugs in your code. Some function down the line will expect a different input than what you’re feeding it because you changed the type of a variable you introduce to your codebase 3 years ago. No bueno.
I was going to talk about pointers, but I feel highly unqualified at the moment, so I’ll talk to you next time when pointers are not a complete mess in my head.
. The whys . May 25, 2025 .
I’ve learned Python haphazardly and coded a few projects, but if one spends even a modest amount of time in the field of computers, in the nitty gritty of software or hardware, it becomes painfully obvious that Python is not it. Unless all you dabble into is AI or scripting, you realize how limited/limiting Python is. I’ve made a few small hardware projects with microcontrollers that used a compressed and stripped down version of Python called Micropython. It worked yes, but with tremendous limitations. It takes a lot of computer memory (in microcontrollers term) to run Micropython and thus not many people use it to write software for hardware projects and in turn doesn’t have nearly as many libraries as C/C++ (300 ish as opposed to C/C++’s tens of thousands).
Anyways, I could go on and on bashing on Python but that’s not the point and to be fair, it’s actually a good first language that’s easy language to learn. But, we all grow up at some point. I have a few points on which I would like to elaborate as to why I’ve decided to finally learn C.
Why C?
As I was making the goose writer, I was met with a lot of roadblocks that were deciding factors in the hardware and software choices. My initial goal was to create a device that would be powered solely by one or two microcontrollers and an ePaper display as the main display. I had those two building blocks in mind. One of my priority was to get the longest lasting battery life that I could muster. In order to achieve this, I would have to choose the right libaries and optimize the code to filter out unnecessary CPU cycles. It didn’t take long to realize that it was futile to try this with the language that I knew (Python/Micropython) for the simple reason that Python/Micropython are resource glutons. The wrong tool for the job.
Other than embedded systems, there are many other areas of computing where learning a lower level language, like C, is tremendously valuable. Most modern Operating Systems are programmed in C because in C you have direct access to memory and thus complete control over how the program runs. This is also the reason why game engines are made in C. One of my goal this year is to be able to start developing a small game in C for the Playdate (it has a C SDK).
But, won’t AI send software development into irrelevancy?
Whether it does or it doesn’t, I’ll be leaving that debate to other people who know what they’re talking about. As for how I feel about it – I’ve arrived at a place in my mind in which this very question becomes irrelevant. The reason is because my ambitions and intentions are probably very different than most people who are willing to tackle the difficult process of learning a language.
3 reasons why I choose to embark on this journey
-
AI hinders your brain. It merely gives you half assed results and obscures the process that went into it. Putting the quality of the output aside, AI hurts the learning process in the long run. There are ways to use AI in a way that aids your learning process, but you have to be careful. It is very easy to slip and get the AI to spit out the answer for you, but there’s no learning in this – only leaning. Leaning like a crutch. Learning happens during the process between an idea and its result. If I go directly from the idea to the result, I skip all the learning. It cannot be fast-tracked like this. I’m not on a rush. For someone who’s not on a rush and learning is the point, slow and steady is actually the best and arguably the quickest way to drive it into your brain so that it stays there. Repetitions, repetitions, repetitions. Coldfusion made a great video about this which goes into further detail and explaining it a lot better than I can.
-
Interest and learning for the goal of learning. The classical way, the self-sufficient way. A quote from Christopher McCandless (from the movie Into the Wild) has always stayed with me and encapsulates this point very well: “how important it is in life not necessarily to be strong, but to feel strong, to measure yourself at least once, to find yourself at least once in the most ancient of human conditions, facing the blind, deaf stone alone with nothing to help you but your hands and your own head.”. Now, we’re not facing the blind and deaf stone, but you get the point. Learning a language is a long and arduous process but a very rewarding one. To feel strong from your own knowledge is priceless. Nobody can take that away from you.
-
I have a family history of dementia in my family and now in my mid-30s, I’m deciding to consciously make the effort of training my brain like a muscle. Without mental health, we have nothing. Studies suggest that being bilingual can delay the onset of the condition by up to five years. I’m already fluent in French and English, but one is never too proactive. “But, natural languages and computers languages are different”, I hear you say. Yes, they are. However, from a Vice article entitled Can Learning to Code Delay Alzheimer’s?, it says that “Professor Janet Siegmund of the University of Passau and her colleagues ran fMRI brain scans on 17 volunteers for a study in 2014.” and that “They found the first empirical evidence that both natural language and programming language require the same areas in the brain and that based on this, they’ve inferred that understanding programming languages and natural languages appear to be similar.”.
There it is. This is why I’m doing this. I’ll start this journey on Boot.dev as I’ve heard a lot of positive things about it as well as reading The C Programming Language and writing some code as I go. I’ll keep myself updated, by writing here – here and there.