A picture book written in C code


  • Banned

    @Byte9 said in A picture book written in C code:

    @Gąska Here's an interesting study that shows "a faster language is not always the most energy efficient." [https://thenewstack.io/which-programming-languages-use-the-least-electricity](link url).

    Thanks for the link! Although it's a bit of bait and switch article, as it doesn't explain what consumes energy. They only hypothesize it might be related to DRAM usage. It's quite surprising - you'd think it'd be fairly easy question to answer by just plugging meters in various places.

    My point was that C programming is still the standard for embedded systems and is useful for beginners programming the Arduino.

    No doubt about that. It's just that embedded is such a small part of the IT world overall that it, like, skews the needs of the general audience. But as I said - it doesn't really matter for such book.

    As for my book on Kickstarter, I've written all the code and I will be completing the illustrations with an artist.

    I see. I guess it makes sense to do it that way; although doing it "backwards" could possibly make things more interesting and non-standard, but also much harder. Thanks for the answer!


  • Considered Harmful

    @HardwareGeek The simplest method isn't difficult. You reduce any given image to a certain number of squares containing normalized average of color and/or brightness data therein. This gives you hash values. Then you use some distance function to decide how different they are. Slightly different lighting, angle, or resized image, or miniscule aspect ratio changes will result into very similar such "blur maps".

    However, any image with blue sky over a greenish field, for example, will be considered similar at some point. Clever mathematics can only do so much when it's still just a one-way compression function and doesn't know what the picture actually shows.



  • @Gąska said in A picture book written in C code:

    @Byte9 also, I categorically disagree with the statement in the trailer at timestamp 0:56-1:00. No, knowing C is rarely ever useful, and certainly it doesn't make you any better at more modern languages based on C - if anything, it makes you worse, so suggesting people should learn C first is actively harmful.

    Man, you would have loved the CS department at the university I went to. People who had ostensibly never done any sort of programming before were cast off into the wilderness with C and doing all their work SSH'ed into a Solaris cluster. In their defense, this was circa 2003, and the department would switch over to Java in a couple years for entry-level classes, but they were still trying to fail out as many students as possible (one lecturer even admitted as much to his students).


  • Trolleybus Mechanic

    @Groaner said in A picture book written in C code:

    @Gąska said in A picture book written in C code:

    @Byte9 also, I categorically disagree with the statement in the trailer at timestamp 0:56-1:00. No, knowing C is rarely ever useful, and certainly it doesn't make you any better at more modern languages based on C - if anything, it makes you worse, so suggesting people should learn C first is actively harmful.

    Man, you would have loved the CS department at the university I went to. People who had ostensibly never done any sort of programming before were cast off into the wilderness with C and doing all their work SSH'ed into a Solaris cluster. In their defense, this was circa 2003, and the department would switch over to Java in a couple years for entry-level classes, but they were still trying to fail out as many students as possible (one lecturer even admitted as much to his students).

    Am I the only person who found C perfectly clear and generally easy from pretty much day 1 (that is: picking up my Dad's copy of K&R)?


  • Banned

    @GOG it is perfectly clear. So is Russian roulette.


  • Trolleybus Mechanic

    @Gąska It's not my fault if people can't keep their pointers pointing in the right direction. 🤷♂


  • Java Dev

    @Gąska said in A picture book written in C code:

    @GOG it is perfectly clear. So is Russian roulette.

    dd if=/dev/random of=/dev/kmem skip=${RANDOM} bs=1 count=1


  • Banned

    @PleegWat

    Some time ago I had a crazy/funny idea for a local privilege escalation: run a privilege granting operation in an infinite loop and wait for a random bit flip in CPU/RAM that would make a 'can this user do this' check return 'true' instead of 'false'. Is this theoretically possible? Yes. And practically? Almost impossible, due to the unlikeliness of a bit flip and even more, the unlikeliness of a bit flip in the just right place. Nevertheless, I thought this idea was quite interesting and decided to dig into the topic. This post will summarize what I've found out and mention a few papers/posts might be worth reading.

    As a start note: most of the data I found is kinda outdated (year 2003, etc), so links to newer data are most welcomed!



  • @GOG said in A picture book written in C code:

    @Groaner said in A picture book written in C code:

    @Gąska said in A picture book written in C code:

    @Byte9 also, I categorically disagree with the statement in the trailer at timestamp 0:56-1:00. No, knowing C is rarely ever useful, and certainly it doesn't make you any better at more modern languages based on C - if anything, it makes you worse, so suggesting people should learn C first is actively harmful.

    Man, you would have loved the CS department at the university I went to. People who had ostensibly never done any sort of programming before were cast off into the wilderness with C and doing all their work SSH'ed into a Solaris cluster. In their defense, this was circa 2003, and the department would switch over to Java in a couple years for entry-level classes, but they were still trying to fail out as many students as possible (one lecturer even admitted as much to his students).

    Am I the only person who found C perfectly clear and generally easy from pretty much day 1 (that is: picking up my Dad's copy of K&R)?

    For the most part, I did too. Pointers took a little while to understand at first, but aside from that it's pretty accessible. Of course, by the time I learned C I was already somewhat familiar with programming in other languages


  • Trolleybus Mechanic

    @hungrier #MeToo, but it was mostly Basic, which is Considered Harmful.



  • @GOG I had started with Basic at 9-10 years old on the family computer, had Pascal and a bit of Java in high school, and some more Java in university, concurrent with the C course. I had tried to learn C myself a couple times before from various online tutorials, but had always got stuck on pointers until uni



  • @hungrier said in A picture book written in C code:

    @GOG said in A picture book written in C code:

    @Groaner said in A picture book written in C code:

    @Gąska said in A picture book written in C code:

    @Byte9 also, I categorically disagree with the statement in the trailer at timestamp 0:56-1:00. No, knowing C is rarely ever useful, and certainly it doesn't make you any better at more modern languages based on C - if anything, it makes you worse, so suggesting people should learn C first is actively harmful.

    Man, you would have loved the CS department at the university I went to. People who had ostensibly never done any sort of programming before were cast off into the wilderness with C and doing all their work SSH'ed into a Solaris cluster. In their defense, this was circa 2003, and the department would switch over to Java in a couple years for entry-level classes, but they were still trying to fail out as many students as possible (one lecturer even admitted as much to his students).

    Am I the only person who found C perfectly clear and generally easy from pretty much day 1 (that is: picking up my Dad's copy of K&R)?

    For the most part, I did too. Pointers took a little while to understand at first, but aside from that it's pretty accessible. Of course, by the time I learned C I was already somewhat familiar with programming in other languages

    I never had any problems understanding pointers, but as a hardware guy, I was already very familiar with the concept of memory locations and the difference between a location and the value stored in it, and the possibility that just because you have an address doesn't mean the thing contained at that address is what you think it is. But, like you, at the time I learned C, I was already familiar with several other languages, BASIC, FORTRAN, assembly for at least one architecture, and I had just finished a 6-month internship writing MAINSAIL. I wonder what ever happened to my old, pre-ANSI edition of K&R.


  • Trolleybus Mechanic

    @hungrier Sounds a lot like myself, 'xcept I never got into Pascal and back when I was in school Java didn't exist yet.

    I guess C made sense to me because I started in the 8-bit era and that necessarily meant programming close to the metal, as it were.



  • @GOG I've been exposed to C multiple times. The first time I was too inexperienced, didn't know there was a difference between C and C++, and couldn't even get the syntax right. I gave up after I realised C didn't even have string interpolation. ("what a useless language!") Much later, having obtained some real experience with Perl and Octave, I found pointers easy to understand, because they are "sort of like references". I guess it helped that references are usually explained in terms of pointers.

    Even later I stumbled upon an opinionated programming course where the author insisted on teaching people to program in three stages: (1) make them accustomed to control flow, algorithms and compilation, using Pascal and no pointers; (2) take a short dip into assembly to show what really happens inside and explain pointers; (3) switch to C to bring back the wonders of structural programming and combine them with newly discovered pointer shenanigans. The author stressed that, in his experience, starting with C right away bears a high risk of damaging students, either leaving them without any understanding of pointers or making them likely to abuse side effects, resulting in unmaintainable code.



  • @aitap said in A picture book written in C code:

    starting with C right away bears a high risk of damaging students

    I doubt you'll get much disagreement with that.

    (1) make them accustomed to control flow, algorithms and compilation, using Pascal and no pointers; (2) take a short dip into assembly to show what really happens inside and explain pointers; (3) switch to C to bring back the wonders of structural programming and combine them with newly discovered pointer shenanigans.

    Replace Pascal with FORTRAN (:belt_onion:), and that sounds not too unlike my experience.


  • Banned

    @aitap said in A picture book written in C code:

    Even later I stumbled upon an opinionated programming course where the author insisted on teaching people to program in three stages: (1) make them accustomed to control flow, algorithms and compilation, using Pascal and no pointers; (2) take a short dip into assembly to show what really happens inside and explain pointers; (3) switch to C to bring back the wonders of structural programming and combine them with newly discovered pointer shenanigans. The author stressed that, in his experience, starting with C right away bears a high risk of damaging students, either leaving them without any understanding of pointers or making them likely to abuse side effects, resulting in unmaintainable code.

    That sounds very interesting. Although my personal, untested in real classroom, opinion is that starting with unmanaged languages will make many people to put unhealthy amount of focus on microoptimizations and keep forgetting about the big picture. I see it all the time in semi-advanced coders who started with C++.

    @HardwareGeek said in A picture book written in C code:

    @aitap said in A picture book written in C code:

    starting with C right away bears a high risk of damaging students

    I doubt you'll get much disagreement with that.

    *looks at OP* :um-nevermind:


  • Trolleybus Mechanic

    @aitap Broadly agree with the three-step program, although it should probably be... ahem... pointed out that you can do the first step with C. In fact, there's a whole bunch of stuff you can do with C without needing to touch pointer arithmetic at all: K&R don't get to it until pretty late in the book.

    I honestly don't know how programming is taught these days, but back in my day it was absolutely expected that the student would be familiar with the internal workings of a computer. You wouldn't necessarily be doing any assembly (but it helps), but you would absolutely be expected to understand stuff like memory addresses, the program stack, etc. This knowledge is much less useful with modern systems (due to the various layers of abstraction; I remember how challenging the move to Windows 3.1 programming was for me), but I believe it still gives you much better grounding for whatever you'll be doing.

    Funnily, in the back of my mind I still think of strings as arrays of characters - despite the fact that these days I work in C# which has them as a built-in type.


  • Considered Harmful

    @GOG said in A picture book written in C code:

    how programming is taught these days

    npm init
    npm install electron
    npm install fuh-fuh-fuh
    ...
    electron .
    


  • @GOG I'm not including pointers in this book, just to let you know. It's a little too much for one picture book. :) As you said, there's a lot you can learn as a beginner in C without getting to pointers. This includes constructs that apply to other languages as well, like for loops, while loops, if/else, arrays, functions, structures, etc.


  • 🚽 Regular

    @PleegWat said in A picture book written in C code:

    That might be part of the reason why C programmers like print-based debugging so much.

    Flashbacks to having delayed a college project for a couple of hours after having added a printf for debugging, where the format string had a %s which should have been a %d



  • @Byte9 said in A picture book written in C code:

    As you said, there's a lot you can learn as a beginner in C without getting to pointers.

    Sure, there are plenty of other ways to shoot yourself in the foot; you don't necessarily need pointers for that, although they certainly make it a lot easier. :half-trolling:

    Filed under: footgun



  • @dkf said in A picture book written in C code:

    @PleegWat said in A picture book written in C code:

    That might be part of the reason why C programmers like print-based debugging so much.

    I like printf-based debugging, but that's often because I'm running code in weird configurations where attaching a standard debugger is difficult or outright impossible.

    Next, you'll be saying that console.log() is not the epitome of web debugicating!



  • @dkf said in A picture book written in C code:

    @PleegWat said in A picture book written in C code:

    That might be part of the reason why C programmers like print-based debugging so much.

    I like printf-based debugging, but that's often because I'm running code in weird configurations where attaching a standard debugger is difficult or outright impossible.

    Most of the C I run is on embedded processors in (software models of) chips that simply don't have any debugger support. (Technically, if it's an ARM CPU, it probably supports ARM's SWD interface, but our test environment usually doesn't support it, and it's painfully slow in software simulation of the chip.) There isn't even a console output; custom printf and similar functions either write the string into a fixed-location buffer from which the test environment can copy the strings to the stdout of the computer running the test, or call a function in the simulation runtime that generates the output. The alternative to printf-based debugging is tracing machine instructions through the CPU's fetch-decode-execute pipeline and matching the instruction pointer to the corresponding source, and ain't nobody got time for that, although it is occasionally necessary.

    Edit: Me done grammared good.



  • @GOG said in A picture book written in C code:

    Am I the only person who found C perfectly clear and generally easy from pretty much day 1 (that is: picking up my Dad's copy of K&R)?

    No, this is common with people already familiar with assembler. Picking up C as a "portable assembler" is quite natural.

    Now if you try dump C on people who do not know assembler, machine code and the way "standard" computer architecture works... it does not make any sense. Actually, even Pascal pointers are very, very hard to explain. I have some experience with that (as a grad student, I TA-ed some basic programming "101" course and advanced "401" C/assembler course).

    It is natural for people who approach computers as "electronic machines" and working your way up. It is, however, not the only way - you can also start from mathematics, approach computers as "mathematical machines" and work your way down. Both approaches are valid, both have long history, both are still relevant, but from my experience I say that the second one is probably better today, with functional languages (unless you really know from the very start that the only thing you ever need is embedded programming).


  • Discourse touched me in a no-no place

    @sloosecannon said in A picture book written in C code:

    @Byte9 said in A picture book written in C code:

    Python is slow for embedded systems/microcontrollers, and also has higher energy consumption!

    Well sure, but that doesn't matter unless you're @dkf

    Horses for courses. Python's not bad as a user interface for some types of users. Some of the performance impacts can be lessened using Numpy, if you know what you're doing (as it lets you have one box per matrix rather than one box per cell, a good win). Other bits, they have to go in a lower level language because of their performance requirements.

    Big embedded systems (on the scale of a first-generation Pi or more) can use Python just fine for their non-critical parts.


  • Discourse touched me in a no-no place

    @Gąska said in A picture book written in C code:

    It's quite surprising - you'd think it'd be fairly easy question to answer by just plugging meters in various places.

    It's surprisingly difficult. Power monitoring systems tend to have a fairly slow response time, a hundred samples per second is fast. Measuring the power consumption of systems with clock speeds in megahertz or gigahertz using that sort of frequency is ridiculous; you just get noise, or what might as well be. (I've seen weirder things than that, such as the sampling frequency being just right for making the shape of the actual power curve come out backwards! And with a massively longer timescale. I had to totally rewrite how we did power monitoring in the wake of that, making it more like profiling, since that we could at least drive at megahertz rates.)


  • Trolleybus Mechanic

    @Kamil-Podlesak Fair point about assembly language, although I would honestly expect some minimum knowledge of how computers work from anyone aspiring to be a programmer.

    Personally, when I learned C I was "familiar" with assembly in the sense of having a general idea of what it is and how it works. I was by no means proficient with it. However, I did have a fairly good idea of how memory works (in the 8-bit/16-bit era, you kind of had to) so "a pointer stores the memory address of a value" - and the implications thereof - were simple enough to understand.

    At the same time, I realize that these days it's much harder to get that level of understanding, because we're usually no longer working with the actual hardware. I generally wouldn't recommend someone start with C in this day and age, because it's typically not the best tool for the job.


  • Discourse touched me in a no-no place

    @GOG said in A picture book written in C code:

    Am I the only person who found C perfectly clear and generally easy from pretty much day 1 (that is: picking up my Dad's copy of K&R)?

    It took me a couple of attempts. First time round, it was utterly confusing and I couldn't make head or tail of it. Second time (after I'd learned Pascal, so a bit under a year later) it was pretty easy.


  • Discourse touched me in a no-no place

    @Gąska quoted an article with the title in A picture book written in C code:

    Random bit flips in hardware and security

    They do happen, and are caused by a mix of radioactive materials in the system construction and cosmic rays. Not often on the CPU itself, but definitely in memory. Our figures suggest that static RAM is more susceptible to trouble than DRAM, and that the rate is really low until you have very big deployments (such as a whole cluster or datacenter). Moreover, ECC RAM mitigates very strongly since the overwhelmingly most common case is a single random bit flip, and that's easy to correct at the hardware level. The need for supporting that sort of thing is one of the main reasons why most computer memories are managed in units larger than bytes (and often actually larger than words; it's the sort of thing that a memory controller does).


  • Discourse touched me in a no-no place

    @skotl said in A picture book written in C code:

    @dkf said in A picture book written in C code:

    @PleegWat said in A picture book written in C code:

    That might be part of the reason why C programmers like print-based debugging so much.

    I like printf-based debugging, but that's often because I'm running code in weird configurations where attaching a standard debugger is difficult or outright impossible.

    Next, you'll be saying that console.log() is not the epitome of web debugicating!

    I'm typically debugging systems where there's multiple components running on different machines and/or in different security contexts, and all communicating with each other in real time. Logging works better for debugging that than a standard debugger.


  • Discourse touched me in a no-no place

    @Kamil-Podlesak said in A picture book written in C code:

    Actually, even Pascal pointers are very, very hard to explain.

    They're almost exactly as difficult as C pointers. The hard part isn't “oh, this indicates some storage somewhere else” but rather the level of indirection that this permits. Indirection is very hard to understand.

    C pointers have additional problems relating to address arithmetic, but that's not where the real trouble lies. (Horror variable in C: void ****ptr;)

    It is natural for people who approach computers as "electronic machines" and working your way up. It is, however, not the only way - you can also start from mathematics, approach computers as "mathematical machines" and work your way down. Both approaches are valid, both have long history, both are still relevant, but from my experience I say that the second one is probably better today, with functional languages (unless you really know from the very start that the only thing you ever need is embedded programming).

    I think the most important early lesson of programming is that the computer does what it is actually told to do, not what you think you told it to do. The second most important lesson relates to splitting tasks up. These things are common to all approaches to programming.



  • C is quite easy to understand in its basic form, but it really falls down with its antiquated approach to strings. The first thing any novice programmer is likely to want to do is along the lines of

    Console.WriteLine("Please enter your name:");
    var name = Console.ReadLine();
    Console.WriteLine("Hello, " + name);

    That is unreasonably difficult in C, because it doesn't have strings as a first class type (not even through std::string type fakery).

    These days I'd probably start people on C#. The OO wrapper boilerplate on everything is a bit of a faff, but it's faff that you'll see in almost every real application anyway, and introducing classes and OO principles early is no bad thing.


  • Trolleybus Mechanic

    I agree with everything @bobjanova just said.


  • Considered Harmful

    @bobjanova said in A picture book written in C code:

    introducing classes and OO principles early is no bad thing

    But it needs to be done very carefully. If too early, it can happen (and often does, which is one of the reasons why people don't get it) that a bunch of classes are written because you got to have classes. Students then fail to see their purpose why the task at hand is getting so complicated and introduces so many failure modes, when they just wanted to sort the apples by color of something.


  • Discourse touched me in a no-no place

    @Applied-Mediocrity It's inheritance that's trouble (because it's another form of indirection). A simple object that just acts as a holder of some bits of related state, that's pretty easy to understand.


  • Considered Harmful

    @dkf If not that even the simplest OOP examples you'll often see are variations of the notorious Dog extends Animal. Features are dangerous. They want to be used. And languages that are most fit for business use are, often by necessity, full of features.


  • 🚽 Regular

    @bobjanova said in A picture book written in C code:

    The OO wrapper boilerplate on everything is a bit of a faff,

    Good news!


  • Trolleybus Mechanic

    @Zecc said in A picture book written in C code:

    @bobjanova said in A picture book written in C code:

    The OO wrapper boilerplate on everything is a bit of a faff,

    Good news!

    That is good news. Actually, there's a whole bunch of pretty interesting things there.


  • Considered Harmful

    @GOG said in A picture book written in C code:

    there's a whole bunch of pretty interesting things there

    And still no extension properties 😢


  • Banned

    @Applied-Mediocrity said in A picture book written in C code:

    @bobjanova said in A picture book written in C code:

    introducing classes and OO principles early is no bad thing

    But it needs to be done very carefully. If too early, it can happen (and often does, which is one of the reasons why people don't get it) that a bunch of classes are written because you got to have classes. Students then fail to see their purpose why the task at hand is getting so complicated and introduces so many failure modes, when they just wanted to sort the apples by color of something.

    That's why I think that for the first year or so, it's best to teach classes without inheritance.

    Edit: :hanzo:



  • @Gąska In my (self-taught) opinion, the uses for inheritance in modern code basically come down to

    • GUI components
    • implementing INotifyPropertyChanged in C# apps. Which is basically about GUI work. And there you make a stupid-simple base class that handles it for you and inherit where you need to. You could even avoid it by just shoving that boilerplate into all your classes--it's totally generic and doesn't change from class to class either.

    Everything else is better done with interfaces (if available) and/or composition.

    Lots of people are now going to jump all over me and tell me where I'm wrong, this being TDWTF.


  • Banned

    @Benjamin-Hall pretty much. That's why I don't consider the lack of inheritance in Rust to be a big problem. But from a practical standpoint, you have to learn inheritance eventually, just so you can understand other people's code.



  • @Gąska said in A picture book written in C code:

    @Benjamin-Hall pretty much. That's why I don't consider the lack of inheritance in Rust to be a big problem. But from a practical standpoint, you have to learn inheritance eventually, just so you can understand other people's code.

    Correct. But it means you can learn it way later and don't have to do it right away.

    But then again, "modern code" and "code written in C/C++" are largely at odds. Because the conventions of those languages (at least at the beginning level) are anything but friendly to most of the modern styles that let you avoid inheritance and manual bit banging.



  • @Kamil-Podlesak said in A picture book written in C code:

    Actually, even Pascal pointers are very, very hard to explain.

    Not particularly. I don't recall anyone in my Pascal class in high school having much trouble with them. (What really messed with most folks was PRNGs, bit twiddling, and string manipulation.) It's when you get into C that pointers become really scary, despite Pascal's records and pointers being functionally pretty much identical to C's structs and pointers.

    I think a large part of it is actually the syntax. Pascal's pointer mnemonics are just an order of magnitude better than C's. In Pascal, you use @ to take the address of something and create a pointer to it, and ^ to dereference a pointer. One literally means "at", as in "where is this located at in memory?" The other is a pointy thing stuck after a pointer variable, so, "what this variable points to."

    In C, you use * and &. Both of these are also mathematical operators used elsewhere in the language, and neither one has any particular mnemonic significance to tie them to the concepts they represent. Which one is the address-of operator and which is the dereference operator? How are you even supposed to tell?

    When you're working with any concept whose syntactical representation is thoroughly unrelated to its semantics, it's inherently difficult to learn and understand. C's pointer syntax is a shining example of how to do it wrong. (And that's not even getting into the massive :wtf: that is array-to-pointer decay. That's one point where C's and Pascal's pointer models are objectively not equivalent; Pascal's is far better in this specific regard.)


  • Considered Harmful

    @Mason_Wheeler Clearly, what we need is an :arrows: operator.


  • Notification Spam Recipient

    @dkf said in A picture book written in C code:

    Fitting such a language onto an Arduino might be a bit tricky,

    So long as you have a reliable debug connection system, I'd say you don't need to run an interpreter on the device regardless.

    Honestly kinda surprised Arduino doesn't have a basic sketch that does this, actually...


  • Banned

    @Mason_Wheeler said in A picture book written in C code:

    When you're working with any concept whose syntactical representation is thoroughly unrelated to its semantics, it's inherently difficult to learn and understand.

    Looking back at my school years, and in particular physics classes - I don't think I agree. Sure, to an English-speaking person it might make sense that force, power and velocity are denoted as F, P and v respectively. But in Polish the words are "siła", "moc" and "prędkość". It's not just that first letters don't match - none of the letters match. I haven't noticed anyone having any problem remembering which is which, nor did it slow them down when solving problems (sure, many people sucked, but they sucked because they couldn't even understand the idea behind "siła").

    Speaking of mnemonics - has there ever been any scientific evidence that they actually make things easier to remember?



  • @Benjamin-Hall said in A picture book written in C code:

    But then again, "modern code" and "code written in C/C++" are largely at odds.

    Strongly disagree w.r.t. C++. You've probably just never seen a modern code base.

    The build infrastructure, OTOH, yeah, that's still stuck in the 80s.


  • Banned

    @dfdub while modern C++ is lightyears ahead of C++03, it's still incredibly kludgy, needlessly verbose and filled to the brim with deadly pitfalls compared to almost every other popular language. And as Rust has shown, being close to the metal has nothing to do with it.


  • BINNED

    @Mason_Wheeler said in A picture book written in C code:

    @Kamil-Podlesak said in A picture book written in C code:

    Actually, even Pascal pointers are very, very hard to explain.

    Not particularly. I don't recall anyone in my Pascal class in high school having much trouble with them. (What really messed with most folks was PRNGs, bit twiddling, and string manipulation.) It's when you get into C that pointers become really scary, despite Pascal's records and pointers being functionally pretty much identical to C's structs and pointers.

    I think a large part of it is actually the syntax. Pascal's pointer mnemonics are just an order of magnitude better than C's. In Pascal, you use @ to take the address of something and create a pointer to it, and ^ to dereference a pointer. One literally means "at", as in "where is this located at in memory?" The other is a pointy thing stuck after a pointer variable, so, "what this variable points to."

    In C, you use * and &. Both of these are also mathematical operators used elsewhere in the language, and neither one has any particular mnemonic significance to tie them to the concepts they represent. Which one is the address-of operator and which is the dereference operator? How are you even supposed to tell?

    When you're working with any concept whose syntactical representation is thoroughly unrelated to its semantics, it's inherently difficult to learn and understand. C's pointer syntax is a shining example of how to do it wrong. (And that's not even getting into the massive :wtf: that is array-to-pointer decay. That's one point where C's and Pascal's pointer models are objectively not equivalent; Pascal's is far better in this specific regard.)

    I don't think that that's the problem. I agree that Pascal was the much saner language to learn in highschool, but still it felt like almost nobody understood pointers, or for that matter the difference between pass-by-value and pass-by-reference (and what that means w.r.t. how changing a variable inside a function is or isn't reflected on the outside). For most people, it was just completely inscrutable.

    Honestly, I've never really got why the concept of a pointer/reference is so hard to grasp. Sure, you need to think about it for a bit the first time you're exposed to it, but it feels like it's not just pupils that struggle with it. A lot of beginner and not-so-beginner programmers don't understand it at all.


Log in to reply