I learned the language of computer programming in my 50s"
Sept 7, 2024 19:50:23 GMT
MikeMusic and rfan8312 like this
Post by Slinger on Sept 7, 2024 19:50:23 GMT
I'm not sure if anyone, except perhaps Martin, will find this interesting. I found it a good read though.
I learned the language of computer programming
in my 50s – here’s what I discovered
A writer with no technical background recounts his incredible journey into the realm
of coding and the invaluable lesson it taught him about the modern world
One day in 2017 I had a realisation that seems obvious now but had the power to shock back then: almost everything I did was being mediated by computer code. And as the trickle of code into my world became a flood, that world seemed to be getting not better but worse in approximate proportion. I began to wonder why.
Two possibilities sprang immediately to mind. One was the people who wrote the code – coders – long depicted in pop culture as a clan of vaguely comic, Tolkien-worshipping misfits. Another was the uber-capitalist system within which many worked, exemplified by the profoundly weird Silicon Valley. Were one or both using code to recast the human environment as something more amenable to them?
There was also a third possibility, one I barely dared contemplate because the prospect of it was so appalling. What if there was something about the way we compute that was at odds with the way humans are? I’d never heard anyone suggest such a possibility, but in theory, at least, it was there. Slowly, it became clear that the only way to find out would be to climb inside the machine by learning to code myself.
As a writer in my 50s with no technical background, I knew almost nothing about how code worked. But I had come across – and been intrigued by – coders when writing a magazine feature about Bitcoin a few years before. The cryptocurrency’s pseudonymous creator, Satoshi Nakamoto, had left few clues as to his identity before vanishing. Yet he had left 100,000 lines of code, which I found his peers reading like literature. I learned that there were thousands of programming languages used to communicate with the machines, including a few dozen big ones whose names tended to suggest either roses or unconscionably strong cleaning products (Perl, Ruby, Cobol, Go), and that each had its own distinct ethos and cultish band of followers, parlayed into subcultures as passionate and complete as the youth subcultures – punks, mods, goths, skinheads – I grew up with.
It seemed there could be rivalry, even mild animosity, between these tribes, a friction coders half-jokingly referred to as “religious wars” on the grounds that no one was ever going to change their mind. Suddenly, the coder’s realm looked rich and intriguing. Later, I spoke to a theoretical physicist who had been studying “high-frequency trading” on the stock market, wherein algorithms working outside human control fight to fool one another as to the market’s state. I was aghast but fascinated when he referred to this cosmos of code as “the first truly human-made ecosystem”. His team’s study was published, not in a physics or computing journal, but in Nature.
A residue of curiosity was all I had on my side as I set out to learn in a domain that proved quirkier – and often funnier – than I would have dared imagine. As with all code naifs, my first task was to choose a language. But on what basis? At length, I found an extraordinary website called freeCodeCamp, where I learned there was a classic trio of languages behind most websites and that many learners started with these. HTML, for Hypertext Markup Language, was created at the dawn of the World Wide Web by Tim Berners-Lee and is used to define the structure of a webpage, while CSS (Cascading Style Sheets) allows for the styling of HTML elements. Optionally, JavaScript could be used to animate those elements. I enjoyed the first two, working through my first code crisis and experiencing the joy of seeing the machine do something I’d intended. Until someone pointed out that I probably liked HTML and CSS because they weren’t “algorithmic”: I was just moving stuff around. This was coding, in other words, but not programming.
Some swearing happened then. Yet in my heart, I knew my choice hadn’t been random. Algorithms are slippery and hard to control in an essentially binary, alien and unforgiving environment, where a misplaced comma can cause a plane to crash or a satellite to explode. Obviously, part of me had wanted to avoid them. Then I looked at JavaScript, the powerfully algorithmic pillar of the web triad – and hated it.
At root, algorithms are simple things, mostly consisting of “if” statements (if “x” happens, do “y”; else do “z”) and “while loops” (so long as “x” applies, keep doing “y”; when “x” no longer applies, stop doing “y”). So by their nature, algorithms concentrate and reinforce what they are given. In principle, if those things are good, the world gets better; if they are bad, the world gets worse. In fact, it’s not so simple. My dismay at JavaScript was about more than discomfort with algorithms, though. Strange as it seemed for what I’d always thought of as a hyperrational realm, the primary problem was aesthetic. Emotional. Just looking at JavaScript, with its ugly flights of brackets and braces and unnecessary-seeming reams of semicolons, made me miserable. There also seemed to be 25 different ways to accomplish every task and these were constantly changing, turning the language into a kind of coding wild west. The more time I spent with it, the more I thought: “I can’t do this; coding’s not for me – I don’t have the right kind of mind (and never liked Star Wars).”
At this low ebb, I had a stroke of luck when a pro-coder friend of a friend suggested I try another language before giving up. He put me in touch with a man called Nicholas Tollervey, who was prominent within the Python language community. Before calling Tollervey, I looked at Python and instantly felt more at home with it. The first thing I noticed was the spare simplicity of its syntax, which used indentation rather than ugly symbols to delineate instructions to the machine. The language was designed by a naturally collaborative Dutchman named Guido van Rossum, who prized communication, community and concern for how his language would behave in the wild – in other words, empathy – above all else. He named his language Python after Monty Python, a whimsical, human touch that seemed promising. When Tollervey suggested I travel to Cleveland, Ohio, to experience the 4,000-strong PyCon conference, I found myself agreeing, with no idea what I was agreeing to.
The first day was less like the stiff gathering of my imagining than the first day back at Hogwarts. I met up with Tollervey, who graduated from the Royal College of Music as a tuba player before pivoting to code – the kind of backstory I’d hear often at PyCon. I learned that Python first appeared in the early 1990s but took the better part of two decades to catch on: Van Rossum tells of calling a meet-up at a large computing conference early this century, to find only a handful of enthusiasts turning up. Yet, as programs grew in size and complexity, his priorities for the language began to tell. When I asked the then Python Software Foundation chair Naomi Ceder how Guido – to “Pythonistas”, he is always just Guido – had foreseen the way the coding environment would change, she said he didn’t.
“No one could! This may sound weird coming from a coder, but I think what Guido brings is an aesthetic sense… his strong attention to the aesthetics of the language gave it a form and structure amenable to adaptation and scaling, like a classical building.”
This may sound unremarkable in the outside world but in code, it is not. Larry Wall, the fascinating polymath who created Perl – which seemed to be eclipsing Python and most of its peers in the 1990s – specifically defined his language in opposition to Python. The latter, he said, was a modernist creation, imposing its own aesthetic and limiting freedom of choice or interpretation, deprioritising the individual. Perl, he claimed, was explicitly postmodern, providing the individual with as many options as possible and leaving them to decide what to use. I think Wall is right on both counts, even if this is a discussion I never expected to have in connection with code. There is a serious point, though, which I started to glimpse at PyCon: that the values and assumptions contained in programming languages inform the software that’s written with them and change the world accordingly. By the time I’d learned that Brendan Eich, author of JavaScript, is an anti-vaxxer and was a supporter of a campaign to have same-sex marriage nixed in California, I wasn’t surprised.
I was surprised at how much fun I had with the Pythonistas. Coding has a gender and race problem, with only about 5% of professionals identifying as women or either Black, African or Caribbean. It would take me several years to get to the bottom of why this is. However strenuous efforts were being made to address the problem within Python communities around the world, notably in Africa. One organisation trying to reverse this imbalance is PyLadies, which traditionally holds a fundraising auction on the Saturday night of the conference. I scored a ticket and got a first real sense, in microcosm, of a community that, while still too narrow in terms of gender and race, is easily the most culturally and neurologically diverse group I’ve ever seen.
The auction highlight for me involved a painting by one of the younger PyLadies. The auctioneer explained how, the previous year, Lynn had suffered a severe burnout – common in a field where small actions can have massive effects. She’d retreated from code and started painting watercolours in her search for peace. The other PyLadies had worked hard to persuade her to offer a painting of her cats, but from where I sat I could see she was shaking with anxiety, clinging to a colleague behind the scenes. The painting had meant so much to her, but what could it mean to others?
I watched in wonder as the bidding started slowly and then accelerated to a peak of $1,410, as the young coder dissolved into tears then floods of tears with the visceral release of a cliff crumbling into the sea, and I went away thinking this was one of the most beautiful things I had ever seen; knowing this was a community I wanted to get to know better.
Two years later I would be writing my first nervy Python as a volunteer in the San Francisco brigade of Code for America, the nonprofit coding equivalent of the Peace Corps, working on a pandemic dashboard for the Bay Area and feeling like the world’s unlikeliest convert to code culture.
Even so, as I burrowed deeper into Silicon Valley and what I came to think of as the “microcosmos”, I did find a hidden wrinkle in the way we compute, something intrinsic to the code itself, which is at odds with the way we’ve evolved to be. Something that has been concentrating power, abrading society and casting an algorithmic spell over us as a species – and will continue to do so until we bring it under control. Just when I thought my work was done, it was about to begin in earnest.
Devil in the Stack: A Coding Odyssey by Andrew Smith is published by Grove Press (£16.99). To support the Guardian and Observer order your copy at guardianbookshop.com. Delivery charges may apply
SOURCE
I learned the language of computer programming
in my 50s – here’s what I discovered
A writer with no technical background recounts his incredible journey into the realm
of coding and the invaluable lesson it taught him about the modern world
One day in 2017 I had a realisation that seems obvious now but had the power to shock back then: almost everything I did was being mediated by computer code. And as the trickle of code into my world became a flood, that world seemed to be getting not better but worse in approximate proportion. I began to wonder why.
Two possibilities sprang immediately to mind. One was the people who wrote the code – coders – long depicted in pop culture as a clan of vaguely comic, Tolkien-worshipping misfits. Another was the uber-capitalist system within which many worked, exemplified by the profoundly weird Silicon Valley. Were one or both using code to recast the human environment as something more amenable to them?
There was also a third possibility, one I barely dared contemplate because the prospect of it was so appalling. What if there was something about the way we compute that was at odds with the way humans are? I’d never heard anyone suggest such a possibility, but in theory, at least, it was there. Slowly, it became clear that the only way to find out would be to climb inside the machine by learning to code myself.
As a writer in my 50s with no technical background, I knew almost nothing about how code worked. But I had come across – and been intrigued by – coders when writing a magazine feature about Bitcoin a few years before. The cryptocurrency’s pseudonymous creator, Satoshi Nakamoto, had left few clues as to his identity before vanishing. Yet he had left 100,000 lines of code, which I found his peers reading like literature. I learned that there were thousands of programming languages used to communicate with the machines, including a few dozen big ones whose names tended to suggest either roses or unconscionably strong cleaning products (Perl, Ruby, Cobol, Go), and that each had its own distinct ethos and cultish band of followers, parlayed into subcultures as passionate and complete as the youth subcultures – punks, mods, goths, skinheads – I grew up with.
It seemed there could be rivalry, even mild animosity, between these tribes, a friction coders half-jokingly referred to as “religious wars” on the grounds that no one was ever going to change their mind. Suddenly, the coder’s realm looked rich and intriguing. Later, I spoke to a theoretical physicist who had been studying “high-frequency trading” on the stock market, wherein algorithms working outside human control fight to fool one another as to the market’s state. I was aghast but fascinated when he referred to this cosmos of code as “the first truly human-made ecosystem”. His team’s study was published, not in a physics or computing journal, but in Nature.
A residue of curiosity was all I had on my side as I set out to learn in a domain that proved quirkier – and often funnier – than I would have dared imagine. As with all code naifs, my first task was to choose a language. But on what basis? At length, I found an extraordinary website called freeCodeCamp, where I learned there was a classic trio of languages behind most websites and that many learners started with these. HTML, for Hypertext Markup Language, was created at the dawn of the World Wide Web by Tim Berners-Lee and is used to define the structure of a webpage, while CSS (Cascading Style Sheets) allows for the styling of HTML elements. Optionally, JavaScript could be used to animate those elements. I enjoyed the first two, working through my first code crisis and experiencing the joy of seeing the machine do something I’d intended. Until someone pointed out that I probably liked HTML and CSS because they weren’t “algorithmic”: I was just moving stuff around. This was coding, in other words, but not programming.
Some swearing happened then. Yet in my heart, I knew my choice hadn’t been random. Algorithms are slippery and hard to control in an essentially binary, alien and unforgiving environment, where a misplaced comma can cause a plane to crash or a satellite to explode. Obviously, part of me had wanted to avoid them. Then I looked at JavaScript, the powerfully algorithmic pillar of the web triad – and hated it.
At root, algorithms are simple things, mostly consisting of “if” statements (if “x” happens, do “y”; else do “z”) and “while loops” (so long as “x” applies, keep doing “y”; when “x” no longer applies, stop doing “y”). So by their nature, algorithms concentrate and reinforce what they are given. In principle, if those things are good, the world gets better; if they are bad, the world gets worse. In fact, it’s not so simple. My dismay at JavaScript was about more than discomfort with algorithms, though. Strange as it seemed for what I’d always thought of as a hyperrational realm, the primary problem was aesthetic. Emotional. Just looking at JavaScript, with its ugly flights of brackets and braces and unnecessary-seeming reams of semicolons, made me miserable. There also seemed to be 25 different ways to accomplish every task and these were constantly changing, turning the language into a kind of coding wild west. The more time I spent with it, the more I thought: “I can’t do this; coding’s not for me – I don’t have the right kind of mind (and never liked Star Wars).”
At this low ebb, I had a stroke of luck when a pro-coder friend of a friend suggested I try another language before giving up. He put me in touch with a man called Nicholas Tollervey, who was prominent within the Python language community. Before calling Tollervey, I looked at Python and instantly felt more at home with it. The first thing I noticed was the spare simplicity of its syntax, which used indentation rather than ugly symbols to delineate instructions to the machine. The language was designed by a naturally collaborative Dutchman named Guido van Rossum, who prized communication, community and concern for how his language would behave in the wild – in other words, empathy – above all else. He named his language Python after Monty Python, a whimsical, human touch that seemed promising. When Tollervey suggested I travel to Cleveland, Ohio, to experience the 4,000-strong PyCon conference, I found myself agreeing, with no idea what I was agreeing to.
The first day was less like the stiff gathering of my imagining than the first day back at Hogwarts. I met up with Tollervey, who graduated from the Royal College of Music as a tuba player before pivoting to code – the kind of backstory I’d hear often at PyCon. I learned that Python first appeared in the early 1990s but took the better part of two decades to catch on: Van Rossum tells of calling a meet-up at a large computing conference early this century, to find only a handful of enthusiasts turning up. Yet, as programs grew in size and complexity, his priorities for the language began to tell. When I asked the then Python Software Foundation chair Naomi Ceder how Guido – to “Pythonistas”, he is always just Guido – had foreseen the way the coding environment would change, she said he didn’t.
“No one could! This may sound weird coming from a coder, but I think what Guido brings is an aesthetic sense… his strong attention to the aesthetics of the language gave it a form and structure amenable to adaptation and scaling, like a classical building.”
This may sound unremarkable in the outside world but in code, it is not. Larry Wall, the fascinating polymath who created Perl – which seemed to be eclipsing Python and most of its peers in the 1990s – specifically defined his language in opposition to Python. The latter, he said, was a modernist creation, imposing its own aesthetic and limiting freedom of choice or interpretation, deprioritising the individual. Perl, he claimed, was explicitly postmodern, providing the individual with as many options as possible and leaving them to decide what to use. I think Wall is right on both counts, even if this is a discussion I never expected to have in connection with code. There is a serious point, though, which I started to glimpse at PyCon: that the values and assumptions contained in programming languages inform the software that’s written with them and change the world accordingly. By the time I’d learned that Brendan Eich, author of JavaScript, is an anti-vaxxer and was a supporter of a campaign to have same-sex marriage nixed in California, I wasn’t surprised.
I was surprised at how much fun I had with the Pythonistas. Coding has a gender and race problem, with only about 5% of professionals identifying as women or either Black, African or Caribbean. It would take me several years to get to the bottom of why this is. However strenuous efforts were being made to address the problem within Python communities around the world, notably in Africa. One organisation trying to reverse this imbalance is PyLadies, which traditionally holds a fundraising auction on the Saturday night of the conference. I scored a ticket and got a first real sense, in microcosm, of a community that, while still too narrow in terms of gender and race, is easily the most culturally and neurologically diverse group I’ve ever seen.
The auction highlight for me involved a painting by one of the younger PyLadies. The auctioneer explained how, the previous year, Lynn had suffered a severe burnout – common in a field where small actions can have massive effects. She’d retreated from code and started painting watercolours in her search for peace. The other PyLadies had worked hard to persuade her to offer a painting of her cats, but from where I sat I could see she was shaking with anxiety, clinging to a colleague behind the scenes. The painting had meant so much to her, but what could it mean to others?
I watched in wonder as the bidding started slowly and then accelerated to a peak of $1,410, as the young coder dissolved into tears then floods of tears with the visceral release of a cliff crumbling into the sea, and I went away thinking this was one of the most beautiful things I had ever seen; knowing this was a community I wanted to get to know better.
Two years later I would be writing my first nervy Python as a volunteer in the San Francisco brigade of Code for America, the nonprofit coding equivalent of the Peace Corps, working on a pandemic dashboard for the Bay Area and feeling like the world’s unlikeliest convert to code culture.
Even so, as I burrowed deeper into Silicon Valley and what I came to think of as the “microcosmos”, I did find a hidden wrinkle in the way we compute, something intrinsic to the code itself, which is at odds with the way we’ve evolved to be. Something that has been concentrating power, abrading society and casting an algorithmic spell over us as a species – and will continue to do so until we bring it under control. Just when I thought my work was done, it was about to begin in earnest.
Devil in the Stack: A Coding Odyssey by Andrew Smith is published by Grove Press (£16.99). To support the Guardian and Observer order your copy at guardianbookshop.com. Delivery charges may apply
SOURCE