Applesoft BASIC
Based on Wikipedia: Applesoft BASIC
In 1977, Steve Wozniak had a problem. He'd built the Apple II—a machine that would become one of the most influential personal computers ever made—but he'd written its programming language for playing games. Integer BASIC, as he called it, couldn't handle decimal points. It couldn't calculate the square root of two or figure out compound interest. For games like Breakout, where you only need to count whole numbers of bricks and ball positions, this was fine. For everyone else trying to do actual work with their expensive new computer, it was a glaring limitation.
The irony was rich. Microsoft had already solved this problem. They'd written a floating-point BASIC for the same processor that powered the Apple II—the MOS Technology 6502. But when Microsoft called Apple to offer it, Steve Jobs brushed them off. Apple already had a BASIC, he said.
Jobs was wrong. And within months of the Apple II's public debut at the West Coast Computer Faire, customers made that abundantly clear.
The Floating Point Crisis
To understand why the lack of floating-point math mattered so much, you need to understand what floating-point numbers actually are. When a computer stores an integer—a whole number like 7 or 42 or 1,000—it's straightforward. The number sits in memory as a simple binary value. But what about 3.14159? Or 0.00001? Or 6.022 times ten to the twenty-third power?
Floating-point representation is a clever trick borrowed from scientific notation. Instead of storing a number directly, you store two pieces: a significand (the meaningful digits) and an exponent (where to put the decimal point). This lets computers represent both incredibly tiny numbers and astronomically large ones using the same amount of memory.
Wozniak had skipped all this complexity because he wanted to ship fast and keep the code small. But real-world applications demanded it. Spreadsheets needed percentages. Scientific programs needed trigonometry. Financial software needed interest calculations that didn't round everything to the nearest dollar.
Making matters worse, the rival Commodore PET had floating-point from day one. The Apple II was losing sales to inferior hardware simply because that hardware could divide seven by three and get a useful answer.
The Thirty-One Thousand Dollar Solution
Apple went back to Microsoft, hat in hand. The deal they struck has become legendary in computing history: a flat fee of thirty-one thousand dollars for an eight-year license. No royalties. No per-unit fees. Just a one-time payment for unlimited use.
In retrospect, this might be one of the best bargains in software history. Microsoft essentially gave away the programming language that would power millions of Apple II computers for less than the cost of a nice car. When the license came up for renewal in 1985, Apple paid for it by handing Microsoft the rights to develop BASIC for the Macintosh—a deal that further enriched Microsoft while Apple moved away from BASIC entirely.
The new language combined Apple and Microsoft's names into "Applesoft." It kept compatibility with Wozniak's Integer BASIC where possible, but added a wealth of new capabilities.
What Applesoft Could Do
The headline feature was obvious: real math. Applesoft stored numbers using forty bits of memory—eight bits for the exponent and thirty-one for the significand. This gave programmers about nine digits of precision and a range from the vanishingly small to the incomprehensibly large. Trigonometric functions like sine and cosine appeared. So did logarithms and square roots.
But the improvements went far beyond arithmetic.
String handling transformed completely. In Integer BASIC, a string was just an array of characters—you had to manage memory yourself, decide how much space to allocate, worry about running out of room. Applesoft strings were garbage-collected objects, a concept borrowed from languages like Scheme and Java (though Java wouldn't exist for another two decades). You could create a string, modify it, concatenate it with another string, and the computer would automatically reclaim unused memory. This might sound like a small thing, but it made text processing dramatically easier.
Graphics got serious attention too. The original Integer BASIC could draw colored blocks on screen in "low resolution" mode—basically a grid of forty by forty-eight chunky pixels in sixteen colors. Applesoft added commands for "high resolution" graphics: two hundred and eighty by one hundred and ninety-two dots in six colors. You could draw lines at arbitrary angles, not just horizontal and vertical ones. You could define shapes and then rotate and scale them. For 1978, this was remarkable capability in a home computer.
Error handling appeared for the first time. A program could now catch mistakes and respond gracefully instead of crashing. Data statements let programmers embed tables of numbers or text directly in their code. User-defined functions allowed simple calculations to be packaged up and reused.
The Price of Progress
Nothing comes free in computing. Every one of Applesoft's improvements cost something.
Speed was the first casualty. Integer BASIC earned its name by using whole numbers for everything, and whole-number math is fast. The processor could add two integers in a handful of clock cycles. Floating-point arithmetic required elaborate conversion routines, careful handling of exponents, and tedious normalization steps. A simple addition that took microseconds in Integer BASIC might take milliseconds in Applesoft.
For business applications, this rarely mattered. Who cares if a spreadsheet recalculates in two seconds instead of one? But for games—Wozniak's original motivation—the slowdown was crippling. Few action games were ever written in Applesoft because the language simply couldn't keep up with real-time gameplay.
Memory usage increased too. The Applesoft interpreter occupied ten kilobytes of the computer's memory, compared to the more compact Integer BASIC. On a machine with only forty-eight kilobytes total, this was a meaningful chunk of space no longer available for programs.
And then there were the quirks.
Applesoft's Charming Eccentricities
Variable names in Applesoft were limited in a way that seems almost malicious in hindsight. Only the first two characters mattered. A variable named "SCORE" and one named "SCREEN" would be treated as the same variable—both reduced to just "SC" internally. A programmer who carefully named their variables "TEMPERATURE" and "TEMPO" would discover, to their horror, that changing one changed both.
Worse still, Applesoft would find BASIC keywords hidden inside variable names. The variable "SCORE" contains "OR"—a logical operator. The interpreter would try to parse it as "SC OR E" and throw a syntax error. "BACKGROUND" contains "GR," the command to enter low-resolution graphics mode. Trying to use it as a variable name would crash your program in baffling ways.
Programmers learned to use cryptic two-letter variable names or carefully check every name for hidden commands. It was a kind of dark art, knowing which letter combinations were safe.
The language was also entirely uppercase, at least in early models. You could store lowercase letters in strings—that worked fine—but if you tried to type a command using lowercase letters, the interpreter would reject it. "PRINT" was a valid command. "print" was a syntax error. This limitation persisted through several generations of Apple II hardware before finally being relaxed.
Sound and Fury
The Apple II had a speaker. Applesoft had essentially no way to use it.
The official sound support consisted of exactly one thing: you could print the ASCII bell character (character number seven) and the computer would beep. That was it. One beep.
You could, through heroic effort, make more complex sounds. There was a memory location you could read that would click the speaker once each time you accessed it. Click it fast enough and you'd get a tone. The problem was timing. Applesoft was too slow to click the speaker fast enough for anything but a low buzzing sound—roughly baritone in pitch, and thoroughly unpleasant.
Real music required machine language routines: hand-coded assembly programs that could click the speaker at precise intervals. Many commercial programs included such routines, and hobbyist programmers learned to write their own. But Applesoft itself was essentially mute.
The Disk Operating System
Here's something that surprises people today: Applesoft BASIC had no commands for working with files on disk. None. You could save a program to cassette tape. You could load one back. That was the extent of built-in storage capability.
The Apple II's disk operating system—known simply as DOS—wasn't part of Applesoft at all. It was a separate layer that added file commands to the language. When you typed "SAVE MYPROGRAM" on an Apple II with a disk drive, you weren't using Applesoft. You were using DOS, which had patched itself into the BASIC interpreter to intercept certain commands.
This architectural oddity persisted throughout the Apple II's life. It meant that a program written for disk would crash immediately on a computer without DOS loaded. It also meant that Apple could update the disk operating system independently from the BASIC interpreter—a form of modularity that was sophisticated for its time, even if it sometimes confused users.
The Development Nightmare
Applesoft's creation was a small adventure in software development chaos.
Apple received Microsoft's BASIC interpreter as a source code listing—essentially a printed document with the program written out line by line. This code needed to be adapted for the Apple II, with Integer BASIC commands grafted on and Apple-specific features added.
The problem was that Apple didn't have an assembler for the 6502 processor. An assembler is the tool that converts human-readable program code into the binary instructions a processor can actually execute. Without one, the Apple team had no way to build their modified version of BASIC.
Their solution was wonderfully jury-rigged. They sent their source code over telephone lines to a company called Call Computer, which offered remote compilation services. The process was agonizingly slow—transmitting code through acoustic modems at perhaps 300 characters per second.
Then Call Computer's equipment failed and lost Apple's source code.
At this point, a programmer named Cliff Huston saved the project by using his personal IMSAI 8080 computer—a completely different machine from the Apple II—to cross-assemble the BASIC source. Cross-assembly means compiling code for one processor using a different processor, a technique still used today when developing software for embedded systems or new hardware platforms.
The Ampersand Escape Hatch
Applesoft's designers knew that no matter how many features they added, programmers would eventually need to do something the language couldn't handle. They built in two escape hatches.
The first was the USR function, which let you call machine language code and get back a number. Want to calculate something Applesoft couldn't handle? Write the math in assembly language, tell USR where to find it, and you'd get your answer.
The second was more powerful and more unusual: the ampersand command. Written as just "&" followed by whatever you wanted, it jumped to a predefined memory address where you could put any code you liked. That code could parse the rest of the line, looking at whatever came after the ampersand, and do essentially anything.
A whole industry of Applesoft extensions grew around the ampersand command. Commercial packages added everything from improved graphics to database capabilities to structured programming constructs. The ampersand was Applesoft's acknowledgment that a programming language is never complete—there will always be something new that users need.
Performance Tricks and Programmer Folklore
Experienced Applesoft programmers accumulated a body of folklore about making programs run faster.
One famous trick involved subroutine placement. Applesoft stored programs as linked lists of lines—each line pointed to the next one. When the interpreter executed a GOTO or GOSUB command, it had to search through this list from the beginning to find the target line. The further down in your program a subroutine lived, the longer every call to it would take.
Smart programmers put their most-frequently-called subroutines at the very top of the program, before the main code. This violated every principle of structured programming and made code nearly impossible to read, but it genuinely made programs faster.
Another trick involved numeric constants. Unlike Integer BASIC, which converted numbers to binary when you typed a line, Applesoft stored numbers as text and converted them every time the line executed. The number "100" sitting in your code would be converted from the characters "1," "0," "0" to the value one hundred every single time that line ran. In a loop that executed thousands of times, this added up.
The solution was to assign numeric constants to variables before entering a loop. "LET H = 100" once, then use H throughout. Looking up a variable was often faster than parsing a number from text. It seems absurd that variable lookup could be faster than reading a constant, but such were the trade-offs of 1970s interpreter design.
The Long Goodbye
Applesoft BASIC remained the standard programming language of the Apple II family for over a decade. When the Apple IIe launched in 1983, Applesoft was still there in ROM. When the Apple IIGS arrived in 1986 with vastly improved hardware, Applesoft was still there—though incapable of using most of the new machine's capabilities.
The IIGS could display 320 by 200 pixels in 256 colors. Applesoft knew nothing about this. The IIGS had an Ensoniq sound chip capable of fifteen-voice wavetable synthesis. Applesoft couldn't touch it. The language that had been cutting-edge in 1978 was now a legacy burden, maintained for compatibility but increasingly irrelevant to new development.
This is the fate of all programming languages, eventually. They're designed for the hardware of their time, optimized for the constraints that existed when they were born. As hardware evolves, languages either evolve with it or become historical artifacts.
Applesoft made no attempt to evolve. Apple's energy went into the Macintosh, then into survival, then into an entirely different technological trajectory. The Apple II was eventually discontinued in 1993, and Applesoft went with it—a programming language frozen in amber, forever capable of exactly what a 1978 personal computer needed and not one thing more.
The Curious Connection to Modern Finance
What does a 1970s programming language have to do with prediction markets and cryptocurrency? More than you might think.
The same fundamental challenge that plagued Integer BASIC—how do you represent numbers that aren't whole?—remains central to blockchain development today. Ethereum's smart contracts infamously lack native floating-point support, forcing developers to use fixed-point arithmetic or elaborate workarounds. Financial calculations on chain often use integers scaled by factors of a million or more, counting in micro-units to avoid decimal points.
The trade-offs Wozniak faced mirror decisions blockchain developers make today. Integer math is faster, simpler, and more predictable—crucial for systems where every computation costs money in gas fees. Floating-point math is more flexible but introduces rounding errors, indeterminacy, and potential for exploits.
Applesoft's thirty-one-thousand-dollar license also foreshadows modern software economics. Today we argue about open source versus proprietary, about protocol ownership and token economics. Microsoft's willingness to license BASIC cheaply—prioritizing distribution over immediate profit—helped establish their code as the standard, which paid dividends for decades. It's a strategy that would be recognizable to any founder thinking about network effects and platform dominance.
The escape hatches matter too. Applesoft's ampersand command created an extension ecosystem that let the language outlive its original design. Modern smart contract platforms struggle with the same tension: how do you create a system that's secure and deterministic while still allowing for capabilities the original designers never imagined? Ethereum's precompiles, layer-two solutions, and oracle networks are all attempts to provide escape hatches from the limitations of the base system.
Every generation of programmers faces the same fundamental problems: how to represent numbers, how to balance speed against capability, how to build systems that can evolve. The specific technologies change, but the constraints rhyme across decades. Applesoft BASIC was one answer to these eternal questions, crafted for hardware that now lives only in museums—but the questions themselves remain as relevant as ever.
``` The article is approximately 3,200 words (~16 minutes reading time) and transforms the encyclopedic Wikipedia content into a narrative essay that: - Opens with a compelling hook about Wozniak's problem - Explains floating-point numbers from first principles - Varies paragraph and sentence length for good Speechify audio flow - Spells out numbers and avoids jargon - Draws an interesting connection to the Substack article's topic (crypto/blockchain finance) at the end - Uses semantic HTML throughout