Tuesday, June 23, 2020

CONSCIENTIOUS VERSUS CONSPICUOUS CONSUMPTION

Consumption is the sole end and purpose of all production. ~ Adam Smith 


Any American knows personal spending on goods and services is the lifeblood of our economy. The statistic that economists universally trot out to emphasize personal consumption expenditures’ pre-eminence is that it represents 70% of our nation’s GDP. In the first quarter of 2020, this statistic was 69.5% of real GDP; close enough for a lot of jazz, and spending. In dollar terms, that’s $13.2 trillion or $39,800 on a per capita basis. In April, consumption declined a whopping 13.4%, in large part due to Sheltering-in-Place and related orders that have curbed consumers’ and business’ behavior to restrict the spread of the coronavirus. This is why the recession was well underway in April.
Progressives and others have argued these government restrictions have unduly hurt lower-income people and have increased inequality. Income and wealth inequality has gradually grown for a quite a while. The pandemic has added to this trend. Income inequality has been strengthened by consumption inequality, because it’s not conscientious enough according to a recent commentary in the New York Times.
Although income inequality has risen, no economist or policy-maker can authoritatively say what level of inequality is best. Should it be 18% lower? No one knows, other than it’s too high now. And because inequality is now acknowledged to appear in so many different guises, there’s no broadly-agreed upon policy remedy. Thinking just about unequal income or wealth distribution is passé. Only a decade or so ago, economists thought instituting more progressive income taxes (in the economic sense) and higher estate/inheritance taxes could effect a reduction in inequality.
No longer. Inequality has simply become too complex and multi-faceted. Its complexity exasperates the policy dilemma of resolving it. I counted 25 different types of inequality that have been mentioned in the media during the past few years. Beyond traditional income and wealth inequality, everything from geography (urban v rural) and criminal justice to education, library fines and the pandemic are wrapped in inequality flags.
A key ingredient in any inequality assessment is its measurement, particularly of how rich, wealthy people are defined. The 2011 Occupy Movement popularized the “we are the 99%” slogan, and correspondingly defined “the 1%” as The Rich. The Nobel-laureate economist Joseph Stiglitz wrote an influential article in the May 2011 issue of Vanity Fair that talked about the harmful effects of the 1%ers’ disproportionately large ownership of society’s wealth and resources. For the past decade the top 1% of income earners has become an unofficial standard for defining the richest Americans, sometimes supplemented by using a thinner slice,  the top 0.1%, for the really, really richest people.
The recent NYT commentary about inequality characterized the rich in a much broader fashion, not the top 1%, but the top 25% (quartile) of the income distribution. The story concentrates on inequality in consumption rather than income per se. A household’s consumption is positively and strongly related to their income (in econo-speak, it’s their disposable, after-tax income). Economic studies have shown for decades that higher-income households directly spend more on consumption than lower-income ones, but proportionately less, relative to their income.
If you’ve taken Econ 101 you might ever-so-vaguely remember that once upon a time, economists fervently debated how to explain household consumption patterns. Now we debate other, usually more exotic economic behaviors. We use the term “average propensity to consume” (APC) to measure what percent of your income you spend on consumption versus savings (non-consumption). If you spend $800 of your $1000 monthly disposable income on consumption, your APC is 80%. Research by the San Francisco Federal Reserve indicates that the APC of households with the lowest 10% income is 36% greater than that with those with highest 10% income; the APC of the lowest 10% is 25% greater than that those with the highest 25% income.
The NYT analysis castigates rich, wealthy people for not spending enough of their income to keep low-wage workers fiscally healthy. The authors criticize their broadly-defined quartile of the richest people for not consuming conscientiously enough.
Thorstein Veblen, who created the term “conspicuous consumption” in his radical 1899 book The Theory of the Leisure Class, would be delighted. The Rich’s consumption patterns, especially the conspicuous ones, have been disparaged (and covertly envied) for over a century. Nevertheless, the NYT study authors’ criticism doesn’t seem to care as much about whether The Rich spend conspicuously – say buying a Porsche 911 GT1. They care whether The Rich’s consumption is conscientiously appropriate in benefiting low-wage workers.
The differences in adjusted gross income (AGI) needed to be in the top 25% versus the top 1% are stark. The top 25% needs at least an AGI of $77,372; the top 1% needs $437,404, 5.5x more AGI. For perspective, the 2019 US median household income was $63,030, just 22% less than the top 25%ers’ minimum $77,372 AGI.
Who believes $77,372 is a rich household’s income? No one. It’s true the tippy top of the top 25%ers are indeed very rich, but this quartile of income distribution is very broadly defined, by definition and the lower half includes folks who usually aren’t considered very rich. For this reason, using the overly-broad top 25% to characterize The Rich is flawed. There’s way too many not-rich folks in this segment, at least 20% too many, more likely 24% too many.
Why do the authors use this expansive, quartile-based definition of rich – and not something more suitable, like the top 1%? No rationale is provided. Back to consumption.
Early in their commentary the reader is introduced to both the victims and the villains, “The recession has crushed this kind of work [Low-wage labor done by The Workers, who are servers and staff in Manhattan restaurants around Lincoln Center.], in particular: service jobs that depend directly on the spending — and the whims — of the well-off.” The authors directly blame the well-off, aka The Rich, for this economic crushing due to their lack of consumption spending.
Curiously, the authors do not mention several other factors that have magnified small business workers’ woes during the covid crisis. First, there’s no mention that all restaurants throughout New York and elsewhere have been closed for sit-in dining for months due to the coronavirus. Nationally, one in five small businesses have closed down. Second, that some truly rich folks (the top 1%ers, not the 25% richest) have departed NYC for other locations, and thus aren’t consuming anything in the Big Apple. Interestingly, thousands of young, probably un-rich people (but some maybe in the lower reaches of the top 25%) are also leaving NYC because of the pandemic and its associated costs. Third, that richer people are more likely to consume goods and services via online services, including (shudder) Amazon. Thus the top 25%ers’ expenditures at small businesses during covid-19 may have shifted online and outside of some “small businesses.” Businesses have been rapidly adopting online sales, that now represent 11.8% of total US retail sales, in order to stay in business. Only 11.8%? The online proportion seems much larger given the media’s focus/hype about online sales.
The authors state that the top 25% of income-earners have disproportionately not returned to their full pre-covid “rich consumption” patterns at small businesses, and that’s causing severe the problems for The Workers. However, no income group’s consumption has fully recovered to pre-covid levels. The “top-middle 25%ers’” spending at small businesses has also fallen but that reduction merits no apparent alarm. It’s hard to imagine that The Workers care much at all where their income comes from. Whether it’s dollars from rich, middle-class or other customers isn’t likely a concern. The Workers only care they are being paid with legal tender, not who hands it to them.
According to the story, by April Fools Day people at every income level drastically and quickly dropped their consumption expenditures at small businesses between 35% to over 40%; the top 25%ers had the most decrease. After mid-April when the first stimulus checks started arriving, small business revenue (SBRev) started to gradually rise in fits and starts. By June 1, the top-middle 25%ers had increased their portion of SBRev to where it was only 16% below that of March; SBRev attributed to top 25%ers was 26% down from March.
The authors’ vegan beef about The Rich is a relative one, not absolute. Their criticism focuses on this 10% difference (26% v 16%) in the top 25%ers’ SBRev recovery gap compared to the top-middle 25%ers’. The story faults these rich folks, who earn as little as $77,372, for not spending more of their income to help The Workers. The authors argue that conscientious consumption – whereby consumers spend their money to benefit low-wage workers – will reduce inequitable consumption. The authors believe The Rich haven’t gotten this message. Thus, consumption by The Rich, who haven’t lost their jobs like The Workers, have devastated low-wage workers, who “count on high-income people spending money.”
The authors’ believe The Rich should make consumption decisions on the degree to which their dollars will help low-wage workers. It sounds worthy, but oh my. Following their novel precept of conscientious consumption, before I decide where to purchase mouthwash, I should conscientiously learn how many low-wage workers are employed at CVS, Amazon, Target, Walgreens and, of course, the locally-owned mouthwash stores, and hopefully also get some understanding about how my mouthwash purchase will specifically benefit these workers at each business. I expect the number of consumers who would use this rationale for conscientiousness would, at most, be the proportion of folks who are now publicly wearing masks in South Carolina, Oklahoma or Montana – all states suffering from significant increases in coronavirus cases.
My bet is that low-wage workers in small businesses, like all workers in all businesses, are counting on everyone ramping up spending their money on goods and services, not just those who are rich. Castigating nearly 35 million households for not sufficiently consuming to conscientiously help low-income laborers is a fool’s errand.
Instead, how about biting a large, legislative bullet and, after November, significantly raising the federal minimum wage from the appalling $7.25/hr that was set 21 years ago, so at least 21 states’ minimum wages would be increased.
Getting out of this horrible recession will take the rich, as well as  you, me and everyone else raising their consumption for whatever we want. The more the merrier.





Friday, June 12, 2020

EDUCATING WITH COVID

Education is not the filling of a pail, but the lighting of a fire. ~ William Butler Yeats

The fabric of education has been ripped apart by the coronavirus. In a different era (before March), the contact-full classroom relationships between students and teachers and the myriad of their daily interconnections served to build knowledge and personal self-awareness, the heart of education. No longer. It is purposefully missing in the now-necessitated norm of online distance-learning (DL) education for this fall’s expected 70.40 million primary, secondary and college students. That’s over 20% of our population.
Education specialists believe that DL in most school districts is not working and that some students are falling behind. A middle-school teacher states, “We know this isn’t a good way to teach.” Black, Hispanic and low-income students are struggling the most, research suggests, according to a NYTimes story.
Dana Goldstein reports the richest and poorest parents are spending about the same number of hours on remote school, but wealthier parents are inevitably able to provide more books and supplies at home, more quiet space, educational toys and often more knowledge of the curriculum. High-income school districts are usually providing strong remote instruction, rather than basic worksheet-like activities. Inequalities often are magnified.
What DL diminishes is the constructive, essential interactive nature of multi-student, classroom-based education – students’ vocally intermingling face-to-face with their teacher and their peers on a continuing basis. It’s something that we’ve taken completely for granted, until recently. Online meeting software like Zoom, TeamViewer or Google Meet allow some simultaneous serial communication, but screens afford a wholly different experience than actual physically-direct collaboration for a classful of students.
So the critics are correct, DL is a threadbare approximation of the education we all remember. It sucks, no matter what grade-level is being discussed. But what do DL critics recommend instead? Mum's the word.
I’ve seen discussions about “split-session” teaching (e.g., having only a portion of the students physically come to classes at any given time), but I can’t imagine how teachers could deal with this possibility – that, in effect, would multiply their required class-time, depending on what the allowed portion is. Also, if “double-time” teaching could be more viable in any context (letting in one-half the class’s students at a time), it would challenge everyone.
Double-timing in-school teaching for the earliest grades, where the students’ education happens in a single classroom and is as much social-learning as academic, would call for schools to “create” twice as many school hours each week in order to comply with state-mandated requirements. California, like 27 other states, requires a minimum of 180 days of formal school instruction each year.
Raise your hand if you’re in favor of a 12-day week (10 school days’ worth of double-time teaching and 2 “recovery, week-end” days; although I think at least 3 recovery days for teachers would be far better after working for 5, double-time days). Or how about daily day and night classes for PK-12 grades? Or mandated home-schooling? What a surprise, I don’t see any raised hands. No wonder local school districts are stymied.
College students face a similar dilemma, but they’re (or someone else) is directly paying for the privilege of being there, unlike public PK-12 schools. At least 100 lawsuits demanding that colleges-universities provide refunds for tuition, fees and/or room and board have been filed so far. The students are claiming that the online DL college experience they received this spring (with the unaccepted, uninvited coronavirus on campus and no “regular” classes) is an academic encounter that is not what they bargained or paid for. The courts have yet to decide whether these students have a legitimate claim for refunds. It’s apparently not a slam dunk for the students. Even if they’re successful, will the colleges-universities be able to provide the reimbursements? According to a person who works for an association representing state higher education programs, colleges’ ability to pay refunds would be “incredibly challenging” due to public education’s sizeable budget cuts and increased costs.
Many colleges-universities are now planning online DL-based education for the fall, including the California State University system, the nation’s largest. Universities are rewriting the rules for on-campus student life in order to avoid a Tragedy of the Campus Commons. Colleges will be demanding their students diligently wear masks, as well as drastically restricting sporting events and somehow curbing social gatherings as well. Will college administrators be able to trust their 18- to 21-year-old undergraduates to follow such decrees? These rules will require behavioral changes that will tax the very being of young immortals. Time will tell.
College is a significant life-event for ever-more people. Thirty-six percent (36%) of US adults now hold at least a B.A. degree, the highest share ever, shown in the chart below. Over 19.64 million people were enrolled in colleges, universities and other “degree-granting institutions” in 2018, 57% of whom were female. This fall, 19.74 million are expected to register. Yet it’s worth remembering that despite the well-deserved praise for our decades-long increase, college degree-holders still represent only a smidgen more than one-third of US adults. At times we may act like a deserving majority, but we’re far from it. 
 Percent of US adults with a BA or higher degree, 1950-2019

 Source: NCES.ed.gov
Thus, even though it sucks, DL is the only practical, nontoxic means of providing public education now. It’s a version of formal education that can nearly adapt to the present, fraught circumstances amidst the scythe of the coronavirus, existing school-university infrastructure and available teachers and staff.
That is, unless I’ve missed a magical, superior education method that remains unmentioned because Albus Dumbledore never disclosed the Hogwarts’ secret handshake. In our current, pre-vaccine, coronavirus-filled world, it’s overwhelmingly online distance-learning, like it or not. And most of us don’t. Economists have a term for such schemes; they’re called “second-best.” At best, DL is a second-best solution, but better than any others.
Almost lost in the dark mists of this pandemic and our cavernous recession are progressives who continue roaring for free college and student debt-forgiveness. Yup, Bernie and Elizabeth have lost the race to be the Democratic Party’s presidential nominee, but some of their backers still actively pursue the provision of much vaster subsidies for college-goers. In the midst of giant, covid-related federal, state and local revenue reductions, adding these policies’ substantial costs ($2.2 trillion) makes little sense for reasons I’ve previously mentioned. Enacting such expensive, flawed plans for free college fade in importance compared to far broader, more pressing human priorities like public safety, adequate food and sufficient housing. Stow it free-college folks; instead seek the secret handshake.




Thursday, June 4, 2020

WRITING THROUGH THE AGES: from Cuneiform to Word

There is nothing to writing. All you do is sit down by the typewriter and bleed. ~ Ernest Hemingway 


I recently wrote another postcard to my grandkids and to my mother-in-law. I like keeping in physical touch with them by sending cards that show places we’ve visited. My specific focus was on writing the words I wanted to put on the cards. But after I was finished and put them in the mailbox, I thought about my actual writing process.
How did I do it? In this case I used a modern rollerball pen that precisely spread its ink onto the card to form letters and words I was writing. As I’m writing these very words about word-writing, I’m using a keyboard and word-processing software to display my words digitally on the computer monitor in front of me. Both these examples of my wordspersonship (aka, penmanship) are the culmination of more than 5000 years of human history.
Over the great span of humanity, those 50 centuries are practically yesterday, and they began a very long time ago. Many creatures communicate with their friends and family by uttering sounds, including crows, chimpanzees, prairie dogs and whales. But our writing distinguishes humans from every other life-form we’ve so far discovered. It permits us and our societies to transmit information and to share knowledge over more than a moment.
I offer here the interesting, fluid tale of the history of writing. It winds its way from cuneiform tablets to quill and fountain pens and word processing.
As you know, written text is distinct from spoken language as well as from symbolic systems. Anthropogenists believe humans developed the capacity for language at least 50,000 years ago. Human writing systems were developed much more slowly than our spoken languages. Symbolic communications systems, that include painting, maps and signs, often do not require prior knowledge of a spoken language or written text to be understood. That’s not true for writing.
Humans have been painting for a long time. The oldest known cave paintings are more than 44,000 years old, during the Upper Paleolithic period. The Chauvet-Pont-d'Arc Cave in southern France contains some of the best-preserved figurative cave paintings, drawn about 32,000–30,000 years ago.
A few written systems can be thought to straddle symbolic and written text. An example is ancient Egyptian hieroglyphics that used picture words to write. Hieroglyphic writing started as early as 3000 BCE, at the onset of pharaonic civilization. It was a very complex way of writing. Depending on how one counts them, there are from 700 to 1000 different hieroglyphic symbols. That’s a lot more than 26 letters. Modern humans were unable to read hieroglyphic scripts before the Rosetta stone was discovered in 1799. Sign language and braille (probably the first digitally-based languages) are two examples of contemporary symbolic communication languages.
Researchers now believe that writing was independently developed in at least four ancient civilizations: Mesopotamia (between 3400 and 3100 BCE); Egyptian hieroglyphics, mentioned above, China (2000 BCE) and Mesoamerica (by 650 BCE).
Cuneiform is a system of writing developed by ancient Sumerians of Mesopotamia, first in the city of Uruk. It is distinguished by its wedge-shaped marks on clay tablets made by a blunt reed stylus. Many Sumerians wrote cuneiform. Up to two million cuneiform tablets have been excavated in modern times. Cuneiform script was used for recording laws and maps, compiling medical manuals, documenting religious stories and beliefs as well as writing personal letters. In addition, businesses recorded their sales and inventories on tablets, delighting paleo-economists. I wonder what their Annual Reports looked like. Cuneiform had quite a run; it was used for more than 30 centuries, until the second century CE.
The modern English alphabet, which is the foundation of all our writing, is a Latin alphabet. It originated around the 7th century from Latin script that was in turn derived from the Greek alphabet, the first alphabet that had both consonants and vowels. As you may remember from grade school, the word “alphabet” is a compound of first two letters of the Greek alphabet, alpha and beta. For you etymological nerds. Q: What’s the most- and least-frequently used letters in the English alphabet?[1]
So now we have the alphabet, all we need is a device to place the letters on something other than a clay tablet. Ta da, the pen.
There has been a surfeit of pen types used for writing through the ages. The first was the reed pen, made from sea rushes, most likely developed by Egyptians to write on papyrus scrolls or parchment as far back as the First Dynasty (3000 BCE). Reed pens have had a long ride in human hands. They were still widely used in the Middle ages, and were slowly replaced by quill pens from birds’ flight feathers after the 7th century. Even now, reed pens made from bamboo are used in parts of Pakistan and Afghanistan.
Like reed pens, quill pens were prominently used for more than 1,000 years. Quills were broadly available and the primary writing instrument in the western world from the 6th through the 19th century. Hence James Madison, Alexander Hamilton, Benjamin Franklin and other political VIPs wrote the US Constitution in 1787 with their quills. Geese were a preferred source of feathers for quill pens. Did regular citizens take a gander at these notables when they deliberated about the Constitution in Philadelphia? Most likely.
Next up in the hands of writers was the nib or dip pen. A nib is the end part of a quill, dip pen or fountain pen, that comes into contact with the writing surface in order to deposit ink. They’re called dip pens because, like quills, there is no reservoir of ink in the pen. You have to periodically dip your pen into the ink well to keep writing.
Ancient Egyptians experimented with metal-nibbed pens made from copper and bronze. Generally the writing quality from such metal-nibbed pens was poorer to that of reed pens. It wasn’t until the early 19th century that pens made with split-steel nibs became popular, principally because their nibs retained a sharp point far longer than quills. Pens with easily replaceable nibs – of differing widths and designs to allow for distinctive writing– were quite popular. Pen-makers have used many types of metal for their nibs, including copper, stainless steel and gold. A copper nib was found in the Pompei ruins (79 CE). After the platinum group of metals was discovered in the mid-18th century, extravagant dip pens became objects d’art with iridium-tipped gold nibs in the early 19th century. Most nibs now use stainless steel alloys.
The French government provided a patent for a fountain pen invented by a Romanian student in Paris in 1827. His fountain pen used a swan's quill as its ink reservoir. The invention of the nibbed fountain pen solved the basic problem with dip pens. The “fountain” is an ink reservoir built into the pen itself that diminished the need for frequent, repeated use of an ink well. The modern fountain pen nib can be traced back to an original gold nib that had a tiny fragment of ruby attached to its wear-point.
Interestingly, Leonardo da Vinci created an inventive fountain pen. According to Wikipedia, there is “compelling evidence” that he constructed a working fountain pen during the late-15th century Renaissance. Leonardo’s journals show detailed illustrations (is there any other kind for Leonardo) of a reservoir pen. Leonardo bibliophiles note that his pen-based handwriting (that often was a mirrored, right-to-left shorthand) displays a consistent contrast, rather than periodic fading typical of a quill pen’s ink re-dipping.
Time-consuming technological innovation was required to mass produce reliable, inexpensive and leak-less fountain pens. This happened by the mid-19th century, when more than half of the world’s the steel nib fountain pens were manufactured in Birmingham, England. The rapid increase of fountain pens’ use in turn encouraged the expansion of education, literacy and, of course, writing.
Next up in this pen parade are ballpoint pens. The first patent for a ballpoint pen was issued in the US in 1888 to John J. Loud, who wanted a writing instrument that would work on rough surfaces like wood, that fountain pens could not. His pen was mildly successful operationally, but no one then wanted one. That would have to wait until after WWII. Rollerball pens utilize the same ballpoint mechanism as ballpoints, but apply water-based inks instead of oil-based inks.
Marcel Bich, a Frenchman, dropped the last letter of his name and created Bic pens. Bic introduced a ballpoint pen in the US in 1950. It was his firm’s first product. His inexpensive, disposable Bic Cristal pen has been the world’s most widely-sold pen for some time. They now come in 18 different colored inks. The 100 billionth Bic pen was sold in September 2006.
What about pencils? We’ve used pencils for ages as drawing instruments. Contrary to the usual term, lead pencils have no lead in them, only graphite usually surrounded by wood. In the mid-16th century a massive deposit of very pure, solid graphite was discovered near Cumbria, England. It’s the only large-scale solid graphite deposit ever found. Because graphite is quite soft, it needs some type of encasement to be used. Way back then, the graphite sticks were wrapped in sheepskin. Now we use wood casing.
One of my memories of grade-school is the bright yellow wood-encased pencils. Indeed, the hexagonal Ticonderoga #2 yellow pencil was created in the late 19th century and is still made and used. Its manufacturer, Dixon, used yellow-colored wood to case their pencils with Chinese graphite because in China yellow denoted royalty. Who knew? Alas for Dixon, pencils have long become a commodity; their Ticonderoga pencil hasn’t been cool or prized for a protracted time.
Speaking of grade-school and pencils, learning long-hand cursive script may still be briefly taught, but its days are clearly numbered because of the gigantic rise of digital writing for everyone.
In high school and college when writing papers, formal letters and other documents I used a typewriter. In college I had a sea-green Hermes 3000 portable typewriter, shown below. A typewriter is a mechanical or electro-mechanical machine for placing alphanumeric characters on paper. In 1575, an Italian printmaker invented the scrittura tattile (literally: tactical writing), a machine to impress letters onto papers, a very crude ancestor of the typewriter. By the 1880s commercial typewriters were becoming more common. Interestingly, the QWERTY keyboard layout was initiated in 1873, and hasn’t changed much despite occasional hand-felt pleas. Like many students, I took a typewriting course in high school, which should have been called qwerty-ing, not typewriting.

 Hermes 3000 Typewriter

Standard designs for typewriters were not common until after 1910. Such standards spurred sales. Thereafter, the typewriter quickly became an indispensable tool for virtually all writing other than personal handwritten correspondence. Typically, a typewriter has an array of keys, and when pressed by a finger each key causes a different single character to be produced on the paper via a typebar, by means of a ribbon with dried ink struck against the paper by a type element. In the office market IBM introduced its Selectric typewriter in 1961 that transformed the electric typewriter market by disposing of the typebars with an easily-replaceable, spherical element (or typeball). By the 1970s, IBM had succeeded in establishing the Selectric as the prevailing typewriter in mid- to high-end office environments. During the 1980s typewriters began to be displaced by digital devices.
Here’s the digital last-stop on my tour. The majority of writing in Western nations now is done digitally ultimately using zeros and ones, not styluses, pens or pencils of any sort. Digital writing involves word processing software on your computer and/or phone.
The term “word processing” first appeared in offices in the early 1970s, as a productivity tool for typists. Centralized, word processing-specific microcomputers slowly edged into businesses. Wang Laboratories became a popular system in the mid-1970s and early 1980s. When home/personal computers (PCs) became more prevalent in the late 1970s and 1980s, centralization fundamentally folded. Everyone became a “typist,” except the typists who became nonessential. Both commercial and individual microcomputer owners bought WordStar and then WordPerfect to write documents after laboriously installing them on their computers. Microsoft Word was put on the IBM PCs in 1984. Macintoshes had MacWrite. The “email and authoring market” software is currently dominated by MS Office, which had 87.5% of the 2019 market. Google Docs has 10.4%.
This synopsis of our 5000-year journey of script has reached its finale. It’s covered much territory and innumerable written pages of prose and poetry. Imagine what might be coming next.





[1] A: The most-frequently used letter is E; Z is the least-used. That’s EZ isn’t it.