Tag Archives: Frustration

They paved paradise…

Did a story for the Ledger-Sentinel (although corporate has apparently decided to just call it the Ledger these days) a few weeks ago that recapped the most recent U.S. Department of Agriculture farm census for Kendall County.

This most recent national farm census was taken in 2012, with the results finally released in 2014. I’d been thinking of doing a piece about it around the time it was to be released, but then the whole thing slipped my mind until late spring this year.

Farm censuses have been taken for almost 200 years now, with the first one taken by the U.S. Census Bureau in 1820 as part of the regular decennial population census. That was the practice until 1950, when the census bureau started collecting farm data only in years ending with 4 and 9. In 1978, that was changed to taking the farm census in years ending in 2 and 7. Finally, in 1997, Congress moved responsibility for collecting farm data to the USDA—which seemed pretty logical to me—and keeping the requirement to collect the data in years ending in 2 and 7. Thus the 2012 census.

When I finally got the data entered into my trusty spreadsheet, I have to admit being surprised—astonished, really—at the amount of farmland the census showed had been lost to development in the five years prior to 2012.

Collar Counties

Kendall is the only non-Collar County to border on three of the six Chicago metro region Collar Counties, putting it in the perfect spot to absorb overflow population from fast-growing Kane, DuPage, and Will counties.

Previously, the largest amount of farmland lost to development had been the 8,313 acres lost between 1992 and 1997.But between 2007 and 2012, Kendall County lost an astonishing 37,131 acres of farmland to development. In the 57 years prior to 2007, the county had only lost a total of 28,365 acres to developme

Granted, it was clear that the county’s strong growth was going to catch up with it sooner or later. Between 1990 and 2010, Kendall County’s population grew from 39,413 to nearly 115,000. My hometown of Oswego went from 3,914 to 30,303 during the same period.

But in the five years between 2007 and 2012 the biggest recession since the Great Depression hit the nation, and it hit Chicago’s collar counties particularly hard. It’s an indication of just how frenetic the financial industry was driving inflation of the nation’s housing bubble in the years immediately prior to the crash of 2008. Billions of fraudulent dollars were changing hands as vast tracts of farmland in Chicago’s hinterland were purchased, subdivided, and developed. Infrastructure—streets, curbs, gutters, water and sewer lines—was being pushed as developers rushed to provide the new homes the financial industry required to keep the bubble inflated through a whole host of actions that ranged from simply unethical to downright illegal.

It took a while for the development train wreck to come to a standstill and the dust to settle. When it did, not only had a bunch of productive farmland been sold for development, but also vast swaths of it had been covered with all that infrastructure listed above. And that meant that while some land sold for development could still be farmed because it was vacant, a lot of it simply could not.

The disappearance of so much farmland capped a long-term period of population growth in Kendall County, particularly in its northern three townships, but also in the county’s eastern tier of three once almost entirely rural townships. Oswego, situated in Kendall’s northeast corner, is a member of both groups.

NaAuSay and Seward townships, situated directly south of Oswego, until this most recent flood of growth hit in the 1990s, had no municipalities in their boundaries. But then Plainfield began expanding across the eastern border of NaAuSay Township, while Joliet and Minooka began intruding into Seward. And that’s how come some residents of Joliet and Plainfield send their children to Oswego schools. It’s also one more reason why so much farmland was lost to development in the five years prior to 2012.

The northern tier of Kendall’s townships—Little Rock, Bristol, and Oswego—had been undergoing growth for years prior to the inflation of the housing bubble. Oswego and Bristol, especially, were the subject of growth hurtling down the corridor along U.S. Route 34—called Ogden Avenue east of the Kendall County border—that accelerated to extraordinary levels after the construction of the huge Waubonsie Interceptor sewer line. The 60” diameter sanitary sewer line was built down the Waubonsie Creek valley from what was then called the Fox Valley Mall to Montgomery, where it crossed the Fox River to the Fox Metro Water Reclamation District’s treatment plant.

As soon as adequate sanitary sewer capacity was available, residential and commercial growth along the Route 23 corridor in Kendall County exploded. Why? For the same reason folks found the county a good place to live way back in the 1830s. Back then, the search was on for cheaper land that was good for farming in a location not too far away from the Chicago market that even in the early 1830s had begun to grow. As soon as U.S. Army engineers figured out how to drive a permanent channel through the sandbar at the mouth of the Chicago River, creating for the first time a safe harbor for Great Lakes shipping, that growth turned exponential.

The Chicago region’s population grew outward from the Lake Michigan shore, first spurred by the Chicago, Burlington & Quincy’s commuter line that terminated at Aurora, and then after World War II by the web of multi-lane limited access expressways that stretched from Chicago north, south, and west.

Oswego Township was picked for industrial development in the early 1950s, with sprawling Caterpillar, Inc. and Western Electric plants built. Plenty of land was available at relatively low prices in the area that was outside the Chicago metro area, but close enough, and with the necessary rail connections industry of that era required.

And at the same time, developer Don L. Dise, hearing about the coming construction of those facilities, decided Oswego Township was the perfect spot to build Kendall County’s first super subdivision. He picked the huge Boulder Hill Stock Farm, owned by the Bereman family, as the location for his development, located right across the Fox River from the new Western Electric and Caterpillar plants, figuring the plants’ workers would need housing. Eventually, the Cat plant alone employed more than 7,000.

Calling his new planned development Boulder Hill after the former livestock farm, Dise proposed building out neighborhoods to attract all economic levels, from executives to factory workers. And he succeeded, attracting an eclectic mix of new homeowners, from CB&Q executives to Caterpillar and other local factory line workers, with most of the first homes financed thanks to the post-World War II GI Bill. Not only did the GI Bill promote home ownership, but it also encouraged veterans to get college degrees, which allowed the millions who served in the war to move up to better jobs, and then buy brand new houses from Dise and other developers.

The first families moved into their new Boulder Hill homes exactly 60 years ago.

The late 1950s was the county’s first big spasm of growth. Between 1950 and 1960 Oswego Township’s population doubled. Then it doubled again between 1960 and 1970. As growth to the east continued to accelerate even faster, refugees from Cook and DuPage counties looking for cheaper housing, less traffic, and a small town atmosphere, continued to move into Kendall County, but growth was relatively restrained until the Waubonsie Interceptor literally opened the floodgates.

And that touched off the next era of growth that both flowed and ebbed several times before the financial industry, with the help of Congress and President Bill Clinton, who removed regulations that had kept it to reasonable levels, hit on the idea of securitizing mortgages. Not only did they securitize mortgages, they also figured out how to defraud the entire real estate financing system by methods ranging from forcing appraisers to artificially inflate existing home values to require bigger loans to gaming the home loan system itself to allow mortgages to be awarded to those who could not afford them. Which was fine, because the goal was not to make money off house payments, but rather by selling the mortgages (sometimes several times), bundling them, and dividing them into batches so they could be securitized into bonds for sale to investors. Since the bond rating agencies were in on the fraud and since government was not allowed to regulate the bonds, the amount of farmland purchased at greatly inflated prices in order to feed the need for more and more mortgages to be sliced and diced and sold to suckers was substantial.

In Kendall County alone, it amounted to that 37,000 acre loss in just five years mentioned above. As the bubble inflated between 1990 and 2010, Kendall County’s population tripled. In fact, according to the U.S. Census Bureau, between 2000 and 2007, Kendall was, in percentage terms, the fastest growing county in the nation.

Then the crash came, but here we sit nonetheless.

And what happened to all those farmers as land was gobbled up by developers? Glad you asked. More later…

 

Advertisement

Leave a comment

Filed under Business, Farming, Frustration, History, Kendall County, Local History, Oswego, People in History, Transportation, Uncategorized

112 year-old urban barn in danger of demolition

One of Oswego’s most historic urban barns has been in the news lately, and in a troubling way.

Our local library district has bought the land on which the small barn occupies a tiny corner, and they’ve announced plans to demolish the barn. According to the library district, there is “no record of any historical significance for the” barn and the small rental cottage on the property. They’re wrong, of course.

Urban barn plan

Urban barn plan with carriage room and stalls for horses and other livestock from Barn Plans and Outbuildings by Byron D. Halsted, Orange Judd Company, New York, 1881.

Urban barns, as an architectural class, are usually pretty unassuming. Virtually all of them (with the exception of the odd modern knock-off) were built in the 19th and early 20th centuries. Back in that day and age, lots in villages and in many towns and cities as well, resembled tiny farmsteads plunked down in urban settings.

Until sanitary sewer systems were widely introduced in the first half of the 20th Century, each small town residential lot featured the family home, an outhouse, and an urban barn. Sometimes a smokehouse and a small chicken house were also included.

Like their larger rural siblings, urban barns housed the family driving horse as well as the family buggy or carriage and often a cutter—a one-horse open sleigh—for winter travel. In addition, the barn also provided a home for the family milk cow, and often a small flock of chickens provided they didn’t have their own chicken house on the lot.

Matile Barn

The urban barn here at the Matile Manse started out its life as a timber-framed (oak and black walnut) saltbox-style house. It was moved a few dozen feet south in 1908 and my great-grandparents built their retirement home on its former site.

When the horse and buggy era ended, urban barns were easily converted into auto garages, workshops, and homes for lawnmowers and lawn sweepers. Over the years, some of those urban barns have even been converted into residences. In other cases back in the day, unwanted residences were also converted into urban barns—like the one here at the Matile Manse. When my great-grandparents bought the property our house sits on, it was already occupied by a timber-framed saltbox style house. Back then, in 1908, folks weren’t so quick to tear old buildings down. So the old house was put on rollers and moved a few dozen feet south to make room for the new house—and turned into an urban barn.

Today, Oswego has a fine collection of 19th and early 20th Century urban barns, possibly one of the best such collections in the Fox Valley. According to the village’s 2009 historic structure survey, conducted by Granicki Historical Consultants of Chicago, “Oswego stands apart from other towns in Northeastern Illinois with its enduring collection of urban barns.”

Granicki counted 22 urban barn examples in the village and labeled six as historically significant in their final report—including the one that the library district could find “no record of any historical significance.” Which suggests they weren’t looking very hard.

Historical preservation got a good start back in the last decade when the village established the Oswego Historic Preservation Commission. And shortly after, they paid for the historic structures survey. But since then, it’s been pretty much downhill when it comes to preserving local historic structures. Changes in the village board’s membership, as well as the turnover of other top elected and appointed officials has basically led to the commission being severely marginalized, with officials withdrawing support and even treating the group with outright hostility.

2016 Kohlhammer Barn

Unless community pressure changes its mind, the Oswego Public Library District plans to demolish the old Kohlhammer Barn.

The urban barn the library district would like to tear down differs from many others in Oswego in that we know who built, when, and why.

The May 18, 1904 Kendall County Record’s “Oswego” news column reported that “Fred Kohlhammer has completed the excavation for the basement for his barn on his lately acquired land north of the Waubonsie.”

Kohlhammer was a well-known German-American contractor in the Oswego area who built homes as well as farm and commercial buildings. The parcel he purchased was bordered to the south by Waubonsie Creek, to the east by the East River Road (now Ill. Route 25), to the north by North Street, and to the west by the Chicago, Burlington & Quincy Railroad’s right-of-way. The site Kohlhammer chose for his house was on a rise overlooking the creek, where the land fell off sharply towards the stream bank. Later, the Kohlhammers would extensively landscape that steep slope with perennials, rock gardens, and an artificial stream, the water for which was pumped from the creek by a scale model Dutch windmill. Interestingly enough, for the driveway to the house, Kohlhammer made use of a short stretch of the old Chicago to Galena Road that cut through the parcel on its way to the limestone-floored ford across the Fox River just to the west.

Kohlhammer placed his barn at the corner of the East River Road and North Street where it would be handy to the house and where it would be easy to store the family buggy and other equipment.

He chose to build a modified bank barn, with an upper floor for buggy and tack storage and a lower lever for the horse and cow stalls. And he built neatly and well, because upon completion in July 1904, the whole Kohlhammer family moved in while construction was on-going for their new home right next door. In early 1905, the family moved into their new home, and the barn was given over to its original purpose.

Some years ago, the barn and small adjacent rental cottage and remnant of the oak savanna on which they stood were separated from the house and sold as separate properties. And now the library district has purchased the barn and rental cottage and the oak savanna remnant. While the rental cottage, which dates to the early 1950s, is not historically significant, the urban barn certainly is.

So what will happen to this endangered urban barn? Well, the folks who own the house Fred Kohlhammer built in 1904 want to buy it from the library district and restore it. It’s situated on the extreme northeast corner of the property, meaning that piece could easily be clipped off and sold to the homeowners, who really want to restore and preserve it. So the barn could be saved at no cost to the taxpayers and the community would continue to enjoy a link with Oswego’s past that’s been standing on the same spot for the last 112 years.

Seems like a win-win, but then again modern folks usually seem fixated on demolishing things they neither want nor understand. Time will tell on this one.

And on a somewhat related note, I’m going to be giving my “Barning Around: Kendall County” presentation this coming Thursday, April 14, starting at 7 p.m. at the Big Rock Historical Society, 48W445 Hinckley Rd., Big Rock. It’s free, and the Big Rock folks won’t mind a bit if you stop by.

Leave a comment

Filed under Architecture, Farming, Frustration, History, Kendall County, Local History, People in History, Semi-Current Events

Common sense seems to be a vanishing commodity

In February, new legislation in the State of Kentucky legalized what is called ‘herd sharing,’ meaning folks can buy into a herd of dairy cows, which will allow them to share the milk and other products the herd produces without such pesky requirements as requiring the raw milk to be pasteurized.

To celebrate the passage of the new law, a Kentucky legislator brought a jug of raw milk and passed it around to his lawmaking colleagues. Whereupon they all got sick. The politicians insist that it was mere coincidence they happened to become violently ill immediately after drinking raw, unpasteurized milk, and it may well have been. But probably not. At least nobody died that I know of, which may or may not be a good thing for the people of Kentucky.

The episode got me to thinking about my own childhood. I was either extremely lucky or disadvantaged, depending on your outlook on life, to have been born at a time when television had not yet become ubiquitous, indoor plumbing had only just become the norm, and diversified farming was morphing into the modern grain or livestock operations common today. It really was the end of an era, and the dawn of a new one.

1950 Butcher Place

The farm my parents rented from Clarence and Elsie Butcher. The barn where my dad milked Daisy is in the background, the disused outhouse is hidden beside the garage at left, and my mother’s chicken house is barely visible through the trees at right.

When my mother brought me home from Copley Memorial Hospital in 1946, it was to a house with a new indoor bathroom that my preteen sisters greatly enjoyed. Granted the bathroom was in the basement, but that was a small trade-off given no more winter or rainy day trips across to the outhouse and the benefit of an indoor bathtub complete with a water heater. We listened to the radio, not TV, and my sisters’ music collections were filled with 78 rpm records that shattered if dropped—or if their little brother stepped on them.

By that time, subsistence farming was long gone in the United States, although vestiges of it remained. My mother did not work outside the house and my father did not work off the farm, but that aspect of farming life was beginning to change even then. Instead of working off the farm, my mother managed the garden and fruit orchard and raised chickens. It all sounds sort of like some modern suburban areas, but her garden was huge, and the orchard was sizable. Each year, she canned dozens and dozens of quarts of vegetables and fruit. And when my grandparents bought chest-type deepfreezes for their children one Christmas, all those veggies and fruits were frozen for use throughout the rest of the year.

Not only did her chickens produce eggs for family use, but the chickens themselves were a year round source of fresh meat. The eggs over and above those used by our family were carefully washed, candled, and packed in egg crates to be taken to town on Saturday and traded for the groceries we didn’t produced ourselves on the farm.

Guernsey cow

Maybe if you’ve seen one Guernsey you’ve seen ’em all, but this lady really does look a lot like the Daisy I remember.

Which brings me to that raw milk I mentioned above. We had a cow named Daisy, a gentle Guernsey. My dad favored Guernseys for his family’s cow because they produce milk with a very high butterfat content. That meant Daisy produced more cream per gallon of milk than the Holsteins most dairy farmers favored, but less actual milk than those black and white Holsteins.

My dad milked Daisy morning and night out in one of the former horse stalls in the barn. He was an expert hand-milker, sitting on his three-legged stool, quickly filling the polished steel milk pail with quick, sure pulls on Daisy’s teats. The barn cats were always drawn by the sound of the milk hitting the bucket, and as they gathered around, Dad would send a squirt of milk first to one and then to another, which they learned to catch in mid-air. Just writing that last sentence brings back the sounds and the smells of that time and place…the warmth of the barn even on a cold winter’s day, the excitement of the cats and kittens, and the glint of humor in my dad’s eye as he accurately shot those milky treats around the semi-circle of hungry cats.

Cream separator.jpg

Our cream separator sat in a corner of the basement. It had to be thoroughly cleaned after each use, a job my sisters did, but not all together willingly.

After Daisy was turned back out into the pasture, Dad took the milk in the house and down the basement, where he put a new filter in the separator, and poured in the milk. Our separator was blue in color and sort of resembled a miniature municipal water tower on four legs. The milk went in the top, and the ingenious tool used gravity and centrifugal force to separate most of the cream from the milk.

My folks used Daisy’s rich cream for their coffee, and mom used it for cooking. After enough of it was saved up in the refrigerator, we took it to my grandmother, who used her electric churn to turn it into rich butter. After being turned out of the churn, she would work it with a flat paddle in a large, shallow wooden bowl to force the buttermilk out, and then to work in the salt she sprinkled on. My dad loved buttermilk. It’s a taste I never acquired.

We drank the milk that came out of the separator and used it on our cereal, and mom used it for cooking. I wasn’t so fond of cream in those days, and even though it had been run through the separator, after a fresh batch of milk sat in the refrigerator over night, a thin skim of cream would form on the top—which my parents happily skimmed off with a teaspoon to color their morning coffee.

Daisy always produced more milk than the five of us could consume, so when enough extra was saved up, Mom and Dad took it over to Aunt Bess McMicken, who then turned it into cottage cheese by some magical process which I never really saw. All I knew was that milk went to Aunt Bess’s and wonderful cottage cheese came back packed in metal containers.

Some farmers during that era had begun pasteurizing their family’s milk, but my dad said that as long as you know where the milk came from, and if you made sure that cow was healthy, there would be no problems with drinking raw milk.

Which brings me back to those Kentucky politicians and their new law. My dad was a wise man, and his insistence that only raw milk from a known healthy cow was safe, it seems to me, is at risk with this ‘herd sharing’ scheme. Who, exactly, will be responsible for assuring every cow in that herd is approved to give healthy enough milk that it doesn’t need to be pasteurized? And what are the shared liability issues? If I were a parent with children, I certainly wouldn’t want to take a chance that milk might be safe when the simple process of pasteurization would assure its safety.

Assuring the safety of milk for human consumption was one of the great scientific achievements of the 19th Century, something that saved countless lives and avoided tragedy on what would be, for us modern Americans, an almost unbelievable scale. But now, well-meaning, but essentially clueless people who are alive today thanks to food safety and health regulations, from vaccinations to milk pasteurization, are eagerly discarding them for their own children in some sort of back to nature scheme. I guess I’m not very worried that parents with crackpot ideas may poison themselves or give themselves preventable diseases, but it does concern me that they may be sentencing their children to disease and death.

 

Leave a comment

Filed under Farming, Food, Frustration, Local History, Nostalgia, Uncategorized

Seward Township’s namesake a reminder bigotry can be fought

Religious bigotry is all the rage in some political circles these days as everyone from candidates for state office right up to Presidential candidates contend that Muslims are born terrorists. And I have to say that since the goal of the actual terrorists is to terrorize, they seem to have succeeded beyond their wildest dreams, at least among the right wing here in the U.S.

One of the real downers for those of us who enjoy reading and writing about history is that it seems no one ever learns a thing from it these days. And since learning from past mistakes is one of the major reasons for studying history in the first place, it is extremely frustrating.

The current rampant fear-mongering on the right is a good case in point. In fact, it’s several cases in point. There hasn’t been a national crisis in years that the right hasn’t used to sow fear to create panic and dissension. Last week’s shooting rampage, the one out in California by a couple of unhinged Muslim malcontents, seems to have driven more than a few folks right over the edge, something that didn’t happen, for some reason, with the recent rampages by the Christian terrorists who gunned down Baptist prayer meeting attendees in South Carolina and people unlucky enough to be near a Planned Parenthood clinic in Colorado.

Hate, when you get right down to it, is pretty ecumenical. Time was, the in folks to hate were Catholics, the Irish, and Germans. There was enough hate percolating around, in fact, to settle out and create the basis of an entire political party.

In the 1840s and ‘50s, the United States was experiencing a strong surge of ethnic and religious rancor and fear, with Catholics bearing the brunt of the hate of the dominant Protestants. The fear arose from the increasing numbers of Irish and Germans who were immigrating to the U.S. to escape everything from famine to war. Most of those new immigrants were Catholic and as their numbers increased, they posed a potential political challenge to the Protestant establishment.

In 1850 residents of one of Kendall County's nine townships voted to name it after Benjamin Franklin. Unfortunately, they had to come up with a new idea shortly thereafter.

In 1850 residents of one of Kendall County’s nine townships voted to name it after Benjamin Franklin. Unfortunately, they had to come up with a new idea shortly thereafter.

As the number of Catholics continued to grow, so did the conspiracy theories about why they were immigrating, chief among those theories being that the Pope was planning to subjugate the nation by increasing the Catholic vote. To combat the perceived threat, a number of secret societies were founded with names like the Order of the Star Spangled Banner and the Order of United Americans. Society members worked to establish political parties that would champion their views—the existing Whig and Democratic parties were considered far too cozy with foreigners—and the end result was the American Party. Because of its secret society foundation, members of the new party were encouraged not to divulge its stands on the most contentious issues, but instead were instructed to reply to questions with “I know nothing.” It didn’t take long for opponents to dub party members the Know Nothings.

The American Party ran candidates for state and local offices in the 1856 election with platforms that promised to crack down on crime, enforce Sunday saloon closings, encourage the use of the Bible and prayer in public schools, and appoint only American-born citizens to government positions. The party was strongest in Massachusetts and New York, but American Party candidates ran in many states.

In 1856, voters here in Illinois, had the opportunity to vote for American Party candidates for President and Governor. Former Whig President Millard Fillmore was their Presidential candidate while Buckner Stith Morris, a former mayor of Chicago and sitting Lake County circuit court judge, ran for Governor of Illinois.

In Kendall County, the appeals to religious and ethnic bigotry didn’t get much traction. Stith only polled 10 votes for Governor, while the GOP candidate, William Henry Bissell, not only swept Kendall with 1,615 out of 1,954, but became the first Republican Illinois Governor.

Fillmore, too, was crushed in Kendall County, polling just 13 votes out of 1,969 cast for President.

Unfortunately, the winner of the 1856 Presidential election, Democrat James Buchanan, proved to be one of the nation’s worst Presidents, whose incompetence sped the country’s march towards civil war.

Forced to discard the Franklin name, county residents next chose to name their township in honor of popular New York politician William H. Seward.

Forced to discard the Franklin name, county residents next chose to name their township in honor of popular New York politician William H. Seward.

Which, strangely enough, brings us to Seward Township here in Kendall County. In 1850, the Illinois General Assembly allowed counties to adopt the township form of government as opposed to the former commission form. Under the commission form of government, three commissioners formed the entire county board. Under the township form, the supervisors from each township in the county formed the county board.

Since so many of Kendall County’s new residents came from states back east where township government was the norm, it was quickly adopted by the voters here, meaning the county’s townships needed official names. The folks living in Seward Township decided to name their township after Benjamin Franklin, and the name appeared on the township’s official U.S. Census returns for 1850. But there was a conflict with another Illinois township and so residents, so many of whom emigrated from New York State, decided to change it to Seward, after William H. Seward, former governor and U.S. Senator from New York—and Abraham Lincoln’s future U.S. Secretary of State.

Seward was born May 16, 1801 in Orange County, N.Y. He was admitted to the New York bar in 1822, and quickly became involved in state politics, including, in the late 1820s, with the anti-Masonic movement. He was elected to the New York State Senate in 1830. After an unsuccessful campaign for governor in 1834, he ran again in 1838, and was elected.

He was an early anti-slavery advocate, a popular position in New York outside of New York City, and in 1848 his stand against slavery got him elected to the U.S. Senate under the Whig Party banner.

As Seward rose in New York’s political hierarchy, he came into constant conflict with the state’s nativists, who eventually turned into the Know Nothings. He was never sympathetic to the growing nativist movement, and after that brief fling with the anti-Masonic movement, he seems to have developed both ethnic and religious tolerance unusual for his day.

By the early 1850s, Seward’s Whig Party was beginning to disintegrate, split by the slavery issue into the “conscience” and “cotton” wings, and the Know Nothings were eagerly waiting to step in and pick up whatever pieces they could grab. Oddly enough, many northern Know Nothings, while rabidly anti-foreigner and anti-Catholic, were also anti-slavery. As a result, they worked hard to convince anti-slavery “conscience” Whigs to join the American Party at the expense of the then-brand new Republican Party.

Buckner Stith Morris, the American Party's candidate for Illinois Governor in the 1856 election was trounced in Kendall County and the rest of the state.

Buckner Stith Morris, the American Party’s candidate for Illinois Governor in the 1856 election was trounced in Kendall County and the rest of the state.

Seward, by then a political power in his own right, decided the Know Nothings were both wrong and dangerous, and he determined to fight them for political control of New York. Even so, in 1852—nearing the height of their power—Know Nothings took virtual political control of the state. But just two years later, Seward won a resounding and overwhelming victory, gaining reelection to the U.S. Senate, this time as a Republican.

What had happened during those few years? Just as had happened to the Whigs, the northern and southern wings of the Know Nothings split over both slavery and Catholicism. Many Louisiana Know Nothings were Catholic, something northern party members couldn’t abide. Nor could southern party members abide the northerners’ anti-slavery position. After managing to run a Presidential candidate in 1856, the Know Nothings pretty much collapsed.

The Know Nothings’ residue drifted to either the new Republicans, or the Democrats—who themselves were beginning to splinter into slavery and anti-slavery wings. During Buchanan’s disastrous Presidency, slavery became the premier sectional issue, one that eventually split the entire nation, temporarily dampening nativist and religious bigotry as the nation was engulfed in war.

Seward Township’s name is a reminder of that unfortunate era of American history when bigotry was formalized into a national political party. Today’s politicians, especially Republican Presidential contenders, would do well to take heed of what happened to the Know Nothings when political bigotry got out of hand. The question is, will there be another William Seward waiting in the wings to save the GOP from itself.

Leave a comment

Filed under Frustration, History, Illinois History, Kendall County, Local History, People in History

Is it twenty-five or six to four, or three?

Thought I’d put this post up today after I dredged it out of the archives, since it’s once again time to “Fall Back.” Hope you all successfully set your clocks back last night…

Seems to be a lot more whining complaining this year about the switch to Standard Time from Daylight Savings Time.

Not sure why that is, except people seem to be getting more and more disgusted with just about everything these days. Not that our twice-annual clock movement ritual makes much sense. It seems to be one of those things we keep doing just because we’ve ‘always’ done it. Which isn’t entirely true, although we’ve been fiddling with the concept for a couple hundred years now.

Benjamin Franklin first proposed the idea of daylight savings time (DST) in 1784, but it wasn’t until 1895 that  New Zealand entomologist George Vernon Hudson proposed its modern incarnation, apparently so that he would have more daylight hours in the summer to collect bugs. Insect collecting aside, it wasn’t until the idea was pitched as a way to save energy in during World War I that the idea got a governmental boost. Although the idea was controversial, especially with farmers who argued their cows and chickens didn’t use clocks so what was the use, DST was approved as a patriotic measure to help win the war.

This Word War I poster urged everyone to support the war effort by supporting Daylight Savings Time.

This Word War I poster urged everyone to support the war effort by supporting Daylight Savings Time. Although willing to go along during the war, farmers in particular lobbied hard to get rid of it after Armistice Day.

Kendall County Record Editor Hugh R. Marshall observed that the idea hadn’t proven as annoying as many feared, asking in the April 3, 1918 edition: “Didn’t mind it, did you? You never noticed the change of time after the novelty wore off, but did you notice that you did not burn so much light at night as before?”

But farmers still didn’t like it, and they were a powerful lobby at the time. As a result, DST was repealed in 1919, despite a veto by President Woodrow Wilson—which Congress promptly overrode.

Although DST was out as a national mandate, local governments had been, unwisely it developed, given the authority to establish it in their own communities. The result was a confusing hodgepodge of times all over the country as some areas adopted it, while others did not.

As the Record reported on April 9, 1930: “Chicago daylight saving time, the bane of hundreds of commuters residing in the Fox valley cities, will be ushered in Sunday, April 27…Suburban trains and the third rail lines operate on the daylight schedule while through trains are operating on central time.”

The problems this situation caused are self-evident. And really, confusion did reign.

So why couldn’t everyone just vote on it? Well, some did. On April 14, 1937, the Record reported from Oswego that: “At the village election to be held next Tuesday, April 20, the question of whether or not to have daylight saving time in the village of Oswego will be voted on.”

The result: “The vote for daylight savings time in Oswego carried at the town election on Tuesday, April 20. All meetings of the churches and schools will be on the fast time. The Presbyterian prayer meeting on each Tuesday night will begin at 8:30.”

So the Village of Oswego would run on DST in the summer, but there was no mandate that anyone in the surrounding countryside had to. The grumbling and confusion continued, with some towns adopting it and others deciding against. Figuring out out which communities were operating on “fast time” and which ones weren’t remained a challenge.

Still extremely unenthused about the whole thing, on Oct. 1, 1941, the Record’s Oswego correspondent complained: “Oswego is to be afflicted with daylight savings time for another month.”

Although it still wasn't popular in rural areas, year round Daylight Savings Time–dubbed War Time–was adopted by Congress in 1942.

Although it still wasn’t popular in rural areas, year round Daylight Savings Time–dubbed War Time–was adopted by Congress in 1942.

DST remained an often contentious local issue until the world went to war once again, and forced the federal government’s hand. Congress enacted the War Time Act on Jan. 20, 1942.

Kendall County communities quickly complied with the new mandate that seemed on the horizin—in fact they jumped the national gun. On Feb. 4, 1942, the Record reported: “The Yorkville village board voted at its meeting Monday night to adopt war time, which is one hour faster than central standard time. War time becomes effective on Monday, Feb. 9. If you will set your clock ahead upon retiring Sunday night, you will get up Monday all square as far as time is concerned. Oswego adopted war time at its board meeting on Tuesday night.”

Five days later, Congress established year-round DST—the War Time already in effect in Kendall County—throughout the United States. The reason, just as during World War I, was given as a wartime measure to conserve energy resources it was felt could better be used to fight the nation’s enemies.

War Time remained in effect until after the end of the conflict, when The Amendment to the War Time Act was passed on Sept. 25, 1945, ending DST as of Sept. 30, 1945.

And by the time the war was winding down, local folks were anxiously looking forward to the end of War Time. As the Record’s Oswego correspondent happily wrote in the Oct. 31, 1945 edition: “O! the joy and peace and contentment when the [radio] announcer is heard to say, ‘We have no two-timers this morning; Central Standard has come to stay,’ (we hope).”

But Kendall County was not done with DST after all. On April 24, 1946, the Record warned its readers: “Don’t forget to move your clock an hour ahead when you go to bed Saturday night. A large number of the towns in Kendall county are going on daylight saving time. It may be confusing until we find out for sure who is and who isn’t on fast time, but it will work out. Better check to be sure what time your church services are, and for train and bus times.”

The next year, the Record was still warning county residents: “If you don’t turn your clock ahead, bear in mind that most events in these parts will be held on daylight time.”

The confusion continued, not only with some states and some communities adopting DST and others not, but with some of those localities adopting it on different dates.

Since there are so few farmers left these days, it appears to e up to the rest of us to grumble about Daylight Savings Time.

Since there are so few farmers left these days, it appears to be up to the rest of us to grumble about Daylight Savings Time.

In 1958, for instance, Minnesota switched from DST to standard time on Sept. 2. But Wisconsin and California didn’t “fall back” to standard time until Sept. 28. And so, reportedly, did 300 of the 800 Illinois localities who were operating on DST. But the Chicago metropolitan area, along with the East Coast, wouldn’t change back to standard time for an entire month. Indiana, of course, suffered through a veritable mishmash of changes from DST to standard time, depending on which county a traveler happened to pass through.

Clearly, something needed to be done, and the transportation industry was willing to pick up the ball and run with it. Back then, Congress was actually willing to pass legislation to benefit the entire country, and the result was the Uniform Time Act of 1966. Starting in 1967, the feds mandated when DST and standard time started and stopped nationwide. States could apply for exemptions, which some did—thus the continuing confusion in neighboring Indiana. But for us here in Kendall County, the era of DST in town and standard time out in the country was finally over.

So the next time someone asks what the deal is with Daylight Savings Time, you can explain it all revolves around expanded daylight hours to collect insects in New Zealand, which makes about as much sense as many of our other traditions do.

Leave a comment

Filed under Farming, Frustration, Kendall County, Local History, People in History, Semi-Current Events

Grouchy old retired editor yells at punctuation clouds…

I consider myself a reasonable person. At least in most things. I don’t consider myself a grammar Nazi, either. But I have to admit there are some things, grammar-wise, that people do that drive me absolutely crazy.

Chief among these things is the misuse of the friendly, useful apostrophe and his little buddy, the comma.

Apostrophes are handy things. They give readers all sorts of useful clues, mostly concerning who owns what. There are, for instance, lots of moms, but my mom’s recipe for pie crust is superior. See what happened there? More than one mom turned into a single, possessive mom, and all it took was an apostrophe.

Commas, those little crescents that look like a ground-based apostrophes, are our friends, too. They tell us what sorts of things go together, what things need to be considered separately, and sometimes where we ought to take a breath when we’re reading out loud.

Misuse of these entirely practical little squiggles is a plague on our society. Not to mention the world and quite possibly the universe. I’ve been fighting against it, in a quiet sort of way, ever since I got into the editing game. My general rule in life is “Moderation in all things,” and when it comes to punctuation it’s even more true. Fewer apostrophes and commas would, I think, be a kindness to everyone. It would certainly make for kinder, gentler editors.

Lo, those many years ago when I was toiling in the editorial fields, I gradually became aware that overuse of commas was driving me crazy. It was a serious problem when we were still typing stories on our trusty upright Royals. I became adept at the squiggle that tells the typesetter to treat all those invasive commas as invisible. But then we started using those little TRS-80 laptops, and removing excess commas—which was most of them—became a laborious pain since it had to be done one at a time.

And then glorious technological progress! Macintosh computers, friendly little boxes that looked like Wall-E, sort of sidled into the newspaper office and became our boon companions, running early versions of Microsoft Word and spitting out copy on nearly silent LaserWriters. And with Word came the wonderful ability to seek and destroy! Errant commas could no longer hide from my blue pen or amongst the legitimate characters on a small LCD screen; squiggles were no longer necessary to excise the little buggers from copy.

And this was a Godsend, especially when it came to editing sports copy. I really liked all the sports writers. I went along with jargon and buzzwords and clichés. But all those extra, extraneous commas? No! Which is where the search and destroy function came in so handy. First thing I’d do is search for commas and replace them with nothing at all (whoever thought up that idea is a genius on a par with Einstein), because there were generally only a dozen or so needed in any given piece and I was sometimes getting a dozen a sentence. Not that I begrudge the serial comma, of course. That’s the one place I make an exception. Strangely enough though, those comma nuts seldom use the serial comma, which would mean I’d actually have to insert commas.

Unlike commas, apostrophes seemed to create confusion and hesitation. When it came to commas, writers throw hands-full, barrels full, boxcar loads of the things into perfectly innocent paragraphs and sentences. But with apostrophes, usage seems to be one of the universe’s particularly tangled mysteries to many writers. They appear to get nervous if they haven’t used one in a while, so they seem determined to stick them in randomly, just to keep their hands in and the copy interesting.

“The Smith’s liked that,” they’d write. “American’s are just fine,” they scribble. And what is the poor copy editor supposed to make of such writing? Smiths and Americans are just fine, all of them, without throwing apostrophes at them on the off chance they might make sense. Really they are.

I tell you, commas and apostrophes were banes of my existence, but they became less baneful after I hustled out of the office door following a particularly nice going away party—even if I was pressed back into emergency service for awhile afterwards and even if I didn’t get a second nice going away party. I was not bitter, however, because I knew I’d never have to edit another sports story written by someone with a comma fixation ever again.

However…however I still read. A lot. And those misplaced commas and apostrophes still grate on me when I see them. I’m not quite as militant as Lynne Truss, author of Eats Shoots & Leaves, who has been known to harangue theatre owners over errant apostrophes on marquees—and even steal them if she can reach high enough to snatch them away from places they should not be. Ever.

This book is an obvious, transparent attempt to rattle the cages of those who prefer their apostrophes to be used correctly.

The title of this book is an obvious, transparent attempt to rattle the cages of those who prefer their apostrophes to be used correctly.

I don’t do that. But I grouse. I complain. I bore my wife. I can’t help it. When I see a book jacket with a really nice type face spelling out the title, Unknown Wars of Asia, Africa, and The America’s That Changed History, I can’t help it. I ask myself, “America’s what?” No apostrophe is needed there; IT IS NOT A POSSESSIVE! It is meant to be a plural. Why is that apostrophe there? Did the book’s art director decide to stick it to grammarians because he had a bad experience trying to diagram sentences in seventh grade? Or perhaps he’s new to this country. Having come from Luxembourg only the week before, it’s possible he’s unfamiliar with proper apostrophe use. Or maybe she’s from south of the Ohio River. I understand they do terrible things to sentence structure down there because they’re still angry that Sherman invented urban renewal in Atlanta, only he started in the white parts of town.

So anyway, I think I’m feeling better now and besides, it’s time for supper. Writing is easier than a lot of us make it, and harder, too. Most of the time, less really is more. And a good supper cures many ills.

6 Comments

Filed under Frustration, Newspapers, Uncategorized

Fear’s not a bug; it a feature of our modern system

So I see by this morning’s news that the doc in New York who had contracted Ebola while treating patients in West Africa has fully recovered.

So far, we’ve had an Ebola epidemic consisting of one aid worker, a missionary, three doctors, and an NBC cameraman who brought the disease home with them from West Africa, all of whom fully recovered. In addition, one visitor brought the disease with him from Liberia and subsequently died. Two nurses were infected while caring for him and also fully recovered. Thus ends the great Ebola epidemic of 2014.

While the country was never in danger from Ebola, it certainly was from the panic, ignorance, and cowardice displayed by a huge chunk of the U.S. population and their political leaders.

Ebola is one of the viral hemorrhagic fevers that afflict mankind by interfering interfere with the blood’s ability to clot. The viruses can also damage the walls of the body’s tiny blood vessels, causing them to leak. That can result in death, often made all the more frightening because of the mystery of what’s happing.

This, of course, was not our first go-round with viral hemorrhagic fever in this country, only the most recent. Given the scientific illiteracy of the modern United States, however, no one—least of all the politicians and media blowhards trying to make political hay out of the unreasoning fear they were busily propagating—remembered what had gone before.

There aren’t a lot of viral hemorrhagic fevers (VHF), but the list includes some of the most frightening names in modern medicine: Dengue, Ebola, Lassa, Marburg, and Yellow Fever.

Here in the U.S., our VHF experience was mostly with Yellow Fever, a disease that could honestly be nicknamed The Slaves’ Revenge.

There was no Yellow Fever in North or South America until the virus was brought here from Africa in the bodies of slaves, starting in the 16th Century. The Yellow Fever virus is transmitted only by the bite of an infected mosquito, although it would take hundreds of years for medical researchers to figure it out.

The disease was as horrifying as it was mysterious. Those stricken suffered high fevers, chills, nausea, muscle pain (particularly in the back), and severe headaches. After that first phase, most victims then suffered through a second, more toxic stage that causes severe liver damage resulting in the jaundice that gives the disease its name, and a painful death.

Yellow Fever was apparently brought to the Caribbean islands and South America by African slaves imported by the Spanish. It didn’t take it long to spread north. The first outbreaks in what would eventually become the United States took place in New York City in 1668 and Philadelphia in 1669. At least 25 major outbreaks followed, including a major one in Philadelphia in 1798—then the nation’s capital. The city was evacuated by the national government as nearly 10 percent of its population perished.

Periodic Yellow Fever outbreaks continued throughout the balance of the 18th and into the 19th centuries, with Louisiana and Florida suffering periodic flare-ups, some of which had effects and caused fear right here in Kendall County. For instance, on Sept. 19, 1878, the Kendall County Record’s Oswego correspondent, Lorenzo Rank, noted: “George W. Avery Jr. is selling out his house and furniture on the 28th inst.; he is bound for the yellow fever lands.”

In the Nov. 21 edition of the Record, Rank noted “L.N. Stoutemyer, an Oswego boy, now one of the editors and proprietors of the New Orleans Times, apparently has been the one that stood the hardest siege with the yellow fever without surrendering. About a week ago his friends here received word that for the first time in 43 days he sat up a little while.”

And what about George and Ed Avery and their families, who had headed to the “yellow fever lands” in 1878 and 1879? In the Oct. 19, 1882 Record, Rank reported of George’s brother, Ed: “The sad intelligence was received last week that Ed Avery had died at Pensacola, Florida from yellow fever.”

One of the worst of these periodic Yellow Fever outbreaks occurred in 1879 in Memphis, Tenn. It began with just one patient in August of that year, a steamboat crewman named William Warren who brought the disease with him from New Orleans, where a periodic outbreak was then on-going. Although officials had attempted a quarantine of steamboats coming north from New Orleans, Warren managed to evade it, before landing in a Memphis hospital, where he died, but not before Tennessee mosquitoes picked up the virus from him and spread it, first to a Memphis food stand operator on the waterfront, and then to dozens and then hundreds of others.

That’s when the panic hit, and residents began fleeing for their lives. Well over half the city’s 47,000 residents headed to rural areas outside of town, only to be met with “shotgun barricades” manned by small townsmen terrified they’d spread the disease to their families. Even so, the exodus led to the spread of the disease to Kentucky, Indiana, Illinois, and Ohio, although in none of those areas did Yellow Fever boil up with the ferocity it did in Memphis.

As if the disease wasn’t horrible enough, if it didn’t kill those it struck, the medical care of the day often did, with the normal treatment being bleeding and dosing with purgatives that often led to death through dehydration.

The black residents of Memphis, most too poor to flee what homes they’d made there, became the backbone of those who kept the city from disintegrating. That was because while blacks were no more or less susceptible to contracting the disease, their death rate was only about 7 percent of those who came down with Yellow Fever. Medical historians suspect that was because the African-American population had built up some immunities to the disease over the centuries. As a result, according to “Yellow Fever in Memphis 1878 by Robert A. Dunn, “The African-American survivors in Memphis became the glue that held the city together, caring for the sick and dying, burying the dead, and taking over may positions in the Memphis police, fire, and other city services.”

After the epidemic was stopped by the first frost in the autumn of 1878, Memphis found itself bankrupt, with the State of Tennessee taking control of the city’s finances. Not only were the city’s debts paid off, but low-lying, swampy areas in the city were drained, trash and debris cleaned up, and an innovative sewer system was installed that, for the first time separated sanitary sewer lines from storm sewers. Those initiatives combined—despite another Yellow Fever outbreak in 1879 that caused 600 deaths—to stop further major outbreaks, although no one really knew why.

It wasn’t until Dr. Carols Findlay suggested that Yellow Fever was actually spread by mosquito bites that the medical profession began taking a serious look at the idea. Then Walter Reed, a U.S. Army doctor working to defeat Yellow Fever during the construction of the Panama Canal, proved conclusively mosquitoes were the culprits.

Under the direction of Army doctors, mosquitoes were eradicated in Cuba and Panama, and with them went Yellow Fever. Similar efforts in the United States eliminated the disease from New Orleans and other low-lying cities that had been periodically afflicted with Yellow Fever.

Ebola is particularly dangerous because it’s spread by its animal hosts, not easily controlled insects, which means it can not only spread from animals to humans but from humans to other humans. Fortunately, as the recent nine-person epidemic in the U.S. showed, it’s not really easy to get Ebola. Someone has to be in close contact with a patient in the final phases of the disease when the victim’s body is producing astonishingly huge numbers of the Ebola virus and then be directly exposed to the victim’s bodily fluids.

Given the problems exposed in the Texas healthcare system with the outbreak in Dallas, it’s fair to wonder whether any further exposures would have occurred in the case of the Liberian patient had he been seen at a modern hospital in, say, New York or Chicago. And as for the New York doctor infected, but now recovered, the healthcare system in that state did what they were supposed to do, and they, like all the other hospitals treating cases, not only prevented any further transmissions but also cured the guy.

The problem, it seems to me, is not that Ebola got to the U.S., or that it was so easily contained. It’s that there’s a horrible, on-going Ebola epidemic in West Africa, and there’s a chance it could spread to, say, the crowded megalopolises of India or Bangladesh or Brazil. That could be a catastrophe of literally unimaginable proportions, something that should be encouraging us to move with all possible dispatch to stop the epidemic at its source as quickly as possible.

 

 

Leave a comment

Filed under Frustration, Local History, People in History, Science stuff, Semi-Current Events, Technology

We dodged an authoritarian bullet in the 1860s…

Reading about the Civil War always makes me extremely glad the Union won, not to mention extremely angry that the war happened at all.

If a large portion of the officer corps of the U.S. military had not decided to surrender their honor and become traitors, it’s likely the estimated 620,000 soldiers on both sides (and that figure doesn’t include war-related civilian deaths) who lost their lives during the conflict and the millions of dollars of destruction could have been avoided.

Unfortunately for us future generations, by the time U.S. Grant and William Sherman had beaten the South into submission the nation was so tired of war they decided to give those military traitors a free pass, other than brief imprisonments for some. And thus was born the “Lost Cause” fable that ushered in decades of monstrous Jim Crow subjugation of anyone with African-American blood in their veins–no matter how little flowed there.

It’s not too strong a statement to say that the Confederate government had a lot in common with the authoritarian governments of the early 20th Century, and seems to have pioneered some of the same techniques fascist and communist governments used to subjugate their own people.

The Civil War is often described as the first modern war since it made extensive use of railroads, mobilization of heavy industry, and proto-modern military tactics such as elaborate entrenchments and rifled firearms. It can also be considered a modern war in that it really didn’t settle much, other than eliminating slavery. Which, granted, was quite a major achievement. Southern attitudes took a breather for a few years but then began once again to eat away at the fabric of the nation right up to the present, to the point that the America envisioned by the Founders is in real danger of disappearing under a mound of hatred and lies.

Leave a comment

Filed under Frustration, Military History, People in History

The handwriting’s no longer on the wall…

So I was chatting with some friends recently, and the subject of handwriting in school came up. It turns out that many school districts across the county are now eliminating teaching cursive handwriting as an essential skill.

I’m not sure what the real reasons for this are, but I can think of a few right off the top of my head.

First, in today’s computer-driven society, where even our watches are becoming machines that put Dick Tracy’s wrist radio to shame, keyboarding skills have become paramount. Back in the stone age, we used to call it typing, but that was when there were machines called typewriters in which the typist rolled in a sheet of paper, held their hands at just the right angle, and then typed. At 40 words per minute if he or she wanted to pass Typing I.

By that time, “typewriter” referred to the machine, an not the person who was using it. As Lorenzo Rank but it in his “Oswego” column in the March 11, 1898 Kendall County Record: “Bessie Armstrong, now one of the stenographers and typewriters, came home from Chicago to spend Sunday.”

When I took high school typing, handwriting was still an essential skill that elementary kids spent a lot of time learning. Most elementary classrooms had depictions of correct upper and lower case cursive letters on cardboard strips up above the blackboard so there would be no excuse for failing to create a proper capital letter Q.

My first ink pen in second grade was a plastic one with a steel nib, just like the middle one here

My first ink pen in second grade was a plastic one with a steel nib, just like the middle one here

I learned cursive in second grade out in our one-room country school, first with pencil, and then graduating to (just like the big kids!) pen and ink. The ink pens we learned on were plastic dip pens with steel nibs that had to be dipped in an ink bottle every few letters. The wet ink then had to be blotted so you didn’t accidentally drag your shirt cuff through it and smudge your masterpiece. Ink blotters, in fact, were a major advertising medium during that era, with all sorts of businesses giving them out for free.

In the middle of my third grade year, when we moved into town, I was mildly shocked, and somewhat insulted, that my classmates were all still a) printing and b) writing in pencil.

The kids in country and town schools through the last of the 19th Century and start of the 20th, learned using the Spencerian Method invented and popularized by Platt Rogers Spencer. That was replaced by the Palmer Method developed by A.N. Palmer and spread nationwide in Palmer’s Guide to Business Writing, in which, by the way, my mother was an expert. She learned it in grade and high school and perfected it in a business college course.

Our school handwriting was very similar to Palmer’s, and was practiced daily.

The cartridge pen allowed the look of a fountain pen without the muss and fuss of carrying a bottle of ink around in your pocket.

The cartridge pen allowed the look of a fountain pen without the muss and fuss of carrying a bottle of ink around in your pocket.

For reasons lost to the mists of time, we weren’t allowed to use ballpoint pens for some years. Fountain pens were fine, but the things leaked. So it was a lifesaver when the Shaeffer company came out with their cartridge ink pens. No filling from ink bottles any more, just buy a small box of plastic cartridges at the drug store and you were good to go. But eventually, the value of ballpoints penetrated the educational system. Our handwriting was a lot less messy, the blotter makers went the way of buggy whip manufacturers, and all was good and right with the world.

All the cool kids in high school used Shaeffer Pens. I know that because this advertisement, from my senior year of high school, tells me so.

All the cool kids in high school used Shaeffer Pens. I know that because this advertisement, that dates to my senior year of high school, tells me so.

Then, as I noted above, some of us learned typing in high school, which proved a very valuable skill. It was also challenging. We learned on standard QWERTY typewriters, but with the exciting modification of blank keys. The keyboard layout was printed on a poster above the blackboard at the front of the room. And no, I don’t remember why our typing room had blackboards.

For some of us, typing was, literally, a life-saver. A friend of mine, drafted into the U.S. Army during Vietnam, was appointed to clerical duties in his engineering company because he could type. It didn’t stop him from hunkering in a bunker and shooting up the bad guys with an M-79 grenade launcher during the Tet Offensive, but to a great extent, it kept him out of lots of other potentially fatal situations.

Typing was also a money maker during college, since the skill wasn’t universal and by that time, term papers were required to be typed in many classes.

Typing didn’t become keyboarding until the computer age dawned. In another interesting tern of events, “computer” had also once been the name of a person’s job, just like “typewriter.” But starting in the late 1970s, computers began requiring keyboards to input data. By the 1980s school boards all over the country were coming to the conclusion that all this computer stuff was something more than a technological flash in the pan. And by the 1990s, “keyboarding” was starting to be considered a basic skill, right along with handwriting.

And then came laptops, smart phones, tablets, and all the rest of the revolution we’ve been living through the past few decades.

Now, it appears, keyboarding has overtaken handwriting, as have more esoteric skills such as texting using nimble thumbs, which all the cool kids know is the rage these days.

Which brings us to the second reason handwriting is disappearing as a skill taught in school—which really has nothing to do with technology, and, when you stop to think about it, not much to do with improving education, either. Handwriting is simply not conducive to modern testing. And the modern mania for “high stakes” testing has pretty much left skills like handwriting in the dust. If it’s not on standardized tests, it is not, for the most part, taught.

So gone is handwriting, and so are lots of other things, like local history because giant testing companies owned by conglomerates overseen by distant financiers understand they can’t be shoehorned into a nationally-normed test. Education, of course, is not the goal here; making money is. For years, the folks who’ve been vacuuming up everyone’s tax dollars have been trying to figure out how to get at that huge pool of property taxes that support local government. With the ‘education reform’ movement, charter schools, and the Common Core, they figure they’re good to go.

2 Comments

Filed under Frustration, Local History, Nostalgia, Oswego, People in History, Semi-Current Events, Technology

In which pop culture strikes back at geezer historian…

So anyway, I was cruising the Net this morning, got down to the Huff Post on my morning reading list,  and came across a post stating that the guy who wrote “All I Want for Christmas” doesn’t consider it his favorite song, which I sort of understood, but then the post went on to say that it had been recorded by Mariah Carey, which blew me away. “Wow!” I thought, “I didn’t know she was into that sort of music at all.” It also blew me away that the guy who wrote it was still alive, much less able to comment on whether it was his favorite or not.

But then I hit the button and played it, and suddenly understood. And I placed where I’d heard it: the movie “Love, Actually.”

Historians live in the past quite a bit–at least this one does–and that was my problem because I had the wrong song. Entirely. Which is a thing us geezers have to deal with on a regular basis. Pop culture whizzes past us, leaving us in the metaphorical dust as times change. Which makes me sound like my grandfather, but still.

See, here’s what I think of when I hear “All I Want for Christmas.” Which was why I was a little surprised (truthfully, more like dumbfounded)  that a looker like Mariah Carey would have recorded it, much less made it a hit, and which I hadn’t really remembered, either. And I was right; the guy who wrote the song that tripped my memory released the lyrics in 1946, which was the year I was born. So I’ve pretty much grown up with the thing.

Trying to envision Mariah Carey singing it does boggle the mind, you have to admit.

1 Comment

Filed under Frustration, Nostalgia, People in History, Semi-Current Events