Like most towns Baker City lies in a valley, but this place, it seems to me, is defined by its mountains.
I use the possessive form here because cities tend to have a palpable pride of ownership in the peaks visible from their streets.
When you are blessed with mountains, and in particular with a truly imposing range such as the Elkhorns, you might as well flaunt them. And so we do, on T-shirts and postcards and coffee mugs among quite a lot of other items.
Nor is this trait peculiar to places of modest size.
Portland bills itself as the Rose City, but there can be no quarrel that its true icon is Mount Hood.
Hood’s volcanic sibling to the north, Mount Rainier, fulfills an equally symbolic role for Seattle.
Baker City’s affections are not so singular.
Our mountains more resemble the Rockies than the Cascades, which is to say there are long ridges from which an occasional peak juts, as opposed to the Cascades’ solitary, but spectacular, fire mountains.
We harbor perhaps a special love for the Elkhorns because they are so near to the west, forming a sedimentary wall that casts its shadow clear across the valley.
But we lay claim as well to the more distant, but indisputably magnificent, Wallowas, which sprawl over the whole of the northeastern horizon.
I have been thinking recently of mountains, and the way we feel about them, after reading Robert Macfarlane’s book “Mountains of the Mind.”
Macfarlane, a British travel writer and mountain climber, wrote the book a decade ago. I managed somehow to avoid the volume for all those years although I relish travelogues of all sorts, and in particular ones dealing with mountains and people who climb them.
(I would like nothing more than to be a travel writer but am afflicted by the insurmountable handicaps of never really going anywhere, or doing anything interesting when I get there.)
The gist of Macfarlane’s book is that modern society’s veneration of mountains, their purple majesty and all that, is, well, modern.
Until around the start of the 19th century many people at least feared, and in many cases acutely loathed, some of the world’s greatest mountain ranges.
Macfarlane, being a European, devotes much of his book to the Alps.
He writes of 17th century travelers whose descriptions of crossing Alpine passes bear a decidedly Tolkien flavor. These accounts, largely taken from contemporary diaries or journals, lament the frightful precipices, the awful blizzards, the utter absence of civilization.
You have a sense that these writers, if they actually believed such creatures as dragons exist, would not have been altogether surprised to come across one in the icy wastelands of Mont Blanc.
Macfarlane explains how science, and especially the budding field of geology, contributed to a wholesale reversal in our opinions about mountains.
Pioneering geologists such as the Scotsman James Hutton, and Charles Lyell, a Briton, came to recognize that by studying mountains and glaciers they could understand how the Earth’s surface had been formed — and moreover, reformed — over the eons.
Their writings encouraged people, most of whom were not scientists, to have a look for themselves.
When they left the sanctity of the valleys and they saw for the first time such awe-inspiring sights as the Mer de Glace or the Italian Dolomites, these visitors stopped worrying about ogres and started thinking about building chalets and cog railroads.
By the middle of the 19th century the Alps were, to the British aristocracy, what Vail and Sun Valley are to modern America’s upper class.
Writers and poets waxed rhapsodic about the sublime spectacles among the peaks.
Doctors touted the pure air as the ideal antidote for Londoners’ soot-stained lungs.
Alpinists, most of them Englishmen, breached summits long thought impregnable. In July 1865 Edward Whymper of London led a party to the top of the most famous peak of all, the Matterhorn.
(Although four of the seven climbers plunged to their deaths on the descent. Whymper and two others were saved when the rope connecting all the climbers snapped.)
Macfarlane’s book intrigued me because I can’t imagine standing in my yard, watching a snow squall sweep across the face of Elkhorn Peak, and feeling anything but ebullient at my good fortune to have such beauty so accessible.
That I might dread the mountains is a concept so foreign as to be beyond my ken.
Yet there was much in “Mountains of the Mind” that seemed familiar.
In particular I felt a kinship with those of Macfarlane’s subjects whose love of the mountains is broad and complex, who are equally entranced by sunlight exploding off a glacier’s surface and by the immensity of time represented in a band of layered stone.
Sometimes when I look at the Elkhorns I see them as objects to ogle. Science seems a minor matter in that moment when the alpenglow slides its pink brush across the slopes, at dawn of a January day when the temperature has plummeted below zero.
At other glances I am overwhelmed by the colossal scale, both in size and in time, that the mountains represent.
I ponder the forces required to move slabs of tropical seafloor thousands of miles — the great upheavals that elevated them and the ice that sculpted the great slabs into pinnacles from which, on a fair day, you can see parts of three states.
Mountains, to borrow Macfarlane’s title, are indeed often on my mind.
And, I hope, they will never be far from my eyes.
Jayson Jacoby is editor of the Baker City Herald.
By Jayson Jacoby
Baker City Herald Editor
I went back in time this week and what a curious journey it was.
My destination was a day rather than a place.
Nov. 22, 1963.
Until Sept. 11, 2001, and with the exception of the monumental events that attended the nation’s birth in the 1770s, it was perhaps the singular day in American history.
For many people, including some of those who served as my tour guides, I suspect that that day, when president John F. Kennedy was assassinated in Dallas, retains its unique status in their memories even after the terrorist attacks a dozen years ago.
I talked with several people who were in Baker that November day. Most were high school students.
Fifty years is a considerable span, of course.
Call this period by its other name — half century — and it seems longer still.
Autumn tends to be the most banal of seasons around here but this current version has gotten up to quite the dickens.
I was over on the breaks of the Snake River last week, immersed in fall.
I was wearing a red-and-black, all-wool hunting coat (warmer than manmade fleece but also considerably scratchier).
I had a bolt-action rifle slung over my shoulder, an elk tag in my backpack, and a keen-bladed knife in my pocket which has not touched blood in many years.
It was not cold, but the air had a proper autumnal chill.
Then I saw a flash of bright orange about 100 feet ahead, conspicuous among the whitish gray chunks of limestone littering the steep slope.
He was my best friend for most of my teenage years and when we met for the first time in almost a quarter century the occasion, as it so often is in such cases, was a sad one.
My friend’s dad had died.
I stood outside the restaurant at the golf course where the post-funeral reception took place and I waited for my friend to arrive.
You used to worry in these situations.
You used to wonder whether, after so much time had elapsed, you would even recognize the face that you once saw every day and that was as familiar as those of your family.
Wrinkles breach the formerly smooth planes.
Hair goes gray.
Often, pounds are added.
(Rarely, they’re subtracted.)
You fear the embarrassment of seeing someone who you feel you ought to know and then hearing, in the tentative timbre of your voice, the question mark when you say his name.
You fear, above all, being wrong.
But we live in a Facebook world, where, for millions of us, the inexorable weathering of our facial features (and other features) is chronicled in high-megapixel detail.
So anyway I knew my friend had aged well, and I thought there was little chance of my making a humiliating misidentification.
Indeed, when he drove past in the parking lot I immediately recognized him, even from the side, and through the haze of the safety glass.
He got out of his car and started walking toward me and we each raised a hand, in a sort of combined salute/wave, at almost the same instant.
He grinned and I grinned and the years, as they sometimes do when our past and our present collide, seemed suddenly to shed most of their oppressive weight.
Nothing was the same — nor could it be the same, so many missed weddings and births and deaths down the line — yet this was of no great consequence.
We knew that our bond, though not strong enough to keep us close through the years, was in its own way a powerful one.
Except I didn’t know, until that moment, that this was so.
I wondered, as I drove to the reception that morning, whether our friendship was forever trapped in childhood, the link severed when we collected our high school diplomas and left on our vastly different journeys into adulthood.
But as we shook hands and had a clumsy embrace (I’m not a hugger, and invariably foul up the etiquette of greeting gestures) I understood that this was not true.
I understood that those distant and murky days of junior high and high school, when we played pool in his basement or watched TV in my living room or sat in his room and listened to Van Halen’s first album and wished we could make our guitars sound like Eddie’s, those days mattered.
There are many kinds of friendships, of course.
Some last a lifetime, or nearly so.
But most, it seems to me, and in particular those that begin in our childhood, do not.
I’d like to believe that this brief reunion with my old friend will revive in some way our dormant relationship.
I suspect, though, that this will not happen.
We live far apart, and not only in a geographic sense.
But even if our next meeting comes years from now, and even if the reason is again a somber one, I have a newfound appreciation for our friendship, indeed for all true friendships.
Each person must give something of himself to create such a connection, and this seems to me a mutually beneficial exchange.
I have an idea of the person I was back then but the picture is an incomplete one. Yet this vision comes clearer, comes closer to a whole and true thing, when I can reminisce with the one person who knows that part of me.
I don’t understand how this works, only that it does work, and that it’s a sort of magic.
Which, come to that, is a fair way to describe friendship.
Jayson Jacoby is editor
of the Baker City Herald.
I’m inclined to dismiss the debate over sports teams’ Native American mascots as a trivial matter except for this:
The issue raises apparent contradictions, as well as questions about the context of words, that fascinate me.
I write “trivial” not to insult anybody.
My point, rather, is that if as a society we’re truly worried about Native Americans then we ought to focus on something other than the logo painted on football helmets and embroidered on the backs of cheerleaders’ sweaters.
The academic progress of kids who grow up on reservations, for instance.
Or the persistent problem of alcoholism among some tribes.
It’s a good thing fires never kill people.
Oh, wait — they do?
Even, sometimes, blazes that weren’t set by cretins bent on murder?
I’m feigning ignorance about the lethality of fires to illustrate what seems to me the absurdity of an argument proffered in print by a couple of professors from Linfield College. They imply, among other things in an op-ed published Saturday in The Oregonian, that people who burn ski resorts and torch SUVs have much more in common with Martin Luther King Jr. than with Timothy McVeigh.
The op-ed was written by David Sumner, an associate professor of English at the McMinnville college, and Lisa Weidman, an assistant professor of mass communication.
Their subject is Rebecca Rubin, a member of a group that caused $40 million in property damage in several incidents from 1996 to 2001, including the 1998 attempt to destroy a Colorado ski area.
Rubin, who fled to her native Canada after the crimes but later returned to the U.S., recently pleaded guilty to several charges. She has not been sentenced.
The professors acknowledge that Rubin is a criminal who deserves to be punished.
(Which is no great concession, considering she admitted as much herself, in court.)
But these educators devote most of their words to arguing that Rubin, and some of her fellow fire bugs, are getting a raw deal in two ways:
• They are routinely branded by prosecutors and the media as “eco-terrorists.”
• Some of these criminals have been given longer prison terms under the “terrorism enhancement” provision in the federal Patriot Act.
The professors contend that Rubin is a saboteur, but not a terrorist, because her crimes didn’t harm anybody and because they were committed “in defense of the natural world,” but not intended to hurt people.
I don’t doubt the latter is true.
But it doesn’t matter.
Rubin and her confederates knew, as any lucid adult knows, that when fires get started people will show up to try to put them out.
Sometimes those people get hurt.
Indeed, a couple of firefighters were injured while extinguishing a fire that Rubin’s group started.
The flames don’t care that your goal was to protect lynx from a ski lodge or to reduce carbon dioxide emissions from gas-guzzlers.
Flames are as apolitical as a hurricane or a tornado.
They just burn, whatever and whoever happens to be within combustible range.
The notion, as the professors put it, that people such as Rubin are less culpable than actual terrorists because they’re “philosophically opposed to hurting or injuring other living things” is the sort of flimsy, “but I didn’t mean to” excuse most commonly employed by boys who bust one of their mom’s antique crystal goblets during an impromptu game of catch in the backyard.
Once you’ve lit the match you’ve potentially endangered others’ lives, and your philosophy, your intent, become worthless drivel.
It’s not as if Rubin and her pals waited around to make sure nobody got close enough to scorch an eyebrow. They scurried away, as criminals are wont to do.
The one minor aspect about which I might agree, in one sense, with the professors is on the question of whether or not Rubin ought to be punished more severely because the word “terrorism” is attached to her crimes.
The difference is that they believe some arsonists deserve lesser punishment.
But I think all arsonists should be treated just as harshly as Rubin and her friends have been, whether the goal is to destroy a fleet of Hummers or to collect a big insurance check.
The truly appalling part of the professors’ argument, though, is their attempt to cast Rubin as a First Amendment warrior whose actions, though technically criminal, were prompted by noble causes.
The gist of their case is that America, by treating “eco-terrorists” as a special sort of criminal, infringes on our freedoms.
“Rubin should be punished for her property crimes, but to call what she did terrorism is to misuse the word,” the professors wrote. “And to misuse the word is to threaten our First Amendment rights.”
Notwithstanding that the professors are on shaky semantic ground with regard to the word terrorism, they leap from that questionable claim to one that seems to me preposterous.
“The cost,” they wrote, “is to our freedom to protest the actions of other people that we feel are wrong. Where does that leave Henry David Thoreau, Martin Luther King and the long American tradition of civil disobedience?”
The professors also refer to “Libertarian activist Ron Arnold,” who came up with the word “eco-terrorist” in 1983.
“If Arnold had his way,” they wrote, “sitting in at a lunch counter or blocking a factory gate would not be merely illegal, but would be terrorism.”
The obvious response here is “so what?”
Arnold has not had his way, nor is he ever likely to. He’s irrelevant.
If Rubin and her friends had merely blocked a factory gate — or in their case, say, chained themselves to heavy equipment at the ski resort site — they probably wouldn’t be in prison, and they certainly wouldn’t have been prosecuted as terrorists.
But they didn’t block things they don’t like.
They burned them.
The difference between these two tactics seems not to be a significant one for the two professors.
But then I didn’t detect much in the way of clarity or consistency elsewhere in their op-ed, either.
The professors start by arguing that Rubin is guilty of sabotage but not of terrorism.
Yet in later paragraphs the professors replace, as if by magic, “sabotage” with “protest,” as though these are synonymous in the context of Rubin’s crimes.
Then the authors stray even further down this path by implying that “vandalism” and “arson” are forms of protest — but again, neither act constitutes terrorism so long as the arsonists and vandals are “careful not to injury or kill anyone.”
In the case of fires this can be more a matter of good luck than of careful planning.
Which brings me back to Thoreau and King, who seem to have been brought in, against their will, from some wholly different discussion.
King’s career has been researched in exhaustive detail yet I don’t recall that he ever employed fires — even carefully ignited ones — to forward his righteous cause.
He marched from Selma to Montgomery, but he didn’t torch either place, nor any town between.
(Although a certain hood-wearing group that didn’t think much of Mr. King certainly didn’t hesitate to break out the lighter fluid to try to make its point.)
King did, however, possess the courage to stage his protests in the illuminated public square, to defy the unjust and to accept, with dignity, the punishment that came with his defiance.
He didn’t slink around under cover of night, lugging cans full of nasty, polluting petroleum products.
As for Thoreau, I’m sure he sparked quite a few blazes in his time.
But he can hardly be blamed for that. I imagine it got pretty cold, living in a cabin in Massachusetts before central heating.
Jayson Jacoby is editor
of the Baker City Herald.
The season of the weeping birch tree has come round again and our city basks in its unique beauty.
I don’t own a birch myself but am partial to the species.
Most deciduous varieties please our eyes when they take on their temporary autumn dress, of course.
In New England an entire tourist industry is built on the ephemeral show.
Somebody has to sign off on all those checks the federal government writes, but why does it have to be Congress?
The Founding Fathers might have botched that one.
Although in their defense, none of those august men ever had to deal with John Boehner.
Or Harry Reid.
Or any other modern politician for whom the words “principle” and “posturing,” which have little in common except containing nine letters and starting with “p,” have become so corrupted as to be nearly synonymous.
This has been, and will continue to be, quite a year for 50-year celebrations.
There’s probably a term for these, something more pleasing to the ear than “half-centennial,” but I do not know it and am too lazy to look it up.
(Which, in the era of Google, is immensely lazy indeed.)
The year 1963 was, among much else, a crucial one in the civil rights movement. Most famously, the Rev. Martin Luther King Jr. held much of the nation spellbound with his “I have a dream” speech on Aug. 28 in Washington, D.C.
On Feb. 11 in London, a four-man beat group recorded its first long-play album, all in a single day. These clever lads called themselves The Beatles. They had some success later.
And of course the best-known event of the entire year, President John F. Kennedy was assassinated in Dallas on Nov. 22.
The publicity for the anniversary of that tragedy will be considerable.
This torrent of reminiscing has reminded me of another milestone, one which arrives in 2014.
That year — and specifically, July 28 — marks one century since the First World War began.
This, of course, puts the event beyond the living memory of almost everyone who’s around today.
(And even those rare methuselahs would have been just kids, and thus unlikely to have been following geopolitical events with any great enthusiasm.)
Yet a compelling case can be made — indeed, many historians have made it — that the First World War was the most significant event of the 20th century.
Many of the defining characteristics of that century — chief among them the nuclear age and the Cold War — are today linked more closely with the Second World War.
But their origins date to the earlier conflict.
Beyond the obvious chronological connection — you can’t have a second world war without a first — the historical record shows that the two wars are in effect one long fight, two bloody stanzas separated by a 21-year intermission during which no grievances were settled, and another major one was sown and bore its deadly fruit.
It is no coincidence, certainly, that the cast of characters was much the same in the two wars, the major differences being that Italy switched sides in the Second World War and Japan joined the Axis (what were known as the Central Powers in the First World War).
Even casual students of history understand that Adolf Hitler — the architect, as it were, of the Second World War — was in effect a prisoner of the First.
Not only did he fight in the 1914-18 war, but the whole of his monomaniacal life after the armistice was driven by his hatred for the punitive terms imposed on Germany by the Versailles Treaty of 1919.
The sequence of monumental events which happened during, or soon after, the First World War seem to us, at such a distance of time, as inevitable, neatly laid out as they are in the chapters of our history books.
Yet we can’t know whether the Russian revolution, the seed of the Cold War, would have happened in 1917, or indeed at all, had that country not suffered through the calamity of the Eastern Front during the previous three years.
And, as mentioned, it is hard to imagine that, without the First World War, a minor artist from Austria would have been able, by sheer force of his charisma and psychosis, to unleash the greatest military conflict in the world’s history.
I’m sure the First World War centennial will get into the news.
But I doubt it will garner anything like the attention given to, say, King’s landmark speech.
This would hardly come as a surprise; a century is an awfully long time.
And in some ways the First World War seems even older than it actually is. There were a lot more horses than trucks, the soldiers carried rifles that had more in common with a musket than an M-16, and the airplanes were about as technologically advanced, by today’s terms, as a push lawnmower.
Perhaps most important, America’s role in the First World War, though significant, came late in the conflict, after France, Germany, Britain and Russia had squandered much of an entire male generation.
America’s experience in the Second World War was rich in iconic events and place names. “Pearl Harbor” and “D-Day” and “Iwo Jima” continue to resonate down through the decades.
By contrast, “Belleau Wood” and “The Meuse-Argonne” seem as foreign as, well, France itself.
Perhaps it’s just as well.
I don’t see that we need to use the Somme or Verdun to remind ourselves of how inhumane humans can be. Sadly, we can use more recent disasters to illustrate the point.
Still and all, 100 years after the guns of August blasted away the notion of war as a gentlemanly pursuit, we might do well to pause briefly to acknowledge that some mistakes carry greater consequences than we could ever conceive at the time.
Jayson Jacoby is editor
of the Baker City Herald.
By Jayson Jacoby
Baker City Herald Editor
I often walk past the defunct Ellingson Lumber Co. sawmill, and the scene never fails to provoke a twinge of sadness.
I don’t go out of my way for these doses of maudlin.
It’s just that I live directly across 15th Street from the fence that marks the western boundary of the millsite. To avoid the place I’d have to reconfigure most of my normal routes, which strikes me as an unnecessary, albeit aerobically beneficial, hassle.
Last Sunday morning I walked along Broadway, on the north side of the property, and even the fading yellow of the rabbitbrush, a sort of farewell to summer’s palette, failed to enrich the somber scene.
If anything, the blooms accentuated the sense that something is missing here, that a site which once teemed with activity, where good salaries were earned and useful products were made, is being taken over gradually by the shrubs of the desert.
Lamenting the loss of a mill is a common refrain these days, of course, and it’s an emotion more often than not informed by the partisan politics pitting the timber industry against the environmentalists.
Yet I rarely consider that debate when I look at the barren buildings on the Ellingson parcel.
I don’t pine for a bygone era when stacks of ponderosa logs loomed over Auburn Avenue, some with butts almost as wide as the street itself.
That prosperous period could not have continued in perpetuity, at least not at the pace which marked much of the half century after the end of World War II.
In a region where a pine needs a century or more to attain such girth, there just weren’t enough trees to satisfy every saw.
Still and all, I can’t help but wonder whether this transition needed to be as abrupt it was.
I ponder whether some minor tweaking of national forest logging policy might have made it possible for this industry, which had been a mainstay of Baker County’s economy for better than a century, to survive, albeit in diminished form.
I remember interviewing Rob and Pete Ellingson after they closed the mill in 1996.
They talked about multiple factors, including government-subsidized lumber from Canada that depressed prices for U.S.-produced boards.
But the most pressing problem, they said, was that they could no longer rely on the three nearby national forests to supply enough trees to augment the logs coming from the company’s own comparatively modest acreage.
The volume of timber cut on the national forests has risen a bit from its nadir in the mid 1990s, but the numbers remain trifling compared with those of previous decades.
Oregon’s congressional delegation has tried several times to craft a compromise that would get log trucks rolling in more significant numbers, but nothing has come of it.
Perhaps nothing ever will.
Or at least not until the hundreds of thousands of acres of young forests in the region have matured, and the public lands once again are best measured in billions of board-feet.
The term “sustainable forestry” has been around for decades and although its creator was no doubt well-intentioned, his work, it seems to me, was for naught.
Our definitions of “sustainable” vary so widely as to render the term useless.
I used to believe that one apt description was that a small town which has a lot of productive forests nearby could sustain at least one sawmill, and in turn all the ancillary businesses which support it.
Moreover, I believed this could happen without our denuding those forests of the other qualities — wildlife habitat, sources of pure water, recreation — which we as a society prize.
It was not to be so in Baker City.
The city, of course, endured the loss of the mill.
I don’t mean to suggest the city’s future was ever in jeopardy. Baker City is a substantial place, and has been so for longer than most of Oregon’s cities. This is not Valsetz, nor any other town defined primarily, if not wholly, by lumbering.
Yet as the paint peels from the buildings which once housed the singing saws, as the wind blows without spreading the fresh scent of pine, I see, in my mind, the people who made careers here, the families which depended on this place, the homes and the cars and the Christmas presents which, in a sense, got their start here.
My eyes just see rabbitbrush, its luster gone again for another year.
Jayson Jacoby is editor
of the Baker City Herald.