Friday, March 27, 2015

Another Pittsburgh Post-Gazette Op Ed of almost a decade ago.
Pittsburgh, Pa.
Monday, Sept. 25, 2006

About endorsements
Today^s front page
Headlines by E-mail
Weekend Perspectives: Berlin's surprising model
Jewish history is honored in the German capital
Saturday, July 22, 2006
By Rudolph H. Weingartner
A recent visit to Berlin prompts me to make a statement that I had never expected to utter: With respect to Jews, the Berliners got something right -- on a matter, moreover, about which we here in the United States might think a bit more clearly. Jews are represented, so to speak, by two formidable Berlin sites.

Rudolph H. Weingartner and his family came to the United States in 1939 as Jewish refugees from Germany. He lives in Squirrel Hill (

The most recently completed (May 2005) is Peter Eisenman's Memorial for the Murdered Jews of Europe, placed in the very center of the city, next to the Brandenburger Tor and not far from the Reichstag. The other is Daniel Libeskind's Jewish Museum, which opened in 2001 in a residential area of Berlin.
The paths toward the erection of these two works were anything but smooth. In particular, the squabbling, sometimes fierce, about every aspect of the Memorial -- design, materials, cost, location and even just what it was to commemorate -- lasted for more than a decade and a half. In a sense, it continues to this day since by no means all of those who disapprove of the design or even of its purpose are reconciled to its existence.
But I am by no means alone in my admiration of Mr. Eisenman's achievement. Heinrich Wefing, a leading German architectural critic, refers to it as a beautiful abstraction "that does not dictate what its observer should think or experience, but is nonetheless thoughtful and moving," while The New York Times' Nicolai Ouroussoff wrote "how abstraction can be the most powerful tool for conveying the complexity of human emotion."
What must not be ignored is the sheer scope of the memorial: 2,711 steles of different heights spread over an expanse measured in the most ur-American way, as having the size of two football fields. If the paradigmatic individual memorial is the simple tombstone, Mr. Eisenman's expanse of slabs is an appropriate cenotaph for 6 million.
The motto of the Jüdische Museum Berlin across town calls for "two millennia of German-Jewish History." And in a somewhat cluttered way, it emphatically lives up to that slogan. Its numerous displays convey a wealth of information about the many and changing roles that Jews have played in Germany.
As one descends from the starting point at the top of Mr. Libeskind's edifice, one moves forward in time until one reaches the lowest floor where information is provided both about the Holocaust and the emigration of Jews to other lands, reinforced by the Garden of Exile and Emigration just outside.
In two significant ways, Berlin got it right, even if it took years of controversy and argument to get there. First, by placing the account of the Shoah at the end of an elaborate overview of those 2,000 years of the intertwining of German and Jewish societies that basement exhibit is not one of mere victimhood but is reached by museum visitors after having been elaborately informed as to who those victims were.
This horrendous period of history is shown to have been both a murderous destruction of human lives and an attempt to eradicate a significant part of human civilization. It takes a Jewish museum to show that, rather than a Holocaust museum.
The second way in which Berlin got it right is that it was sensitive to the tension between the institutional goal of providing knowledge and that of fostering commemoration.
The didactic goal is indefinitely complex. Museums use displays and documents, film clips and computer screens, earphones and loudspeakers and ever more modes of communication to cram masses of information and impressions into the heads of their patrons.
Alert visitors -- moving this way and that, looking and listening here and there -- take in a lot. With luck, they will remember a goodly fraction of what they have experienced, and with even more luck, they will later reflect on what they have found out.
When all the stars are properly aligned, then, the absorbed stream of information may lead to retrospective contemplation. But a monument like Mr. Eisenman's has the power to do that on the spot. Inducement to reflection and meditation is focused, monolithic, immediate. By pressing different buttons from those multifaceted didactic ones, a thoughtful monument evokes thought, then and there.
That, finally, brings me to our own practices in the United States. To a degree, Washington's Holocaust Memorial Museum serves both the didactic and the memorial function, though in the latter role not as successfully as Israel's Yad Vashem.
But if we leave aside these major establishments of world capitals, accounts about the United States are not encouraging. The Israel Science and Technology Homepage reports that to this date, we have created 23 Holocaust museums, not counting our little one in Pittsburgh that didn't make it onto the list. If you put aside actual Shoah sites, such as Dachau, Bergen-Belsen and Auschwitz, that is more than in all of the rest of the world together.
By way of contrast, the same source of information show there to be 24 Jewish museums in our land, while there are 57 in the rest of the world, not counting numerous museums in Israel.
Ours is the wrong ratio, a wrong view of history, the wrong way to present the contributions of the Jewish people to the world's history and the wrong way to commemorate the Shoah. We must hold on to an inclusive meaning of Never Forget. Forget not what they did, but remember, too, who they were to whom they did it. Let's stop building Holocaust museums and create Jewish museums instead.

Friday, March 20, 2015

Pittsburgh Post-Gazette First Person 
Hanging out with movie people
March 31, 2012

By Rudolph H. Weingartner

Join the conversation: Have tux; did travel. The plan was for me to spend my 85th birthday in Los Angeles with son Mark, who works in that city's No. 1 industry, the "moo'n' pitchers," as pronounced in Brooklyn, where I went to high school. After the plan was in place, Mark, a member of the Academy of Motion Picture Arts and Sciences, received an invitation to the annual bash of the American Society of Cinematographers, who had the nerve to do their thing on my (and Lincoln's) birthday. "But ... but ... " sputtered Mark, "that's my dad's birthday and he's flying out here for the occasion." "No problem," was the response, "I will invite your father as well." "OK by me" was my reaction, to continue in the same Brooklyn argot, followed by a quick inspection that showed tux and shirt -- last worn singing Beethoven's Ninth in the Mendelssohn choir -- were still in usable shape, ready to be packed. The venue was a mega-sized room in a complex called Hollywood & Highland, after the intersection where it stands. The room sported a stage and four large screens that enabled all 1,200 people, arrayed around tables for 10, to follow the goings-on. Drinks were served outdoors on a patio in front of the hall's entrances, with enough time allotted before the festivities to network, however dubiously that noun parades as a verb. With some exceptions, we males were in de rigueur penguin garb, with many, if minor, stylistic variations, while the women -- who were in a distinct minority -- were dressed at various levels of fanciness, with the depth of decolletage roughly inversely proportional to the wearer's age. A final sartorial observation: I was truly impressed not just by the variety of the men's hair styles, but by the care -- and, presumably, expense -- with which they were fashioned. For me, that tonsorial splendor was the most convincing evidence that I was not in Pittsburgh, but in the land of showbiz. For a while we ate; it was surf and turf, accompanied by wine and quiet table conversation. But with dessert, the 26th Annual ASC Award ceremony began. The proceedings were managed smoothly and seriously; no Billy Crystal equivalent. An effort obviously had been made to maximize the number of participants, in that a different person introduced each of the presenters of award nominees, who were divided into nine categories -- such as Half-Hour Series/Pilot, Television Motion Picture/Miniseries and Theatrical Release -- plus several awards for distinction and achievement. Two traits of the goings-on were noteworthy. First, the ASC, like the Masons, is a masculine society. All 22 officers and other board members are men and so were all 25 nominees for awards. Women seem not to have made much headway in the craft. Second, the group takes great pride in the vital contribution cinematographers make to the production of films. The handsome book that all of us found on our chairs features numerous pictures of crews setting up difficult shots and wielding complex equipment. Throughout the evening, it was clear, without it having been said in so many words, that cinematographers are the essential right arms of directors. Directors rely on cinematographers to select cameras and lenses and ancillary equipment -- of which there is a great variety, conventional and esoteric. Cinematographers bring to the table the know-how and ingenuity to set up and use all that gear for shots under water, out of moving vehicles and in tricky terrain of every kind, to photograph moving objects and still, so as finally to create a movie that is pleasing to look at or interesting or both. From category to category, clips of the nominees' work were shown and the winners introduced and given the opportunity to say a few words. Some did just that, others went on a bit long. All conveyed the flavor of their craft. Near the end of the evening came the society's Board of Governors Award and the audience was treated to the presence of Harrison Ford, at nearly 70 erect and distinguished looking, who made an eloquent little speech in praise of the cinematographers he had worked with through the years. The bar outside the hall had been dismantled. But that was just as well, since it was getting late. So, after a bit more conversation, most of us wended our way homeward. For me, it had been an enlightening glimpse into another world, a world of serious, high-powered professionals who work behind the scenes to entertain us.

Rudolph H. Weingartner is professor emeritus of philosophy and a former provost of the University of Pittsburgh ( The second edition of his "Fitting Form to Function: A Primer on the Organization of Academic Institutions" was recently published.

Wednesday, March 18, 2015

Three Classes of Disbelievers

 Historically it may not be so unusual, but it is nevertheless noteworthy that today there are three large classes of people who deny what the conventional establishment believes. There are those who do not believe in evolution. They do not believe that biological species in the past and flourishing today, including and especially human beings, were created by a process Charles Darwin identified, called natural selection. That biological scientists of every subspecies believe that that is how the world works does not impress them; according to the deniers plant and animal species and man in particular were created in another way.
   A class of people who deny what scientists in a broad variety of fields hold to be the case, disbelieve that the earth is undergoing a process of global warming, with multiple effects to be expected—some of them unfortunate, many of them disastrous, some of them in a proximate future, others in a more distant one. A subclass of climate change skeptics concede that some such warming is in progress, but that  it constitutes yet another phase—of many different ones—in the many-million year story of our planet, that human activity, however, has nothing to do with it.
   The third tribe of deniers asserts that there never was a Holocaust. It was not the case that during the period of the second World War six million Jews were systematically murdered, the vast majority by Germans but some also by such German allies as Romanians. Yes, of course Jews were killed; there was a war, after all, in which many of Europe’s population perished, but there was not concerted effort to murder all the Jews of Europe.
   Three sizable populations dissent from what the vast majority believes, people who agree with the experts on these topics. Not during the dark ages when a tiny fraction of the population was literate, but in the 20th and 21st centuries and in the center of the developed world. It is a phenomenon (or these are phenomena) worth thinking about. Of the first two—which are much more American goings-on—I’ve at times thought that disagreement with what was widely believed was the equivalent in the realm of belief of the “rugged individualism” that spurns governmental welfare because people should only get what they have themselves worked for. So the analogue: I’m my own man regarding what I think, just as I myself earn what I consume. The fact that the majority of the dissenters are not the hoi polloi, but card-carrying members of the middle class and above adds to this view’s plausibility. Still, that explanation is too generic; much more specific accounts are needed.
   But first, let me deal with the denial of the Holocaust, a phenomenon much more European and Middle Eastern than American. As far as I can see, there is no special anti-Holocaust ideology. Rather, that denial is rooted in the same-old, same-old anti-Semitism of since forever. Jews are manipulative and engage in conspiracies, vide the Protocols of the Elders of Zion. Their scheming has much of the world believe what is not so, with the additional undesirable consequence that the Holocaust serves as a justification for the creation and continued existence of the state of Israel. More broadly, disbelievers see Jews to be using the pretext of mass murders to exact other policies, economic and political, favorable to them in recompense for a Holocaust that never happened.
   I could cite other components of  the history of anti-Semitism, but that would not add much to the foundation, so to speak, of Holocaust denying by contemporary neo-Nazis and their sympathizers. For all I know, there is somewhere a legitimate scholar who has doubts about the evidence for the campaigns to murder Jews or about their magnitude, but if there is such a one, there surely are not many. In short, I cannot think of any other ground for this species of disbelievers than old-fashioned anti-Semitism brought up to date.
   Many of those who deny global warming or at least of human responsibility for such are politicians, industrialists, entrepreneurs; they are college educated, with good jobs—in Congress or in branches of the corporate world. I find their latest and now quite popular response to the claims under discussion to be particularly brash and truly annoying. “I’m not a scientist” is the response that is intended to close the conversation. Yeah, of course you aren’t. But neither are you an epidemiologist or a meteorologist or an electrician. But you’ll take a flu shot when you are told it is appropriate, you will protect the window of your store when you are warned that the hurricane will hit in less than twenty-four hours and you will replace wires in your house when an electrician tells you that some of them are unsafe.
   The world is full of specialists and the role of educated lay people is to seek out appropriate experts when wondering what to do and to listen what they have to say. No, you are not a scientist; you need not tell us, no one thought you were.  Rather, you are the consumer of the knowledge of a myriad of specialists who have studied and continue to study their chosen domain. Of course you are not a scientist, but as a supposedly educated person you should know how to access the knowledge that scientists have produced, if not in their professional articles and books, surely in the vast array of publications and internet postings addressed to the lay world.
   Not being a scientist, in short, is no reason to believe anything nor a reason not to believe, so why disbelieve? Three broad reasons, I think, with the first somewhat speculative. For it, I come back to that peculiarly American rugged individualism which, in this context, might be translated as a peculiar form of pigheadedness. He, everybody else--they have their own beliefs and I have mine, (which, needless to say, I prefer). Underneath this attitude is the view that there isn’t really any such thing as knowledge—claims that are true and confirmed by well-established evidence. There are only opinions, not objective propositions, would-be knowledge. All beliefs, instead, are person-dependent—in short, subjective. Climate warming is your thing; it isn’t mine.
   The second root of disbelief is more familiar. Mitch McConnell is the senior senator of Kentucky. Kentucky is a major coalmining state; and burning coal is a major source of greenhouse gases that contribute to global warming. The “correct” steps to take are, first, to trap those gases before they reach the atmosphere and, for the longer haul, to replace burning coal with other modes of producing energy. To require the first of these measures will impose costs on the Kentucky coal industry and the second would require a painful revamping of the state’s economy.
Hence there really isn’t any global warming and if there is, human actions, including burning coal has little or nothing to do with it.
   I will refrain from adding any of the many similar examples that could be cited. What we are talking about is the phenomenon of an interest, especially a strong interest, determining what one believes. In a “lighter” context we call it wishful thinking; but in one form or another, to have desire determine what is held to be true is a common human failing. Skepticism that there is objective knowledge, beyond “mere” opinion, together with the influence of desire on what is held to be the case are surely the main reasons for denying that there is global warming.
   The third set of disbelievers dissent from what is more generally accepted in a much broader and more radical way than either of the groups above. To deny that there is such a process as evolution and to hold that the world is about six thousand years old propels those disbelievers into an entirely different world. The only surprising thing about them is that they do not also believe that the earth is flat, since that is also true of the biblical world from which they derive their opinions. We are in effect talking about a population that derives its “knowledge”—at least about a number of utterly fundamental issues—from completely different sources from those of the population in general. We are speaking of Fundamentalist Christians who interpret the bible literally, holding it to be the inspired word of God. They very much go beyond biblical prescriptions and proscriptions regarding behavior to a large range of assertions as to what the world is like.
   It is fair to say that fundamentalists do not simply dissent from what is commonly believed; in effect they operate in a completely different sphere of thought from that of the world that surrounds them. Instead of experimenting and reasoning and observing the world as, say astrophysicists do, they search in the two testaments to find out what they assert. I can’t think of a good analogy; perhaps it is like moving from a world of Euclidean geometry to a distant non-Euclidean one. At the root, then, of the disbelief in what most others hold to be the case is the literal belief in a document that many others interpret symbolically or poetically.

   Though very different from the disbelievers previously discussed, I do want to conclude by noting that the fundamentalism I am referring to is largely American—and if it has spread well beyond our shores, that it finds its origins in this country. And it is my belief, though I have no evidence, that this particular deviation from tradition is also in part the result of that American rugged individualism, of the insistence that a person is not only the master of her or his fate, but a master of his or her truths. 

Sunday, March 15, 2015

Global Warming: Why Act Now in Behalf of Future Generations?
This piece was written in mid-2009 and, as best as I recall, was never published. It’s on a good topic, but remains a quite superficial piece. Read it and find a couple of current comments appended.
            The negative effects of global warming we feel now are not horrendous.  If we knew for sure that what has been happening during the last few decades was just a passing, if cosmic, phase, we would essentially ignore such temporary sizzling in the expectation that things will return to normal before long.
            But that ain’t so.  Everyone but a minority of ostriches believes that if we carry on as usual, ever increasing calamities will befall our globe.  By now most people have become acquainted with predictions about sea levels rising so as to swallow up settlements of huge populations and more.              But when?  Not next year, not during the next decade, perhaps not even during the lifetimes of many of us who are around now.  That raises a question that has not much been discussed: what does the present generation owe to future generations, many quite distant from our own?
            Economists have formulæ for calculating what I should pay now for an anticipated future gain, with the first sum smaller than the second because the formula considers inflation and the uncertain slips betwixt cup and lip.  But such calculations become tenuous when we are not speaking of events in a single lifetime and it becomes inapplicable, except for pathological rationalists, when we are speaking of the cost of preventing the future loss of untold numbers of lives.
            In short, even though large sums of money are involved, the discipline of economics will not teach us what we ought to do.  But thinking of this question as a moral issue also raises difficulties.  Most people hold that they have responsibilities vis-à-vis their children and many also believe that they have obligations toward their grandchildren.  If that is so, we are not doing what we ought to do (not doing our duty) if we don’t contribute to the flourishing of our children and grandchildren—assuming that we are able to do so.  But I can’t see stretching that obligation to my grandchildren’s grandchildren and then to theirs.  As the generations roll on they soon become complete strangers, no more my family than the folks that now live on Chestnut Street in Philadelphia.  And I don’t owe them anything at all.
            This tack, then, isn’t getting us anywhere, even assuming that the people affected in a fairly distant future are descendents of ours.  Nevertheless, quasi-relatives or not, I believe in my gut that we ought to act, if we are able, so that the lives of future generations are not harmed or lost. So I’ll  try another way to justify this conviction.
            A huge number of decisions were made in the past and huge amounts of money have been spent from which I have benefited.  To be sure, when those schools and roads were built (to take two examples of thousands), those who built them did not have me in mind—a fact that is no bar at all to my profiting from their existence.  If all of that is true, do I now have an obligation to act in such a way that unknown future generations are better off because of what I did?
            That’s what people—usually affluent ones--call “giving back.”  Past generations gave to me and now I give to future generations.  Assuming I have what it takes in money and/or ability, that comes close to an obligation.  We tend to hold that people are (reprehensibly) selfish if they are rich and hold on to every penny they own.  (We don’t think that scrooge is a good guy.)  Children, we say, ought to be helpful to their old parents—in return, so to speak, for having been brought up by them.  Society has nurtured me, goes a similar argument, so now I owe society something in return.  And in this context I see no moral difference between “giving back” to the present generation or to future ones (though thanks are to be had only from those who are here now).  We also don’t approve of Louis XV’s “Après mois le déluge,” that is, not giving a hoot about what comes after us.
            If this talk of morality is getting you down, let me conclude with some quite mundane local analogies.  Not very long ago it was decided to spend quite a bit of money to widen the parkway that goes from downtown Pittsburgh to the airport and beyond.  The beneficiaries of this expenditure, however, are certainly not limited to those whose taxes were used; rather, all the  future people who use that road are included.  There may be arguments about how traffic situations are best improved, but it is taken for granted that easing up on clogged roadways is desirable, with no one fretting that most of the beneficiaries live in a distant future. 
            There is nothing wrong, moreover, in wishing that the authorities many years ago had anticipated ever-increasing traffic and had then widened the road.  We would have been grateful to them and thought them meritorious.  Well, the need for effective transportation has been around for a long time, while the need for saving  low-lying land from being overtaken by the ocean is newly discovered; but the structure of the argument for acting now is the same.

The analogy to road building is too facile for at least two reasons. First, the issue is not to produce for future generations the sort of “facilities” that are expected to be desired because they are similar to features that are at present desired, where the future is expected to resemble the present. The future in the climate change case, on the other hand, is expected to be quite different—with many of its features unknown—and the reason for acting now is to eliminate or at least to mitigate future disasters. These are important differences. Second, before too long I want to do a piece in which I try to understand why so many simply deny that significant climate change is in progress or , if there is, that human activity is a cause.

Tuesday, March 10, 2015

Size Matters

   Who says that sculpture has to be three dimensional? Take a piece of cardboard, say ten inches long by 4 inches high, gently curve it, but not symmetrically, and with a bit of scotch tape, attach the one ten-inch edge to your desk or table, so that it stands up, so to speak. Now you have a work of sculpture of sorts—at least if you are generous about your use of that term. If it’s sculpture, it’s art and to call something a work of art is surely honorific.
   But I see that you are not impressed. So, let’s have someone expand the dimensions of that cardboard—say using sheet metal—keeping the shape and proportions, making a piece 40 inches high and 100 inches long, curved like the cardboard. Are you more impressed? Do you like it better as a work of sculpture?
   Honestly, “Yeah, but  . . . blah . . . . “ might well be the answer. Happily no one, to my knowledge, ever produced works of the kind just described. However, sculptures of such and similar shapes were created by the American artist, Richard Serra. But his works, while perhaps originating in simple drawings of shapes and curves as just described, dramatically exploded to heights of twenty feet and more, emphatically asserting themselves in textured corten steel. Size matters.
   Go to YouTube and see a demonstration: There you will see, accompanied by an explanatory voice, two gloved hands variously twisting and twirling a sausage-shaped blue balloon, until, lo and behold, it turns into a cheerful balloon dog. By my guess that shiny beast is about ten inches high and perhaps a foot long. Balloon twisting is not an ancient art, but it seems to have entertained children at birthday parties since the early forties of the last century.
   That is Jeff Koons’s starting point. The sculpture with which he winds up is also shiny, to the point that you can use it as a mirror, and, like the original, apparently seamless. But Koons’s dogs are made of metal, come in several vibrant colors and are big. I mean BIG: over ten feet high and about twelve feet long. We’re talking feet, not inches. We’re also talking millions: dollars, that is.
   To get there, Koons had to devise manufacturing techniques and supervise a large squadron of assistants with a variety of technical skills. His mega-baubles cost a lot to produce, but they cost much more to the mega-collectors who have bought them. Size matters.
   Size matters: that is how Serra and Koons wind up on the same page. But there are profound differences. Serra has invented a profusion of shapes—variously curved planes that are born at the size at which they are created and displayed. Koons, on the other hand, appropriates previously existing models and lavishes on them his considerable technical know-how to create his multi-million dollar Collector’s Items. “Koons is a man who gives a whole new meaning to the term lightweight,” says Felix Salmon in a Guardian review of the recent Koons exhibit at the Whitney, its last in the Breuer building.

   Size matters. Has it ever before in the history of art?  For sure it has, starting with the Parthenon. But I cannot think of many other instances where size was IT. Here are two different examples.

Sunday, March 8, 2015

The AT&T Monopoly as Creator of Bell Labs
   It was of course sheer coincidence that on the day after I finished Jon Gertner’s fascinating book, The Idea Factory: Bell Labs and the Great Age of American Innovation, I read in the New York Times that AT&T will shortly be replaced by Apple “in the Dow Jones Industrial Average, an index that tracks 30 US blue-chip companies and is a leading bellwether of the nation's economic performance.” Ma Bell had entered the DOW in 1916.
   A coincidence of course, but uncannily symbolic of the theme of this blog post I thought of writing, also last night. I will not review the book; there are plenty of good discussions and Google will get you to them. I merely want to make a couple of observations that you might say are embedded in Gertner’s account, but not brought out as explicitly as I will here. But before I do so, I need to remind you of what Bell Labs was.
   It was created in 1925 as AT&T’s Research and Development unit and became, many agree, the greatest institution of that species—ever, anywhere. At its height the Labs employed 12,000 people—theoretical scientists, experimental scientists, engineers of various brands, and more practical people who had ideas about how to manufacture (economically!) the things thunk up by those more abstracted ones.  Some problems were set, many were “invented” by the “employees”—of whom no one could earn more than ten times the lowest paid—and other projects  were brought to them, as they arose from AT&T needs.
   Interaction among all these folks was controlled—or rather, very much not controlled—by practices and mores that sharply distinguished Bell Labs from the departmental organization of universities. No sharp lines were drawn between genres and interaction among them all was encouraged and even facilitated by the very design of buildings—with long corridors that made encounters likely.
   Given this set-up (of which is a mere outline), the Bell Labs’ accomplishments were phenomenal.  A recital of them all would indeed require a review. Herewith just a flavor. Crucial work on radar, a significant World War II project. Many concerns with long-distance telephoning, with innovations pertaining to undersea cables and communication via satellites.  The stars, however, are, above all, the “invention” of information theory by the mathematician Claude Shannon that makes possible (or should I say “practical” communication of text via telephone or computer or whatever comes next. The other star, more (if not very) familiar to us ordinary folks, is the transistor that replaced vacuum tubes, at a miniscule fraction of their bulk and consumption of power, making it possible to, literally, amass millions of them. Although Bell Labs accomplished many more feats that affect all of our lives, I’ll let those two symbolize how that organization has changed technology in the latter 20th century and beyond.
   Clearly, to have those 12,000 people do all that and much more cost mega-bucks—many millions in the currency of the day, probably pushing into the billions if the same effort were mounted today. This was not Washington money, but money from an outfit that was cultivating customers to stay alive, like zillions of others to be found on the stock market. But there was one big—very big—difference: AT&T had been accepted as a monopoly  by the US government. With a steady growth of telephone users—with the cost not affected by competition—the company was sufficiently rich to be able to satisfy its stock holders as well invest heavily in providing for its future. Bell Labs, whose work called for the investment of many years of effort, could only have been created by a company that was autonomous, capable of planning into an extended future.
   That status as a recognized monopoly—in a context that was, in general, not at all friendly to monopolistic practices—had another outcome that was very favorable to widespread technological progress. AT&T management was well aware that the US government might change its position and end their privileged status as a monopoly. It thus became Bell Labs’ policy to allow the use by others of its many patents with only minimal charges. Numerous industries in the US and throughout the developed world thus benefited from the work of the Labs, almost as if they were a governmental research organization—only better run and more successful than any.
   It was not to last—not because the Labs ran out of steam, but because after years of negotiations that began with a 1974 justice department antitrust suit, agreement was reached in early 1982 that led to the break-up of the Bell empire and the separation of AT&T from Western Electric, its manufacturing subsidiary.
   With the introduction of competition, the consequences, not just for the telephone industry but for a much broader span of communication providers, were complex and at times dramatic. I do not have a grasp of this multifaceted set of changes and am not in a position to sort out what was for the better and what for the worse. Most free market proponents will probably agree that in some domains competition must give way to regulated monopolies, but there is probably little agreement as to which cases and how regulated. These are big topics—well beyond my ken.

   I do however want to conclude this small essay by noting that Bell Labs, almost certainly the greatest “idea factory” that was ever created, depended on its formation and continuing functioning on the existence of a significant monopoly and that its end came rapidly when that monopoly ceased to exist.    

Monday, March 2, 2015

Another Oldie
       While I intend to post a historic op-ed only now and then, I do want to match one of my favorites (see previous post) with an op-ed in which I was dead wrong. Since I have a copy of the printed version in my computer, you will also see how they were treated by the Pittsburgh Post-Gazette.

Pittsburgh Post-Gazette, Wednesday March 10, 2010
Tea Party paranoia is nothing new
Fear-mongering has always been with us, but it never wins in the end
Wednesday, March 10, 2010
By Rudolph H. Weingartner

It is a notorious fact that the Monarchs of Europe and the Pope of Rome are at this very moment plotting our destruction and threatening the extinction of our political, civil and religious institutions.
-- So it was "reported" in a Texas newspaper in 1855.
There is much to regale one when reading historian Richard Hofstadter's 1965 essay "The Paranoid Style in American Politics." While the citation above speaks darkly of foreign foes, plenty of others from the beginnings of the American experience berate the enemy within.
Masons and Mormons have been accused of nefarious plots against the right-thinking; so have Catholics and Jews. Add the racial or ethnic group of your choice. Muslims have been a favorite lately.
Some enemies never seem to go out of fashion. A 1954 book was titled "The Income Tax: The Root of All Evil," more than half a century before Andrew Stack aimed his plane at an IRS office in Austin, killing himself and one employee.
This dramatic event was not an outlier. In just five years, threats against IRS employees have increased by nearly 25 percent, to 1,014 in 2009. The enemy is within, didn't you know?
If you thought Gen. George C. Marshall was the last word in competence and rectitude, be enlightened by Sen. Joseph McCarthy, who was quite sure that "his decisions, maintained with great stubbornness and skill, always and invariably served the world policy of the Kremlin."
If you thought Dwight Eisenhower was a mildly conservative president, be enlightened by Robert (grape jelly) Welch, founder of the John Birch Society, who considered him "a dedicated, conscious agent of the Communist conspiracy" -- a conclusion "based on an accumulation of detailed evidence so extensive and so palpable that it seems to put his conviction beyond any reasonable doubt."
And so, with Yogi Berra, I observe that it is deja vu all over again.
The best evidence of this is contained in a brilliant Feb. 16 New York Times article by David Barstow: "Tea Party Lights Fuse for Rebellion on Right."
Many of the tea partiers were not involved in politics until prodded by misfortunes attributed to the recession. The newness of their plunge into the fray explains in part the radical nature of their proposals: Get rid of the Fed, the income tax, Social Security, not to mention bailouts and stimulus bills, even Medicare (a government program on which the government should keep its hands off!).
As for so many in the past, to the tea partiers the world is full of conspiracies, with President Barack Obama the master of them all. He, not even a citizen of the United States, is intent on controlling the Internet, depriving Americans of their guns, killing the economy and so much more.
But take note that a Nevada Republican running for Congress blames both the Democratic and Republican parties for moving the country toward "socialist tyranny." An equal opportunity accuser!
Therein lies a clue.
I would not, with Mr. Barstow, characterize the tea party movement as an expression of "conservative populist discontent." Populist, probably; discontent, surely. But not conservative.
Both Edmund Burke, the father of modern conservatism, and William F. Buckley, his modern, if imperfect, reincarnation, would shudder in their graves to see the tea partiers given the respectable label of "conservative."
Conservatism is a rational position. Paranoia is neither rational nor a position. It is, the dictionary informs us, a derangement, derived from Greek words that translate as "outside the mind."
History teaches us that we've been here before. History teaches us that fear-mongering can cause great annoyance, injury, turmoil, even death. But history also teaches us that paranoia in American politics, in the end, does not prevail.
This too shall pass.
Rudolph H. Weingartner is professor emeritus of philosophy at the University of Pittsburgh, where he served as provost from 1987 to 1989 (rudywein@