KIRKUS REVIEWS, ONE OF THE INDUSTRY’S LEADING NEW BOOK ANNOUNCEMENT SPACES, IS BEGINNING TO TAKE NOTICE OF A FRAGMENT TOO FAR

REVIEW: Kirkus Reviews, July 15, 2019 (online), August 1, 2019 (print), “Lynch’s series debut offers a sharp, engaging hero and a mystery that amiably simmers. . . ” Full review pasted below.

A FRAGMENT TOO FAR
Author: Dudley Lynch

Review Issue Date: August 1, 2019
Online Publish Date: July 15, 2019
Publisher: ECW Press
Pages: 352kirkus
Price ( Paperback ): $16.95
Publication Date: October 1, 2019
ISBN ( Paperback ): 978-1-77041-499-0
Category: Fiction
Classification: Mystery

An industrious Texas sheriff tackles a strange multiple murder with roots in weird science and the Roswell crash of 1947. Mild-mannered Sheriff Luke McWhorter, whose Yale degree in Divinity gives him a relaxed, straightforward narrative voice, marks an ignoble milestone when he vomits for the first time in his career. The cause is the discovery in an empty house of nine corpses already ravaged beyond easy recognition by a swarm of buzzards. McWhorter, who’s never met a bit of received wisdom he wasn’t eager to share, immediately calls FBI Special Agent Angie Steele, who also happens to be his lover. Angie and several local detectives start to untangle the case, complicated by the buzzard attacks and, strangely, by the lack of any personal identification around the bodies or in the house. An equally anonymous 10th body is discovered behind the dwelling, which belongs to a professor Huntgardner, who hasn’t been seen lately. Huntgardner’s extensively documented, and somewhat outrageous, claims about the legendary UFO crash at nearby Roswell raise even more eyebrows than usual in light of the multiple murders. McWhorter’s wide-ranging investigation involves searching for the professor, identifying the bodies, and making a deep dive into recent history with the few locals who are old enough to remember. With Angie’s guidance, McWhorter unravels the complex case, separating conspiracy theories from historical facts and offering numerous sidebar anecdotes along the way. Lynch’s series debut offers a sharp, engaging hero and a mystery that amiably simmers without ever quite catching fire.

UPCOMING REVIEW: Publishers Weekly. Run date TBD.

BLURB: “Make room for author Dudley Lynch and protagonist Sheriff McWhorter in the Mystery Fiction Big Time. A Fragment Too Far delivers.” — Carlton Stowers, two-time winner of the Edgar Award

Upcoming Ad: Publisher’s Weekly, full page print ad in the Fall Announcement edition

Bookmark and Share

A FRAGMENT TOO FAR : ‘TRUE DETECTIVE’ MEETS ‘THE X-FILES’ IN THE FIRST INSTALLMENT OF DUDLEY’s SHERIFF LUKE MCWHORTER MYSTERY SERIES

Dudley Lynch’s debut mystery novel is available October 1. To preorder, go here!

Nine physicists are dead. The medical examiner has determined that the victims died from drinking coffee laced with rat poison. The owner of the house, Professor Thaddeus Huntgardner, isn’t suspected, but his claim that a piece of the debris from Roswell’s 1947 UFO crash was hidden in Flagler might be true.

Is the fragment real? If so, who is trying to locate it? And what has fueled the byzantine activities of Abbot County’s two secret societies for the past 70 years?

Enter Luther “Luke” Stephens McWhorter, a Yale Divinity School–educated West Texas sheriff with all the right questions. Working with FBI agent and girlfriend, Angie Steele, Sheriff Luke begins to put together all the pieces and come to understand the connection between seemingly unrelated phenomena.

To arrange reviews, interviews or events involving A Fragment Too Far, contact Susannah Ames at susannah@ecwpress.com.

For information regarding regarding social media promotion, print or digital advertising, or promotional materials (such as postcards, bookmarks, or posters), contact Leah Kleynhans at leah@ecwpress.com.

Reviews of the author’s books, including A Fragment Too Far:

“An engaging narrative voice, likeable hero, and unpredictable plot make for an outstanding mystery debut!” — Jay Brandon, Edgar Award-nominated author of Fade the Heat and Shadow Knight’s Mate

“[A] superb writer [who] deftly guides you through apocalyptic terrorism, academic over-reach, religion as entertainment, the chances of fate, the sustenance of friendship and the hunger for love, all out in the sunbaked badlands of West Texas.” — Victor L. Hunter, co-author of The New York Times-reviewed novel, Living Dogs and Dead Lions

“Lynch takes Sheriff Luke McWhorter and the reader on a wild ride through the West Texas prairie to deliver a bang of a story.” — Stephanie Jaye Evans, author of the Sugar Land Mystery Series

Joe Holley, writer of The Houston Chronicle’s weekly “Native Texan” column, calls Lynch a “clever and quirky” author who has pulled off a “unlikely” plot “with panache.”

“[Dudley Lynch] has done the almost impossible. He has turned the Raiders of the Lost Ark into a Sunday School lesson. . . . There is a film there!” — Robert M. Randolph, Chaplain at Massachusetts Institute of Technology

A Fragment Too Far is a gripping thriller as full of twists and turns as a mountain road. Lynch’s beloved sheriff works through a situation combining a very real West Texas setting and its history, science, and what we all sincerely hope is science fiction. A first class read.” — Rose Williams, Latin and English instructor, McMurry University, and author of Caesar’s Blood

“I just wanted to thank you and let you know that there is somewhere in Africa, a 56-year old Frenchman who is grateful to you for this gift through your [Strategy of the Dolphin] book.” — Jean-Pierre Brosset, Dakar, Senegal

“Dudley Lynch is a toolmaker for the brain.” — Dr. Steven Feinberg, author of The Advantage-Makers

“Your creative works absolutely transformed my life.” — Dr. Sirichai Preudhikulpradab, organizational development professor, Assumption University of Thailand

“If you haven’t read any of Dudley’s books, you’ve missed out on top-notch material for change work.” — Per Rehné, Cornelius Group, Copenhagen, Denmark

“Dudley altered my world view in such a powerful way, opening new and powerful leadership possibilities.” — Katherine Carol, co-author, Shining Beautiful

“I have a hard time putting into words the tremendous joy and satisfaction I am experiencing in my life right now as a direct result of your work.” — Brian Lundquist, Publisher, NanoTech-now.com

Blurb about the author:

Dudley Lynch was born on America’s Southern Great Plains to a fundamentalist minister and his wife. He has lived much of his life in Texas, some of it in small cities like Abbot County’s Flagler, the spell-binding, ever-surprising setting for his two debut mystery novels. With two university degrees in mass communications and an honorary doctorate from Newport University, Lynch has written sixteen non-fiction books and articles for more than 250 publications worldwide. His first published books were about Texas storms (tornadoes) and Texas political figures, including The President from Texas (Lyndon Baines Johnson) and The Duke of Duval (the infamous political boss George B. Parr).

In A Fragment Too Far, the first of Lynch’s fictional mystery series, he introduces Sheriff Luther Stephens McWhorter, a Yale-Divinity-School-educated West Texas law officer. Sheriff Luke ushers crime fighting into the Post-Extraterrestrial Age with his discovery that a fragment of a UFO with potentially dangerous secrets inscribed on it may now reside in Abbot County. The UFO plunged into a New Mexico pasture shortly after World War II ended. Is there more to be known? More to be feared? More to prepare for? The answers should be available soon. The 75th anniversary of the Roswell UFO crash will occur in late July, 2022. The owners of the fragment said this was when they planned to return to Earth to retrieve it.

Lynch’s second Sheriff Luke adventure, One Good and Deadly Deed, will be released in the summer of 2020. In this mystery work, Sheriff Luke finds Abbot County’s tranquility disrupted by the arrival of a 4,000-year-old mummy retrieved by local explorers on Turkey’s Mt. Ararat. Thanks to breakthroughs at his local research laboratory, a talented Flagler biologist has managed to impregnate a Palestinian-born woman with DNA that may have come from the prophet Noah. Thinking that a half Israeli-half Palestinian baby would end Middle Eastern conflicts thousands of years old, the biologist and two more of Flagler’s ever-intrepid adventurers discover they have badly misjudged the consequences.The two pilots who brought the mummy to Flagler die when someone shoves them into their aircraft’s propellers. Bullets start flying at the local hospital, more people die, and the new mother and her babies—the mummy’s DNA produced twins—have to rushed into the U.S. Marshals Service’s Witness Protection Program. And the mummy? The fight to possess it continues.

These works are the first two of a projected five-book series about the erudite sheriff, his feisty, fetching girlfriend, FBI special agent Angie Steele, and the byzantine secrets of their hyper-active West Texas city.

Bookmark and Share

SO JUST HOW SKILLED AT LYING DO WE AMERICANS WANT OUR PRESIDENT TO BE? SOME THOUGHTS FROM THE FRONT LINES OF FALSEHOOD.

We first posted this blog item back in July, 2006. Change a name or two, and it seems to be just as apt now as then:

On the one hand, scientific proof is growing that George W. Bush is a very intelligent man. The argument centers on knowledge that has become so widespread that it’s something of a worldwide joke: the president is so good at, so at home with, so nonchalant about . . . lying. And, on the other … well, let’s reflect for a moment on the issue of leaders and lying.

Salon.com’s Tom Grieve gave us an example of wherefores of recent presidential lies earlier this month as he revisited Bush pronouncements in the past five years on how he feels about Osama bin Laden. Bush has said, variously:

Sept. 17, 2001: George W. Bush is asked if he wants Osama bin Laden dead. “I want justice,” he says. “There’s an old poster out west, as I recall, that said, ‘Wanted: Dead or Alive.’”

March 13, 2002: At a press conference, Bush says that he doesn’t know if bin Laden is dead or alive. “You know, I just don’t spend that much time on him. . . . And I wouldn’t necessarily say he’s at the center of any command structure. And, again, I don’t know where he is. I — I’ll repeat what I said. I truly am not that concerned about him.”

Oct. 13, 2004: “Gosh, I just don’t think I ever said I’m not worried about Osama bin Laden. It’s kind of one of those exaggerations.”

Jan. 31, 2006: “Terrorists like bin Laden are serious about mass murder — and all of us must take their declared intentions seriously.”

May 25, 2006: “I learned some lessons about expressing myself maybe in a little more sophisticated manner — you know, ‘Wanted dead or alive,’ that kind of talk. I think in certain parts of the world it was misinterpreted, and so I learned from that.”

July 4, 2006: The New York Times reports that the CIA last year disbanded a secret unit assigned to track down bin Laden and his top lieutenants in an effort to focus on “regional trends rather than on specific organizations or individuals.”

July 7, 2006: At a press conference in Chicago, Bush calls the Times report “just an incorrect story.” “I mean, we got a — we’re — we got a lot of assets looking for Osama bin Laden. So whatever you want to read in that story, it’s just not true, period.” Asked if he’s still on the hunt for bin Laden, the president says: “Absolutely. No ands, ifs or buts. And in my judgment, it’s just a matter of time, unless we stop looking. And we’re not going to stop looking so long as I’m the president.” Bush said he had announced regret over the “dead or alive” comment only because “my wife got on me for talking that way.”

But let’s be fair about this. President Bill Clinton wasn’t called Slick Willie because of his hair gel. In her Feb. 5, 2006, cover story on lying in The New York Times Sunday Magazine, science writer Robin Henig recalled watching a videotape of Clinton at a presidential news conference in early 1998. These were the early days of the Monica Lewinsky scandal. You probably remember the scene as well as I do, when the Prez shook his finger at the collective us and said, “I want you to listen to me. I did not have sexual relations with that woman.”

With Henig as she viewed the tape was Dr. Paul Ekman, retired from the psych faculty at UCSan Francisco, creator of the Facial Action Coding System and author of the book, Telling Lies. Among the clues Ekman counsels us to look for in watching for the lie are (1) demeanor that is different from a person’s usual demeanor (2) “distancing language,” like referring to others more in the third person, and (3) “verbal hedges,” useful in buying time to figure out how to phrase the lie.

It’s all there in Mr. Clinton’s denial, Ekman told Henig. “I want you to listen to me.” Verbal hedge (like the shark in the cartoon, standing in the courtroom looking up at the judge and saying “Define ‘frenzy.’”). “I did not have sexual relations with that woman.” Distancing language. And there was, noted Ekman, an almost imperceptible softening of the president’s voice at the end of the “that woman” sentence. Demeanor departure. Ekman leaves the impression that a trained human lie-detector can only conclude that “That man did something nefarious with that woman.” In fact, said Ekman, the moment the press conference ended, he started getting calls from people he has trained, saying, “The President is lying!”

And yet the experts that Henig interviewed seem to be pretty unanimous that you wouldn’t want a president who couldn’t lie. Ekman is one of those. He ticks off three qualities needed to tell a lie: (1) the ability to think and plan moves ahead of time—that is, to think strategically. (2) to observe others therapeutically—to put yourself in their shoes. (3) to act like a grown-up—to manage your emotions.

Two Scottish primatologists have devised the Machiavellian Intelligence Hypothesis. It contends that the bigger the neocortex, the better a creature is at deception. And the better the deception, the more social the species. The more social the species, the greater the intelligence. Which may be why, as Henig reports, researchers at the U of Southern California found that pathological liars have more white matters in their prefrontal cortexes that nonliars. Another researcher, Sean Spence at the U. of Sheffield, notes, “White matter is pivotal to the connectivity and cognitive function of the human brain.”

So are we verging on “what a liar you are” becoming not only a compliment but also a reason why a person might make a good president?

I very much like the perspective that Jeremy Campbell’s gray matter offered on all this in his book, The Liar’s Tale: A History of Falsehood:

“The irony… is that lying cannot hope to succeed in its aim unless truth is the normal practice of a society. In the nineteenth century there was a sense that democracy, more than other forms of government, needed truthfulness if it was to increase and flourish, that mendacity in a politician was more to be deplored than another category of offense. The converse of that view is that in a system which draws much of its strength from candor, lies are all the more effective, all the more insidious. For that reason, so this argument goes, they will never be removed from our type of democratic community. But if lying becomes the norm, on the thesis that it softens the “cruelty” of life, it defeats its own purpose. Truth might then become more powerful than untruth, as in George Orwell’s bureaucratic nightmare, 1984, where a person who dared to speak the truth was so dangerous to the state as to be in urgent need of liquidation.”

I think this may be what I want in a president: a person immensely skilled at telling a whopper but who never does so without agonizing over the damage the telling nearly always does to the fabric of our shared social character.

I’m 99.9% convinced that Mr. Bush doesn’t meet my qualifications, and Mr. Clinton may not have either. Of course, we’ll probably never know. Mr. Clinton was a much more skilled liar than Mr. Bush has proven to be.

Bookmark and Share

BACK WHEN I TRIED TO POST AN ORIGINAL FULL-BLOWN BLOG ITEM ONCE A WEEK, ONE WEEK I PROVIDED THIS ENCAPSULATION OF WHAT I CONSIDERED MY TEN BEST BLOG ITEMS FROM THE PAST

A reader in North Carolina writes:

I still believe in “significant coincidences,” which were demonstrated—once again—by a death in the family that brought me to Ohio for a week. As I woke up this morning, I ended up browsing issues related to my Web site and found your blog posting of the poem I had sent you a few months ago. That led to your blog, and I started reading many of your other postings, with pleasure. First, I realized that I have to link my newsletter readers to your blog. Second, I found great notes on books I need to read. Third, it confirmed that many, many people can benefit from learning more about the Beta Mind.

My kind of guy, for sure. And he set me to thinking: Here’s one of my closest colleagues, and he forgets even that I maintain a blog. And I understand why. Information-wise, we live in a dim sum world. It’s all you can do to sample a little here, a little there. So this may be your first visit to my blog, or your first visit in a good while, or your first visit since your last first visit. In any event, I have managed to haul my bifurcated brain out of bed now for the past 16 months and post, on average, one blog item every week or two. And other than my wife, Sherry, to whom I pointedly allude to “my latest blog item” within 24 hours of each item’s posting and then pointedly make reference to something in that item within the next 24 hours to find out if she’s read it, I strongly suspect that there’s not another person on earth (in the heavens, either) who had read every single one of my musings.

Forever intending to be your humble servant, I then want to save you the trouble of backtracking thoroughly. Here’s a guide to what I consider the Ten Best Of The Lot (although not in any particular ranking of importance but beginning with the most ancient of the postings first) on the day I wrote this blog item (To read the item, find the date provided in the list at right on this page.)

Happy timewarping!

Just When I Was Ready to Discuss What We Could Do to Encourage New Thinking Skills in a Seminar at Her Employer, I Get This Question about Believing in God Posted on November 28, 2005

While the Greedy Merchandisers of Children’s Electronic Entertainment Are Counting Their Shekels, Their Viewers—or So It Appears to Grammie and Me—Are Simply Learning to Count Posted on December 16, 2005

Yes, I’m Convinced That We Are Progressively “Evolving” How We Wire and Use the Wiring in Our Brains, But We Still Don’t Any Means to Stand Back and Take a Good Look at How It All Works Posted on March 04, 2006

Six Years Ago I Wrote About Where Mr. Bush Clocked Out on the Timepiece of Presidential Candidates. I Continue to Think It Was a Timely Reading Posted on May 09, 2006

The Minds We Use Have Consequences in the Lives We Live. Here Are Three Telling Examples Posted on July 05, 2006

“To Be or Not To Be?” Really Isn’t the Question, and Never Has Been. So What IS the Really Important Question that the Brain Needs to be Trained to Handle Adeptly and Maturely? Posted on July 05, 2006

Unhappily, When This Talented Academician’s Dual Worlds of Art and Science Meet in His “Brain on Music” Book, the Bridge Often Seems to Be Out Posted on September 19, 2006

Philosophers Aren’t a Modest Bunch: They Argue That Few of Us Would Know Much About Anything If Philosophy Didn’t Know Something About Something Posted on October 25, 2006

The Buck Stops Here on the Issue of Breaking the Cycles and the Spells That Cauterize Our Brain’s Ability to Provide Sane and Suitable Actions and Answers Posted on December 14, 2006

One of the World’s Smallest “Engines of Change” Is Also One of Its Most Powerful. On An Almost Unimaginable Scale, the Amygdala Rules Posted on January 07, 2007

Bookmark and Share

“TO BE OR NOT TO BE?” REALLY ISN’T THE QUESTION, AND NEVER HAS BEEN. SO WHAT IS THE REALLY IMPORTANT QUESTION THAT THE BRAIN NEEDS TO BE TRAINED TO HANDLE ADEPTLY AND MATURELY?

The future of the human species, and the future of the many other species whose fate is tied to ours, however directly or indirectly, hinges on what the human brain can be taught to do with this question: Is there another way to explain or do this?

This has always been the question. Every advance in tool capability and efficiency has resulted because someone either imagined another way to do or explain something, or else simply stumbled onto it. The same is to be said for progress in religious thought. And in philosophy. And medicine. And all else.

At the biological level, if it has been a way better suited to delivering a result more useful or powerful or adaptive to general circumstances, or often to very specific circumstances, then the result has not infrequently been a reordering or a reconstitution of the biological pecking order or the biological mechanics.

Adroit handling of the question—is there another way to explain or do this?—seems not to come naturally to us humans. It is, for most of us, an acquired taste at best. What we think of the question, if we think of it at all, is most often a consequence of whether we were born to parents who were products of a culture that welcomed the question. Most cultures, and most parents, have not encouraged the question. So unless you found yourself living in a democracy, there has usually been a risk at asking the question. And even in a democracy as formally devoted to the idea that it is always permissible to ask “Is there another way to explain or do this?” as the United States of America, it can be sometimes dangerous to ask the question. It was pervasively so during the Civil War years, during the McCarthy Era, during the reign of Jim Crow in the South and can still be, to a disturbing extent, so in today’s obsessed-with-terrorism political environment.

We have spent years at Brain Technologies developing and perfecting, often assisted by the trenchant and imaginative work of others, ways to forecast how a given brain may handle the question.

Generally, or so it is our experience, the brain will react in one of four ways:

1) In most circumstances, it will reject the idea that there is anything to be gained in asking the question. Thus it will defend, sometimes to the death or to others’ dying, the explanations it already has.

2) It will accept the idea that the question is a good one, but typically be indiscriminate in seeking, judging and acting on answers to the question. The first answer that happens by that seems to work is, for this category of brain functioning, accepted and acted on, whatever the outcomes.

3) It will see the creation of hypotheses and the investigation of them as “end all and be all” of the process. So that the challenge becomes understanding a set of answers in great detail but not necessarily the efficient and imaginative use of any of them.

4) It will automatically assume that there is an infinite variety of ways to explain almost anything and will work to experience as many varieties of ways as possible, giving precedence to the newest and most novel.

Of course, the human brain being what it is, most any healthy and especially fully formed (adults over 30, for the most part) brain can and does move between these four approaches if coached, encouraged and provided with a safe haven for doing so. However, such safe havens, such encouragement and such coaching are in extremely short supply. It is so today, and it has always been so.

So nothing approaches in importance how human brains handle the question, “Is there another way to explain or do this?” At this stage in our development as a species, handling the question well and effectively and with political astuteness requires unusual pluck, luck and maturity. It is a most intriguing reality that while our species often seems to take three steps backwards for every half-step forward, we do seem to be making some progress in handling the question.

Now explaining the reasons for that has come close to antiquating virtually all foundations of religion and philosophy. Nor are suitable answers in immediate prospect. It may first be necessary to have some good explanations for such questions as what is the world made of (we still don’t know) and what happened before anything happened (we don’t have a clue) and is there conceivably any point or place or combination of circumstances in the universe when it will cease to make sense to ask the question, “Is there another way to explain or do this?”

Stay tuned as long and as healthily as you can. It has really begun to get interesting in these recent times.
_______________________
The above commentary has appeared previously on one of my blogs.

Bookmark and Share

FOR A SELLER OF BOOKS AND MUSIC PRODUCTS, IT SEEMED LIKE THE MOTHER LODE, AND STILL DOES. BUT IT ALSO TURNED OUT TO BE A REMARKABLE WINDOW ON A GIFTED AND DISTURBED MIND

It started as purely a business transaction—a business coup, it seemed then and still does. Seventy-two moving boxes (12×13×16 inches in size), each packed like a sardine tin with books, CDs, audio tapes or photograph records. We bid $1,000 and got the lot. It took a rental trailer and a pick-up truck (and my brother-in-law’s generous help) to get all this to our double-car garage. My rotator cuff injury ached for days. And that was only the beginning.

The thousands of items had to be unboxed, one at a time, and catalogued for the online bookstore we operated at the time (Brain Books To Go) and other services where we were selling reading and listening materials. That, obviously, was the initial attraction. What we didn’t realize at the time was our thousand dollars had done more than simply glut our intellectual properties’ supply line for several months. We’d also acquired a window on a remarkable, and remarkably shattered, brain.

We knew going in that this collection carried a “must-sell” urgency because its compiler was in a coma from which he was not expected to emerge. We heard vaguely that he had suffered a lifetime of schizophrenia. That added an element of intrigue to the deal, because we purchased the library blind. The items were already packaged when we bought them.

Months later, we’d opened every box and examined every item. And it was a singular experience for us.

Sherry took charge of the CDs, audio tapes and albums. I took the books. Both categories, though, produced the same response. Our minds boggled over another mind’s remarkable achievement, given the obvious depth of its despair and brokenness.

Sherry gave me a guided tour through the albums, the audio tapes and the CDs the other night. She’s put them in clusters alphabetically by artist. It appears that our archivist started in the late 1960s. For the rest of that decade and in the ‘70s and ‘80s, it was all albums. The Beatles, Grateful Dead, Beach Boys, Marshall Tucker Band, Abba, Kenny Rogers, Jimi Hendrix—those names and many others we recognized, even though their album photos often pictured them earlier in their careers than we remembered. And then there were hundreds of singles by performers we weren’t familiar with at all: Dan Fogelberg, Lee Michaels, Savoy Brown, Lightfoot, The Jim Carroll Band and Barclay James Harvest, to name a handful I turned up at random while rifling through Sherry’s orderly storage system.

In the ‘80s, our archivist turned to audio tapes. And in the 1990s and the 21st Century, to CDs. Not an unprecedented undertaking, of course. There are thousands of collectors worldwide of this sort of thing. But when combined with the book collection, we’ve been made to realize that our potential “white elephant” purchase has thrust us into the role of archaeologists for a mind that, if deeply troubled, was also profoundly gifted, active and productive.

Because the same thoroughness that made his music products collection a veritable “history” of what music producers were packaging over nearly four decades did the same for his book collection.

Clearly, he didn’t buy everything. But it is difficult to think of a title … or a writer … of importance that he missed. At one point I had to wonder, “Will I ever get all of his copies of Anthony Trollope’s works catalogued?” But I quickly forgot Trollope because then came Dickens. Book after book after book. Some a bit bunged up, but many brand new. Eleven, spankin’ clean volumes of The Diary of Samuel Pepys. The entire set of the gorgeously printed and bound Library of America series. Copy after copy of the prodigious Oxford University Press’s dictionaries and anthologies and histories and “companions to.” Somehow, he either got on the mailing lists of or prowled the bookstore stacks housing the publications of numerous university presses, and certainly the biggest and busiest: Harvard, Yale, Princeton, Chicago, Johns Hopkins, the State Universities of New York, Berkeley, Stanford, North Carolina, Indiana, Oklahoma, Nebraska and on and on. But then he’d also purchased practically every book Billy Graham ever wrote. And Robert Schuller. And Joan Didion. And John Grisham. He’d bought copious numbers of books about military history. And race relations. About philosophy and literary criticism. (And languages. He never seemed to have passed up a Berlitz “learn to speak it yourself”-type tape set and instructional book. But not just Italian or Japanese. The languages of the Lakota Sioux, and the Shoshoni, and the Navajo—he had those sets, too.)

But did he actually read any of his books? As I kept moving through box after box of the best and the most acclaimed (and sometimes not so acclaimed) of 300 years of writing in the English language, I had my doubts. But then by-the-bye I’d pick up one of Oxford U. Press’s 1,200-page tomes, for example, and there deep in its bowels I’d notice a series of repetitive notes. “I read this … I read this …I read this … I read this,” he’d pen in his small, slightly irregular handwriting.

And then I discovered the journals. We’d been told by one of the workers who had packaged all this about the journals. He said they were just spiral-bound notebooks filled with gibberish and they’d tossed them in the trash. But not all of them. I found a half-dozen. And it was in them that the extent of his illness became instantly and achingly clear. And also, the extent of his devotion and passion to his collections.

I’ll not quote a single word from his notebooks. It would be a violation of his copyright, not to mention his privacy. But leave it said that he read copiously. He would plan the night before to read Doftoevsky or William James or Eugene O’Neill the following day. He might even have a favorite chapter in mind (indicating that he’d read it before), and would note how eager he was to place a checkmark by it once he was finished. Every day for years, he wrote a single page about each day of his hopeless fragmented life. When he reached the end of a page and a day, he stopped, often in mid-sentence. Yes, he read a lot in his books. And, no, he couldn’t possibly have done more than open many of them a time or two, if that.

We understand that he did emerge from his coma. Afterwards, he was cared for in a health facility in the Midwest. We wish him every solace that contemporary medicine of the mind could offer. And we wondered if the store clerks checking out his endless purchases over the decades had any idea of the chaos in the brain they were conversing with.
_______________________
The above commentary has appeared previously on one of my blogs. I’m choosing to recycle it here because I think the story it tells is fascinating.

Bookmark and Share

YOU CAN CALL ME A CONSERVATIVE-LIBERAL-SOCIALIST, IN NO PARTICULAR ORDER, AND ALL AT THE SAME TIME, IF YOU WISH … AND HERE’S WHY

Thanks to invaluable assistance from the late Polish philosopher, Leszek Kołakowski, who taught at All Souls College, Oxford, after being exiled from Poland in 1968, I think I’ve figured out what my real political orientation is. I am a Conservative-Liberal-Socialist with the following views (all borrowed, most liberally and described most conservatively, in a very social sense, from Kolakowski):

A conservative believes:

1. That in human life there never have been and never will be improvements that are not paid for with deteriorations and evils; thus, in considering each project of reform and amelioration, its price has to be assessed. Put another way, innumerable evils are compatible; but many goods limit or cancel each other, and therefore we will never enjoy them fully at the same time….

2. That we do not know the extent to which various traditional forms of social life—family, rituals, nation, religious communities—are indispensable if life in a society is to be tolerable or even possible. There are no grounds for believing that when we destroy these forms, or brand them as irrational, we increase the chance of happiness, peace, security, or freedom….

3. That the idée fixe of the Enlightenment—that envy, vanity, greed, and aggression are all caused by the deficiencies of social institutions and that they will be swept away once these institutions are reformed—is not only utterly incredible and contrary to all experience, but is highly dangerous. How on earth did all these institutions arise if they were so contrary to the true nature of man?

A liberal believes:

1. That the ancient idea that the purpose of the State is security still remains valid. It remains valid even if the notion of “security” is expanded to include not only the protection of persons and property by means of the law, but also various provisions of insurance: that people should not starve if they are jobless; that the poor should not be condemned to die through lack of medical help; that children should have free access to education—all these are also part of security. Yet security should never be confused with liberty. The State does not guarantee freedom by action and by regulating various areas of life, but by doing nothing….

2. That human communities are threatened not only by stagnation but also by degradation when they are so organized that there is no longer room for individual initiative and inventiveness….

3. That it is highly improbable that a society in which all forms of competitiveness have been done away with would continue to have the necessary stimuli for creativity and progress. More equality is not an end in itself, but only a means….

A socialist believes:

1. That societies in which the pursuit of profit is the sole regulator of the productive system are threatened with as grievous—perhaps more grievous—catastrophes as are societies in which the profit motive has been entirely eliminated from the production-regulating forces. There are good reasons why freedom of economic activity should be limited for the sale of security, and why money should not automatically produce more money. But the limitation of freedom should be called precisely that, and should not be called a higher form of freedom.

2. That it is absurd and hypocritical to conclude that, simply because a perfect, conflict-free society is impossible, every existing from of inequality is inevitable and all ways of profit-making justified. The kind of conservative anthropological pessimism which led to the astonishing belief that a progressive income tax was an inhuman abomination is just as suspect as the kind of historical optimism on which the Gulag Archipelago was based.

3. That the tendency to subject the economy to important social controls should be encouraged, even though the price to be paid is an increase in bureaucracy. Such controls, however, must be exercised within representative democracy. Thus it is essential to plan institutions that counteract the menace to freedom which is produced by the growth of these very controls.

Observed Dr. Kołakowski, “So far as I can see, this set of regulative ideas is not self-contradictory. And therefore it is possible to be a conservative-liberal-socialist. This is equivalent to saying that those three particular designations are no longer mutually exclusive options.”

I’ve left out some of his comments to shorten the above. The complete essay appears on pages 225-227 of his Modernity on Endless Trial, a book whose every paragraph I’ve found to be enlightening and engrossing. To order a copy, click on the title.
_______________________
The above commentary has appeared previously on one of my blogs. I’m choosing to recycle it here because I think the points it makes are fascinating and important.

Bookmark and Share

TWO DIFFERENT “TRIUNE” BRAIN THEORIES BUT THE SAME CRUCIAL CONCLUSION: WE ARE MAKESHIFT ENTITIES STILL UNDER DEVELOPMENT, AND THAT CAN CREATES SERIOUS PROBLEMS FOR US

I do not remember exactly the first time that I heard about pioneering neuroscientist Paul MacLean’s concept of the triune brain. The idea of a neocortex sitting atop a primordial cortex sitting atop the brain stem. The brain of a human sitting atop the brain of a horse sitting atop the brain of a reptile, all three brains located inside each of our heads. I do remember being electrified by the idea. Instantly struck by what a gorgeous, evocative, instructive, illuminating insight this was.

But like so many other gorgeous, evocative, instructive, illuminating discoveries, the idea of the triune brain has not always stood the test of further, better scientific inquiry all that well. The problem mainly is that the roles of the trio of brains are not nearly as independent as Dr. MacLean had thought. What is going on in the general neighborhood of one of Paul MacLean’s trio of brains is often having an outsized influence over what is going on in other brain areas.

But the idea that the brain has separate “processing” areas that don’t cooperate well—that’s a MacLean idea that has stood the test of time.

For example, the region where MacLean located his middle (primordial) brain contains a little almond-shaped organ called the amygdala. It turns out that the amygdala has a mind of its own. That is, it can learn—reason?— independent of the (higher) cortex. Moreover, the means that the amygdala and the cortex have for communicating what each “is thinking” are imperfect at this point in our evolving capabilities, and that creates endless trouble for us.

For non-brain-scientists (me, for one), no one whom I know about has offered better, clearer explanations of all this than Joseph LeDoux at New York University’s Center for Neural Science. In Synaptic Self: How Our Brains Become Who We Are, Dr. LeDoux suggests that the reason why the all-important amygdala can’t “talk” well with its higher-up synapses is because the wires leading there aren’t well enough developed. And the reason for that is because the development of language by humans required so much space and so many connections to pull off. Consequently, the cognitive systems in our heads have inordinate trouble communicating with the emotional and motivational systems, and vice versa.

Writes Dr. LeDoux, “This is why a brilliant mathematician or artist, or a successful entrepreneur, can like anyone else fall victim to sexual seduction, road rage, or jealousy, or be a child abuser or rapist, or have crippling depression or anxiety….Doing the right thing doesn’t always flow naturally from knowing what the right thing to do is.”

The trilogy of brain functions that LeDoux finds most compelling are indeed those governing thoughts, emotions and motivations. If this triune grouping breaks down, he writes, “the self is likely to begin to disintegrate and mental health to deteriorate. When thoughts are radically dissociated from emotions and motivations, as in schizophrenia, personality can, in fact, change drastically. When emotions run wild, as in anxiety disorders or depression, a person is no longer the person he or she once was. And when motivations are subjugated by drug addiction, the emotional and intellectual aspects of life suffer.”

In short, Dr. LeDoux says that the self is synaptic: “You are your synapses.” Meaning that what happens between key parts of the brain—or doesn’t happen—can be all-important and all-defining. On this point, Dr. MacLean would most likely have been in full agreement.
_______________________
The above commentary has appeared previously on one of my blogs. I’m choosing to recycle it here because I think the points it makes are fascinating and important.

Bookmark and Share

IF YOU REALLY WANT TO KNOW WHAT I HAVE AGAINST “MOTIVATIONAL EXPERTS,” I’M GLAD YOU BROUGHT THE SUBJECT UP

Four of the most egregiously unfair and misused words in this language are “You can do it.” And I’m guilty at abusing them, too.

Because in using those words to urge our children or employees or students or anyone else forward in the performance of a task they’ve not done before or at which they are performing poorly, we are often claiming ownership of information and insight that, in most cases, is simply absent.

Who really knows exactly what your brain is capable of? I certainly don’t? And how could you possibly know what my brain is capable of? You shouldn’t presume to know. And neither of us should be telling each other, or anyone else, that we can do something unless there is evidence that this might be so, and even then, there are important intermediate steps that usually get left out. We can call it The 3-Way Test of Achievability.

• Would you like to do it?
• How do you think you might best go about it?
• Is it worth the effort that is going to be required?

When and only when we have affirmative answers to those questions, do you and I have any reasonable right to offer someone the encouragement that “You can do it.”

In the past few days, I’ve had at least three experiences reminding me that there are things that, in all likelihood, I can’t do. At least, in all likelihood, I’m not going to do them, and so, on these subjects, I fail The 3-Way Test of Achievability.

1) Sitting in our neighborhood deli, Sherry and I were still waiting on our food when the private envelope of our morning conversation was suddenly pierced by a sheet of drawing paper. On the paper, with remarkable fidelity to visages we both were used to observing in the bathroom mirror, were two people seated at a deli restaurant table, having their morning conversation. When we looked up, the artist was beaming at us. He’d been sitting at the table across the aisle, sketching away, unnoticed by either of us. I’m quite sure I’ll never be able to do what he had just done because my brain doesn’t work that way. He said his gift was something he had discovered in himself. He doesn’t use it professionally but, wanting to do something with it, he does things like draw unsuspecting strangers in their morning conversation and spring their portraits on them.

2) One of our local high school seniors has taken the three-hour exam that’s supposed to measure a high school student’s chance of academic success in the first year of college—the dread SAT—twice . . . and achieved a perfect score both times. Asked to explain how he does this, the best he could offer was, “It helps to remember what you have studied.” I don’t need to test this talented mind to be very suspicious that he can’t help but remember what he has studied. This is just the way his brain works. I’ve always marveled at how quickly and totally my brain erases what I’ve just studied once the immediate reason for cramming has been satisfied. I’m quite sure I was not designed to achieve perfect scores on the SAT. Not even once, much less twice.

3) At a used book sale the other day, I spotted a thin, jacket-less little volume titled Mind’s Eye of Richard Buckminster Fuller. There was a time when I spent a lot of time devouring Bucky Fuller’s writings—and pretending to understand most of what I’d just read. Two things in life I’m pretty certain of: (1) Buckminster Fuller was a genius. (2) Virtually no one really understands very much of what he had to say. A really gifted mind can understand a part of it. But by the time you understand that part, Bucky is off rattling the tea cups in some other authority’s buffet. Here, though, was a guy—Bucky’s patent attorney!—ready to show us how Mr. Fuller’s mind worked. So I snatched up Donald W. Robertson’s book (it’s only 109 pages long) and figured I was about to be handed the secret to deciphering one of the 20th Century’s most creative intellects. But no such luck. All that attorney Robertson knew was how to describe approximately how Bucky happened to think up an invention so it stood a chance of being awarded a patent. (Robertson’s applications weren’t always successful because sometimes the patent office attorneys didn’t understand Robertson well enough to understand if Bucky, on that occasion, could be understood).

Three more things in life I’m pretty sure of. No matter how many times you tell me “you can do it!” I’ll never be able to (1) draw a detailed likeness of you eating breakfast that will cause you to say, “That’s amazing!” (2) take the SAT and get a perfect score (once, much less twice) or (3) be able to look at much of anything with the kind of unique visioning capabilities of one of modern times’ most fascinating minds, Richard Buckminster Fuller’s.

The moral of the story: Please save your encouragement for my doing something reasonably doable, and something that I really want to do (and maybe that the world would benefit from my doing), and I’ll return the favor. Thanks!

P.S. Never pass up a chance to buy a copy of Mind’s Eye of Richard Buckminster Fuller. It’s a rare book.

_______________________
The above commentary has appeared previously on one of my blogs. I’m choosing to recycle here because I think the points it makes are fascinating and important.

Bookmark and Share

IF EVERYTHING IS PROGRESSING LIKE THE IDEA OF PROGRESS SUGGESTS IT SHOULD BE, WHY DOES IT FEEL LIKE THINGS ARE GOING WELL FOR ONLY A FEW?

One week a few years ago, I chanced upon two mostly forgotten books, and probably would not have spent much time with either had not both mentioned—on the very first page—an event that itself has been mostly long forgotten: the Century of Progress Exposition that the city of Chicago staged in 1933-34 to commemorate the 100th anniversary of the city’s incorporation.

In The Next Hundred Years: The Unfinished Business of Science, Yale University chemical engineer professor C.C. Furnas lost no time in pointing out how disappointing and overblown the Hall of Science at the Chicago event was to many astute visitors.

Among his observations:

“They [visitors] found most of the loudspeakers on the grounds sadly out of adjustment and the television exhibitions to be more imagination than vision. They saw the latest, swiftest and safest airplanes on display, but during the Fair one sightseeing and one regular passenger plane fell in the vicinity of Chicago killing an even score of men and women.

“They saw exhibit after exhibit featuring the advance of modern medicine but were faced with a preventable and inexcusable outbreak of amebic dysentery, entering in two of the city’s leading hotels, which claimed 41 lives out of 721 cases….They saw a motor car assembly line in operation but, if they investigated carefully, they found that as mechanism for converting the potential energy of fuel into mechanical work the average motor car is only about 8 per cent efficient.

“They marveled at the lighting effects at night but, in talking the matter over with experts, they found that most of the lights were operating with an efficiency of less than 2 per cent.” There was much more—several more paragraphs, in fact—in the way of observations and cautions and laments from Professor Furnas based on his visit to the Century of Progress Exposition.

Bottom line to The Next Hundred Years: the Century of Progress wasn’t all it was cracked up to be.

Then I opened a copy of J.B. Bury’s The Idea of Progress and learned on the first page that the Century of Progress Exposition was partly why the Macmillan publishing house decided in 1932 to bring out an American edition of Cambridge historian Bury’s 1920 masterpiece of historical/economic analysis.

In it, Bury sought to pooh-pooh the idea that “the idea of progress” was a john-come-lately concept crystallized by self-promoting business people and thus was a rather superficial invention. He traced the roots of the idea back at least as far as St. Augustine in the Middle Ages (not that Augustine was a father of the idea of progress but rather that he and other Christian Fathers booted out the Greek theory of cycles and other ideas that stood in the way of a theory of progress) and charactered the idea as one of those rare world-makers.

But even so, after 300 pages of trenchant, sometimes breath-taking reporting and analysis, Bury—on the final page of his book—cautioned that the Idea of Progress might not be all it was cracked up to be. After all, he argued, the most devastating arrow in the idea’s quiver was the assertion that finality is an illusion, that the truth is that what comes, eventually goes.

Bury wrote, “Must not it (the dogma of progress), too, submit to its own negation of finality? Will not that process of change, for which Progress is the optimistic name, compel ‘Progress’ too to fall from the commanding position in which it is now, with apparent security, enthroned?…In other words, does not Progress itself suggest that its value as a doctrine is only relative, corresponding to a certain not very advanced stage of civilization; just as Providence, in its day, was an idea of relative value, corresponding to a stage somewhat less advanced?”

Bury thought it might be centuries in the future before the Idea of Progress was dethroned and replaced.

But looking at an exceedingly rough start for the 21st Century, especially in America, it can be suspected that a persistent undercurrent of change may already be underway less than one century after Bury raised the question of whether the Idea of Progress was going to prove insufficient and undesirable as “the directing idea of humanity.”

Never in history have the shibboleths and ideals of the Idea of Progress been praised and promoted to the extent that they have in the U.S. in the past five years. And with each passing day, the conclusion seems to be more and more unavoidable: they are only working for a tiny part of our population, the very rich and powerful.

It is becoming more and more obvious that the highly stylized, sound-bite-polished, PowerPoint-presentation-perfected, U.S. flag-draped version of the Idea of Progress isn’t all that is was cracked up to be.

Which leaves us to wonder if the time isn’t much riper than we could have imagined a few short years ago for if not the emergence of a new directing idea of humanity, at least the beginning of the disintegration of the current one.

For as the late Peter Drucker argued in a book published in the 1960s that perhaps should be considered the third in a triology of works on this whole subject of progress, it appears that we may already be much deeper into an “age of discontinuity” that we had realized.

_______________________
The above commentary has appeared in a blog on another of my websites. I’m choosing to recycle it here because I think the points it makes are fascinating and important.

Bookmark and Share