I have a great excuse, however; I’ve been working hard on my latest book, Master the Language of the Universe, which is out now. Check it out and leave me a review. Doing so may improve liver function and help reduce the signs of aging*.
That’s all for now. Short and sweet, unlike this 370-page book.
* This statement has not been evaluated by anyone.]]>
Assuming for a moment that the whole thing isn’t a hoax perpetrated by a talented and mischievous pianist, this case raises profound questions about the nature of human knowledge. It’s commonly accepted that we are born with instinct and nothing more. Most believe that—like clay sculptors—we acquire information and add it to our repository of knowledge, fitting pieces together into something useful through study and practice. Derek Amato’s case, like those of other acquired savants, questions this. Where did his knowledge come from?
Though acquired savantism is poorly understood and exceptionally rare, a few hypotheses exist, some more ridiculous than others. For instance, some posit that individuals like Mr. Amato draw their knowledge from a sort of (to use a Jungian term) collective unconscious. A fun introduction to the idea can be found in Richard Linkletter’s film, Waking Life, when actor Ethan Hawke (cheekily reprising his character from the film, Before Sunrise) talks of a study involving crossword puzzles:
“They did this study where they isolated a group of people over time, and monitored their abilities at crossword puzzles in relation to the general population, and they secretly gave them a day-old crossword, one that had already been answered by thousands of other people, and their scores went up dramatically. Like twenty percent. So it’s like once the answers are out there, people can pick up on them. Like we’re all telepathically sharing our experiences.”
Those who buy into the idea of a collective unconscious would claim that, similarly, larger aspects of human knowledge such as language and music are also “out there,” waiting to be accessed. As an annoying skeptic (I prefer the term to “party-pooping asshole”), the idea of a collective unconscious doesn’t exactly fit into my worldview. As far as I’ve been able to tell, it lacks sufficient evidence to be taken seriously. As with anything, however, I’ll keep and open mind should new evidence present itself.
I’m not going to hold my breath.
Over the last few years, acquired savant syndrome has received more scientific attention. Some experts believe that when the left (logical, sequential) hemisphere of the brain is damaged, the right (creative, associative) hemisphere can sometimes pick up the slack. At the risk of oversimplifying the result, the right does the left’s job, but “does it in its own way.” Can this phenomenon (referred to as a “release of dormant potential”) result in sudden and extreme piano skills?
Surprisingly, it seems to be the closest thing to a reasonable explanation yet. Most savants tend to exhibit abilities that fall into a relatively narrow spectrum that includes things like music, calculation, and memory. Some neuroscientists and psychiatrists argue that that these specific types of skills share an intrinsic logical foundation that may be instinctively programmed within us. This so-called “genetic memory” may be innate but muted, and the average person may simply expose it through experience. That is to say, we are more like sculptors of stone than of clay—with study and practice, we chip away at the stone, revealing inherent internal knowledge as time progresses. This theory would argue that both acquired and congenital (i.e., “from birth”) savants simply seem to bypass the “chipping away” part.
Based on this, some scientists are apparently looking into ways to release these abilities without introducing brain trauma. Could future generations bypass years of sacrifice and practice, opting instead to undergo a small “rerouting” procedure that would expose prodigious musical, mathematical, or mnemonic abilities?
Though seemingly more plausible than the collective unconscious theory, this “innate ability” explanation seems incomplete to me. Still, acquired savants are real people who have somehow acquired real skills, and many seem able to reasonably prove that they didn’t have to work for them.
Savantism remains a fascinating (and often tragic) mystery, but it’d be great to be able to take a pill and play piano like Derek Amato. Sure, there’s a certain romance associated with the process of learning, but I think I’d get over it somewhat quickly as I played Rachmaninoff’s Piano Concerto No. 3 like I’d been slaving away over the keys for years.
I am very curious to see what the coming decades hold.]]>
You’re motivated. You’re inspired. You have awesome ideas and want to act on them. However, the bigger your goals and the more you achieve, the more complex things become.
Managing Personal Complexity is about developing organizational strategies that allow you to stay productive and pursue the things you’re passionate about. Unfortunately, schools, businesses and mentors rarely teach you how to do so.
Managing Personal Complexity proposes a unique and holistic definition of time management and teaches you how to pursue your goals while minimizing stress and confusion. You will walk away with strategies and suggestions for building a highly personalized productivity framework.
Cost: Free – Sponsored by Rutgers University
When: Thursday, May 2, 2013 at 12 noon
Where: Rutgers University Camden, Executive Dining Room; 303 Cooper Street, Camden, New Jersey 08102
I hope to see you there.]]>
The league all-star was Denny Walls. I remember him well. I tried to look him up recently, to no avail. Though he was only a kid, he looked like a grown man—at least in my memory. He could hit home runs, was an all-star pitcher, and could outrun anyone else in the league. I remember thinking about him once when I was back in my yard after a game, throwing a ball against the pitch-back net my mom bought me. I wondered what I was doing wrong that he was doing right.
Over twenty years later, I read Malcom Gladwell’s acclaimed book, Outliers. In it, he mentioned something interesting about the relationship between birth date and sports success. The following excerpt from New York Magazine explains:
Relying on the work of a Canadian psychologist who noticed that a disproportionate number of elite hockey players in his country were born in the first half of the year, Gladwell explains what academics call the relative-age effect, by which an initial advantage attributable to age gets turned into a more profound advantage over time. Because Canada’s eligibility cutoff for junior hockey is January 1, Gladwell writes, “a boy who turns 10 on January 2, then, could be playing alongside someone who doesn’t turn 10 until the end of the year.” You can guess at that age, when the differences in physical maturity are so great, which one of those kids is going to make the league all-star team. Once on that all-star team, the January 2 kid starts practicing more, getting better coaching, and playing against tougher competition—so much so that by the time he’s, say, 14, he’s not just older than the kid with the December 30 birthday, he’s better.
When I originally read this, it blew my mind. Due to my birthday, I was the third youngest boy in my grade and consistently the youngest kid on any of my sports teams. A year’s difference can certainly be huge at such a young age, and it didn’t occur to me that this could have contributed to my mediocrity until I read Gladwell’s book.
Was my average-ness—which I had attributed to genetics, lack of drive, and a series of other factors—due in part to something as simple as my birthday?
Over the years, I slowly stopped playing organized sports and focused my considerable energy elsewhere, resulting in some significant choices and experiences that shaped me into the person I’ve become.
I can trace a loose path from the little league field to my life as a thirty-something. I had always been academically successful, but possessed an energy that consistently got me into mischief. After it became clear that I wouldn’t have a career in sports, I began to channel my energy into another love: music. I learned to play multiple instruments, joined bands, and started playing local shows. By sixteen, found myself deeply involved in the local underground music subculture. I met many unique people and received a great education (most teens don’t have to worry about splitting van insurance five ways or the difference in profit margins between two t-shirt printers). Later, as a young adult, I toured the world several times over while playing with bands.
I’m no longer a touring musician and haven’t been for many, many years. But those experiences made me who I am. They affected my ambition, my self-reliance, my DIY ethic, and the way I interact with others and work in groups. They affected my sense of adventure and lust for travel. They taught me a good deal about trust, risk, and how to work through exhaustion.
Considering Gladwell’s comment, I asked a question: would I have had these experiences had I been been better at baseball? I’d surely have been less prone to travel out of state as a youth to play a concert if I had baseball practice early the next morning. Had I stayed involved with baseball, I may never have gotten involved with music at all, let alone continued my involvement into adulthood.
Furthermore, I’m forced to consider that my birthday may have directly shaped not only my experiences, but my social circle. Almost all of my closest male friends from elementary school, high school, and college shared a path similar to my own. We all played sports, but never really excelled. We all found music, and through that shared social network became friends. I even met my wife through music.
Interestingly, a good number of these friends were born within a few months of me—so much so that we all celebrated a group thirtieth birthday party. Over the past decade or so (when a one-year difference no longer meant a massive difference is muscle mass, cognitive skills or reach), many of us found ourselves involved with sports again. One friend became a competitive cyclist. Two joined an adult hockey league. I became very serious first about weightlifting and then about combat sports. With a head nod to Gladwell, it turns out we weren’t that bad.
I talked to a few people who followed through with sports into adulthood. They can trace a path from their early sports successes to their current situations as adults, and—no surprise—they overwhelmingly tended to be the older kids in their respective grades and teams.
Fascinating, sure, but why do I bring this up?
As I prepare to become a first-time father (as much as anyone can really prepare for such a thing), I can’t help but wonder: will my child’s life path be determined (or limited) by her birthday? If so, what date is ideal for producing an athlete or a mathlete? More importantly, what birthday will ensure that my child has the most options?
Most US states currently have a school year cut-off at the end of August or beginning of September, and my daughter is due to be born at the end of March, which should mathematically put her in the middle of the age distribution for a given class. This was certainly not a factor in planning the pregnancy, and while I will be the first to suggest that such planning seems ridiculous, I must admit that in light of Gladwell’s point, my daughter’s prospective March birthday puts me at ease.
We are armed with more data than any generation before us, and theoretically, you could attempt to plan your child’s birthday around your favorite paranoia. Consider the following: RSV (the most common cause of hospitalization in babies under 6 months old) peaks around the flu season. Should birth be planned around peak disease seasons to minimize the risk of complications? How about peak seasons of hawk attacks, newborn kidnappings, or heatwaves?
I pose these somewhat ridiculous questions to illustrate my point: though it’s certainly interesting to consider your or your children’s birthdays as potential factors in certain circumstances (such as struggling with sports), it seems a bit robotic to actually plan birth dates based on these types of factors. Call me old-fashioned.]]>
That said, I propose a theory: The Jeopardy contestant who writes their name the largest is disproportionately likely to win the game.
I started to notice this a few years ago, but kept it to myself. After all, I’m not a regular watcher of the show, or any game show for that matter. Eventually, I began boldly predicting the winners before games started, and was enjoying an unusual amount of success in doing so.
After catching the show quite a few more times over the years, I can say that while it’s certainly not a sure thing, I’m willing to bet that the largest-name player wins measurably more often than the one-third of the time that randomness would suggest.
Writing this post has caused me to consider my definition of a “large” signature. Unscientifically, this is based partly on perceived size (taking into account “big” style and flamboyance), as opposed to sheer surface area, width or height.
If I’m correct, what is causing this phenomenon? Is it a matter of confidence? After years of being the smart kid—the dweeb, the outcast, the nerd—these individuals find themselves standing in the one place in the world where their eggheadedness is overtly celebrated. Overwhelmed with a sudden cathartic pride, perhaps these individuals decide that it’s time to stop downplaying their brilliance. “My name? I’m Jill, goddammit. I worked hard to get here. J-I-L-L. Underline.” In contrast, perhaps the ones with the tiny, understated signatures are those who stare out at the expectant crowd, hearts pounding and vision fading, thinking, “What am I doing here? I only got a B in AP History. Oh my god, look how big Jill wrote her name.”
In fact, maybe I should revise my theory to state that the person with the smallest signature is the least likely to win.
Anyhow, to make science, one needs data. I set out on an Internet oddysey to find information on Jeopardy. Given the nature of the show—and therefore its fanbase—I wasn’t surprised to quickly find the J Archive, an extensive fan-curated history of the series, complete with painstaking details about every episode. Though I could easily locate information about each episode’s winner, I was unable to find any screen captures. After more fruitless searching, I tried to match a few random Google image search results with their respective episodes on the J Archive, but was unable to do so (and such a small data set could surely imply causality when none was to be found). So as not to taint the good name of science, I gave up on my quest.
So, until the J Archive gets their act together, my theory shall remain a theory. However, the next time you sit down in front of the television for your nightly date with Alex, I encourage you to place a friendly wager against anyone who dares take you on.]]>
If there’s one thing you should do to prepare for a catastrophe, what should it be?
I know a man who buries ammunition in waterproof cases around his yard. I knew a family who stockpiled gasoline and spent New Year’s Eve 1999/2000 inside their home, armed and waiting for looters in the wake of Y2K madness. Shows like Doomsday Preppers boast massive viewership. Rock-and-roll scribe Neil Strauss wrote Emergency a few years back, brilliantly detailing his journey down the slippery slope of survivalism.
And then there’s the genre of apocalyptic fiction. Doomsday scenarios have become increasingly popular as of late in film, literature, and television, with humanity meeting its end due to either zombies, plagues, or—in cases such as Cormac McCarthy’s simplistic masterpiece, The Road—causes left entirely to the reader’s imagination.
The apocalypse has become a multimillion dollar industry.
The doomsday preppers are good for an easy laugh and post-apocalypse fiction can be incredibly entertaining. Yet our intrigue seems to run deeper than entertainment; perhaps its roots exist in the fact that such lore reminds us that civilization is at all times only a single cataclysmic event away from breakdown and martial law. Perhaps, deep down, we all know we’ve gotten too comfortable. I remember a conversation I once had with a man who came of age during the Bosnian War. We spoke of something that I’d always known but rarely considered: Life could be relatively tame and sane one day, and then, a few short months later, you could find yourself in a hell on earth. To quote Strauss,
All it would take is one war, one riot, one dirty bomb, one natural disaster, one marauding army, one economic catastrophe, one vial containing one virus to bring it all smashing down. We’ve seen it happen in Hiroshima. In Dresden. In Bosnia. In Rwanda. In Baghdad. In Halabja. In New Orleans.
I’d be surprised if the average person living in the prime of ancient Rome gave much thought to its fall, and the United States is—chronologically speaking—in its very infancy. We’re not immune to civil unrest or breakdown. No one is. With this in mind, when I scoff at the doomsday preppers, smiling smugly along with everyone else, I can’t help but recall something an old man once told me: “You’re only paranoid until you’re right.”
This brings us to the theme of this post: making reasonable plans for worst-case scenarios. We take out insurance policies to account for relatively unlikely events, and yet most of us really have no sense of how to survive if the proverbial shit hits the fan, even though such insurance requires very little investment. Most of us don’t know what to eat in the wild. Most of us don’t know how to collect drinkable water. Most of us can barely defend ourselves, can’t use weapons effectively, and would make simple mistakes that would result in our freezing to death if left to the whim of the elements.
These facts, coupled with the rise in apocalyptic media, have spawned an entire industry of survival. All over the US, one can attend (urban or rural) survival classes, often taught by retired military personnel. This is great, but how much preparation is appropriate? In short, I believe the answer to be, “more than none.” I’m no doomsday prepper, but I think there’s certainly a reasonable minimum level of knowledge that everyone should have (or, at least, have access to).
Let’s first get self-defense out of the way. In many doomsday scenario films, shows, and books, battle is abundant; however, you’re not going to learn to fight or handle a weapon by reading a book or watching Youtube videos. Serious self-defense is a lifetime commitment, and some of the best training programs out there, such as Krav Maga, can be brutal as well as time-consuming. If you can find the fun in such brutality, go for it; it’s easy to argue that self-defense-related hobbies offer more “odds of surviving a disaster” potential than playing video games or watching television.
Due partly to the fact that I sincerely love it and partly to a poorly-masked desire to prepare for the worst, I’ve been involved with combat sports since I was a teen; however, I am lucky enough to be able-bodied and relatively young, and these things can’t be said for everyone. Over the past five years of Brazilian Jiu Jitsu and combat grappling, I’ve suffered everything from black eyes and loose teeth to multiple torn ligaments and fractures. With that in mind, it’s perfectly understandable if this doesn’t meet your criteria for “reasonable” and you feel as though your survival efforts would be best directed elsewhere. Moving on.
You could learn to shoot. You could learn to dig trenches. Hell, you could slowly introduce common poisons to your system in order to prepare yourself for chemical warfare. But the key word we’re focusing upon here is “reasonable.” What reasonable measures can we take? We are parents, employees, brothers, sisters, tee ball coaches and crocheters, after all. There is only so much time in the day.
The US government suggests a first aid kit, water, and few other goodies. While there are certainly benefits to preparing in this way, I argue that your best bet for survival “bang-for-your-buck” comes in the form of two books that explain exactly what to do if things do go down.
The first is called Tom Brown’s Field Guide to City and Suburban Survival. In it, survival expert Tom Brown lays the foundation for skills you should possess in the event of a catastrophic event in a major metropolitan area. Despite being almost thirty years old at this point, the principles taught in the book are priceless. It includes everything from harvesting water from the pipes of an abandoned home to the scenario I now find myself in (hurricane preparedness).
The second is Tom Brown’s Guide to Wild Edible and Medicinal Plants. If something drastic were indeed to occur, you’ll eventually run out of food and find yourself competing with the hungry throngs for the leftovers at the local Shop-n-Bag. In a word, this book teaches you how to forage. As an alternative, you should also consider Identifying and Harvesting Edible and Medicinal Plants in Wild (and Not-So-Wild) Places by “Wildman” Steve Brill. I once took a “wild edibles” nature walk guided by Steve, in which he taught the group how to identify food sources in the wild (you can find out more at http://www.wildmanstevebrill.com/?utm_source=rss&utm_medium=rss).
So, if you ask me, this should fall into everyone’s definition of reasonable. It’s two books, and they’re dirt cheap. This is the “more than none” that could one day save your life. You could easily argue that there’s no need to hoard gasoline, bury ammunition, or practice stabbing a dummy to death over a can of baked beans; however, there’s no shame in taking out just a little insurance. With these two books alone (get hard copies, as you’ll need electricity for a Kindle to work), you may find yourself in a much better place than most, should you find the world around you crumbling. Hell, you don’t even need to read them; just have them in your home in case you do need them some day.
…which actually seems likely right now, as the lights flicker yet again. Wish me luck. And to everyone else on the east coast, see you on the other side.]]>
For example, imagine that you wake at 7AM, go through a normal day, fly out at 7PM (heading west), and arrive at your destination at 7AM. There you meet with friends, family, professional contacts or the like, and continue on without sleeping. You find a second wind. A third. A fourth.
I recently experienced such a day, and was reminded of just how much I cherish them. I find myself alone in this, as most people I know tend to hate them.
A friend recently asked me why I like these freakish mega-days.
I love the surreal sensory clarity that ensues; I love the sharpness of sunlight and depth of darkness. I love simultaneously experiencing exhaustion and jittery enthusiasm.
I love the otherworldly silence of my long-awaited hotel room and the dreamless, death-like sleep that claims me when I finally turn in.]]>
The concept of User Experience Design has existed, in one form or another, for quite some time. Although nowadays much attention is given to its application to computer interfaces, its roots are in more physical, concrete applications, often referred to as “human factors.” According to the Human Factors and Ergonomics Society, the essence of human factors — and thus the root of UX Design — is “the interaction between human users, machines and the contextual environments to design systems that address the user’s experience.” From this concrete root I draw the comparison that lies at the heart of this blog post.
While visiting the Museum of Modern Art in Manhattan during the winter of 2010, I came upon an exhibit called “Counter Space: Design and the Modern Kitchen.” The exhibit featured both media and actual historical artifacts relating to the transition from the formative kitchens of centuries past to those which we now know and love in the modern western world.
While browsing the offerings of the exhibit, I came upon the following image (click to enlarge):
Upon seeing the image, it suddenly dawned on me: This is the very essence of User Interface Design; this single drawing sums up exactly what is important about the craft. Anyone can technically dictate the layout of a kitchen (or website, or magazine page, or ATM interface…), but it takes skill, creativity, empathy, and problem solving prowess to create a functional space that takes into account all reasonable applications while retaining aesthetic. A well-designed kitchen (or website, or…) should result in an efficient, elegant, attractive, and safe end product.
I’ve since referenced this very image when introducing inexperienced individuals to the concept (and importance) of User Interface Design. In doing so, I’ve often witnessed immediate understanding wash over their faces.
I suppose it’s true that a picture is sometimes worth a thousand words.]]>
A few years ago, I moved into an area outside Philadelphia, and, after a few months, decided it was time to visit a dentist. I wanted to find someone local, as my previous dentist was now about forty-five minutes away. I searched online, and arbitrarily chose a practice that was both close to my home and received good reviews.
Upon entering the lobby, I was faced with the following painting:
Had I been there due to the mumps or some sort of threatening, throbbing abscess, this may have been a bit off-putting. I just hope that this was painted entirely from the artist’s imagination, and not from the actual likeness of a wriggling, pained child model.
The office was very nice, and awash in activity; even at seven in the afternoon, it was bustling with nurses, doctors, and patients. Due to the location, decor, and equipment, I deduced that the practice was doing quite well. I was promptly seen by a nurse, who took a series of about fifteen X-rays. Soon, the doctor arrived.
I had “a few cavities.”
For many, this may not seem odd. However, at the time, I was approaching thirty years old, and had never had a cavity in my life. Despite admittedly performing bare minimum dental care (brushing twice per day and a visit to the dentist every other year or so; no flossing, no mouthwash), I’ve always had perfectly healthy, strong teeth. Genetically, I had done well in the dental dice game (not so much in the eyesight one).
Legitimately, naively curious, I asked to see the X-rays. The doctor pointed to an X-ray, and, with the lid of a marker, traced wide, casual circles around my back teeth, and said something that she couldn’t possibly have expected me to understand without a degree in dentistry.
As a society, we have a tendency to trust dentists and other medical professionals in a somewhat unquestioning manner. Though “unconditionally” isn’t quite the word to use, I think it’s safe to say that we put more merit in doctors’ assessments than we do other professionals/specialists who work within domains of knowledge with which the average person is unfamiliar. If something seems “off” with our mechanic’s assessment of our car (especially when a hearty price tag is concerned), we are likely to distrust or seek a second opinion. This is not so much the case with doctors. We trust that, with our health (and, at times, lives) in their hands, they are always going to “do the right thing,” and put our interests well ahead of the almighty dollar.
That said, if a dentist tells you that you have a cavity, would you question it?
I was experiencing no pain or discomfort, and could see nothing on the X-ray films that indicated anything out of the ordinary. In this case, given my lack of rapport with this practice, it occurred to me that there was a chance — paranoid as it may seem — that I was being swindled. Hustled. Gamed. Taken for a ride. Bamboozled. Having the ol’ wool pulled over the eyes.
With so many terms for the same thing, you’d think we’d be more conscious of the risk across all domains.
Conjuring up my most innocent, non-accusatory tone, I said to the doctor, “if I were to let this go and not have them treated, how long would it be before I start to notice pain or other symptoms?” Her response was, “three months, tops.”
So, I waited three months. In fact, I waited well over a year, a time during which I received about a dozen mailers, a birthday card, and several phone calls from the office, hoping to schedule an appointment to remedy my alleged condition. Each time, I simply put it off.
Almost two pain-free years later, I went to see another dental professional; another local doctor who my wife had begun to see. Without mentioning a word about my previous, questionable visit, I got a full checkup, cleaning, and suite of X-rays.
As you have probably guessed by now (unless you were expecting an M. Night Shyamalan-style twist), I had no cavities. After confirming a clean bill of dental health, I told her the story of my previous encounter. My new dentist confirmed several suspicions:
After this episode, I spoke with a few friends, and asked them how many had addressed cavities with no symptoms, and found that almost every single one had. Although I’d like to believe that the likelihood of corruption in the dental industry isn’t some rampant, unchecked plague, I can’t help but wonder what percentage of these procedures were, in fact, unnecessary. Though I’ve so far never had to undergo such a procedure, it is my understanding that they are generally far from pleasant.
It turns out I’m not alone in my experience. After scouring the internet, I was able to find quite a few individuals who claim to have gone through a similar ordeal.
This episode got me thinking — How much unchecked trust do we put in medical professionals? Insulated through our widespread use of medical insurance, it’s easy to forget that many doctors are paid quite well for the performance of outpatient procedures. I began to think back. As a young boy, I had had a series of questionable freckles removed. As an adult, I had had my wisdom teeth removed, as well as a tonsilectomy and septoplasty (correction of a deviated septum). Told that I had very limited airflow in my right nostril, I underwent the septoplasty, under the impression that it would improve my ability to breathe, lessen my seasonal allergies, and help with snoring. The recovery was tremendously difficult and painful; and the results? Arguably negligible. The snoring subsided for a few weeks, only to return again to the same extent. My allergies remain the same to this day. My breathing didn’t seem to improve (though it was never a problem to begin with).
My mind began to race; were these procedures flesh extractions or wallet extractions?
According to the Journal of the American Medical Association (JAMA), 12,000 people die every year as a result of “unnecessary surgery.” One could easily assume that in order to make such a bold claim, for each of these 12,000 individuals, a suit was filed or an investigation led, resulting in the court’s assessment that the procedure was indeed unnecessary. Most likely, quite a few more died from such unnecessary surgeries, but the individuals’ loved ones never thought them unnecessary (or simply failed to pursue the cases for one reason or another).
It seems likely that substantially more still lived through such surgeries, and to this day have no suspicion that they were anything but necessary. Have I been one of them?
Of these 12,000 reported/judged deaths, likely even more unreported/unjudged deaths, and likely even more undetected unnecessary procedures, how many were due to a misjudgement on the behalf of the doctor, and how many were due to his or her personal interests?
Just something to think about. Sometimes a second opinion is well worth the effort and cost.With all teeth intact, Matthew