top of page







          Ancient Athens is called the birthplace of democracy, and the influence of its legendary government casts a long shadow.  Men who were free, owned property, and native-born were known as citizens, and they were allowed to join the assembly on the hilltop called the Pnyx.  There they proposed laws and actions, spoke about issues in turn, and cast their votes to run their city-state.  The number of men willing to attend did not always meet the quorum, so rules were made to force people to show up.  The story is that two slaves would carry a rope with red dye through the forum, and any eligible male whose toga was touched with the dye had to head straight to the Pnyx or else face punishment.  This system of direct democracy was not perfect – it was not inclusive, it existed in a very unequal, slave-owning society, and participation was apparently lacking.  As bad as it was, the Athenian ideal of direct democracy stood as a model through the ages.  As people were suffering under the dictatorial rule of kings and emperors, they could read about a long-ago time when the opinions of ordinary citizens counted for just as much as the opinions of the elites: one person one vote.  This dream of equality continued to kindle the Western imagination across the millennia.

          When modern republics formed, they looked back to Greece and Rome for inspiration.  They looked to Rome for representative governments with three parts, each with balanced powers that supposedly functioned according to a constitution.  They looked to Greece for the democratic aspect of their systems.  Every eligible voter cast his vote and these were tallied equally: one person, one vote, independent of wealth.  These votes were not for policies or laws; they were votes for politicians to represent them in government and make the laws and policies for them, supposedly reflecting the will of the people.  Since the population of these republics was too large to come together regularly in one place to discuss issues and make decisions, it was assumed that Athenian-style direct democracy was impossible.

          Over the last two centuries, a number of important improvements have been made to our “democratic republics.”  Property requirements to vote were eliminated.  Voting rights were expanded to include women.  Universal suffrage allowed people of all races to participate.  The Progressive Era created avenues for the citizens to directly affect legislation and office-holders through initiative, referendum, and recall.  In Australia, voter participation has been over 90% for the last century because it was a requirement, with fines imposed for not voting.  Some might complain that this sort of rule takes away a person’s freedom, but should we really have the right not to participate in our own governance.  What kind of democracy is it if people don’t vote?

          Voter participation remains a problem.  Without automatic enrollment, political games ensue to attempt to disenfranchise those likely to vote for an opposing party.  Polls and voting requirements are being adjusted to make it harder for certain people to vote.  Even when there is no barrier to voter registration or the polls, people do not feel involved or motivated enough to properly maintain a citizen-run government.  In most countries that hold elections, somewhere between 25-50% of the eligible electorate in any given election fails to cast a vote.  In close elections with a small turnout, we may in some cases be allowing a quarter of eligible voters to decide things for the rest of us.  These problems are in addition to the influence of money in politics, corruption, lobbying, gerrymandering, the influence of media lies, propaganda, the ignorance of the electorate, etc. 

          The solutions to our plethora of problems must be many-sided and nuanced, but one big part of the answer can be found by looking back to ancient Athens, where it all started.  Our assumption since the 1700s has been that direct democracies can’t work because our modern populations are too large.  This might have been true 250 years ago, when people had to physically gather to discuss issues and vote, but today, with modern computer technology, direct democracy is entirely possible.  If all citizens 18 or older were automatically registered to vote and required to be involved, online forums could be created that allowed everyone to learn about issues, propose legislation, discuss and debate, make amendments, and finally cast their votes.  Systems could be created to ensure that votes were not altered or hacked, so voter fraud would be next to impossible.  Voter participation would be near 100%.  There would be no more sense of alienation among the electorate.  There would be no more political gridlock and no more government inaction despite the will of the people for immediate action.  If cultural expectations made it clear that everyone needed to be knowledgeable about current issues, and if educational systems and community practices evolved to facilitate this, then the actions of the government and the will of the informed electorate would be one and the same.  In a virtual sense, we can now all meet on the Pnyx to make our voices heard.  Of course we can’t be everywhere at all times.  We can’t know everything and follow every aspect of government workings.  Systems would need to be devised to create committees (either with volunteers, on a rotational basis, or according to whatever protocols the majority determined to be best) for different projects or types of legislation as well as a fair justice system. 

       Technologically-enabled direct democracy is the only way for us to move forward and realize the dream of an effective, responsive, and just government.  In such a system, the people themselves would be the government.  They would no longer vote for others to represent them; they would primarily vote for policies and laws, and only secondarily select the right people to carry out these policies and enforce these laws (not politicians; technicians, engineers, law enforcement officers, court officials, etc.).  Computer technology and the internet are capable of doing far more than the tasks for which we currently employ them; they can revolutionize government and allow us to combine the promise of ancient Athenian democracy with the norms of a multiracial, multicultural, modern society.  We can break out of our current cycle of apathy, corruption, stagnation, gridlock, inefficiency, voter disenfranchisement, and general inability to deal with the problems that face us.  This can only happen if we redesign our systems of government to empower the people themselves to run the show. 

A sunise over the mountains.  A trail in the forest.  Flowers.



    The suggestion that we will soon enter a Golden Age sounds too incredible to believe.  As a historian, I want to point out that incredible events have made up our world from start to finish.

    Never mind the odds that existed against the formation of the universe, stars, planets, the protective magnetic field around the earth, and the development of life on our planet.  Never mind the amazing pairing of mitochondria with other life forms to create plants and animals.  Never mind the unbelievable growth of pre-human cranial capacity.  Let’s just look at a few of the events in recorded history that would have seemed like fantasy prior to their occurrence.

    In ancient Greece, when the massive Persian Empire invaded, any outside observer would have written off the little Greek city-states.  The Battle of Marathon and the Battle of Thermopylae irritated and humiliated the Persians, and then the Battle of Salamis damaged their navy to the point that resupplying their forces in Greece became difficult.  Finally, the Greeks banded together and decimated the Persians in the fantastic victory at Plataea.  Then, a century and a half later, Alexander the Great had the gall to turn on the Persian Empire.  After winning two battles against the Persians, he faced an enemy army perhaps five times larger than his own puny force at Gaugamela.  In a battle that defies logic to this day, he won against all odds, and Greek culture was extended to Afghanistan and the Indus River Valley.  Because of this cultural mixing, Buddhist statues in Tibet, Vietnam, China, Japan, and Korea are all based on the realistic Greek style of art.

    A hundred years later in China, a look at the map of the time would have led one to believe that the large state of Chu would probably take over the other kingdoms to unify the Hua people.  If not Chu, the victor would probably be Wu or Chin.  Against all odds, the small, backward, semi-barbaric state of Qin conquered all the rest.  The short-lived Qin Dynasty collapsed into chaotic rebellions, and in the end, the two main rebel leaders were Xiang Yu, a young man from a Chu military family, and Liu Bang, a commoner in his 40s, who had a force one-tenth the size of Xiang Yu’s.  Against long odds, Liu Bang wore down Xiang Yu’s forces over years of fighting, and this man, the least-likely ruler you could imagine, established the Han Dynasty, creating the foundation for all the Chinese dynasties to follow.  To this day, the people of China call themselves “Han people” after the name he invented for his dynasty, and Liu Bang is known as the "Great Ancestor."  Chinese writing was codified at this time and is still known as "Han" characters.  Confucianism, which was almost destroyed under the Qin, became the state philosophy during the Han Dynasty, and it went on to influence all of East Asia to this day.

    If we go back to early Italy and look at a map, we would expect the Etruscans to dominate the peninsula.  Rome barely existed.  Then, this little city conquered its neighboring city-states and forged alliances of a new sort, offering most of the rights of citizenship to its one-time enemies.  Rome was then both a city and a republic spanning many cities.  Through sheer force of will, this republic, and later empire, expanded to dominate the area from England to Israel, and from the Black Sea to Morocco.  If we look at the religion of the empire, we would pity the poor, persecuted Christians, but never expect that they would become the main religion of the empire.  Through persistence and slow indoctrination, this small sect turned the tide and became the primary religion of the whole region, and later the main religion of the whole world.

    If anyone were to have looked at the map at that time and attempted to predict where the world’s next major religion would come from, the best guesses would have seemed like heavily-populated areas like China, India, or the Mediterranean.  Yet, lo and behold, about a hundred years after the fall of Rome, in the sparsely-populated desert subcontinent of Arabia, Muhammad claimed to have a revelation from the angel Gabriel.  The religion of Islam exploded onto the scene in the most unexpected way, taking both the exhausted Persian Empire and Byzantine Empire completely by surprise.  Within a few hundred years, this new religion extended from the Atlantic Ocean to the Indian Ocean, and later all the way to the western Pacific.

    After the fall of Rome, the Middle Ages was a time of interesting contradictions.  Europeans were backward in many ways but were thirsty for knowledge.  They discovered, of all unlikely things, the lost writings of Aristotle (translated via Arabic into Latin) and began to objectively question everything, including the Church.  Europeans were technologically far behind the Chinese and the Islamic World, yet they were in love with contraptions so much so that by the 1300s, long after the Chinese had dismantled their own giant clock (the first of its kind in the world, made in the 1000s) because they didn't know what to do with it, each town in Europe clamored for its own clock in the town square.  The simple metal spring, ironically invented by the Romans (the least inventive people of all) was transformed into a coil and used to hold and release energy slowly like a battery.  When used to power a clock, this opened up huge potential for miniaturization, and soon European engineers were making the most finely-tuned precision devices in the world.  The Renaissance was another unlikely surprise that no one could have predicted or imagined.  It created the idea of progress, and, along with the Aristotelian view of the world, led to the Age of Reason and the development of science.

    The British in the 1100s were the last people in the world one would have expected to create a democratic parliament, but because of the indignation at the excessive taxation of the king, the Magna Carta was signed in 1215.  The ideas of English liberty expanded until they were turned against their mother country by the colonists in the American Revolution.  When the shot heard 'round the world was fired on Lexington Green, it must have seemed a foregone conclusion that the Americans would be crushed by the mighty British army.  Like Alexander the Great, George Washington had that blend of magic and luck to survive and win against all odds.  From the Revolutionary War onward, the story of the world has largely been the story of expanding American influence.  Once again, the speck on the horizon that no one believes will materialize has become the foundation upon which all that follows is built.

    Ordinarily, the safest bet is to expect that tomorrow will be pretty much the same as today, but if history teaches us anything, it is to expect the unexpected.  Sometimes drastic, sudden changes do take place.  According to numerous prophecies, we are living in one of those times right now.  The first step is believing that positive change is possible.  Don’t fall into the trap of thinking that we cannot solve our problems and create a Golden Age.  We can do whatever we put our minds to.  Once the idea exists, it is only a matter of taking action to make the dream a reality.  As it says in Acts 4:11, the stone which was rejected by the builders will be made into the cornerstone of the foundation.  The people of the future will thank us for it.

interracial friends



        The culture wars rage over public spaces.  What books should be in our public libraries?  What should be the policies for our public bathrooms and sporting events?  What should the curriculum be in public schools?  How should public accommodations laws be interpreted when religious views come into question?  The battle-lines of these arguments are based on our current conception of what is public and what is private.  As it is, the only private space is your own house, your car (to a lesser degree), and your own mind (so long as you keep your mouth shut in public).  Everything else in the world, from the sidewalk line onward, is pretty much public space.  But what if there were an entirely new paradigm for separating public and private, so that we had many new layers in between what was purely private and what was fully public?

        If we redesign our socio-economic system to create democratically-run cooperative communities, and people joined these communities voluntarily according to shared beliefs, we would be living among people who agreed with us in most ways.  Let’s look at some examples of what sorts of cultural norms would meet with the approval of different groups.  In Pennsylvania, the Amish are conservative, pacifist Christians who refuse to use modern technology.  In Canada, Hutterite communities are conservative Christian communes.  They are a bit more open-minded than the Amish in the use of technology, but are not willing to accept any non-traditional culture, especially where gender roles are concerned.  Some church communities are more liberal, and endorse same-sex weddings.  People who belong to the Humanist Society might form a community that was atheist, and focused on scientific and artistic accomplishments.  This is only the tip of the iceberg.  The full spectrum of human cultural opinions is vast: it includes all religions and all aspects of life from diet to education.  If you put conservative Hutterites and progressive Humanists in a community and asked them to agree on dress codes, cultural norms, and school curriculum, this is a mission impossible.  This is partly why the Hutterites form their own communities.  Understandably, they want their own space.  This may seem strange to most of us, but the truth is that we should all take a cue from them.

        While I strongly urge people not to create insular groups like the Hutterites, but instead to form intentional cooperative communities where members of all races and religions ae represented, it is also true that adequate space and good fences make good neighbors.  The humanist family who wants to provide a non-judgmental environment where their children are free to determine their own identity should be allowed to do so, but not at the expense of the Amish family next door that does not want their neighbor’s liberal views to be normalized in their household.  Both have a right to live as they see fit.  Until people are 18, their parents have the legal authority to determine their upbringing, and they should be allowed to form the best community and culture possible with like-minded people.  It would be relatively easy to find people who agreed with you in terms of cultural attitude that were from a broad selection of religious background Christian, Muslim, Hindu, Buddhist, Sikh, etc.  You could find people from all these religious groups who were either (A) vegetarian, anti-firearm, and LGBTQ-friendly or (B) meat-eating, pro-gun, and promoted traditional gender roles.  The same could go for policies about technology use, social media use, cell phone use in public, frequency of shared meals, dress code, acceptable language in public, amount of community service work required, shared traditions, expectations about stating one's pronouns (or not), and bylaws of the community in general.  We do not have to live in places where some people can swear loudly in front of kids or wear a shirt with a vulgar word written across it (unless we choose to live in such a place); we can make shared covenants with our neighbors and hold each other accountable. 

        These co-op communities might or might not be fenced off and gated, but the important thing about them is that they would provide a new kind of zone between our old definitions of public and private.  When outside your home but within the boundaries of your co-op, you would be in public, yes, but only within the community of your peers and friends (plus guests).  The rules for behavior would be those agreed upon by all members of the community.  Everyone would be on a first-name basis with everyone else.  If you saw something that upset you in your community, you would not just turn away, and neither would you call the police.  You would bring everyone's attention to the issue, go over the shared covenant, and get the issue resolved democratically.  This is what you could call a meta-private space: not your own house, per se, but shared property that functions according to an agreed-upon set of rules.


       If a co-op was part of a network, the network could create institutions to serve only those members.  These could be educational, recreational, medical, research-related, etc., creating even more options for spaces where people can be with others that agree with them about codes for living.  Imagine universities or other institutions of equal size created by and reserved for the use of the group or groups that supported them.  A retreat center could be as spectacular as Hearst Castle, made for use on a rotational basis.  Conference areas could be combinations of hotel, garden, and inspirational architecture.  Roman bath complexes could be re-created.  Projects could be undertaken to build historical re-enactment zones for students.  These would operate according to rules and norms that suited their members; they would not have to try and please everyone's sensibilities.  In more conservative areas, people would not expect one another to state their pronouns.  In some of these zones, it would be understood that all the food would be vegetarian and organic.  Some spaces would be have more of a rustic, cowboy, hunter's lodge atmosphere.  Some would be for nudists.  Others would mix and match preferences in new ways according to the wishes of their supporting communities.  These sorts of zones would be another layer of the onion, what we might call a mini-public space.  You would be with people who agreed with you in general about certain things, but not to the degree of specificity as the members of your own co-op.  With the option of getting your entire education, employment, and much of your business needs fulfilled within the bubble of these sorts of spaces (if you so chose), many of the conflicts and lawsuits we experience today could be avoided.  Sometimes good fences make good neighbors.

       In larger environments between these community spaces would be truly public spaces, as we know them today.  These would be places where democratically-determined rules had to be followed, and people with vastly different attitudes toward life had to get along together.  The culture wars we have today would be lessened because everyone would have more spaces in which to live as they saw fit according to their own set of values and beliefs.  Instead of asking people if they approve of a law, essentially asking what standard they themselves think is appropriate, and passing laws to force all others to live according to their rules (laws which would need to be followed on co-ops as well), the more appropriate question would be, “Is this issue so crucial to protecting human rights that I must limit other people’s freedom in this particular area by making a restrictive regulation?”  If the answer is no, but we personally feel that something is wrong, we can refrain from making a law against it, but still consider making a co-op rule against it.  One good example is the rule against eating meat that used to exist in Japan.  A dietary rule should be a personal decision or a group decision, not a nation-wide decision.  If the co-op declines to pass a certain rule that you feel strongly enough about, you could consider changing co-ops to find a more suitable place.  No one will ever find a community that fits your own preferences 100%, but in this proposed system of designer co-ops, you can come close. 

       We all like the idea of live and let live ​– right up to the point when our neighbors do something that irritates us.  As the culture wars rage, it seems that we are attempting to legislate morality for each other.  Political theater is becoming the norm, and performances of outrage to either gain viewers or posture for one’s political base are what pass for civil debate.  When we live in places that are not communities with shared values, where our agreed-upon norms end at the property line, everything in the remaining public sphere must be determined in this blunt, inexact way, driven by forces that are not able to lead us to good decisions.  The system is like a watch repair shop whose only tool is a sledgehammer.  As it is, everything is either black or white.  Today most communities exist in name only.  Until we redesign our communities to make them more than just collections of houses with people who don't know each other or have anything in common, there can be no nuanced shades of gray.  We need a spectrum of community types that reflects the diversity in the population.  People need more space to be themselves without having to reach a consensus with others who think completely differently.   Then it will be a lot easier to live and let live.




        When moving to a new place with kids, one has to be aware of the schools in the area.  That’s because, unless you are rich, you will probably be sending your kids to a public school, and these operate according to attendance area.  You are not asked which school you want to attend; you are told which school your kids will attend.  You might prefer a STEM school, an arts-based school, a history-based school, etc., but these choices are not offered.  As it used to be in the Soviet Union, you get no choices.  Why is it that Americans become so upset over the requirement that they have insurance so they can see a doctor, saying this is somehow socialized medicine (although it leaves them a choice of which doctor to see), but have no qualms about being told they have no choice in which taxpayer-supported school they will attend?  It is all determined by your address.  In Moscow on the Hudson, Robin Williams played an immigrant from the U.S.S.R. who had waited in line for coffee in the old country.  When he saw the selection of coffee in an American supermarket, he swooned in disbelief.  Strangely enough, we are not strangers in a strange land, we are Americans living in America, and when told there is only one flavor of education (that we have to pay for ourselves through taxation), we are okay with that.  Are you okay with that, comrade?


        With the exception of those who live in spots where a few charter schools have been set up as a token gesture (or as a way to weaken unions), the vast majority of us have no real choice when it comes to education.  The idea is that the people themselves decide by electing school boards, state superintendents, etc., but the reality is that after the system was set in motion, it is easier to steer an iceberg in the opposite direction than to make serious reforms to the public school system.  Parents have expectations of child care, unions have contracts, millions of employees are in place, thousands of schools are up and running, and so forth.  The system isn’t about to go from having one flavor of ice cream to offering 31 flavors overnight.  But if there is no choice in education, how can the system possibly meet all the needs of its students and their families?  We are trying to accommodate parents with political beliefs all over the spectrum and students with interests and learning styles that run the gamut.  How can we give everyone the best experience possible in a single classroom?


        When Europeans created universities in the Middle Ages, there were two main models: the Paris model and the Bologna model.  In Paris, a group of teachers got together and offered a course of study to students.  When the students passed the teachers’ exams in the end, the students received a certificate called a degree.  In the Bologna model, groups of students came together and hired teachers to tutor them so they could receive their degrees.  In the Paris model, the teachers made the rules.  In the Bologna model, the students made the rules.  Lifespans were shorter then, and college students were younger – usually in their early teens.  In the Paris model, students who didn't follow the rules could be punished or expelled.  One of the rules they wrote into their system in Bologna was that teachers had to stop teaching the instant the bell rang to end the class – or else be punished.  Today, we have a weird, hybridized system where the administration of a school (and district) makes their own rules, but, since it is funded by the public, they are subservient to the whims of the public.  Professional educators are told what to do and how to do it by people who have never tried teaching and have no degrees.  The public is promised under the Tenth Amendment that they will have the power to create their own schools, and yet this guaranteed ability to decide your region’s education is exercised through such Rube Goldberg contraptions that it has resulted in a one-size-fits-all school system that fits no one perfectly.


        If I go to a restaurant and I dislike anything about it, I have the freedom to avoid that place again.  I may never find a restaurant that fits me 100% perfectly, but I can keep looking.  In the end, I have to either settle for the best of all available restaurants, cook at home, or open my own business.  This is the beauty of the Paris system.  It is designed by professionals.  They have real-world experience and believe in what they are doing.  They offer services for a fee, and if you don’t like it, you can go somewhere else.  With competition, there is a wide selection, so the students and their families have plenty of choices.  The teachers are also the administrators, and, like restaurant owners, they must be concerned about content, quality of delivery, and customer satisfaction.  If we tried to guarantee that no diner would be left behind by eliminating supermarkets, requiring that everyone eat at their local corner restaurant, and then having all the diners in that sector vote for the chef they liked best, you can imagine how well that would turn out.  If all restaurants were run by the government’s Department of Food, this would not be a very good solution either, but this is essentially what we do in America, the land of the free.


        We all love choices, so why is it that we surrender this freedom of choice when it comes to education?  Why not allow teachers to group together according to the educational system they believe in and let parents choose among them?  Students will opt for fun and ease, but parents will emphasize test score improvement and college admission rates.  These negotiations between parents and students can be settled in the home.  Once a student arrives in class, it would pretty much be (from the teacher’s point of view) my way or the highway.  Students would not cuss the teacher out on Monday and then be back in class on Tuesday.  Students who don’t want to be in a particular school can go elsewhere and make room for someone who does want to be there.  Schools should have the right to refuse service to people who could not follow the rules, and new schools, more focused on discipline, or which used unorthodox approaches, could be created to deal with students who couldn’t find any other school that fit.  Where there was no demand, supply would shrink or adjust accordingly, and where there was a demand, new supply would rise to meet it.  


        All of this would work most smoothly in a society based on intentional co-op communities.  In an environment like this, people would have more equality and group assistance, so the system would not devolve into elitism.  Schools would be smaller and better-suited to meet the needs of students and families.  The focus would be on making sure every student got 90-100% of the skills and/or material under their belt before they moved on.  Staff would make certain that student attitude, behavior, and citizenship were excellent, instead of being forced to tolerate egregious abuses because the conveyor belt is moving too fast and there are too many limits on the interventions that can be applied.  Instead of resembling prisons, schools of the future could be a cross between Disneyland and an interactive science museum.  Students would either be fully engaged or else they would move on to another school that fit them better.  Students would no longer be allowed to be disruptive, disrespectful, or just take up space and do no work.  If they learned no more in school than the fact that these basic rules were non-negotiable, it would in many ways be a massive improvement over the current system.  People who grew up with an education like this would probably know more when they finished high school than the average college graduate does today.  Isn’t that the point of an education system: to educate people and improve over time?  Most states have been doing things the same old way for almost 200 years now.  Does it look to you like things are getting better?

Julius Caesar



        You may have noticed that the election (or appointment) of a single person can radically alter policies that affect hundreds of millions of people.  These small, seemingly random choices can mean the difference between life and death for many.  For others, it can mean tremendous changes in lifestyle, often in the most negative sense.  Whether a position of power is held by candidate A or candidate B (who has diametrically opposing views) comes down to voter turnout, media spin, voter suppression, gerrymandering, various dirty tricks, and social media algorithms.  It seems we might as well just flip a coin. 

        The problem isn’t democracy; it’s a lack of democracy.  It’s true that the people could be better informed and educated, but they do have opinions, and those opinions should be valued.  We have a system wherein we select people to run the government for us, and then we are forced to choose between them in a two-party system where most people feel left out.  We select between the two top candidates based on a combination of their policies and their personalities.  Unfortunately, it is human nature to pick based on image.  We assume that the image reflects the person’s true personality.  In actuality, the image the public sees has usually been crafted by public relations managers, and may be very misleading.  The policies of these politicians are often unstated, unclear, based on connections, formed by telling the base what they want to hear, totally impractical, or totally arbitrary.  The leaders, once elected, are under no obligation to actually follow through with the policies they promised the electorate.  The whole system seems more suitable to the 1800s than the 2000s.

       With today’s computer technology, we should vote for policies, not people.  When we have decided in a democratic fashion on the policies we want, we can then hire or appoint people, according to predetermined system, to enact these chosen policies.  Today, an issue as momentous and morally-charged as whether or not America will continue to support a democratic Ukraine against Putin’s invasion may be entirely decided by our choice for president.  This is a lot of power for one man to have.  Add to that the ability to appoint judges, control the military, propose a budget, wield veto power, etc., and you begin to see how our system is like a giant roulette wheel.  This is far too much power for one person to have.  These matters are too important to be bundled in one person’s personality and decided all in one vote.  When we marry someone, we need to accept a package deal.  When we craft public policy, economic policy, foreign policy, the political persuasion of judges, etc., it does not have to be done this way.

       Why should one person be standard-bearer for all those who hold a certain political view?  This runs the risk of creating a cult of personality that eclipses both party and belief.  The personality of a party chief becomes all-important.  Their successes or failures, witticisms or foibles, cleverness or stupidity, wisdom or psychoses all become inflated.  Everything they do or say becomes a battleground for partisan propaganda.  Those who are allied to the emperor cannot bring themselves to point out that the emperor has no clothes, even when he strides down the avenue in his birthday suit.  Objective truths become a subjective set of “alternative facts,” and we reach a twilight zone state where we can agree on nothing and public officials get death threats for merely doing their jobs.  Whether it is the sun or the moon in the sky depends on which party you ask and which ruler is in charge at a given time.

       Julius Caesar is one of Shakespeare’s greatest plays.  The complex character of Caesar and the dilemma facing Brutus make it a classic, timeless tale.  Caesar is caring and magnanimous, but he is also an egomaniac and a dictator.  At what point does concern for the future of one’s country outweigh personal loyalty?  What will happen if one chooses to act against a potential tyrant?  What will happen if one chooses not to act?  Are the fears of tyranny overblown?  Caesar is a patrician, but he is loved by the masses.  It is ironic that his main supporters are those who have nothing in common with him, and those who fear him the most are members of his own class.  His assassination was the most famous in history.  It was the subject of Marc Antony’s famous funeral oration, where he tactfully spun the incident not as the just execution of a tyrant, as Brutus claimed, but as the murder of a loving and just leader.  Caesar’s death is controversial even today.  Was he killed because he was a dictator for life, threatening the republic itself, or because he offended the elite by threatening to reapportion wealth and power in favor of the poor?  In 44 B.C., Romans had to take sides.  They had to be either for Caesar (and with Antony and Octavian), or against him (and on the side of Brutus and Cassius).  In the 2020s, we should be able to run our countries in a more nuanced way.  We do not have to be either for Caesar or against him.  From here on, we can do without any more Caesars.




       Once upon a time in America, there used to be an FCC regulation called the Fairness Doctrine.  If a radio or television station had a story with a slanted viewpoint, a person or political party with an opposing view had to be given equal time to make a counter-argument.  This was repealed in 1987.  The idea was that this doctrine was limiting people’s freedom of speech.  Once upon a time, there used to be an agreed-upon, common-sense understanding that “news” meant a reality-based description of current events, but this understanding has also disappeared.  Certain people and corporations have played fast and loose with this gentleman’s agreement that a news agency must actually contain news - and report it in an unbiased fashion.  Now, the fox has moved into the henhouse and turned it into a propaganda factory with the label “news” still on top of the signboard.  When there is an event that everyone is covering (a fire, a hurricane, an earthquake), these propaganda stations cover it about the same as everyone else.  This occasional correspondence between their reporting and the undeniable reality on the ground lends credence to the lie that they are a reliable reporter of the facts.  Whenever there is a lull in breaking news, or whenever they can slip in some commentary, it is so one-sided that it resembles what viewers in Russia see on their all-Putin, all the time, government-run stations.  It isn’t balanced, and it often isn’t even grounded in reality.


       You would think the public would be savvy enough to resist being fooled by this kind of skewed, partisan reporting, but you would be wrong.  As in the movie Anchorman 2, a large audience was sucked in by waving the flag and telling people what they wanted to hear.  People started following certain on-screen personalities and believing everything they said as if they were infallible deities, even though their talking points come straight from the corporate desk above, and only have a tangential relationship with any objective reality.  It quickly became clear just how easily people could be brainwashed by the talking heads on their screens to believe just about anything.  In 2020, Fox "news" won a defamation lawsuit because the judge agreed that no "reasonable viewer" would take what the big-name host said seriously.  This may make legal sense, but the real-world problem is that tens of millions of viewers do take what these folks say seriously.  These are grown-ups who should know better but don't, and they can cause quite a ruckus when they are outraged by phony stories meant to outrage them.  Over the last few years, four dangerous lies were driven home on right-wing fake news channels that a frightening number of people have accepted as true.  First, by repeating over and over that there has been a lot of voter fraud (when in fact there has been almost none), citizens were more accepting of restrictive laws, ostensibly to protect the ballot, which in actuality made it harder for targeted segments of the population to vote.  Second, the validity of the 2020 presidential election was repeatedly challenged despite over 60 court rulings that found no evidence in support of these claims.  The net result of this disingenuous haranguing was to create a violent movement to overthrow the government in order to save the country, as we saw on January 6, 2001.  Third, their racist nonsense about a “replacement theory” has led crazy white men to shoot people of other races in shopping centers.  Fourth, their lies and misrepresentation of what they call “Critical Race Theory” has created a rabid reaction against a thing that does not even exist.  As Hitler said, if you repeat a lie often enough, people will believe it.  Fox “news” mentioned what they called “Critical Race Theory” over 1,900 times in less than three and a half months in 2021.  There actually is a thing called CRT, but the real CRT is the study of how government policy has worked against minorities in the past.  These policies were very real, right through the 1900s, from Woodrow Wilson’s firing of all African-American employees in the federal government, to redlining by government policy to prevent people from borrowing money to buy homes in areas where African-Americans lived, to the policies that made it harder for African-American farmers to borrow money.  This study of CRT is pretty much restricted to law schools and postgraduate studies.  No one outside of that had ever even heard of it until Fox took the title and redefined it, based on their own wild imagination, to try and rile people up.  They say, with no evidence except their own echo, that CRT calls all white people racist and teaches kids to hate their country.  Politicians then followed suit, treating the make-believe thing as if it were real and, like "Professor" Harold Hill in The Music Man, getting people all worked up over nothing.  The parallels with Putin’s Russia are startling.  In Russia, non-profit agencies working to document Stalin’s atrocities have been shut down.  The reasoning for this is part of a nationalistic propaganda narrative that portrays Stalin as a hero who saved Russia from the Nazis.  To maintain a simple, easy-to-understand storyline, any facts that do not fit this narrative must be edited out.  This means anyone who focuses on negative aspects of Stalin’s rule must be anti-Russian.  The facts that the people in the agencies were all Russians who love their country, and that Stalin was a psycho who ordered millions of innocent Russians to be imprisoned and executed are considered irrelevant in Putin’s eyes.  Similarly, the Fox-invented definition of CRT is the one we mostly use in what passes for our national discourse on their pre-fabricated “issue.”  People make passionate statements about a thing that does not exist, others react to these statements as if the non-existent thing were real, and on we go, with no one ever pausing to make it clear that the entire discussion is based on fantasy and disinformation.  Twenty-eight states have even passed legislation to ban the imaginary version of CRT.  They might as well pass laws to protect themselves from werewolves, vampires, witches, and zombies.  It is as if we are all living inside the movie Wag the Dog.


       So how can we extricate ourselves from this hole?  How do we create a system where some people report on current events without spin, half-truths, and lies?  One component of a workable solution is the requirement that agencies that want to be licensed as news providers in the future follow very clear and narrow parameters.  This includes refraining from making any statement that is not both objective and immediately verifiable.  If reporting on a worker’s strike, news agencies would not be allowed to comment on the worker’s actions or the company policy, only report on what is happening.  A reporter shouldn't say things like, "The workers are angry.  They are in a frenzy."  The reporter can show us videos of the strike, and ask striking workers, "Can you please tell us what you're doing and why?"  People on the spot could be interviewed, but reporters would no longer be able to cherry-pick the most positive or negative of these interviews to share with viewers, characterize the situation with their own positive or negative adjectives, or sum things up the meaning or import of the event with their own spin.  If reporting on a president, you cannot call him a "wannabe dictator" and still act as an official, accredited news agency.  You cannot declare that the Department of Justice, when searching with a warrant for classified documents that an ex-president refuses to return, is “out of control.”  These sorts of statements are not facts.  These are opinions.  They have no place in a news report.  A person or group that reports in this way should not be allowed to use the name “news” to describe what they provide.  CNN and MSNBC, while at least operating in a fact-based universe, are also guilty of slanting the news.  These outlets make no secret of their distain for Trump with sarcasm, pejoratives, shaking of the head, facial expressions, and headlines about "Trump's Dark Vision."  Just report the facts without a running commentary of opinion and let us decide for ourselves whether something is dark.  In the future, viewers who acted in conjunction with public regulatory agencies would need to monitor broadcasts carefully to see that no licensed news agency was omitting important details, lying, or skewing their portrayal of events with a political agenda in mind.  News reporting should be neutral and honest, like a scientist recording data.  Reporters are not supposed to tip the scales. 


       Free speech is wonderful in theory, but there are always limits.  One clear limit should be in the reporting of the news.  Your own personal political opinion should not be a factor in gathering or reporting the news, and the viewers who come to the well for a drink should not receive a poisonous potion of editorial comment mixed in.  When this sort of biased reporting becomes the norm, the populace becomes so drugged that they can’t tell news from propaganda anymore.  When right and left news sources mix fact and opinion, or just substitute fact with opinion and outright lies, people cannot even agree on the basic objective facts that constitute our reality.  They can no longer have a civil conversation.  We see messages of hate on social media between armed members of paramilitary groups (who ironically think they are saving the country) openly asking each other why they shouldn't just kill all the people who disagree with them.  If one believes the outright, baseless lies that are being substituted for truth, it leads a person to this insane conclusion, and we let loose the four horsemen of the apocalypse.


       Can we trust government to fix this problem?  Can we trust corporations?  Can we trust the fox to guard the henhouse?  It seems we can only rely on ourselves to make sure that we wind up with a fair, honest, and informative news system in the future.  This will not be easy to achieve, but it is necessary.  With clear guidelines, a decentralized, fact-based system could be maintained, so long as it relied on strict monitoring, constant verification, and many checks throughout the entire news-gathering and reporting process.  This is hard to picture right now, with giant multinational corporations dominating the infotainment and outrage politics landscape, but it would fit hand-in-glove with a socio-economic structure based on intentional co-op communities that operated according to cooperative economic principles.  An accountable news media is a key component of an informed society and a world that works.  It may seem a long way off right now, but the basic outline is not so difficult to understand, and aligns perfectly with international norms of fairness, openness, and democratic principles.  If we can conceive of an effective, impartial reporting system and agree that it is an absolute necessity to have such a thing, we can, in the not so distant future, make it a reality.  If you allow yourself to be so jaded as to think it will never happen, you will be throwing away your children’s future.  Do you want them to live in a world where billionaire corporate owners can go on brainwashing anyone into believing anything?  If not, take some time to consider what a transparent, unbiased, and responsible system of news gathering and reporting would look like, what laws are needed, and how we, the people, can keep a good system from going bad over time.  This is not something that can be left to others to fix; we all have to be part of the solution.





       In the racist South, there used to be a legal principle known as the “one-drop rule."  If it could be proven that a white person had even one ancestor who was black, no matter how far back in time, then their blood was said to be corrupted.  They were not allowed to vote, serve on a jury, or hold office.  No matter how light their skin was, they were categorized as black.  The question of their story their family background, culture, upbringing, education, life experiences, principles, character, etc. was not seen as relevant.  The term they used was “miscegenation,” an abominable concept that assumed any mixed-race person was somehow flawed.  All of this stems from the flawed way that people, Americans chief among them, view race in terms of categories.

       Writer Sarah Cooper was recently interviewed on NPR.  She described her own experience with the oversimplification of race in America through categorization.  Sarah’s family was from Jamaica.  Her best friend, Stacey, was Jewish.  Stacey had been called a “N-lover” by other kids, and Sarah asked her why.  Stacey said, “Because my best friend is black.”  Hurt and confused, Sarah said, “I thought I was your best friend.”  Stacey said, “You are.”  Sarah, who had never thought of herself as black, processed this and said, “But wait.”  Stacey then informed Sarah that she was black.  Sarah’s parents had said they were Jamaican and not used the word black to self-identify.  Probably they thought this term referred only to people who were of purely sub-Saharan African descent.  The family history was especially nuanced because one of Sarah’s grandparents was Chinese and another was German.  To them, identity had been a matter of telling the complex story of where they were from and who their ancestors were, which makes a lot more sense than using a single word to describe a relative shade of skin color, and have this word supposedly encapsulate your entire racial identity.  She said that in Jamaica, everyone was very mixed, but in America, everyone had to pick one group with which to identify (in her own words: “everyone needs to be: you’re black, you’re white, you’re this, you’re that”).


       So what is race?  Is it a question of where you are from, your culture, your self-identification, or the color of your skin?  President Obama was called America’s first black president by one and all with virtually no discussion about his racial identity, but there was an inherent leap of logic to categorize him like this.  Obama’s mother was European-American and his father was from Kenya.  So why then must he be categorized as one or the other?  Was this identification based on his appearance?  This practice of calling people black who have one black parent is normally meant as a politically-correct sign of African-American pride and a positive, progressive way of describing a person, but this all-or-nothing definition has a lot in common with the old one-drop rule.  There is a cultural pressure for people to identify as one or the other.  For people who are biracial and given a questionnaire asking them to pick one race or another, it can be confusing.  If a person belongs to two or more categories, why should they be asked to pick a single one?  Why must we categorize race in such black and white terms at all?  It reminds me of the question Hank Hill asked his Laotian neighbor Khan in King of the Hill: “So, are ya Chinese or Japanese?”  Hank was trying to be friendly, but as these were the only two categories in his simple Texan mind, it was the only question he could think of.  Khan gave a brief overview of the location and population of Laos, which went right over Hank’s head.  Hank then repeated the question verbatim, and Khan lost his patience.  This brings up an important point: when people do not understand your story because of limited education and/or experience, do not become frustrated with them.  You just need to explain your full back-story to them (sometimes repeatedly) until they get it.  Once they finally do comprehend the complexity of your origin story (and possibly others like you), they will be able to pass this understanding on to others.  This is how culture is changed: one mind at a time.


       Race is a construct and always has been.  Most Han Chinese have no knowledge of non-Han origins, but every virtually Chinese person alive today has DNA from Huns, Turks, and Mongols, along with genes from dozens of tribal groups.  Japanese, who have been fanatic about racial purity at times in their past, all have ancestors from the Korean peninsula, from China, and from the indigenous Ainu.  If we rewind to a time before the rise of Rome, the Latin League was just one group among a patchwork of many in Italy, the most powerful being the Etruscans.  The various groups on the peninsula spoke different languages and saw each other as enemies.  Little did they know that in a few hundred years they would all consider themselves Romans and venerate their mythic origins.  Although the notions of “Rome” and “Romanness” were created out of thin air, the Romans were racist in the beginning of their rise, viewing all non-Romans as inferior.  They mellowed in many ways on the subject of race after they allowed all subjects of empire to share in citizenship, but they never fully lost their sense of superiority until the final fall of Rome.  It was the mixing of “Romans” with invading “barbarians” that formed the states of the European Middle Ages, where people were later to become insanely racist with ideas of their own purity of blood.  If we rewind back far enough, of course, all humans come from small bands of hominins in Africa a few million years ago.


       When people focus on these supposed differences in the wrong way, the outcome is never good.  As an extreme example, take the case of Rwanda.  In Rwanda, the Hutus and Tutsis shared a common language and origin.  Their only difference seemed to be that Hutus farmed and Tutsis herded cattle.  The Kingdom of Rwanda was run by Tutsis from the 1700s onward.  At this time, some Tutsis formed a ruling class and disenfranchised the more numerous Hutus.  It was still understood at that point that the two groups were castes, not races, and that a person could move from one group to the other.  During the Belgian occupation after W.W. I, however, a rigid classification system was created and the groups were kept separate.  The Belgians supported the Tutsi hegemony over the Hutus in return for Tutsi support of colonial rule.  Hutus understandably resented the arrangement.  In 1994, following a civil war, over 500,000 Tutsis were massacred, mostly with machetes.  These two groups were virtually indistinguishable culturally, physically, or linguistically, yet Hutus were driven by hate-filled speeches over the radio into a murderous rage.  Tutsis were called cockroaches by “Radio Machete,” and Hutus were told to exterminate them.  The movie Hotel Rwanda is a fact-based drama that portrays the events of the genocide.  The protagonist is a Hutu hotel owner with a Tutsi wife.  If they had been living anywhere outside Rwanda, no one would have been able to find the slightest racial or cultural difference between the two, yet these manufactured differences within Rwanda were the cause of mass slaughter.


       A similar situation could be said for couples living in the West in which:


       *One partner was Japanese and the other was Ainu.


       *One partner was Japanese and the other was Korean-Japanese.


       *One partner was Japanese and the other was Burakumin.


       *One partner was Vietnamese and the other was Hmong.


       *One partner was from the Kshatria caste in India and the other was Dalit.


       *One partner was Croat and the other was Serbian.  


       In each case, the differences between people are so slight that outsiders cannot distinguish between them, yet within the microcosm of their own national milieus, these differences have been magnified to justify discrimination, persecution, warfare, and genocide.  We could expand this list to cover more national and religious differences that are used as casus belli, but you get the idea.  These differences are important in that we should seek to understand each other in full and not lump people together in general categories that deny them their unique identities.  These differences are not important in that they should never be used to divide people or incite negative feelings.  The only way forward to world peace is to eliminate all divisions between people.  We need to integrate on a larger scale and in a different way than ever before into true communities.  We need to provide protections against speech that incites racial hatred, discrimination, or even a sense of superiority.  The ludicrous notion of “replacement theory” disseminated by Fox “news” is a prime example of free speech straying into dangerous racism.  There should be sanctuary from racism.  People should see that no race is intrinsically better than any other and realize that racial mixing, which has been going on as long as there have been people on the earth, is a positive thing in every way.  The only thing lost in an interracial union is the narrowness of belonging to a single group.  The gains are immeasurable. 


       As Walt Whitman wrote in his poem, Passage to India,


       Lo, soul, seest thou not God’s purpose from the first?

       The earth to be spann’d, connected by network,

       The races, neighbors, to marry and be given in marriage,

       The oceans to be cross’d, the distant brought near,

       The lands to be welded together.


       In the future, the human race will be a melting-pot wherein these artificial differences will disappear.  For our grandchildren, it be normal for people to be as mixed as Tiger Woods: African, Native American, Chinese, Thai, and European.  There really is only one race: the human race.  All distinctions that have biases attached are nonsense.  We are one species.  Variety is the spice of life, and we need to recognize our differences, but always in a positive, celebratory way.  All the drops of blood that make us who we are genetically and culturally should be researched because they are fascinating, as shown on Dr. Luis Gates' show, Finding Your Roots, not because they are useful for categorizing and dividing.  We don’t have to choose part of our lineage and declare that to be our one true identity.  We need to tell our stories in their fullness without concern for the meaningless classification systems that have existed up to now.  If we ignore these empty abstractions and instead define ourselves as we want to be understood, we can do so independent of previous stereotypes and limitations.  In addition to relating our own stories, we need to be willing to listen to the stories of others as well.  In the end, we will realize that the stories of others belong to us, the listeners, as much they belong to the storytellers themselves.  Each of us is unique, and at the same time we are all one.  Love is the glue that binds us together.  This is the wisdom that will save us.

Screenshot 2023-10-18 171451.png



       “Intelligence Quotient” does not mean what most people seem to think it means.  Your so-called “IQ” does not tell “how smart you are,” as people assume.  What is known as an “IQ level” is a confusing misnomer that has large implications for the way we think of human intelligence.  An “IQ” is really just the score a person gets on an “Intelligence Quotient” test.  This is a test, of which there have been many versions over the years, to roughly gauge a person’s ability to solve problems, reason, and demonstrate understanding of such things as vocabulary, instructions, sequence, relationships, patterns, and rules.  A test like this is useful to a large organization, like the military, which in times of emergency feels the need to rapidly determine which inductees may be able to think on a higher level than others.  Beyond that, a test like this has little or no use.  To be clear, an “IQ” is only a score on a test, which can change based on the amount of knowledge a person has, as well as their preparation for the test.  It is not a fixed number, as many mistakenly assume, telling a person in some authoritative way how “intelligent” or “smart” they are.  There is no way to accurately determine this about an individual and assign them a score that is unchanging and meaningful.  At the heart of the matter is the mistaken notion, which we have all accepted far too easily, that a human being’s brain power can somehow be correctly measured and assigned a number like a bar code on an item in a store.  This is not true and never will be. 


       There are countless different kinds of intelligence describing all the skills we could attempt to categorize and assess: mathematical skills, language skills, listening skills, memory (in all its forms), attention to detail, survival skills, street smarts, decoding skills, problem-solving ability, intuition, social skills, ability to read people, ability to play a role or lie convincingly, ability to think ahead, ability to follow through on necessary actions to achieve a goal, etc.  A failure in any one of these areas might cause people to have a lower estimation of our "intelligence."  Their lowered estimation may not matter at all however.  A person with amazing survival skills is needed when lost in the wilderness, not a person with a great abilities as an accountant.  When we need advice on our relationships, we need help from people who understand emotions, not people with the best engineering knowledge.  For every purpose, their is a person with a skill, and we all have different skills and weaknesses.  Who is to say which is better or worse?  It all depends on the situation.  As the saying goes, a blind cat is better than a horse, if what you need is to catch mice.


       We are people with names and unlimited potential, not numbers.  That is why this seemingly small issue of accepting an "IQ number" is actually a big deal.  Is each young person to be counted as “just another brick in the wall,” as the Pink Floyd song claims public schools treat children, or are we individuals with names and personalities, filled with infinite creativity and power?  It all depends on how we define ourselves.  No one can be adequately summed up by a number, yet we have been conditioned by our culture to do this without question for the last century.  So from now on, when people say that something will “lower their IQ” as if an “IQ” is a true measure of cerebral ability, you should call them on their mistake.  Point out that this is nonsense based on a misunderstanding of what “IQ” scores really are, and explain to the speaker that the very notion of a person being defined by a number degrades all human beings.  Describing people according to their IQ as if it is a fixed number is akin to describing them according to their SAT scores.  If you want to get a higher score on an IQ test, just study and take the test again, as with any test.  The persistent use of IQ tests and the practice of labeling people by their scores is part and parcel of our whole dysfunctional system.  As long as we say that so-and-so has an IQ of X, we perpetuate this erroneous train of thought that makes us believe we have predetermined limitations.  Don't buy this insidious lie.  You are not a machine; you are a human being.  Pay no attention to the labels, numbers, or preconceptions that others have about you.  Don't let anything stop you or slow you down.  Get out there and do something great that no one else has ever done – something that no one but you could possibly do. 

Screenshot 2023-12-05 163229.png




       Like most people, I consider travel to be one of the most enjoyable and rewarding experiences in life.  I have spent endless hours reading about far-off lands and dreaming of visiting them.  I sincerely believe there are few things in life with more power to change a person for the better than travel.  That being said, I think it is important that we travel in a mindful manner.  When engaging in non-essential travel, it is healthy to (1) remain thankful for the opportunity and not take such a thing for granted, (2) remain humble about what travel does and does not do for a person, (3) consider the cost to the environment, and (4) always try to travel with a purpose.


       Almost everyone loves to travel.  There is a particular kind of excitement inherent in entering the portal of an international airport and exiting at any other point on earth.  Air travel connects the far-flung parts of the planet like a giant internet with teleportation ability.  The great epic journeys of our history – crossing seas, traversing the vast and varied expanses of the Silk Road, exploring continents and cultures by foot like Ibn Battuta – can all be replicated with ease in the space of a single day.  From the moment you step off the plane, you are met with new smells, stimulating sounds, and a whole different feel to the air (a different temperature, humidity, elevation, level of purity, etc.).  All of it is so pleasurable as to be intoxicating.  Travel is truly wonderful, and almost indispensable for maximum growth in the modern world.  I would not be who I am today if not for the time spent overseas.  As important and beneficial as travel can be, however, I have observed that it has become so ubiquitous that it has become a kind of mindless universal fixation, a burning desire to travel in order to relieve boredom.  Essentially, it has become a matter of travel for the sake of travel.


       One problem with our common conception of travel is that we all too often view it not in terms of growth, learning, or exploration, but with a consumer mindset.  The whole experience of travel is viewed in terms of how much satisfaction we get from a trip, as if it were no more than a visit to a restaurant.  Like a newspaper critic, we assess and score all the individual components that make up each travel opportunity the way we critique the various aspects of a dinner: the service, the decor, and every single item on the menu.  At the most simplistic level, we view the whole kaleidoscopic experience of a journey as if we are consuming a bottle of soda.  This is beautiful: I want to look at it.  There is delicious food there: I want to eat it.  The beach is lovely there: I want to laze on it.  Of course there is nothing wrong with doing these things in these places, and indeed it would be silly not to partake in local cuisine and see local beauty spots while you are there.  When we have no other purpose to our travels (or to our existence), however, the entirety of a trip (or of our lives) can unfortunately be reduced to a series of such sensory experiences, each to be rated zero to five stars for its quantity and quality.  The complexity of the analysis may be sophisticated if we reach the level of a connoisseur, but the attitude behind it is not.  While it may appear cultured, it is an approach mired in selfish, materialistic attitudes toward life.  It is based on a gilded but superficial view of life, much like that of the wealthy, foppish protagonist brothers in the comedy series Frasier.  I once saw Chinese tourists who had been traveling in California on different itineraries compare notes, and they did exactly this.  How many hours did you get at Disneyland?  How many hours at Universal Studios?  How many hours at Yosemite?  It was like kids who went to different birthday parties comparing who ate the most cake and candy.  Their competitive, consumerist view of their overseas experience was like my dad’s attitude in the all-you-can-eat buffet when I was a kid: I had to at least eat an amount of food equal to the price of my admission, and if I exceeded that, we had “won” and “gotten our money’s worth.”  Treating travel as if it is the equivalent of consuming a candy bar makes us think and act like the swarm of ever-ravenous monkeys in the Buddha's parable.  They move like locusts from tree to tree devouring fruit.  They are never thoughtful, they never learn or grow, and they are never satisfied.  Their attitude represents the flame of desire that keeps the cycle of samsara spinning.  To see a perfect human representation of this seemingly perpetual state of hormone-driven craziness, just watch some footage of college students during spring break in Florida.


       Another problem with pop culture's view of travel is the growing lack of gratitude.  I have heard millennials with a decent income declare haughtily that travel is now considered a minimum measure of success, as if it is in a sense a God-given right, like cell phones and streaming video services.  The attitude is that the world is (or should be) your playground.  Problems like poverty, war, disease, oppression, etc. are a drag, and should not interfere with your enjoyment of Airbnb, sushi, bubble tea, tapas, margaritas, skiing, waterskiing, parasailing, ziplining, etc.  Any thoughtful and compassionate person must remain aware of the immense suffering that much of the world presently endures and be constantly on a mission to work in some way to alleviate its causes.  We must be thankful for what we have and remember our true purpose in life.  We must not allow ourselves to become snobs with an inflated sense of entitlement; we must maintain a higher vibration and set a better example than imperialist Europeans on tour in their colonies.


       I used to assume that travel automatically made people wiser due to their expanded wealth of experiences.  When I was overseas, I met people who showed me that it was in fact possible to travel widely and gain absolutely no wisdom.  For some, it means becoming more and more snobbish and arrogant.  When you have visited more than half the countries on earth, it is unfortunately all too easy to look down on those who have hardly been anywhere and want to tease them for their poverty or lack of worldly experience.  To assume that travel automatically leads to an increase in knowledge and wisdom is to commit the terrible mistake committed by the makers of the very awful 1979 film, Star Trek: The Motion Picture, where the Voyager 6 probe encountered so many life forms on its travels that it became an all-powerful sentient being.  Wrong!  Even if the Voyager had been instantly exposed to the entire universe, the likelihood of it becoming sentient is about as likely as your calculator becoming sentient because you had it in your pocket when you walked across a university campus.  Similarly, if your mindset during your travels are on the same superficial and materialistic level as the above-mentioned troop of monkeys jumping from tree to tree devouring fruit and moving on, you have zero chance of developing wisdom.  Your travels will probably just make you more blasé and egotistical.


       A recent ad for Expedia stated: “Do you think any of us will look back on our lives and regret the things we didn't buy; or the places we didn’t go?”  It resonated deeply because this is the way most of us think.  When we meet a new person and want to get to know them, we regularly ask, “Where have you been?” as if this really mattered in and of itself in determining a person’s identity.  If we are traveling in order to party in Las Vegas, it is not in any way comparable to traveling in order to meet people and learn from them.  For example, this could include learning culture and language, studying with a professor, or spending time with a spiritual master.  Travel for the most profound of purposes is on such a higher plane than merely traveling to consume pleasant experiences that it deserves an entirely different name (the word travel, for example, could refer to both a forgettable, mindless lark or an epic odyssey of  self-discovery).  


       We must also consider the environmental cost of travel when deciding whether or not to visit a place.  Until clean technologies become available, there is an added cost/benefit analysis that must be calculated.  Notice how Greta Thunberg most admirably chooses to take a sailboat to cross the Atlantic rather than to fly.  At present, we are burning fossil fuels to travel most everywhere we go, and so we must weigh the reality of our carbon footprint when justifying the need to go anywhere, especially halfway around the world.  Flying to the Maldives to stay in a luxury hotel for a few days might not be worth it from an ecological standpoint, and will probably help put the whole island chain underwater.


       Yes, sometimes we all need to travel to take a break, get some exercise, or recharge our batteries, but sometimes this is just an avoidance technique, whereby we delay doing the things we know we should do.  I for one often put off meditating and go to the gym or for a hike instead.  When I was in college and knew I should be studying for a big test, I sometimes snuck off to the movie theater rather than act responsibly.  The idea of physical movement is more attractive than stationary focus: it is easier and more immediately gratifying.  It provides the illusion of achieving something simply because we are changing the scenery around ourselves.  If we tell people we went to a beach, a mountain, a distant place, it provokes oohs and ahhs.  People are impressed.  We become the center of attention and people are envious.  They ask detailed questions about what we did, what we saw, what we ate or drank, etc., as if it made any difference.  If, on the other hand, we tell them we read a book or meditated all weekend, people think it is boring and have no follow-up questions.  For a spiritual-minded person, it must be recognized that meditation is the most important thing for us to master in this lifetime.  If our travel helps with this, it is worth it to go to the ends of the earth.  If we are merely traveling to avoid learning to control the self, there is no point in it.  Sometimes it is better to stay put and practice.


       When we travel, we should travel with a greater purpose in mind.  We should make it our aim to learn something, to improve ourselves, or to solve a problem, always with the ultimate objective of helping others in the long run.  In the course of doing so, it is better to go to one place and drink deeply than to visit 100 places in a hurry, glancing briefly at the flowers before moving on to the next spot.  If our goal was to brag about how many places we visited or in how many languages we learned to ask for the bathroom, we should aim for quantity over quality, but this would be unwise.  If you travel, I recommend you view it through a mature lens.  Plan your trip so as to do good for others directly, or else plan to gain knowledge or increase wisdom so you can better help others in the long run.  Think about how you will take the harvest from your experiences and pay it forward to benefit the next generation.  


       Don't automatically assume the more traveling you do, the better.  Yes, it seems really cool and fun, and it makes great videos that get people's attention, but none of that is important.  The main thing is to realize that you are put on this earth to work unceasingly for the good of others, not yourself.  This may require us to travel or it might not, depending on your situation.  You need to see through the ego-based cultural standards connected to travel and rid the self of its desire to be constantly on the move, enjoying a multitude of external stimuli.  Remember what Milarepa said on the subject.  He told his disciples not to waste their time going on pilgrimage, but to stay where they were and meditate instead.  “The Buddha is complete within you,” he reminded them.  Also remember the truth in the Daoist adage: “The entire world may be known without leaving the house.”  

Screenshot 2024-02-06 193506.png




       The Second Amendment to the United States Constitution reads: “A well-regulated Militia being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.”  This was written in the late 1780s, right after the American Revolution.  The idea of people owning their own muskets and swords went back to the experience with Cromwell in England, where Parliament found it necessary to fight for the people’s rights in the face of a tyrant who did not honor the social contract.  This right to bear arms was enshrined in the English Bill of Rights in 1689 after the Glorious Revolution.  The British residents of the 13 colonies inherited this right and put it to good use when the distant authorities in London began to deny them the right to negotiate their tax burden through parliamentary representation.  It was the British denial of the colonists’ right to bear arms that led to the opening of hostilities on Lexington Green, sparking the revolution.  With this war fresh in their minds, and still keenly aware of their vulnerability to attack by European nations, Congress immediately passed the first ten amendments the Bill of Rights including the right to bear arms.


       The Second Amendment is problematic for many reasons.  First, it is specifically intended to ensure the existence of a militia, not the type of private ownership of firearms we have today.  Second, at that time, there was no standing army nor military-industrial complex.  In the 1780s, the right to bear arms was a guarantee that the country would have the basis of an army that could assemble for defense as needed.  In a world with a multi-branched military of millions, supersonic planes, nuclear weapons, and a navy that spans the globe, there is little need for such a militia as originally conceived.  Third, the nature of the weapons at our disposal as individuals is completely different.  When the founders wrote the Second Amendment, they were thinking of inaccurate, smooth-bore muskets that took a minute to load, not today’s rapid-fire assault rifles.  A person with an assault rifle can kill dozens in one minute, and if they have enough ammo, fully-automatic upgrades, enough targets, and a place to hide, they can inflict hundreds of casualties as occurred at the Mandalay Bay Hotel in Las Vegas in 2017.  If only the Founding Fathers could have seen all the mass murder events in the U.S. over the last 50 years, they definitely would have added some footnotes to this amendment.


       There are certainly times and places when it is advisable to be well-armed.  When the United States was facing the Empire of Japan in early 1942, and considering the possibility of Japan controlling the entire Pacific, the fact that Americans were well-armed made it next to impossible for Japan to seriously consider a land invasion.  In addition to foreign invasion, other reasons to be well-armed include the possibility of internal instability, like times of natural disaster when unrest could result, civil war, or the need to resist a dictator like Cromwell who wants to deprive people of their rights.  All of these instances, however, could be best guarded against with a well-regulated militia, not the possession of lethal weapons by everybody regardless of training, coordination with neighbors, or state of mental health.  Switzerland is a good example of a state that has many firearms for national defense (in the form of militias), but which carefully regulates the private ownership of guns.  


       Since I was born in 1968, over 1.5 million Americans have been killed by gun violence: about double the number of Americans killed in all wars since 1776.  In 2023 alone, there were an estimated 42,888 people killed by firearms in the U.S.  That amounts to about 117 deaths per day.  Out of every 100,000 people, that amounts to 10.89 deaths.  Compare that with Japan, which has an annual 0.08 deaths by gun violence per 100,000.  Even in firearm-rich Switzerland, the rate was 0.1 death per 100,000.  Why should people in the 21st century live in fear of being shot like this?  We are no longer living in log forts with British soldiers besieging us from all sides and marauding pirates swarming offshore.  Why should we have to worry about being shot by nutcases in schools, malls, theaters, or street fairs?  Whenever common-sense legislation is proposed to restrict gun ownership by lunatics, the knee-jerk reaction is to honor the Second Amendment’s “right to bear arms.”  Notice how they ignore any mention of the “well-regulated militia.”  We can argue all day about the causes of gun violence and crime in general and get no closer to a solution.  Yes, it is technically true that people kill people, not guns (although, as David Letterman once noted, it's really those darned bullets that do it), and that if all guns were outlawed, then only outlaws would have guns (along with police and soldiers).  It is also true that it is a lot harder to kill people with knives than with guns, and that the presence of a gun in the house actually increases the risk of death by firearm (often by suicide or in a family dispute when tempers flare).  We can take a pacifist attitude or a survivalist attitude and there will be pros and cons to each.  In the short term, the best course for each family to take vis a vis firearms all depends on what happens next.  Whatever decision we make as individuals regarding gun ownership – to have or not to have​ – can potentially lead to life-threatening situation.  In terms of the national debate on the gun issue, we are currently as stuck as a saber-toothed tiger in the La Brea Tar Pits.  


       This impasse will not last forever.  Eventually, most likely sooner than anyone thinks, events that no one can fully foresee will change the entire landscape of the political, social, and economic status quo.  We will have an opportunity to reframe the entire fabric of our socio-economic existence such a way that our descendants will be able to protect their human rights without having to endure such an insanely dangerous world.  If we put our heads together, we can make war and invasion a thing of the past.  We can create communities that are prosperous, equitable, and safe enough that violent crime is extremely rare.  The best way to prevent crime it is to raise children properly and work together with neighbors to create an environment in which all members have a stake in society.  We can create governments that are decentralized and democratic, so that despotism is all but impossible.  There will always be a need to have something akin to a military or police force around in some form, but this can be low-profile and low-risk in terms of its ability to install a leader with force of arms.  We should not look at the carnage of today and imagine that it must continue into the future.  Creating a future in which people outnumber guns is not a betrayal of the Constitution.  The Founding Fathers wanted the people of North America to be secure in their right to life, liberty, and the pursuit of happiness.  An important prerequisite for this is a continued absence of bullet holes in our bodies.  We cannot expect people in the future to go on accepting the collateral damage we have been tolerating for the last fifty-plus years.  Do you want to see your grandson or granddaughter blown away by someone who should be in a mental hospital?  If not, we need to make major changes pronto.  We cannot expect a better world to take shape by accident; we have to use wisdom and foresight to create a plan right now, and then work tirelessly to make it a reality.

Screenshot 2024-03-16 130754.png




       Spoiler alert.  In the Black Mirror episode, “Fifteen Million Merits,” people live in a future where most are relegated to pedal exercise bikes all day to generate power.  This activity earns the workers a kind of currency called merits, which they can use to purchase food or television privileges.  All day long, the bike-pedaling populace is glued to screens filled with mindless entertainment.  Workers retire to their cubicles at the end of the day little rooms where the walls are giant television screens.  What they are able to see or avoid seeing depends on the amount of merits in their account.  Bing is a bike-pedaler with an extraordinary amount of drive.  He accrues 15 million merits and gives them to Abi, a love interest of his, but this backfires and ends in heart-breaking separation.  Bing then redoubles his cycling efforts and raises another 15 million merits, which he uses to purchase contestant spot on a talent show that can give him a whole new life.  He shocks everyone by holding a piece of glass to his own throat and launching into an extended rant about the madness of their dystopian society.  The judges are impressed enough to give him a television time slot.  In the final scene, we see Bing doing his bi-weekly rant before the camera, then turning to enjoy a luxury apartment with a view.  He has become part of the system that he despises.


       This is one of the most powerful episodes in the series.  It encapsulates a great deal of the current insanity of our daily lives.  Our amazing technology is used by millions around the world, not to learn or to better ourselves, but to watch silly videos of people dancing, engaging in hedonism, or doing stupid things on purpose.  Our minds have created mass media whose main purpose seems to be to destroy our minds.  Our days are spent doing things as pointless as pedaling bikes and getting nowhere, and yet we, the people, are not in charge.  The system that governs our lives was designed by distant authority figures and is not open to discussion.  Each of us does what we can to stay sane in an insane world, and try in our own small ways to make the experience of life as pleasant as possible.  When one of us is observant enough to make an impressive critique of our dysfunctional system, it usually leads to a moment of fame and perhaps some wealth, but makes no impact on the system itself.  The person who so ingeniously points out the flaws in the system becomes co-opted by the system.


       When we turn on the TV, what we usually see is 99% insipid entertainment and 1% insightful observation by people who have become a part of the system about which they complain.  What we need now is not another 15 million merits.  This amounts to little more than an extra gallon of water over Niagara Falls.  Fame and fortune can come to people who write books or produce films that make brilliant observations, but if these works do not translate into a revolution in thought and action, what good do they do?  What we need now is a practical plan for comprehensive change.  We must think in a brand new way if we are to make the necessary alterations to save ourselves and our planet.  We cannot keep going along to get along: prioritizing our own financial security (essentially selling out because we see no other alternative), assuming that the world of tomorrow will be just like the world of today.  We need to completely remake our socio-economic system, and with it the entire world.  We need to think of the whole more than our own piece of the pie.  We must dare to dream big.  Anything less will only amount to having a better room on the Titanic.

bottom of page