Friday, April 18, 2014

The Constitution and the Failures of Contemporary American Politics

Note:  This column recently appeared in Politics In Minnesota.

Is the polarization and dysfunctionalism in contemporary American politics an accident or  a product of design failure?  The more one thinks about it the conclusion may well be that the many of the problems now confronting the United States are the product of a faulty Constitution, or at least one that may perhaps have outlived its times.
    Many mythologize our Constitution and the men who wrote it. This seems especially true among the Tea Party faction of the Republican Party. They see in James Madison, Alexander Hamilton, and other Founding Fathers a “genius” to the American political process (as historian Daniel Boorstin described it) where the product of their efforts was creation of a representative democracy that really reflected the first three words of the Constitution–“We the people.” Yet historian Richard Hofstadter counseled against seeing the Constitutional Framers as gods, but instead as who they really were–smart politicians with their own interests, prejudices, and limitations who affected compromises to create the American political system.
    Among lost milestones in 2013 was the one hundredth anniversary of the publication of Charles Beard’s An Economic Interpretation of the Constitution of the United States.  In that book Beard made a radical argument that the Framers were economic elitists who did not trust the common man, writing a Constitution to further their economic interests which they felt were threatened by America’s first constitution, the Articles of Confederation.  The Articles government according to Beard, was an economic disaster for business interests, and many of the constitutional framers were being hurt by this government.  The tipping point for them was Shay’s Rebellion, demonstrating to many of them, including Alexander Hamilton, the dangers that the people could pose to the rich.
    Beard’s book catalogues the economic background of the constitutional framers, all slaveholders or wealth businessmen except for a couple.  They wrote a document giving Congress vast powers to regulate and strengthen commerce, and it was also constitution that preserved slavery, stood silent on voting rights, and otherwise created a system of checks and balances, separation of powers, and other power-dividing mechanisms that made it difficult, as James Madison said, for majority factions to take over the political process.  Political scientist Robert Dahl described the Constitution too as a mechanism to slow down political change, making it difficult to effect reform or change unless there was significant time and consensus to achieve it.
    Beard’s controversial challenge was to assert that the complex constitutional system was not meant to produce democracy, but instead shield the rich from the poor and to entrench the power of the former forever.  John Jay, one of the framers and co-author of the Federalist Papers, once exclaimed that “Those who own the country should rule it,” while James Madison famously declared in Federalist number ten that: “But the most common and durable source of factions has been the various and unequal distribution of property. Those who hold and those who are without property have ever formed distinct interests in society.”  For Beard, the real genius of American politics was how the Framers recognized the inevitability of class conflict but designed a political system than  transformed it into group competition, forever dividing the people among various interests, thereby sublimating strife between the rich and poor.  In short, as former Supreme Thurgood Marshall said, “We the people” was the reality of the Constitution, it excluded many from its promise and it took a Civil War, two civil rights movements,  and more than a score of amendments to even give faint meaning to the promise of these three words.
    Looking back over time one wonders to what extent Beard was correct in that the Constitution was designed to assure rule by a privileged elite or that, to update his thesis, that the polarization and dysfunctionalism in contemporary American politics is not just an accident but is exactly what the Framers wanted.  America is a society where economic privilege allocates political power.  Who votes, who runs for office, who gives money, and who benefits from our public policies is significantly determined by economic status.  The “winners” in the American political system look surprisingly a lot like the profile of the constitutional framers of 1787. 
    The Electoral College mechanism for electing the president along with the federalism it embodies have split America into regions since the early days of the republic.  Small states, such as in Senate, can gang up and filibuster legislation and thwart majorities even though they only constitute a small portion of the  population.  And in the House the requirement that every state receive at least one House member too gives disproportionate influence to small populations.  Couple that with gerrymandering and we have created a political system where there are increasingly fewer and fewer incentives to compromise.  This means fringe voices, especially in the political right these days, are given a virtual veto over reform.
    Certainly the American political system is not meant to be winner take all pure populism.  It is a balancing of majority rule with minority rights, but is the minority the Koch brothers and others with money, or those hostile to the rights of women, people of color, and the GLBT community?  The political process which was designed as a compromise seems increasingly unable to create the incentive to compromise, or at least it does not work because some do not want it to.  Instead of seeing our political system now as one where the slowness to change and demand to compromise  were viewed as virtues to protect liberty, it now appears to be one that is unable to act, paralyzed  by gridlock.
    If Charles Beard as updated is correct, one should not be surprised by what is happening across the United States.  The polarization and dysfunctionalism is either an intentional feature to preserve the power for a few, or as law professor Sanford Levinson contends, a sign of original design flaws in the Constitution that are now coming to haunt America more than two centuries later.  In either case, as we head into the 2014 and then 2016 political cycles, we should ask whether the Constitution we mythologize really is up to the task for the demands of the twenty-first century and if it is the cause of, or the impediment preventing the resolution of many of the pressing problems in contemporary American politics.

Friday, April 4, 2014

The Lessons of McCutcheon: The First Amendment as Thuggery

Since when did the First Amendment become a tool of thuggery to suppress speech rather than enhance it?  This is essentially what the Roberts Supreme Court did in the recent McCutcheon v F.E.C.  decision striking down aggregate political contribution limits.
    At the core of the McCutcheon  is the argument that all individuals have a right to expend unlimited money for political purposes.  Because of that right, federal laws that overall limit individuals to contributing approximately $123,000 per year to candidates and political parties violated  their right to free speech.  Who knew that such a cap was so suppressive and chilling of  free speech? At least this is what the Roberts Court wants us to believe.  We should all rejoice in our new found freedom to spend as much as we want to affect the political process.  Yes, now the rich  and poor equally have the right to spend more than $123,000 per year for political purposes much in the same way that writer Anatole France once said that the rich and poor were equally free to sleep under the bridge.
    That is false sense of equal rights is one of several points that the Roberts Court misses in McCutcheon.  The rich and poor may equally have the same right to spend unlimited amounts of money, but the reality is that only the rich shall be able to use this right.  For the other 99.9% of the population, McCutcheon has nothing to do with rights.  It does not mean that your tired, your poor, your huddled masses yearning to breathe free, will spend more.  No, it only means a few will do so because they will be able to.   This is what defenders of  McCutcheon either fail to see, or which they see clearly and embrace the decision because of its implications to favor their views.
    The shallowness of the McCutcheon argument lies in a misguided notion of what the First Amendment free speech clause is supposed to be about.  While of course John Stuart Mill wrote long after the Founding Fathers penned the First Amendment, his On Liberty remains perhaps the single best defense of free speech.  Free speech is necessary not simply to express ourselves, but to gather and circulate the information necessary to make informed political choices and to make democracy and the search for truth possible. Free speech is not a silencing tool but an invitation to dialogue, a declaration that were only one side to speak then it would shield dogma and invite censorship. As Mill stated: ”There can be no fair discussion of the question of the usefulness (of an idea) when an argument so vital may be employed on one side, but not on the other.”  It takes at least two to have a conversation; one person shouting at another  is not free speech it is intimidation.
    The First Amendment free speech clause is not meant to be a right for one or the few but for all.  It is recognition that in a society all of us have a right to speak, and to do that, as in any social situation, there are rules of communication that make a conversation possible.  There is no way that a rule that says all of us have an unlimited right to speak is viable; at some point one has to understand that the First Amendment rights of some have to be read or understood in light of the First Amendment rights of others. The right to free speech cannot be interpreted in such a way that  the rights of a few can suppress the free speech rights of others.  As philosopher John Rawls once declared:  “[E]ach person is to have an equal right to the most extensive basic liberty compatible with a similar liberty for others.” Rights to free speech must be read within a social context of like liberty for all.  McCutcheon and its defenders fail to recognize this principle.
    At one time the Supreme Court recognized it.  In cases such as  Burson v Freeman the Court was confronted with conflicting First Amendment rights when it ruled that states could impose a ban on political advertising within one hundred feet of a polling place.  Here there were contending  free speech rights–to vote and to engage in political advertising–and it was impossible to allow for  both in absolutist fashion.  To allow an absolutist position for both rights probably would have meant neither were possible.  Contending rights of differing individuals have to read or understand in ways that respect the rights of both.  Similar tradeoffs or balancing of rights were made in Red Lion Broadcasting v. Federal Communication Commission where the fairness doctrine was upheld, ruling that the free speech rights of broadcasters must be balanced against the first Amendment rights of the public.  The same was true in Smith v. Alwright, where the Court ruled that the First Amendment associational rights of parties had to be balanced against the rights of individuals to participate in politics.
    McCutcheon, along with the 2010 Citizens United v. Federal Election Commission decision allowing corporations to expend unlimited money for political purposes, ignores the social context for free speech rights.  These decisions privilege the rights of the few at the expense of others.  Even worse, they assume that everyone has an unlimited right to speak and that this speech includes the expenditure of money.  It forgets an old adage many of us learned when we were growing up–My right to extend my arm goes no further that your face. 
    McCutcheon crabbed absolutism seems to assume that everyone has a right to spend unlimited amounts of money, even if that means that some have a right to drown out of suppress the free speech rights of others.  That is what McCutcheon will do.  Its impact will be to give some a  megaphone to speak, serving a a sledgehammer to silence others.  The First Amendment free speech clause was never meant to be a bullying tool or an instrument of thuggery but that is what the Roberts Court did in deciding that the rich and poor equally have the right to spend unlimited amounts of money for political purposes.

Wednesday, April 2, 2014

Metaphysical Not Empirical: The Problems with McCutcheon



The Supreme Court decision McCutcheon v F.E.C. striking down aggregate contribution limits is flawed for many reasons.  Critics will complain that the Court adopted a crabbed and narrow definition of corruption, or that it seemed inured to the role of money in politics, or that it is one more extension in giving more rights to the wealth and in sanctifying one dollar, one vote as the defining philosophy of the Roberts’ Court view of American democracy.  All these criticisms have merit.  But the deeper flaws lie in something more fundamental–the decision is the triumph of legal metaphysics, devoid of a real theory or understanding regarding how American democracy should and do operate in the real world.
            As I argue in my new book  Election Law and Democracy Theory, the most curious feature about election law scholarship and adjudication, including that by the Supreme Court, is the degree to which it is theoretically rudderless.  What is meant by rudderless?  Simply put, it is the extent to which the critical debates and issues that are at the center of many election law disputes are often addressed in the most minimal of matter, generally without regard to any broader sense of a political theory which should guide decisions.  In reaching decisions addressing political speech versus promoting the integrity of elections in the area of campaign financing, or ballot access versus electoral integrity, voting rights versus fraud prevention, or any other innumerable issues, election law scholars and judges seem to assume that the matters at stake are devoid from a broader political or democratic theory context.  This is what occurred in McCutcheon.
            On one level the Supreme Court yet again issued a decision in which it examined one issue about American politics and elections–the role of money or the right of individuals to make political contributions–without adequately considering the broader impact of that decision on the actual performance of American democracy.  The Court treats in isolation one aspect of our political democracy–the right of an individual to spend money–without considering other competing values and how they come together to form a more complete theory about government, politics, and elections.  Yes individuals may have a right to expend for political purposes, and such an act may further an important value of free speech, but that is not the only act and value that must be furthered or considered in a democracy.
            Democratic theorists such as Robert Dahl point out that a theory of democracy includes several values, such as voting equality, effective participation, enlightened understanding, control of the agenda, and inclusion.  For each of these values there is a need to construct institutions that  help sustain them or give them meaning.   Effective participation includes institutions that create for example free and fair elections, opportunities for non-electoral participation, and competitive parties. However, none of these values operates in isolation; a real concept of democracy requires that one understand how they interact, coming together to form a fuller theory of American Democracy.
            Democratic theories have ontologies.  Each theory  defines its object of inquiry, the critical components of what makes a political system work, and what forces, structures, and assumptions are core to its conception of governance.  This ontology will not only include a discussion of human nature but also examination of concepts such as representation, consent, political parties, liberty, equality, and a host of other ideas and institutions that define what a democracy is and how it is supposed to operate.  The Supreme Court, along with most election lawyers, have no sense of theory. In McCutcheon, the Supreme Court isolated one value or practice–expending money–in isolation from many others, asserted that such a practice was protected by the First Amendment, and either called it a day or mistook such a claim as a theory. This is hardly the case.  At best it is the most minimal concept of a democracy, at worst it is no theory.  Among many election lawyers they have made the same mistake, confusing advocacy of a single claim with a broader theory of democracy.  Or in the contrary their view of democracy is reductionist–it is about saying that the allocation of political power and influence is not different than the selling of cars or toothpaste.  Markets may be great ways to allocate commodities, but they are not appropriate tools to sell or distribute political power or democratic influence.  For those who think it is, they are confusing politics with economics, elections with markets.
            Thus on one level the Supreme Court in McCutcheon had no theory and it was all empirical–some individuals denied the right to max out their political contributions on as many candidates and organizations as they want.  But in another sense the decision was all theory and not empirical.  The Supreme Court had its own metaphysics about how it thought people acted. The majority opinion waltzed out a series of hypothetical ways money could be diverted in elections was conjecture at best, devoid of real empirical evidence.  Moreover, the majority opinion, along with many of the defenders of it, make many assertions that simply lack empirical  foundation.  Is it real true that the decision means groups and individuals will be more likely to shift giving to candidates and away from third party groups?  Are political parties strengthened by taking more special interest money?  We have no real evidence to support these claims. 
            For the most part, the assumptions made by the Court and many election lawyers are devoid of empirical political science analysis.  They are highly rationalistic models about human behavior, akin to the theoretical  models economists and other social scientists often make about worlds and behavior they do not exist in reality.  Decisions such as McCutcheon are what many of us call formalistic.  They ignore the wisdom of Supreme Court Justice Oliver Wendell Holmes, Jr.  Who once declared: “The life of the law has not been logic; it has been experience.”  It should be experience, evidence and data, and not blind assertions or theories, that guide decisions about the role of money in politics.
            Overall, the real failure of McCutcheon is that it is both too theoretical and not sufficiently theoretical, and too empirical and not empirical enough.  It ignores how an American democracy should operate, and how its institutions do actually work both within a comprehensive theory and in the real world.

Sunday, March 30, 2014

So why is my college tuition so high? Or why learning no longer seems like the primary goal of colleges and universities.

This is the season.  It’s the time when high school seniors are waiting to hear from colleges regarding whether they have been accepted.  But once the joy or disappointment sets in after acceptance or rejection letters have been mailed, another set of emotions and questions kick in for students and parents who ask: “How do we know we made the right choice to get a good education and how will we pay for college?
    The simple answer is don’t look to my salary or my colleagues for the reason why college is so expensive.  Instead, one needs to understand how higher education has changed in last generation or so to realize that getting a good education is barely the major purpose or goal of colleges and universities anymore and that the real drivers of educational costs are factors that take us way beyond the classroom. 
    A recent report by the Delta Cost Project entitled Labor Intensive or Labor Expensive?: Changing Staffing and Compensation Patterns in Higher Education highlights how the changing structure and employment patterns of American colleges and universities really de-emphasize classroom learning.  In that report what we find is that over the last decade colleges are experiencing a bloat in hiring of non-teaching staff.  For the most part schools are employing more and more administrators and ancillary staff and less faculty, or at least traditional full time tenured or tenure  track faculty.  According to the report:

    *  The overarching trends show that between 2000 and 2012, the public and private nonprofit higher education workforce grew by 28 percent, more than 50 percent faster than the previous decade.
   
    *  Growth in administrative jobs was widespread across higher education—but creating new professional positions, rather than executive and managerial positions, is what drove the increase.
   
    *  As the ranks of managerial and professional administrative workers grew, the number of faculty and staff per administrator continued to decline. The average number of faculty and staff per administrator declined by roughly 40 percent in most types of four-year colleges and universities between 1990 and 2012, and now averages 2.5 or fewer faculty and staff per administrator.

The Delta Cost Project points out that several things are going on in higher education.  First, as the Millennial generation has gone off to college these students are trading in their helicopter parents for helicopter schools.  Colleges are increasingly providing more services and programs to attract and retain students, especially as the number of eighteen-year-olds are decreasing.  Tighter competition for a declining pool of students means schools are spending more and more money to woe and retain students.  More lavish dorms, sports, food, bandwidth, gee whiz technology, and campus aesthetics.
    Second, colleges and universities are experiencing administrative bloat.  As the Delta Report points out, the ratio of faculty to administrator keeps falling and falling, with it now  being overall being 2.5 or fewer faculty and staff per administrator.  Many of these administrators have little or no experience in teaching, often coming from the private sector demanding salaries comparable to what they had there.  This shift in administration is different from more than a generation ago where  colleges were run by real faculty (who actually taught and published) who rose in ranks. 
    Third, the amount of money being spent of faculty–especially full time tenured or tenure track–is going down.  In efforts to reduce teaching costs more adjuncts are employed on an as-needed  basis, or teaching loads are increased.  A recent Star Tribune article highlights this trend. Additionally, faculty salaries have more or less been flat for the last decade.  While there are many competent adjuncts, often they are overworked, undervalued, and just do not have the same commitment and time to serve students.
    The Delta report really highlights a trend in higher education I have been writing about for nearly a decade (corporate universityneo-liberal university). Colleges and universities have lost their purposes.  They have become corporate universities.  For the corporate university, many decisions, including increasingly those affecting curriculum, are determined by a top-down pyramid style of authority.  University administration often composed not of typical academics but those with business or corporate backgrounds had pre-empted many of the decisions faculty used to make.  Under a corporate model, the trustees, increasingly composed of more business leaders than before, select, often with minimal input from the faculty, the president who, in turn, again with minimal or no faculty voice, select the deans, department heads, and other administrative personnel.  The business of higher education has essentially become that–a business, often with little regard for the quality of education.
    So much of what is invested in higher education is completely incidental to learning.  Schools spend a ton on learning technologies but there is little evidence that they make much difference in  learning outcomes.  For the last five years I have edited a journal devoted to public affairs teaching and have yet to find an article or study demonstrating the value of all high tech toys in the classroom.
Such technologies impress parents and students, but they do little more than drive up costs needlessly.  Pedagogy should determine what technology is used, instead the opposite is the case.
    Additionally, expenses on bandwidth and sports are  nice amenities but secondary to what happens in the classroom.  Yes support services to help learning are needed–especially for students with disabilities–but it is not clear how necessary these expenses are.  I am the first to argue that a good chunk of college is teaching students how to grow up and take responsibility, but it is not clear  that higher education is fostering this type of social learning or maturation.
    What really encourages learning are well-trained professors who have both the substantive knowledge and teaching skills to work with students. In my 25 years+ of teaching I have largely remained a professor whose most extensive use of technology is a piece of chalk.  I assign lots of reading and writing and expect students to do both.  I ask tough questions in class, I give students the chance to rewrite assignments, and I set high standards for me and my students.  I tell my students if they work hard I will to.  I also read, write, and publish, making sure that I stay up to date with my research and that of others.  It takes two to tango, and it takes both a hardworking teacher and a hardworking student to foster good learning.  This is what colleges and universities need to encourage.
    So where are we? Today’s higher education displays all the worst traits of the private sector–top heavy with middle and upper management, expenditures on items not essential or necessary to its core mission, while spending on what really is the core mission and who generates the real value for the school–faculty–is going down.  In the private sector, a company run like this would go out of business.  Yet colleges do not because they are able to pass the costs on to the customers–students and parents–who know that a college degree is essential to most successful careers.
    So as you contemplate why your college tuition or that of your children is so high remember it is not my salary that is doing it.

Sunday, March 23, 2014

What does it mean to be a progressive Democrat today?

What does it mean to be a Democrat let alone a progressive one  these days?  The question was prompted by my recent op-ed in Minnpost where in response to an argument against the State of Minnesota granting the NFL tax exemptions to host the Super Bowl, one reader wrote that he supported public funding for the stadium along with the tax breaks, and that he was a Democrat and a “fairly far to the left one too.”
    Since when does a progressive Democrat support tax subsidies and breaks for billionaires and hugely profitable private companies that generate few jobs for working people and provide entertainment (in person) that only a few can afford?  I thought that was what the Republican Party did?  With Democrats like this, who needs Republicans.
    But the debate over tax breaks for the Vikings stadium and the NFL does prompt a broader debate about what it means to be a Democrat or a progressive these days?   It is certainly not  good old-fashioned economic liberalism.  This is not Bill Clinton liberalism that supported NAFTA and welfare reform and which Mitt Romney once warmly embraced as the kind of Democratic Party politics he liked. 
    Instead, the progressive politics that appears dead is that of Lyndon Johnson, John Kennedy, Franklin Roosevelt, and even Teddy Roosevelt. It is about a 21st century version of the Great Society and the New Deal.  It is about redistributive politics that seek to raise those at the economic bottom, narrow the gap between the rich and poor, and wrestle control of political power in the United States from corporations and plutocrats.  It is about the spirit of John Rawls, Michael Harrington, and Dorothy Day and a commitment to believing that the government has an important role in make sure we are a nation that is  not one-third ill-fed, ill-clothed, and ill-housed, that kids should not go off to school hungry, and that corporations should not have the same rights as people.  It is the idea that we help out the least advantaged and most vulnerable first and that the rich have an obligation to help the poor.
    What has taken over for Democrat Party politics is warmed over Republicanism–the centrist sort of corporate politics that some GOP once represented but now have  abandoned as it races further and further to the right, embracing xenophobia, homophobia, and a market fundamentalism that Social Darwinists would embrace. Oh, and vaccines cause mental retardation and global warming does not exist, at least this is what many current Republicans believe.  Even the Republican Party of Abe Lincoln supported civil rights, but not this party–instead it is committed to fine vision that a nineteenth century politician would weep over.  But now consider the Democrats.
    Start at the top.  Obama ran promising change.  The reason why so many are disappointed in him is not that he was too far left but that instead he failed to deliver on his lofty promises.  At inauguration Obama had a window to change America but he flinched.  Carpe diem was not his motto.  But in reality, Obama was never a progressive.  He ran for president opposing a single payer health insurance plan and instead embraced the Republican plan that Mitt Romney adopted in Massachusetts.  Obama was not originally in favor of repealing “don’t ask, don’t tell,” and he did not embrace same-sex marriage until public opinion and political necessity dictated he do so.  
    Obama has deported more  individuals than any other president, he supports coal and nuclear power, and his big victory in repealing the Bush era tax cuts came with a reinstating of the payroll tax, imposing on Americans a more regressive and costly tax system than before.  Obama also defends the use of drones to kill Americans abroad, and he refuses to make any serious changes in an NSA surveillance program that  runs roughshod on the civil liberties of Americans. And in 2008 he took more money in from Wall Street than any presidential candidate in history.
    Across the board many Democrats seem confused to their identity.  They support public subsidies for downtown ball park stadiums and convention centers ahead of neighborhoods.  They defend NSA spying on Americans except when they are spied on.  They take little action to address the impact of money in politics and instead beg for money from big donors and PACs.  They offer few real substantive ideas regarding how to tackle issues such as the achievement gap and the economic discrimination against women (who still make only 77% of what men make).
    Worst of all Democrats lack the guts to fight.  Why?  Democrats (and one should not confuse the current party with progressivism) believe that they are the caretakers for government.  They believe that they need to be responsible and not run the risk of shutting the government down for fear of how it would ruin the economy or hurt people.  But conservatives know this and take advantage of the Democrats willingness to blink.  But guess what?  By blinking the Democrats are screwing over poor people and the economy slowly by giving ground one inch at a time and they seem unable to recapture it. Until Democrats are willing to fight and show conservatives they are willing to shut the government down and hold conservatives responsible they will never win.
    What passes for progressive  Democratic Party politics seems so bland.  Same-sex marriage?  Supporting it a decade ago was progressive but now that is mainstream.  Opposing NSA spying on Americans?  Even Rand Paul does that.  No one should be against strengthening anti-bully legislation.  This is not progressive politics but just common sense.  Yes, raising the minimum wage to an adequate level is good progressive politics, but few talk of living wages these days.
    Progressive politics is dead so long as it is married to the current Democrat Party.  Progressives need their own TEA Party revolution on the left–one that engineers a new rhetoric and take over of the party.  One that is not willing to play it safe and worry that if a few Democrats lose  that means the Republicans win.    It means a willingness to fight for what you believe in.  It also means believing in something worth fighting for.

Saturday, March 15, 2014

Marijuana and the Criminal Justice-Prison Industrial Complex

     America has fought a losing war and it is time to end it.  No, this is not a reference to Afghanistan or the War on Terrorism.  It is to the four decade long war on drugs that has failed miserably.  It is time to shift away from a drug policy that criminalizes its use to one which treats it as a public health problem.  This should be the policy regardless of whether Minnesota endorses medical marijuana.
    Richard Nixon launched the “war on drugs” with his presidency in 1968 and coined the phrase in a 1971 speech.  Since Nixon the war on drugs has been a mainstay of Republican if not bipartisan politics.  The 1974 New York Rockefeller Drug laws penalized individuals with sentences of 15, 25 years, or even life in prison for possession of small amount of marijuana. Increased mandatory minimum sentences for crimes were ratcheted up for drugs and the move toward “three strikes and you are out laws” in the 1990s were adopted in part as a result of the drive to prosecute drug crimes.  All told in the last decade the federal government has annually spent $20-25 billion on drug enforcement with states kicking in an additional $10-15 billion if not more. What has this money purchased?
    There is little evidence that drug usage is down.  Nearly 40% of high school students have reported using illegal drugs, up from 30% a decade ago.  Some studies suggest 30 million or more Americans have used illegal drugs in any given year.  Several hundred thousand individuals per year are arrested for mere use or possession of marijuana. Hard core use is not down and in fact in some cases it has stabilized or increased over time.  Programs such as DARE show little sign of success, and the “Just say no” campaign that begin with Nancy Reagan also does not seem to have had much impact on drug usage.
    But if the war on drugs has done little to decrease demand for drugs, it has had powerful unintended consequences.  Interdiction and enforcement has created a significant and profitable market for illegal drugs both in the United States and across the world.  Estimates are the marijuana is one of the most profitable cash crops in California and the drug violence in Mexico, resulting in approximately 55,000 deaths in the last six years, is tied to American demand for drugs.  The price of cocaine is now at record lows, courts are jammed with drug dockets, and prison populations have swelled with individuals whose only crimes were minor drug possession.  States are now saddled with overcrowded bloated and aging prison populations, lives have been lost due to drug incarceration, and tax dollars that could have been spent on education, roads, or simply saved have been wasted on drug enforcement.  American politicians never seemed to lose points by ranting against drugs or demanding tougher enforcement.  Clearly they were addicted to our drug policies.
    Drug criminalization has failed.  This is not to say that drug use is not a problem.  In some cases it is.  But put into perspective, use of alcohol, tobacco, or the consumption of fatty foods and sugary drinks exacerbating obesity and heart disease are far greater problems in this country than the use of illegal drugs. In many cases recreational use of drugs is harmless, in others, such as with medical marijuana, its uses may in fact be beneficial.  For others, personal and occasional use of drugs is a matter of privacy.  But yes, one can concede that use of illegal drugs–including abuse of prescription drugs which is perhaps the biggest problem–is a public health issue.  Lives can be lost to addiction and families broken up through abuse or neglect.  Many of us know of friends or family members who lives read like a drug version of Billy Wilder’s 1945 classic The Lost Weekend.  These individuals need medical help, not a prison term.  Drug policy needs to be decriminalized and shifted to a public health approach.  But many oppose decriminalization.  Why?
    The basis for opposing the use of drugs generally rests on one of two grounds. First, there is the moral claim that drug use is inherently immoral or bad because it alters the mind, debases human nature, or reduces the capacity for autonomy. The second claim for opposing the use of drugs is social, arguing that the use of drugs and drug related activity produces certain social costs in terms of deaths, black marketing, and crime. Another variant of this claim is that drug use diminishes social productivity by sustaining bad work habits, or by generating other social costs including increased health care costs.
    Ok, one might concede that use of illegal drugs is bad or that it constitutes a public health problem that needs to be addressed.  By having acknowledged this, the question is whether the current practice of drug criminalization and using police resources is the most effective policy to addressing this problem.  One argument against the decriminalization approach is the sending signals argument.  Specifically one major objection to the strategy proposed here is the argument that it would lead to an increase in drug usage and experimentation. Legalizing drugs would send a signal to individuals that drug usage is permissible and therefore more people would use them.
    It is just not clear what impact making drugs legal or illegal has on their usage.  Conceivably making them illegal creates a “forbidden fruit” aura around them that encourages their usage that would be abated by legalizing them.  The same might be said for tobacco products and teenagers or perhaps for any other products or practices socially shunned. Regardless of the reasons why individuals choose to use drugs, there is little evidence that legalization has resulted in increased usage.  In the Netherlands, decriminalization of some drugs has not lead to an increase in usage or in users trading up from soft to harder drugs.  Five years after Portugal decriminalized many drugs in 2001, there too was little evidence that it led to increased drug use.  Portugal’s drug usage rates remain among the lowest in Europe after legalization, while rates of IV-drug user infection rates and other public health problems dropped.  In legalization of medical marijuana in California, the decriminalization might have changed attitudes towards the drug but there was no evidence of change in its use.  So far the same is true in Colorado with outright legalized marijuana. There simply is no real evidence that legalization sends a signal that drugs are permissible and therefore more people use them.
    The point here is that the war on drugs has failed.  It was a political narrative used by politicians for four decades to promote their electoral interests at the expense of public good and taxpayers.  The criminal justice-prison industrial complex has gotten addicted to the war on drugs, making billions of dollars off of criminalization of drugs, especially marijuana. If we truly wish to win the war against drugs, whatever that means, jailing people is not the way to do it. It is time to end that narrative and establish a different approach that sees drug usage as a public health issue.  The $40 or so billion expended per year on drug enforcement could be better spent on other things.  This is a taxpayer issue and maybe in these difficult fiscal times the opportunity is there to rethink drug policy in Minnesota and America.

Monday, March 3, 2014

Bridging the Achievement Gap: The Limits of Education Reform



Note:  This blog originally appeared in Politics in Minnesota.

The educational achievement gap will not be solved by better teaching, or by firing teachers or bashing unions.  Or by vouchers, charter schools, teacher pay for performance, pre-school, all day kindergarten, or simply by a new curriculum.  The achievement gap is a matter of race and class that may not be solved by the schools or educators alone.  It requires attention to the social economic forces that define the lives of students and which affect their ability to learn.
            Addressing the educational achievement gap is the issue de jure.  The Minneapolis and St Paul mayors want to be the education mayors.  R T Rybek sees his gubernatorial future in talking about the gap, and politicians and educators of all stripes are talking about it.  A recent Pew research Center Report entitled The Rising Cost of Not Going to College points to the erosion in the value of a high school degree and the need to get more students of color  into college. The gap nationally and in Minnesota is real.  Simply stated, while Minnesota has one of the highest graduation rates in the nation, with student standardized test scores second only to Massachusetts, the story is very different for people of color and for the poor.  The graduation rate and test scores between whites and students  of color in Minnesota is the largest in the country. We are largely failing (in both meanings of the term) students of color–the children who will be the future of this state.  This failure also overlaps with poverty, meaning that many poor whites also fit into this category of those victimized by the gap.
            So now the question is what to do?  Minnesota to a large extent has been an education innovator over time.  We were the first to introduce open enrollment, allowing students to cross district lines to attend school.  Yet with more than a generation of experimenting with open enrollment, few parents participate in it and there is little data that it has made much difference in outcomes.  Minnesota also led the nation in pushing for charter schools, believing somehow that these educational experiments freed from normal rules and bureaucratic constraints–and teachers unions–would be better run by a bunch of educational amateurs.  Largely the evidence here to is inconclusive regarding their efficacy, although there is powerful data offered by the University of Minnesota’s Institute on Metropolitan Opportunity that charter schools have enhanced segregation.  Finally, Minnesota has experimented with magnet schools, tinkered with class size, and given lip service to rectifying educational funding disparities across school districts.  It has also talked of full day kindergarten and universal pre-school–both laudable adventures–but so far little money has been forthcoming for these adventures.
            In so many ways Minnesota is a terrific microcosm of the reforms many advocates proposal to fix public schools and address the achievement gap.  So many of the current ideas revolve around ideas such as vouchers, school choice, holding teachers accountable with merit pay, and closing poorly performing schools.  For the most part, as education scholar Diane Ravich points out in recent books such as Reign of Error: The Hoax of the Privatization Movement and the Danger to America's Public Schools and The Death and Life of the Great American School System: How Testing and Choice Are Undermining Education, these fads have mostly failed.  There is little evidence that they have improved performance overall for students let alone addressed the achievement gap.  Instead, they seem more the product of ideology–conservative attacks on teachers’ unions, government, and taxes–and less about education reform.
            What Ravich is hinting at is that part of the reason why Johnny and Jane cannot read is about what schools are or are not doing, and part is about what society is or is not doing.
            Perhaps the best recent book on the failure of American education is Amanda Ripley’s The Smartest Kids in the World.  She examines what the best performing school systems in the world are doing by looking at South Korea, Finland, and Poland.  What she finds is that–to paraphrase President Obama–“That used to be us.”  These countries take education seriously.  Teaching and education are held out to be important.  Only the best and brightest are selected to be teachers, educated at a finite number of colleges that impose rigorous standards.  Teachers are subject to constant training and support and–mostly importantly–are paid well for their efforts.  There are also  high demands set for students, and families are expected and do support schools and their children.  Moreover the purpose of schools is clear and unambiguous–educate–and not confused with  other distractions such as sports.  In short, for those of us who grew up in the age of Sputnik and the race with the Russians to the Moon, education was culturally taken seriously.  While singer  Sam Cooke may have lamented that he did not know much about history, ignorance was not accepted as bliss.  This is part of the message that South Korea, Finland, and Poland teach.
            But what the Ripley and Ravich books also point out, and what we learned in the 1960s, is that students cannot learn if they come to school hungry, sick, or abused.  The school lunch and breakfast programs as well as Head Start started under Lyndon Johnson pointed out that Johnny cannot study if he is hungry or starts at an educational disadvantage.  That is still true today.  Income and educational achievement are powerfully correlated, and if we want to address achievement gaps we need to address poverty.  We also need to address racism–the racism that still condemns many students of color to inferior schools.  The 1954 Brown v. Board of Education decision was supposed to desegregate schools and banish separate but equal from America.  But the reality is 60 years later America’s schools remain as segregated as ever, with race and class reinforcing one another.   One need only read Jonathan Kozol’s Savage Inequalities, Amazing Grace, or Fire in the Ashes to see the reality of how racial and economic discrimination  plague American education.
            So what is the point of all this when it comes to the achievement gap?  Perhaps yes there are some things we can do in the classroom to improve educational outcomes and performance, but they are not what we are currently doing.  Maybe smaller class sizes will help, but the evidence suggests only up to a point.  Tracking or separating students out by ability also lacks data supporting its efficacy.  But all day kindergarten, universal pre-school, and even all-year school demonstrate improved outcomes and life prospects for students.  Programs such as HOSTS which feature one-on-one reading with students, yield results.  Frankly, all students do better when they all do better, and that means we all of them are given the same chance and encouragement to learn.
            Perhaps the most important thing we can do to address the achievement gap is to confront  the underlying poverty and racism that prevents students from learning.  Governor Dayton was correct in proposing that the government pay for the lunches for students who cannot afford it.  We need to go further.  We need to stabilize the family situation of many students–nutrition programs, health care, housing, and other social service programs need to be strengthened so that children and family do not have to worry about where the next meal is coming from, or where their next bed will be located.  We are never going to solve the achievement gap in the classroom until such time as we address the gap that separates students before they even walk into the classroom.