City College of San Francisco: down but not out

Ask the students, ask the faculty, ask anyone, really, who knows the institution: City College of San Francisco is one of the best community college systems in the nation. Just don’t ask the Accrediting Commission for Community and Junior Colleges, which last week chose to revoke the college’s accreditation for a host reasons having everything to do with politics and finance, and little to do with the actual quality of education offered.

Good community colleges make good communities. They serve as vital centers not only for students actively working towards degrees, but for a broad swatch of the population looking to uncover new talents and hobbies, improve professional capacities, stay sharp and engaged later in life, or simply learn for the joy of learning (and because, as studies show, learning about anything helps the brain grow better at everything.)

The loss of accreditation becomes effective next year, and though the decision is being appealed, it is a tremendous and unfair blow to an institution serving 85,000 students, which has been struggling to resolve funding issues, as I've already written, in the face of California’s ever-evolving budget crisis. In fact, finances have been at the root of the problem since even before City College was first put on sanction by the commission back 2012, the same time at which President Obama’s Statue of the union speech slashed at community education with a double-edged sword:

"States also need to do their part, by making higher education a higher priority in their budgets," Obama told the nation. "And colleges and universities have to do their part by working to keep costs down." The policy outcome of this practical-sounding sentiment is what’s known in education circles as the “completion agenda,” which emphasizes numbers-based evaluations and focuses on providing services to students seeking to complete a degree – to the detriment of broader learning and community-based initiatives.

The nitty-gritty of Obama’s education policy forced local accrediting agencies to comply with the new agenda – ironically using affordability as a measure by which to determine accreditation, which is necessary for colleges to receive state funding (in addition to dictating whether students can apply for federal loans, apply credits earned to other institutions, and graduate with a generally recognized degree). So institutions undergoing financial hardship, regardless of the quality of their services, were subject to losing their state funding, resulting in – that’s right – financial hardship. 

The irony of the situation is that it is driven a close-minded focus on economic development — one that equips students with the bare minimum of skills they need to enter the workforce as quickly as possible. But this is only one model of economic development, and a flawed one at that. Developing whole, well-rounded individuals and fostering educational and cultural linkages among community members at all levels of education is the better, more far-reaching economic goal. And most importantly, the benefits can be reaped even before students hit the workforce. By treating the college as a community hub – attracting people for lectures, events, classes – and focusing on its impact on public space and community cross-pollination, economic gains become generative and enduring, like in this arts-focused re-imagining of an Oregon community college. It's an obvious next step from the 21st Century (21C) paradigm for k-12 education, an integrated education model developed at Yale University, already at play in 1300 schools across the country.  Replacing the community-centered function of these institutions for a narrow economic agenda, as even well-meaning and progressive prescriptions tend to do further marginalizes and alienates already disadvantaged students, robbing them of the chance to mingle and collaborate with a wider swatch of of population who may use the institution, despite already having achieved a high level of academic success, for enrichment and continuing education. And in removing enrichment-type classes from the curriculum (dance, art, general interest and humanities classes), community colleges not only suffer the loss of students of diverse academic backgrounds, they further restrict marginalized students from the benefits of a rich and broadly integrated education. 

The completion agenda goes hand-in-hand with other outcome-focused initiatives like Obama's controversial "Race to the Top" rally and the wider outcry to shore up performance in STEM subjects – a call to arms with similarly narrow economic underpinnings. Rather than focusing solely on predicting needs and frantically bolstering education for a changing job market, we can build strong, diverse educational communities (Sanford C. Shugart, president of Valencia College in Florida, calls this the "educational ecosystem") that emphasize creativity, non-linearity and strong analytical thought processes, which are adaptable to any shift in the job market. But not if we undercut our educational system, rather than taking advantage of all it has to offer. 

It’s not a new story for City College. It’s just one more symptom of an education system – and a larger society – which shortsightedly emphasizes quantitative over qualitative analysis, discounting the benefits of an engaged and educated community, both economically and, more importantly, socially.

Get involved at: www.saveccsf.org

City College Task Force to redefine 'Success'

My best friend from back East recently told me about NBC’s television show Community, now in its third season, which takes place in a Colorado community college and tells the story of an offbeat group of students: a lawyer suspended from practice, an aging millionaire, a straightedge and strait-A student with an erstwhile Adderall addiction, a football star, a single mother, and so forth.

If I were on the show, I guess I’d be the studious and serious Annie Edison – not because an addiction to prescription study aids caused me to have a nervous breakdown and jump through a plate-glass window, but because I am not, according to a task force assembled to review educational and financial policy at City College of San Francisco, the community college "type."

I take classes at City College, despite already having a degree from a well-regarded university back East, and I love it. I have amazing teachers whose dedication to their jobs measures up to anything I experienced during my undergraduate career. But if the Student Success Task Force’s agenda passes in Sacramento this spring, my access to these opportunities will be targeted — along with that of many other students who don’t fit into the task force’s streamlined model of successful community education.

The SSTF has assembled an eighty page document recommending that sweeping changes to the funding model of California’s community colleges be passed in state legislature. The intention is to make more funding available for “typical” community college students – those on the fast track to their Associate’s Degrees or to transfer to a four year institution – but the point (one that NBC's comedy makes lavish use of to draw its laughs) is that community college students are rarely typical.

The recommendations are meant to support full-time students, but even among students who have the same goals in mind as the task force – an AA or transfer – the ability to attend classes full time is rare. Many students can only take a partial load because of work or family obligations; students struggling hardest to make ends meet, working multiple jobs, are those most in need of the funding the SSTF would deny part-timers. In addition, there are those who want to improve their skills in order to find work or do a better job in the work they already have — goals which will ultimately serve to boost the state economy, which is, of course, where the motivation for the task force’s recommendation lie in the first place. There are older citizens looking to stay sharp and expand their horizons, there are high school students seeking enrichment — and yes, there are those, like me, who are simply there to be educated. After all, that’s the whole point of a “community” college in the first place, right?

In addition to eliminating state funding for any student not transferring to a university within a strict two-year deadline (regardless of that student’s residency),  the report recommends eliminating non-credit courses, creating a one-size-fits-all placement test system, and cutting down on any course offerings which don’t feed directly into a degree-granting program.

These changes would not only be detrimental to students who see ongoing education as a vital part of a fulfilling life, and to professionals seeking to develop their skills, but to the degree-seeking students themselves. They would lose the opportunity to interact with a wide range of students from all sorts of educational and professional backgrounds. They would lose the opportunity to supplement their core courses with a wider and more enriching curriculum, and they would lost the opportunity to participate in a system of community education that values learning for learning’s sake – not because a degree or a job depends on it, but because it makes us better, fuller human beings. 

The Path of the Urban Indian

On tramping through the woods as a kid, reconciliation, and the new urban frontier . . .

TaleoftheUrbanIndian_boots.jpg

To explain where the name “Urban Indian” came from — and why, as a white, Jewish girl lacking a speck of Indian blood, I feel the right to claim it — we'll have to start with a childhood rooted in the not-so-vast wilderness of suburban Cleveland, Ohio.  Growing up, my family’s primary signs of affiliation with the Tribe were parsimony, unmanageable curly hair, and a love of things fried in the name of religion.  So though I was given a Hebrew name —Shoshanna, which means lily — it was not to honor a dear departed relative (as per Jewish tradition) or because anyone had visions of a delicate retiring beauty, tinged with a pale blush.  Which is good, because I turned out to be more often brash, bruised, and unlikely to be tinged with anything but dirt.

No, my mom named me Shoshanna because she thought it sounded like the Shoshone people of the Western United States — the tribe from which Sacajawea, the celebrated interpreter for Meriwether Lewis and William Clark, was captured as a youth.

In my own youth, I was quite happy to live up to the romantic image of what I thought my name connoted: I tramped through the woods, caught tadpoles, built houses from sticks, and coerced my young playground-mates to grind up rocks of different colors, mix them with water, and paint their faces.  I was astounded to meet my literary double in Sharon Creech’s Newberry Award-winning book Walk Two Moons — the main character Salamenca is so-named because her mother mistook it for the name of her great-grandmother’s tribe — the Seneca.

I shared with Sal a well-meaning if misguided vision of what emulating my namesakes might entail: treading lightly upon the earth, living frugally off the fruits of the forest, and perhaps even casually passing the time of day with wild beasts both fearsome and cuddly — in their own language, of course.

All of this is surely to the consternation of plenty of real American Indians, and the scholars, Indian and not, who strive to understand and preserve their history. But then, it is also much in keeping with a long tradition of pasty and bewildered Europeans on this continent — one that we struggle, even today, to address and remediate.

Take the word “Indian” itself.

It was used, generally unchallenged, well into the 20th century (the American Indian Movement was self-christened in 1968) despite the fact that the term was long understood to be a misnomer. At the time when Columbus bumped haplessly into the Antilles, he’d been looking for the "India" he knew to be the provenance of valuable spices; the name was often applied to the entirety of South and East Asia, and on some maps of the time, it referred to basically anything that wasn’t Europe.  Needless to say, the “Indians” were so-called upon the mistaken belief that Columbus had hit his intended mark, and the name stuck.

After the civil rights movements, “Native American” became the preferred term because many felt that “Indian,” in addition to being the result of serious geographical discombobulation, had accrued an unshakeable set of pejorative undertones.

But then, “native” has a storied and troublesome past of its own, raising plenty more objections, which have spawned a plethora of additional phrases, each accepted by some and deplored by others to the point where we’re tongue-tied with the task of distangling our language from the social histories it preserves and, indeed, generates. In such a context, using the term Indian to describe my own urban wanderings may seem frivolous at best. But stay with me.

The battle over terminology is a valid one.  Our language patterns fossilize old power structures, but also create a template for the construction of new ones — often with far-reaching, if not immediately obvious, effects.  

In this country, one of the most enduring examples of this phenomenon has been described by geographer William Denevan as the Pristine Myth. It is the idea that the pre-Columbian Americas were "natural" — empty and untouched, save for a mere smattering of natives who stirred nary a leaf, living in a prelapsarian paradise free from the ills of modern mankind — wait, that sounds a little familiar.

So what exactly is the problem with the romantic, if naive, bent of this fantasy that so stirred me as a child?

One problem with this idealized vision is that it is simply wrong. Denevan and others have fully discredited the notion, showing how the landscape European explorers first marveled at was, in fact, extensively shaped by human populations — in numbers far exceeding what scholars would generally acknowledge for the next several centuries.

But the Pristine Myth goes beyond a merely whimsical re-rendering, supporting a number of crucial biases: the view that the land was unused and there for the taking, and the suggestion that the decimation of Indian populations following European settlement was less extreme than it actually was. It fails to acknowledge the very real needs and desires of native populations who, historical accounts have shown, traditionally warred with other tribes, stole from each other, hunted some animal populations so unsustainably that they at times risked starving themselves, and made eager use of technologies from Europeans (like guns, metal, and horses) that made their lives more convenient, even at the expense of the natural world. 

The false dichotomy between greedy, disruptive white man and the noble savage creates a difficult paradox for indigenous people today, whose political will may run counter to our deep-seated romantic notions.  Furthermore, this sort of schism between reality and fantasy makes it difficult, despite even the best intentions, to remediate troubled pasts, by falsely invoking some identifiable point from which we can measure the damages, and some irreproachable ideal to which we can return.

When John Muir famously effused on the beauty of the Yosemite Valley, the “natural” churches that so moved him were grassy meadows that had been maintained by very real, very extreme human intervention. Over time, environmental advocates seeking to uphold Muir’s dedication to preserving the immaculate landscape found themselves with quite a dilemma: their ardent protection of the lands had, in fact, begun to destroy the very scenes which drove Muir to such heights of religious ecstasy. Without the periodic burning practiced by native tribes, the grasslands had succumbed to the natural cycle of forestation, and trees were creeping in and squishing out those lovely sun-filled meadows. Now the movement was in a pickle. Restore the land, yes, but restore it to what? Muir’s vision? A truly wild state? What type of intervention was preservation, and what was degradation?

Humans and landscapes have complex, intertwined pasts, just like humans and other humans. Though understanding these pasts can be key to healing them, there may be times when finding footing amid forever-shifting historical accounts is less important than taking stock of where we are now and figuring out how to go from there — even if it means approaching bigger questions of truth and justice by addressing more granular, even prosaic, matters first: are we enabling a diversity of species to thrive? Preserving cultural autonomy and promoting economic prosperity for those who need it most? Are we holding extinction at bay? Advocating for clean air and water?  Leaving something breathtaking for our children to see? 

One way forward may be to borrow from the emerging study of political ecology, which seeks to understand the complex interplay between socio-economic structures and the natural and cultural landscapes in which they are set. One of the field's leading voices, Paul Robbins, has introduced the concept of the Hatchet and the Seed: a methodology that emphasizes not only exposing and pruning away — deconstructing — the myths the prop up many of today’s undesirable social and environmental realities, but also using our knowledge to plant new and better realities.

Political ecology can be a practical means to explore the legacies of, and make amends for, the complex histories between postcolonial cultures and the populations they have marginalized in the quest for land and resources. But it is a field that aims for a moving target — a productive, sustainable relationship with nature, and with each other, is best described as a dynamic equilibrium, a balancing act that requires constant reevaluation, innovation, and compromise.

This is not to negate the very real need for serious inquiry and reparation in this country — many have called for a Truth and Reconciliation Commission modeled after the one that followed the dismantling of Apartheid South Africa, and I don’t think it would be a bad idea. But we must also recognize that talking about the past — reconstructing our notions of what happened or refining the terminology we use today — will not alone move us forward.

In Twelve Step programs, there is the concept of living amends. Some things you just can’t apologize for, and you can’t truly take back. Instead, you must simply change how you live, do better as you go along. And — this is where I’m sure to get myself into trouble — I say, if holding onto our romantic notions, just bit, helps us do that, then why not?

Increasingly, experts in fields ranging from architecture to zoology are making the case that cities hold the key to a more just and sustainable future. While they have always created efficiencies in housing, transportation, and energy consumption, it is the relative novelties of electricity and, oh, say, sewage treatment that have made them infinitely more livable than they were even a short hundred years ago.

Meanwhile, those aspects of life affecting population and environment on a global scale that have always fallen under rural purview — agriculture, for instance — are increasingly manipulatable by city-dwellers, either indirectly through choice in the marketplace, or directly through bourgeoning interest in urban foraging, home cooking, preserving, community-supported agriculture, and the thriving urban homesteading movement.

We needn’t each of us own a hundred acres in the woods — indeed, this would be the biggest ecological and political disaster of all — to honor the values we ascribe, accurately or inaccurately, to the first people of this land. “Real” or not, those values, applied today, can help overcome the obstacles of land and resources that have germinated over time into stories of oppression and destruction.

Frederick Jackson Turner famously expounded on the importance of the frontier in defining the American spirit of gumption and pluck. He delivered his thesis at the 1893 Chicago World’s Fair (officially, the World’s Columbian Exposition, held to honor the 400th anniversary of Columbus’s “discovery” of America), at what he saw as the end of an era. In the 1890 Census, just three years earlier, the US government had declared that there was no longer a discernable line between civilization and the savage, untamed wilderness — the frontier was dead.

Since then, many things have been dubbed “the new frontier,” not the least significant among them urban America itself. The retreat from this country’s urban centers in the second half of the twentieth century did, undeniably, leave untamed jungles of despair in many places. In this context, the term “urban frontier” has been used for many purposes. Some have been brash and mercenary, resulting in the wholesale re-development of city districts where another solution may have sufficed. But many have been less dubious, and even valiant: sensitive, fine-grained approaches to urban renewal, incentives drawing energy back to the urban core, and the many not-insignificant grassroots efforts to create urban culture and prosperity at street level.

Amid this new wave of exploration, we might do well to think of ourselves not just as Urban Settlers, but also as Urban Indians. Cities are ecosystems, webs as tenuous as any forest or meadow, and change — plucking any string — can have dramatic results, both good and bad. It is crucial that we understand issues like gentrification, attendant to this new wave of Manifest Destiny, and recognize the impacts upon all city dwellers — after all, urban landscapes, even those ravaged by disinvestment and disenfranchisement, are no more “empty” than were the forests and plains of the frontier. “Urban Indian” is my nod to the past, a reminder to stay humble, move slowly, tiptoe where necessary, through landscapes that each have a unique ecology, a matchless balance of triumphs and challenges.

And, problematic as the term may be for some, I think there is something deeply evocative about it. “Urban Indian” calls upon us all to live — among skyscrapers or suburban lawns — more like we imagine Indians of the past living among the trees: with creativity and parsimony, attention to our surroundings, a sense of pride and self-reliance, abundance wrought by using what is available and wasting little, and above all, joy in simple things and in each other.

As for why this blog isn’t called “Urban Native American”? I’ll let Sharon Creech answer that one through Sal’s mother, the character who shares so much in common with my own mom: “My great-grandmother was a Seneca Indian, and I’m proud of it. She wasn’t a Seneca Native American. Indian sounds much more brave and elegant.”