The Crucible of American Indian Identity
Part 2

Native Tradition versus Colonial Imposition

in Postconquest North America

By Ward Churchill

 

The Impositions of U.S. Policy

Probably the first concerted effort on the part of U.S. officialdom to use the incorporation of whites and their mixed-blood offspring as a wedge with which to pry indigenous societies apart began in the late 1700s, when Moravian missionaries were asked to serve as de facto federal emissaries to the Cherokee Nation." Imbued with the mystical notion that white "Aryan" genetics correlated to such "innate" endowments as intellect and "moral capacity"- which in their minds corresponded with the potential to adopt "civilized" (Christian) outlooks and values-the Moravians and, after 1803, their Presbyterian colleagues "went out of their way to befriend" mixed-bloods rather than "pure" Indians while pursuing their goals of obtaining religious converts cum political allies.

Predictably, this racial bias translated into a privileging of mixed-bloods in both political and material terms-regardless of their rank within the Cherokee polity and irrespective of whether they desired such "benefits"--a situation which was quite reasonably resented by other Cherokees (most especially those whose authority was undermined or supplanted by such external manipulation). The result, obviously intended by the United States, was the opening of deep cleavages among Cherokees that greatly weakened them in military as well as political and cultural terms, circumstances which amplified considerably the decisive advantages the U.S. already enjoyed in its drive to dispossess them of their property. Meanwhile, similar initiatives had been undertaken vis-a-vis the Creeks, Choctaws, Chickasaws and others.

Although the U.S. refrained from attempting such maneuvers during the first 30 years of treaty-making with indigenous nations-an interval roughly corresponding to the period in which the young republic, a veritable revolutionary outlaw state, desperately required the legitimation which could be bestowed via native recognition of its sovereign status (indigenous sovereignty having already been recognized through treaties with the European powers)special provisions pertaining to mixed-bloods entered its formal diplomacy with Indians, beginning with an 1817 Treaty with the Wyandots and several other peoples of the Ohio/ Pennsylvania region. Thereafter, the performance was repeated in compact after compact, at least 53 times by 1868.

In only few instances, such as the 1847 Treaty with the Chippewa of the Mississippi and Lake Superior, in which it is recognized by the U.S. that "half or mixed bloods of the Chippewas residing with them [should simply] be considered Chippewas, is there acknowledgment of the right of indigenous nations to naturalize citizens as they saw fit. In the great bulk cases, such treaty provisions are plainly designed to accomplish the opposite effect, distinguishing those of mixed ancestry from the rest of their people, almost always by unilaterally privileging them in a material fashion. Usually, this followed upon the model established in the above-mentioned 1817 treaty, the eighth article of which provided that, while the Indians themselves would hold certain lands in common, those "connected with said Indians, by blood or adoption" would receive individual tracts averaging 640 acres each.

There were several variations on the theme. In one, exemplified by the 1818 Treaty with the Miami, chiefs as well as mixed-bloods and intermarried whites were assigned individual parcels, one-to-six sections each in this case, while the rest of the people were assigned a tract in common. Hence, not only were mixed-bloods figuratively elevated to the same standing as chiefs by external fiat, but the Miamis' actual leaders were implicitly linked to them rather than to their people as a whole. On other occasions, as in the 1855 Treaty with the Winnebago, missionaries were substituted for chiefs. On still others, as in the 1837 Treaty with the Sioux, money and/or other special provisions was substituted for land. Even in cases like the 1861 Treaty with the Cheyenne and Arapaho, where full-bloods and mixed-bloods were nominally treated the same, regardless of rank--i.e., everyone was allotted a parcel and/or monetary award--mixed-bloods were often singled out to receive larger quantities.

In a number of instances, as with the 1857 Treaty with the Pawnee, provisions were explicitly designed to induce an outright physical separation of mixed-bloods from their people, a practice which was particularly odious in instances such as that addressed in the 1865 Treaty with the Osage where "breeds" were the only group allowed (or coerced) to remain within a traditional homeland from which the rest of their nation was removed." In the 1831 Treaty with the Shawnee, the notion of blood quantum was first applied in a formal way to determine who would-or, more importantly, who would not-be recognized by the U.S. as a "real" Indian. Moreover, racism aside, the treaties often employed a virulent sexist bias-tracing descent, acknowledging authority and bestowing land titles along decidedly patriarchal 'lines even (or especially) in contexts where female property ownership, political leadership and matrilinearity were the indigenous norms--as a means of subverting the integrity of native culture, undermining their sociopolitical cohesion and confusing or negating their procedures for identifying member/citizens.

In 1871, sensing that the capacity of most indigenous nations to offer effective military resistance was nearing an end, Congress suspended further treaty-making with Indians. There then followed a decade of reorganization during which the government shifted from what had been primarily a policy of physically subjugating native peoples to an emphasis upon assimilating what remained of them, both geographically and demographically. While there were a number of aspects to this transition-notably, the extension of U.S. criminal jurisdiction over reserved native territories via the Seven Major Crimes Act of 1885--its hallmark was passage of the 1887 General Allotment Act, a measure expressly intended to dissolve the collective relationship to land which was the fundament of traditional cultures by imposing the allegedly superior Anglo-Saxon system of individuated property ownership.

The main ingredient of the Act was that each Indian, recognized as such by the U.S., would be assigned an individually-deeded parcel of land within existing reservation areas. These varied in size, depending on whether the Indian was a child (forty acres), unmarried adult (eighty acres), or head of a family (160 acres). Once each Indian had received his/her personal allotment, becoming a U.S. citizen in the process, the law prescribed that the balance of each reservation be declared "surplus" and opened up to homesteading by non-Indians, corporate usage, or placed in some form of perpetual federal trust status (i.e., designation as national parks and forests, military installations, etc.). In this manner, some 100 million of the approximately 150 million acres of land still retained by indigenous nations for their own exclusive use and occupancy at the outset "passed" to whites by 1934.

The bedrock upon which the allotment process was built was the compilation of formal rolls listing those belonging to each native people, reservation by reservations' While the Act itself posited no specific criteria by which this would be accomplished, responsibility for completing the task was ultimately vested in the individual federal agents assigned to preside over the reservations. Endowed as they were with staunchly racialist perspectives, and fully aware that whatever definitional constraints might be applied in determining the overall number of Indians would translate directly into an increased availability of property to their own society, it was predictable that these men would rely heavily upon the sort of blood quantum "standards" already evident in treaty language.

In practice, it was typically required that a potential enrollee/allottee be able to demonstrate that s/he possessed "not less than one-half degree of blood" in the particular group in which he/she wished to be enrolled ("intertribal" pedigrees were seldom accepted, even for ostensible full-bloods, and the overall standard was almost never allowed to slip below quarter-blood). The upshot was that anywhere from a third to two-thirds of all those who might otherwise have been eligible to receive allotments were denied not only land but federal recognition as being member/citizens of their nations.94 In total, government functionaries admitted to the existence of only 237,196 native people within U.S. borders by the late 1890s, of whom only a small percentage were less than half-blood members of specific groups.

To ice the cake of racialist reconfiguration of Indian identity, the Act provided that those enrolled as full-bloods would, under the legal presumption that they were genetically incompetent to manage their own affairs, be issued "trust patents" for their allotments, to be "administered in their behalf by the Secretary of the Interior or his delegate" (the latter term meaning the local Indian agent) for a quarter-century. Mixed-bloods, by virtue of their white genetics, were deemed to be competent for such purposes and therefore issued patents in fee simple. This, along with other blatantly preferential treatment bestowed as a matter of policy upon those of mixed ancestry, drove the final wedges into many once harmonious indigenous societies. In the more extreme instances, such as that of the Kaws in Kansas, the full-bloods' visceral response was to repudiate mixed-bloods altogether, demanding their elimination from the tribal roll and seeking to expel them as a body from their society.

By the turn of the century, then, virtually every indigenous nation within the U.S. had, by way of an unrelenting substitution of federal definitions for their own, been stripped of the ability to determine for themselves in any meaningful way the internal composition of their constituencies. The manner in which this had been accomplished, moreover, ensured that rifts even among those still recognized by the government as being Indians were of a nature which would all but guarantee eventual dissolution of native societies, at least in the sense they'd traditionally understood themselves. Allotment and the broader assimilation policy of which it was part had truly proven themselves to be, in the words of Indian Commissioner Francis E. Leupp, "a mighty pulverizing engine for breaking up the tribal mass."

 

Internalization

The breakup and diminishment of the reservation landbase were not the only factors leading to confident predictions that there would be no Indians culturally recognizable as such in the United States by some point around 1935."' Beginning in the 1860s, there had been an increasing emphasis on "educating" native youth on the ways of the dominant society, a trend which was rapidly consolidated in the 1880s as a concomitant to allotment and other assimilationist techniques. While there were several options available--reservation-based day-schools, for example, all of them less expensive and more humane--the mode selected for delivery of such instruction was primarily that of "off-reservation boarding schools" located in places as remote as possible from native communities.

The model for what became an entire system was the Carlisle Indian School, established in Pennsylvania in 1875 by Captain Richard Henry Pratt, a man whose main qualification for the task seems to have been that he'd earlier served as warden of a military prison at Fort Marion, Florida. Following Pratt's stated objective of "killing the Indian" in each student, Carlisle and other such facilities-Chilocco, Albuquerque, Phoenix, Haskell, Riverside; by 1902, there were two-dozen of them--systematically "deculturated" their pupils. Children brought to the schools as young as age six were denied most or all direct contact with their families and societies for years on end. They were shorn of their hair and required to dress in the manner of Euro-America, forbidden to speak their languages or practice their religions, prevented from learning their own histories or being in any other way socialized among their own people.

Simultaneously, all students were subjected to a grueling regimen of indoctrination in Christian morality--mainly the "virtues" of private property, sexual repression and patriarchy--"proper" English and arithmetic, officially-approved versions of history, civics and natural science, the latter devoted mostly to inculcating prevailing notions of racial hierarchy. To instill the "work ethic"--that is, to prepare students for the lot assigned their racial group once they'd been absorbed by Euroamerica--they were also required to spend half of each day during the school year engaged in "industrial vocational training" (i.e., uncompensated manual labor). During the summers, most of the older boys were "jobbed out" at very low wages to work on white-owned farms or local businesses; girls were assigned as domestics and the like.

Individual native families and, often, whole societies resisted the process. In 1891, and again in 1893, Congress authorized the use of police, troops and other forcible means to compel the transfer of children from reservation to boarding school, and to keep them there once they'd arrived. Hence, despite the best efforts of their elders, and not infrequently of the students themselves, a total of 21,568 indigenous children--about a third of the targeted age group-were confined in the schools in 1900. As of the late 1920s, the system had been diversified and expanded to the point that upwards of eighty percent of each successive generation of native youth was being comprehensively "acculturated" in a more-or-less uniform fashion.

By 1924, assimilation had progressed to the point that a "clean-up bill" was passed through which the responsibilities, though not necessarily the rights, of U.S. citizenship were imposed upon all Indians who had not already been naturalized under the Allotment Act or other federal initiatives."' Although it appeared as though this might represent the culminating statutory ingredient necessary to allow for the final absorption of Native America, fate intervened in a most unexpected fashion to avert any such outcome (formally, if not in terms of more practical cultural, political and economic realities). This, rather ironically, took the form of resources: the mostly barren tracts of land left to Indians after allotment--thought to be worthless by nineteenth century policymakers--had by the late 1920s been revealed as some of the more mineral-rich territory in the world.

Loath to see these newfound assets thrown into the public domain-many had strategic value, real or potential-the more forward-looking federal economic planners quickly perceived the utility of retaining them in trust, where they might be exploited at controlled rates by preferred corporations for designated purposes (and in the most profitable fashion imaginable). This resulted, in 1925, in the recommendation by a committee of one hundred officially selected academic experts and business leaders that allotment and the more draconian objectives of assimilation policy be immediately abandoned in favor of preserving the reservations in some permanently subordinated capacity and inaugurating a policy of carefully-calibrated "economic development" therein.

This, in turn, led to passage of the 1934 Indian Reorganization Act (IRA), through which what remained of traditional native governments were for the most part supplanted by federally-designed "tribal councils" meant to serve as the medium for long-term administration of the newly--conceived internal colonial domain. Although the IRA was imposed behind the democratic facade of reservation-by-reservation referenda, the record reveals that BIA field representatives obtained favorable results by presenting skewed or patently false information to voters in a number of instances, flatly rigging the outcomes in others. And, while democratic appearances were reinforced by the fact that the government of each reorganized reservation functioned an the basis of its own "tribal constitution" the reality is that these "founding" documents were essentially boilerplate contraptions resembling corporate charters hammered out on an assembly line basis by Bureau personnel. Nowhere is this last more obvious than in the language of the IRA constitutions pertaining to criteria of tribal membership. Although there are certain variations between instruments, most simply aped the then-prevailing federal quantum standard of quarter-blood minimum, while all of them, regardless of the degree of blood required, advanced genetics as the linchpin of identity."' That there was no noteworthy resistance among native supporters of the IRA to this conspicuous usurpation of indigenous tradition is unsurprising, given that they were all but invariably drawn from the ranks of those indoctrinated in the boarding schools to see themselves in racial rather than national/political or cultural terms. With the embrace of the IRA constitutions by what were projected as solid majorities m most reservations, Euro-American definitions of and constraints upon Indian identity were formally as well as psychologically/intellectually internalized by Native America. From there on, the government could increasingly rely upon Indians themselves to enforce its race codes for it. Indeed, whenever the existence of the latter has been made a point of contention, Washington has been able to lay the onus of responsibility directly at the feet of the IRA governments it not only conceived and installed, but which remain utterly and perpetually dependent upon federal patronage for their base funding and whatever limited authority they might wield."' They, in turn, defend such negation of indigenous sovereignty in the name of maintaining it. A more perfect shell game is impossible to imagine.

 

Enter the "Purity Police"

The reconfiguration and structural assimilation of the mechanisms of indigenous governance- by the early 1990s, IRA-style councils were being openly referred to as a "third level" of the federal government itself-was facilitated and reinforced, not only through the increasingly pervasive indoctrination of native students via the educational system, but by lingering effects of allotment. Foremost in this respect was the "hiership problem" created by the fact that the reservation landbase had been reduced to a size corresponding to the number of Indians recognized by the BIA as existing during the 1890s, with no provision made for a population rebound of any sort.

There was no reserved land available to accommodate the fifty percent increase over turn of the century number of recognized Indians recorded in the 1950 U.S. Census. Rather than remediating the problem by transferring some portion of the lands unlawfully stripped away from native people back to its rightful owners, the government launched a massive and sustained program to relocate the native "population surplus" from the land altogether, dispersing them for the most part in major urban areas. At the same time, as an incentive for them to leave, funding for on-reservation programming of all sorts was sliced to the bone and sometimes deeper. One result is that, while well over ninety percent of federally-recognized Indians lived on the reservations in 1900, fewer than 45 percent do so today. Another federal cost-cutting measure, beginning in the mid-1950s, was to simply "terminate" recognition of entire nations whose reservations were found to be devoid of minerals, or who were deemed to be too small and insignificant to warrant the expenditures necessary to administer them."' A total of 103 peoples, ranging from large groups like the Menominee in Wisconsin and Klamath in Oregon to the tiny "Mission Bands" of Southern California, were unilaterally dissolved, their remaining lands absorbed into the U.S. territorial corpus and their population effectively declared to be non-Indians before the process ran its course in the early '60s. Only a handful, including the Menominee but not the Klamath, were ever reinstated.Predictably, rather than seeking to combat such trends, federally-installed and supported tribal councils amplified them. In the face of declining federal appropriations to Indian Affairs, they by-and-large set out to reduce the number of Indians eligible to draw upon them. Arguing that the fewer people entitled to receive benefits such as healthcare and commodity foodstuffs, or to receive per capita payments against mineral extraction, water diversions and

past land transfers, the larger the share for those who remained, the councils were able to peddle their bill of goods to many-though by no means all--of their increasingly impoverished reservation constituents. In short order, the IRA constitutions on many reservations were amended or rewritten to reflect higher blood quantum requirements for tribal enrollment. In a number of instances, reservation residency was required as well, a stipulation which excluded the children of relocatees, regardless of their documentable degree of Indian blood. The council heads, through a federally-funded lobbying organization dubbed the National Tribal Chairmen's Association (NTCA), then launched an aggressive campaign to recast the definition of "Indian" in the public consciousness--and, they made it clear, in law--as being only those "enrolled in a federally-recognized tribe. Redefined as "non-Indians" in this perverse scenario was everyone from terminated peoples like the Klamaths to the unenrolled traditionals still living an and about many reservations, from nations like the Abenakis of Vermont who had never consented to a treaty with the U.S.-and were thus formally "unrecognized"--to the NTCA members' own nieces and nephews residing in cities. Also sacrificed in the proposed ethnic purge were thousands of hapless children, orphaned and otherwise, whom federal welfare agencies had caused to be adopted by non-Indian families.The government declined to adopt the NTCA's simplistic nomenclature of Indianness. Instead, it conjured up a proliferation of what by now amount to at least eighty different and often conflicting definitions of its own, each of them conforming to some particular bureaucratic or policy agenda, most sporting a larger or smaller claque of Indian subscribers queued up to defend it under the presumption they would somehow or another benefit by their endorsement."' Under such conditions, it is possible to challenge the legitimacy of virtually anyone identifying herself as Indian on one or several grounds (often having little or nothing to do with genuine concerns about identity, per se). The result has been a steadily rising tide of infighting--occasioned in most instances by outright race-baiting-between and among native peoples over the past forty years.Things became truly pathological in 1990, with passage of the so-called Act for the Protection of American Indian Arts and Crafts, a measure which purportedly makes it a criminal offense punishable by fines of $250,000 to $1 million and imprisonment of up to fifteen years for anyone not enrolled in a federally recognized tribe to identify as an Indian while selling artwork. Although Congress never provided the statute an enabling clause to allow its enforcement-not least because to do so would have technically required the arrest and prosecution of individuals deemed to be Indian under other elements of federal law--its very existence unleashed an utter frenzy of witch-hunting among Indians themselves. Within months, ad hoc patrols of "identity monitors" were prowling selected museums and galleries, demanding to see documentation of the "pedigrees" of the native artists exhibited therein, while freelance "Indian spokespersons" such as Suzan Shown Harjo advocated that comparable legislation pertaining to "ethnic fraud" should be enacted with respect to writers, educators, filmmakers, and journalists, among others.

The theme was quickly picked up, tabloid-style, by papers like News From Indian Country Today and News From Indian Country, while the Internet came figuratively alive with a swarm of essentially anonymous rumors that dozens of Native America's most distinguished artists, authors, thinkers and activists weren't "really" Indians after all. Perhaps most disgustingly, a literal flying squad of self-appointed "purity police" in the San Francisco Bay Area took it upon itself to systematically disrupt the functioning of all manner of community service organizations in 1992-everything from the native programming on radio station KPFA, to an AIDS clinic administered by the Indian Health Service, to the local school district's Indian education project-to ensure that everyone involved fit their particular notion of what an Indian should be (children as young as eight years of age were buttonholed and ordered to prove they were "genuine" Indians). Meanwhile, back on the rez, at least some IRA leaders were arguing that the tribal constitutions should be amended yet again, this time to disenroll members who married non-Indians on the premise that such measures had become vital "to protect the purity of our Indian blood."

 

The Way Ahead

The internalization of Euro-America's conception of race by native peoples, the virulence with which it is now being manifested in all-too-many sectors of the indigenous community, and the ubiquity of the confusion and divisiveness this has generated among Indians and their potential supporters, represents a culmination of federal policy initiatives originating nearly two hundred years ago. To all appearances, Native North America has been rendered effectively self-colonizing and, if present attitudes persist, it stands to become self-liquidating as well. The tale is told in the demographic data pertaining to those who are federally-recognized.

"During the twentieth century population recovery of American Indians there has been an increasing mixture between them and non-Indian peoples. Data concerning this may be obtained from the 191D

and 1930 U.S. censuses of American Indians… [In 1910] 56.5 percent of American Indians enumerated in the United States were full-blood--150,053 out of 265,682-with the blood quantum of 8.4 percent (22,207) not reported... In the U.S. census of 1930, however, 46.3 percent--153,933 out of 332,397--were enumerated as full-bloods and 42.4 percent (141,101) were enumerated as mixed bloods, with the d of Indian blood of 11.2 percent (37,363) not reported. Thus, whereas the American Indian population size increased by slightly over 66,000 from 1910 to 1930, the number of full-blood American Indians increased by only 4,000; most of the increase was among mixed-blood Indians."

Such trends have not only continued but accelerated. By 1970, approximately two-thirds of the marriages of those on the tribal rolls were to people who were not, with the result that only 59 percent of births reflected a situation in which both parents registered themselves as possessing any Indian blood at all."' The n of supposed full-bloods has thus dropped to almost nothing--among populous peoples like the Minnesota/Wisconsin Chippewa they now represent only five percent of the whole--while the proportion and composition of mixed bloods has climbed dramatically. At present rates of intermarriage, the segment of the federally-recognized native population evidencing less than one-quarter degree blood quantum, presently less than four percent, will have climbed to 59 percent or more by 2080. To tighten or even adhere to quantum requirements in the face of such realities is to engage in a sort of autogenocide by definitional/statistical extermination.

Some smaller peoples like the Umatillas in Oregon, have already undertaken to preserve racial cant while offsetting the consequent prospect of definitional self-extinguishment by proposing revision of their constitutions to require that future enrollees demonstrate some degree of blood, no matter how minute, in addition to "at least one-quarter degree of blood ... in another federally-recognized tribe or tribes." Left conspicuously unexplained in such convoluted formulations is exactly how being a quarter-blood Lakota or Mohawk supposedly makes a person one whit more Umatilla than does being a quarter-blood Irish, lbo or Han. In the converse, no explanation is offered as to why a person genealogically connected to the group would be less Umatilla in orientation, absent some sort generic "Indian" genetic structure, than a person who had it.

The implications of sudi nonsense become most striking when it is considered in juxtaposition to the actual-rather than federally-recognized-size of the present indigenous population of the United States, and the potential power deriving from its scale. Jack Forbes, perhaps the closest examiner of the issue, has noted that since 1969, "the Bureau of the Census, conspiring with the Office of Management and Budget and political special interests, has [deliberately obfuscated] the 'racial' character of the U.S. population and, as part of the process, has 'lost' some six to eight million persons of Native American ancestry and appearance with a scientifically useless 'Hispanic/Spanish' category. In addition, [seven million or more] persons of mixed African and Native American ancestry remain uncounted as such because of the way census questions were asked and the answers tallied."

Forbes estimates that, even using standard blood quantum criteria, the actual native population of the "lower 48" in 1980 was well over fifteen million rather than the 1.4 million officially admitted by the Census Bureau. Employing traditional indigenous methods of identifying population rather than racial criteria per se would have resulted in an even higher number. And, as of 1990, when the official count reached nearly two million, inclusion of these most rapidly growing sectors of the native population results in an aggregate of as many as thirty million persons overall. The ability to wield political and economic clout inherent to the latter tally, as opposed to the former--which comes to less than .5 percent of the overall U.S. populations--is self-evident.

Fortunately, there is at least one concrete example of how things might be taken in the direction of realizing this potential. The Cherokee Nation of Oklahoma (CNO), in its 1975 constitution, took the unprecedented step, still unparalleled by other twentieth century indigenous governments, of completely dispensing with blood quantum requirements in its enrollment procedures and resuming its reliance upon a more traditional genealogical mode of determining citizenship. This had the effect of increasing the number of persons formally identified as Cherokees from fewer than 10,000 during the late 1950s to slightly over 232,000 by 1980 (and about 300,000 today). On this basis, the Cherokees, whose reservation was dissolved pursuant to the 1898 Curtis Act, have been able to assert what amounts to a split jurisdiction over their former territory. Moreover, while much has been made by assorted race mongers about how this course of action was "diluting" whatever was left of 'real" Cherokee culture and society, the precise opposite result has obtained in practice.

Plainly, in and of itself, the CNO initiative has neither ended the internecine bickering over identity which has precluded anything resembling unity among native people, much less established the basis upon which to free even the Cherokees from internal colonial domination by the U.S. It does, however, represent a substantial stride in the right direction. If the model it embodies is ultimately seized and acted upon by a broadening spectrum of indigenous nations in the years ahead, the tools required for liberating Native North America may at long last be forged. hi the alternative, should the currently predominating racialist perspectives associated with the IRA regimes prevail, the road to extinction can be traversed rather quickly.