Author Archives: keeneshort

Unknown's avatar

About keeneshort

I am a writer in Southern Indiana.

For Me, the Year is Only Half Over

I won’t be making New Year’s Resolutions on January 1. To be honest, I never have, but not because I’m against resolutions. It’s because for me a new year won’t begin on January 1. As long as I can remember, I’ve never marked new and old years by the Gregorian calendar. These twelve-month chunks don’t reflect my own endings and beginnings. Instead, I’ve always marked years by the academic calendar.

I count school years instead of Gregorian years because summers have always marked the major changes for me: every June I leave behind classes and teachers and prepare to meet new ones in August. Friends graduate and leave, relationships end, and the next school year offers new possibilities. The end of 2016 means nothing to me. It’s still winter, I’m still in grad school, I’m still 24. What will actually change tomorrow?

Now, while folks wallow in the regret of not fulfilling their 2016 resolutions, I still have six months left until I have to wallow in regret, and even then I have the whole summer to do my wallowing. I have plenty of time to not get in shape and not get published in The Paris Review.

I also have half a year left to finish my MA, improve my teaching, become a regular at a bar where everybody knows my name, and find inner peace. Piece of cake. Then, in summer, I can start the next year fresh and accomplished. I still don’t know where I’ll be next year, how many publications I’ll have, and whether or not I’ll have to cope with martial law, but that’s fine, because I still have half a year to figure it out.

For the rest of you folks celebrating 2017 like it somehow means something, I wish you a Happy New Year. For me, though, kindly hold your New Year’s wishes until summer. The weather will be nicer then, anyway.

Peace,

-jk

Coming Home for Christmas After the Boston Tea Party

destruction-of-tea

The Destruction of Tea at Boston Harbor, by Nathaniel Currier, 1846, Hand-Colored Lithograph

On December 16, 1773, the Sons of Liberty checked their phones for messages about the plan. Some Tweeted about it as they crept on board the British ship; others posted Instagram pictures of the tea crates they dumped into the Boston Harbor, one after another. #coffeefromnowon. #revolution. #dumptea. Throughout the night, several Sons posted updates on the SoL Forum. Meanwhile, crate after crate of imported tea splashed into the salty, frigid water.

John Adams live-tweeted the affair with considerable criticism, but a new hashtag surfaced: #sitdownjohn. Frustrated, he stayed inside while the protest unfolded. Several Native American pages posted their own frustration that the Sons of Liberty were dressing up as Mohawks, pointing out the inaccuracies and retribution the British might take against them, but the protest continued unabated. Some tagged King George in their posts.

The next morning, King George deleted his Twitter account, then reopened it again to post “Not cool” several times. The Sons of Liberty felt like they had accomplished a good shaming.

A week later, Sons and Patriots returned home for Christmas. The media expressed a disorganized uproar about the protest, with Loyalist blogs calling the Sons of Liberty terrorists and the Sons of Liberty tagging everything #donttreadonme and #goteabagyourself. Some Sons returned to divided families: a Loyalist cousin here, a Quaker moderate in-law there.

It was particularly awkward at the Adams Christmas Party. Refusing to yield his position, John spent the entire time standing up, while his cousin Sam spent his time in a corner liking and retweeting every post of a tarred-and-feathered British tradesman. John called it grotesque of him to like so much shaming; Sam told him to stop shaming him for his views. Sam pointed out that John defended the Red Coats after the Boston Massacre three years earlier, calling him out for defending people who killed Americans; John called out Sam for passively defending a whiny group of protestors. Meanwhile, Abigail Adams drank whiskey in the billiard room and thought very seriously about tarring and feathering both John and Sam. She was, after all, ashamed of both of them. They liked the shock and awe of sharing listicles reinforcing their stances, like preaching to two different choirs. “Ten Horrible Things King George Has Done in Ireland,” “Nine Ways the Revolution Fails at Intersectionality,” “You Won’t Believe the Feathers on This Loyalist Cuck.”

Abigail had visited a Boston general hospital weeks earlier after a tax collector she had befriended was tarred and feathered at the docks. She remembered the way the hot tar stuck to his skin, the difficulty of pulling it off, the way it stuck to doctors who tried to remove it, making him untouchable, unapproachable. He refused to speak to Abigail for her husband’s politics, and instead stared at the ceiling while doctors treated his burn wounds.

Sam called John a feisty little tea drinker, and John called Sam a caffeinated warmonger. They were on the verge of tarring each other right there at the party, and if they did, Abigail knew that she would pull the dried tar from both morons while they lay side by side, listening to each other’s crying. Even that, she posted on Tumblr passive aggressively, wouldn’t get them to meet one another halfway.

-jk

Reflections on a First Semester of Teaching

pedagogy-2To the astonishment of many, I finished my first semester as a graduate instructor, and I now have a break from graduately instructing people. I have ambitious writing goals for the break (two new stories, four revisions, eight submissions), and I intend to stick to those goals (not just because my nonfiction instructor challenged me to email her if I succeeded), and now that I’ve submitted final grades, I have time to think about my first time being fully responsible teaching forty-six people to write arguments.

I still mostly don’t know what I’m doing, but I’m learning and have learned plenty, and I now know what not to do (mostly). Even with a syllabus, plans change, and even when I realize a lesson plan is about to fail (much like hope or democracy) ten minutes into class, I still have to go through with it. Teaching is a kind of theater, and I can hide my uncertainties about a lesson plan well enough.

I should be honest with my students, but not too honest. Teaching is still theater, but actors bring pieces of themselves on stage when they perform, even in subtle ways. I don’t want to be a mysterious professorfiguredude, because I’m not. I’m a graduate instructor trying to figure out the mechanics of a syllabus and how to factor in participation. I should be honest with my students if I make a mistake, and I expect the same from my students (and despite this semester’s rough patches, I still have high expectations).

A good cohort makes teaching easier, and not just because it’s lovely to have a group of friends with whom I can praise and complain about students, plan lessons, work on assignments, and stay motivated. It also helps to have people who need to stress-drink as much as I do.

A bad lesson plan does not make a bad semester, and I often have a hard time remembering that. Mistakes might feel worse and worse as the semester goes on, but it helps to remember that over Winter Break, students will forget most of them, and in a few years I probably will too.

Hypocrisy is inevitable, and that’s also okay. I’m a quiet student, and when confronted with a class of people who, like me, are very quiet, I’m forced to be speak more, because avant-garde pedagogy in which students and teachers sit in a room silently meditating on a reading is very uncomfortable. It’s hard to fill fifty minutes three times a week with discussions and lectures, and it makes me want to apologize to all my professors for having been such an aggressively quiet student.

A new semester means a new syllabus, which means countless more ways to make mistakes and learn, but now I know what to expect.

-jk

Genre, Nostalgia, and The Love Witch

I don’t normally do film reviews/analysis on this blog, but a recent viewing of The Love Witch with its “aggressive strangeness,” as a friend described it, warrants a a closer look.

witchAnna Biller’s 2016 The Love Witch begins as a parody of late 1960s/early ’70s sci-fi/fantasy sexploitation B movies such as Barbarella (1968) and A Touch of Satan (1971). Biller’s film establishes a concrete cinematic nostalgia that it then goes to great lengths to critique.

Spoilers and such: The main character, Elaine, is a witch utilizing her witchiness to seduce men in her supposedly endless search for love. After having killed her husband, Elaine quickly finds a new partner (Wayne) and uses her excessive witchitudes to convince him to take her to his getaway cabin. There, she gives him a “potion,” which causes him to “feel love” too intensely and die. (Audiences can recognize that taking beverages from strangers is also a possible way to die). Elaine attributes Wayne’s death to men being unable to cope with their emotions, and moves on from there, as we all should when someone feels emotions to death.

When Elaine finds  Griff, “the right one,” the film has veered from its established genre, becoming at different points a Hallmark romcom and a buddy cop drama. Disappointed that he fails to feel love for her (and also happens to be the cop who finds that she is guilty of “loving” people to death), Elaine stabs Griff in the chest. The film ends with a disturbingly quiet fantasy of Elaine marrying Griff in a Renaissance setting, interrupting the campy tone and ending with a serious meditation on the consequence’s of the film’s own logic.

The Love Witch exploits viewers’ nostalgia for a unique cultural moment that existed only after the emergence of birth control and before the HIV/AIDS epidemic, an era that felt like it was going places and might have were it not for the sudden death of hope that came with Reagan and the moral majority.

Elaine’s treatment of men is cynical and essentialist, and the film’s male characters buy into it just as much as she does. Audiences are meant to see her views as a product of her frustration with social expectations for happiness and monogamy, and she turns to witchliness to prevent further disappointment. Witchcraft here functions as a cult-like ideology: Elaine seeks improvement in relationships and cedes her agency to something beyond her (a program/product/cult), but as a result of that program/product/cult, she only ends up killing people (which, to no surprise, ends up hurting her relationships). Witchcraft is a stand-in for any commodified, pre-packaged self-help ideology, such as Sedona’s vortexes, Scientology, or the books of Rob Bell (all of which, I’m sure, have resulted in someone’s death).

Marriage appears most colorfully in a scene at a Renaissance fair, pulling back the curtain on Elaine’s personal investment in the program/product/cult. Elaine, like many Americans, is drawn to witchcraft simply for the nostalgia of a pre-globalized Europe, one without the intrusion of Christianity but all the aesthetics of a Pagan religion without the human sacrifice and patriarchy. All the fun without the historicity. Tolkien made even whiter and somehow less sexy. Halloween ruined by Medieval and Renaissance Studies majors. The list goes on. The film critiques the audience for participating in nostalgia for a style it portrays as commodified, pre-packaged, and self-consuming.

The Love Witch dissects the way many Americans imagine the sexual revolution, which existed between two periods of extreme repression, the 1950s and 1980s, and captures the strangeness of that moment of hope while simultaneously undermining it just as violently as it was subverted historically.

The film’s generic experimentation acts as a mechanism exploiting viewers’ nostalgia. The Love Witch tricks audiences the way time and politics often do, by taking its viewers on a trip they never signed up for but feel unable to step out of, forcing them to walk away with questions and pretentious blog post ideas about witch hunts and the 1970s or something.

-jk

American Discourse and Islamic States

globeIn contemporary American discourse, the ways we talk about Islam and the Muslim world tend to be limited. The phrase “Middle East” has become synonymous with Islam in the American imagination. In recent years, the self-proclaimed “Islamic State” has dominated western discourse about a large and malleable region of the world, but the concept of an Islamic state has appeared in numerous other historical moments, warranting a more nuanced understanding of the phrase.

Edward Said points out that “before the sudden OPEC price rises in early 1974, ‘Islam’ as such scarcely figured either in the culture or the media. One saw and heard of Arabs and Iranians, of Pakistanis and Turks, rarely of Muslims” (36). Discussions of nationality and ethnicity were practical for American discourse. Economically and politically, American discourse began homogenizing these polities under one overarching category: Islam. Oil price changes, revolutions in Iran, protests in India, and socialism in Afghanistan in the 1970s and 1980s slipped away as Americans perceived dozens of countries as simply “The Middle East.”

The concept of the Islamic World actually has its roots in Medieval Islamic thought as the dar al-Islam, or the abode of Islam, which was a (most likely idealized) view of the Medieval world in which a Muslim could move freely throughout regions with Muslim rulers, ranging from Spain to the borders of China. The dar al-Islam was not a state, but a conceptualization of territories.

An older article in The Atlantic defined a Caliphate as “an Islamic State,” which is a historically insufficient definition.Nation-states emerged in Europe as a result of geographic borders solidified by absolutist monarchs who dictated what qualified as citizenship, namely religion, taxation policy, and loyalty to the crown. As European nations and colonies swept aside absolutism and attempted to create secular liberal republics, the concept of the state as a geographic fence with a common language and fiscal arrangement remained the same: a homogeneous block of identity.

Thomas Barfield calls this the American Cheese model of statehood, and uses Swiss Cheese as a metaphor for premodern regions of Central Asia such as Afghanistan (Barfield 67). Rather than a solid block, polities were porous, malleable, and not always ruled through and through by a dominant king or ideology. This is true, I think, of what most Americans call the Middle East. It is largely Islamic, but it is far from homogeneous. The relationship between citizen and state often differs from the easy system many Americans paint onto the world, trying to mark which populations are with us or against us. The U.S. and Pakistan share more in common historically, as republics formed from anti-British/anti-colonial independence movements, yet the U.S. has a better working relationship with Saudi Arabia, an oppressive regime that likes to bomb its neighbors and censor its people. (Maybe the U.S. has more in common with Saudi Arabia than I’d thought).

Likewise, the Caliphate did not function the way we often think state-religion relationships function today. The nineteenth century Egyptian reformer Muhammad ‘Abduh wrote that Muslims never experienced “something that resembled the power wielded by the Papacy of Europe, nor were they ever exposed to a Pope-like figure who could and did exert power to remove Kings and banish princes, extract taxes and decree Divine laws” (Haj 93). Writing from the 1900s, his statement was true. Caliphs were not believed to rule the way Popes and monarchs claimed to, as infallible and acting as spokespeople of God to his otherwise hapless subjects. This is not to say that Caliphal rule was always just, but suggests that religion and state in the Islamic world grew up functioning alongside one another, but never competing with one another for control.

For most of Islam’s history, the initial Caliphate “remained head of the umma [community of believers] and a symbol of Muslim unity” but “would represent the administrative and executive interests of Islam while the scholars and Sufis defined Islamic religious belief” (Lapidus 102), and even that diminished as the Caliphate moved around, ending up in the Ottoman Empire where, after World War One, it was officially abolished. Smaller caliphates appeared every so often, but the use of the phrase “Islamic state” to describe a caliphate is too simplistic, because for much of history the Caliphate represented the separation of Islamic doctrine from political administration, at least in theory.

As such, the concept of a secular state grew up differently than it did in the west, perhaps with a greater dissonance. A single glance at the United States today, which passes laws about abortion based on religiously inspired magical definitions of personhood, suggests that we have yet to actually implement the separation of church and state.

Depending upon what is convenient for media and politicians, the Middle East contains parts of Africa, the Arab world, and Central Asia. If used literally, the Muslim World should be expanded to include China, Russia, the Caucasus, Southeast Asia, the Balkans, and regions of the Western Hemisphere where African Muslims were forcibly shipped during the Atlantic slave trade. The majority of the world’s Muslims are in Indonesia, not western Asia. The Islamic World is neither unified nor homogeneous, and instead encompasses a broad spectrum of religious, philosophical, and political discourses.

When Americans talk about the Islamic world, they typically think only of the Arab world plus Iran, because, as Said points out, it became convenient for Americans to think of themselves as persecuted by a collective polity (Islam) during the 1970s and 1980s. Violent extremists exist within a unique historical context; their crimes are not justified by that history, but they should nevertheless be understood as stemming from particular origins. It is neither useful nor intelligent to homogenize one billion people. States are intrinsically porous and malleable; Americans should recognize that this applies to the U.S. as well as the rest of the world.


Barfield, Thomas. Afghanistan. Princeton University Press, 2010.

Haj, Samira. Reconfiguring Islamic Tradition. Stanford University Press, 2009.

Lapidus, Ira. A History of Islamic Societies. Cambridge University Press, 2002.

Said, Edward. Covering Islam. Random House Vintage, 1997.

The Novel That Wasn’t (But Will Be)

library-books-2The last time I wrote anything for NaNoWriMo this year was November 8. After November 9, I mysteriously lost interest in a genre-bending crime-western about four elderly women who witness a murder and can only recall the gritty details of a bad acid trip they had together in their college days in the late 1960s.

I still have that overwhelming disinterest now, as I apply for more graduate schools in the humanities, an area continually asked to justify its existence to university administrators who want higher salaries for themselves at the expense of faculty and student budgets. We’re constructed as the enemy, put on watchlists by paranoid Internet users, and made to be reminded that our pursuit of art is a waste of time in a fastpacedgrabitall economy. What good is an MFA to a post-apocalyptic society struggling to save the last bee colony? What good is a genre-bending novel to a pipeline oil leak? In a few years, will we even be publishing novels?

So I put it aside. I was also busy studying and teaching. I want to enjoy the rest of my education; I don’t know if I’ll have an opportunity to enjoy it again. It’s a lot of work for little, if any, profit. I’m lucky I have no student debts, but every time I look at the news, I can’t help but feel that I’ve squandered my education for a pursuit that now only exists to sustain itself. The power of the written word has betrayed us. The written, texted, tweeted word can be undeniably a lie, and people still believe it. Meanwhile, if a poem goes viral, it only reaches the people who already love it.

I still have the urge to write, though. I enjoy it, when I manage to find the time and peace. Even this meager blog is satisfying. It’s work, pleasurable work, but it can’t exist only for me. The most successful writer, as I’ve heard, is the one who can write a story and put it in a drawer forever. Until I reach that level of inner peace, I need an audience. Maybe this post will reach someone who needs it, or at least enjoys it. Probably not. I want to be realistic about my prospects, but the pleasure I derive from writing propels me forward through this muddy, hopeless, disgusting month.

I’ll get back to my novel soon. I don’t think a genre-bending novel will make a difference, but if I ever stop writing, the anti-intellectuals win. If they want me to justify my existence as a writer, reader, and academic, I’ll have to give them one hell of a novel full of well-written dynamic characters and compassionate portrayals of inner conflicts and meticulous attention to the beauty of environmental and historical landscapes. I’d rather write hopelessly than not at all.

-jk

Nobody Actually Has Time to Write a Novel (But We Do It Anyway)

train-1November 1 kicked off the beginning of National Novel Writing Month, or NaNoWriMo, the long-standing tradition in which writers and readers alike decide to write a novel (or 50,000 words at least) during the month of November. The idea isn’t to have a novel finished by December 1, but to have written enough of a first draft of a novel (or memoir or novella even) to build on during the next year, something to return to and tinker with at a more casual, realistic pace.

Writing at a bout 1600 words a day, writers might finish. Most don’t. I’ve only finished once, and I was a very sleepless college freshman, full of ideas and not much else. Now, I’m a graduate student in English. Now, I’m full of ideas and stress, and not much else. I have deadlines to meet, books to read, authors to research, research to catch up on, and workshop material to write. I have classes to teach and a full 13 credit hours of graduate coursework to focus on, as well as graduate applications and a brand new hip flask to make proper use of. Do I really want to add the pressure of a novel to that?

The answer, of course, is no. But I’m doing it anyway. I don’t expect to finish, even if I write during all of Thanksgiving break. Even when I inevitably don’t meet the standard 50,000 words, I’ll still have a novel draft to tinker with in 2017. Like many writers across the globe, I’m enjoying more ambition than I can justify having, and the companionship of fellow writers struggling near me.

So, to my fellow November-long novelists, I wish you luck and sudden bouts of free time. Say goodbye to your loved ones and the prospect of having clean dishes. It’s noveling season.

-jk

Before Wounded Knee

wounded-knee-massacre

Photograph of civilians collecting the dead at Wounded Knee.

The Wounded Knee Massacre of December 29, 1890 is widely considered the end of military hostilities between the U.S. government and Native American Indian tribes. The Standing Rock protest today, however, is building up in similar ways to the Wounded Knee Massacre, and although there are key differences, it seems that the relationship between the U.S. government and American indigenous peoples has remained largely the same since 1890.

In 1888, a Paiute man named Wovoka began a religious movement centering around the Ghost Dance. Wovoka’s movement asserted that the Messiah would return as a Native American Indian and the continent would be freed from pioneering and settler oppression, and the Ghost Dance would usher in the Messiah’s return. The movement quickly swept across Native American communities, reaching the Dakotas by summer of 1890.

Followers of Wovoka such as Arnold Short Bull, brought the Ghost Dance to the Lakota Pine Ridge Indian Reservation during a drought and amid numerous treaty violations, which included reduced food rations for the reservation and white settlement on land designated for Lakota use. The Ghost Dance accompanied federally sanctioned violence, starvation, and a small environmental disaster. The U.S. government was suspicious of the Ghost Dance as early as May of 1890, and continued to treat it as a militaristic threat rather than a religious movement. On October 30, an agent for the Pine Ridge office of the Bureau of Indian Affairs wrote a letter to the BIA commissioner indicating that, in his view after observing the Ghost Dance,

“. . . the only remedy for this matter is the use of the military, and until this is done, you need not expect any progress from these people; on the other hand, you will be made to realize that they are tearing down more in a day than the government can build up in a month” (Royer 65).

Here, the BIA acted as an observation tool for the U.S. government, keeping track of Native American Indians forced onto reservations with little water and food. A religious spectacle became a mode of unity, an expression of organization, which the government deemed, without question, a threat. Earlier, BIA commissioner R. V. Belt wrote in a letter dated October 17, 1890, that the Pine Ridge Agency should inform those

“. . . engaged in encouraging the Ghost Dance and other like demoralizing conduct, and inciting and fomenting dissatisfaction and discontent among the peaceably disposed Indians that [the Secretary of the Interior John Noble] is greatly displeased with their conduct” (Belt 75).

Belt went on to describe the Ghost Dance as “bad advice and evil,” and that the Secretary of the Interior will “exert whatever influence he may have over any of the Indians to turn their backs upon the medicine men who are seeking to divert the Indians from the ways of civilization” (75-76). There was a connection of correspondence linking BIA agents at Pine Ridge to the White House expressing anxiety about the Ghost Dance. These agents wanted “peaceably disposed Indians” who did not express discontent.

But all evidence suggests that they had every reason to express discontent. They were surviving a genocide, forced onto difficult land after military engagements against them, after numerous other massacres and battles. It seems that BIA agents and the U.S. government associated Native American discontent with militaristic hostility, conflating the two, because to the U.S., the moment a tribe became vocal, the moment its members made themselves visible, they challenged the established systematic erasure of an indigenous population and the colonial narrative of European settlement on an otherwise unpeopled land rich with untapped resources.

The Ghost Dance as a religious practice did not emphasize military struggle or armed combat. On October 31, Short Bull gave a sermon to his followers, referring mostly to the coming of the Messiah and mentioning combat only once, when he said,

“You must not be afraid of anything. The guns are the only things we are afraid of, but they belong to our Father in Heaven. He will see they do no harm. Whatever white men tell you, do not listen to them. My relations, this is all” (Sitting Bull 65).

Anxieties over Native Americans not listening to those attempting to defeat, control, indoctrinate, and relocate them culminated in the military’s arrival in November at Pine Ridge, to keep the peace. Following Royer’s suggestions, the military became a remedy to stop the Ghost Dancers from breaking down what the U.S. government had built up. Cavalry divisions arrived at Pine Ridge, forcing surrender and disarmament. On December 29, in the process of disarming a few Ghost Dancers, a rifle went off, and soldiers panicked after being informed that an armed insurrection would take place. Fueled by fear and rumors, soldiers fired at the Ghost Dancers, and a massacre ensued. There were casualties on all sides as some Ghost Dancers attempted to defend themselves. Estimates vary, but up to 300 Lakota were killed, most of them unarmed, many of them children.

The logic leading up to the massacre might be difficult to track, but was built on a number of assumptions. First, that Native Americans practicing a large, organized demonstration was the equivalent of cultural and military dissent, or in other words, a problem. Second, that the only way to “solve” the problem was through the use of military force. Third, that expressing dissatisfaction with an understandably bad situation was unacceptable.

One of the defining features of the 21st century is the blurring of police and military forces. In a post-9/11 surveillance state in which citizens and combatants are considered difficult to distinguish from one another, the police and military begin to serve similar functions. While this fact has become more obvious in recent years, and while there have been many instances in the U.S. in which the state treated its citizens as combatants, this has always been the case for Native Americans. Since the founding of the United States, Native Americans have always been designated a threat to westward expansion simply by their presence, their visibility, their voice. Historically, soldiers keeping peace and soldiers engaged in combat have served the same purpose for the U.S. when engaging its indigenous population.

I’m not a proponent of the notion that history repeats itself; I find it too simple. However, the events surrounding the Standing Rock protest are eerily similar to those that led up to the Wounded Knee Massacre: Native American Indians express discontent over treaty violations, land abuse, and environmental disasters, and as a reaction, a militarized police force steps in. Tensions have already resulted in violence against protestors and the arrest of journalists for covering the events. Contexts may be different, but the logical framework the U.S. uses to understand and address the protest remains almost identical to how the U.S. addressed and understood the Ghost Dance. Whether or not there will be another massacre remains to be seen.

Coleman, William S. E. Voices of Wounded Knee. University of Nebraska Press (2000).

Wounded Knee Massacre,” Encyclopedia of the Great Plains, 2011. Accessed October 30, 2016.

Apparently I’ve Been Blogging for Three Years

all-skull-and-bibles

A photo I took exactly three years ago.

I don’t tend to celebrate anniversaries. I don’t actively celebrate my birthday and I ignore my country’s independence day. But WordPress insists on reminding me that I started this blog three years ago, and I may as well mark the occasion.

Since my last blogiversary, I’ve attended a rad academic conference in Albuquerque, had poems, a short story, and a nonfiction essay published, and visited multiple national parks. I’ve completed a draft of my creative writing Master’s Thesis, a collection of interconnected short historical fiction stories (is there a more pretentious phrase? If so, I’ll find it), as well as poetry collections and essays. My writing has improved (I think), and I’ve developed a better understanding of literature.

I’ve also been in Nebraska for over a year, and my relocation here has started to set in. I’m finding a community in Lincoln. I’m forming connections with friends and colleagues. Sadly, I may be leaving again for another graduate program. Once again, I’ve decided to apply to graduate programs to pursue either a PhD or MFA program, and once again, I have no idea where I’ll be living a year from now.

But wherever I am, I’ll at least have a blog. It may not be much, but if I leave my friends, colleagues, and relations, if I leave them all behind for another new start in another state and another program, I’ll still have this little journal of my affairs. It may not be much, but it can be a grounding ritual, or a way to kill time. In either case, I enjoy it.

Wherever I am, wherever I will be, wherever I’ve come from, here’s to three years of fairly sporadic blogging. Cheers, peace, and until another autumn.

Peace,

-jk

Useful Tips for Making Time to Write

make-timeStephen King once suggested that aspiring writers carve out time to write every single day, which probably works for wealthy retired people like him. For the rest of us proles trying to be writers, carving out time to write can be a challenge. There are, however, numerous ways one can make time to write.

  1. Give up sleep. Talk to your doctor to let her know that you no longer require the recommended seven to eight hours of sleep; four or three or two should be sufficient. That’s what coffee is for, right? If your doctor protests, just let her know that she can have a free signed copy of your fantastic novel-in-progress, American Noun, once you finally get it written.
  2. If sleep is too difficult to give up, try giving up on friends. Thanks to social media, dropping off the face of the Earth has become quite easy. Delete your Facebook, Instagram, Twitter, Pinterest, Bumble, Thorax, PurpleDeth, and whatever other social media you have. No more notifications from you will quickly let your friends know that you are now and always have been an illusion. No more friends, no more distractions! Now get writing!
  3. If you somehow need friends and sleep, another way to make time is to quit your job. Many aspiring artists have done it. You can call it breaking out of the system, but we all know it’s to make time to write. After a month or so, you can consider more tenable versions of giving up on a well-paying job to pursue your dream, like getting an MFA in something or volunteering with the Green Party.
  4. If you’re the kind of loser who needs sleep, friends, and a job, another way to make time to write is to literally create time. For many writers, this is the most realistic option. Build a time machine (instructions are on Wikipedia) and spend a day writing, then go back twenty-four hours and respend that day working with friends and sleeping. You’ll have a novel in no time, but the problem is that, to the rest of us, you will age twice as fast.
  5. Making time to write is difficult, and you may have to give up a few things: regular TV, some social events, a few good meals. The important thing is to not give up on writing, if you really, really want to write. You can’t have it all, but the parties you get invited to after publication will make up for it.

-jk