Tuesday, May 21, 2013

Haunted by the Spectre of Ghost Writing


Yesterday a university colleague referred me to an article in one of the best known Lebanese newspapers citing me on student cheating: “Students Buy Assignments As Semester Ends”. I had warned colleagues that I had been interviewed on this subject by a Daily Star reporter and that an article that could refer to any or all of our courses was due out soon; we had been anticipating the piece. The startled young writer had politely knocked on our Fisk Hall office door last week and asked a colleague and I whether we could answer some questions about student cheating on written assignments, including the basis on which we suspected it. The reporter had already investigated the ghost writing business in Lebanon and was seeking more information, determined to cite the views of faculty members. The problem was widespread, she said, wondering whether anything more could be done to curb it.

The ghost writing business is illicit and generally done surreptitiously. Though alarmed by the information she had gathered, the reporter seemed comfortable talking to us about it, having worked as a graduate assistant in our department in the past. Her findings, focused on universities in Lebanon, are in my view local examples of a global issue: a disease that is geographically pervasive and reflected online as many paper mills advertise and sell on the internet. Whether the phenomenon is new to the world is doubtful, though the problem has come to light more clearly in the past decade or two through the internet.
To my knowledge, some celebrities and politicians use ghost writers for their biographies, speeches and blogs; some artists use them; and a number of pharmaceutical companies have resorted to medical ghost writing to promote their products, so students are not the only culprits in this world. Still, the academic use of ghost writers requires special attention as it is a problem worse than plagiarism. Some plagiarism is unintentional, when, for example, students complete their own assignments but lack skills in summarizing, paraphrasing, or quoting, especially in a foreign language; or when they have not understood the importance of crediting their sources though they have been taught about it. While a partially plagiarized text might include some student effort, a ghost-written paper generally does not, except possibly when the students provide the dealer with articles and other sources as content.  
It takes necessary courage to discuss this taboo subject and bring it into the open as it can undermine the credibility of those involved, be they companies, celebrities, students or others. English teachers have a special role to play in reducing this problem by motivating students to write, engaging them with relevant topics, teaching effective research and writing skills, emphasizing processes rather than products, and discussing students’ projects with them, providing feedback from start to finish. Teachers of other subjects should also pay attention to the types of assignments they expect as the more unreasonable the assignment, the more likely the students are to resort to external help or "services".
 
This is my initial reaction to the newspaper article on students purchasing papers. I hope to respond further on this issue in future blog posts as a short posting such as this cannot do this profound subject justice.

Saturday, April 27, 2013

The Most Popular Languages

Most sources would agree that the top five languages in 2013, judging by the number of speakers, are Mandarin Chinese, Spanish, English, Arabic and Hindi. Does this mean that these languages are “better” than others? Does it mean they are more in demand? The answers are not straightforward. Linguists tend to concur that no language is superior to any other. Chomsky, for example, is famous for his theory of Universal Grammar: that all languages are very similar, without exact point to point correspondence, and that all people are born with a capacity for a “universal grammar” which manifests itself through concrete languages. He theorized that UG was in people’s genes not only metaphorically but also literally.

What about the importance given to the English language worldwide? The Telegraph’s study advice section recently warned that assuming that everyone is happy with English is not valid in international business: “It’s no longer permissible to simply assume that clients will be comfortable speaking English, particularly if you’re looking to set-up lucrative ongoing business links” (“What’s the Best Language to Learn to Further Your International Business Career?”). Rather, the advice is to learn Mandarin Chinese, as many companies are moving to China, or Russian as Russia is important in oil and gas production. On the other hand, one is advised not to forget the continuing global importance of European languages such as German, French and Spanish.

For UK-based native speakers of English, Arabic and Polish are almost equally important these days – the former partly because of Qatari investment and the latter because of the huge influx of Polish immigrants over the years. Still, these languages are outranked in importance by Mandarin as well as Spanish, French and German. Germany is one of the UK’s largest export markets while Spanish-speaking Latin America includes important, fast-developing markets (“Graduate Jobs: Best Languages to Study”).

Corinne McKay, a US-based certified translator who recently blogged on the question of “Which Language Is ‘the Best’?” thinks that Middle Eastern and Asian languages score highest in terms of “critical need”. She admits, however, that translating from these languages into English is difficult because of significant cultural differences, unlike translating from other European languages into English. Translators brought up in the US, who have not lived in China or the Arab world, for example, may find it more challenging to translate from the relevant languages than translating from French or German – and culture is not the only hurdle.
Arabic is a good example of a “difficult” language: it is a Semitic language, like Hebrew, unlike French, which is Indo-European and has more word and word structure similarities with English; some vocabulary items in French are almost the same as their English equivalents, and both languages use an s at the end of words for the plural. Arabic vocabulary, in contrast, is very different, and hence more difficult to learn, and plural formation is different, not to mention that Arabic has “dual” pronouns besides singular and plural. Furthermore, because Arabic script goes from right to left and is cursive, as in English handwriting, it appears tougher to decipher. While there is a better correspondence between spelling and sound in Arabic, some sounds are not easy for native speakers of English; certain “phonemes” do not exist in their language. Then there is the issue of diglossia, the difference between colloquial Arabic and standard Arabic, and complications with different dialects depending on the region. In some cases, the dialects are so different that native speakers of the language have trouble decoding each other’s utterances. Besides, dialects are only spoken; unlike standard Arabic, they are not meant to be written. They are still important to learn though, complicating matters for the language learner.

Whether English is really in decline would be an interesting question to answer. While the percentage of native speakers of English appears to be decreasing as other populations multiply more quickly, English remains important as a second or foreign language, if not in business, then definitely in science. For native speakers of Arabic, one may assume, English is till crucial as the language of science whereas Chinese may be increasingly the language of future business.

Tuesday, March 26, 2013

Why Some of Us Still Don't Use Twitter

When I discovered Twitter, several years ago, I did not find it appealing at all and therefore did not subscribe to it. I was already using Facebook for social networking and LinkedIn for professional networking. Twitter presented itself as a mere distraction in comparison, a redundant tool that would waste my valuable time. It also looked and sounded dry: not much in terms of pictures, stories or jokes; no “friends” – just “followers” and followees. Besides, the idea of being a “follower” of others did not appeal to me (doesn’t the word carry connotations of subservience, stalking, or both?). Nor was I excited by the prospect of being “followed”. Besides, from an English teacher’s point of view, I was discouraged by the abbreviated language of Twitter, which defies spelling, grammar and punctuation conventions. Why would an English teacher want to be involved in such an environment when we are supposed to set a good example of Standard English, including complete sentences and well-developed paragraphs? My initial impression was that Twitter was for those who don’t know how to write!

My view of Twitter has changed only slightly over the years. Knowing that many highly educated people use it, including public figures, I now see it as a tool for three categories of people: those who don’t need to set a good example language-wise, those who don’t have the time to write properly or at length, and those whose writing is not presentable in the first place. One can see the wisdom of valuing content and meaning over style, yet how much content can you squeeze into 140 characters, and how much depth, analysis or synthesis can go into that? Twitter is an excellent tool for brief announcements and comments. Beyond that, I believe Facebook and LinkedIn are superior – and so is proper, old-fashioned blogging. In this regard, I agree with Devin Coldewey, a Seattle-based writer and photographer, who said in 2009, “…if someone is so regularly finding content of merit, why don’t they have a blog where the content can be given context, discussion, and perhaps a preview so people aren’t going in blind? I like reading interesting blogs. I don’t want to receive links every time someone finds something they think everyone should see. Twitter just adds another layer to the equation — and I don’t like layers” (“Why I Don't Use Twitter”). A large scale study conducted by the data analytics provider Pear Analytics actually concluded that 40% of tweets were “pointless babble”, more than a third were “conversational”, and around 9% had only “pass along” value (Mashable).

From a business point of view, companies are using social media for public relations purposes. People like to see what CEOs think, and they can now find some of them on Twitter. Ellen Roseman of the Toronto Star hopes that Twitter “sticks around forever” if it truly connects corporate leaders to customers more effectively (“Why Smart Consumers Should Use Twitter”). On the other hand, if – like me – you are neither a company CEO nor a particularly worried “consumer”, why would you join Twitter over and above other online networking tools? For news gathering and information on current events? If you already use Facebook, you would need extra time for Twitter and you might end up finding the same information there in any case; besides, you can always go to news sources directly rather than waiting for others to share, layer upon layer. So many tools, so little time to juggle!

Monday, March 4, 2013

Researching “Research”

On the occasion of National Grammar Day in the U.S., this posting focuses on a puzzling grammar point.
Recently, I managed to provoke an online discussion in our English Communication Skills Program at AUB about a controversial grammar issue. The subject of the discussion was “Students Pluralizing ‘Research’: Right or Wrong?”. What triggered my initial posting was my disappointment with students pluralizing the noun “research” even after I had explained that it is better not to pluralize it because it is generally uncountable – plus the fact that, to my dismay, some of the best known dictionaries have started accepting the usage.
After seeing the Macmillan Dictionary’s entry on “research” , which makes perfect grammatical sense, providing examples of usage “errors” in a "Get it right" section, along with corrections, it was surprizing to find that a number of other dictionaries, including the Wikipedia Wiktionary and the Cambridge and Oxford dictionaries, accept the countable form of research –  “researches”. This plunged me into a deep depression, but I guessed that, since the better dictionaries depend on statistics – aiming to be descriptive rather than descriptive – it is hard to argue with them. (Here is more information on how words enter dictionaries from a previous blog post of mine, “How DictionariesCope With Language Change” ; the post also happens to include a link to “How a New Word Enters the Oxford Dictionary”).

In any case, based upon the solicited input of fellow English Communication Skills teachers on how they handle the matter with their students, it was clear that the instructors were divided in their opinions. Out of the six colleagues who contributed to the discussion, two seemed to be in favour of accepting the plural, or at least not penalizing students for it. One appeared to be between the two extremes, though her answer was somewhat vague, and the remaining three were vehemently opposed to the usage. Here are extracts from what they said:
·     “I usually (if not always) cross out the 'es' when students pluralize' research' - I like the examples/samples listed in Macmillan dictionary and their complete rejection of the plural form.” (Rima Shadid)

·     “I automatically cross out the  ‘es’ and replace it with ‘studies’ as I mark my students' papers...’research studies’...I do so not necessarily because ‘researches’ sounds a little odd to me, but rather simply because ‘research studies’ is usually more accurate.” (Missan Laycy Stouhi)

·       “With regard to dictionaries:  Just because a dictionary does not set a particular standard, this does not mean that the standard does not exist (dictionaries are not the be-all and end-all of language use)…. If American society still equates nonstandard with substandard after all of this effort, how can we expect an individual here or there who uses nonstandard English to have much impact…?” (Kathryn Lincoln)

Apparently, we are not the only people in the world (or on the web) debating this grammar point. Take a look at this forum, for example, where someone asks, “I am not sure about the plural of research. Can you help me?”: http://forum.wordreference.com/showthread.php?t=1828694
·         One person replies “researches”.

·         Another person says, “No, I would argue ‘research’ is uncountable because it doesn't sound right to say ‘Yesterday I did three researches.’ It would either be ‘Yesterday I did research’ or ‘Yesterday I did three research assignments/cases/files’ etc. The only time you would have ‘research’ in plural is to refer to the person who does research or their job title. i.e. ‘We have three researchers.’ (Note the spelling- not ‘researches’)."

·         Yet another comments, “Sorry, Jack. It can be a countable noun in some cases or, at least, it's starting to be used that way…. (Definition of research noun from the Cambridge Advanced Learner's Dictionary).”

·         The final comment on the thread is, “This is an excellent example of the difference between what one finds in the dictionary and how one speaks. With respect to modern spoken English (at least in AmE), Jack is absolutely right: we do not use the plural ‘researches’. The fact that it's in the dictionary is secondary to modern usage.”

Teachers – and students – out there, what do you think? My advice is that, if something is going to sound jarring to your readers or listeners, use a safer alternative – never forget the audience. Besides, in this case, if you still see research as a process rather than a product or an object, why pluralize it?

Tuesday, February 5, 2013

Word of the Year


Words, Words, Words
Different organizations have voted for different words as “Word of the Year 2012”.
The American Dialect Society chose “hashtag”, the well-known symbol used in Twitter. Surely, that is not surprising, knowing the increasing popularity of the social network and the networking and sharing tools it provides. Still, for some, the choice was somewhat unexpected – leading New York Times blogger Jennifer Schuessler to refer to the word as a “dark horse” winner. Other top candidates, including acronyms and phrases, were “YOLO” short for “you only live once”, “fiscal cliff”, “marriage equality” referring to legalization of gay marriage, and “Gangnam style”. Of these, “YOLO” was voted least likely to succeed, contrary to “marriage equality”, rated most likely to. Interestingly, among the categories was one for the most euphemistic expression, where “self-deportation” was the winner, meaning “policy of encouraging illegal immigrants to return voluntarily to their home countries” (that is, by making life difficult for them rather than officially expelling them).
The Merriam Webster Words of the Year, the two most looked-up words in 2012 were “capitalism” and “socialism”, probably prompted by the year’s U.S. elections, including the healthcare debate; people tended to look up the words together, said Peter Sokolowski, the dictionary’s editor-at-large to CBS news (“’Socialism’ and ‘Capitalism’ Revealed as 2012 ‘Word of the Year’ ”) – a bit like looking up “depression” and “mania” together one might reckon! The 2011 Webster word of the year was “austerity”, not surprising considering the world economy that year.
The Oxford dictionaries of the U.S. and the U.K. also had their 2012 favourites, respectively “to gif”, from the well-known file format, and “omnishambles”, meaning a disastrous situation, whichever perspective you take. Still, though chosen by the relevant lexicographers as the most interesting words of the year, the terms do not necessarily enter the dictionaries and may fade away with time.
What about “Arab Spring” you might ask? Does it not deserve a place in all this? Well, the term was actually chosen by Global Language Monitor  as the 2011 phrase of the year, along with word of the year, “occupy”.  The Monitor's 2012 choices were “Gangnam style” and “apocalypse”.
Time will tell which words make it to the top in 2013. Let’s watch and see.

Monday, January 21, 2013

Myths About University Faculty

A recent Forbes magazine article by Suzan Adams created such a stir through torrents of reader comments that she had to quickly update it, acknowledging that she may have been mistaken – though not apologising for the offense. In “The Least Stressful Jobs of 2013”, she had given the impression that university faculty had such easy-going jobs that they were to be envied for their generally stress-free (i.e. possibly lazy) lifestyles:

"University professors have a lot less stress than most of us. Unless they teach summer school, they are off between May and September and they enjoy long breaks during the school year, including a month over Christmas and New Year's and another chunk of time in the spring. Even when school is in session they don't spend too many hours in the classroom ... Working conditions tend to be cozy and civilized…."
Note the phrase “unless they teach summer school”, and note the focus on “hours in the classroom”, as if work outside the classroom such as grading, student office hours, faculty meetings and committees, research, creating new activities and exams, and updating syllabi and course material does not count.
Although Adams cites her major source of information, CareerCast, she clearly overlooks the work university faculty are involved in outside the classroom, which her source does mention: “Work beyond the lecture hall is also a vital facet of the professor’s day. Postsecondary educators can assist in scholarship committees and curriculum development. Research is also a critical part of the university professor’s responsibilities, as educators typically are expected to produce published works” (Kensing).
One of Suzan Adams’ critics, Forbes colleague David Kroll, has countered her article, expressing surprise and disappointment at her “misguided” piece in “Top 10 Reasons Being a University Professor is a Stressful Job”. Based on his personal experience, among the reasons for faculty members’ stress are the following: “the customer is always right” mentality applied to students unprepared for higher education; abuse of part-time faculty (threatening full-timers that they may be replaced by adjuncts); administrators and the public undervaluing teaching loads; administrators undervaluing online teaching (“If you’re already teaching the class, it’ll be nothing to throw it up online, right? “); hiring too many administrators at the expense of faculty members; and the fact that “tenure” has become meaningless: “I’ve rarely seen a tenured professor be fired but a professor with tenure who is deemed unproductive by whatever anonymous review can certainly be made to wish they didn’t have a job.”
Any experienced educator would rightly side with Adams’ critics. The fact that Adams cites a source is not enough to justify her warped, overgeneralized claims.

Sunday, December 30, 2012

New Year's Madness?

Seeing the traffic and the hustle and bustle at the end of the year in our cities can get one wondering about the meaning of a new year. Why does the whole world celebrate the New Year? Why is the end of December such a special turning point? Is this demarcation not rather arbitrary compared with other celebrations? The meaning of New Year’s is not as clear as that of Christmas, for example (good will to all), or Independence Day. Are the celebrations completely vacuous, or do they have a deep psychological significance for people around the globe?

In ancient times people welcomed the New Year with rituals meant to attract good fortune. The Ancient Romans caroused in the streets for several days, around food-laden tables. The Ancient Babylonians, Hindus, Chinese, and Celts sought to restore order after chaos at the turn of the year. Until now, starting fresh is a common concept in many cultures.

The month of January is actually named after the Roman god Janus, the god of gates, doorways and thresholds: a two-faced god with faces looking in opposite directions representing the past and the future. No wonder then that at the end of the year, people reflect on past achievements and plan for a brighter year ahead. Whether people reflect deeply about their values is not always reflected in the resolutions they proclaim. Around the world there are resounding, common themes: people want to be healthier, to exercise more and smoke less, to be more active members of their communities, to be more productive at work, etc. Psychologically, people want to improve themselves and their lives in general.

While there is nothing wrong with recalling one’s values and wanting to advance, the question remains as to why only now? Isn’t December the thirty first, technically speaking, the same as any other day of the year? Why not remember our values daily, throughout the year? Why not seek improvement all year around, regularly reflecting on our successes and failures, our goals for the future?

The point here is not to belittle New Year’s celebrations, although they can be extravagant in proportion to the real significance of the New Year, nor is it to undermine new years’ resolutions. The point is that one can sympathize with those who laugh at the crowds flooding the gyms in January, who dwindle out of sight in February, and one can understand those who decide not to celebrate at all.

Monday, December 10, 2012

Breaking Language Barriers

                                                   
Your voice in language 1 – MachineTranslation - Your voice in language 2

Last May I blogged about the ways in which natural language processing is changing our world, mentioning a number of applications, including automatic machine translation of speech using one’s own voice. I wrote that computer speech synthesis had advanced to an extent that, in the near future, systems would be able to translate your speech using your own voice. Theoretically speaking, you would be able to hear your voice speaking a foreign language without your necessarily having learnt that language. With sufficient samples of your speech, such systems would be capable of putting together new sentences for you, in the new language. The systems would just need to know your voice, and they would do the rest of the work.

Well, that future is here now. Rick Rashid, Microsoft’s Chief Research Officer, has recently demonstrated automatic translation of his English speech into Chinese at Microsoft Research Asia’s 21st Century Computing event. One of his blog posts includes a recent video of his demonstration, entitled “Speech Recognition Breakthrough for the Spoken, Translated Word”. In that post, he explains that the first challenge in such systems is for the computer to actually understand what you are saying – a challenge acknowledged by experts decades ago; the past decade has seen breakthroughs reflected in a “combination of better methods, faster computers and the ability to process dramatically more data” (“Microsoft Research Shows a Promising New Breakthrough in Speech Translation Technology”).

Rashid adds that over two years ago Microsoft researchers made their systems far more intelligent by using a technique called Deep Neural Networks that simulates the way the human brain works. He asserts that error rates in speech recognition have been reduced from around one in five words to about one in eight, and that, similarly, machine translation has become more reliable. In the case of English to Chinese, the system works in two steps: first, it translates the English words into Chinese equivalents; then, it re-arranges the words to form proper Chinese sentences. While acknowledging the persistence of errors, including amusing ones, in both the English text and the translation, he projects that, in a few more years, these systems will be astonishingly reliable.

This is definitely breaking news on breaking language barriers. The implications for academic institutions might warrant some consideration.

Saturday, November 24, 2012

An Online “Tsunami”

                                
AUB is shyly experimenting with hybrid formats of learning at a time when the global universities have formally embraced online learning. Despite some continued resistance, including liberal arts technophobia, it is official this year because Stanford University President John Hennessy has proclaimed in a Wall Street Journal interview that a “tsunami” is approaching and that his goal is “not to just stand there” but to “try to surf it”, along with the other US elite universities. Some of the criticism aimed at this trend revolves around standards, superficiality versus depth of learning, the relevance of online formats to some subjects, such as philosophy, and the applicability of distance education to young undergraduate populations that may need more guidance than the more mature “continuing education” type of students. Still, it seems, there is no stopping this wave. Rather, efforts are now directed at preparing for it.

The World Economic Forum brought together a diverse group of university stakeholders this summer in order to discuss this online “tsunami” that everyone is talking about, as reported by Ivarsson and Petochi. One of the issues that were debated was the central role of students; as student expectations change, who should decide on the best forms of teaching and learning? Academic institutions or students? Another issue of debate was that of certificates versus degrees, where “Certification and degrees may have to be aligned”. The participants agreed that, in any case, online learning would be an inevitable part of future universities – it is already here (“What Will the University of the Future Look Like?”).


Will some institutions continue to babble while others sing at the top of their voice? Time will tell.

Friday, October 12, 2012

Arabic in Unofficial English - 12 October 2012

Arabic in Unofficial English


I recently came across an interesting slang dictionary by the American lexicographer Grant Barrett: The Official Dictionary of Unofficial English. Although it dates back to 2006, it was definitely new to me. What caught my attention most in the text were the Arabic and Middle East related words that were included. Many of them had crept into English since 2003 in Iraq, especially in the “War Against Terror”. Here is a listing:

Ali Baba: “thief. After the government of Saddam Hussein was toppled, uncontrolled looting ravaged the country—anything of value, and many things that weren’t, were stolen or destroyed. Looters, and, generally, any thieves, are called ali baba, by Iraqis, after the tale of ‘Ali Baba and the Forty Thieves,’ told by Scheherazade in the stories known in the West as Thousand and One Nights. American soldiers who have served in Iraq say they tend not to use the term as a noun, but as a verb meaning ‘to steal’: ‘We’re gonna ali baba some scrap metal from their junkyard.’”

Dhimmi: “a non-Muslim living with limited rights under Muslim rule”

Eurabia: “the (perceived) political alliance of Europe and Arab nations (against Israel and Jews); a name for Europe if its Muslim or Arab immigrants become a large or powerful minority”

Haji: “an Iraqi; any Muslim, Arab, or native of the Middle East”

Hawasim: “a looter or thief”

Muj: “among (Anglophone) foreigners in Middle Eastern or Islamic nations, a guerrilla fighter or fighters. Clipped form of Persian and Arabic mujahideen, plural for mujahid, ‘one who fights in a jihad or holy war.’”

Shako Mako: “loosely translated as ‘what’s up?’ or more specifically, ‘what do and don’t you have?’ or ‘what’s there and not there?’ It’s similar to shoo fee ma fee used in Lebanese Arabic. Commonly one of the first Iraqi Arabic expressions learned by coalition forces. A common response is kulshi mako ‘nothing’s new’.”

Ulug: “thug or lout. Repopularized by the former Iraqi Minister of Information Muhammad Saeed Al Sahhaf as a term for Americans. The word had previously been rare.”

Of course these are not the only expressions that will be of interest, so happy reading!


Posted by May Mikati on 12 October 2012, 12:47 AM

Monday, September 10, 2012

Nouns that Were Verbed in the Olympics - 10 September 2012

Nouns that Were Verbed in the Olympics


Now that both the Olympics and Paralympics are over, reflecting on language used at the events is due. The connection between the Games and the English language is not an obvious one, but some controversy did brew up this year over sports terms such as “medal” and “podium” that are now occasionally used as verbs. However, as Liz Potter of the Macmillan Dictionary blog notes in “They Came, They Medalled, They Podiumed”, the verbing of nouns is not a new phenomenon in the language (nor is resistance to such evolution one might add). In fact, many other nouns, unrelated to the Olympics, have recently become common verbs: to blog, from web log, to friend and unfriend from Facebook features, and to Facebook are just a few examples.

So why are Olympics-related terms so controversial? Possibly because the event is a high profile one with global coverage. The furore over “medal”, which has not only been used as a verb but also as an adjective, as in “the most medalled Olympian”, has been documented by The Guardian newspaper’s style editor, David Marsh, who defended the use in relation to the 2008 Olympics, commenting that it was neither illegal nor immoral, while, for the purists, it was a sign that “the linguistic barbarians are not only at the gates: they have battered their way through, pulled up a chair, helped themselves to a beer and are now undermining our very way of life by rewriting our grammar books to suit their evil purpose” (“Mind Your Language”).

Historically in English, nouns have been verbed, and verbs have been nouned: the process is called conversion. Those who react violently to such verbal variation are simply undermining the linguistic creativity of others, as well as the natural evolution of the language.


Posted by May Mikati on 10 September 2012, 10:57 AM

Thursday, September 6, 2012

Metaphors for Teaching - 06 September 2012

Metaphors for Teaching


At the start of a new academic year, what better metaphors to explore than metaphors for teaching? The Annenberg Foundation has surveyed school teachers in the U.S. for metaphors they would use for their work. Dozens emerged, including that of a dolphin riding the waves of the classroom, an actress with many roles to play, an encyclopedia maximizing students’ learning, a detective diagnosing students’ needs, and a farmer planting the seeds of knowledge (“What’s Your Metaphor? Survey Results”). The better metaphors touch upon the diagnostic and formative aspects of teaching, not just summative, end product aspects.

Kim McShane, a university lecturer in Sydney, Australia, has researched metaphors for university teaching, focusing on academic teacher identity in the light of the integration of ICT in teaching. Metaphors for teachers who use technology are different from those used to describe old-fashioned teachers. The facilitator metaphor, that of the “guide on the side”, supplants that of “sage on the stage”. Such teachers are seen as leaders, hosts, managers, mediators, or resources to be consulted. Traditional teachers, on the other hand are seen as authoritarian providers of knowledge: performers and deliverers of content. McShane worries that some of the new metaphors may actually be interpreted as devaluing, or ignoring teachers’ work – and one may or may not agree.

My favourite metaphor is that of the transmission of cultural DNA, comparing cultural propagation to genetic propagation. After all, culture is not just a matter of history or people’s rituals, let alone how they dance or sing; it is about how they react to current issues, including the way they solve problems. Harold McDougall, a law Professor, examined the idea in a recent Huffington Post blog post, “Cultural DNA: Bringing Civic Infrastructure to Life”. His post ends on a particularly relevant note: “As we teach them and send them on their way, we have a responsibility to pass on the tenets of progressive social change as our generation understands them: learn by experience; respect context; encourage participation; honor place; accept limits; and acknowledge temporality. These strands of cultural DNA, traditional and modern, can help us construct a culture of empathy and sustainability that is the proper foundation for progressive social change.”

In their book Metaphors We Live By, Linguist George Lakoff and philosopher Mark Johnson have argued that our minds actually process the world primarily through metaphors and that the way we conceptualize abstract ideas affects the way we understand them. Metaphors define roles. Therefore, those that represent students as passive receivers of knowledge, such as the gardener or shepherd analogy, implying that students are expected to behave like plants or sheep, are not as useful as those that focus on what students can do. A famous metaphor attributed to William Butler Yeats underlines the need for motivation of students: "Education is not the filling of a pail, but the lighting of a fire". Very true: education should transcend the mere transmission of content; it should be more about sparking curiosity, teaching students how to learn, and encouraging independent and life-long learning.

An insightful Chinese proverb can be valuable in this connection: “Teachers open the door, but you must enter by yourself”.


Posted by May Mikati on 06 September 2012, 2:35 PM

Sunday, August 12, 2012

Blogging as Lifelong Learning - 12 August 2012

Blogging as Lifelong Learning


While reading blogs may certainly contribute to one’s education, blogging itself is also a form of life-long learning. Writing about anything means understanding it in order to express yourself clearly about it; you need to learn about it first, experience it in a way, and reflect on it before you can effectively share your thoughts about it.

Even the briefest blog post may be preceded by hours or days of reading or mulling over a topic. There will be times when not much, if anything at all, will have been written about your idea – as was the case with my previous post, linking the Olympic motto to blogging. I could not find a single online resource applying the “faster, higher, stronger” maxim to blogging. I was thrilled that no one had written about blogging from that particular vantage point in the past.

Yes, blogging can be thrilling – and thought provoking. Was the allusion eccentric I thought? Or was it simply creative? Either way, on such occasions, a few clicks later, and the post is published. On the other hand, for more ordinary topics, there will be more information out there than you can handle. You need to be selective. Wading through tonnes of others’ online pronouncements on an issue is not always a zappy experience; it can be slow and painstaking. One article leads to another; one video leads to another, and so on and so forth. You compare against your prior learning and experience. Ideas flow. Some sink in; others drop out. New insights form. You shape your new ideas, you shape and reshape the text through which you will express them; you check your word choices for accuracy and appropriacy; you reflect on your choices, semantically and pragmatically, then you share. Repeatedly, you go through this process. Now if that is not lifelong learning, then what is?


Posted by May Mikati on 12 August 2012, 5:13 PM

Wednesday, August 1, 2012

Citius, Altius, Fortius - 01 August 2012

Citius, Altius, Fortius


Social media can provide faster, higher, stronger platforms for expression. They are clearly faster than more traditional forms of publication. The parallel which opponents of blogging, and other moralists, may draw with the tortoise and the hare does not hold as that would be more like comparing apples and oranges. Take this blog, for example: the content would rot if were to be kept and later published as a book, or even as a traditional “article”, in tortoise-like fashion!

The “higher” part is not so well-defined. While it would be hard to argue that blogging is always morally superior, it may be viewed as being above traditional publishing in the sense of bypassing the hurdles of conventional reviewers, editors, etc. One is always a click away from publishing the next idea – no bureaucracy, and no fuss. The spontaneity of the pieces, and the transparency of reader feedback, may actually provide a slight moral edge.

For addicted bloggers and readers, of course, “higher” may take on a special, added meaning.

Finally, social media are stronger in the sense of their immediacy, global reach and impact. In their accessibility to writers and readers, they may also be considered fairer than traditional media, especially for the traditionally disadvantaged.

The Olympic motto “Citius, Altius, Fortius”, Latin for “Faster, Higher, Stronger”, can therefore apply to blogging, whether in the sense of civic engagement or not.

Let me know if you disagree.


Posted by May Mikati on 01 August 2012, 3:10 PM

Wednesday, July 25, 2012

A Waste of Time or Digital Social Capital - 25 July 2012

A Waste of Time or Digital Social Capital?


The term civic engagement has been used to reflect many different approaches to citizenship, whether local or global, including a variety of activities - from informal individual activities to formal collective ones.

Both blogging and commenting on blogs may be considered forms of civic engagement. In an interview for The Chronicle of Higher Education, Berekely’s Howard Rheingold emphasized the need to encourage students to blog, saying that 21st century civic education is “participatory media-literacy education”, distinguishing passive consumers of broadcast media content from active citizens who blog, share videos, comment on newspaper articles online, etc. (“Why Professors Ought to Teach Blogging and Podcasting”).

On the other hand, not everything posted by ordinary citizens is influential at this point in time, as explained by Ryan Rish of MIT in the paper “User Generated Content of an Online Newspaper: A Contested Form of Civic Engagement”. Regretting that user-generated content, such as feedback provided on online newspaper sites, is not currently considered a legitimate form of civic engagement, he expects greater impact for future civic and participatory journalism. While civic journalism involves professional journalists encouraging interactive news reporting, participatory journalism places citizens more centrally, involving them in the collection, analysis and publishing of news and ideas. Focusing his study on an online school newspaper in the U.S., Ryan reports that “Members of the local school district leadership discounted user-generated content associated with the online newspaper as a legitimate form of communication with school district officials, while users of the online newspaper and the editorial staff of the newspaper argued for the user-generated content to be considered a form of community conversation”.

Digital social capital or a waste of time? You decide.


Posted by May Mikati on 25 July 2012, 4:34 PM

Monday, July 2, 2012

Online Civic Engagement - 02 July 2012

Online Civic Engagement


Over a year has passed since I started blogging. What keeps this blog going when many academics fear blogs as unconventional, non-peer-reviewed forms of publication?

Since blogs are open to the world, anyone can scrutinize their content and comment on it, including experts – something not entirely different from peer review. Additionally, blogging may be seen as a form of civic engagement. It is useful not only in teaching and building community with one’s immediate environment but also in outreach to a broader community. And don’t forget, it’s much faster than other forms of publishing.

One blogger and teacher, Michael Kuhne, sees wikis such as Wikipedia as a form of civic engagement: “When it works, Wikipedia is this great social experiment where people with a vested interest in an article (actually, their interest is not the article itself, but what the article (re)presents) can exchange ideas, debate, deliberate, and create. How many civic institutions exist today that can promise the same?” (“What Does Wikipedia Have to Do With Civic Engagement?”). Traditionally, civic engagement has taken the form of non-profit contributions to society, usually by powerful groups of people providing services to their communities through channels such as charities, scouts, social welfare groups and religious organizations.

The Pew Research Center reported in 2009 that, in the U.S., the internet was still mirroring traditional socioeconomic power relations. The more advantaged are more likely to be civically engaged (whether online or not) just as the situation has been historically. Yet things are changing: “There are hints that forms of civic engagement anchored in blogs and social networking sites could alter long-standing patterns that are based on socioeconomic status” (Smith et. al, “The Internet and Civic Engagement”).

In future postings I shall continue to reflect on the idea of civic engagement online – a thought-provoking subject.


Posted by May Mikati on 02 July 2012, 5:23 PM

Thursday, June 7, 2012

Why We Quote - 07 June 2012

Why We Quote


Having recently blogged on the subject of originality, quoting appears as an antithesis. Still, if you are interested in the culture and history of quotation, this book by Open University scholar Ruth Finnegan will be of interest: “Why Do We Quote?: The Culture and History of Quotation”. Finnegan dedicates her book to “the many voices that have shaped and resounded in my own”; then she asks interesting questions in her preface: “What does it mean this strange human propensity to repeat chunks of text from elsewhere and to repeat others’ voices? How does it work and where did it come from? Does it matter? Why, anyway, do we quote?”.

Admitting that her book is biased towards western culture, and her research focused on southern England, she begins the book in contemporary England, the “here and now”, and moves back chronologically to understand the background to her subject. A large scale survey conducted in 2006 shows that English people quote for various reasons, and that proverbs constitute a large proportion of quotations. The proverb “more haste less speed” appears repeatedly in her survey results. Other proverbs include “Sticks and stones will break your bones, but words will never hurt you”, “Too many cooks spoil the broth”, and “Laugh and the world laughs with you, cry and you’ll weep alone.” Quotes are used not only to share information, but often, especially in conversation, to evoke irony, a pun or analogy which the listener must catch. In e-mail, quotes have become fashionable as a “tag or sort of signature”. In persuasion, quoting famous people tends to add credibility to what is being argued; it may add “weight” or “ammunition”. Quoting may also be used for the sake of humour or sarcasm, as in “A bad workman blames his tools”. On the other hand, many of those surveyed had reservations against quoting: it can border on plagiarism, it is unoriginal, a sort of “parroting” to be avoided, a sign of possible laziness. Some had no objections to it, as long as it was not overdone. Still, what was at issue was not the quantity but the appropriacy of what was being quoted: was it necessary, or was it done for the sake of pompously showing off?

Historically speaking, Finnegan says, the origin of quotation marks is not clear. For example, different versions of the Bible used different devices to indicate reported speech: while newer versions include angled quotation marks, older versions used devices such as indentation, capital letters, or colons. Some texts only used verbs to indicate spoken or written words. She indicates that the ancient Greeks were probably the first to use inverted commas: a wedge shape > was used as a marginal sign to draw attention to anything especially important in a text: linguistic, historical or controversial. This diple mark eventually metamorphosed into the quotation marks we use today. She notes stylistic and cultural differences in the way people quote across languages, identifying a disadvantage to using quotation marks; they are too binding: “they impose too exact an impression…. Quote marks are too crude a signal, it could be said, for the subtleties of submerged or fading quotation, allusions, parody, intertextuality, leaving no room for finer gradations” (p.109).

Finnegan distinguishes between quoting to endorse another and quoting to set oneself apart, keeping the other at a distance; the way the quotation interacts with the rest of the text should reflect whether the other is respected or “mocked… parodied, or belittled” (p. 171). In a chapter entitled “Controlling Quotation”, the author indicates that quoting has become a regulated social activity; not only is plagiarism frowned upon, there are laws protecting intellectual property and copyright. In “Harvesting Others’ Words”, Finnegan notes that collecting quotations has been common in the west for millennia, but is not restricted to the western tradition. Ancient Mesopotamia, early China, India and the Arab world, among others, have their own collections. There seems to be a moral force to the words of past generations – a certain wisdom. As for proverbs with pictorial illustrations, the west first saw these in medieval times, as reflected in the French collection Proverbs en Rimes later translated to English (p. 179).

The book ends with the conclusion that there is no single, clear-cut answer to the question of why people quote: quoting is a “normal” aspect of language, which, like other human behaviours, has its own social regulations. Finnegan’s final question is why not?

It is hard to disagree with this book. After all, it is based on facts rather than conjecture. It is highly relevant to historians, teachers, and university students who write substantive essays. It is comprehensive in that it tackles both written and oral texts, viewing them in different contexts: those of religion, philosophy, the family, and society at large. On the other hand, as the author rightly indicates, excessive use of others’ words – and ideas – may give the impression of laziness or lack of originality, so students need training in how, what, and when to quote.

The challenge of original expression is of course multiplied for those writing or speaking in a second or third language, so language teachers take heed.


Posted by May Mikati on 07 June 2012, 11:24 AM

Wednesday, June 6, 2012

Linguistic Inflation - 06 June 2012

Linguistic Inflation


The Macmillan dictionary blog recently hosted two attention-catching articles on hyperbole by Stan Carey: “Is Linguistic Inflation Insanely Awesome?” and “The Unreality of Real Estate Language”.

In the first article, Carey explains that linguistic inflation devalues words by associating them with what is of lesser value, as in referring to a clever person as a “genius” or labelling an internet link that we share as “insanely amazing” simply because that draws better attention than “pretty good” or “rather interesting”. Still, Carey does not see this as seriously problematic because we will never be short of words to express what we want: as grand-sounding words become routine, other words take their place by “further shift or by coinage” as indicated in the Economist article “Dead, or Living Too Well?”; as the meanings of “awesome” and “terrible” changed, for example, “awe-inspiring” and “horrifying” took their place. Similarly, the Economist anticipates that a new word will soon be needed to signify a “curator” in the sense of an art warden because the meaning of “curator” has been significantly diluted: “A curator is no longer a warden of precious objects but a kind of freelance aesthetic concierge” (“Curator, R.I.P”).

While scientific and academic writing resist linguistic inflation, some less formal contexts such as those of real estate language illustrate the phenomenon very well, according to Carey: “In this world, medium is ‘large’, average is ‘first rate’, and unusual is ‘extraordinary’. Any site that isn’t a ruined shack sinking into a swamp may be described as ‘superb’….Even run down houses can be made appealing, since they offer ‘immense potential’ ”.

English language learners beware: You need to understand the language 110%!


Posted by May Mikati on 06 June 2012, 2:45 PM

Friday, June 1, 2012

Gender Neutral Language - 01 June 2012

Gender Neutral Language


First in France this year - now in Sweden: the feminists are changing the language. The Swedes, known as the most gender equal people in the world, are now striving beyond equality – towards neutrality, and this is being reflected in their language. A new gender neutral pronoun, “hen”, can now be used instead of the feminine “hon” or masculine “han”. Suggested by linguists in the 1960s, the pronoun finally made it into the mainstream language this April when it was added to the National Encyclopedia in Sweden.

Nathalie Rothschild has reported on how the Swedish society is no longer satisfied with gender equality; pressure groups are working on the elimination of gender distinctions from society at large, including government institutions. The purpose is not simply to accommodate those who do not identify well with a particular gender, or who wish to marry someone of the same sex: “What many gender-neutral activists are after is a society that entirely erases traditional gender roles and stereotypes at even the most mundane levels”( “Sweden’s New Gender-Neutral Pronoun: Hen”). Rothschild gives examples on how this is happening: parents are being encouraged to use unisex names for their children, a Swedish clothes company no longer has a “girls” section as distinct from a “boys” section, and toy catalogues are following the same logic. Schools, sports, and even restrooms are following the trend. Of course, there has been opposition, including complaints that the feminists are destroying the language, but this has not stopped “gender pedagogues” from monitoring schools and taking action where necessary.

Sweden is a perfect example of a new world order, including a “new word order”, in sharp focus. Others will follow – slowly but surely, wouldn’t you agree?


Posted by May Mikati on 01 June 2012, 9:24 PM

Sunday, May 27, 2012

What Is Originality? - 27 May 2012

What Is Originality?


Educators like to promote original thought and creative expression, but what exactly is originality? If you go to the plagiarism detection web site, Turnitin, you will see one meaning of an “originality report”: the percentage of matching text. It is easy to infer that the lower the percentage of matching text, the greater the originality of ideas. Stolen ideas that are paraphrased are not easily detectable by such systems. In theory, students can recycle entire “research” papers and submit them to such services, and they can get away with it. Those who are too lazy to paraphrase their stolen ideas are caught more easily!

Few are those who are truly original since writers build on others’ ideas, as do innovators in various fields – scientists and engineers, fashion designers, chefs, etc. On her web site Brainpickings Maria Popova has posted thoughts from Henry Miller that are worth sharing:

And your way, is it really your way?

[…]

What, moreover, can you call your own? The house you live in, the food you swallow, the clothes you wear — you neither built the house nor raised the food nor made the clothes.

[…]

The same goes for your ideas. You moved into them ready-made.


Originality, it seems, is not a matter of black and white. There are different degrees and types of originality. If students are encouraged to take fresh angles on their topics, synthesize ideas in new ways, and express themselves in a creative manner, the chances of their producing “original”writing are raised – all the while of course remembering the need to acknowledge any sources.


Posted by May Mikati on 27 May 2012, 8:19 PM

Thursday, May 24, 2012

Banned Words - 24 May 2012

Banned Words


Words can be banned for various reasons. Let’s examine examples from France, the U.K. and the U.S.

Among various efforts to eliminate gender discrimination, the feminists in France managed to ban the word “mademoiselle” from official documentation a few months ago; it has been replaced by a generic “madame”. Last year, France banned the words “Facebook” and “Twitter” from TV and radio, dictating that general terms such as “social networking sites” be used instead as the latter do not advertise for specific companies. Years earlier, the culture ministry in Paris had published a list of 500 English words, such as “email”, “blog”, and “podcast”, recommending that certain French equivalents be used instead. Besides gender equality, national pride is clearly an issue for the French.

In the U.K this month, Scotland Yard banned the terms “whitelist” and “blacklist” in an effort to reduce racism in the police force. Staff have been advised to use the equivalent “green list” and “red list”. Some police officers are not convinced that this will change anything, but following repeated allegations of racism, senior officials at Scotland Yard will go to any length to reduce sensitivities (“Blacklist Is Blacklisted”). Generally, though, the U.K. may be moving in the opposite direction – that of eliminating a 2006 law that bans “insulting” words but does not clearly define them. The BBC recently reported on the opposition to the law in “Law Banning Insulting Words and Behaviour 'Has to End'”.

In educational contexts some expressions may be avoided if considered distracting for students. New York City’s Department of Education recently banned over fifty such items from the city’s standardized tests. Most of the words, such as “Halloween” and “dinosaurs”, appear innocuous on the surface, so no wonder the list has sparked controversy. Valerie Strauss, reporting on the ban for The Washington Post, says, “Why Halloween? Its pagan roots could offend the religious. Dinosaurs? Evokes evolution, which could upset creationists. You get the point” (“50-plus Banned Words on Standardized Tests”).

Watch your words. While some word bans may appear silly, others are clearly justified. It’s good to stay up-to-date on these matters in order to adapt to different contexts, both synchronically and diachronically.


Posted by May Mikati on 24 May 2012, 11:56 PM

Saturday, May 19, 2012

How Natural Language Processing is Changing Our World - 19 May 2012

How Natural Language Processing is Changing Our World


From speech recognition to speech synthesis, and from machine translation to data mining, natural language processing is changing our world.

In language-related applications, computers are gaining intelligence at an amazing speed. Some computers can now not only recognize basic spoken words and sentences, they can also resolve lexical and sentence ambiguity based on the context; plus, they can recognize some idioms and metaphors. To top it off, computers are learning to detect emotion and respond appropriately. By extension, automatic translation is advancing daily, which may diminish the need to learn foreign languages for future generations. Speech synthesis has advanced in such a way that systems will soon be able to translate your speech using your own voice. Theoretically speaking, you will be able to hear yourself (or your voice, more correctly) speaking Hindi, Mandarin Chinese or even Mongolian in the not too distant future, without your necessarily having learnt any of those languages. With sufficient samples of your speech, such systems will be capable of putting together new sentences for you, in the new language. The systems just need to know your voice, and they will do the rest of the work.

Of course, automatic translation is a complicated task. Poetic language and uncommon metaphors and puns pose special challenges, as do certain expressions that may be considered “untranslatable”, requiring borrowing from the source language, adaptation, substantial paraphrasing or annotation. Still, as emphasized in tcworld, an international information management magazine, machine translation is becoming inevitable: “Over the next few years, every organization’s content strategy will rely on some type of machine translation” (“As Content Volume Explodes, Machine Translation Becomes Inevitable”).

As for data mining, while we all know how search engines are speeding up our research, more advanced searches can produce even better, more focused results, further eliminating the unwanted, irrelevant types of “hits” one normally obtains with ordinary search engines. Just watch this video to see how future search results can be refined with more intelligent searches: “How Natural Language Processing Is Changing Research”.

In this impressive video, Aditi Muralidaharan, a Berkeley graduate student explains her work on a new system called Word Seer. The system can save reading time for researchers by analysing digitized literary texts quickly, using parsing that targets useful parts of sentences, such as adjectives and verbs. Instead of performing a simple keyword search, the system extracts very specific data. The student gives the example of slave narratives being analysed for their references to God. Rather than simply typing in “God”, one asks specific questions about God: “What did God do?” elicits verbs, such as “gave”, “knew” and “blessed”, while “How is God described?” extracts adjectives, such as “good”, “holy”, “just” and “great”. The conclusion could be that slaves generally had a positive relationship with God despite their misery. For those working on this project, the hope is that researchers in the humanities will be convinced to use the technology based on the efficiency of the results. Rather than having a graduate student read through selected texts (with the word God in them) in five days, one can extract relevant information using the parser in five minutes.

Such advances in natural language processing herald a bright future for the currently not so bright technologies we use.


Posted by May Mikati on 19 May 2012, 9:17 AM

Wednesday, May 2, 2012

The Origin and Progress of the Academic Spring - 02 May 2012

The Origin and Progress of the Academic Spring


In case you were wondering how this current “Academic Spring” started, well, apparently it was triggered by a posting on a university mathematician’s blog in January 2012. On April 9, The Guardian newspaper published an article on this, entitled “Academic Spring: How an Angry Maths Blog Sparked a Scientific Revolution”. The article identifies Tim Gowers, a distinguished, prize-winning Cambridge mathematician as the initiator of the Elsevier boycott. Gowers received hundreds of comments on his post “Elsevier — My Part in Its Downfall”, and one of his supporters started a site for collecting the names of academic boycotters. Thousands have signed up in just a few months, and incidentally, there are two from Lebanon already, including one from AUB.

A more recent Guardian article, dated May 1, shows British government progress on the issue of facilitating public access to research results: with the help of Jimmy Wales, all “taxpayer-funded research [will be] available for free online” (“Wikipedia Founder to Help in Government's Research Scheme”). The same article reports that Harvard University, angry at the high cost of journal subscriptions has followed suit: it has encouraged its faculty members to publish openly and “resign from publications that keep articles behind paywalls”. The article cites David Prosser, executive director of Research Libraries UK (RLUK): "Harvard has one of the richest libraries in the world. If Harvard can't afford to purchase all the journals their researchers need, what hope do the rest of us have?...There's always been a problem with this being seen as a library budget issue. The memo from Harvard makes clear that it's bigger than that. It's at the heart of education and research. If you can't get access to the literature, it hurts research."

Having attended, in 2009 and 2011, international conferences on distance, open, and e-learning, and having witnessed the enthusiasm of participants, including that of UNESCO representatives, for open access to information, I am not really surprised by the momentum building up behind the Open Access movement; the Wellcome Trust and the World Bank are now also on board.

With one eye on the Arab Spring and another on the Academic Spring, one can easily lose sight of other important issues, however. One’s inner eye must always be on the lookout for less obvious but equally worthy causes.


Posted by May Mikati on 02 May 2012, 5:44 PM

Thursday, April 26, 2012

Open Access: An “Academic Spring” - 26 April 2012

Open Access: An “Academic Spring”


One of the many academic databases we have access to at the American University of Beirut is the well-known database Elsevier Science Direct. Faculty members and students use this resource to a considerable extent. Yet, I was recently surprised to learn that the database had been boycotted by thousands of academics worldwide as part of the boycott of the Anglo-Dutch science publishing giant Reed Elsevier, one of the world’s largest publishers of scientific, technical, and medical information, and owner of Lexis Nexis (another popular resource).

On April 1 – though this was not a joke – The Chronicle of Higher Education published “An Open Letter to Academic Publishers About Open Access” written by Jennifer Howard. Howard warned publishers that they should be nervous because of the new “Academic Spring” - the revolt against expensive publishers spreading throughout academia and represented by the Open Access movement.

Open access of course may be defined in various ways; the definition may be restricted to the relatively new open access journals, or it may include the less formal posting of working papers, blogs, and other non-peer-viewed work. While the traditional requirements of conventional academic careers may dictate otherwise, who knows what the future might bring for academia? Web 2.0 has done miracles so far. Besides, the United Nations, represented by UNESCO, supports open access.

If spring is here, can summer be far behind?


Posted by May Mikati on 26 April 2012, 5:51 PM

Monday, April 16, 2012

Softening Up the Language - 16 April 2012

Softening Up the Language


Language teachers often find themselves teaching about euphemisms, whether intentionally or not. A euphemism is a relatively harmless word or expression meant to replace a more offensive one. The blind are commonly referred to as “visually impaired” or “visually challenged”, spying is “surveillance”, and stealing can be “appropriation”.

A particularly interesting word often used as a euphemism is “overqualified”. When referring to a rejected job applicant, the term may be used as a cover-up for the fact that the employers do not wish to reveal their reasons for the rejection, or that the applicant is too old for the job, resistant to new technologies, or too demanding in terms of compensation.

The Economist editors recently published a report on euphemisms from around the world. Entitled “Making Murder Respectable”, their article defines euphemism as “a mixture of abstraction, metaphor, slang and understatement that offers protection against the offensive, harsh or blunt”. Noting that the British are “probably the world champions of euphemism”, the article concludes that, without euphemisms, the world would be a more honest but harsher place to live in. No witness to “the global war on terror” with its “friendly fire”, “collateral damage”, and “enhanced interrogation techniques” could possibly disagree.

Euphemisms definitely soften up the language, don’t they?


Posted by May Mikati on 16 April 2012, 8:34 AM

Friday, April 6, 2012

A "New Word Order" - 06 April 2012

A "New Word Order"


My previous blog post was about lexicography in the Internet age: how dictionaries are coping with the speed of language change. Here is solid background, and further reflection, on this ever more mercurial subject.

A Guardian article dated 2001 shows that back then the “New Word Order” was beginning to set in. Competition was suddenly hotting up between dictionary makers. Lexicographers had started implementing more sophisticated methods to keep up with language evolution. The author, D. J. Taylor, notes that speed had suddenly become of paramount importance in a field not particularly notable for speed. Hopeful for the future, he used a most revealing analogy: “If language is a butterfly, endlessly and effortlessly soaring above the heads of the entomologists who seek to track it down, then the nets are getting larger every year.” He reminded readers of Samuel Johnson, the most influential English lexicographer, who was the first to vehemently reject the prescriptivist approach, indicating that language is so variable that trying to police it is a doomed endeavour. Taylor added that while language does need to be tracked closely, it is like a beast that transforms itself into something else by the time one has finished the process of capturing and dissecting it. Some words take on new meanings between detection and publication.

Policing language is one thing, and tracking it is another. No wonder the constant searches, solicitation of user input, statistics and research. Will any of the well-known dictionaries ever implement live online updates to their definitions, or will they continue to solicit new input, adding appendices of possible new words, between editions? If they do all go “live”, that may be better for users, but any print editions published would automatically become obsolete. Will these dictionaries follow in the footsteps of the Encyclopedia Britannica soon?

Far-sighted thinkers, such as Michael Rundell, might ask whether there is a future at all for lexicography, or whether dictionaries will simply “dissolve” into our computers ("The Future of Lexicography: Does Lexicography Even Have a Future?”) . It would be interesting to watch and see.


Posted by May Mikati on 06 April 2012, 2:46 PM

Thursday, April 5, 2012

How Dictionaries Cope With Language Change - 05 April 2012

How Dictionaries Cope With Language Change


Can English language dictionaries cope with the current speed of language change? While such change usually involves grammar, pronunciation, spelling, and phrasing, the English language appears to be changing particularly fast in the realm of phrasing: the incorporation of new words and expressions relating to various topics, influenced, among other things, by the speed of technological change. Yet, it may be inferred that technological change is not a sufficient criterion for such change. Japanese, for example, has changed little, compared with English, according to a recent National Science Foundation report; other social and cultural factors appear to play a role ("Language Change").

Paul McFedries’ intriguing web site Word Spy (The Word Lover’s Guide to New Words) is a good example on the density of new expressions entering the English language, some of which are making it into the dictionaries. To cope with the phenomenon, well-known dictionaries are providing constant online updates. Merriam-Webster, for example, has a section for words proposed by the public: “You know that word that really should be in the dictionary? Until it actually makes it in, here's where it goes” (“New Words and Slang”). How, then, in the perpetual tsunami of new vocabulary, do dictionary editors decide which new words to include in updates to their dictionaries? First of all, an unabridged dictionary is likely to include more new words than an abridged one because of space considerations. Secondly, new words go through a long process before they are either incorporated or dropped, as illustrated through the example of Merriam-Webster. To make a long story short, a typical procedure involves the following broad phases: editors “reading and marking” a variety of published material, noting neologisms, variant spellings, etc.; saving the marked passages, along with their citations in a searchable database, showing not only where each text came from but in what context the new word was used; and “definers” reading through the citations, deciding which words to keep based on the number of citations found for each word as well as the variety of publications where it is used over a substantial period of time (“How Does a Word Get into a Merriam-Webster Dictionary?”). The process is almost identical in the Oxford Dictionaries (“How a New Word Enters the Oxford Dictionary”).

Dictionaries are coping with the speed of change with the help of technology: easier access to a variety of publications, searchable electronic databases, user input and faster statistics. With time, dictionaries can only become more objective -- descriptive rather than prescriptive as most were in the past. They are also becoming more democratic. A Wikimedia era of McDictionaries or a regulated lexicographic democracy? You decide.


Posted by May Mikati on 05 April 2012, 4:43 AM

Tuesday, March 27, 2012

April Fools? - 27 March 2012

April Fools?


Have you ever been tricked on April Fools’ Day? Apparently, some of the best known April first pranks have taken place in higher education settings.

In 1983, a Boston University professor of history, John Boskin, when interviewed about the origin of April Fools’ Day, fabricated a story that was published by the Associated Press and later withdrawn. He claimed that some court jesters in the days of Constantine had told the emperor they could run the empire better than he did and that, amused, Constantine allowed a jester called Kugel to become king for a day, April 1. When the young AP reporter got the “story” published, Boskin used the incident to teach his students about false reports in the media: how the media can take a joke, innuendo, or story, consider it as authentic, and spread it. Luckily, the credulous reporter’s career was not ruined; he is now an associate professor in the College of Communication (“How a BU Prof April-Fooled the Country”).

The Massachusetts Institute of Technology has been associated with other well-known April first pranks. Among these was the 1998 hacking of the institutional web site by students who announced the “unprecedented acquisition of a non-profit educational institution by a Fortune 500 company”. They claimed that a huge Disney scholarship fund would reimburse past and future students for the following twenty years; the Engineering School would switch to “Imagineering”; the Sloan School would be renamed the Scrooge McDuck School of Management; there would be a Donald Duck Department of Linguistics, and Disney characters would appear in lectures to keep students alert, facilitating the learning process ("Walt Disney Corporation to Acquire MIT for $ 6.9 Billion").

The University of Cambridge has also had its fair share of April Fools’ Day stories. A posting on a student forum in 2011 announced that, due to government spending cuts, the Vice Chancellor had announced Cambridge would soon become a science only university ("Cambridge to Cease Arts Teaching by the End of the Decade"). While some naive readers were shocked, others realised that could only have been a joke.

Let us all be on the alert this April first, and every day of every year; few are as fortunate as the Boston former AP reporter though many are equally, if not more, gullible.


Posted by May Mikati on 27 March 2012, 12:13 AM

Saturday, March 24, 2012

Fooling Around in the Classroom? - 24 March 2012

Fooling Around in the Classroom?


Does humour detract from the quality of teaching and learning? I would say that joking in the classroom is a high risk activity for educators. It depends on the quality – and quantity – of the humour, as well as its timing. Personally, on the rare occasions that I do use humour, I relate jokes to the subject matter I am teaching, and the first joke I tell in any class usually receives a positive reaction from students. Beginning with the second or third witticism (if there is one), students’ reactions vary. Alarm bells seem to start ringing for the paranoid in the audience. Yet while some appear uneasy, others may start imitating the humour in an effort to reciprocate.

Psychologist Ted Powers has written on the usefulness of humour in both teaching and assessment, citing Boughman’s famous statement, “One of the greatest sins in teaching is to be boring”. Powers’ definition of humour is a broad one, including any event that elicits laughter: “It is not limited to jokes or humorous stories but can include props, puns, short stories, anecdotes, riddles, or cartoons. It can be anything that creates a positive feeling in students and makes them smile and laugh.” He refers to studies that have shown the benefits of occasional appropriate humour: increased attention and focus, a more liberal atmosphere, helping with class management, better retention of information, and reduction of anxiety on a test or quiz ("Engaging Students With Humor"). Similarly, Melissa Wanzer’s “Use of Humor in the Classroom” discusses research on the benefits and challenges of using humour.

Retired linguist and humour specialist Don Nilsen advises caution regarding the timing of humour, which could be counterproductive when students are under stress, such as before exams or when major projects are due. Additionally, he warns against the use of sarcasm ("Humor Studies: An Interview with Don Nilsen"). However, he and his wife Alleen are great advocates of humour, having started a conference about it in the 1980s, which was always held on April Fools’ Day weekends. They published a journal and wrote books on the subject, including an encyclopedia of American humour ("Twenty Five Years of Developing a Community of Humor Scholars"). Don Nilsen also gave undergraduate and graduate courses in linguistic humour and language play. He illustrated language devices such as chiasmus (the use of criss-cross structures) through funny examples, as in a bumper sticker that read “Marijuana is not a question of ‘Hi, how are you’ but of ‘How high are you?’”.

The field of humour studies is a well-established one now. Take a look at the International Society for Humor Studies, for example. You’ll see a journal, conferences, seminars and workshops, and resources galore.

Humour, then, may be more serious than you think! Language teachers, especially, should try some language play every now and then to lighten up their material. While this may be a challenging activity, it may be quite rewarding.


Posted by May Mikati on 24 March 2012, 10:14 PM