Friday, September 12, 2014

Language Learning for a Globalized World

In my previous blog post on motivation in language learning, I indicated that I would be following up on the subject.

An opinion piece by Jocelyn Wyburd in Times Higher Education stresses the need for UK policies to encourage language learning at all levels of education. Entitled “Give Languages a Fair Shout”, the article reminds native speakers of English that the status of English as a lingua franca should not be an excuse for ignoring other languages. Being the Director of the Cambridge Language Centre, Wyburd’s opinion clearly carries weight.

Wyburd adds her voice to others decrying the decline of foreign languages in UK education, reminding readers that language learning not only enhances communication: “a gateway to understanding the world through the words, thoughts and cultures of others”; it has educational, cognitive and cultural value. To her, losing languages means losing “international insight”. She contrasts the situation with that of the rest of Europe, now including Scotland, where educational policy aims at equipping students with two foreign languages while in the rest of the UK only elite schools appear to mandate a foreign language. On a more positive note, Wyburd notes that employment pressures and research needs have motivated some students to pay attention to languages, yet she refers to this as “instrumental” as opposed to “the deeper, more specialist study of languages, cultures and societies, and the accompanying linguistic and intercultural competence.”

Finally, Wyburd supports the efforts of the All Party Parliamentary Group on Languages, with its recently released Manifesto for Languages. The manifesto starts with the following strong statement: “English is an important world language, but the latest cutting-edge research shows that, in the 21st century, speaking only English is as much of a disadvantage as speaking no English.”

So which other languages are important to learn? A BBC article on the “ ‘Alarming Shortage’ of Foreign Language Skills” cites the British Council’s Top 10 Languages, among which Arabic ranks second, after Spanish, followed by French, Mandarin Chinese, German, Portuguese, Italian, Russian, Turkish and Japanese (Languages for the Future). Teachers of Arabic may definitely be happy with this news.

Wednesday, July 9, 2014

Motivation in Language Learning

The Guardian newspaper’s Education section on “The Case for Language Learning” shows a number of interesting recent entries.

Geoffrey Bowden regrets the declining interest in the UK in foreign language learning, asserting that “If There Aren't Enough Linguists, We'll Need Immigrants”; he sees the disappointingly low numbers of foreign language learners reported by the Higher Education Funding Council for England as a serious threat – “It is difficult to measure the financial cost of poor language skills to the UK economy.” Bowden suggests that government incentives should be provided. On the other hand, John Mackey, in “Wanting It Enough”, discusses the importance of motivation, “the secret to success”, in language learning. He emphasizes the role of learning in context, as in traveling to relevant countries and interacting with people. He says that most people who succeed at second language learning are highly motivated to learn, whether “intrinsically” or “extrinsically”, as language researchers put it. Intrinsic motivation stems from factors such as the need to make personal connections, while extrinsic stimuli could include wanting to pass a language test. Mackey warns, however, that motivation is not enough: research shows that, for success in language learning, aptitude and access to proper instruction must accompany motivation. He cites Steven Pinker on the neurophysiology of language in the brain, concluding that “The idea of people being hard wired for second language learning is fascinating and, perhaps, appealing in that it might be used to get some of us off the hook if our language learning journey is less than successful.”

The Guardian advertises that today, July 10, there is a live debate in London on whether medicinal drugs should be used to enhance language learning. Apparently, scientists have noticed that mood disorder drugs can improve language learning. The controversy revolves around various implications - ethical, practical, social and medical - and whether the advantages exceed the risks (“Are Drugs the Answer to Learning Languages?”). One is definitely motivated to read more on the subject, whether in the Guardian or elsewhere.

Wednesday, May 14, 2014

English or Globish?

Anthropological linguists Edward Sapir and Benjamin Whorf emphasized the link between language and culture decades ago. They hypothesized that one’s language affects one’s view of the world. For example, they reckoned that if your language did not have a word for a certain colour, you would not distinguish that colour very easily.  

Recently, The Guardian’s Peter Scott, a university professor and administrator, wondered what universities would be like if English was no longer the world’s lingua franca (“Will UK Universities Cope if English No Longer Rules the World?”). He began his article with the thesis that “Being an English-speaking country is a blessing – and a curse”. While being a native speaker of the language of Shakespeare, science, popular culture, tourism and business may appear to be a privilege, Scott rightly argues that in fact it locks one into “an anglophone prison”: this situation is disadvantageous because the less concerned native English speakers are about other languages, the less they will comprehend other cultures. Their understanding of others will remain superficial. He regrets the fact that the number of Mandarin speakers of English is much higher than the number of English speakers of Mandarin, pointing out that this is an advantage for the Chinese and that “monolingualism inhibits multicultural sensitivity”.
Scott then insinuates that English is no longer English in this globalized world: rather, it should be referred to as “Globish” as there are many international versions of it. Additionally, he regrets the complacent monolingualism of anglophone students compared with the confident bilingualism of other students, who are also highly skilled in their fields. Further, the open source publishing movement, being freer, will promote other languages, Scott believes, bypassing the traditional “gatekeepers” of international science publishing. He sees a bias towards Anglophone universities in the global league tables, which may soon change in a more pluralist world, noting that Chinese dominance is not the only “alternative future”. The message is that one must be able to imagine other alternative futures: Anglophone universities must be prepared for a more inclusive world, not just encouraging other languages, but also appreciating other cultures.
Robert McCrum, a British novelist and editor, has drawn a convincing parallel between English as a lingua franca and Latin: “Globish may be … a global phenomenon, but, like Latin before it, is vulnerable to change and decay. It won't be global forever” (“Globish and Its Discontents”). We are living in a fast changing world. It would be interesting to see what the coming decades bring in terms of global language and culture.

Thursday, March 27, 2014

More Banned Words


A couple of years ago, I blogged about Banned Wordson the occasion of the official French ban of “Mademoiselle”, which was replaced with “Madame” for all. Among the reasons some words are being banned across the world are sensitivities regarding gender, race, and religion, as well as national pride. This month, it has been reported that Saudi Arabia has banned fifty names, so it is time for an update on this subject.

In an article entitled “Saudi Arabia Bans 50 Baby Names”, Gulf News indicated that “The names fit into at least three categories: those that offend perceived religious sensibilities, those that are affiliated to royalty and those that are of non-Arabic or non-Islamic origin.” On the other hand, it remains a mystery why some others have been blacklisted: “A number of other names appear that do not necessarily fit into any category and it is therefore unclear as to why they would have been banned”. In any case, one of the banned names, “Al Mamlaka”, means “the kingdom”, so one can imagine why it might have been banned.

It isn’t just the Saudis who are banning words. The feminists are still at it as well. In The Guardian’s Women in Leadership section, Harriet Minter reports on the “#banbossy campaign” (“Open Thread: If We're Banning Bossy, Which Other Words Need a Rebrand?”). Started by Sheryl Sandberg, Chief Operating Officer of Facebook, and author of Lean In: Women, Work, and the Will to Lead, the campaign aims at removing the word from our lexicons because it is offensive when used to refer to women leaders: “…there's a sting that comes with being called bossy. A feeling that whilst you might be running the group nobody likes you for it and that's not something I'd wish on any child.” Minter rightly wonders what other words should also go, reflecting on the following examples: Aggressive. Ruthless. Ambitious. Forward. Go-getting.

She asks the reader to decide whether they see these as positive or negative adjectives, guessing that readers are probably influenced by whether they think they are applied to a man or a woman.

The author ends her article with the important question as to whether it is the words themselves that should go or the way we think about them: “So what would you ban? Or instead of banning words should we be campaigning for their acceptance?”

It is usually easier to remove a word than a mentality. Is that always the best solution though? I’ll let my readers think about it.

Sunday, March 2, 2014

Uncertainty About Uncertainty


I recently gave a conference presentation on “Risk Control in the Blended Learning Environment”. My research showed that online teaching and learning were generally viewed as risky in many ways, including the following: possible student cheating; absence of face-to-face cues; retention issues (MOOCs currently serve as extreme examples); technical hurdles; reliance on doubtful or contradictory web sources; and lack of recognition. The problem, however, is that these risks have not been quantified properly, if at all. There is uncertainty about uncertainty. I recommended blended learning as a compromise between the perceived risks of online learning and the assumed safety of the traditional teaching/ learning environment. Yet even when risks are objectively quantified, as in the health field, there are individual and cultural differences in risk perception and uncertainty avoidance. There are also complications concerning definitions; for example, definitions of cheating may vary, and what may be an irrelevant source in one sense may be highly relevant in another.    
By chance this week I came across a Macmillan blog post by Liz Potter on different ways of expressing uncertainty in English (“Life Skills Tip of the Week: Ways of Expressing Uncertainty”). What a coincidence, I thought. It was only logical to connect this to my idea of uncertainty. Here are some ways of indicating uncertainty in English, as expressed by Potter, each with a slightly different pragmatic application. If you are uncertain about the difference between them, check the above site.
  • perhaps/ maybe
  • possibly/ probably
  • apparently
  • as far as I know
  • to the best of my knowledge/ recollection/ belief
  • not to my knowledge 
  • I imagine/ suppose  
I hope all this makes sense - or perhaps not!

Sunday, January 26, 2014

Unglamorous Grammar?

Most of the students I come across do not seem to consider grammar to be an exciting part of their learning though they do appear to realize its importance in their academic work, everyday correspondence, and future career prospects. Every year a number of current and former students ask me to check their language on various documents before they apply for jobs or graduate work. A student recently wrote a computer application and asked me to check it for grammar before he shared it online; it was a text-based application, so correctness was paramount.

It is time to bring back the glamour into grammar. In fact, etymologically the two words are related. Believe it or not, “grammar” is the precursor of “glamour”. Here is what the Oxford Dictionaries say about the origin of the word “glamour”:
Origin
early 18th century (originally Scots in the sense 'enchantment, magic'): alteration of grammar. Although grammar itself was not used in this sense, the Latin word grammatica (from which it derives) was often used in the Middle Ages to mean 'scholarship, learning', including the occult practices popularly associated with learning.
The Scottish online newspaper Caledonian Mercury confirms the origin of the word: “Glamour was originally a modified form of the word grammar. Grammar originally meant learning in general, rather than its modern sense, and it also referred to a knowledge of the occult or magic. Thus, grammar and glamour were both caught up in witchcraft” (“Useful Scots Word: Glamour”).
It is interesting to note a connection between magic and learning. The association between learning and power has traditionally been more salient, as in Francis Bacon’s “Knowledge is power”. Yet students need to realize the magic of grammar. Since it can enchant or disenchant readers, it can transform people’s lives. In “information literacy”, grammar is one of the criteria used in judging the credibility of a source!
Mind you, it is not only students that need guidance in grammar. Faculty members and non-teaching staff can also benefit from polishing it up as indicated in The Huffington Post article “Why Grammar Is Important” by William Bradshaw, author of The Big Ten of Grammar: Identifying and Fixing the Ten Most Frequent Grammatical Errors. Bradshaw reminds us that effective grammar gives leaders an advantage and that correct grammar is the basis of clear, effective communication: “… the better the grammar, the clearer the message, the more likelihood of understanding the message's intent and meaning. That is what communication is all about.” The author interestingly notes that non-native learners of English often have better knowledge of grammar than native speakers: “For those of us who have had international students in our classrooms, although they usually speak with a noticeable accent, their knowledge of English grammar is frequently superior to that of our own students.” A British Council source has made a similar observation:
Isn't there any difference between “knowing grammar” and “knowing about grammar”? In fact, there is a difference as “knowing grammar” is a facility which developed when we were small children and “knowing about grammar” is a reflective process, i.e. to be able to describe what the rules are. It is not a secret that sometimes native speakers of English don’t know any grammar and foreigners speak more correctly than the natives. The native speakers often fail to describe their own grammar knowledge and it is either because they have not thought to do so or because of poor teaching methods. (“The Importance of Grammar”)
A couple of grammar sites that I have found particularly useful are Mignon Fogarty’s Grammar Girl and the University of Northern Iowa’s Dr Grammar. On the other hand, research into the teaching of grammar and writing indicates that the former should preferably not be taught separately from the latter, out of context, but rather as part of the teaching of writing – and that the best way of improving students’ grammar is by relating it to their writing. After all, grammar without content can be pretty vacuous. Still, without grammar, where are real success and glamour?

Monday, December 30, 2013

The Selfish “Selfie”?

Different dictionaries have identified different choices as “Word of the Year” for 2013. Merriam Webster, America’s leading dictionary publisher, has announced its top ten words of the year based on the top look-ups in its online dictionary, Merriam-webster.com. The words are quite mundane: “science”, “cognitive”, “rapport”, “communication”, “niche”, etc. More interesting is the Collins Word of the Year - “geek”, not in the old sense of a boring unsociable nerd, but in the new sense of “a person who is knowledgeable and enthusiastic about a specific subject” (a little broader than the 2003 definition focusing on a preoccupation with computing). The evolution of this word shows just how fluid the language is – how fast it is evolving.
For Dictionary.com, the Word of the Year is an old word with growing importance: “privacy”! From airport body scanners to global spying and corporations accessing user data, privacy has become a huge concern globally, triggering an open letter by fifty prominent writers urging the United Nations to establish an international bill of digital rights.
Most interestingly, according to the Oxford Dictionaries Online Blog, “selfie” is the word of the year, reminding us that “A picture can paint a thousand words”. While the word is not very new, having been “on the radar” for quite a while, it became popular in 2013. Besides, as the blog notes, “It seems like everyone who is anyone has posted a selfie somewhere on the Internet. If it is good enough for the Obamas or The Pope, then it is good enough for Word of the Year.” The dictionary blog notes that while self-portraits are not new historically, technology has made them much easier; the word was first spotted on an Australian online forum in 2002, after which it gradually gained some currency on social networking forums such as Flickr and MySpace before becoming most prominent in the last year or two. The blog also points out a promising feature of the word – its linguistic productivity; take for instance “helfie” (a picture of one’s hair), “welfie” (a workout selfie) and drelfie (a drunken selfie).
I wish my readers a successful new year, with lots of selfies, welfies, etc. After all, selfies are not necessarily as selfish as they sound!

Wednesday, November 27, 2013

Is Teaching Prestigious?

Let’s face it: teaching is not a prestigious profession in many parts of the world – even university teaching. Having researched this subject lately on the web, I have come up with a number of findings. First of all, the perception of low prestige is not a new one. An article entitled “How Can Teachers' Prestige Be Raised?” dated Summer 1964 proves this. The article discusses relevant U.S. surveys, beginning with two that had been conducted nationwide in 1960 and 1961. While the first survey was addressed only to elementary and secondary school teachers, and the second only to school superintendents, they clearly reflected perceptions regarding teaching back then. School teachers were perceived as being lower class to middle class and only “slightly above the average in prestige in a list of 90 representative occupations” (Chu, 1964, p. 333). A study of parents’ attitudes in New York at that time showed that less than a quarter of parents admired teachers while a UNESCO survey showed cultural differences between the U.S. and the former Soviet Union, where teachers were highly esteemed, provided with a considerable range of services, and often elected for government positions (Chu, 1964, p. 333).

In the same article, Chu argued that since the teaching profession is far more influential than many other professions, it should be held in greater regard in the U.S. This, he claimed, could remedy “the shortage of teachers, the lack of permanency in the field of teaching, and the lower qualifications of teachers” (p. 334). Interestingly, Chu concluded from the surveys that teachers themselves should play the greatest role in raising their own prestige by, for example, “enriching their knowledge in the teaching field” while others who could influence perceptions include parents, school administrators and teachers’ organizations.

On the other hand, an Indonesian study conducted in 1961 showed that university teachers ranked at the top of a list of occupations in terms of prestige while other teachers ranked significantly lower (Murray, 1962, “The Prestige of Teachers in Indonesia”). School teachers also ranked low in a 2003 UK nation-wide study conducted by researchers from Cambridge and Leicester, though university faculty seem to have been excluded (“ The Status of Teachers and the Teaching Profession”).

Linda Hargreaves has more recently analyzed perceptions of teacher prestige across nations (“The Status and Prestige of Teachers and Teaching”, 2009). She concludes that there are clear differences in teacher prestige globally. With regard to Taiwan, for example, she refers to Fwu and Wang’s analysis of “the high status of teachers in Taiwan in traditional Chinese culture, which placed teachers in the realm of heaven, earth, the Emperor and parents, and deemed them especially privileged to explore and explain the essence and operation of the ‘True Way’” (p.222).

While regard for teachers is high in countries such as Finland, Japan, and Taiwan, it tends to be low in countries where pay is lower. Still, Hargreaves warns that though pay may determine status, it does not necessarily determine prestige. She also warns against subjective self-perceptions of low prestige among teachers, referring to Turner’s 1988 analysis of the “distinctive American construct of ‘subjective status’”: “The subjective dimension is especially relevant in the case of teachers, whose subjective status typically underestimates, and, arguably, limits their objective status” (Hargreaves, 2009, p.218). The author adds that in 2005 one of the OECD’s highest priorities was “the improvement of the image and status of teaching” (p. 219); she also points out that political instability may undermine teacher status (p. 221). One may add that economic instability can have similar effects (McCartney, 2011, “Budget Cuts, Falling Prestige Beset Teachers”).

The good news is that many governments across the world are aware of the importance of encouraging the teaching profession. The 2012 promise of the Ukrainian Prime Minister is one example: “Azarov Vows to Restore Prestige of the Teaching Profession”. In the U.S. The Woodrow Wilson Foundation has similarly, in 2012, brought attention to the need to improve attitudes towards public school teaching: “PDK/Gallup Poll on Education Affirms Need for Rigor, Prestige in Teaching”.

It is my belief that one way for teachers to encourage appreciation of their work is by blogging about it. As I mentioned in my former blog post, only a few teachers in Lebanon are currently blogging about teaching or work-related matters; here are links to recently established blogs by a couple of colleagues – writing teachers at the American University of Beirut:

Amany Al Sayyed’s Blog
Jessy Bissal’s Blog.

Let’s hope these blogs inspire other teachers to similarly reflect and connect.

Monday, October 28, 2013

Starting a Teachers' Blogging Community in Lebanon

Perseverance pays.

When I started blogging in 2011, few - if any - other teachers in Lebanon were blogging about their teaching or work related matters. In Lebanon, it is much more common to find teachers, especially university academics, blogging about politics and society in general. By 2012, I wished to encourage colleagues to blog so that we could share ideas and connect with each other and with teachers elsewhere. My departmental bloggers’ special interest group struggled to take off last year; fellow instructors of English were interested in the idea but could not find the time to get their blogs going. In any case, we agreed on a set of objectives which are beginning to bear fruit this year:

• encouraging blogging among English Department faculty members by initiating the first AUB blogging community

• maintaining our own, separate blogs in order to
  • reflect on our teaching and on writing and language matters in general
  • reflect on student issues and workplace issues
  • connect and share ideas with colleagues, and possibly with the outside world
A couple of colleagues have this year joined me by blogging on teaching-related subjects. Their blogs are currently active, and they intend to continue posting on a regular basis. Hopefully, I will be sharing links to their blogs soon.

Do stay tuned!

Monday, September 16, 2013

Elision & Ellipsis


My last blog post was about apocope, the dropping of one or more sounds at the end of a word. In fact, sounds may be dropped at the beginning of a word (a phenomenon known as apheresis) or in the middle of a word as well. The general term for elimination of sounds is “elision”, also referred to as “syncope”, though the latter term is also used in the special sense of omitting vowels between consonants.

Examples of elision are abundant in fast and informal speech: “gonna” for “going to”, “ain’t” for “are not”, “’im” for “him”, and “cats ‘n dogs” for “cats and dogs”, etc. The pronunciation of a word such as “family”, omitting the second vowel, also illustrates elision. These words are spelled normally in writing (i.e. without the elision) unless one is trying to reflect the dialect or the exact level of informality. For second language learners, elision is a challenging part of listening comprehension, especially when the learners have not had sufficient contact with native speakers.
Omission of one or more redundant words from a sentence is known as ellipsis. Examples of dropped verbs include sentences such as “We did ( )”, while examples of dropped nouns include “There were two ( )”. In a special phenomenon called “answer ellipsis”, omission may be extreme, as in answering a question such as “Who borrowed the book from the library yesterday?” with “Mary” instead of “Mary borrowed the book from the library yesterday.”

Another form of ellipsis can be very useful when you are quoting lengthy texts. In such cases, you would want to skip unnecessary parts of sentences – without changing the meaning. This is done simply by placing three dots between the surrounding words or punctuation marks, as in “We must finish … promptly”. Unlike elision, ellipsis may be used in both formal and informal writing, and the Modern Language Association recommends putting square brackets around the three dots if they are your creation rather than original components of the quoted text.
In informal writing, such as email, some people use ellipsis excessively, replacing other punctuation marks, such as full stops and commas, with it. This is not advisable – nor is it proper grammar of course … unless you are being very liberal with your grammar rules. And what are those last three dots for, you may ask? You’ve guessed it: expression of hesitation or a pause in thought. It is a legitimate use, to be used sparingly. The Chicago Manual of Style distinguishes confident pauses, represented by dashes, from hesitant ones using dots: “Ellipsis points suggest faltering or fragmented speech accompanied by confusion, insecurity, distress, or uncertainty.” Still, Virginia Woolf, a highly successful writer, was an ellipsis enthusiast. In “Phases of Fiction” she points this out clearly: "Better it would be, we feel, to leave a blank or even to outrage our sense of probability than to stuff the crevices with this makeshift substance."

Now it is up to you, the reader, to decide whether and when to use these punctuation choices and omissions … or not.

Saturday, August 24, 2013

Welcome to Uni!

Do you find the word “university” formal and stiff? Do you often say “uni” instead? If so, then you are committing apocope: the omission of sounds from the end of a word.
The word “apocope” comes from Greek “apokoptein”, meaning to “cut off”. Many English words have become abbreviated in this way: advertisement/ad; application/app; administrator/admin; decaffeinated/decaf; magazine/mag, and teenager/teen.
What do you think of this trend? Does it make you sound sloppy? Or do you find it fab? While there is no right and wrong in such matters, one must pay attention to the context – the situation. In formal academic writing, for example, such abbreviated forms may be inappropriate. For more cred, use your head!

Wednesday, July 24, 2013

Don't Say Goodbye ...


Have you ever left a gathering without saying goodbye? If so, then you have “ghosted” – in a different sense, of course, than that of ghosting student essays and articles, or writing for others. What caused me to reflect on the different senses of the word was a recent article by Seth Stevenson in Slate magazine, entitled “Don’t Say Goodbye, Just Ghost”.

Stevenson argues that while party hosts might appreciate the politeness of guests bidding them farewell before leaving, it might be impractical and time consuming for everyone to do this at a large gathering. His advice is to “just ghost”, though admitting that the act of suddenly disappearing from a group might have been frowned upon in the past, as evidenced from “ethnophobic” synonyms such as “the Irish goodbye”, “the French exit” (French leave) or the less commonly known “Dutch leave”.  Clearly, as a semantic choice, “ghosting” is a milder alternative to “Irish goodbye”, with its negative connotations of inconsiderateness and rudeness. It is also less culture-laden than the “French exit”, which ironically translates into “filer à l’anglaise”, leaving the English way. Perhaps the fact that a ghost is colourless (and cultureless) helps.
What other meanings are there for the word “ghost”? A good dictionary will provide several possible senses for the noun, with meanings ranging from that of a soul in general to the soul of a dead person, to the Holy Ghost, to a demon, to that of a red blood cell without haemoglobin, in medicine. There is also the sense of a trace or memory of something, or a false image on a photographic film or screen. Additionally, the term may refer to a fictitious employee, business, or publication listed in a bibliography. The verb “ghost” appears as both transitive and intransitive, in multiple senses including that of “ghost write”, haunt, and move silently like a ghost. Idioms include "pale as a ghost"; to “give up the ghost”, meaning to stop trying or - euphemistically - to die (also used humorously in relation to machines); to “look like you’ve seen a ghost”, in the context of fear; “not have the ghost of a chance”, meaning to not have any chance at all, and "the ghost at the feast", that is, someone who spoils your enjoyment by constantly reminding you of something unpleasant.

In computing terminology, the term has multiple senses. Ghost computers and ghost web sites are used by hackers and phishers respectively, and ghost imaging clones the software on one computer for other computers, using ghosting software. In drug slang, the word may apparently refer to LSD or cocaine.

Enough of this word! Let us lay the ghost of this subject to rest.

Sunday, June 23, 2013

Ghost Writing: In Need of a Cure




As mentioned in my previous blog post, ghost writing is a sad global development in academia. In fact, it is not only global in scope – meaning very widespread – but also international in dimensions, in the sense that impoverished students and graduates in certain parts of the world are writing for others who can afford it elsewhere. The supply and demand coexist and there are writers’ “factories” out there fuelled partly by the massification of higher education.
Many of the ghost writing stories one finds on the Web are U.S. related, but there are also reported cases from elsewhere. UK-based A-level students apparently hire writers locally and internationally, from places as far as Canada, Egypt, Romania, India and Pakistan. Chinese academics, as well as students, buy papers to boost their publication lists, as reported in 2010 by the BBC ("Chinese Academia Ghostwriting 'Widespread'"). More recently, Chinese students in New Zealand universities came under suspicion when a service catering to them was identified, yet it has been argued that they have been targeted out of racism and xenophobia as they are not the only ones using such services (“Ghost Writing is Ubiquitous”). Similarly, students in Russia have been reported to buy not only academic papers but entire degree certificates as well.
In this blog entry, the ghosts are smiling because they do not realize the seriousness of the problem. One “senior corporate marketeer” interviewed by the Bangkok Post admitted some guilt, saying that if she were to think hard about what she was doing, she would stop it – but she doesn’t give it much thought (“Lost for Words”). A former ghost writer interviewed by the same newspaper said he decided to stop because he realized it was wrong – he had been simply helping lazy students - though he finds nothing wrong with “editing” work. Other than writing essays, some in Bangkok, as elsewhere, write statements of purpose for students applying to universities, claiming the students are bright but linguistically deficient, while others label the latter as “rich and stupid”.
Euphemisms abound with regard to ghost writing. Besides “editorial work” for students, there are the more serious cases, such as those of medical publications impacting public health, with “guest authors” or “honorary authors”. The marketing of the drug Vioxx, for example, which was withdrawn in 2004, has been linked to such writers. In the worst case medical scenarios, a poorly tested drug is marketed after ambitious, well-known medical specialists have been invited to put their names on articles relating to clinical trials they have not been involved in. The drugs are then sold with little or no reference to possible side effects, let alone confirmation of actual efficacy. Public health disasters follow, and the authorities are alerted only when it is too late. Reuters has used the euphemism of “omission from a published study's author list of a person who substantially contributed to the work”; other sources have used the term “invisible author”, to show that the real authors are not the same as those whose names, legally speaking, should not be on the articles.
What can be done globally to combat ghost writing? The role of language teachers in university courses is limited as the problem is not simply a language matter; still, university faculty members should remain on the alert, actively encouraging proper writing practices. The medical examples illustrate serious corruption, in academia and elsewhere, as well as conflicts of interest requiring prompt attention and legal action. The interest of universities in advancing knowledge for the benefit of humanity (while assessing people's work fairly) clashes in these cases with the interest of the businesses involved (such as pharmaceuticals) in making profit quickly and, unfortunately, unscrupulously in some cases.

Tuesday, May 21, 2013

Haunted by the Spectre of Ghost Writing


Yesterday a university colleague referred me to an article in one of the best known Lebanese newspapers citing me on student cheating: “Students Buy Assignments As Semester Ends”. I had warned colleagues that I had been interviewed on this subject by a Daily Star reporter and that an article that could refer to any or all of our courses was due out soon; we had been anticipating the piece. The startled young writer had politely knocked on our Fisk Hall office door last week and asked a colleague and I whether we could answer some questions about student cheating on written assignments, including the basis on which we suspected it. The reporter had already investigated the ghost writing business in Lebanon and was seeking more information, determined to cite the views of faculty members. The problem was widespread, she said, wondering whether anything more could be done to curb it.

The ghost writing business is illicit and generally done surreptitiously. Though alarmed by the information she had gathered, the reporter seemed comfortable talking to us about it, having worked as a graduate assistant in our department in the past. Her findings, focused on universities in Lebanon, are in my view local examples of a global issue: a disease that is geographically pervasive and reflected online as many paper mills advertise and sell on the internet. Whether the phenomenon is new to the world is doubtful, though the problem has come to light more clearly in the past decade or two through the internet.
To my knowledge, some celebrities and politicians use ghost writers for their biographies, speeches and blogs; some artists use them; and a number of pharmaceutical companies have resorted to medical ghost writing to promote their products, so students are not the only culprits in this world. Still, the academic use of ghost writers requires special attention as it is a problem worse than plagiarism. Some plagiarism is unintentional, when, for example, students complete their own assignments but lack skills in summarizing, paraphrasing, or quoting, especially in a foreign language; or when they have not understood the importance of crediting their sources though they have been taught about it. While a partially plagiarized text might include some student effort, a ghost-written paper generally does not, except possibly when the students provide the dealer with articles and other sources as content.  
It takes necessary courage to discuss this taboo subject and bring it into the open as it can undermine the credibility of those involved, be they companies, celebrities, students or others. English teachers have a special role to play in reducing this problem by motivating students to write, engaging them with relevant topics, teaching effective research and writing skills, emphasizing processes rather than products, and discussing students’ projects with them, providing feedback from start to finish. Teachers of other subjects should also pay attention to the types of assignments they expect as the more unreasonable the assignment, the more likely the students are to resort to external help or "services".
 
This is my initial reaction to the newspaper article on students purchasing papers. I hope to respond further on this issue in future blog posts as a short posting such as this cannot do this profound subject justice.

Saturday, April 27, 2013

The Most Popular Languages

Most sources would agree that the top five languages in 2013, judging by the number of speakers, are Mandarin Chinese, Spanish, English, Arabic and Hindi. Does this mean that these languages are “better” than others? Does it mean they are more in demand? The answers are not straightforward. Linguists tend to concur that no language is superior to any other. Chomsky, for example, is famous for his theory of Universal Grammar: that all languages are very similar, without exact point to point correspondence, and that all people are born with a capacity for a “universal grammar” which manifests itself through concrete languages. He theorized that UG was in people’s genes not only metaphorically but also literally.

What about the importance given to the English language worldwide? The Telegraph’s study advice section recently warned that assuming that everyone is happy with English is not valid in international business: “It’s no longer permissible to simply assume that clients will be comfortable speaking English, particularly if you’re looking to set-up lucrative ongoing business links” (“What’s the Best Language to Learn to Further Your International Business Career?”). Rather, the advice is to learn Mandarin Chinese, as many companies are moving to China, or Russian as Russia is important in oil and gas production. On the other hand, one is advised not to forget the continuing global importance of European languages such as German, French and Spanish.

For UK-based native speakers of English, Arabic and Polish are almost equally important these days – the former partly because of Qatari investment and the latter because of the huge influx of Polish immigrants over the years. Still, these languages are outranked in importance by Mandarin as well as Spanish, French and German. Germany is one of the UK’s largest export markets while Spanish-speaking Latin America includes important, fast-developing markets (“Graduate Jobs: Best Languages to Study”).

Corinne McKay, a US-based certified translator who recently blogged on the question of “Which Language Is ‘the Best’?” thinks that Middle Eastern and Asian languages score highest in terms of “critical need”. She admits, however, that translating from these languages into English is difficult because of significant cultural differences, unlike translating from other European languages into English. Translators brought up in the US, who have not lived in China or the Arab world, for example, may find it more challenging to translate from the relevant languages than translating from French or German – and culture is not the only hurdle.
Arabic is a good example of a “difficult” language: it is a Semitic language, like Hebrew, unlike French, which is Indo-European and has more word and word structure similarities with English; some vocabulary items in French are almost the same as their English equivalents, and both languages use an s at the end of words for the plural. Arabic vocabulary, in contrast, is very different, and hence more difficult to learn, and plural formation is different, not to mention that Arabic has “dual” pronouns besides singular and plural. Furthermore, because Arabic script goes from right to left and is cursive, as in English handwriting, it appears tougher to decipher. While there is a better correspondence between spelling and sound in Arabic, some sounds are not easy for native speakers of English; certain “phonemes” do not exist in their language. Then there is the issue of diglossia, the difference between colloquial Arabic and standard Arabic, and complications with different dialects depending on the region. In some cases, the dialects are so different that native speakers of the language have trouble decoding each other’s utterances. Besides, dialects are only spoken; unlike standard Arabic, they are not meant to be written. They are still important to learn though, complicating matters for the language learner.

Whether English is really in decline would be an interesting question to answer. While the percentage of native speakers of English appears to be decreasing as other populations multiply more quickly, English remains important as a second or foreign language, if not in business, then definitely in science. For native speakers of Arabic, one may assume, English is till crucial as the language of science whereas Chinese may be increasingly the language of future business.

Tuesday, March 26, 2013

Why Some of Us Still Don't Use Twitter

When I discovered Twitter, several years ago, I did not find it appealing at all and therefore did not subscribe to it. I was already using Facebook for social networking and LinkedIn for professional networking. Twitter presented itself as a mere distraction in comparison, a redundant tool that would waste my valuable time. It also looked and sounded dry: not much in terms of pictures, stories or jokes; no “friends” – just “followers” and followees. Besides, the idea of being a “follower” of others did not appeal to me (doesn’t the word carry connotations of subservience, stalking, or both?). Nor was I excited by the prospect of being “followed”. Besides, from an English teacher’s point of view, I was discouraged by the abbreviated language of Twitter, which defies spelling, grammar and punctuation conventions. Why would an English teacher want to be involved in such an environment when we are supposed to set a good example of Standard English, including complete sentences and well-developed paragraphs? My initial impression was that Twitter was for those who don’t know how to write!

My view of Twitter has changed only slightly over the years. Knowing that many highly educated people use it, including public figures, I now see it as a tool for three categories of people: those who don’t need to set a good example language-wise, those who don’t have the time to write properly or at length, and those whose writing is not presentable in the first place. One can see the wisdom of valuing content and meaning over style, yet how much content can you squeeze into 140 characters, and how much depth, analysis or synthesis can go into that? Twitter is an excellent tool for brief announcements and comments. Beyond that, I believe Facebook and LinkedIn are superior – and so is proper, old-fashioned blogging. In this regard, I agree with Devin Coldewey, a Seattle-based writer and photographer, who said in 2009, “…if someone is so regularly finding content of merit, why don’t they have a blog where the content can be given context, discussion, and perhaps a preview so people aren’t going in blind? I like reading interesting blogs. I don’t want to receive links every time someone finds something they think everyone should see. Twitter just adds another layer to the equation — and I don’t like layers” (“Why I Don't Use Twitter”). A large scale study conducted by the data analytics provider Pear Analytics actually concluded that 40% of tweets were “pointless babble”, more than a third were “conversational”, and around 9% had only “pass along” value (Mashable).

From a business point of view, companies are using social media for public relations purposes. People like to see what CEOs think, and they can now find some of them on Twitter. Ellen Roseman of the Toronto Star hopes that Twitter “sticks around forever” if it truly connects corporate leaders to customers more effectively (“Why Smart Consumers Should Use Twitter”). On the other hand, if – like me – you are neither a company CEO nor a particularly worried “consumer”, why would you join Twitter over and above other online networking tools? For news gathering and information on current events? If you already use Facebook, you would need extra time for Twitter and you might end up finding the same information there in any case; besides, you can always go to news sources directly rather than waiting for others to share, layer upon layer. So many tools, so little time to juggle!

Monday, March 4, 2013

Researching “Research”

On the occasion of National Grammar Day in the U.S., this posting focuses on a puzzling grammar point.
Recently, I managed to provoke an online discussion in our English Communication Skills Program at AUB about a controversial grammar issue. The subject of the discussion was “Students Pluralizing ‘Research’: Right or Wrong?”. What triggered my initial posting was my disappointment with students pluralizing the noun “research” even after I had explained that it is better not to pluralize it because it is generally uncountable – plus the fact that, to my dismay, some of the best known dictionaries have started accepting the usage.
After seeing the Macmillan Dictionary’s entry on “research” , which makes perfect grammatical sense, providing examples of usage “errors” in a "Get it right" section, along with corrections, it was surprizing to find that a number of other dictionaries, including the Wikipedia Wiktionary and the Cambridge and Oxford dictionaries, accept the countable form of research –  “researches”. This plunged me into a deep depression, but I guessed that, since the better dictionaries depend on statistics – aiming to be descriptive rather than descriptive – it is hard to argue with them. (Here is more information on how words enter dictionaries from a previous blog post of mine, “How DictionariesCope With Language Change” ; the post also happens to include a link to “How a New Word Enters the Oxford Dictionary”).

In any case, based upon the solicited input of fellow English Communication Skills teachers on how they handle the matter with their students, it was clear that the instructors were divided in their opinions. Out of the six colleagues who contributed to the discussion, two seemed to be in favour of accepting the plural, or at least not penalizing students for it. One appeared to be between the two extremes, though her answer was somewhat vague, and the remaining three were vehemently opposed to the usage. Here are extracts from what they said:
·     “I usually (if not always) cross out the 'es' when students pluralize' research' - I like the examples/samples listed in Macmillan dictionary and their complete rejection of the plural form.” (Rima Shadid)

·     “I automatically cross out the  ‘es’ and replace it with ‘studies’ as I mark my students' papers...’research studies’...I do so not necessarily because ‘researches’ sounds a little odd to me, but rather simply because ‘research studies’ is usually more accurate.” (Missan Laycy Stouhi)

·       “With regard to dictionaries:  Just because a dictionary does not set a particular standard, this does not mean that the standard does not exist (dictionaries are not the be-all and end-all of language use)…. If American society still equates nonstandard with substandard after all of this effort, how can we expect an individual here or there who uses nonstandard English to have much impact…?” (Kathryn Lincoln)

Apparently, we are not the only people in the world (or on the web) debating this grammar point. Take a look at this forum, for example, where someone asks, “I am not sure about the plural of research. Can you help me?”: http://forum.wordreference.com/showthread.php?t=1828694
·         One person replies “researches”.

·         Another person says, “No, I would argue ‘research’ is uncountable because it doesn't sound right to say ‘Yesterday I did three researches.’ It would either be ‘Yesterday I did research’ or ‘Yesterday I did three research assignments/cases/files’ etc. The only time you would have ‘research’ in plural is to refer to the person who does research or their job title. i.e. ‘We have three researchers.’ (Note the spelling- not ‘researches’)."

·         Yet another comments, “Sorry, Jack. It can be a countable noun in some cases or, at least, it's starting to be used that way…. (Definition of research noun from the Cambridge Advanced Learner's Dictionary).”

·         The final comment on the thread is, “This is an excellent example of the difference between what one finds in the dictionary and how one speaks. With respect to modern spoken English (at least in AmE), Jack is absolutely right: we do not use the plural ‘researches’. The fact that it's in the dictionary is secondary to modern usage.”

Teachers – and students – out there, what do you think? My advice is that, if something is going to sound jarring to your readers or listeners, use a safer alternative – never forget the audience. Besides, in this case, if you still see research as a process rather than a product or an object, why pluralize it?

Tuesday, February 5, 2013

Word of the Year


Words, Words, Words
Different organizations have voted for different words as “Word of the Year 2012”.
The American Dialect Society chose “hashtag”, the well-known symbol used in Twitter. Surely, that is not surprising, knowing the increasing popularity of the social network and the networking and sharing tools it provides. Still, for some, the choice was somewhat unexpected – leading New York Times blogger Jennifer Schuessler to refer to the word as a “dark horse” winner. Other top candidates, including acronyms and phrases, were “YOLO” short for “you only live once”, “fiscal cliff”, “marriage equality” referring to legalization of gay marriage, and “Gangnam style”. Of these, “YOLO” was voted least likely to succeed, contrary to “marriage equality”, rated most likely to. Interestingly, among the categories was one for the most euphemistic expression, where “self-deportation” was the winner, meaning “policy of encouraging illegal immigrants to return voluntarily to their home countries” (that is, by making life difficult for them rather than officially expelling them).
The Merriam Webster Words of the Year, the two most looked-up words in 2012 were “capitalism” and “socialism”, probably prompted by the year’s U.S. elections, including the healthcare debate; people tended to look up the words together, said Peter Sokolowski, the dictionary’s editor-at-large to CBS news (“’Socialism’ and ‘Capitalism’ Revealed as 2012 ‘Word of the Year’ ”) – a bit like looking up “depression” and “mania” together one might reckon! The 2011 Webster word of the year was “austerity”, not surprising considering the world economy that year.
The Oxford dictionaries of the U.S. and the U.K. also had their 2012 favourites, respectively “to gif”, from the well-known file format, and “omnishambles”, meaning a disastrous situation, whichever perspective you take. Still, though chosen by the relevant lexicographers as the most interesting words of the year, the terms do not necessarily enter the dictionaries and may fade away with time.
What about “Arab Spring” you might ask? Does it not deserve a place in all this? Well, the term was actually chosen by Global Language Monitor  as the 2011 phrase of the year, along with word of the year, “occupy”.  The Monitor's 2012 choices were “Gangnam style” and “apocalypse”.
Time will tell which words make it to the top in 2013. Let’s watch and see.

Monday, January 21, 2013

Myths About University Faculty

A recent Forbes magazine article by Suzan Adams created such a stir through torrents of reader comments that she had to quickly update it, acknowledging that she may have been mistaken – though not apologising for the offense. In “The Least Stressful Jobs of 2013”, she had given the impression that university faculty had such easy-going jobs that they were to be envied for their generally stress-free (i.e. possibly lazy) lifestyles:

"University professors have a lot less stress than most of us. Unless they teach summer school, they are off between May and September and they enjoy long breaks during the school year, including a month over Christmas and New Year's and another chunk of time in the spring. Even when school is in session they don't spend too many hours in the classroom ... Working conditions tend to be cozy and civilized…."
Note the phrase “unless they teach summer school”, and note the focus on “hours in the classroom”, as if work outside the classroom such as grading, student office hours, faculty meetings and committees, research, creating new activities and exams, and updating syllabi and course material does not count.
Although Adams cites her major source of information, CareerCast, she clearly overlooks the work university faculty are involved in outside the classroom, which her source does mention: “Work beyond the lecture hall is also a vital facet of the professor’s day. Postsecondary educators can assist in scholarship committees and curriculum development. Research is also a critical part of the university professor’s responsibilities, as educators typically are expected to produce published works” (Kensing).
One of Suzan Adams’ critics, Forbes colleague David Kroll, has countered her article, expressing surprise and disappointment at her “misguided” piece in “Top 10 Reasons Being a University Professor is a Stressful Job”. Based on his personal experience, among the reasons for faculty members’ stress are the following: “the customer is always right” mentality applied to students unprepared for higher education; abuse of part-time faculty (threatening full-timers that they may be replaced by adjuncts); administrators and the public undervaluing teaching loads; administrators undervaluing online teaching (“If you’re already teaching the class, it’ll be nothing to throw it up online, right? “); hiring too many administrators at the expense of faculty members; and the fact that “tenure” has become meaningless: “I’ve rarely seen a tenured professor be fired but a professor with tenure who is deemed unproductive by whatever anonymous review can certainly be made to wish they didn’t have a job.”
Any experienced educator would rightly side with Adams’ critics. The fact that Adams cites a source is not enough to justify her warped, overgeneralized claims.

Sunday, December 30, 2012

New Year's Madness?

Seeing the traffic and the hustle and bustle at the end of the year in our cities can get one wondering about the meaning of a new year. Why does the whole world celebrate the New Year? Why is the end of December such a special turning point? Is this demarcation not rather arbitrary compared with other celebrations? The meaning of New Year’s is not as clear as that of Christmas, for example (good will to all), or Independence Day. Are the celebrations completely vacuous, or do they have a deep psychological significance for people around the globe?

In ancient times people welcomed the New Year with rituals meant to attract good fortune. The Ancient Romans caroused in the streets for several days, around food-laden tables. The Ancient Babylonians, Hindus, Chinese, and Celts sought to restore order after chaos at the turn of the year. Until now, starting fresh is a common concept in many cultures.

The month of January is actually named after the Roman god Janus, the god of gates, doorways and thresholds: a two-faced god with faces looking in opposite directions representing the past and the future. No wonder then that at the end of the year, people reflect on past achievements and plan for a brighter year ahead. Whether people reflect deeply about their values is not always reflected in the resolutions they proclaim. Around the world there are resounding, common themes: people want to be healthier, to exercise more and smoke less, to be more active members of their communities, to be more productive at work, etc. Psychologically, people want to improve themselves and their lives in general.

While there is nothing wrong with recalling one’s values and wanting to advance, the question remains as to why only now? Isn’t December the thirty first, technically speaking, the same as any other day of the year? Why not remember our values daily, throughout the year? Why not seek improvement all year around, regularly reflecting on our successes and failures, our goals for the future?

The point here is not to belittle New Year’s celebrations, although they can be extravagant in proportion to the real significance of the New Year, nor is it to undermine new years’ resolutions. The point is that one can sympathize with those who laugh at the crowds flooding the gyms in January, who dwindle out of sight in February, and one can understand those who decide not to celebrate at all.