Sunday, December 30, 2012

New Year's Madness?

Seeing the traffic and the hustle and bustle at the end of the year in our cities can get one wondering about the meaning of a new year. Why does the whole world celebrate the New Year? Why is the end of December such a special turning point? Is this demarcation not rather arbitrary compared with other celebrations? The meaning of New Year’s is not as clear as that of Christmas, for example (good will to all), or Independence Day. Are the celebrations completely vacuous, or do they have a deep psychological significance for people around the globe?

In ancient times people welcomed the New Year with rituals meant to attract good fortune. The Ancient Romans caroused in the streets for several days, around food-laden tables. The Ancient Babylonians, Hindus, Chinese, and Celts sought to restore order after chaos at the turn of the year. Until now, starting fresh is a common concept in many cultures.

The month of January is actually named after the Roman god Janus, the god of gates, doorways and thresholds: a two-faced god with faces looking in opposite directions representing the past and the future. No wonder then that at the end of the year, people reflect on past achievements and plan for a brighter year ahead. Whether people reflect deeply about their values is not always reflected in the resolutions they proclaim. Around the world there are resounding, common themes: people want to be healthier, to exercise more and smoke less, to be more active members of their communities, to be more productive at work, etc. Psychologically, people want to improve themselves and their lives in general.

While there is nothing wrong with recalling one’s values and wanting to advance, the question remains as to why only now? Isn’t December the thirty first, technically speaking, the same as any other day of the year? Why not remember our values daily, throughout the year? Why not seek improvement all year around, regularly reflecting on our successes and failures, our goals for the future?

The point here is not to belittle New Year’s celebrations, although they can be extravagant in proportion to the real significance of the New Year, nor is it to undermine new years’ resolutions. The point is that one can sympathize with those who laugh at the crowds flooding the gyms in January, who dwindle out of sight in February, and one can understand those who decide not to celebrate at all.

Monday, December 10, 2012

Breaking Language Barriers

                                                   
Your voice in language 1 – MachineTranslation - Your voice in language 2

Last May I blogged about the ways in which natural language processing is changing our world, mentioning a number of applications, including automatic machine translation of speech using one’s own voice. I wrote that computer speech synthesis had advanced to an extent that, in the near future, systems would be able to translate your speech using your own voice. Theoretically speaking, you would be able to hear your voice speaking a foreign language without your necessarily having learnt that language. With sufficient samples of your speech, such systems would be capable of putting together new sentences for you, in the new language. The systems would just need to know your voice, and they would do the rest of the work.

Well, that future is here now. Rick Rashid, Microsoft’s Chief Research Officer, has recently demonstrated automatic translation of his English speech into Chinese at Microsoft Research Asia’s 21st Century Computing event. One of his blog posts includes a recent video of his demonstration, entitled “Speech Recognition Breakthrough for the Spoken, Translated Word”. In that post, he explains that the first challenge in such systems is for the computer to actually understand what you are saying – a challenge acknowledged by experts decades ago; the past decade has seen breakthroughs reflected in a “combination of better methods, faster computers and the ability to process dramatically more data” (“Microsoft Research Shows a Promising New Breakthrough in Speech Translation Technology”).

Rashid adds that over two years ago Microsoft researchers made their systems far more intelligent by using a technique called Deep Neural Networks that simulates the way the human brain works. He asserts that error rates in speech recognition have been reduced from around one in five words to about one in eight, and that, similarly, machine translation has become more reliable. In the case of English to Chinese, the system works in two steps: first, it translates the English words into Chinese equivalents; then, it re-arranges the words to form proper Chinese sentences. While acknowledging the persistence of errors, including amusing ones, in both the English text and the translation, he projects that, in a few more years, these systems will be astonishingly reliable.

This is definitely breaking news on breaking language barriers. The implications for academic institutions might warrant some consideration.

Saturday, November 24, 2012

An Online “Tsunami”

                                
AUB is shyly experimenting with hybrid formats of learning at a time when the global universities have formally embraced online learning. Despite some continued resistance, including liberal arts technophobia, it is official this year because Stanford University President John Hennessy has proclaimed in a Wall Street Journal interview that a “tsunami” is approaching and that his goal is “not to just stand there” but to “try to surf it”, along with the other US elite universities. Some of the criticism aimed at this trend revolves around standards, superficiality versus depth of learning, the relevance of online formats to some subjects, such as philosophy, and the applicability of distance education to young undergraduate populations that may need more guidance than the more mature “continuing education” type of students. Still, it seems, there is no stopping this wave. Rather, efforts are now directed at preparing for it.

The World Economic Forum brought together a diverse group of university stakeholders this summer in order to discuss this online “tsunami” that everyone is talking about, as reported by Ivarsson and Petochi. One of the issues that were debated was the central role of students; as student expectations change, who should decide on the best forms of teaching and learning? Academic institutions or students? Another issue of debate was that of certificates versus degrees, where “Certification and degrees may have to be aligned”. The participants agreed that, in any case, online learning would be an inevitable part of future universities – it is already here (“What Will the University of the Future Look Like?”).


Will some institutions continue to babble while others sing at the top of their voice? Time will tell.

Friday, October 12, 2012

Arabic in Unofficial English - 12 October 2012

Arabic in Unofficial English


I recently came across an interesting slang dictionary by the American lexicographer Grant Barrett: The Official Dictionary of Unofficial English. Although it dates back to 2006, it was definitely new to me. What caught my attention most in the text were the Arabic and Middle East related words that were included. Many of them had crept into English since 2003 in Iraq, especially in the “War Against Terror”. Here is a listing:

Ali Baba: “thief. After the government of Saddam Hussein was toppled, uncontrolled looting ravaged the country—anything of value, and many things that weren’t, were stolen or destroyed. Looters, and, generally, any thieves, are called ali baba, by Iraqis, after the tale of ‘Ali Baba and the Forty Thieves,’ told by Scheherazade in the stories known in the West as Thousand and One Nights. American soldiers who have served in Iraq say they tend not to use the term as a noun, but as a verb meaning ‘to steal’: ‘We’re gonna ali baba some scrap metal from their junkyard.’”

Dhimmi: “a non-Muslim living with limited rights under Muslim rule”

Eurabia: “the (perceived) political alliance of Europe and Arab nations (against Israel and Jews); a name for Europe if its Muslim or Arab immigrants become a large or powerful minority”

Haji: “an Iraqi; any Muslim, Arab, or native of the Middle East”

Hawasim: “a looter or thief”

Muj: “among (Anglophone) foreigners in Middle Eastern or Islamic nations, a guerrilla fighter or fighters. Clipped form of Persian and Arabic mujahideen, plural for mujahid, ‘one who fights in a jihad or holy war.’”

Shako Mako: “loosely translated as ‘what’s up?’ or more specifically, ‘what do and don’t you have?’ or ‘what’s there and not there?’ It’s similar to shoo fee ma fee used in Lebanese Arabic. Commonly one of the first Iraqi Arabic expressions learned by coalition forces. A common response is kulshi mako ‘nothing’s new’.”

Ulug: “thug or lout. Repopularized by the former Iraqi Minister of Information Muhammad Saeed Al Sahhaf as a term for Americans. The word had previously been rare.”

Of course these are not the only expressions that will be of interest, so happy reading!


Posted by May Mikati on 12 October 2012, 12:47 AM

Monday, September 10, 2012

Nouns that Were Verbed in the Olympics - 10 September 2012

Nouns that Were Verbed in the Olympics


Now that both the Olympics and Paralympics are over, reflecting on language used at the events is due. The connection between the Games and the English language is not an obvious one, but some controversy did brew up this year over sports terms such as “medal” and “podium” that are now occasionally used as verbs. However, as Liz Potter of the Macmillan Dictionary blog notes in “They Came, They Medalled, They Podiumed”, the verbing of nouns is not a new phenomenon in the language (nor is resistance to such evolution one might add). In fact, many other nouns, unrelated to the Olympics, have recently become common verbs: to blog, from web log, to friend and unfriend from Facebook features, and to Facebook are just a few examples.

So why are Olympics-related terms so controversial? Possibly because the event is a high profile one with global coverage. The furore over “medal”, which has not only been used as a verb but also as an adjective, as in “the most medalled Olympian”, has been documented by The Guardian newspaper’s style editor, David Marsh, who defended the use in relation to the 2008 Olympics, commenting that it was neither illegal nor immoral, while, for the purists, it was a sign that “the linguistic barbarians are not only at the gates: they have battered their way through, pulled up a chair, helped themselves to a beer and are now undermining our very way of life by rewriting our grammar books to suit their evil purpose” (“Mind Your Language”).

Historically in English, nouns have been verbed, and verbs have been nouned: the process is called conversion. Those who react violently to such verbal variation are simply undermining the linguistic creativity of others, as well as the natural evolution of the language.


Posted by May Mikati on 10 September 2012, 10:57 AM

Thursday, September 6, 2012

Metaphors for Teaching - 06 September 2012

Metaphors for Teaching


At the start of a new academic year, what better metaphors to explore than metaphors for teaching? The Annenberg Foundation has surveyed school teachers in the U.S. for metaphors they would use for their work. Dozens emerged, including that of a dolphin riding the waves of the classroom, an actress with many roles to play, an encyclopedia maximizing students’ learning, a detective diagnosing students’ needs, and a farmer planting the seeds of knowledge (“What’s Your Metaphor? Survey Results”). The better metaphors touch upon the diagnostic and formative aspects of teaching, not just summative, end product aspects.

Kim McShane, a university lecturer in Sydney, Australia, has researched metaphors for university teaching, focusing on academic teacher identity in the light of the integration of ICT in teaching. Metaphors for teachers who use technology are different from those used to describe old-fashioned teachers. The facilitator metaphor, that of the “guide on the side”, supplants that of “sage on the stage”. Such teachers are seen as leaders, hosts, managers, mediators, or resources to be consulted. Traditional teachers, on the other hand are seen as authoritarian providers of knowledge: performers and deliverers of content. McShane worries that some of the new metaphors may actually be interpreted as devaluing, or ignoring teachers’ work – and one may or may not agree.

My favourite metaphor is that of the transmission of cultural DNA, comparing cultural propagation to genetic propagation. After all, culture is not just a matter of history or people’s rituals, let alone how they dance or sing; it is about how they react to current issues, including the way they solve problems. Harold McDougall, a law Professor, examined the idea in a recent Huffington Post blog post, “Cultural DNA: Bringing Civic Infrastructure to Life”. His post ends on a particularly relevant note: “As we teach them and send them on their way, we have a responsibility to pass on the tenets of progressive social change as our generation understands them: learn by experience; respect context; encourage participation; honor place; accept limits; and acknowledge temporality. These strands of cultural DNA, traditional and modern, can help us construct a culture of empathy and sustainability that is the proper foundation for progressive social change.”

In their book Metaphors We Live By, Linguist George Lakoff and philosopher Mark Johnson have argued that our minds actually process the world primarily through metaphors and that the way we conceptualize abstract ideas affects the way we understand them. Metaphors define roles. Therefore, those that represent students as passive receivers of knowledge, such as the gardener or shepherd analogy, implying that students are expected to behave like plants or sheep, are not as useful as those that focus on what students can do. A famous metaphor attributed to William Butler Yeats underlines the need for motivation of students: "Education is not the filling of a pail, but the lighting of a fire". Very true: education should transcend the mere transmission of content; it should be more about sparking curiosity, teaching students how to learn, and encouraging independent and life-long learning.

An insightful Chinese proverb can be valuable in this connection: “Teachers open the door, but you must enter by yourself”.


Posted by May Mikati on 06 September 2012, 2:35 PM

Sunday, August 12, 2012

Blogging as Lifelong Learning - 12 August 2012

Blogging as Lifelong Learning


While reading blogs may certainly contribute to one’s education, blogging itself is also a form of life-long learning. Writing about anything means understanding it in order to express yourself clearly about it; you need to learn about it first, experience it in a way, and reflect on it before you can effectively share your thoughts about it.

Even the briefest blog post may be preceded by hours or days of reading or mulling over a topic. There will be times when not much, if anything at all, will have been written about your idea – as was the case with my previous post, linking the Olympic motto to blogging. I could not find a single online resource applying the “faster, higher, stronger” maxim to blogging. I was thrilled that no one had written about blogging from that particular vantage point in the past.

Yes, blogging can be thrilling – and thought provoking. Was the allusion eccentric I thought? Or was it simply creative? Either way, on such occasions, a few clicks later, and the post is published. On the other hand, for more ordinary topics, there will be more information out there than you can handle. You need to be selective. Wading through tonnes of others’ online pronouncements on an issue is not always a zappy experience; it can be slow and painstaking. One article leads to another; one video leads to another, and so on and so forth. You compare against your prior learning and experience. Ideas flow. Some sink in; others drop out. New insights form. You shape your new ideas, you shape and reshape the text through which you will express them; you check your word choices for accuracy and appropriacy; you reflect on your choices, semantically and pragmatically, then you share. Repeatedly, you go through this process. Now if that is not lifelong learning, then what is?


Posted by May Mikati on 12 August 2012, 5:13 PM

Wednesday, August 1, 2012

Citius, Altius, Fortius - 01 August 2012

Citius, Altius, Fortius


Social media can provide faster, higher, stronger platforms for expression. They are clearly faster than more traditional forms of publication. The parallel which opponents of blogging, and other moralists, may draw with the tortoise and the hare does not hold as that would be more like comparing apples and oranges. Take this blog, for example: the content would rot if were to be kept and later published as a book, or even as a traditional “article”, in tortoise-like fashion!

The “higher” part is not so well-defined. While it would be hard to argue that blogging is always morally superior, it may be viewed as being above traditional publishing in the sense of bypassing the hurdles of conventional reviewers, editors, etc. One is always a click away from publishing the next idea – no bureaucracy, and no fuss. The spontaneity of the pieces, and the transparency of reader feedback, may actually provide a slight moral edge.

For addicted bloggers and readers, of course, “higher” may take on a special, added meaning.

Finally, social media are stronger in the sense of their immediacy, global reach and impact. In their accessibility to writers and readers, they may also be considered fairer than traditional media, especially for the traditionally disadvantaged.

The Olympic motto “Citius, Altius, Fortius”, Latin for “Faster, Higher, Stronger”, can therefore apply to blogging, whether in the sense of civic engagement or not.

Let me know if you disagree.


Posted by May Mikati on 01 August 2012, 3:10 PM

Wednesday, July 25, 2012

A Waste of Time or Digital Social Capital - 25 July 2012

A Waste of Time or Digital Social Capital?


The term civic engagement has been used to reflect many different approaches to citizenship, whether local or global, including a variety of activities - from informal individual activities to formal collective ones.

Both blogging and commenting on blogs may be considered forms of civic engagement. In an interview for The Chronicle of Higher Education, Berekely’s Howard Rheingold emphasized the need to encourage students to blog, saying that 21st century civic education is “participatory media-literacy education”, distinguishing passive consumers of broadcast media content from active citizens who blog, share videos, comment on newspaper articles online, etc. (“Why Professors Ought to Teach Blogging and Podcasting”).

On the other hand, not everything posted by ordinary citizens is influential at this point in time, as explained by Ryan Rish of MIT in the paper “User Generated Content of an Online Newspaper: A Contested Form of Civic Engagement”. Regretting that user-generated content, such as feedback provided on online newspaper sites, is not currently considered a legitimate form of civic engagement, he expects greater impact for future civic and participatory journalism. While civic journalism involves professional journalists encouraging interactive news reporting, participatory journalism places citizens more centrally, involving them in the collection, analysis and publishing of news and ideas. Focusing his study on an online school newspaper in the U.S., Ryan reports that “Members of the local school district leadership discounted user-generated content associated with the online newspaper as a legitimate form of communication with school district officials, while users of the online newspaper and the editorial staff of the newspaper argued for the user-generated content to be considered a form of community conversation”.

Digital social capital or a waste of time? You decide.


Posted by May Mikati on 25 July 2012, 4:34 PM

Monday, July 2, 2012

Online Civic Engagement - 02 July 2012

Online Civic Engagement


Over a year has passed since I started blogging. What keeps this blog going when many academics fear blogs as unconventional, non-peer-reviewed forms of publication?

Since blogs are open to the world, anyone can scrutinize their content and comment on it, including experts – something not entirely different from peer review. Additionally, blogging may be seen as a form of civic engagement. It is useful not only in teaching and building community with one’s immediate environment but also in outreach to a broader community. And don’t forget, it’s much faster than other forms of publishing.

One blogger and teacher, Michael Kuhne, sees wikis such as Wikipedia as a form of civic engagement: “When it works, Wikipedia is this great social experiment where people with a vested interest in an article (actually, their interest is not the article itself, but what the article (re)presents) can exchange ideas, debate, deliberate, and create. How many civic institutions exist today that can promise the same?” (“What Does Wikipedia Have to Do With Civic Engagement?”). Traditionally, civic engagement has taken the form of non-profit contributions to society, usually by powerful groups of people providing services to their communities through channels such as charities, scouts, social welfare groups and religious organizations.

The Pew Research Center reported in 2009 that, in the U.S., the internet was still mirroring traditional socioeconomic power relations. The more advantaged are more likely to be civically engaged (whether online or not) just as the situation has been historically. Yet things are changing: “There are hints that forms of civic engagement anchored in blogs and social networking sites could alter long-standing patterns that are based on socioeconomic status” (Smith et. al, “The Internet and Civic Engagement”).

In future postings I shall continue to reflect on the idea of civic engagement online – a thought-provoking subject.


Posted by May Mikati on 02 July 2012, 5:23 PM

Thursday, June 7, 2012

Why We Quote - 07 June 2012

Why We Quote


Having recently blogged on the subject of originality, quoting appears as an antithesis. Still, if you are interested in the culture and history of quotation, this book by Open University scholar Ruth Finnegan will be of interest: “Why Do We Quote?: The Culture and History of Quotation”. Finnegan dedicates her book to “the many voices that have shaped and resounded in my own”; then she asks interesting questions in her preface: “What does it mean this strange human propensity to repeat chunks of text from elsewhere and to repeat others’ voices? How does it work and where did it come from? Does it matter? Why, anyway, do we quote?”.

Admitting that her book is biased towards western culture, and her research focused on southern England, she begins the book in contemporary England, the “here and now”, and moves back chronologically to understand the background to her subject. A large scale survey conducted in 2006 shows that English people quote for various reasons, and that proverbs constitute a large proportion of quotations. The proverb “more haste less speed” appears repeatedly in her survey results. Other proverbs include “Sticks and stones will break your bones, but words will never hurt you”, “Too many cooks spoil the broth”, and “Laugh and the world laughs with you, cry and you’ll weep alone.” Quotes are used not only to share information, but often, especially in conversation, to evoke irony, a pun or analogy which the listener must catch. In e-mail, quotes have become fashionable as a “tag or sort of signature”. In persuasion, quoting famous people tends to add credibility to what is being argued; it may add “weight” or “ammunition”. Quoting may also be used for the sake of humour or sarcasm, as in “A bad workman blames his tools”. On the other hand, many of those surveyed had reservations against quoting: it can border on plagiarism, it is unoriginal, a sort of “parroting” to be avoided, a sign of possible laziness. Some had no objections to it, as long as it was not overdone. Still, what was at issue was not the quantity but the appropriacy of what was being quoted: was it necessary, or was it done for the sake of pompously showing off?

Historically speaking, Finnegan says, the origin of quotation marks is not clear. For example, different versions of the Bible used different devices to indicate reported speech: while newer versions include angled quotation marks, older versions used devices such as indentation, capital letters, or colons. Some texts only used verbs to indicate spoken or written words. She indicates that the ancient Greeks were probably the first to use inverted commas: a wedge shape > was used as a marginal sign to draw attention to anything especially important in a text: linguistic, historical or controversial. This diple mark eventually metamorphosed into the quotation marks we use today. She notes stylistic and cultural differences in the way people quote across languages, identifying a disadvantage to using quotation marks; they are too binding: “they impose too exact an impression…. Quote marks are too crude a signal, it could be said, for the subtleties of submerged or fading quotation, allusions, parody, intertextuality, leaving no room for finer gradations” (p.109).

Finnegan distinguishes between quoting to endorse another and quoting to set oneself apart, keeping the other at a distance; the way the quotation interacts with the rest of the text should reflect whether the other is respected or “mocked… parodied, or belittled” (p. 171). In a chapter entitled “Controlling Quotation”, the author indicates that quoting has become a regulated social activity; not only is plagiarism frowned upon, there are laws protecting intellectual property and copyright. In “Harvesting Others’ Words”, Finnegan notes that collecting quotations has been common in the west for millennia, but is not restricted to the western tradition. Ancient Mesopotamia, early China, India and the Arab world, among others, have their own collections. There seems to be a moral force to the words of past generations – a certain wisdom. As for proverbs with pictorial illustrations, the west first saw these in medieval times, as reflected in the French collection Proverbs en Rimes later translated to English (p. 179).

The book ends with the conclusion that there is no single, clear-cut answer to the question of why people quote: quoting is a “normal” aspect of language, which, like other human behaviours, has its own social regulations. Finnegan’s final question is why not?

It is hard to disagree with this book. After all, it is based on facts rather than conjecture. It is highly relevant to historians, teachers, and university students who write substantive essays. It is comprehensive in that it tackles both written and oral texts, viewing them in different contexts: those of religion, philosophy, the family, and society at large. On the other hand, as the author rightly indicates, excessive use of others’ words – and ideas – may give the impression of laziness or lack of originality, so students need training in how, what, and when to quote.

The challenge of original expression is of course multiplied for those writing or speaking in a second or third language, so language teachers take heed.


Posted by May Mikati on 07 June 2012, 11:24 AM

Wednesday, June 6, 2012

Linguistic Inflation - 06 June 2012

Linguistic Inflation


The Macmillan dictionary blog recently hosted two attention-catching articles on hyperbole by Stan Carey: “Is Linguistic Inflation Insanely Awesome?” and “The Unreality of Real Estate Language”.

In the first article, Carey explains that linguistic inflation devalues words by associating them with what is of lesser value, as in referring to a clever person as a “genius” or labelling an internet link that we share as “insanely amazing” simply because that draws better attention than “pretty good” or “rather interesting”. Still, Carey does not see this as seriously problematic because we will never be short of words to express what we want: as grand-sounding words become routine, other words take their place by “further shift or by coinage” as indicated in the Economist article “Dead, or Living Too Well?”; as the meanings of “awesome” and “terrible” changed, for example, “awe-inspiring” and “horrifying” took their place. Similarly, the Economist anticipates that a new word will soon be needed to signify a “curator” in the sense of an art warden because the meaning of “curator” has been significantly diluted: “A curator is no longer a warden of precious objects but a kind of freelance aesthetic concierge” (“Curator, R.I.P”).

While scientific and academic writing resist linguistic inflation, some less formal contexts such as those of real estate language illustrate the phenomenon very well, according to Carey: “In this world, medium is ‘large’, average is ‘first rate’, and unusual is ‘extraordinary’. Any site that isn’t a ruined shack sinking into a swamp may be described as ‘superb’….Even run down houses can be made appealing, since they offer ‘immense potential’ ”.

English language learners beware: You need to understand the language 110%!


Posted by May Mikati on 06 June 2012, 2:45 PM

Friday, June 1, 2012

Gender Neutral Language - 01 June 2012

Gender Neutral Language


First in France this year - now in Sweden: the feminists are changing the language. The Swedes, known as the most gender equal people in the world, are now striving beyond equality – towards neutrality, and this is being reflected in their language. A new gender neutral pronoun, “hen”, can now be used instead of the feminine “hon” or masculine “han”. Suggested by linguists in the 1960s, the pronoun finally made it into the mainstream language this April when it was added to the National Encyclopedia in Sweden.

Nathalie Rothschild has reported on how the Swedish society is no longer satisfied with gender equality; pressure groups are working on the elimination of gender distinctions from society at large, including government institutions. The purpose is not simply to accommodate those who do not identify well with a particular gender, or who wish to marry someone of the same sex: “What many gender-neutral activists are after is a society that entirely erases traditional gender roles and stereotypes at even the most mundane levels”( “Sweden’s New Gender-Neutral Pronoun: Hen”). Rothschild gives examples on how this is happening: parents are being encouraged to use unisex names for their children, a Swedish clothes company no longer has a “girls” section as distinct from a “boys” section, and toy catalogues are following the same logic. Schools, sports, and even restrooms are following the trend. Of course, there has been opposition, including complaints that the feminists are destroying the language, but this has not stopped “gender pedagogues” from monitoring schools and taking action where necessary.

Sweden is a perfect example of a new world order, including a “new word order”, in sharp focus. Others will follow – slowly but surely, wouldn’t you agree?


Posted by May Mikati on 01 June 2012, 9:24 PM

Sunday, May 27, 2012

What Is Originality? - 27 May 2012

What Is Originality?


Educators like to promote original thought and creative expression, but what exactly is originality? If you go to the plagiarism detection web site, Turnitin, you will see one meaning of an “originality report”: the percentage of matching text. It is easy to infer that the lower the percentage of matching text, the greater the originality of ideas. Stolen ideas that are paraphrased are not easily detectable by such systems. In theory, students can recycle entire “research” papers and submit them to such services, and they can get away with it. Those who are too lazy to paraphrase their stolen ideas are caught more easily!

Few are those who are truly original since writers build on others’ ideas, as do innovators in various fields – scientists and engineers, fashion designers, chefs, etc. On her web site Brainpickings Maria Popova has posted thoughts from Henry Miller that are worth sharing:

And your way, is it really your way?

[…]

What, moreover, can you call your own? The house you live in, the food you swallow, the clothes you wear — you neither built the house nor raised the food nor made the clothes.

[…]

The same goes for your ideas. You moved into them ready-made.


Originality, it seems, is not a matter of black and white. There are different degrees and types of originality. If students are encouraged to take fresh angles on their topics, synthesize ideas in new ways, and express themselves in a creative manner, the chances of their producing “original”writing are raised – all the while of course remembering the need to acknowledge any sources.


Posted by May Mikati on 27 May 2012, 8:19 PM

Thursday, May 24, 2012

Banned Words - 24 May 2012

Banned Words


Words can be banned for various reasons. Let’s examine examples from France, the U.K. and the U.S.

Among various efforts to eliminate gender discrimination, the feminists in France managed to ban the word “mademoiselle” from official documentation a few months ago; it has been replaced by a generic “madame”. Last year, France banned the words “Facebook” and “Twitter” from TV and radio, dictating that general terms such as “social networking sites” be used instead as the latter do not advertise for specific companies. Years earlier, the culture ministry in Paris had published a list of 500 English words, such as “email”, “blog”, and “podcast”, recommending that certain French equivalents be used instead. Besides gender equality, national pride is clearly an issue for the French.

In the U.K this month, Scotland Yard banned the terms “whitelist” and “blacklist” in an effort to reduce racism in the police force. Staff have been advised to use the equivalent “green list” and “red list”. Some police officers are not convinced that this will change anything, but following repeated allegations of racism, senior officials at Scotland Yard will go to any length to reduce sensitivities (“Blacklist Is Blacklisted”). Generally, though, the U.K. may be moving in the opposite direction – that of eliminating a 2006 law that bans “insulting” words but does not clearly define them. The BBC recently reported on the opposition to the law in “Law Banning Insulting Words and Behaviour 'Has to End'”.

In educational contexts some expressions may be avoided if considered distracting for students. New York City’s Department of Education recently banned over fifty such items from the city’s standardized tests. Most of the words, such as “Halloween” and “dinosaurs”, appear innocuous on the surface, so no wonder the list has sparked controversy. Valerie Strauss, reporting on the ban for The Washington Post, says, “Why Halloween? Its pagan roots could offend the religious. Dinosaurs? Evokes evolution, which could upset creationists. You get the point” (“50-plus Banned Words on Standardized Tests”).

Watch your words. While some word bans may appear silly, others are clearly justified. It’s good to stay up-to-date on these matters in order to adapt to different contexts, both synchronically and diachronically.


Posted by May Mikati on 24 May 2012, 11:56 PM

Saturday, May 19, 2012

How Natural Language Processing is Changing Our World - 19 May 2012

How Natural Language Processing is Changing Our World


From speech recognition to speech synthesis, and from machine translation to data mining, natural language processing is changing our world.

In language-related applications, computers are gaining intelligence at an amazing speed. Some computers can now not only recognize basic spoken words and sentences, they can also resolve lexical and sentence ambiguity based on the context; plus, they can recognize some idioms and metaphors. To top it off, computers are learning to detect emotion and respond appropriately. By extension, automatic translation is advancing daily, which may diminish the need to learn foreign languages for future generations. Speech synthesis has advanced in such a way that systems will soon be able to translate your speech using your own voice. Theoretically speaking, you will be able to hear yourself (or your voice, more correctly) speaking Hindi, Mandarin Chinese or even Mongolian in the not too distant future, without your necessarily having learnt any of those languages. With sufficient samples of your speech, such systems will be capable of putting together new sentences for you, in the new language. The systems just need to know your voice, and they will do the rest of the work.

Of course, automatic translation is a complicated task. Poetic language and uncommon metaphors and puns pose special challenges, as do certain expressions that may be considered “untranslatable”, requiring borrowing from the source language, adaptation, substantial paraphrasing or annotation. Still, as emphasized in tcworld, an international information management magazine, machine translation is becoming inevitable: “Over the next few years, every organization’s content strategy will rely on some type of machine translation” (“As Content Volume Explodes, Machine Translation Becomes Inevitable”).

As for data mining, while we all know how search engines are speeding up our research, more advanced searches can produce even better, more focused results, further eliminating the unwanted, irrelevant types of “hits” one normally obtains with ordinary search engines. Just watch this video to see how future search results can be refined with more intelligent searches: “How Natural Language Processing Is Changing Research”.

In this impressive video, Aditi Muralidaharan, a Berkeley graduate student explains her work on a new system called Word Seer. The system can save reading time for researchers by analysing digitized literary texts quickly, using parsing that targets useful parts of sentences, such as adjectives and verbs. Instead of performing a simple keyword search, the system extracts very specific data. The student gives the example of slave narratives being analysed for their references to God. Rather than simply typing in “God”, one asks specific questions about God: “What did God do?” elicits verbs, such as “gave”, “knew” and “blessed”, while “How is God described?” extracts adjectives, such as “good”, “holy”, “just” and “great”. The conclusion could be that slaves generally had a positive relationship with God despite their misery. For those working on this project, the hope is that researchers in the humanities will be convinced to use the technology based on the efficiency of the results. Rather than having a graduate student read through selected texts (with the word God in them) in five days, one can extract relevant information using the parser in five minutes.

Such advances in natural language processing herald a bright future for the currently not so bright technologies we use.


Posted by May Mikati on 19 May 2012, 9:17 AM

Wednesday, May 2, 2012

The Origin and Progress of the Academic Spring - 02 May 2012

The Origin and Progress of the Academic Spring


In case you were wondering how this current “Academic Spring” started, well, apparently it was triggered by a posting on a university mathematician’s blog in January 2012. On April 9, The Guardian newspaper published an article on this, entitled “Academic Spring: How an Angry Maths Blog Sparked a Scientific Revolution”. The article identifies Tim Gowers, a distinguished, prize-winning Cambridge mathematician as the initiator of the Elsevier boycott. Gowers received hundreds of comments on his post “Elsevier — My Part in Its Downfall”, and one of his supporters started a site for collecting the names of academic boycotters. Thousands have signed up in just a few months, and incidentally, there are two from Lebanon already, including one from AUB.

A more recent Guardian article, dated May 1, shows British government progress on the issue of facilitating public access to research results: with the help of Jimmy Wales, all “taxpayer-funded research [will be] available for free online” (“Wikipedia Founder to Help in Government's Research Scheme”). The same article reports that Harvard University, angry at the high cost of journal subscriptions has followed suit: it has encouraged its faculty members to publish openly and “resign from publications that keep articles behind paywalls”. The article cites David Prosser, executive director of Research Libraries UK (RLUK): "Harvard has one of the richest libraries in the world. If Harvard can't afford to purchase all the journals their researchers need, what hope do the rest of us have?...There's always been a problem with this being seen as a library budget issue. The memo from Harvard makes clear that it's bigger than that. It's at the heart of education and research. If you can't get access to the literature, it hurts research."

Having attended, in 2009 and 2011, international conferences on distance, open, and e-learning, and having witnessed the enthusiasm of participants, including that of UNESCO representatives, for open access to information, I am not really surprised by the momentum building up behind the Open Access movement; the Wellcome Trust and the World Bank are now also on board.

With one eye on the Arab Spring and another on the Academic Spring, one can easily lose sight of other important issues, however. One’s inner eye must always be on the lookout for less obvious but equally worthy causes.


Posted by May Mikati on 02 May 2012, 5:44 PM

Thursday, April 26, 2012

Open Access: An “Academic Spring” - 26 April 2012

Open Access: An “Academic Spring”


One of the many academic databases we have access to at the American University of Beirut is the well-known database Elsevier Science Direct. Faculty members and students use this resource to a considerable extent. Yet, I was recently surprised to learn that the database had been boycotted by thousands of academics worldwide as part of the boycott of the Anglo-Dutch science publishing giant Reed Elsevier, one of the world’s largest publishers of scientific, technical, and medical information, and owner of Lexis Nexis (another popular resource).

On April 1 – though this was not a joke – The Chronicle of Higher Education published “An Open Letter to Academic Publishers About Open Access” written by Jennifer Howard. Howard warned publishers that they should be nervous because of the new “Academic Spring” - the revolt against expensive publishers spreading throughout academia and represented by the Open Access movement.

Open access of course may be defined in various ways; the definition may be restricted to the relatively new open access journals, or it may include the less formal posting of working papers, blogs, and other non-peer-viewed work. While the traditional requirements of conventional academic careers may dictate otherwise, who knows what the future might bring for academia? Web 2.0 has done miracles so far. Besides, the United Nations, represented by UNESCO, supports open access.

If spring is here, can summer be far behind?


Posted by May Mikati on 26 April 2012, 5:51 PM

Monday, April 16, 2012

Softening Up the Language - 16 April 2012

Softening Up the Language


Language teachers often find themselves teaching about euphemisms, whether intentionally or not. A euphemism is a relatively harmless word or expression meant to replace a more offensive one. The blind are commonly referred to as “visually impaired” or “visually challenged”, spying is “surveillance”, and stealing can be “appropriation”.

A particularly interesting word often used as a euphemism is “overqualified”. When referring to a rejected job applicant, the term may be used as a cover-up for the fact that the employers do not wish to reveal their reasons for the rejection, or that the applicant is too old for the job, resistant to new technologies, or too demanding in terms of compensation.

The Economist editors recently published a report on euphemisms from around the world. Entitled “Making Murder Respectable”, their article defines euphemism as “a mixture of abstraction, metaphor, slang and understatement that offers protection against the offensive, harsh or blunt”. Noting that the British are “probably the world champions of euphemism”, the article concludes that, without euphemisms, the world would be a more honest but harsher place to live in. No witness to “the global war on terror” with its “friendly fire”, “collateral damage”, and “enhanced interrogation techniques” could possibly disagree.

Euphemisms definitely soften up the language, don’t they?


Posted by May Mikati on 16 April 2012, 8:34 AM

Friday, April 6, 2012

A "New Word Order" - 06 April 2012

A "New Word Order"


My previous blog post was about lexicography in the Internet age: how dictionaries are coping with the speed of language change. Here is solid background, and further reflection, on this ever more mercurial subject.

A Guardian article dated 2001 shows that back then the “New Word Order” was beginning to set in. Competition was suddenly hotting up between dictionary makers. Lexicographers had started implementing more sophisticated methods to keep up with language evolution. The author, D. J. Taylor, notes that speed had suddenly become of paramount importance in a field not particularly notable for speed. Hopeful for the future, he used a most revealing analogy: “If language is a butterfly, endlessly and effortlessly soaring above the heads of the entomologists who seek to track it down, then the nets are getting larger every year.” He reminded readers of Samuel Johnson, the most influential English lexicographer, who was the first to vehemently reject the prescriptivist approach, indicating that language is so variable that trying to police it is a doomed endeavour. Taylor added that while language does need to be tracked closely, it is like a beast that transforms itself into something else by the time one has finished the process of capturing and dissecting it. Some words take on new meanings between detection and publication.

Policing language is one thing, and tracking it is another. No wonder the constant searches, solicitation of user input, statistics and research. Will any of the well-known dictionaries ever implement live online updates to their definitions, or will they continue to solicit new input, adding appendices of possible new words, between editions? If they do all go “live”, that may be better for users, but any print editions published would automatically become obsolete. Will these dictionaries follow in the footsteps of the Encyclopedia Britannica soon?

Far-sighted thinkers, such as Michael Rundell, might ask whether there is a future at all for lexicography, or whether dictionaries will simply “dissolve” into our computers ("The Future of Lexicography: Does Lexicography Even Have a Future?”) . It would be interesting to watch and see.


Posted by May Mikati on 06 April 2012, 2:46 PM

Thursday, April 5, 2012

How Dictionaries Cope With Language Change - 05 April 2012

How Dictionaries Cope With Language Change


Can English language dictionaries cope with the current speed of language change? While such change usually involves grammar, pronunciation, spelling, and phrasing, the English language appears to be changing particularly fast in the realm of phrasing: the incorporation of new words and expressions relating to various topics, influenced, among other things, by the speed of technological change. Yet, it may be inferred that technological change is not a sufficient criterion for such change. Japanese, for example, has changed little, compared with English, according to a recent National Science Foundation report; other social and cultural factors appear to play a role ("Language Change").

Paul McFedries’ intriguing web site Word Spy (The Word Lover’s Guide to New Words) is a good example on the density of new expressions entering the English language, some of which are making it into the dictionaries. To cope with the phenomenon, well-known dictionaries are providing constant online updates. Merriam-Webster, for example, has a section for words proposed by the public: “You know that word that really should be in the dictionary? Until it actually makes it in, here's where it goes” (“New Words and Slang”). How, then, in the perpetual tsunami of new vocabulary, do dictionary editors decide which new words to include in updates to their dictionaries? First of all, an unabridged dictionary is likely to include more new words than an abridged one because of space considerations. Secondly, new words go through a long process before they are either incorporated or dropped, as illustrated through the example of Merriam-Webster. To make a long story short, a typical procedure involves the following broad phases: editors “reading and marking” a variety of published material, noting neologisms, variant spellings, etc.; saving the marked passages, along with their citations in a searchable database, showing not only where each text came from but in what context the new word was used; and “definers” reading through the citations, deciding which words to keep based on the number of citations found for each word as well as the variety of publications where it is used over a substantial period of time (“How Does a Word Get into a Merriam-Webster Dictionary?”). The process is almost identical in the Oxford Dictionaries (“How a New Word Enters the Oxford Dictionary”).

Dictionaries are coping with the speed of change with the help of technology: easier access to a variety of publications, searchable electronic databases, user input and faster statistics. With time, dictionaries can only become more objective -- descriptive rather than prescriptive as most were in the past. They are also becoming more democratic. A Wikimedia era of McDictionaries or a regulated lexicographic democracy? You decide.


Posted by May Mikati on 05 April 2012, 4:43 AM

Tuesday, March 27, 2012

April Fools? - 27 March 2012

April Fools?


Have you ever been tricked on April Fools’ Day? Apparently, some of the best known April first pranks have taken place in higher education settings.

In 1983, a Boston University professor of history, John Boskin, when interviewed about the origin of April Fools’ Day, fabricated a story that was published by the Associated Press and later withdrawn. He claimed that some court jesters in the days of Constantine had told the emperor they could run the empire better than he did and that, amused, Constantine allowed a jester called Kugel to become king for a day, April 1. When the young AP reporter got the “story” published, Boskin used the incident to teach his students about false reports in the media: how the media can take a joke, innuendo, or story, consider it as authentic, and spread it. Luckily, the credulous reporter’s career was not ruined; he is now an associate professor in the College of Communication (“How a BU Prof April-Fooled the Country”).

The Massachusetts Institute of Technology has been associated with other well-known April first pranks. Among these was the 1998 hacking of the institutional web site by students who announced the “unprecedented acquisition of a non-profit educational institution by a Fortune 500 company”. They claimed that a huge Disney scholarship fund would reimburse past and future students for the following twenty years; the Engineering School would switch to “Imagineering”; the Sloan School would be renamed the Scrooge McDuck School of Management; there would be a Donald Duck Department of Linguistics, and Disney characters would appear in lectures to keep students alert, facilitating the learning process ("Walt Disney Corporation to Acquire MIT for $ 6.9 Billion").

The University of Cambridge has also had its fair share of April Fools’ Day stories. A posting on a student forum in 2011 announced that, due to government spending cuts, the Vice Chancellor had announced Cambridge would soon become a science only university ("Cambridge to Cease Arts Teaching by the End of the Decade"). While some naive readers were shocked, others realised that could only have been a joke.

Let us all be on the alert this April first, and every day of every year; few are as fortunate as the Boston former AP reporter though many are equally, if not more, gullible.


Posted by May Mikati on 27 March 2012, 12:13 AM

Saturday, March 24, 2012

Fooling Around in the Classroom? - 24 March 2012

Fooling Around in the Classroom?


Does humour detract from the quality of teaching and learning? I would say that joking in the classroom is a high risk activity for educators. It depends on the quality – and quantity – of the humour, as well as its timing. Personally, on the rare occasions that I do use humour, I relate jokes to the subject matter I am teaching, and the first joke I tell in any class usually receives a positive reaction from students. Beginning with the second or third witticism (if there is one), students’ reactions vary. Alarm bells seem to start ringing for the paranoid in the audience. Yet while some appear uneasy, others may start imitating the humour in an effort to reciprocate.

Psychologist Ted Powers has written on the usefulness of humour in both teaching and assessment, citing Boughman’s famous statement, “One of the greatest sins in teaching is to be boring”. Powers’ definition of humour is a broad one, including any event that elicits laughter: “It is not limited to jokes or humorous stories but can include props, puns, short stories, anecdotes, riddles, or cartoons. It can be anything that creates a positive feeling in students and makes them smile and laugh.” He refers to studies that have shown the benefits of occasional appropriate humour: increased attention and focus, a more liberal atmosphere, helping with class management, better retention of information, and reduction of anxiety on a test or quiz ("Engaging Students With Humor"). Similarly, Melissa Wanzer’s “Use of Humor in the Classroom” discusses research on the benefits and challenges of using humour.

Retired linguist and humour specialist Don Nilsen advises caution regarding the timing of humour, which could be counterproductive when students are under stress, such as before exams or when major projects are due. Additionally, he warns against the use of sarcasm ("Humor Studies: An Interview with Don Nilsen"). However, he and his wife Alleen are great advocates of humour, having started a conference about it in the 1980s, which was always held on April Fools’ Day weekends. They published a journal and wrote books on the subject, including an encyclopedia of American humour ("Twenty Five Years of Developing a Community of Humor Scholars"). Don Nilsen also gave undergraduate and graduate courses in linguistic humour and language play. He illustrated language devices such as chiasmus (the use of criss-cross structures) through funny examples, as in a bumper sticker that read “Marijuana is not a question of ‘Hi, how are you’ but of ‘How high are you?’”.

The field of humour studies is a well-established one now. Take a look at the International Society for Humor Studies, for example. You’ll see a journal, conferences, seminars and workshops, and resources galore.

Humour, then, may be more serious than you think! Language teachers, especially, should try some language play every now and then to lighten up their material. While this may be a challenging activity, it may be quite rewarding.


Posted by May Mikati on 24 March 2012, 10:14 PM

Thursday, March 22, 2012

Bloggers' Block? - 22 March 2012

Bloggers' Block?


For those who enjoy writing, the blogosphere beckons with magnetic force. Yet even experienced writers suffer from occasional writers’ block. They may run out of topics to write about or things to say about their subjects. Apparently, stress is one of the main enemies of creative writing; brain science has shown that the mind tends to “freeze” when an individual feels threatened or is under stress. On the other hand, a relaxed mood promotes creative writing (see Rosanne Bane’s “The Writer’s Brain: What Neurology Tells Us about Teaching Creative Writing”). Other causes of writers’ block include worrying too much about the audience or the appropriacy of the topic.

To spice up their sites, some bloggers write joint blogs or occasional joint articles. Others invite guests to write pieces that they can publish; however, guest blogging has been criticized for drowning the voice of the guest by merging it with that of the host.

Extensive reading of course helps generate ideas for blogs. Similarly, reader feedback may trigger future blog posts. I would therefore like to encourage readers to comment on my postings. I have received scattered responses so far; positive but vague verbal remarks from a few colleagues, a number of “likes” through Facebook, and one anonymous student comment on a post entitled “Reflecting on Student Expectations”. In that comment, the student remarks that while instructor humour can aid learning, disrespectful humour is counterproductive. That reader’s interest in the subject of classroom humour has encouraged me to dedicate a future posting to the subject, so do stay tuned for that.


Posted by May Mikati on 22 March 2012, 11:31 AM

Sunday, March 18, 2012

No More Britannica in Print - 18 March 2012

No More Britannica in Print


In January I blogged on the topic of print versus online publications, asking the question “Are Books Out of Fashion?”. Today, a follow-up post is due on the occasion of the oldest English encyclopedia, Britannica, going completely digital. This is a turning point, I believe, not just for the publishing industry, but for humankind at large as other such publishers are likely to follow.

Is it a question of literacy? Is it that people don’t read any more? Of course not. Students read and cite Britannica and other reference works all the time. They use the online version, though, not the hard copy. The death of print does not mean the end of publishing; it is simply a matter of medium. Publishers are realizing that the online versions are easier to maintain and more popular, delivering the databases in better quality. Nor is it a matter of competition from the free encyclopedia, Wikipedia, according to Jorge Cauze, Britannica President ("Encyclopedia Britannica Ends Print, Goes Digital").

Isn’t this kind of move also a triumph for the environment? Are we saving the trees at last? Definitely – and technology has advanced in such a way that the paper we are still using need not be recycled any more. A simpler, faster process for re-using paper has been invented in Cambridge: toner removal, rendering paper ready for re-use in no time. The new approach would not only save trees but also reduce emissions from the pulp and paper recycling industry ("Use a Laser, Save a Tree").

News for educators and environmentalists alike.


Posted by May Mikati on 18 March 2012, 12:40 AM

Sunday, March 11, 2012

Language Purists versus Language Change - 11 March 2012

Language Purists versus Language Change


Languages change whether the language prescriptivists like it or not. The purists out there are irritated by the truncation and blending of words, the borrowing from other languages, the use of gender-neutral terms (such as “chair” and “server”) promoted by feminists, etc. This blog would have been a “web log” still if it hadn’t been for the process of language change; and France would have been stuck with “Mademoiselle” for ever (“Au Revoir ‘Mademoiselle’”).

Grammars change with time as well. In English, the levelling of “whom” and “who” is one example; the move from such expressions as “if I were” to “if I was” is another. Those who stick to the old ways may be outed by their language use as illustrated in this popular language joke:

St. Peter (at the Pearly Gates of Heaven): Who is it?

Voice: It is I!

St. Peter: Go to hell, we already have all the English teachers we need!

Language teachers beware!


Posted by May Mikati on 11 March 2012, 4:55 PM

Thursday, March 8, 2012

Student Perfectionism: A Two-Edged Sword - 08 March 2012

Student Perfectionism: A Two-Edged Sword


Does perfectionism ail our students at the American University of Beirut?

Besides worrying about their course loads, students are concerned about excelling in their courses. At U.S. style universities, counseling centers play a role in identifying issues affecting student performance. The University of Texas Center for Mental Health and Counseling identifies a number of important student concerns that could apply to any university. Among these is perfectionism. The Center distinguishes perfectionism from a healthy pursuit of excellence. Perfectionism involves setting unrealistic standards, never being satisfied, becoming depressed, constant fear of failure and rejection, over-sensitivity to criticism, and seeing mistakes as disasters rather than as stepping stones to success. A healthy pursuit of excellence, in contrast, involves high but realistic standards, enjoyment of process as well as product, tenacity in the face of challenge, appreciation of constructive criticism, and seeing mistakes as opportunities for learning and improvement. The University of Buffalo Counseling Services site has a web page entitled “Preventing Perfectionism”, where it is pointed out that the condition can be crippling, inviting disappointment due to the unrealistically high expectations set for oneself and others. Similarly, Dr. Anthony Kamaroff of Harvard Medical School refers to the pros and cons of perfectionism in “Perfectionism Is a Two-Edged Sword”, warning that it may be exhausting and counterproductive.

UK and Australian universities have also identified relationships between perfectionism and mental health problems. See “Perfectionism and Mental Health in Australian University Students: Is there a Relationship?”, the University of Leicester Graduate School section on “Managing Problems”, and this leaflet on “Perfectionism” from the University of Dundee. Definitely food for thought for AUB students.


Posted by May Mikati on 08 March 2012, 8:45 PM

Tuesday, February 14, 2012

Encouraging Creativity - 14 February 2012

Encouraging Creativity


Are you a creative individual?

Researchers and policy makers have recently started stressing the need for promoting creativity among students in higher education although the definition of creativity varies across fields and concepts, so that the teaching of creativity may be discipline-specific (Marquis & Vajoczki, “Creative Differences: Teaching Creativity Across the Disciplines”). A 2007 European Universities Association report points out that “the complex questions of the future will not be solved ‘by the book’, but by creative, forward looking individuals and groups who are not afraid to question established ideas and are able to cope with the insecurity and uncertainty that this entails” (Creativity in Higher Education, p.6). The report emphasizes the need for diversity of teaching staff, students, and learning experiences for the promotion of creativity. A 2010 publication by the same association further stresses the importance of creativity and diversity as part of quality assurance ("Creativity and Diversity: Challenges for Quality Assurance Beyond 2010").

Do all teachers encourage creativity among students? Unfortunately not. School teachers may confuse student creativity with unruliness, preferring discipline and conformity. Rather than spontaneity and critical thinking, usually associated with creative students, instructors may prefer character traits associated with obedience to authority, seeing creativity as more of a burden than an asset in the classroom. Since creative people tend to ignore social conventions, they can give a hard time to teachers trying to manage a class of twenty or more students.

Sternberg and Williams, both psychology professors, have pointed out that young children tend to be more creative than older ones because society curbs spontaneity with time – for example, by expecting children to colour within the lines in their colouring books. Innovative ideas are not readily accepted by the masses:

"When creative ideas are proposed, they are often viewed as bizarre, useless, and even foolish, and are summarily rejected, and the person proposing them regarded with suspicion and perhaps even disdain and derision…. Creative ideas are both novel and valuable. Why, then, are they rejected? Because the creative innovator stands up to vested interests and defies the crowd and its interests. The crowd does not maliciously or willfully reject creative notions; rather it does not realize, and often does not want to realize, that the proposed idea represents a valid and superior way of thinking. The crowd generally perceives opposition to the status quo as annoying, offensive, and reason enough to ignore innovative ideas….Although people typically want others to love their ideas, immediate universal applause for an idea usually indicates that it is not particularly creative". (How to Develop Student Creativity)

Sternberg and Williams suggest various ways of encouraging creativity among students. The main way is for educators to serve as role models for creativity. Other ways include cross-fertilizing ideas, rewarding creative ideas and products, encouraging sensible risks, promoting self-responsibility and self-regulation, and delaying gratification.

Of course there have been some cynical approaches to creativity, as in Albert Einstein’s statement, “The secret to creativity is knowing how to hide your sources.” However, the concept has been associated with leadership, career success, energy, sanity, empowerment and individuality:

• “Innovation distinguishes between a leader and a follower.” Steve Jobs

• “But the person who scored well on an SAT will not necessarily be the best doctor or the best lawyer or the best businessman. These tests do not measure character, leadership, creativity, perseverance.“ William J. Wilson

• “I firmly believe that all human beings have access to extraordinary energies and powers. Judging from accounts of mystical experience, heightened creativity, or exceptional performance by athletes and artists, we harbor a greater life than we know.” Jean Houston

• “For me, insanity is super sanity. The normal is psychotic. Normal means lack of imagination, lack of creativity.” Jean Dubuffet

• “I think it's fair to say that personal computers have become the most empowering tool we've ever created. They're tools of communication, they're tools of creativity, and they can be shaped by their user.” Bill Gates

• “Living creatively is really important to maintain throughout your life. And living creatively doesn't mean only artistic creativity, although that's part of it. It means being yourself, not just complying with the wishes of other people.“ Matt Groening

University students need to understand that there’s more to life than conventional textbook information (or web information for that matter). Get a life – be creative.


Posted by May Mikati on 14 February 2012, 12:52 PM

Sunday, February 12, 2012

Must We Still Travel? - 12 February 2012

Must We Still Travel?


Has the internet relieved us of the need to travel? Partially perhaps.

St. Augustine once said, "The world is a book, and those who do not travel, read only one page." This would have been true until very recently. The internet has changed the world, however. In this global village we now inhabit, communication across borders is easier than ever. There are virtual worlds out there reducing the need for travel. Online education and training, virtual business meetings, and applications such as Google Earth are just a few examples. Before we know it, tele-immersion will be at our fingertips.

“What is tele-immersion?” you may ask. It is technology, using holographic environments, which will allow users in different parts of the world to interact virtually, in real time, in three-dimensional space, giving them the illusion that they are talking face-to-face in the same room. While tele-portation is a far-fetched futuristic idea, tele-immersion is not. Its applications will include contexts such as conferences, theatre and sports performances, education and training (such as that of soldiers and doctors), and tele-presence in other remote or hazardous situations. The technology will allow users to have unrestricted views of other users’ environments, greatly surpassing current video-conferencing. Some holiday travel may also be replaced with tele-immersion. Of course there will be technical hurdles, such as bandwidth issues and the need for expensive supercomputers, but, as with any new technology, these hurdles can gradually be overcome.

In the future, the curious and restless among us will still want to explore far-away places at first hand, in a manner similar to that of Robert Louis Stevenson who once said, “For my part, I travel not to go anywhere, but to go. I travel for travel’s sake. The great affair is to move.” However, future travel will be more out of choice than necessity when the new technology succeeds.


Posted by May Mikati on 12 February 2012, 9:37 AM

Monday, January 30, 2012

Promoting Significant Learning - 30 January 2012

Promoting Significant Learning


In the days of our students’ parents and grandparents, learners thrived on memorization, mostly out of context. Much learning was abstract, theoretical, dry, and irrelevant to people’s careers or everyday lives. Teachers clung to their “content” as if it were Holy Scripture that could not but benefit their pupils. Times have changed though, and that type of education is now considered inappropriate.

Researchers have realized that what engages students is the usefulness of the knowledge gained and the likelihood that it will impact others. That is why teachers these days are expected to demonstrate the relevance of their courses to their students, promoting creative applications. Showing students the significance of a course promotes intrinsic motivation. An excellent definition of significant learning comes from Dr. L. Dee Fink, author of the book Creating Significant Learning Experiences. Dr Fink came up with a “Taxonomy of Significant Learning”, which he sees as a successor to the classic taxonomy of cognitive skills developed by Benjamin Bloom and his associates in the 1950s. In his view, “individuals and organizations involved in higher education are expressing a need for important kinds of learning that do not emerge easily from the Bloom taxonomy, for example: learning how to learn, leadership and interpersonal skills, ethics, communication skills, character, tolerance, the ability to adapt to change, etc.” (“What Is Significant Learning?”).

Fink’s taxonomy revolves around the following kinds of learning:

• Foundational knowledge

• Application

• Integration

• The human dimension

• Caring

• Learning how to learn

In a recent interview, Fink elaborated on the importance of the shift from the content-centred approach to a learning-centred approach, stressing the need for change not just at the classroom level, but also at the organizational and national levels ("Creating Significant Learning Experiences: An HETL Interview with Dr. Dee Fink"). If you’re an educator still stuck on Bloom’s ideas, read Fink’s work. You’ll surely find it significant.


Posted by May Mikati on 30 January 2012, 2:36 PM

Thursday, January 26, 2012

Are Books Out of Fashion? - 26 January 2012

Are Books Out of Fashion?


Are books out of fashion? Five years ago, Thomas Benton, a college English lecturer observed that “The library -- perhaps like the human body -- must be purged of its decadent physicality and relocated into the realm of pure intellect, pure information, pure rationality, eternally updated, preserved as an endless stream of instantaneous electronic data” (“Red-Hot Library Lust”). Back then, researchers were wondering whether print books would still be available in five years’ time: “Does print really have an anticipated life span of five more years? Will e-books finally take off? After nearly two decades of talking about how e-books are right around the corner, have we finally reached the corner?” (Nelson, "E-Books in Higher Education: Nearing the End of the Era of Hype?")

Electronic publishing clearly is not erasing books in the sense of content; on the net, even “old books” are available. Online one can find old, out of print, and rare books, in electronic form. It’s not just a matter of hard copy versus electronic books though: people prefer shorter texts these days, and they read in a different way. Our students are an excellent example. They skim, scan, and read small chunks of text, unlike previous generations. Their preference seems to be for information from web sites rather than books, whether electronic or hard copy; the wider web is more appealing to them in its immediacy than the e-book section of the online library, just a few more clicks away. On the other hand, some academics claim “We're Still in Love With Books”; the transition away from old-fashioned reading has been slower than anticipated. As William Pannapacker put it, when new media emerge, they do not immediately replace old media.


Posted by May Mikati on 26 January 2012, 11:08 AM

Friday, January 20, 2012

In Defense of Cheating? - 20 January 2012

In Defense of Cheating?


One of my favourite essays on contemporary approaches to assessment is Donald Norman’s “In Defense of Cheating”. First of all, the title is clever, and secondly, the message is well thought out: change the educational system instead of accusing students of cheating; they only “cheat” because of the way they are taught and assessed. Emulate real life by replacing memorization and individual work with engaging activities and more collaborative work.

Norman emphasizes from the beginning of his article that his purpose is not to encourage deception but to reform the outdated curriculum and assessment practices. In his view, eliminating the need to “cheat” is more important than punishing students after the act. Prevention is better than cure. Changing the instructional philosophies is a must to avoid situations where “students cram for exams, regurgitate the material at exam time, and seldom retain it afterwards.” He underlines the need to emphasize processes – giving students credit for the way they reach their answers, including collaborative work (required in the workplace), and stressing comprehension rather than seeking answers in a vacuum.

Next Norman discusses plagiarism and grading. On plagiarism he has something clear to say: “The sin of plagiarization is not that it involves copying -- this should be rewarded -- but that it doesn't give credit for the originator.” I have to admit that while “copying” is not necessarily as great an idea as Norman makes it sound, using and acknowledging sources is an important skill, not just in academic work, but in real life. What he probably means is that the worst part of plagiarism is the unethical claiming of others’ ideas or work as one’s own. As for grading, Norman is opposed to the way it is done on a curve rather than for mastery: currently, “a person can only get a higher grade if someone else receives a lower grade.” He prefers a system where competition is de-emphasized, and absolute standards are spelled out, even if that means everyone receiving an A. Additionally, he proposes dividing the curriculum into modules that students can master at their own pace: “Admission to higher grades or to universities -- or even employment -- could be based upon what students know. Schools or employers would not look at grade point averages, rather they would judge students by their particular skills, by their ability to work in teams, and by the set of modules that they have mastered."

“In Defense of Cheating” should not fail to grab the attention of educators, employers, or – of course – students!


Posted by May Mikati on 20 January 2012, 1:02 PM

Friday, January 13, 2012

The Psychology of Projection - 13 January 2012

The Psychology of Projection


I recently came across this well-expressed observation on Dr. Wayne Dyer’s Facebook page and have been mulling over it ever since: “Persistently viewing others as dishonest, lazy, sinful, and ignorant can be a way of compensating for something you fear. If there's a pattern of seeing others as failures, you need to notice this pattern as evidence of what you're attracting into your life.”

The first part of Dyer’s statement struck me because I have encountered suspicious, cynical people who have turned out to be unscrupulous themselves. The psychological mechanism at work in their case is known as “projection” – such people often project their own shortcomings on others. You may encounter these individuals anywhere: at school, in the workplace, and in society at large. The lazy may suspect hard working people of laziness; cheats may see honest people as probable cheats, etc. This phenomenon, first identified by Freud as a psychological defence mechanism, is generally thought to be unconscious. Mentally ill people, especially paranoid schizophrenics, are notorious for their displays of projection. The second part of the statement also rang true because even if those around you are actual – rather than imagined - failures, it would only be fair to ask yourself why you are in that situation: why haven’t you managed to attract better people into your life? Couldn’t you be partly to blame? Could you have possibly even caused others' failure? These are interesting questions to ponder for people in the workplace in general, and in education in particular.


Posted by May Mikati on 13 January 2012, 11:55 PM

Sunday, January 8, 2012

What Employers Expect from Our Graduates - 08 January 2012

What Employers Expect from Our Graduates


Our students want an education that satisfies the requirements of potential employers, but what do employers look for in fresh graduates these days? Globally, employers may be shifting their attention from grades and experience to softer qualities, and communication skills appear to be the top requirement. In the U.S., writing skills are a “threshold requirement” as reflected in a 2004 report of The National Commission on Writing: “Writing: A Ticket to Work … Or a Ticket Out” .

A recent survey by the National University of Singapore Careers Centre also ranked communication at the top of the requirements list, based on the responses of 118 companies. This was done as part of the Graduate Global Talent Development Programme (GGTP) – a new initiative by NUS to produce global-minded graduates. The other top criteria identified were passion, analytical thinking, interpersonal skills, and the desire to learn (see Andrew Abraham’s “Top 5 Qualities Employers Seek in Fresh Graduates”). In Australasia, similar results were obtained. Based on a 2010 Graduate Outlook Survey of 350 graduate employers, the list of employer criteria other than communication skills does include academic results and experience. However, these rank fourth and sixth respectively (see “Skills Employers Want” ). Likewise, U.K. companies seek soft skills, which they often find lacking in fresh graduates, according to a study by Industry in Education, a national education trust: employers "are looking as much (or more) at personal skills for immediate deployment, as they will be at the specialist content of the degree" ("Graduate Job Seekers 'Lack Personal and Interactive Skills' Demanded by Industry").

With the increasing massification of higher education, finding the right job is no longer a piece of cake for the average university graduate. In an increasingly competitive global job market, it is useful for students to know the variety of qualities they should cultivate in order to strike the right chords with potential employers. It is also important that educators integrate these soft skills into their teaching, or at least bring them more to light.


Posted by May Mikati on 08 January 2012, 6:57 PM