Scott Thornbury’s “E is for E coursebook- 10 years later.

6–8 minutes

read

A is for AI… which has made coursebooks even less relevant.

In 2012, two months before I would venture across the Atlantic to Budapest from my home in upstate New York to take the CELTA course and enter the world of ELT, Scott Thornbury published a marvelous piece on his blog which would only become relevant and comprehensible to me four or five years later as I navigated my new role working as a corporate business English trainer in Germany.

Teaching in corporate training rooms for six or seven back to back sessions was a challenge that the CELTA had not prepared me for. I had small groups of professionals who had very specific reasons for learning English, a lot to talk about, and very little time to waste. And I had no time to waste on hours of lesson planning and making photocopies.

Around the time I discovered Dogme, which revolutionized my teaching, I found a blog post by Thornbury called E is for E Coursebook. It was both a call for not simply using new tech to do the same old things (publishing mass-market coursebooks which deliver “grammar McNuggets” and present dry, static texts, etc.) and a glimpse at how tech could be used to craft relevant and engaging language lessons spontaneously, based on what was actually going on with the actual people in the room. I was hooked.

Thornbury presented a quick possible sketch of how tech could be used to replace instead of to peddle published materials.

  1. A topic arises naturally out of the initial classroom chat. The teacher searches for a YouTube video on that topic and screens it. The usual checks of understanding ensue, along with further discussion.
  2. A transcript of the video, or part of it, is generated using some kind of voice recognition software; alternatively, the learners work on a transcription together, and this is projected on to the interactive whiteboard, which is simply a whiteboard powered by an eBeam.
  3. A cloze test is automatically generated, which students complete.
  4. A word-list (and possibly a list of frequently occurring clusters) is generated from the text, using text processing tools such as those available at The Compleat Lexical Tutor. A keyword list is generated from the word list. Learners use the keywords to reconstruct the text – using pen and paper, or tablet computers.
  5.  On the basis of the preceding task, problematic patterns or phrases are identified and further examples are retrieved using a phrase search tool.
  6.  The target phrases are individually ‘personalised’ by the learners and then shared, by being projected on to the board and anonymised, the task being to guess who said what, leading to further discussion. Alternatively, the phrases are turned into questions to form the basis of a class survey, conducted as a milling activity, then collated and summarised, again on to the board.
  7. In small groups students blog a summary of the lesson.
  8. At the same time, the teacher uses online software to generate a quiz of some of the vocabulary that came up in the lesson, to finish off with.

https://scottthornbury.wordpress.com/2012/01/29/e-is-for-ecoursebook/

Countless twists on the above have inspired my teaching ever since. Sometimes it’s an infographic or short text, sometimes a TED talk or YouTube video. Very often, just the language that emerges from discussion or tasks on a shared virtual whiteboard. And with a good grasp of digital tools virtual teaching, far from being bland and monotonous, became an immersive experience.

Enter OpenAI.

Last week, OpenAI released a beta version of their newest language processing model and it is currently free to use. The tool can do a stunning amount of language related tasks instantly. I began experimenting and I haven’t been able to stop since. So to pay homage to Scott’s 2012 post, and hopefully start a conversation with other teachers about applying this new technology, here is an example of how it can be used.

One example

A B2 conversation class entered the (virtual) room and learners were immediately talking about books. After some chit chat, I asked about their favorite authors. One of them mentioned Stephen King, so I logged into the playground section of OpenAI and asked it to write me three book reviews of a Stephen King novel. It churned out the following within two seconds:

Then I prompted it to list all of the collocations that could be found in the text:

Brilliant. I started by displaying the list of collocations on a whiteboard. We went through the terms one by one, and I asked learners to decide if each one most likely belonged in a positive or negative book review…or if they were unable to tell.

Next, I displayed the three texts and they read them, deciding which review was positive, negative, or neutral, and why. Afterwards, they went through the texts again to highlight the terms we had already gone over, and to see if their initial guesses about which reviews the terms belonged in were right. Then, we went through the terms again and personalized them by discussing them with regards to their own experiences with books and reading.

After discussing a bunch of emergent language, learners went on to write their own short book reviews, encouraged, but not mandated, to use language we had already covered.

Just getting started

That is just one of countless examples of how this can be used in the language classroom. I’m inclined to think that the primary benefit is that it allows us to create compelling and level appropriate input and good models without needing to worry about copyright issues nor searching the web for texts for hours… though the interface will also rewrite texts for you and give you written feedback on texts. OpenAI will write in a variety of styles and will take pragmatics into consideration depending on the instructions you input. It will also explain grammar rules to you if you ask it to (another nail in the coffin wasting lots of classroom time explicating grammar rules). Below are some prompts similar to ones that I have used with impressive results. If you are interested in using this platform but don’t know where to start, you can use them as inspiration:

  • write a summary of Matt Cutt’s TED talk
  • write two more summaries of the talk, with two incorrect pieces of factual information missing from each
  • write me a list of the top 20 most important accounting terms in English and use those terms in a text about accounting in 2022. Then, create a version where the terms are replaced with gaps.
  • write me a list of eight true and two false sentences about Germany
  • write me a chain of professional emails. Email one should be a customer complaining about….Email two should be a response from Mr Smith at…. email three…
    mix up the order of the emails
  • according to author David Brooks, what are the advantages and disadvantages of older and younger employees
  • write a dialogue of two people meeting in a cafe. Include a bit of small talk, a funny joke, and something surprising.
  • write a 250 word text in CEFR B2 level English about the world cup. Create a list of common collocations from the text. Replace the first word in each collocation with a gap and provide a wordlist
  • write me a story that ends with “so that’s how I got my nickname, “Turkey Lips.”
  • write the steps of how to change a tire
  • write me a script of a phone conversation between a…. include….

I really think the possibilities are endless here, and as always…principles before tools. I don’t think this kind of tech is anywhere near replacing the value of the genuine social context of the classroom or virtual classroom when it comes to language learning…but it can certainly save teachers a ton of time and provide us with comprehensible input that can easily be adapted into engaging tasks.

What do you think?

17 responses to “Scott Thornbury’s “E is for E coursebook- 10 years later.”

  1. Very interesting. As you say, the possibilities are endless. Its “studying” pontential makes me a bit uneasy – the Johns & Davies (1983!) article on Text As A Vehicle for Information (TAVI) versus TALO (Text as a Linguistic Object) comes to mind. But I’m sure Tim Johns would have been excited by the potential. LOTS to think about!

    Like

    1. Thanks Geoff… I’ll try to track that article down…sounds interesting.

      Like

      1. I did a post on it years ago:

        * TAVI and TALO: Reading and ELT

        Like

    2. Thanks for the breakdown! I do think, like everything, how we use these things and the principles we abide by are key.

      So I suppose in the example above with the book reviews I’m using the text as both TALO and TAVI by exploring some of the language within the text but then also having to DO something meaningful and realistic with the text; in this case, decide which review they resonate with the most (it was a book they had all read) and then afterwards to produce some reviews of their own (and a further step I forgot in the blog post was that they had to read each others reviews after and then decide which of the books they would want to read the most that the others had reviewed). Since my primary focus is communication and meaning, I guess it’s still loads better than merely using the text to present particular language and then ask students to “practice” using the language. It is a fine line, though!

      Like

  2. I’m wondering how we can get AI to generate level appropriate, CI correct texts? But then again, it does depend on your objective, whether to use a leveled text or not. Many factors involved. But I do think it could be done, AI generating the simpler, abridged stuff we hire so many folks to do.

    Like

  3. If you prompt the AI to decrease or increase the complexity of a text it will do that…so that’s one possibility. Other than that, I think its just like selecting usual texts…modify them, elaborate them, etc. Although… I’m sure its already possible to ask it to communicate with y0u at I+1 level in the chatbot function if you prompt it correctly…lots to explore!

    Like

  4. The time-saving issue struck me in particular. Interesting post. Thanks.

    Like

  5. […] Scott Thornbury’s “E” is for E coursebook- 10 years later  […]

    Like

  6. […] my first post, I had just barely dipped my toes into the OpenAI playground and its possibilities for teachers. I […]

    Like

  7. Excellent post Sam – late to this as I no longer Twitter but I look forward to hearing your future experiments. Like you, I’m pretty excited by its potential, and in particular have been playing around creating texts that can be used as part of a task cycle in a TBL lesson (well, I did so once I’d stopped asking it questions like “What makes Al Pacino a great actor?”, “Can you write an essay about cats?” and “tell me a Dad joke” – not that these may not one day be relevant to my lessons….).

    Like

    1. Thanks for the comment, Neil. That seems to be one of the better uses, so far: creating texts that are level-appropriate and relevant…the kinds of short, language-rich texts that fit well into the tasks in your “Activities for Task Based Learning” for example. See my part 2 post (a few more AI things) for a funny example of how I created a text instantly in a Dogme lesson that led to lots of laughs and also the purchase of a real-life anniversary gift 🙂

      We’re organizing an online conference soon to talk about all of this and it would be great to have you there!

      Like

      1. Thanks Sam, let me know more about the conference when relevant (at neilpma74@gmail.com or messenger). Have you seen the booklet “A Teacher’s Prompt
        Guide to ChatGPT aligned with ‘What Works Best’”? Someone just sent me a copy. At a glance it seems to be brimming with useful ideas.

        Like

  8. […] Scott Thornbury’s “E” is for E coursebook- 10 years later  […]

    Like

  9. […] have AI and large language models like GPT. The possibilities are rich and quite a few people, like Sam Gravell and Svetlana Kandybovich, have already started suggesting interesting and creative ways of using […]

    Like

Leave a reply to scarlet_erg Cancel reply