Skip to content

Opinion: Whistler is the best ski resort on Earth (says ChatGPT)

'AI—it’s just like us! Laughably confident even in the face of embarrassing ineptitude and incompetence.'

After weeks of ignoring it, probably out of some primal, unspoken fear of technology and the future, I took the ChatGPT*** plunge this week.

So, of course, I used my first query to put all the haters and doubters to rest.

“What is the best ski resort in the world?” I asked.

After a short disclaimer about it being “difficult” to name the single “best” ski resort due to things like “individual preferences and priorities,” the machine did not disappoint.

Whistler Blackcomb is No. 1, my new robot friend said, before waxing poetic about its 8,100 acres (3,277 hectares) of skiable terrain, stunning scenery and lively après scene.

(Courchevel, France; Verbier, Switzerland; Aspen Snowmass, Colo.; and Niseko, Japan rounded out the top 5, in case you were curious.)

Inspired and mildly impressed, I decided to put ChatGPT to the test, and asked it to “write me a poem about a big, fat bear in Whistler, B.C. who befriends a stinky little marmot.”

I thought this request had stumped the robot at first, before it slowly began to compose:

“In the mountains of Whistler, so grand,” it began.

Good start.

“Lived a big, fat bear, mighty and grand.”

Brilliant, ChatGPT. Real timeless prose you’re generating here.

Rhyming “grand” with “grand” was just the start. Other genius turns of phrase penned by the inimitable AI: rhyming “full” with “dull”; “marmot” with “solid”; and “harm” with “warm.”

Also, there was only one reference to the marmot being stinky, and it didn’t factor in to the poem’s narrative at all. Real bush league stuff.

Let’s just say I would not be comfortable submitting ChatGPT’s first effort to the mayor’s annual Poet’s Pause Competition.

But then, the outputs you get from ChatGPT are a direct result of the inputs—and the related context—provided.

As such, my dumb little poem prompt barely scratched the surface. Dig a little deeper, and you start to see the true potential of ChatGPT.

Inside of half an hour, I had prompted the program to produce mostly accurate, entirely accessible summaries on Whistler’s history, its Official Community Plan, and the logic behind the bed cap.

It made me helpful lists of things to do in the resort, and even the best times to find a table at a restaurant in Whistler Village.

When you really learn how to speak to it, ChatGPT can be used to generate discussion papers, planning materials, and project outlines; create story templates, fact-check copy and even suggest alternate headlines.

It all seems great, at first glance, and the tool will have obvious applications for journalism and news creation moving forward. But on closer examination, you start to see where the tech is still lacking in its current iteration.

One query about housing under construction in the resort prompted an impressive-looking response—that contained within it some certified, unverifiable nonsense.

The response began by accurately describing the Resort Municipality of Whistler’s target, set in 2017, of creating 1,000 new beds in five years. The accuracy ended there, as it then told me the municipality approved construction of “the Northlands project” in 2020, “which will provide 177 new units.” Northlands is still a long ways out from construction, and I have yet to determine where exactly ChatGPT got the 177 figure from.

Even more perplexing was its next example of Whistler’s housing progress: a 185-unit development it called “Rainbow Crossing.”

As far as I can tell, this is an entirely fictional development; an amalgamation of reports and terms ChatGPT compiled and spat back at me.

Intrigued, I asked it to tell me more about Rainbow Crossing.

It is a 185-unit development project in Whistler’s Rainbow neighbourhood, “which is situated in the southern end of Whistler,” ChatGPT told me confidently, adding that the project began in 2021, will be complete by 2024, and is being developed in conjunction with the Whistler Housing Authority (WHA).

To be clear: there is no 185-unit “Rainbow Crossing” project being developed by the WHA south of Whistler. ChatGPT couldn’t tell me where it got the bad info, but my best guess is it’s confusing work underway in Cheakamus Crossing with project(s) recently completed in Rainbow.

This kind of confident-but-factually-incorrect response is common among language models, and is referred to as “artificial intelligence hallucination.”

AI—it’s just like us! Laughably confident even in the face of embarrassing ineptitude and incompetence.

Either way, the tech clearly needs some fine-tuning before it’s revolutionizing our workflows (or stealing our jobs) in earnest.

For as confident and convincing as ChatGPT’s responses are, it is still just a computer program regurgitating our own knowledge back at us—for now, at least.

Since it launched in November, there has been no shortage of think pieces on ChatGPT—and there are clearly many moral, ethical and professional knots to untie as we work towards the proper implementation of artificial intelligence in society.

In the meantime, ChatGPT can serve as a helpful research assistant. Just make sure you’re double-checking its facts—and not setting expectations too high when it comes to the fine art of stinky-marmot poetry.

***But not the GPT-4 plunge, which was released after this article was written—if you haven't heard about it, you will soon.

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks