CTC8 – Chatbots and AI -final presentations

After two days of intense activity and a whole heap of learning for all of us, Code The City #8, our Chatbots and AI weekend came to an end at tea time on Sunday.

It couldn’t have happened without the generous sponsorship of our two sponsors: The Health Alliance, and Fifth Ring, for which we are very grateful.

The weekend rounded off with presentations of each project, four of which we’ve captured on video (see below).

Each of the projects has its own Github repo. Links are included at the end of each project description. And, two days later, the projects are still being worked on!

Team: ALISS

Team ALISS worked on providing a chatbot interface onto healthcare and social data provided via the ALISS system.

ALISS bot project : Code the City 8 from Andrew Sage on Vimeo.

You can find Project ALISS’s code here on Github.

You can also watch this video of Douglas Maxwell from the Alliance being interviewed about the weekend (although at the time of writing the video is offline due to an AWS problem).

Team: City-consult

This team aimed to make the quality of consultations better through using intelligent chatbot interfaces to guide users through the process – and to provide challenge by prompting citizens to comment on previous consultees’ input.

City-Consult bot project : Code the City 8 from Andrew Sage on Vimeo.

You can find the code for City-Consult at this Github repo.

Team: NoBot

The concept for NoBot came from an initial idea which was of a bot which would make scheduling meetings easier. That spawned the idea – what if the Bot’s purpose was to make you have fewer meetings by challenging you at every turn, and in the process the bot’s personality as a sarcastic gatekeeper was born.

NoBot project : Code the City 8 from Andrew Sage on Vimeo.

The code for Nobot lives here on Github.

Team: Seymour

Sadly there is no video of the wind-up talk for Seymour. In short the purpose of Seymour is to help you keep your houseplants alive. (More details to come).

You can find the code for Seymour at this repo on Github.

Team: Stuff Happens

We started this project with the aim to help citizens find out what was happening in the myriad of local events which we each often seem to miss. Many local authorities have a What’s On calendar, sometimes with an RSS feed. None we found had an API unfortunately.

We identified that by pulling multiple RSS feeds into a single database then putting a bot in front of it, and either through scripting or applying some AI, it should be possible to put potential audiences in touch with what is happening.

Further, by enhancing the collected data – enriching it either manually or by applying machine logic, we could make it more easily navigable and intelligible.

Expect a full write-up of the challenges of this project, and what progress was made, on Ian’s blog,

There is no video, but you an find the project code here on Github.

Team: W[oa]nder

This project set out to solve the problem of checking if a shop or business was still open for the day through a Facebook bot interface – as you with wander around, wondering about the question, as it were.

W[oa]nder bot project : Code the City 8 from Andrew Sage on Vimeo.

You can find their code here.

And finally we were joined by Rory on day two who set out to assist team Stuff-Happens through developing some of the AI around terminologies or categories. That became the:

Word Association Scorer

This is now on Github – not a bot but a set of python functions that scores a given text against a set of categories.

And Finally

We had loads of positive feedback from those who attended the weekend (both old hands and newbies) and from those who watched from afar, following progress on Twitter.

We’ve published the dates for CTC9 and subsequent workshops on our front page. We hope you can join us for more creative fun.

Ian, Andrew, Steve and Bruce
@codethecity

Scraping Goes Off The Rails

This post was originally published on 10ml.com by Ian Watt

The art of scraping websites is one beset by difficulties, as I was reminded this week when re-testing a scraper that I built recently.

Schienenbruch

 

Railway performance

As part of my participation in 100 Days of Code I’ve been working on a few projects.

The first one that I tackled was a scraper to gather data from the PDF performance reports which are published on a four-weekly cycle Scotrail’s website. On the face of it this is a straightforward things to do.

  1. Find the link to the latest PDF on the performance page using the label “Download Monthly Performance Results”.
  2. Grab that PDF to archive it. (Scotrail don’t do that – they vanish each one and replace it with a new one every four weeks, so there is no archive).
  3. Use a service such as PDFTables which has an API, uploading the PDF and getting a CSV file in return (XSLX and XML versions are also available but less useful in this project).
  4. Parse the CSV file and extract a number of values, including headline figures, and four monthly measures for each of the 73 stations in Scotland.
  5. Store those values somewhere. I decided on clean monthly CSV output files as a failsafe, and a relational SQLite database as an additional, better solution.

Creating the scraper

So, I built the bones of the scraper in a few hours over the first couple of days of the year. I tested it on the then current PDF which was for period nine of 2016-17. That worked, first creating the clean CSV, then later adding the DB-write routines.

Boom – number 1

I then remembered that I had downloaded the previous period’s PDF. So I modified the code (to omit the downloading routine) and ran it to test the scraping routine on it – and it blew up my code. The format of the table structure in the PDF had changed with an extra blank link to the right of the first list of station names.

After creating a new version and publishing that, I sat back and waited for the publication of period 10 data. That was published in the middle of this week.

Boom – number 2

I re-ran the scraper to add that new PDF to my database – and guess what? It blew up the scraper again. What had happened? Scotrail had changed the structure of the filename of the PDF – from using dashes (as in ‘performance-display-p1617-09.pdf’) to underscores (‘performance_display_p1617_10.pdf’)

That change meant that my routine for sicking out the year and period, which is used to identify database records, broke. So I had to rewrite it. Not a major hassle – but it means that each new publication has necessitated a tweaking of the code. Hopefully in time the code will be flexible enough to accommodate minor deviations from what is expected without manual changes. We’ll see.

We’re ‘doing the wrong thing righter’ – Drucker

Of course, none of this should be necessary.

In a perfect world Scotrail would publish well structured, machine-readable open data for performance. I did email them on 26th November 2016, long before I started the scraper, both asking for past periods’ data and asking if they wanted assistance in creating Open Data. I got a customer service reply on 7th December saying that a manager would be in touch. To date (15 Jan 2017) I’ve had no further response.

The right thing

Abelio operates the Scotrail franchise under contract to the Scottish Government.

Should the terms of such contracts not put an obligation on the companies not only to put the monthly data into the public domain, but also that it be made available as good open data – and follow the Scottish Government’s on strategy for Open Data ? Extending the government’s open data obligation to those performing contracts for governments would be a welcome step forward for Scotland.