You can also watch this video of Douglas Maxwell from the Alliance being interviewed about the weekend (although at the time of writing the video is offline due to an AWS problem).
This team aimed to make the quality of consultations better through using intelligent chatbot interfaces to guide users through the process – and to provide challenge by prompting citizens to comment on previous consultees’ input.
The concept for NoBot came from an initial idea which was of a bot which would make scheduling meetings easier. That spawned the idea – what if the Bot’s purpose was to make you have fewer meetings by challenging you at every turn, and in the process the bot’s personality as a sarcastic gatekeeper was born.
We started this project with the aim to help citizens find out what was happening in the myriad of local events which we each often seem to miss. Many local authorities have a What’s On calendar, sometimes with an RSS feed. None we found had an API unfortunately.
We identified that by pulling multiple RSS feeds into a single database then putting a bot in front of it, and either through scripting or applying some AI, it should be possible to put potential audiences in touch with what is happening.
Further, by enhancing the collected data – enriching it either manually or by applying machine logic, we could make it more easily navigable and intelligible.
Expect a full write-up of the challenges of this project, and what progress was made, on Ian’s blog,
This project set out to solve the problem of checking if a shop or business was still open for the day through a Facebook bot interface – as you with wander around, wondering about the question, as it were.
Code The City Weekends would not happen were it not for the generosity of our sponsors.
As we approach CodeTheCity #8 we must recognise two organisations who have backed this event.
The first is The Health and Social Care Alliance Scotland (the ALLIANCE) who is sponsoring the event through its ALISS Programme.
The ALISS Programme is excited to be sponsoring and attending the Code the City, AI and Chatbots hack weekend. This area of our work allows us to test concepts and prototype real world solutions to problems like; how does a person who is living with sight loss access the great local resources available on ALISS? Or, how does a person who finds it challenging to use normal desk-based computers access the local support available through ALISS? We will hopefully test these types of problems at the hack weekend and follow this up with a blog on our work.
Please follow the Alliance and their work on Twitter: at @ALLIANCEScot@ALISSProgramme or the hashtag search for #ALISS . If you are attending the weekend please make sure you hook up with @DouglasMaxw3ll and have a chat to him about the great work that his organisation does.
Our other main sponsor is Fifth Ring.
Fifth Ring is a marketing and communications agency in Aberdeen, Houston and Singapore with a big focus on digital and inbound marketing. Fifth Ring is already experimenting with conversational interfaces for some of their client work.
If you are attending the weekend please make sure to say hello to Steve Milne or Alan Stobie to discuss some of the great work they do.
This post was originally published on 10ml.com by Ian Watt
The art of scraping websites is one beset by difficulties, as I was reminded this week when re-testing a scraper that I built recently.
As part of my participation in 100 Days of Code I’ve been working on a few projects.
The first one that I tackled was a scraper to gather data from the PDF performance reports which are published on a four-weekly cycle Scotrail’s website. On the face of it this is a straightforward things to do.
Find the link to the latest PDF on the performance page using the label “Download Monthly Performance Results”.
Grab that PDF to archive it. (Scotrail don’t do that – they vanish each one and replace it with a new one every four weeks, so there is no archive).
Use a service such as PDFTables which has an API, uploading the PDF and getting a CSV file in return (XSLX and XML versions are also available but less useful in this project).
Parse the CSV file and extract a number of values, including headline figures, and four monthly measures for each of the 73 stations in Scotland.
Store those values somewhere. I decided on clean monthly CSV output files as a failsafe, and a relational SQLite database as an additional, better solution.
Creating the scraper
So, I built the bones of the scraper in a few hours over the first couple of days of the year. I tested it on the then current PDF which was for period nine of 2016-17. That worked, first creating the clean CSV, then later adding the DB-write routines.
Boom – number 1
I then remembered that I had downloaded the previous period’s PDF. So I modified the code (to omit the downloading routine) and ran it to test the scraping routine on it – and it blew up my code. The format of the table structure in the PDF had changed with an extra blank link to the right of the first list of station names.
After creating a new version and publishing that, I sat back and waited for the publication of period 10 data. That was published in the middle of this week.
Boom – number 2
I re-ran the scraper to add that new PDF to my database – and guess what? It blew up the scraper again. What had happened? Scotrail had changed the structure of the filename of the PDF – from using dashes (as in ‘performance-display-p1617-09.pdf’) to underscores (‘performance_display_p1617_10.pdf’)
That change meant that my routine for sicking out the year and period, which is used to identify database records, broke. So I had to rewrite it. Not a major hassle – but it means that each new publication has necessitated a tweaking of the code. Hopefully in time the code will be flexible enough to accommodate minor deviations from what is expected without manual changes. We’ll see.
We’re ‘doing the wrong thing righter’ – Drucker
Of course, none of this should be necessary.
In a perfect world Scotrail would publish well structured, machine-readable open data for performance. I did email them on 26th November 2016, long before I started the scraper, both asking for past periods’ data and asking if they wanted assistance in creating Open Data. I got a customer service reply on 7th December saying that a manager would be in touch. To date (15 Jan 2017) I’ve had no further response.
The right thing
Abelio operates the Scotrail franchise under contract to the Scottish Government.
Should the terms of such contracts not put an obligation on the companies not only to put the monthly data into the public domain, but also that it be made available as good open data – and follow the Scottish Government’s on strategy for Open Data ? Extending the government’s open data obligation to those performing contracts for governments would be a welcome step forward for Scotland.
These are the six presentations made by the teams at the conclusion of Code The City 7, Health Hack, captured on periscope.tv.
Team Float My Boat
An enhanced prototype has been created, with plans to create a more complete version. Using postcodes and mapping it would be straightforward to consume good data from elsewhere if available.
Some community centres and churches have over 100 groups operating at some point in the month. They can be hugely valuable, but somewhat invisible to the internet. Just making the existence of many of these groups visible can be a big step.
Also discussion of the importance of occupational therapists, librarians, dog walkers – many different individuals in the community that can feed valuable information into this kind of platform – important to remember that it’s not just primary care data that matters.
Some interesting visualisations of the underlying data were also created, and led to some interesting discussions around assumptions that are made about data. Again, the value of having the experts in the room at a hack event was demonstrated, as assumptions were challenged, and analysis changed based on feedback. Such feedback can often take weeks to acquire – but was available during the presentation. A snapshot of the data is available on github, and you can see the visualisation here.
The team have a working prototype, with functioning logic to query the Aliss dataset and return three results vis SMS. Pulling json data from Aliss based on a query generated from the SMS exchange, and sending those results.
The team say that there is still work to do to make this production ready, and some of the language processing and logic could be improved – but getting a working prototype over the course of the weekend is a real achievement. You can see elements of the code on github.
The limiting factor for this team has been the size of the datasets that they are working with, and the speed at which these can be moved around. Despite early setbacks with port access through the wifi (something we’re working on for the next weekend) the team were able to show some real results for the final presentation.
Some interesting findings around the geotagging, and inconsistencies that can arise. Some really interesting possible extensions to the project were discussed. The plan is to take this project ‘back to the office’ as the prototype for a full roll out to help optimise the use of lab support for GPs.
This team found that overlaps between their objectives and those of other teams were significant, so concentrated on some of the more ‘marketing’ aspects of service delivery – identity, and some thoughts around messaging to bring people into the service.
A good example of a service that could be rolled out quickly on top of the kind of datasets being used by the Float your boat project.
In the lead up to Code the City 7 we sent attendees some blank Barrier and Opportunity cards. We asked them to complete and bring them – with a single suggestion or idea per sheet.
On arrival people were to stick them to the wall. The response was great – with an enormous display of creativity quickly assembled. Many of these suggestions grouped well together. As we got started, five volunteers stepped forward to be the champion for one idea each, which formed the starting point of each of the projects taken forward during the weekend. You can read more about these from this blogpost onwards. Even the drawings accompanying the ideas were great – see the montage above!
But what of the remaining ideas – of which there were dozens? I read each of them and have summarised some of them – often grouping several together – below. Each of these has merit as a potential area to explore further (perhaps at a future event).
Find out how busy a GP practice is, before you register
It is suggested that there is no consistency across the NHS Grampian area – with some good examples of websites and some poor.
Waiting times for appointments at GPs’ surgeries?
Where is the data to show which days are busier than others. How could that help patients?
Live Tracking of referrals to consultants
Patients, on being referred to a consultant are often left in the dark for weeks or months until a letter arrives. How could that be made transparent? Could we have a ‘track my referral’ as you would a ‘track my parcel’? How or when will you get an appointment with a consultant? Could you self select from calendar rather than get one which doesn’t suit and has to be changed.
Lack of data interoperability between elements of health service / Health and Social Care etc.
Assist GPS to do more online – self service – online calendars for appointments – meaning that they can spend longer with patients or reduce waiting times for appointments
Citizen / Patient digital literacy
How could we assist patients to use digital services as these are developed. Which also raise the issue of health literacy – how could we assist people to understand their own health – e.g. cause and effect.
Persuade / help GPs to get citizens to use informal / community-based support
A shared calendaring across NHS Grampian to share training opportunities. Much training is common but is delivered is a siloed basis.
Develop a common organogram showing remits, areas of operation across the formal and inform H&SC landscape
Address the challenges of patients being treated in parallel between two specialists, so that they don’t feel that they are being passed from pillar to post.
These ideas alone would feed another three hack weekends! If you are interested in working or these – or sponsoring a further weekend such as this, please let us know!
Saturday 10th December from 0900 – 1600 at Bridge of Don Academy, Aberdeen, AB22 8RR
Sport Aberdeen and Code the City are inviting people from across Bridge of Don and the wider Aberdeen area to take part in a full day community workshop looking at active travel ideas. The day will consider ideas to develop an Active Travel Hub in Bridge of Don which can promote and support cycling and walking in the community.
The event will be structured across a whole day, and allowing for drop in attendance throughout the day.
You can choose to drop in either morning or afternoon – or even stay for the full day if you like.
The day will involve:
Identification of potential opportunities or problems relating to the siting and functionality of an active travel hub in the Bridge of Don area.
Group idea generation session to address each of these areas employing a variety of appropriate techniques in order to generate the best ideas possible.
Team and group work to explore each idea – developing these to envision what future states might be.
Iterative development of prototype ‘solutions’.
Catering (teas, coffee, juices, snacks during the morning and afternoon and a sandwich / pizza lunch) for all participants.
Code the City #8, which will take place in on Sat 25th to Sunday 26th February 2017, will be an exploration of the world of chatbots and AI (or Artificial Intelligence), identifying problems to tackle and quickly prototyping solutions.
A chatbot is a piece of software that interacts with a customer or user to directly answer their questions. It uses existing data or information coupled with artificial intelligence to respond in a human-like way, guiding the user to a solution.
There are many examples of live chat bots in this exciting, emerging field. A chatboat could give you travel directions, tell you when its next going to rain in your area, or help you contest parking tickets. It could book you a flight and hotel, or act as a free lawyer to help the homeless get housing . The HBO series Westworld has even launched a bot to help you interact with the (fictional) holiday park!
Bring together a diverse range of people from various backgrounds, to form teams.
Identify problems that we’d like to apply chatbots to solve.
Identify approaches, information and data, to guide how we develop the bots and train them
Mix academic thinking, and user need, with open source technology and open data to develop new services
Iterate quickly through approaches, testing ideas, failing quickly and refining our approaches.
Prototype and demonstrate solutions to an interested audience
Who should attend?
Service owners – and service providers
Academics and students in the field of chatbots and artificial intelligence
Front-end and UX designers
Bloggers and social media practitioners
Anyone with an interest in getting involved in creating bots even for fun!
What you will do?
You will create mixed teams to workshop chatbot solutions to real world issues. Maybe these will building on the outputs of previous work we’ve done at CodeTheCity. Through rapid prototyping you will create new applications and have some fun in the process.
We’ll show you new techniques for service design, idea generation, prototyping, and rapid iterative application development – and you will show other participants some tricks and approaches, too. We’ll share knowledge and learning.
You might even get a Tshirt, and we can guarantee the best catering of any weekend workshop in the city!
To book a free ticket visit our Eventbrite page But be quick, tickets will go swiftly!
All attendees will get a year’s free membership of the Open Data Institute.
PLEASE NOTE – Due to low take-up this event has been postponed. We are sorry for any inconvenience this will cause.
Perth wants to boost its tourism offer and wants some help!. They want to see whether some well developed apps could help the city and its wider area bring attractions, trails, events, culture,accommodation, eateries; and activities to life.
They are also interested in bringing the quirky and interesting aspects of the city together, using great images and interesting user generated content through social media.
= DATA sOURCES aDDED On Github
They have developed the website http://www.perthcity.co.uk/ and there is an app (http://www.mi-perthshire.co.uk/ ) but want some creative minds to take a fresh look at the city and surrounding area, generate new ideas that they could then develop into some new apps, open data or other projects.
As always we’re looking for coders, designers, data wranglers, service users and providers, bloggers – in fact anyone with an interest – to join us for a weekend of ideation, creation, open data and rapid prototyping.
We’ll feed you, keep you stimulated, and provide good wifi. You will leave with a sense of accomplishment, new skills and potentially new friends.
In addition there are a cluster of B&BS on Dunkeld Road.
Also, just outside the city itself, The Lodge at the Perth Racecourse are offering a flat rate of £90 per night in a Double or Twin bedded room (£45 per person), which also includes a full breakfast. See http://perthlodge.co.uk/dining
On 19th and 20th March we found ourselves back at Aberdeen Uni with 35 or so eager hackers looking to bring to life a 3D Virtual Reality historic model of Aberdeen city centre using new open data. So how did it go?
This time we were more prescriptive than at any previous Code The City event. In the run up to the weekend we’d identified several sub-team roles.
Locating, identifying and curating historic content
Transcribing, formatting and creating valid open data
Building the 3D model, fixing and importing images and
Integrating and visualising the new data in the model.
After some breakfast, an intro and a quick tutorial on Open Data, delivered by Andrew Sage, we got stuck in to the work in teams.
Old Books into Open Data
We were lucky to have a bunch (or should be a shelf-ful) of city librarians, an archivist and a gaggle of other volunteers working on sourcing and transcribing data into some templates we’d set up in Google Sheets.
Given that we’d been given scanned photos of all the shop frontages of Union Street, starting in 1937, of which more below, we settled on that as the main period to work from.
The librarians and helpers quickly got stuck into transcribing the records they’d identified – particularly the 1937-38 Post Office Directory of Aberdeen. If my arithmetic is correct they completely captured the details of 1100+ business in the area around Union Street.
At present these are sitting in a Google Spreadsheet – and we will be working out with the librarians how we present this as well structured, licensed Open Data. It is also a work in progress. So there are decisions to be made – do we complete the transcription of the whole of Aberdeen – or do we move onto another year? e.g. 1953 which is when we have the next set of shopfront photos.
Music, pictures and sound
At the same time as this transcription was ongoing, we had someone sourcing and capturing music such might have been around in 1937, and sounds that you might have heard on the street – including various tram sounds – which could be imported into the model.
And three of us did some work on beginning an open list of gigs for Aberdeen since the city had both the Capitol Theatre (Queen, AC/DC, Hawkwind) and the Music Hall (Led Zeppelin, David Bowie, Elton John) on Union Street. This currently stands at 735 gigs and growing. Again, we need to figure out when to make it live and how.
The 3D Model
At CTC5 back in November 2015, Andrew Sage had started to build a 3D model of the city centre in Unity. That relied heavily on manually creating the buildings. Andrew’s idea for CTC6 was to use Open Streetmap data as a base for the model, and to use some scripting to pull the building’s footprints into the model.
This proved to be more challenging than expected. Steven Milne has written a great post on his site. I suggest that you read that then come back to this article.
As you’ve hopefully just read, Steve has identified the challenge of using Open Streetmap data for a project such as this: the data just isn’t complete enough or accurate enough to be the sole source of the data.
While we could update data – and push it back to OSM, that isn’t necessarily the best use of time at a workshop such as this.
There is an alternative to some of that. All 32 local authorities in Scotland maintain a gazetteer of all properties in their area. These are highly accurate, constantly-update, and have Unique Property Reference Numbers (UPRNs) and geo-ordinates for all buildings. This data (if it was open) would make projects such as this so much easier. While we would still need building shapes to be created in the 3D model, we would have accurate geo-location of all addresses, and so could tie the transcribed data to the 3d map very easily.
By using UPRNs as the master data across each transcribed year’s data we could match the change in use of individual buildings through time much more easily. There is a real need to get the data released by authorities as open data, or at least with a licence allowing generous re-use of the data. ODI Aberdeen are exploring this with Aberdeen City Council and the Scottish Government
We were given by the city’s Planning Service, scans of photos of shopfronts of Union Street from a number of decades from 1937, 1953 and on to the present. Generally the photos are very good but there are issues: we have seams between photos which run down the centre of buildings. We have binding tape showing through etc.
These issues are not so very difficult to fix – but they do need someone with competence in Photoshop, some standard guidance, and workflow to follow.
We started fixing some photos so that they could provide the textures for the building of Union Street in the model. But given the problems we were having with model, and a lack of dedicated Photoshop resource we parked this for now.
Taking this project forward, while still posing some challenges, is far from impossible. We’ve shown that the data for the entire city centre for any year can be crowd-transcribed in just 36 hours. But there are some decisions to be made.
Picking up on the points above, these can be broken down as follows.
Licensing model to be agreed
Publishing platform to be identified
Do we widen geo-graphically (across the city as a whole) or temporally (same area different years)
Creating volunteer transcribing teams, with guidance, supervision and perhaps a physical space to carry out the work.
Identify new data sources (e.g. the Archives were able to offer valuation roll data for the same period – would these add extra data for buildings, addresses, businesses?)
Set up a means for the general public to get involved – gamifying the transcription process, perhaps?
Similar to the data above.
We need clear CC licences to be generated for the pictures
Crowdsource the fixing of the photos
Create workflow, identify places for the pictures to be stored
Look at how we gamify or induce skilled Photoshop users to get involved
Set up a repository of republished, fixed pictures, licensed for reuse, with proper addressing system and naming – so that individual pictures can be tied to the map and data sources
The 3D Model
Build the model
Extend the coverage (geographically and through time)
Establish how best to display the transcribed data – and to allow someone in the 3D environment to move forward and back in time.
Look at how we can import other data such as a forthcoming 3D scan of the city centre to shortcut some development work
Look at how we can reuse the data in other formats and platforms (such as Minecraft) with minimum rework.
Speed up the 3D modelling by identifying funding streams that could be used to progress this more quickly. If you have suggestions please let us know as a comment below.
Taking all of this forward is quite an undertaking, but it is also achievable if we break the work down into streams and work on those. Some aspects would benefit from CTC’s involvement – but some could be done without us. So, libraries could use the experience gained here to set up transcribing teams of volunteers – and be creating proper open data with real re-use value. That data could then easily be used by anyone who wants to reuse it – e.g. to create a city centre mobile app which allows you to see any premises on Union Street, call up photos from different periods, find out which businesses operated there etc
As the model takes shape and we experiment with how we present the data we can hopefully get more attention and interest (and funding?) to support its development. It would be good to get some students on placements working on some aspects of this too.
Aberdeen City Council is working with the Scottish Cities Alliance to replace and improve the Open Data platforms for all seven Scottish cities later this year – and that will provide a robust means of presenting and storing all this open data once in place but in the mean time we will need to find some temporary alternatives (perhaps on Github ) until we are ready.
We welcome your input on this – how could you or your organisation help, what is your interest, how could you assist with taking this forward? Please leave comments below.
Code The City 6 – The History Jam was funded by Aberdeen City Council’s Libraries service and generously supported by Eventifier who provided us with free use of their Social Media platform and its LiveWall for the sixth consecutive time!.