Goodness, is that the time?!
Our teams have been working away for a few hours. So, it’s time for an update. If you have missed the intro to this weekend’s event, read first here.
We kicked off the ‘Code The City 9 – Health Signposting’ weekend this morning bright-eyed and bushy-tailed. There are just under 20 attendees from mixed backgrounds.
We have volunteered to help solve issues around health care data. One problem is that health care data are currently maintained in (at least) three unconnected systems run by different organisations. These are ALISS, GCD (Grampian CareData) and MILO. The ultimate goal of this project is to create an open data source that provides accessible up-to-date information to the public and professionals.
After two days of intense activity and a whole heap of learning for all of us, Code The City #8, our Chatbots and AI weekend came to an end at tea time on Sunday.
It couldn’t have happened without the generous sponsorship of our two sponsors: The Health Alliance, and Fifth Ring, for which we are very grateful.
The weekend rounded off with presentations of each project, four of which we’ve captured on video (see below).
Each of the projects has its own Github repo. Links are included at the end of each project description. And, two days later, the projects are still being worked on!
Team ALISS worked on providing a chatbot interface onto healthcare and social data provided via the ALISS system.
You can find Project ALISS’s code here on Github.
You can also watch this video of Douglas Maxwell from the Alliance being interviewed about the weekend (although at the time of writing the video is offline due to an AWS problem).
This team aimed to make the quality of consultations better through using intelligent chatbot interfaces to guide users through the process – and to provide challenge by prompting citizens to comment on previous consultees’ input.
You can find the code for City-Consult at this Github repo.
The concept for NoBot came from an initial idea which was of a bot which would make scheduling meetings easier. That spawned the idea – what if the Bot’s purpose was to make you have fewer meetings by challenging you at every turn, and in the process the bot’s personality as a sarcastic gatekeeper was born.
The code for Nobot lives here on Github.
Sadly there is no video of the wind-up talk for Seymour. In short the purpose of Seymour is to help you keep your houseplants alive. (More details to come).
You can find the code for Seymour at this repo on Github.
We started this project with the aim to help citizens find out what was happening in the myriad of local events which we each often seem to miss. Many local authorities have a What’s On calendar, sometimes with an RSS feed. None we found had an API unfortunately.
We identified that by pulling multiple RSS feeds into a single database then putting a bot in front of it, and either through scripting or applying some AI, it should be possible to put potential audiences in touch with what is happening.
Further, by enhancing the collected data – enriching it either manually or by applying machine logic, we could make it more easily navigable and intelligible.
Expect a full write-up of the challenges of this project, and what progress was made, on Ian’s blog,
There is no video, but you an find the project code here on Github.
This project set out to solve the problem of checking if a shop or business was still open for the day through a Facebook bot interface – as you with wander around, wondering about the question, as it were.
You can find their code here.
And finally we were joined by Rory on day two who set out to assist team Stuff-Happens through developing some of the AI around terminologies or categories. That became the:
This is now on Github – not a bot but a set of python functions that scores a given text against a set of categories.
We had loads of positive feedback from those who attended the weekend (both old hands and newbies) and from those who watched from afar, following progress on Twitter.
We’ve published the dates for CTC9 and subsequent workshops on our front page. We hope you can join us for more creative fun.
Ian, Andrew, Steve and Bruce
Code The City Weekends would not happen were it not for the generosity of our sponsors.
As we approach CodeTheCity #8 we must recognise two organisations who have backed this event.
The first is The Health and Social Care Alliance Scotland (the ALLIANCE) who is sponsoring the event through its ALISS Programme.
The ALISS Programme is excited to be sponsoring and attending the Code the City, AI and Chatbots hack weekend. This area of our work allows us to test concepts and prototype real world solutions to problems like; how does a person who is living with sight loss access the great local resources available on ALISS? Or, how does a person who finds it challenging to use normal desk-based computers access the local support available through ALISS? We will hopefully test these types of problems at the hack weekend and follow this up with a blog on our work.
Please follow the Alliance and their work on Twitter: at @ALLIANCEScot @ALISSProgramme or the hashtag search for #ALISS . If you are attending the weekend please make sure you hook up with @DouglasMaxw3ll and have a chat to him about the great work that his organisation does.
Our other main sponsor is Fifth Ring.
Fifth Ring is a marketing and communications agency in Aberdeen, Houston and Singapore with a big focus on digital and inbound marketing. Fifth Ring is already experimenting with conversational interfaces for some of their client work.
If you are attending the weekend please make sure to say hello to Steve Milne or Alan Stobie to discuss some of the great work they do.
This post was originally published on 10ml.com by Ian Watt
The art of scraping websites is one beset by difficulties, as I was reminded this week when re-testing a scraper that I built recently.
As part of my participation in 100 Days of Code I’ve been working on a few projects.
The first one that I tackled was a scraper to gather data from the PDF performance reports which are published on a four-weekly cycle Scotrail’s website. On the face of it this is a straightforward things to do.
So, I built the bones of the scraper in a few hours over the first couple of days of the year. I tested it on the then current PDF which was for period nine of 2016-17. That worked, first creating the clean CSV, then later adding the DB-write routines.
I then remembered that I had downloaded the previous period’s PDF. So I modified the code (to omit the downloading routine) and ran it to test the scraping routine on it – and it blew up my code. The format of the table structure in the PDF had changed with an extra blank link to the right of the first list of station names.
After creating a new version and publishing that, I sat back and waited for the publication of period 10 data. That was published in the middle of this week.
I re-ran the scraper to add that new PDF to my database – and guess what? It blew up the scraper again. What had happened? Scotrail had changed the structure of the filename of the PDF – from using dashes (as in ‘performance-display-p1617-09.pdf’) to underscores (‘performance_display_p1617_10.pdf’)
That change meant that my routine for sicking out the year and period, which is used to identify database records, broke. So I had to rewrite it. Not a major hassle – but it means that each new publication has necessitated a tweaking of the code. Hopefully in time the code will be flexible enough to accommodate minor deviations from what is expected without manual changes. We’ll see.
Of course, none of this should be necessary.
In a perfect world Scotrail would publish well structured, machine-readable open data for performance. I did email them on 26th November 2016, long before I started the scraper, both asking for past periods’ data and asking if they wanted assistance in creating Open Data. I got a customer service reply on 7th December saying that a manager would be in touch. To date (15 Jan 2017) I’ve had no further response.
Abelio operates the Scotrail franchise under contract to the Scottish Government.
Should the terms of such contracts not put an obligation on the companies not only to put the monthly data into the public domain, but also that it be made available as good open data – and follow the Scottish Government’s on strategy for Open Data ? Extending the government’s open data obligation to those performing contracts for governments would be a welcome step forward for Scotland.
These are the six presentations made by the teams at the conclusion of Code The City 7, Health Hack, captured on periscope.tv.
Team Float My Boat
An enhanced prototype has been created, with plans to create a more complete version. Using postcodes and mapping it would be straightforward to consume good data from elsewhere if available.
Some community centres and churches have over 100 groups operating at some point in the month. They can be hugely valuable, but somewhat invisible to the internet. Just making the existence of many of these groups visible can be a big step.
Also discussion of the importance of occupational therapists, librarians, dog walkers – many different individuals in the community that can feed valuable information into this kind of platform – important to remember that it’s not just primary care data that matters.
Some interesting visualisations of the underlying data were also created, and led to some interesting discussions around assumptions that are made about data. Again, the value of having the experts in the room at a hack event was demonstrated, as assumptions were challenged, and analysis changed based on feedback. Such feedback can often take weeks to acquire – but was available during the presentation. A snapshot of the data is available on github, and you can see the visualisation here.
The team have a working prototype, with functioning logic to query the Aliss dataset and return three results vis SMS. Pulling json data from Aliss based on a query generated from the SMS exchange, and sending those results.
The team say that there is still work to do to make this production ready, and some of the language processing and logic could be improved – but getting a working prototype over the course of the weekend is a real achievement. You can see elements of the code on github.
The team have created a video prototype, which looks great. The full Polish translation is complete, and will be added to the video using youtube closed captions, as well as an audio overlay later.
The project is to be presented to a group of GPs later this week for feedback as to usefulness and likely impact. Code, and scripts, are posted to the team github page.
Team Delta Test
The limiting factor for this team has been the size of the datasets that they are working with, and the speed at which these can be moved around. Despite early setbacks with port access through the wifi (something we’re working on for the next weekend) the team were able to show some real results for the final presentation.
Some interesting findings around the geotagging, and inconsistencies that can arise. Some really interesting possible extensions to the project were discussed. The plan is to take this project ‘back to the office’ as the prototype for a full roll out to help optimise the use of lab support for GPs.
Team Friend Tree
This team found that overlaps between their objectives and those of other teams were significant, so concentrated on some of the more ‘marketing’ aspects of service delivery – identity, and some thoughts around messaging to bring people into the service.
A good example of a service that could be rolled out quickly on top of the kind of datasets being used by the Float your boat project.
Codethecity 12 (3rd & 4th February 2018) will be on the topic of Tourism.
More details here: http://codethecity.org/2018/01/code-the-city-12-tourism/
Ticket bookings are now open at Eventbrite.
In the lead up to Code the City 7 we sent attendees some blank Barrier and Opportunity cards. We asked them to complete and bring them – with a single suggestion or idea per sheet.
On arrival people were to stick them to the wall. The response was great – with an enormous display of creativity quickly assembled. Many of these suggestions grouped well together. As we got started, five volunteers stepped forward to be the champion for one idea each, which formed the starting point of each of the projects taken forward during the weekend. You can read more about these from this blogpost onwards. Even the drawings accompanying the ideas were great – see the montage above!
But what of the remaining ideas – of which there were dozens? I read each of them and have summarised some of them – often grouping several together – below. Each of these has merit as a potential area to explore further (perhaps at a future event).
This links number of a blog post I wrote recently about the ratio of GPS to patients at Scottish Surgeries.
It is suggested that there is no consistency across the NHS Grampian area – with some good examples of websites and some poor.
Where is the data to show which days are busier than others. How could that help patients?
Patients, on being referred to a consultant are often left in the dark for weeks or months until a letter arrives. How could that be made transparent? Could we have a ‘track my referral’ as you would a ‘track my parcel’? How or when will you get an appointment with a consultant? Could you self select from calendar rather than get one which doesn’t suit and has to be changed.
How could we assist patients to use digital services as these are developed. Which also raise the issue of health literacy – how could we assist people to understand their own health – e.g. cause and effect.
These ideas alone would feed another three hack weekends! If you are interested in working or these – or sponsoring a further weekend such as this, please let us know!
Great progress overnight and through the morning. Very few drop outs overnight – keeping a real energy in the room.
Float your boat
The team have created a prototype website focusing on helping people find events and services locally. Includes stories about people improving their lives through accessing services.
Currently acting as a central hub for finding further information.
Have discussed turning it into an app, but clearly a web first approach seems the most appropriate at this stage. Discussion about the potential for local community ownership, or for a body like Health Partnership Development to take the lead.
A key observation was that the scope of ambition for the project has jumped from very ambitious and broad, to much tighter, and back again multiple times. Deciding on the scope to tackle took significant time, and was acknowledged as key to making progress.
Worth noting that the team is treating the sourcing and management of high quality data to be a parallel problem, likely tackled elsewhere.
The team has a paper prototype app aiming to guide people towards independently finding a way to take part in the local community.
Similar to the Float’ project, but focussed more tightly on social isolation issues and solutions.
They have been looking at the scoring, rating and categorisation of services and activities to aid in selection and guide people towards appropriate choices.
This team agreed with the importance of selecting a specific objective for the project – and to focus on that. Very easy to get distracted by related issues.
The unique element identified in the group discussion was the potential to allow the creation of some small groups. A fascinating example was the creation of a ‘take the bins out’ agreement among neighbours – helping people find help if they are away from home, and easing a ‘first contact’ event with new neighbours when you move home.
While the team ‘have nothing fancy to show’ they have made substantial progress since the last update, and are confident of having a well progressed project by the end of the day. Work has progressed on three fronts:
Data collection and insertion to new database.
Reporting layer, where work if focussed on generating mean values for overview presentation.
The Geo team have been translating postcodes into coordinates, and creating workflows and automation to allow this to happen as time goes on when new data and boundary changes happen.
An interesting discussion about availability of data about GP practices, (there is more than you might think, much of which can be reviewed here) and what can be done with it.
The language barrier project has focussed on refining the story told in the video and literature that it is creating. The discussion touched on the existing use of mobile phones as a primary translation tool for many people with English as a second language, especially when confronted with technical or medical terms.
Also discussed options to not only offer better translation access, but to offer language skills development services as a preferred approach.
This team have met a couple of technical barriers when tying their various elements together, but have achieved a number of key elements.
SMS messages are being relayed successfully.
A prototype of the service has been created in Java to simulate the interaction, on screen for now rather than by SMS.
Discussion has been primarily around the importance of marketing and communications around the service. Targeting of publicity thought to centre on food banks, shelters, pubs, chemists, community centres – all places with high footfall from the demographic groups the service would be most appropriate for.
The wider group identified this as a key tool in self management of long term issues, and something that would have a genuine impact.
Finally, a demonstration of some visualisation options using off the shelf visualisation tools to gain insights into the quality, coverage and usefulness of a data set.
Discussion around the demonstration identified the usefulness of the geographical visualisations in identifying differences and gaps in service levels from area to area.
Pre-pizza updates from the teams:
Since lunchtime the team have grabbed more coffee, created a big list of tasks, and been working on pulling Aliss data into their project. Work still to be done on the SMS layer.
Also discussing interesting natural language processing element to improve ease of use for the app.
Watch team Team Text on Periscope.
The team have created a script and video prototype in English with Polish translation underway. Web based version is in progress and likely to be complete early tomorrow.
Looking at options for animation of the video tomorrow.
Watch team Pomoc on Periscope.
Since lunch the group have wrangled some network issues which held up progress, but have completed initial database design, and are working on the data and reporting layers in parallel.
Watch Team Delta Test on Periscope.
Since lunch the team have worked on a web prototype of the front end of the service. Lowering participation through better data access, easier navigation and quality curation.
Watch team Friends Tree on Periscope.
Floating my boat
Since lunch have eaten sweets and cola. Community layer is vital to many health issues – a service discovery app.
They have created a number of user personas to enable
Four sections to the envisaged service:
Watch Float My Boat on Periscope.