In this close-out post I shall hand over to the teams themselves to walk you through their CTC9 weekend. Check out the videos using the links below. Use the ‘ctc9’ tag to find all other blog posts about the amazing volunteering experience this weekend.
I am so glad I joined the CTC9 project as a volunteer. Blogging about this project was a tremendous experience. There are two aspects of this weekend that amazed me beyond the teams’ achievements.
The idea funnel
It was fascinating to witness the journey we all ventured on – from random ideas on post-its to distilling them down into structured approaches.
The teams seemed to develop naturally based on people’s interests. It is remarkable how smoothly people from different sectors and backgrounds worked together in a very productive way. The Code the City staff did a great job in keeping us all on track.
Here’s a quick update before the big show-and-tell later on.
The team has developed a draft version of the website tucked away on a test server. They have established the first functional search using the category ‘social isolation’. It returns a list of service providers in the area that is drawn from the three source databases. This is a big step forward, as we now know how to program a search and are able to deliver visible results on a user interface.
The team is also working on searches based on location by postcode or radius.
One expected challenge is the extraction of information from differently formatted data sources. For example, one source database does not provide contact details in dedicated address fields but in a more general description box.
Team: Soul Cats
This group went back to focusing on the public end users. They came up with various names for this new website that make it easy to find. They played with words from Scots dialect and proper King’s English. All suggestions were googled to see whether they exist already or are buried in amongst a ton of other results. Ideally, we want something unique!
The team suggested to submit a selection of words to a public forum in order to collect opinions or votes.
Team: The Professionals
The Professionals are a spin-off group from the Soul Cats. It’s a rollercoaster with those Cats! They went back to focusing on the value this website for health care professionals. In a structured approach they answered 4 key questions:
- Who are key stakeholders?
- What are key relationships?
- What are key challenges?
- What are the gains right now if this project went live?
What a beautiful sunny morning for making my way over to CTC9 HQ. It’s a slow start today. Hey, it’s Sunday…
Since we didn’t have a close-out meeting last night, we caught up with everybody’s progress in a kick-off meeting this morning. Make sure to read the update from yesterday afternoon beforehand.
Team: ALISS API
The data is flowing! We now have access to all 3 data sources: ALISS, GCD and MILO. MILO too? Yes! As it turns computing student Mikko has been working on hooking up MILO to the project as part of Team ALISS API.
Linking up GCD encountered a stumbling block after the initial success because the WiFi network ended up blocking the website used for our API. By the sounds of it, this is in hand though.
Now that we are connected to all databases, they are being combined by matching titles, identifying duplicates etc. The result will provide access to searchable data from all sources via one URL. James has already launched a temporary live demo page that connects to the databases. The first rough draft is based on story boards James designed with input from the user-focused teams last night. The website is currently at an early stage; so some buttons will work, some won’t. Feel free to rummage around.
There is also a shared file repository on github. It harbours user interface code, the backend REST API and photos from our brain storming sessions.
The next big goal is to develop the visual interface further to make search results visible to the website user. At the moment results appear only in code. The team also suggested that functionalities for location-based search and prioritising search results will require more development.
Team: Soul Cats
Teams Stripy Tops and Access All Areas have merged under the new name ‘Soul Cats’ (inspired by a T-shirt). This move made sense because both have been targeting user groups – the professional user (Stripy Tops) and the public (Access All Areas) – and now felt that their paths were converging.
The teams have drawn up more specific suggestions on user requirements based on the needs of different target groups. It’s quite impressive how yesterday’s wide-roaming discussions are now funneling into concrete scenarios and solutions. The obvious conclusion is to make the web interface simple – clear language, natural keywords, self-evident icons, sensible menu structure etc.
- options for geo-location of service providers relative to user addresses
- including info on mobility/access issues e.g. stairs
- including info on parking, public and community transport connections
- including photos of the service location, exteriors and interiors, so that people easily recognise the place once there
The next steps will involve working closer with our coders and coming up with names for the page, categories etc.
We kicked off the ‘Code The City 9 – Health Signposting’ weekend this morning bright-eyed and bushy-tailed. There are just under 20 attendees from mixed backgrounds.
We have volunteered to help solve issues around health care data. One problem is that health care data are currently maintained in (at least) three unconnected systems run by different organisations. These are ALISS, GCD (Grampian CareData) and MILO. The ultimate goal of this project is to create an open data source that provides accessible up-to-date information to the public and professionals.
In the lead up to Code the City 7 we sent attendees some blank Barrier and Opportunity cards. We asked them to complete and bring them – with a single suggestion or idea per sheet.
On arrival people were to stick them to the wall. The response was great – with an enormous display of creativity quickly assembled. Many of these suggestions grouped well together. As we got started, five volunteers stepped forward to be the champion for one idea each, which formed the starting point of each of the projects taken forward during the weekend. You can read more about these from this blogpost onwards. Even the drawings accompanying the ideas were great – see the montage above!
But what of the remaining ideas – of which there were dozens? I read each of them and have summarised some of them – often grouping several together – below. Each of these has merit as a potential area to explore further (perhaps at a future event).
- Find out how busy a GP practice is, before you register
This links number of a blog post I wrote recently about the ratio of GPS to patients at Scottish Surgeries.
- Information on GP practices
It is suggested that there is no consistency across the NHS Grampian area – with some good examples of websites and some poor.
- Waiting times for appointments at GPs’ surgeries?
Where is the data to show which days are busier than others. How could that help patients?
- Live Tracking of referrals to consultants
Patients, on being referred to a consultant are often left in the dark for weeks or months until a letter arrives. How could that be made transparent? Could we have a ‘track my referral’ as you would a ‘track my parcel’? How or when will you get an appointment with a consultant? Could you self select from calendar rather than get one which doesn’t suit and has to be changed.
- Lack of data interoperability between elements of health service / Health and Social Care etc.
- Assist GPS to do more online – self service – online calendars for appointments – meaning that they can spend longer with patients or reduce waiting times for appointments
- Citizen / Patient digital literacy
How could we assist patients to use digital services as these are developed. Which also raise the issue of health literacy – how could we assist people to understand their own health – e.g. cause and effect.
- Persuade / help GPs to get citizens to use informal / community-based support
- A shared calendaring across NHS Grampian to share training opportunities. Much training is common but is delivered is a siloed basis.
- Develop a common organogram showing remits, areas of operation across the formal and inform H&SC landscape
- Address the challenges of patients being treated in parallel between two specialists, so that they don’t feel that they are being passed from pillar to post.
These ideas alone would feed another three hack weekends! If you are interested in working or these – or sponsoring a further weekend such as this, please let us know!
On 19th and 20th March we found ourselves back at Aberdeen Uni with 35 or so eager hackers looking to bring to life a 3D Virtual Reality historic model of Aberdeen city centre using new open data. So how did it go?
This time we were more prescriptive than at any previous Code The City event. In the run up to the weekend we’d identified several sub-team roles.
- Locating, identifying and curating historic content
- Transcribing, formatting and creating valid open data
- Building the 3D model, fixing and importing images and
- Integrating and visualising the new data in the model.
After some breakfast, an intro and a quick tutorial on Open Data, delivered by Andrew Sage, we got stuck in to the work in teams.
Old Books into Open Data
We were lucky to have a bunch (or should be a shelf-ful) of city librarians, an archivist and a gaggle of other volunteers working on sourcing and transcribing data into some templates we’d set up in Google Sheets.
Given that we’d been given scanned photos of all the shop frontages of Union Street, starting in 1937, of which more below, we settled on that as the main period to work from.
The librarians and helpers quickly got stuck into transcribing the records they’d identified – particularly the 1937-38 Post Office Directory of Aberdeen. If my arithmetic is correct they completely captured the details of 1100+ business in the area around Union Street.
At present these are sitting in a Google Spreadsheet – and we will be working out with the librarians how we present this as well structured, licensed Open Data. It is also a work in progress. So there are decisions to be made – do we complete the transcription of the whole of Aberdeen – or do we move onto another year? e.g. 1953 which is when we have the next set of shopfront photos.
Music, pictures and sound
At the same time as this transcription was ongoing, we had someone sourcing and capturing music such might have been around in 1937, and sounds that you might have heard on the street – including various tram sounds – which could be imported into the model.
And three of us did some work on beginning an open list of gigs for Aberdeen since the city had both the Capitol Theatre (Queen, AC/DC, Hawkwind) and the Music Hall (Led Zeppelin, David Bowie, Elton John) on Union Street. This currently stands at 735 gigs and growing. Again, we need to figure out when to make it live and how.
The 3D Model
At CTC5 back in November 2015, Andrew Sage had started to build a 3D model of the city centre in Unity. That relied heavily on manually creating the buildings. Andrew’s idea for CTC6 was to use Open Streetmap data as a base for the model, and to use some scripting to pull the building’s footprints into the model.
This proved to be more challenging than expected. Steven Milne has written a great post on his site. I suggest that you read that then come back to this article.
As you’ve hopefully just read, Steve has identified the challenge of using Open Streetmap data for a project such as this: the data just isn’t complete enough or accurate enough to be the sole source of the data.
While we could update data – and push it back to OSM, that isn’t necessarily the best use of time at a workshop such as this.
There is an alternative to some of that. All 32 local authorities in Scotland maintain a gazetteer of all properties in their area. These are highly accurate, constantly-update, and have Unique Property Reference Numbers (UPRNs) and geo-ordinates for all buildings. This data (if it was open) would make projects such as this so much easier. While we would still need building shapes to be created in the 3D model, we would have accurate geo-location of all addresses, and so could tie the transcribed data to the 3d map very easily.
By using UPRNs as the master data across each transcribed year’s data we could match the change in use of individual buildings through time much more easily. There is a real need to get the data released by authorities as open data, or at least with a licence allowing generous re-use of the data. ODI Aberdeen are exploring this with Aberdeen City Council and the Scottish Government
We were given by the city’s Planning Service, scans of photos of shopfronts of Union Street from a number of decades from 1937, 1953 and on to the present. Generally the photos are very good but there are issues: we have seams between photos which run down the centre of buildings. We have binding tape showing through etc.
These issues are not so very difficult to fix – but they do need someone with competence in Photoshop, some standard guidance, and workflow to follow.
We started fixing some photos so that they could provide the textures for the building of Union Street in the model. But given the problems we were having with model, and a lack of dedicated Photoshop resource we parked this for now.
Taking this project forward, while still posing some challenges, is far from impossible. We’ve shown that the data for the entire city centre for any year can be crowd-transcribed in just 36 hours. But there are some decisions to be made.
Picking up on the points above, these can be broken down as follows.
- Licensing model to be agreed
- Publishing platform to be identified
- Do we widen geo-graphically (across the city as a whole) or temporally (same area different years)
- Creating volunteer transcribing teams, with guidance, supervision and perhaps a physical space to carry out the work.
- Identify new data sources (e.g. the Archives were able to offer valuation roll data for the same period – would these add extra data for buildings, addresses, businesses?)
- Set up a means for the general public to get involved – gamifying the transcription process, perhaps?
- Similar to the data above.
- We need clear CC licences to be generated for the pictures
- Crowdsource the fixing of the photos
- Create workflow, identify places for the pictures to be stored
- Look at how we gamify or induce skilled Photoshop users to get involved
- Set up a repository of republished, fixed pictures, licensed for reuse, with proper addressing system and naming – so that individual pictures can be tied to the map and data sources
The 3D Model
- Build the model
- Extend the coverage (geographically and through time)
- Establish how best to display the transcribed data – and to allow someone in the 3D environment to move forward and back in time.
- Look at how we can import other data such as a forthcoming 3D scan of the city centre to shortcut some development work
- Look at how we can reuse the data in other formats and platforms (such as Minecraft) with minimum rework.
- Speed up the 3D modelling by identifying funding streams that could be used to progress this more quickly. If you have suggestions please let us know as a comment below.
Taking all of this forward is quite an undertaking, but it is also achievable if we break the work down into streams and work on those. Some aspects would benefit from CTC’s involvement – but some could be done without us. So, libraries could use the experience gained here to set up transcribing teams of volunteers – and be creating proper open data with real re-use value. That data could then easily be used by anyone who wants to reuse it – e.g. to create a city centre mobile app which allows you to see any premises on Union Street, call up photos from different periods, find out which businesses operated there etc
As the model takes shape and we experiment with how we present the data we can hopefully get more attention and interest (and funding?) to support its development. It would be good to get some students on placements working on some aspects of this too.
Aberdeen City Council is working with the Scottish Cities Alliance to replace and improve the Open Data platforms for all seven Scottish cities later this year – and that will provide a robust means of presenting and storing all this open data once in place but in the mean time we will need to find some temporary alternatives (perhaps on Github ) until we are ready.
We welcome your input on this – how could you or your organisation help, what is your interest, how could you assist with taking this forward? Please leave comments below.
Code The City 6 – The History Jam was funded by Aberdeen City Council’s Libraries service and generously supported by Eventifier who provided us with free use of their Social Media platform and its LiveWall for the sixth consecutive time!.
Codethecity five is over. It took place over the weekend of 24 / 25 October 2015, and was themed around culture. Around 40 volunteers worked on some great projects. You can catch up with events on the #CTC5 tag here.