An open letter to Aberdeen City Council

It has been well documented that there is a problem with Aberdeen City Council and their approach to Smart City and Open Data in particular. See these posts, these requests and this github page from a project at CTC11, where we tried to help fix things. Today, a Finnish researcher on Smart Cities posted this on Reddit!  International reputation? What international reputation!

Now it appears that in the relaunch last week of the Aberdeen City Council website, the council has ditched masses of content. This includes the city-wide What’s On which was until recently the most heavily-used part of the council website and which provided an extremely useful community resource.

More digging – well Googling of some popular terms for council website content  and functions – returns nothing but 404 errors. See the list below for some examples.

When, in 2006 when when the site last underwent a major update, the small team took just six months on the transition, beginning to end. No content was lost or broken, and with URL rewriting and redirects they ensured that everything worked on day one.

The council have been working on the current relaunch – on and off as managers were swapped around or were dispensed with – for two years! And the mess of the site, with massive holes in content and functionality,  far outweighs the much-improved look and feel.

So, what is the plan to restore content, much of which is a matter of public record?

We, as tax-payers, have paid for the creation of functionality and information which is of significant public use. So, where has it gone?

For example where is:

Don’t the citizens of Aberdeen deserve better than this?

Maybe someone would care to make an FOI request to the city council – to ask what data the decision-making on transfer of content and functionality was based on, and get a copy of the website stats for the last three months? I think they are fed up of me.

Ian

Final presentations at CTC10 – Perth

We had four presentations at the final pitch session at CTC10.

We have uploaded these to Vimeo below (trimming them for time to just the core presentations, and eliminating intros and questions):

Team one: https://vimeo.com/236647324

Team two: https://vimeo.com/236648125

Team three: https://vimeo.com/236649327

Team four: https://vimeo.com/236650501

These brought an enjoyable and productive couple of days to a close.

Well done to all participants involved!.

CTC9 – Near the finish line

Here’s a quick update before the big show-and-tell later on.

Team: ALISS API database

The team has developed a draft version of the website tucked away on a test server. They have established the first functional search using the category ‘social isolation’. It returns a list of service providers in the area that is drawn from the three source databases. This is a big step forward, as we now know how to program a search and are able to deliver visible results on a user interface.

The team is also working on searches based on location by postcode or radius.

One expected challenge is the extraction of information from differently formatted data sources. For example, one source database does not provide contact details in dedicated address fields but in a more general description box.

Team: Soul Cats

This group went back to focusing on the public end users. They came up with various names for this new website that make it easy to find. They played with words from Scots dialect and proper King’s English. All suggestions were googled to see whether they exist already or are buried in amongst a ton of other results. Ideally, we want something unique!

The team suggested to submit a selection of words to a public forum in order to collect opinions or votes.

Team: The Professionals

The Professionals are a spin-off group from the Soul Cats. It’s a rollercoaster with those Cats! They went back to focusing on the value this website for health care professionals. In a structured approach they answered 4 key questions:

  1. Who are key stakeholders?
  2. What are key relationships?
  3. What are key challenges?
  4. What are the gains right now if this project went live?

team-gathering

CTC9 – Sunday Morning

What a beautiful sunny morning for making my way over to CTC9 HQ. It’s a slow start today. Hey, it’s Sunday…

Since we didn’t have a close-out meeting last night, we caught up with everybody’s progress in a kick-off meeting this morning. Make sure to read the update from yesterday afternoon beforehand.

Team: ALISS API

Geek MuffinThe data is flowing! We now have access to all 3 data sources: ALISS, GCD and MILO. MILO too? Yes! As it turns computing student Mikko has been working on hooking up MILO to the project as part of Team ALISS API.

Linking up GCD encountered a stumbling block after the initial success because the WiFi network ended up blocking the website used for our API. By the sounds of it, this is in hand though.

screenshot demoNow that we are connected to all databases, they are being combined by matching titles, identifying duplicates etc. The result will provide access to searchable data from all sources via one URL. James has already launched a temporary live demo page that connects to the databases. The first rough draft is based on story boards James designed with input from the user-focused teams last night. The website is currently at an early stage; so some buttons will work, some won’t. Feel free to rummage around.

There is also a shared file repository on github. It harbours user interface code, the backend REST API and photos from our brain storming sessions.

The next big goal is to develop the visual interface further to make search results visible to the website user. At the moment results appear only in code. The team also suggested that functionalities for location-based search and prioritising search results will require more development.

Sunday team photo

Team: Soul Cats

Teams Stripy Tops and Access All Areas have merged under the new name ‘Soul Cats’ (inspired by a T-shirt). This move made sense because both have been targeting user groups – the professional user (Stripy Tops) and the public (Access All Areas) – and now felt that their paths were converging.

The teams have drawn up more specific suggestions on user requirements based on the needs of different target groups. It’s quite impressive how yesterday’s wide-roaming discussions are now funneling into concrete scenarios and solutions. The obvious conclusion is to make the web interface simple – clear language, natural keywords, self-evident icons, sensible menu structure etc.

There was some discussion around: user cases

  • options for geo-location of service providers relative to user addresses
  • including info on mobility/access issues e.g. stairs
  • including info on parking, public and community transport connections
  • including photos of the service location, exteriors and interiors, so that people easily recognise the place once there

The next steps will involve working closer with our coders and coming up with names for the page, categories etc.

Tourism Hack – Perth – TBC

PLEASE NOTE – Due to low take-up this event has been postponed. We are sorry for any inconvenience this will cause. 

Perth wants to boost its tourism offer and wants some help!. They want to see whether some well developed apps could help the city and its wider area bring attractions, trails, events, culture,accommodation, eateries; and activities to life.

They are also interested in bringing the quirky and interesting aspects of the city together, using great images and interesting user generated content through social media.

==================================================
=
= Update
=
= DATA sOURCES aDDED On Github
=
==================================================

They have developed the website http://www.perthcity.co.uk/ and there is an app (http://www.mi-perthshire.co.uk/ ) but want some creative minds to take a fresh look at the city and surrounding area, generate new ideas that they could then develop into some new apps, open data or other projects.

As always we’re looking for coders, designers, data wranglers, service users and providers, bloggers – in fact anyone with an interest – to join us for a weekend of ideation, creation, open data and rapid prototyping.

We’ll feed you, keep you stimulated, and provide good wifi. You will leave with a sense of accomplishment, new skills and potentially new friends.

Accommodation.

We’ve uploaded a list of hotels in this Perth City Accommodation List.

In addition there are a cluster of B&BS on Dunkeld Road.

Also, just outside the city itself, The Lodge at the Perth Racecourse are offering a flat rate of £90 per night in a Double or Twin bedded room (£45 per person), which also includes a full breakfast. See  http://perthlodge.co.uk/dining

So, how did CTC6 – The History Jam go?

Intro

On 19th and 20th March we found ourselves back at Aberdeen Uni with 35 or so eager hackers looking to bring to life a 3D Virtual Reality historic model of Aberdeen city centre using new open data. So how did it go?

This time we were more prescriptive than at any previous Code The City event. In the run up to the weekend we’d identified several sub-team roles.

  • Locating, identifying and curating historic content
  • Transcribing, formatting and creating valid open data
  • Building the 3D model, fixing and importing images and
  • Integrating and visualising the new data in the model.
Andrew Gives us an Open Data Briefing
Andrew Gives us an Open Data Briefing

After some breakfast, an intro and a quick tutorial on Open Data, delivered by Andrew Sage, we got stuck in to the work in teams.

Old Books into Open Data

We were lucky to have a bunch (or should be a shelf-ful) of city librarians, an archivist and a gaggle of other volunteers working on sourcing and transcribing data into some templates we’d set up in Google Sheets.

Given that we’d been given scanned photos of all the shop frontages of Union Street, starting in 1937, of which more below, we settled on that as the main period to work from.

The Transcribers
The Transcribers

The librarians and helpers quickly got stuck into transcribing the records they’d identified – particularly the 1937-38 Post Office Directory of Aberdeen. If my arithmetic is correct they completely captured the details of 1100+ business in the area around Union Street.

At present these are sitting in a Google Spreadsheet – and we will be working out with the librarians how we present this as well structured, licensed Open Data. It is also a work in progress. So there are decisions to be made – do we complete the transcription of the whole of Aberdeen – or do we move onto another year? e.g. 1953 which is when we have the next set of shopfront photos.

We have a plan
We have a plan

Music, pictures and sound

At the same time as this transcription was ongoing, we had someone sourcing and capturing music such might have been around in 1937, and sounds that you might have heard on the street – including various tram sounds – which could be imported into the model.

Sounds of the city
Sounds of the city

And three of us did some work on beginning an open list of gigs for Aberdeen since the city had both the Capitol Theatre (Queen, AC/DC, Hawkwind) and the Music Hall (Led Zeppelin, David Bowie, Elton John) on Union Street. This currently stands at 735 gigs and growing. Again, we need to figure out when to make it live and how.

The 3D Model

At CTC5 back in November 2015, Andrew Sage had started to build a 3D model of the city centre in Unity. That relied heavily on manually creating the buildings. Andrew’s idea for CTC6 was to use Open Streetmap data as a base for the model, and to use some scripting to pull the building’s footprints into the model.

Oculus Rift Headset and a 1937 Post Office Directory
Oculus Rift Headset and a 1937 Post Office Directory

This proved to be more challenging than expected. Steven Milne has written a great post on his site. I suggest that you read that then come back to this article.

As you’ve hopefully just read, Steve has identified the challenge of using Open Streetmap data for a project such as this: the data just isn’t complete enough or accurate enough to be the sole source of the data.

While we could update data – and push it back to OSM, that isn’t necessarily the best use of time at a workshop such as this.

An alternative

There is an alternative to some of that. All 32 local authorities in Scotland maintain a gazetteer of all properties in their area. These are highly accurate, constantly-update, and have Unique Property Reference Numbers (UPRNs) and geo-ordinates for all buildings. This data (if it was open) would make projects such as this so much easier. While we would still need building shapes to be created in the 3D model, we would have accurate geo-location of all addresses, and so could tie the transcribed data to the 3d map very easily.

By using UPRNs as the master data across each transcribed year’s data we could match the change in use of individual buildings through time much more easily.  There is a real need to get the data released by authorities as open data, or at least with a licence allowing generous re-use of the data. ODI Aberdeen are exploring this with Aberdeen City Council and the Scottish Government

Fixing photos

We were given by the city’s Planning Service, scans of photos of shopfronts of Union Street from a number of decades from 1937, 1953 and on to the present. Generally the photos are very good but there are issues: we have seams between photos which run down the centre of buildings. We have binding tape showing through etc.

A split building on Castle Street.
A split building on Castle Street.

These issues are not so very difficult to fix – but they do need someone with competence in Photoshop, some standard guidance, and workflow to follow.

We started fixing some photos so that they could provide the textures for the building of Union Street in the model. But given the problems we were having with model, and a lack of dedicated Photoshop resource we parked this for now.

Next steps

Taking this project forward, while still posing some challenges, is far from impossible. We’ve shown that the data for the entire city centre for any year can be crowd-transcribed in just 36 hours. But there are some decisions to be made.

Picking up on the points above, these can be broken down as follows.

Historical Data

  • Licensing model to be agreed
  • Publishing platform to be identified
  • Do we widen geo-graphically (across the city as a whole) or temporally (same area different years)
  • Creating volunteer transcribing teams, with guidance, supervision and perhaps a physical space to carry out the work.
  • Identify new data sources (e.g. the Archives were able to offer valuation roll data for the same period – would these add extra data for buildings, addresses, businesses?)
  • Set up a means for the general public to get involved – gamifying the transcription process, perhaps?

Photos

  • Similar to the data above.
  • We need clear CC licences to be generated for the pictures
  • Crowdsource the fixing of the photos
  • Create workflow, identify places for the pictures to be stored
  • Look at how we gamify or induce skilled Photoshop users to get involved
  • Set up a repository of republished, fixed pictures, licensed for reuse, with proper addressing system and naming  – so that individual pictures can be tied to the map and data sources

The 3D Model

  • Build the model
  • Extend the coverage (geographically and through time)
  • Establish how best to display the transcribed data – and to allow someone in the 3D environment to move forward and back in time.
  • Look at how we can import other data such as a forthcoming 3D scan of the city centre to shortcut some development work
  • Look at how we can reuse the data in other formats and platforms (such as Minecraft) with minimum rework.
  • Speed up the 3D modelling by identifying funding streams that could be used to progress this more quickly. If you have suggestions please let us know as a comment below.

Taking all of this forward is quite an undertaking, but it is also achievable if we break the work down into streams and work on those. Some aspects would benefit from CTC’s involvement – but some could be done without us. So, libraries could use the experience gained here to set up transcribing teams of volunteers – and be creating proper open data with real re-use value. That data could then easily be used by anyone who wants to reuse it – e.g. to create a city centre mobile app which allows you to see any premises on Union Street, call up photos from different periods, find out which businesses operated there etc

As the model takes shape and we experiment with how we present the data we can hopefully get more attention and interest (and funding?) to support its development. It would be good to get some students on placements working on some aspects of this too.

Aberdeen City Council is working with the Scottish Cities Alliance to replace and improve the Open Data platforms for all seven Scottish cities later this year – and that will provide a robust means of presenting and storing all this open data once in place but in the mean time we will need to find some temporary alternatives (perhaps on Github ) until we are ready.

We welcome your input on this – how could you or your organisation help, what is your interest, how could you assist with taking this forward? Please leave comments below.

Code The City 6 – The History Jam was funded by Aberdeen City Council’s Libraries service and generously supported by Eventifier who provided us with free use of their Social Media platform and its LiveWall for the sixth consecutive time!.

History Jam – #CTC6

The History Jam (or Code The City #6 if you are counting) will take place on 19-20 March 2016 at Aberdeen University. You can get one of the remaining tickets here.

As an participant, you’ll be bringing history to life, creating a 3D virtual reality map of a square mile of Aberdeen’s city centre. You’ll be gathering data from a variety of historical sources, transcribing that and creating new open data. You’ll import that into the the 3D model.
And there will also be the opportunity to re-use that data in imaginative new ways. So, if you are a MineCraft fan, why not use the data to start building Minecraft Aberdeen.
This is not one of our usual hacks, whatever that is! This time around instead of you proposing problems to be worked on, we’ve set the agenda, we’ll help form the teams, and provide you with more guidance and support.
If you come along you’ll learn open data skills. And you’ll get a year’s free membership of the Open Data Institute!

Saturday’s Running Order

09:00 Arrive in time for fruit juices, coffee, pastries, or a rowie.

09:30 Introduction to the day
09:45 Briefing of teams and, if you are new to Open Data, a quick training session

10:15 Split into three streams:

  • Sourcing and curation of data, and structuring capture mechanisms
  • Transcribing,  cleaning, and  publishing open data
  • Creating the 3D map, importing and visualising the data

CTC-6-Flow1

Throughout the day we’ll have feedback sessions, presenting back to the room on progress. We’ll write blog posts, create videos, photograph progress.

13:00 Lunch (the best sandwiches in Aberdeen)

More workstream sessions with feedback and questions.

17:30 (or so) Pizza and a drink

We’ll wind up about 8pm or so if you can stay until then

Sunday’s Agenda

09:30 arrive for breakfast

10:00 kick off

Morning sessions

12:30 Lunch

Afternoon sessions

16:00 Show and Tell sessions – demonstrate to the room, and a wider audience, and preserve for posterity what you’ve produced in less than 36 hours. You’ll be amazed!