Monday 29 February 2016

History Bakers: Marchpane

For this month’s History Bakers, I wanted to try something different to my previous bakes and as I was looking through our various recipes I became intrigued by this recipe for marchpane or marshpain, an early version of marzipan.

Original recipe from U DDHO/19/1

The recipe is taken from our Hotham collection (item ref. U DDHO/19/1) and, in its original form, is a little confusing for the modern baker. I wasn't quite sure what it meant by ‘wafors’ and I couldn't quite picture how it was meant to look. As such, I scoured the internet for further advice and found that the ‘wafors’ performed the same job as greaseproof paper does for us today, and that marchpane could be moulded into any shape desired but that in this case it was meant to be a circular ‘slab’ decorated with sweets.

The end result is essentially an almond biscuit, crispy on the outside and soft in the middle. As it is baked, it is not soft all the way through, like modern marzipan, and I’m not sure it would be that easy to cover a cake with marchpane! It was, however, another interesting foray into historical baking, providing insight into what used to be a very popular sweet treat.

If you would like to have a go yourself, please see below for a modernised recipe:
Marchpane
200g ground almonds
100g icing sugar
Approx. 6 tsp rosewater

1. Mix together the almonds and icing sugar, making sure any lumps of icing sugar are smoothed out.
2. Add the rosewater a teaspoon at a time. Start by mixing with a spoon and then with your hands. Keep adding the rosewater until it becomes a smooth paste. The dough should not be sticky and should not be over worked as it will become oily.
3. Dust your work surface with icing sugar and roll out the marchpane to approximately 1cm depth.
4. Cut around a plate to create a circle of marchpane and place on a baking tray covered with greaseproof paper. Any leftover marchpane can be used to create other shapes for decoration.
5. Bake in the oven at 150C (300F) for about 25 minutes, until it is starting to brown.
6. Allow to cool before decorating with icing (mix of icing sugar and rosewater) and your decorations of choice.

The finished marchpane!

Let us know what you think if you have a go at making it. Here are some of our thoughts...

Dave - Very sweet and surprisingly crunchy
Elaine - Can taste the almonds, biscuit consistency
Claire - Chewy, almondy and sweet, lovely and surprising!
Christine - Lovely almond flavour and pleasantly chewy!
Mrs West - Very chewy, crunchy and sweet
Martin - Crunchy and almondy, although not to the extent I was expecting
Alex - Really delicate almond and rose flavour, sort of marzipan in biscuit form!
Pete - Very good, had three pieces...
Laura - Chewy, sweet and lovely almond flavour, very tasty

Verity Minniti, Archives Assistant

Friday 26 February 2016

Image Visualisation

Something I have found quite surprising over the last few months is just how varied archives can be. I was recently given a large set of images to work with, which is not so usual, but the task at hand was trying to look at the items in a new way. This was only really possible in this case because the dataset I was working with was a group of digital images, this gives a lot of freedom in regards to quickly arranging and presenting large quantities of data. Providing our users a different way of viewing data is in many ways is just like supplying another means of access, it allows them to see things differently and draw their own conclusions without imparting any personal bias or inferences on our part.

Tasked with finding free software or tool to use to visualise the material I set about some serious desktop research. Whilst the work is ongoing here are my initial thoughts using two different pieces of software.


ImageSorterV4 dataset arranged via colour
ImageSorter
The first, ImageSorterV4 by Pixolution, allows a very quick and easy way to sort and arrange large volumes of images. It is limited in the sorting options available since it can only arrange by colour, name date and size. Of the choices available colour and date ate the two that are the most useful for our purposes with the colour option being very visually striking. The advantage of this program is just how quick it is, within locating the image folder it will process and display. 

Dataset arranged by Brightness vs Saturation 
ImagePlot
The second program ImagePlot by the Software Studies Initiative builds upon ImageJ a java based open source image processor. ImagePlot is a far more versatile program allowing a comparison of any two elements in relation to the image, providing that you already have access to the information in advance. At its base it is simply a scatter plot program, comparing two different values but the way this program differs is that once it has actually plotted the position it will place the relevant image at that location. This doesn’t sound ground breaking but it turns a very static and uninspiring looking scatter graph into a very visual experience. But it is still just a scatter plot so that means that as long you have two [pieces of data worth comparing then you can visually display this, so the focus shifts from looking for the right program to display your data to needing to find the right way of pulling data from the images themselves.

Metadata
Since the only information that we have available in this instance is the image itself we have to look to see what metadata we can extract from the file itself including file size or the date it was created. Most digital cameras also record information at the moment when the shutter-release button is pressed. Information such as, time and date, camera model, focus settings, FOV (field of view), was a flash used and in some cameras it will even record the GPS location of where the photo was actually taken. 

For my testing purposes using the ImagePlot software I used some easily accessible metadata in the form of name, date last modified and size of the file these I was able to collect in moments by using an internet browser. With files in a folder you can just copy and paste the location address into a browser window in order to get access to a simplified file hierarchy. Google Chrome in particular works well with this.

Image Analysis
Dataset showing the count of objects vs the percentage area
covered
If we want to do some useful comparisons we are going to need some more interesting data. The content of the images hold the most potential for relevant data. We can already view the images, but pulling quantifiable statistics for the purpose of comparison is a little more difficult. Thankfully ImagePlot actually comes with three additional macros that allow the user to pull a limited amount of this information from the images themselves. In this case I used these macros to pull information about that brightness, saturation, hue, count (number of objects greater than 10px) and the percentage of area cover (by the same objects). The count macro actually allows for a certain amount of variation in the way it collects its values, by default it is setup to search for ‘objects’ that are 10 pixels or larger so some complex images like people’s faces or textured backgrounds can sometimes return a value in the thousands but this setting can be altered to a higher number (say 100px) and it would only detect larger objects at the cost of any small objects getting overlooked. There is also a setting to change whether it is looking for more circular or rectangular sections to determine what is classes as an ‘object’ for the purposes of counting.

File Format
Once all this information has been gathered it simply needs to be placed into an excel spreadsheet with each row relating to a particular image. All of the methods I have mentioned so far specifically allow the information to be retrieved in text form and placed into a spreadsheet very easily. Once the spreadsheet has all the data that you wish to compare it is then saved as a tab delineated text file (.txt) which is the format that ImagePlot uses for running its scatter plot algorithm This freedom of just saving data into an excel spreadsheet and then turning that into a text files really opens up what you can do with ImagePlot since as long as you can get information and values you have something to compare and you don’t have to deal with a third party file type you’ve never encountered before.

Conversion of excel spreadsheet to tab delineated text file
Conclusion
Programs such as these are only limited by the data that can be extracted from the images themselves, the fact that I can only compare two statistics at a time and the manner that the data is displayed will always be a direct comparison of one element to another. It is also far easier to compare information that is pure numbers as opposed to text for example using the file name or any string/boolean value, although that is only specifically with the software that I have found so far. If there is software that could group images by textual information I would love to hear about it. 

With what I have so far I have a significant amount of information in regards to displaying the data, there are some comparisons that are most relevant than others and some that are far more visually appealing but it is the fact that I can show this data at all that is interesting. Being able to get a visual representation of data makes it easier to consider it as a whole and allows other people to draw conclusions about the entire dataset in a way that would otherwise be almost impossible only viewing small sections of the data at a time.

David Heelas
Transforming Archives Trainee

Wednesday 17 February 2016

History Bakers: Quaker Oat Biscuits


January's History Bakers...slightly later than usual!
Last month's recipe comes from a souvenir recipe booklet produced for the Kingston Wesley Methodist Church Bazaar in 1962 (RefNo.C DCE/848/9). As it was January I was looking for something to bake that was a little healthier after the rich foods of Christmas and the New Year, and this recipe caught my eye, particularly as the ingredients are readily available and the biscuits are quick and easy to make.
C DCE/848/9
At the beginning of the recipe booklet is a page of useful information including oven settings and measures, and the electric oven temperatures are in Fahrenheit and so I needed to convert these to centigrade, and then reduce it accordingly for my fan assisted electric oven. The recipe instructions themselves are very brief and so there was some experimenting and keeping a close eye on the biscuits whilst they were cooking as no times are given. The recipe title indicates that Quaker Oats are to be used, but any porridge oats are suitable, and the same for the corn flakes, any brand can be used.
Recipe and ingredients
For electric ovens the mixture needs to be cooked initially at 180 degrees centigrade for four minutes, and then lower it to 170 degrees, and for fan assisted start at 170 degrees and lower it to 160 degrees. The mixture did not spread as much as I expected whilst it was cooking and a teaspoonful of the mixture gave very small biscuits. I found that a good tablespoonful or even more was better and produced a good crisp chewy biscuit. The baking time is 12 to 15 minutes, depending on the size, and the biscuits will be quite soft when they come out of the oven but will harden and crisp up as they cool on the cooling rack, so don’t be tempted to leave them in the oven to crisp.
The finished biscuits
Some comments from the staff after tasting them:
Claire      Crispie, oaty yummyness!
Elaine      Scrumptious, very tasty
Alex         Really nice!  Very moreish
Laura       Lovely, crispy and tasty!  Yum!
Verity        Just the right balance between crispy and chewy
Carol        Lovely and moist but crunchy in the right places
Sarah       Chewy, oaty and buttery.  Lovely!
Caoimhe   Delicious – crunchy and yummy
Elspeth      Extremely oaty, pleasant taste, and great crunch
Angela       Very crunchy, creamy and not too sweet
Neil          Oaty, light and crunchy.  Very nice!
Pete      Jolly good.

Christine, Conservator at Hull History Centre

Friday 12 February 2016

Digital Isn’t Different Conference

Peoples History Museum, Manchester
Last week I attended the Digital Isn’t Different conference by the Collections Trust held at the Peoples History Museum in Manchester. The event was primarily about the digital strategy for museums and how to approach the rising amount of digital materials that were starting to be used. It covered both digital collections and digital asset management.

Interestingly a similar point was reinforced that I took away from the DPC Student Conference in that the Digital Collections process is largely the same as it is with none digital items, there is plenty of transference of skills and strategy. But at the same time what you do decide to accept still needs to be in line with your policy and mission statement and well as the resources at your disposal.

The secondary focus of the conference was to provide more information on what Digital Asset Management is and how to utilise Digital Asset Management Systems to help with your collection. One quote from an Extensis webcast that stuck with me was this:

“You can find a recipe for Norwegian Apple Cake in seconds – but can you find your own records that were created yesterday”

I think this can really reinforce that it is not always just about gathering information but there needs to be an effective way to filter through and actually find what you are looking for, this is especially true in a digital environment. There have been plenty of anecdotes over the last few years of organisations going completely paperless and then promptly being unable to actually find anything anymore.
The conference about to begin!

Digital Asset Management (DAM)

Digital Asset Management is just the process that an organisation will use in order to be able correctly create, store, retrieve and use digital information. A DAM has several important aspects in how they interact with your collection and they are planning, findability, connectivity security, useability and preservation.

Business continuity planning- If you only have one staff member who can use the software or hardware, is it worth investing in? Compare how your current system is used, it is still working? Does it need to be changed?

Findability and connectivity- File naming conventions are important for digital assets that you create and can help significantly rather than just utilising the default camera naming settings. Keyword tagging can also feed into this in order to aid with identification, for example a picture of a ruined castle could be tagged with “castle, ruins, building” and then this metadata can be searched bringing up relevant pictures easier.

Security, usability and preservation- Security is not just in regards to protecting the actual data from someone attempting to access it but it also covers aspects of copyright and intellectual property rights. Preservation is another incredibly important aspect, although this could be covered by a DAM there is dedicated software available for digital preservation (archivematica, preservica, etc.). It is important to have the DAM take this into account and provide ways of integrating this into your overall process.

Integrity is something that is vital to born-digital assets, there are checksums that can be created to confirm that data has not been altered. But this means that during the workflow there needs to be a time specifically taken to creating these checksums and then later on in the workflow checking the checksums against the original to confirm that no alterations have been made.

Ultimately when dealing with digital files there are several things to keep in mind file sizes are getting larger and larger, there is always going to be more digital files tomorrow then there are today and finally people expect to be able to access/find files instantly, this includes colleagues as much as the public!

Case Studies
There were several museums who gave their own experience using digital assets ranging from full blown apps to large scale digitisation projects. It was very interesting to see other institutes tackle this problem and see what issues they encountered along the way.

One incredibly interesting talk by Kendall Museum covered their experience with an HLF funded digitisation project and how they went about choosing what collections to digitise. Resulting in their digital catalogue that provides high quality images of minerals and herbarium. It was also interesting to see that for the purposes of digitisation they chose to use the Metamorfoze Imaging Standard which is an internationally recognised Preservation Imaging Guideline from the Netherlands, it is actually for the digitisation of paper documents but it can also be used for other projects as well. Using this standard provides a guarantee that regardless of the lighting conditions that the the image is viewed in it can be determined accurately what the original colour looked like to give the best idea of what the artefact or record will actually look like. They have a blog post on the subject that is well worth reading.


Group discussion on the process of accepting a digital record
Resources
Perhaps the greatest thing that I will take away from the conference is information about the sheer amount and variety of resources available when looking into digital assets. Such as:

Strategy and Planning:
Collections Trust ‘Going Digital’ – An information package including tools about how to start thinking about the digital side of things and how to assess your organisations current procedures.

The National Archives ‘How to manage your information’ – This covers a number of different guides and tools to help plan your digital strategy and understand how to manage all the information.

Digitisation/Digital Assets information:
JISC Guides – JISC has a number of detailed guides on a variety of subjects such as Creating, Finding, Managing, Digitisation and Delivering and Using. There is also the JISC Toolkit which relates to the equipment needed for digitisation projects.

Extensis Webcast – Recordings of webcasts from the company Extensis who provide Digital Asset Management Systems.

DPC Digital Preservation Handbook – Something I was already familiar with but nonetheless an excellent and well written guide for all things digital preservation.

British Library ‘Fail to prepare for digitisation, prepare to fail at digitising!’ – The British Library has a number of different posts and discussions about their own process in Digitising collections, definitely a site worth looking at.

Social Media/Blogging:
Thirty8 Digital ’Blogging Ideas’ – Inspiration for blogs primarily aimed at museums but could still be used for archives, Thirty8 digital also has a number of other resources on various social media platforms such as twitter as well as some digital strategy worksheets all available on their resource page.

#CultureThemes – A blog that posts a new hashtag every few weeks in order to create some cohesion but also look at some more interesting artefacts/records that places may have. Once again largely aimed at museums but still relevant to other heritage sectors!

Technology:
Museum in a box – A 3D printing project aimed at getting pieces from smaller museums scanned and then using the replicas as learning resources.

Conclusion
Overall having a Data Asset Management plan in place does seem like the obvious and clear choice, but as was mentioned throughout the conference it is important to look at your current systems in order to see what you already have in place and what you may need, ideally your new system wants to feed in to your existing systems as much as possible. The conference also one hundred percent reinforced something from the DPC Student Conference in that you need to know what you have and where it is, just because something is digital it doesn’t mean that this becomes less relevant.

To conclude, it was definitely a very interesting two days. There was a much grander focus on the museums side of Digital Assets but that is understandable and everything covered is still very relevant to other heritage institutions. If you want to see what people were tweeting about during the event, we were using the hashtag #DigitalIsntDifferent. I think that this event has really helped to supplement what I learned and covered in my previous blog post about the DPC Student Conference, instead this time covering more on how to deal with the sheer amount of information that an organisation could potentially be taking in.

David Heelas
Transforming Archives Trainee

Wednesday 3 February 2016

New Cataloguing Project at Hull History Centre: Introducing ACPO

Hello, I’m Alex and I’ve just taken up a position as a Project Archivist at Hull History Centre. Over the next few months I will be working with the papers of the now defunct national body, the Association of Chief Police Officers (ACPO).
This is me, Alex!
ACPO formed in 1948 from the combination of County Chief Constable’s Club and the Chief Constable’s Association of England and Wales and was a critical component in the development of policing and police forces. The association sought to ensure policing operations and services were coordinated at a national level, and also provide a united professional voice for the police service. An independent review of ACPO was commissioned by the recently introduced Association of Police and Crime Commissioners in 2013. Following publication of the resulting report, the National Police Chief’s Council (NPCC) was established to replace ACPO. This took effect on the first of April 2015.
Selection of items from the ACPO collection
During their existence ACPO influenced not just the creation of police policy, but also successfully lobbied Parliament, as a result influencing the creation of national legislation. Although I’ve only just started to look at the collection of ACPO records here at Hull it’s already clear that they span from the 1940s right up until at least 2003. The period of history in which ACPO operated was a turbulent one for the police in England and Wales, with public attitudes towards them shifting dramatically. It also saw vast changes in the social and cultural landscape. A quick peek in the boxes promises records which reflect these shifts. In addition to reports of inquiries into high profile police scandals such as the murder of Stephen Lawrence and the response to the miners’ strike of 1984-1985, there are also investigations into the use of video technology (now itself outdated) and the use of DNA databases (now well established).
ACPO logo
I’m really looking forward to ensuring that the records of this highly influential and sometimes controversial organisation are fully accessible to any researcher. I’ve already taken the first step by writing a brief overview of the collection for our online catalogue. Follow this link to take a look. I hope that both this blog and the initial entry in our catalogue give an idea of what to expect from the collection and also encourage you to come and find out more once the project is complete and the records available to consult in our search room. I’ll be writing more entries for the blog as we progress through the project, so do keep an eye out.

Alex, Project Archivist, Hull History Centre