Thursday, 13 September 2018

It's really not about the ducks

I received some very strange looks from my colleagues when I stated that I was no longer going to allow students to put their hands up in my IT workshops to ask for help. Nope, not even once.

What is this nonsense you might ask, or in terms of the name given to the initiative, What the Duck?

Let's make one thing very clear at the outset, students can ask for help during my classes, but they have to do it in a specific way. I will elaborate but first some background.

There is a very successful initiative for improving the programming and data management of primarily postgraduate and postdoctoral scientists who are not computing specialists. This initiative, Software Carpentry (and it's allied friends) takes a strongly evidence based pedagogy and uses some smart methods in the classroom. One of those is the application of red and green post-it notes. When a student requires help they put the red sticky up on their monitor. When they have completed the task at hand the green sticky is used to indicate visually to the instructor that they are ready to move on.  This is an effective method for those classes which are high intensity with highly motivated and high calibre participants.

I have translated this to my classes. I don't use stickies (though they are usable in this context) but instead simplify the approach to this:
If you want help, talk to the duck!
The duck is anything that you can stick on the top of your monitor. It is surprisingly easy to detect any perturbation in the smooth flat line of a monitor from across the classroom.

Whilst the duck is there, explain the problem, talk it through. Often the student solves the problem and takes down the duck before the instructor can reach them.  But while the duck is doing the attracting, the student can concentrate on the task, eyes are on the screen, hands are on the keyboard.

How well does it work? It is only the start of semester and only one class in but the students got the idea very quickly and really liked the ducks/sheep/penguins/swans/hedgehogs etc. I distributed. This is a small class in an elective module - the large classes will have to earn their ducks through interaction in lectures/workshops. The response was great - I could see that ducks that had been up were taken down before I could attend to that student. And while students had ducks up on their monitors they were actively engaged in attempting to solve the problems rather than being engaged in attracting the instructor's attention.

So a positive response so far from the students and the desired behaviour in the classroom. We'll see how  it plays out through the rest of semester and larger classes.
Yes, all of these are ducks, in the right context.
Edit: We are now three weeks in and the ducks are extremely popular. The class is less stressed and students are keen to bring in their own personal ducks (or win some for answering questions in class). It is surprising how easy it is to see almost anything on top of a monitor screen, even in a room of 150 (our largest IT lab).

Has this inspired you to try ducks in your classroom? If so, please let me know how you get on via a comment or email.

How hard can it be? Part one of a saga.

It started, as do many things, with an idle conversation in front of a white board. (I'm starting to realise that white boards are dangerous places). One of my colleagues, a talented immunologist who spearheads our undergraduate cell culture modules, had seen the laser pen microscope and commented 'It would be great to adapt this to use as a cell counter'. She has been trying to get one into the lab but it isn't economically viable at the moment. How hard can it be to build a fluorescent cell counter/analyser on a minimalist budget (ie  a few £10s )?

Lasers

We have lasers. For a few pounds you can buy a laser pen that gives a reasonably coherent light. I bought several for a fiver from eBay and now have wavelengths of 405nm, 535nm and 620nm (blue, green and red respectively). So we have a source of light.

Filters

Filters for a commercial system are typically precision optical glass, carefully manufactured and a premium price. Way more than we could afford on the shoestring budget allocated (I rummaged down the back of the sofa and found a few bits of spare change, a button and some random electronic components). However, there is a very cheap source of filters. The theater industry use many colours on lightweight plastic filters. Rosco provide spectrograms for all their filters so we can spend many hours scanning their web site to identify filters that will act as high-pass (to capture the excitation wavelength but exclude the incident) and a low-pass to capture the incident light without any contribution from the emitted light.

 This gets us two components - the challenge now is leveraging these in a way where we can work with it. One of those aspects is detecting and measuring the light, the other is scaling this to a level where we are working on just a single cell. The former of these will be dealt with later.

Optics
To focus a laser beam requires a lens. Identifying suitable lenses (ie very cheap) is challenging so it is time to think creatively. An acrylic rod that is optically clear will focus the light to a line from the round dot of the laser pen. A quick google identifies a suitable equation for calculating the focal length of a cylinder of known refractive index and diameter. In short, this tells us that a laser beam will be focused to a point 7.6 mm from the centre of a 10mm acrylic rod.

Having now obtained a suitable rod, a quick play around indicates that this will be about right. So this problem is solved. We can put a disc from the rod before, to focus the light, after to gather the transmitted light, and at right angles to catch the emitted fluorescent light. In principle we could use two different discs, one on each side to capture two different incident lights but that proves difficult to easily design. So we will stick with one fluorescent label.

Electronics

This is not too difficult (I always seem to say that at the start of a project). Phototransistors are cheap and have a broad sensitivity. Light causes the resistance they exhibit to drop. We can wire this in to a voltage divider and turn the change in resistance to a change in voltage. In order to boost detection we can feed this voltage into an op-amp in differential mode, the other input being a voltage that can be adjusted to set a zero. We are then measuring the change in voltage rather than absolute voltage (signals can go up and down. You may not get back the electrons you invest).

This signal can then be fed into a second op-amp that is used as an amplifier, so a second variable resistor can be used to adjust the gain. The output from this is then read through an Analogue to Digital Converter and read by a Raspberry Pi. That at least seems a sensible plan to start with.

 And now the big challenge.

The Flow Cell 

How big are the cells we want to look at? Different cells give different sizes. Ultimately our flow cell size will be determined by what it is feasible to manufacture. The narrowest hole I think I can drill is 0.3 mm so the plan is to take a 4mm square perspex bar and drill a 0.3 mm diameter hole down the absolute centre, then to glue some larger tube to each end of the cell. I can find 5mm OD, 3mm ID clear perspex tube that should do the trick. I had considered using a laser to make the channel in a triangular piece and then construct the flow cell from two pieces, but the width of the laser cut is probably in the order of 0.5-0.6mm and any interface will have optical challenges, so careful use of a drill press it will be. There is a plan B if this doesn't work which gives us the potential to do even smaller holes but more on that later if it is needed.

This should give us a cell of 0.3 mm diameter. To limit the effective flow cell size, the laser can be pointed through a narrow (0.1-0.2) slit across the flow cell so we have a maximum size of about 0.2 * 0.3mm. If a cell is 25 micron diameter then we are looking for a maximum pertubation of the signal of about 1% for a single cell. That should be detectable if we use differential signal analysis with a suitable gain.

Other considerations

Holding a 4 mm rod in a vice under a drill press will be challenging. Instead of holding it directly I will make a jig with a 3D print that has a 4mm receptacle in the centre. The jig can be tightly clamped in the right position and then multiple flow cells drilled without having to constantly reset alignment. Likewise, gluing 5mm and 4mm pieces together end on will be challenging. A jig will be 3D printed to allow pieces to be held in place without risk of gluing them to the substrate.

Enabling flow in the optical cell is the final challenge. We have thought to use a peristalsic pump, or just a hand held syringe but one of my colleagues has instead suggested a venturi pump on the outflow (a water pump to those used to school chemistry lessons). This will keep a constant pressure differential without overstressing the components.

Design in CAD

 I've used TinkerCad (from Autodesk) to design the pieces. It is freely available and cloud based.



The blue element is the main part. The laser pen comes in to the right hand side, firing through the first lens and the slit. The flow cell (clear) will be inserted in the slot and run front to back (or back to front, there is no real difference.)
The green element locks the lenses in place. There is a slight gap between the lenses and the flow cell of about 0.6mm which allows for the insertion of a sliver of filter gel.
The yellow element locks the flow cell in place.
The purple element is the jig for constructing the flow cell.
The red element is the jig to hold the central part of the flow cell for drilling.

Overall size is about 60x40x40mm.

These parts have now been 3D printed, perspex has arrived and needs cutting to size and drilling. Electronics components are ready and waiting.
Testing - the laser pen fits. Other parts require a bit of finishing.
The 3D print straight off the machine. The flow cell dummy is to the left.

Parts ready to play. A phototransistor sits on the right hand foreground.


Perspex (acrylic) bar and tube ready to turn into finished parts.

Will this actually work as intended? I don't know but I'm going to have fun finding out. More posts will follow detailing construction, testing, coding and more.




Monday, 6 August 2018

Fold it, suck it, blast it, show it



Public engagement isn't to my mind so much about educating the public, though that is of course one of the aims, but of enthusing folk to educate themselves. One key way to do this is with hands-on discovery - when you find something with your own hands, or see it with your own eyes it has so much more of an impact.

I have been impressed with the work of Cristina Cipriano (@crizipri‏ on Twitter) who is keen to bring low cost equipment to the general public. She has produced a 3D printed laser pen microscope which is pretty cool, but 3D printing isn't really my thing, and the pieces are very specific in size. A different brand of laser or syringe could lead to small alignment issues and not work as desired.

I put together a design to hold the laser pen and a syringe in a rigid folded cardboard that can be cut from a single A4 sheet. (plans here). The holes for the pen and syringe act as springs which hold them in place and can be tweaked to get perfect alignment.

How this works: The drop acts like a lens and will project whatever is inside as a image onto the surface behind the drop. I used a sample of dirty water from the bottom of the water butt in my garden.

I used single layer corrugated card. With careful attention to the laser cutter settings you can cut through just one wall at the folds making a very smart hinge.

And now for the health and safety. Lasers are dangerous. Laser pens can cause damage so treat them with respect. Look at what the laser is pointing at, not the laser itself.

Parts list:
1 A4 sheet of single wall corrugated card about 1.5mm thick
1 laser pen (about £1.50 on eBay)
1 10ml syringe with a luer tip (not luer lock or it won't fit down the hole.)
You will need a laser cutter or be very good at cutting out with a craft knife.


The first cut out. This used cardboard from an old shoe box. 

Ready to go. The cut cardboard. 

Folding the cardboard is easy with the part through cuts along the fold lines.

Before assembling the box, push the laser pen (abotu £1.50 on eBay) through the pen holder holes. The inner one has the cuts 'backward' so ensure you push it through in the right direction. Having a star cut allows for some 'wriggle' to get the laser nicely aligned.


One side folded. The tabs slot through then bend to lock in place. It is quite robust. I've made some slight adjustments to the design shown here to make it easier to insert the tabs(rounded corners etc.)

A 10 ml syringe fits nicely. If you want to use a needle, this can just pierce the cardboard directly. The design includes a hole for the tapered tip.

In action. The peg is keeping the laser pen on and there is a scrap of cardboard rolled up to hold it in the right place (just off the bottom of the photo).

Please do not look in this end. This was viewed with the camera, not my eye. There are windows both sides for aligning the drop and the laser. I will repeat 
DO NOT LOOK INTO THE LASER WITH YOUR REMAINING EYE.

And here is a picture - sludge from the bottom of the garden water butt. There is some diffraction, but movement of the particles can be clearly seen.  

This will happily project onto a wall or large piece of card. Remember to not look into the laser.

It's the little things that grind you down.

I am a big fan of rubrics. They help to structure the assessments and direct the activity towards the learning outcome to be assessed.  We, as are many Higher Education Institutes, are big users of TurnItIn, the online assessment workbench. This has embedded rubrics and a suite of rubric management tools. We use rubrics to ensure clear marking and assessment guidelines, and consistency between different markers.

Unfortunately the rubric management tools are not the most user friendly, and many of my colleagues (and I) prefer to use a spreadsheet to edit and update out rubrics. Doing this online through the Rubric Manager is tedious and often frustrating when it times out and loses all your work.

TurnItIn has spreadsheet upload facilities but not download. The export from the rubric manager is in a JSON-like format and not particularly accessible for the lay person to amend.

I have a need to edit and share rubrics so I have put together a simple Python script that converts the .rbc file that you export from TurnItIn to a .csv that can be viewed/edited in Excel. Please note that this is not quite the format that is used for spreadsheet uploading, but amending the spreadsheet to the required format is quite straightforward.

The script is a Jupyter notebook (runnable on any of our uni computers through the Jupyter notebook app) and can be found here. Sufficient instructions are included. edit: We have noticed some issues with the TK library on some computers where the code hangs.

Jupyter notebooks can be run through the Anaconda Python distribution downloadable from http://anaconda.org

Let me know if you find it useful.

Tuesday, 31 July 2018

Oh no, not again!

It is the summer break in academia and we are either taking our well earned holidays or preparing material for the next session. A publisher has sent us a bunch of text books to look at - they hope we like them and would recommend them to the students for our courses.
Disclaimer: Whilst we are provided with the textbooks for no personal cost, there are no inducements to promote a particular book or publisher. Each book stands (or falls) on it's own merits.

I have an interest in improving the numeracy and basic data literacy amongst our students, so one of the titles appealed to me. It covered basic arithmetic though various applications in the lab. And one of those specific applications is Enzyme Kinetics.

The Michaelis-Menten equation is one of the foundations of kinetic analysis and the results can be represented in various ways. One of the classical method was developed by Lineweaver and Burk in 1934. It is superficially attractive in that the data is transformed from a hyperbolic curve to a straight line but suffers from reciprocal scaling of the errors. As it is a double reciprocal plot (plotting the inverse of one value against the inverse of the other) small errors in measurement at small values in the test tube become huge errors at huge values in the analysis. The plot has many uses for visualising the data though - identification of mechanism of inhibition is one of the classical examples given in introductory undergraduate lectures.

The problem comes when the plot is used for the determination of the constants (Vmax and Km) rather than their visualisation. Because of the reciprocal scaling of the errors, the estimates can be extremely unsafe (how much so we will get to in a moment). This was recognised by Lineweaver and Burk but this is typically left as an unquantified warning to students to be careful about their errors.

Since 1934 there have been other representations of the Michaelis-Menten equation that transform the data to linear forms. The Eadie-Hoftzee and Haynes-Woolf plots both provide linear transformations though interpretation and extraction of the parameters is marginally more complex than a simple reciprocal of the value of the intercepts on the x and y axes.

More recently the advent of computational methods of direct fitting to the equation removes the issue of error scaling completely and there is no real excuse for not using the direct method in any serious experimental study.

I wanted to see just how well each method would be in reconstructing correctly the correct parameters so constructed a small simulation. Taking a fixed Km and Vmax I can calculate the expected values for V at given substrate concentrations. Needless to say with ideal data every method retrieves the expected values perfectly.
In the lab, however, the world is not perfect.  I therfore add fixed errors corresponding to measurement errors as one might find on reading a typical UV/Vis spectrometer, and proportional errors such as might be expected from pipetting. I repeat each experiment 10 times and calculate mean/SD for the parameters. The calculations for the Lineweaver-Burk plot are performed with no error weighting, such as might be done in a typical undergrad classroom.

The results are clear. Lineweaver-Burk is giving extremely poor results - so much so that one would never consider it for any analysis.

The simulation can be found here (PDF) (R Source [latex]) if you want to work through it yourself. I recommend the RStudio IDE and you will need appropriate software for Sweave/Latex.

So back to the book. They provide a suitable treatment of the Michaelis-Menten equation but the only experimental treatment for determining the kinetic constants is the Lineweaver-Burk plot, with a cursory half sentence hinting that it may not be suitable. There is no consideration of alternative approaches. It is a pretty damning effort - the book is otherwise quite approachable and could otherwise be recommended, but serious errors where we have to correct the textbooks render it unacceptable. I'm not going to name the book or publisher, but will be contacting the authors directly.

Friday, 24 June 2016

Using Map data in R

Maps are useful things and R is a useful thing, so it would seem a natural useful thing to get them to work together.  However, maps are far more complex that we might think and dealing with them in R can be a bit interesting.

I started looking for map data to follow up a discussion on the correlation between votes for a far-right wing party (UKIP) and lack of exposure to immigrants. The recent UK elections have provided a wealth of data, and figures like these suggest that there may be a correlation.




Data for voting patterns in May 2013 can be obtained from the electoral commission at http://www.electoralcommission.org.uk/our-work/our-research/electoral-data
Data for the recent census can be retrieved from http://www.nomisweb.co.uk/census/2011/KS204EW which gives the country of birth at a local authority level.

The Electoral data is an Excel file that is nicely split and formatted in a way that makes it difficult to read directly into R. However, by spending a few minutes editing out the subheadings and copying the regional names to a new column, I have a tab-delimited file that gives me the electoral votes by local authority.


origins<-read.csv('bulk.csv',stringsAsFactors=F)
origins$geography <- toupper(origins$geography)
names(origins)[c(5:16)]<-c('All','UK','England','Northern.Ireland','Scotland','Wales','UK.Other','Ireland','EU','EU2001','EU.New','Other')

The Nomis data gives a good CSV file with appropriate headings, though these needed some rewriting to be more friendly in R. All the Local Authority names (origins$geography) have been capitalised to match the electoral data.

votes<-read.table('EPE-2014-Electoral-data.txt', header=T, sep='\t')
votes$Local.Authority<-toupper(votes$Local.Authority)
regions<-merge(votes,origins, by.x='Local.Authority', by.y='geography')

So this gives us the votes data and this can then be merged with the regional census data.
A plot is straightforward, as is a regression analysis.
UKIPvote<-100*regions$UKIP/regions[,4]
NonUK<-(regions$All.y-regions$UK)*100/regions$All.y
plot(UKIPvote~NonUK, xlim=c(0,60), ylim=c(0,60), xlab='% non-UK born',
       ylab='UKIP vote (%)', main='Anti-immigrant vote vs population composition')
abline(lm(UKIPvote~NonUK), col=2)

summary(lm(UKIPvote~NonUK))

Call:
lm(formula = UKIPvote ~ NonUK)

Residuals:
     Min       1Q   Median       3Q      Max 
-12.0187  -3.6868  -0.1728   3.3297  14.5318 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) 34.17057    0.67626   50.53   <2e-16 ***
NonUK       -0.49421    0.03117  -15.86   <2e-16 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 4.956 on 116 degrees of freedom
Multiple R-squared:  0.6843, Adjusted R-squared:  0.6816 
F-statistic: 251.4 on 1 and 116 DF,  p-value: < 0.00000000000000022

Looks good to me - fear of strangers leads to reactionary politics.
Now to recreate the maps. This is where it starts to get interesting.

Map data for the UK at local authority boundary level is not readily available. Datasets are at ward level or higher level regions. So I will need to read in the ward data and merge the local authorities so I can plot the electoral data. The frame of reference is latitude and longitude for the polygon coordinates.

ONS census ward (merged wards) data is available from http://www.ons.gov.uk/ons/guide-method/geography/products/census/spatial/2011/index.html. This downloads as a zip archive that unpacks into a folder containing multiple files with the same root filename. These are ARCGIS SHAPEfile format. Get the full resolution file as the generalised one fails to merge wards properly.
The R package maptools is used to read them. Whilst you are at it also install the libraries rgeos, rgdal and gpclib to process the data. Set your session to the directory containing the data files and read them in.
library(maptools)
library(rgeos)
library(rgdal)
library(gpclib)

GPClib is proprietary and licensed for non-commercial use. To enable it type

gpclibPermit()

wards <-readShapePoly("CMWD_2011_EW_BGC")

This gives the ward boundaries and these can be plotted readily using the shapefile objects plot command.

plot(wards) #this could take a while so try plotting a selection
plot(wards[1:20])
This datafile contains the local authority name. However, we need to merge the datafiles to local authority. That can be done by unifying individual polygons (dissolving in shapespeak) to anew boundary list where the id is the local authority name.

lamap <-unionShapePoly(wards, wards$LAD11NM)
But this gives an error.
So I process the data local authority by local authority until I find the error
for(x in levels(wards$LAD11NM)){
  try( {
    cat(x);
    lamap<-unionSpatialPolygons(wards[wards$LAD11NM==x,],    
        wards$LAD11NM[wards$LAD11NM==x]);
  })
}

This lists all the LA names as we go and the errors so it is clear to see which ones are providing problems. It does take a long time to process Wales - the coastline is quite challenging.

In total there were 5 local authorities who showed errors. I now have two options:

1. Fix the errors
2. Work round them by plotting everything except the problem ones and plotting them as wards.

I give up with 1.
With 2 I can use the R subset command to select the good wards and merge those. That leaves the bad wards to plot on top of them afterwards.

goodwards<-subset(wards, !(wards$LAD11NM== 'Bridgend'| wards$LAD11NM=='Gwynedd'| wards$LAD11NM=='Isle of Anglesey'| wards$LAD11NM=='Newcastle-under-Lyme'| wards$LAD11NM=='Oxford'))
badwards<-subset(wards, wards$LAD11NM== 'Bridgend'| wards$LAD11NM=='Gwynedd'| wards$LAD11NM=='Isle of Anglesey'| wards$LAD11NM=='Newcastle-under-Lyme'| wards$LAD11NM=='Oxford')

Now I can merge the good ones.

lamap<-unionSpatialPolygons(goodwards, goodwards$LAD11NM)

And plot

plot(lamap)
plot(badwards, add=T)

Now we'll work on colouring them by the data.

And I should update this for the EU referendum..

Monday, 1 December 2014

Moving House

This blog has moved from inside the Univoersity of Dundee VLE to outside as accessing it was proving to be, shall we say, suboptimal.  Now the whole world can see.

I will move most of the posts over in due course.