Searching for a module

When installing software on an High Performance Computing unit, additional packages are often handled by the module package.

To have a list of all the modules available it is sufficient to type


module avail

Often one then retrieves a very long list of possible modules, in alphabetic order. This is not very convenient if one is looking for a particular feature and dos not really know how it has been categorised.

One may think that grep would suffice to filter the results. This is almost true: in order to use grep first one needs to reformat the result of module avail with the -t  option into a single column, redirect the standard error output (labelled by 2 in Bash) to the standard output (labelled by 1, so that the redirection is 2>&1) and then pipe it with grep.

For example, if we want to search for all the modules containing “python” in their name we would type:


module avail -t 2>&1 | grep -i python

and eventually just write a convenient script named modsearch in our ~/bin :


#!/bin/bash
module avail -t 2>&1 | grep -i $1

so that in the future we will just have to type


modsearch python

Experimental Evidence for a Structural-Dynamical Transition in Trajectory Space

Rattachai Pinchaipat is a crafty experimentalist (PhD student in Bristol) that I have had the pleasure to work with within a Bristol-Mainz collaboration aimed at demonstrating in experiments the existence of phase transitions in trajectory space for supercooled liquids. Our work is going to appear in Physical Review Letters. Here is the preprint.

In Mainz, preliminary simulations on hard spheres in trajectory space (by Matteo Campo) have sampled the tails of the probability distribution of time-integrated structural observables and predicted long non-gaussian tails (signature of a phase transition). In experiments, Rattachai managed to find an analogous signature via subsampling the trajectories of a rather large system.

The result demonstrates that the dynamical heterogeneities that characterise fragile glass forming liquids can be read as the coexistence, in trajectory space, of different long-lived (metastable) stationary states: some are structure-less, while others show the presence of extraordinary extended and long-lived motifs. Crucially, the phase transition between the two states is not accessible in current experiments, and could eventually never be accessible, if it is always buried in the tails of the probability distribution.

But this is another story… (actually, this story).

From Glass Formation to Icosahedral Ordering by Curving Three-Dimensional Space

Our work on curved space and frustration with Gilles Tarjus at the Université Pierre et Marie Curie in Paris is going to appear published in Physical Review Letters, 118, 215501.

In it, we provide a test of one of the competing theories for the origins of the glass transition: this is geometric frustration, i.e. the idea that the slowing down observed in glass forming liquids goes hand in hand with the formation of particular non-cristalline geometric motifs, that increase in size as the liquids are cooled.

We test this on the most favourable ground for the theory, which is a curved manifold. We do this for the first time in three dimensions, observing the structural evolution of a glass former on the surface of a sphere embedded in four dimensions (This is a funny space to work in. A beautiful way to visualise such a hypersurface is to use the so-called two-ball construction, see image above, which nicely matches with the vision of the universe that Dante and his teacher Brunetto Latini had).

What we find is that geometrical motifs become gradually unfrustrated as the curvature increases (which is compatible with the basic assumptions of geometric frustration) and ordered phases (with some tricky defects, that we discuss in the Supplemental Material) spontaneously form for low enough temperatures. However, the size of the domains in such motifs is tightly coupled with the slowing down only for very strong curvatures, making geometric frustration just one of the mechanisms that eventually play a role in realistic glass-forming fluids (that exist in our ordinary Euclidean space).

Binary Crystals and Kinetic Traps

During an invited talk at the University of Bath on May 3rd 2017, I have had the chance to discuss the work that we have done in Bristol on binary crystals and binary mixtures.

The story, that you can find in the following slides, discusses experimental and numerical aspects in the formation and dissolution of binary crystals:

  • The routes to the formation interstitial solid solution (work with I. Rios de Anda).
  • The role of compositional frustration (work with P. Crowther), published here.
  • The emergence of dynamical transition under mechanical deformation of the crystals (work with E. Brillaux).

 

Effects of vertical confinement on gelation and sedimentation of colloids

Disordered systems under confinement may show very specific properties, such as enhanced density fluctuations or flow instabilities.

Azaima Razali (Bristol) and Christopher Fullerton (Bath, now in Montpellier) have performed experiments and simulations on the effect of extreme confinement in colloidal gels and their work (to which I have the pleasure to add my contribution) has just been published in Soft Matter.

The notable result is that while gelation is often employed in bulk systems in order to slow down sedimentation, in strongly confined systems the opposite appears to be true, with sedimentation facilitated by the formation of a percolating network.

The full article can be found here:

A. Razali, C. J. Fullerton, F. Turci,  J. E. Hallett, R. L Jack and C. P. Royall, Effects of vertical confinement on gelation and sedimentation of colloids, Soft Matter, (2017), doi:10.1039/C6SM02221A

 

Local structure of percolating gels at very low volume fractions

We have recently published on the Journal of Chemical Physics the study resulting from the work of a Master Student in Bristol Chemistry: via numerical simulations, we explore the very low volume fraction regime of a colloidal gel and find striking structural signatures related to the compactness of the gel arms. Moreover, we find that the only limit for gel formation truly is the accessible observation time.

Full reference: S. Griffiths, F. Turci and C. P. Royall,   The Journal of Chemical Physics 146, 014905 (2017); doi: http://dx.doi.org/10.1063/1.4973351

 

Box Counting in Numpy

The fractal dimension of an object is a single scalar number that allows us to quantify how compact an object is , i.e. how wiggly a curve is, how wrinkled a surface is, how porous a complex volume is.

However, estimating the fractal dimension of an object is not an easy task, and many methods exist. The simplest method is box counting: the idea is to fully cover the object with many boxes of a given size, count how many boxes are needed to cover the object and repeat the process for many box sizes. The scaling of the number of boxes covering the object with the size of the boxes gives an estimate for the fractal dimension of the object.

SierpinskiThe algorithm has many limitations, but in its simplest form it can be easily implemented in Python. The idea is to simply bin the object in a histogram of variable bin sizes. This can be easily generalised to any dimensions, thanks to Numpy’s histrogramdd  function.

In the following code, I test the idea with a known fractal, Sierpinski’s triangle, which has an exact (Hausdorff) fractal dimension of log(3)/log(2).

import numpy as np
import pylab as pl

def rgb2gray(rgb):
    r, g, b = rgb[:,:,0], rgb[:,:,1], rgb[:,:,2]
    gray = 0.2989 * r + 0.5870 * g + 0.1140 * b
    return gray

image=rgb2gray(pl.imread("Sierpinski.png"))

# finding all the non-zero pixels
pixels=[]
for i in range(image.shape[0]):
    for j in range(image.shape[1]):
        if image[i,j]>0:
            pixels.append((i,j))

Lx=image.shape[1]
Ly=image.shape[0]
print Lx, Ly
pixels=pl.array(pixels)
# pl.plot(pixels[:,1], pixels[:,0], '.', ms=0.01)
# pl.show()
print pixels.shape

# computing the fractal dimension
#considering only scales in a logarithmic list
scales=np.logspace(0.01, 10, num=10, endpoint=False, base=2)
Ns=[]
# looping over several scales
for scale in scales:
    print "======= Scale :",scale
    # computing the histogram
    H, edges=np.histogramdd(pixels, bins=(np.arange(0,Lx,scale),np.arange(0,Ly,scale)))
    Ns.append(np.sum(H>0))

# linear fit, polynomial of degree 1
coeffs=np.polyfit(np.log(scales), np.log(Ns), 1)

pl.plot(np.log(scales),np.log(Ns), 'o', mfc='none')
pl.plot(np.log(scales), np.polyval(coeffs,np.log(scales)))
pl.xlabel('log $\epsilon$')
pl.ylabel('log N')
pl.savefig('sierpinski_dimension.pdf')

print "The Hausdorff dimension is", -coeffs[0] #the fractal dimension is the OPPOSITE of the fitting coefficient
np.savetxt("scaling.txt", zip(scales,Ns))

Clustering and periodic boundaries

Clustering in Python can be nicely done using the statistical tools provided by the sklearn library.

For example, the DBSCAN method easily implements a clustering algorithm that detects connected regions, given a maximum distance between two elements of a cluster.

However, natively the library does not support periodic boundaries, which can be sometimes annoying. But an easy workaround can be found precisely exploiting the power of the library: methods like DBSCAN can be given in input distance matrices directly, and then the clustering is computed on these.

The workaround is to compute the distance matrix with the periodic boundaries in it. The easiest way that I have found is to use the scipy function pdist on each coordinate, correct for the periodic boundaries, then combine the result in order to obtain a distance matrix (in square form) that can be digested by DBSCAN.

The following example may give you a better feeling of how it works.

import pylab as pl
from sklearn.cluster import DBSCAN
from scipy.spatial.distance import pdist,squareform

# box size
L=5.
threshold=0.3
# create data
X=pl.uniform(-1,1, size=(500,2))
# create for corners
X[XL*0.5]-=L

# finding clusters, no periodic boundaries
db=DBSCAN(eps=threshold).fit(X)

pl.scatter(X[:,0], X[:,1],c=db.labels_, s=3,edgecolors='None')
pl.figure()

# 1) find the correct distance matrix
for d in xrange(X.shape[1]):
    # find all 1-d distances
    pd=pdist(X[:,d].reshape(X.shape[0],1))
    # apply boundary conditions
    pd[pd>L*0.5]-=L
    
    try:
        # sum
        total+=pd**2
    except Exception, e:
        # or define the sum if not previously defined
        total=pd**2
# transform the condensed distance matrix...
total=pl.sqrt(total)
# ...into a square distance matrix
square=squareform(total)
db=DBSCAN(eps=threshold, metric='precomputed').fit(square)
pl.scatter(X[:,0], X[:,1],c=db.labels_,s=3, edgecolors='None')
pl.show()

Before the periodic boundaries (Lx=Ly=5):
fig1

… and after (Lx=Ly=5):

fig2

Concatenate pdfs from the Terminal

Oftentimes it can be convenient to merge different PDF documents in order to get a single, continuous document that can be easily sent via mail for review or correction.

If one has just a few documents, this can be done directly through the Preview.app application on the Mac, but for more documents (or when we want to repeat the merge many times) a command-line application can be very convenient.

On Linux, or on the Mac, poplar is the kind of set of tools that makes the trick (you can install it with  Homebrew on the Mac).

In particular, you will find that the package includes a program called
pdfunite
. Its usage is straightforward:

pdfunite file_in_a.pdf file_in_b.pdf file_in_c.pdf fileout.pdf

 

brew update ––force

Homebrew is a very convenient package manager for Mac OS X. It makes the installation of numerous utilities and programs incredibly easy. It is based on a databases of instructions (Ruby formulas) that are kept up to date using Git.

Keeping the database up-to-date is normally done with

brew update

Sometimes, however, it can fail.  It occurred to me already a few times that I was unable to retrieve the latest version of the database, and installing new software becomes impossible.

If the internal diagnostic tool

brew doctor

is not sufficient for identifying and solving the issue, there is a way to force the update. As indicated on these pages, one can use Git directly and recover the database:

cd `brew --prefix`
git remote add origin https://github.com/mxcl/homebrew.git
git fetch origin
git reset --hard origin/master