Smartphones and sensors have produced a treasure trove of pictures, many tagged with pertinent information identifying content. Using this vast database of cross-referenced images, convolutional neural networks and other machine learning methods have revolutionized the ability to quickly identify natural images that look like ones previously seen and catalogued.

These methods “learn” by tuning a stunningly large set of hidden internal parameters, guided by millions of tagged images, and requiring large amounts of supercomputer time. In many fields, however, such a database is an unachievable luxury. Biologists record cell images and painstakingly outline the borders and structure by hand — it’s not unusual for one person to spend weeks creating a single, fully three-dimensional image.

An efficient set of mathematical “operators” was developed that can greatly reduce the number of parameters. These mathematical operators might naturally incorporate key constraints to help in identification, such as by including requirements on scientifically plausible shapes and patterns. This new approach to machine learning, rather than relying on the tens or hundreds of thousands of images used by typical machine learning methods, learns much more quickly and requires far fewer images.

The Mixed-Scale Dense Convolution Neural Network (MS-D) requires far fewer parameters than traditional methods, converges quickly, and has the ability to “learn” from a remarkably small training set. A small set of training images processed with a MS-D network was reconstructed using 1,024 acquired X-ray projections to obtain images with relatively low amounts of noise. Noisy images of the same object were then obtained by reconstructing using 128 projections. Training inputs were noisy images, with corresponding noiseless images used as target output during training. The trained network was then able to effectively take noisy input data and reconstruct higher resolution images.

For more information, contact Jon Bashor at This email address is being protected from spambots. You need JavaScript enabled to view it.; 510-486-5849.