|Title||:||Large Scale Sketch-based Image Retrieval|
|Speaker||:||Sarthak Parui (IITM)|
|Details||:||Tue, 17 Feb, 2015 4:30 PM @ BSB 361|
|Abstract:||:||The explosive growth of publicly available digital images highlights the need for an accurate, efficient and user-friendly technique to index and retrieve images from large scale multimedia databases. Due to the proliferation of touch-based smart computing devices and the consequent ease and simplicity of querying images via hand-drawn sketches on touch screens, sketch-based image retrieval has emerged as an interesting problem. The standard way of querying images via text may be imprecise and ambiguous at times due to the presence of synonymous, polysemous, spam and social tags. Sketch-based image retrieval, on the other hand, being an expressive and interactive way of image search, either alone or in conjunction with other retrieval mechanisms such as text, may yield better results. However, currently available systems for the same are quite rigid in terms of sketch-to-image matching and cannot retrieve the images where the sketched object is present at different location, scale and/or orientation.
In this talk, we address the above issue by proposing an efficient and “similarity-invariant” approach for sketch-based image retrieval from millions of images. To make the online retrieval fast, each database image is preprocessed and represented as variable-length descriptors. These descriptors are efficiently matched using a Dynamic Programming-based approximate substring matching algorithm that is used for indexing followed by an efficiently search for matching descriptors in a hierarchical k-medoids tree structure. Furthermore, a geometric verification mechanism is devised for on-the-fly elimination of false positives. Extensive experiments performed on 1.2 million image database demonstrate significant performance improvement over other existing methods.