- A unIfied framework for multimodal content SEARCH

FP7 Logo

About

The recent advances in ICT have led to affordable costs of seamless computing and advanced networking technologies, paving the way, in this respect, for the emerging of an Ambient Intelligence environment, where the user is unobtrusively and transparently fully immersed. As an effect, the users can interact anywhere and anytime with such an environment, where the difference between the real and the virtual artefacts do not make sense anymore.

The I-SEARCH project aims to provide a novel unified framework for multimodal content indexing, sharing, search and retrieval. The I-SEARCH framework will be able to handle specific types of multimedia and multimodal content (text, 2D image, sketch, video, 3D objects and audio) alongside with real world information, which can be used as queries and retrieve any available relevant content of any of the aforementioned types.

The search engine will be highly user-centric in the sense that only the content of interest will be delivered to the end-users, satisfying their information needs and preferences. Furthermore, by introducing novel visualisation schemes, the retrieved results will be optimally presented to the user, an approach which is expected to dramatically improve end-user experience. Finally, the search engine will be dynamically adapted to the end-user’s device, which can vary from a simple mobile phone to a high-performance PC.

  • Twitter
  • Facebook
  • LinkedIn
  • Share/Bookmark
Bookmark the permalink. Follow any comments here with the RSS feed for this post. Both comments and trackbacks are currently closed.

© 2010 I-SEARCH Project Consortium