Show simple item record

dc.contributor.authorRababaah, Aaron
dc.date.accessioned2017-10-02T07:39:06Z
dc.date.available2017-10-02T07:39:06Z
dc.date.issued2017
dc.identifier.urihttp://hdl.handle.net/11675/2989
dc.description.abstractData collected by multi-modality sensors to detect and characterize behavior of entities and events over a given situation. In order to transform the multi-modality sensors data into useful information leading to actionable information, there is an essential need for a robust data fusion model. A robust fusion model should be able to acquire data from multi-agent sensors and take advantage of spatio-temporal characteristics of multi-modality sensors to create a better situational awareness ability and in particular, assisting with soft fusion of multi-threaded information from variety of sensors under task uncertainties. This book presents a novel Image-based model for multi-modality data fusion. The concept of this fusion model is biologically-inspired by the human brain energy perceptual model. Similar to the human brain having designated regions to map immediate sensory experiences and fusing collective heterogeneous sensory perceptions to create a situational understanding for decision-making, the proposed image-based fusion model follows an analogous data to information fusion scheme for actionable decision-making applied to surveillance intelligent systems.
dc.publisherSaarbrucken, Germany, Scholar's Press.
dc.titleA Novel Image-based Model for Data Fusion in Intelligent Surveillance Systems.
dc.typeBook
dc.identifier.urlhttps://www.amazon.com/Novel-Image-based-Fusion-Surveillance-Systems/dp/3330651539


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record