A Showcase of Taught MSc Dissertations
The O’Reilly Institute, Trinity College, Dublin 2
Friday 2nd September from 4 p.m. to 6 p.m.
This event provides an opportunity for business, industry, the public sector and academia to link with graduates who are showcasing their research on five MSc programmes in the School of Computer Science and Statistics: Interactive Entertainment Technology; Mobile and Ubiquitous Computing; Networks and Distributed Systems; Interactive Digital Media and Health Informatics. It also offers an opportunity for industry to interact with academics and researchers identifying possible future collaborations.
Graduates on these programmes have gained experience with the latest tools and technologies and have been exposed to cutting-edge research being conducted within the School. The nature of the work undertaken equips these graduates with transferable skills relevant to careers in the global technological industry. As part of their MSc studies, students undertake an individual research project leading to a submission of a dissertation of publishable standard.This dissertation reflects an independent piece of exploratory research which significantly contributes to the advancement of technology and involves design, development and evaluation.
Schedule and selection of the student projects are outlined below:
4:00p.m. - 4:40 p.m.
- Opening: Dr. Siobhan Clarke
- Professor Veronica Campbell, Dean of Graduate Studies
- Outline of MSc programmes: Dr. Siobhan Clarke, outgoing Director of Postgraduate Teaching and Learning and Director of MSc in Computer Science (Networks & Distributed Systems)
- Keynote speaker:John Healy, Citi: Director & Head of Technology, Belfast
- Student "fast-forward" comprising one minute presentations of selected student posters
4:45 p.m. - 6 p.m.
- Poster presentation/interactive audio visual installation and reception at which students, academics and industry interact.
Selected Dissertation Topics 2010-11
Format: Poster Display & Interactive Audio-visual installation
Poster Display (Foyer, O’Reilly Institute)
A Driving Assistance System: Real time Speed Limit Signs Recognition System
Biased Mobile Auctions in Urban Environments
Collaborative Video Surveillance
College View: Android Participatory Sensing Framework
Context-Aware Prediction of Arrival Times using Sensor and Smartphone Data
Correlating Semantics and Expertise to Enhance Social Network Exploration
Crowdsourced Translation for Emerging Technologies
Cyber Foraging for Multimedia Processing Tasks
Evaluating the use of Volumetric Billboards in Games
Exploring the Educational Potential of Modern Mobile Games
FrisVee: A Gesture-Based Mobile Virtual Frisbee Game
Harvesting Home Area Network Knowledge from the Web
Icarus: Towards a lightweight mobile cloud
Improving User Cognitive Support for Interlinking Linked Data
Investigating Octree Generation for Interactive Animated Volume Rendering
Lightweight Wired Communication for Sensor and Actuator Arrays Using I2C Bus
Mobile Phone Paintball: A New Multilayer Gaming Concept
Monitoring Of Web Services Compositions Running On Different Cloud Infrastructures
Ontology-based asynchronous monitoring for urban-scale application adaptation support
Proactive Service Provider for the Smart Traveller Information System
Procedural Content Generation of Indoor Environments for WebGL
Procedural Generation of Large Scale Game-Worlds
Progressive Volume Rendering using WebGL and HTML5
Real Time Rendering of Interactive Raining Scenes on the GPU
Real-Time Wrinkles for Expressive Virtual Faces
Request/Response Protocol for MANETs
Semantically-aware Annotation of Historic and Literary Texts
Sensory Congruence in Augmented Reality
Slice-Oriented Programming for Resource-Constrained Target Environments
Smart Traffic Lights
Software as Service for Smarter Cities: an Approach to Cross-Layer Adaptation for Service-Oriented Applications
Synthesizing Realistic Human Motions Using Motion Graphs
TotTemp: A wireless data reporting system with environmental integration for infant health monitoring
Utilising Augmented Reality to create a Brand Interaction Application on Mobile Devices
Viewer-Aware Dynamic Advertising
Web standards-based collaboration software for the classroom
Interactive audio-visual installations (Large Conference Room, O’Reilly Institute)
Dublin Dental University Hospital
The students from the MSc Interactive Digital Media programme 2010/11 will present four projects around the theme of The Dublin Dental University Hospital which will link into the website www.thedentalproject.org being launched on 6 September 2011.
Within this overall theme, four different project groups have explored different aspects of the dental hospital which is located on the TCD campus and also open to the general public.
Bringing touch and swipe style interaction to the mouse, D Stories presents the experience of the Dublin Dental University Hospital through the perspective of the patient. Explore their journey through panoramic environments and transition video.
Laughing Gas is a Smartphone application which encourages users to confront their fears of dentistry by tailoring individual narrative experiences. Via the interface, users can choose a narrative path in which they hold control over the mood of the story that offers a choice of a 100 different possibilities.
The narratives take inspiration from the darker history of dentistry and the audience is drawn into the project though QR codes discovered on the streets surrounding the Dublin Dental University Hospital. These access points to the project also allow users to travel through the narratives geographically, discovering the content by navigating using the QR codes in the area.
Colony is an audio visual project which samples overlooked elements from The Dublin Dental University Hospital in order to produce a web application that lies between generative and bio art.
These elements include light, humidity, temperature, sound vibrations, and bacteria.
Sampled elements are mutated through a process modelled on bacterial growth to produce audio visual creations, which allow participants to become immersed in an abstract playful environment, and experience new perceptions of the physical space.
Meaning in the Mouth
Meaning in the Mouth is an interactive audiovisual exploration of how the mouth can communicate meaning non-verbally. The Kinect motion-sensing device allows users to generate their own musical pieces via physical movement.
An event run in collaboration with Trinity Research & Innovation and Careers Advisory Service at Trinity College