Browsing by Subject "Human computer interaction"
Now showing 1 - 13 of 13
Results Per Page
Sort Options
Item Open Access A clustering-based method to estimate saliency in 3D animated meshes(Elsevier Ltd, 2014) Bulbul, A.; Arpa, S.; Capin, T.We present a model to determine the perceptually significant elements in animated 3D scenes using a motion-saliency method. Our model clusters vertices with similar motion-related behaviors. To find these similarities, for each frame of an animated mesh sequence, vertices' motion properties are analyzed and clustered using a Gestalt approach. Each cluster is analyzed as a single unit and representative vertices of each cluster are used to extract the motion-saliency values of each group. We evaluate our method by performing an eye-tracker-based user study in which we analyze observers' reactions to vertices with high and low saliencies. The experiment results verify that our proposed model correctly detects the regions of interest in each frame of an animated mesh. © 2014 Elsevier Ltd.Item Open Access Collaborative workspaces for pathway curation(CEUR-WS, 2016-08) Durupınar-Babur, F.; Siper, Metin Can; Doğrusöz, Uğur; Bahceci, İstemi; Babur, O.; Demir, E.We present a web based visual biocuration workspace, focusing on curating detailed mechanistic pathways. It was designed as a flexible platform where multiple humans, NLP and AI agents can collaborate in real-time on a common model using an event driven API. We will use this platform for exploring disruptive technologies that can scale up biocuration such as NLP, human-computer collaboration, crowd-sourcing, alternative publishing and gamification. As a first step, we are designing a pilot to include an author-curation step into the scientific publishing, where the authors of an article create formal pathway fragments representing their discovery- heavily assisted by computer agents. We envision that this "microcuration" use-case will create an excellent opportunity to integrate multiple NLP approaches and semi-automated curation. © 2016, CEUR-WS. All rights reserved.Item Open Access A color-based face tracking algorithm for enhancing interaction with mobile devices(Springer, 2010-05) Bulbul, A.; Cipiloglu, Z.; Capin, T.A color-based face tracking algorithm is proposed to be used as a human-computer interaction tool on mobile devices. The solution provides a natural means of interaction enabling a motion parallax effect in applications. The algorithm considers the characteristics of mobile useconstrained computational resources and varying environmental conditions. The solution is based on color comparisons and works on images gathered from the front camera of a device. In addition to color comparisons, the coherency of the facial pixels is considered in the algorithm. Several applications are also demonstrated in this work, which use the face position to determine the viewpoint in a virtual scene, or for browsing large images. The accuracy of the system is tested under different environmental conditions such as lighting and background, and the performance of the system is measured in different types of mobile devices. According to these measurements the system allows for accurate (7% RMS error) face tracking in real time (20-100 fps). © Springer-Verlag 2010.Item Open Access Eye tracking using markov models(IEEE, 2004) Bağcı, A. M.; Ansari, R.; Khokhar, A.; Çetin, A. EnisWe propose an eye detection and tracking method based on color and geometrical features of the human face using a monocular camera. In this method a decision is made on whether the eyes are closed or not and, using a Markov chain framework to model temporal evolution, the subject's gaze is determined. The method can successfully track facial features even while the head assumes various poses, so long as the nostrils are visible to the camera. We compare our method with recently proposed techniques and results show that it provides more accurate tracking and robustness to variations in view of the face. A procedure for detecting tracking errors is employed to recover the loss of feature points in case of occlusion or very fast head movement. The method may be used in monitoring a driver's alertness and detecting drowsiness, and also in applications requiring non-contact human computer interaction.Item Open Access A face tracking algorithm for user interaction in mobile devices(IEEE, 2009-09) Bülbül, Abdullah; Çipiloğlu, Zeynep; Çapin, TolgaA new face tracking algorithm, and a human-computer interaction technique based on this algorithm, are proposed for use on mobile devices. The face tracking algorithm considers the limitations of mobile use case - constrained computational resources and varying environmental conditions. The solution is based on color comparisons and works on images gathered from the front camera of a device. The face tracking system generates 2D face position as an output that can be used for controlling different applications. Two of such applications are also presented in this work; the first example uses face position to determine the viewpoint, and the second example enables an intuitive way of browsing large images. © 2009 IEEE.Item Open Access Fine arts perspective in user interface design(ACM, 2009) Kültür, Can; Veryeri Alaca I.In this poster, we first aim to explain an interdisciplinary approach and question an idea and attempt. Second, we aim to underline challenges and enablers of such an attempt. This idea can be briefly summarized as "inclusion of learning activities and assessments that are applied in coordination with the Department of Fine Arts might be necessary in terms of developing visual design skills". The target of this approach is improving the courses like 'human computer interaction' or 'user-interface design'.Item Open Access HandVR: a hand-gesture-based interface to a video retrieval system(Springer U K, 2015) Genç, S.; Baştan M.; Güdükbay, Uğur; Atalay, V.; Ulusoy, ÖzgürUsing one’s hands in human–computer interaction increases both the effectiveness of computer usage and the speed of interaction. One way of accomplishing this goal is to utilize computer vision techniques to develop hand-gesture-based interfaces. A video database system is one application where a hand-gesture-based interface is useful, because it provides a way to specify certain queries more easily. We present a hand-gesture-based interface for a video database system to specify motion and spatiotemporal object queries. We use a regular, low-cost camera to monitor the movements and configurations of the user’s hands and translate them to video queries. We conducted a user study to compare our gesture-based interface with a mouse-based interface on various types of video queries. The users evaluated the two interfaces in terms of different usability parameters, including the ease of learning, ease of use, ease of remembering (memory), naturalness, comfortable use, satisfaction, and enjoyment. The user study showed that querying video databases is a promising application area for hand-gesture-based interfaces, especially for queries involving motion and spatiotemporal relations.Item Open Access Mean-shift tracking of moving objects using multi-dimensional histograms(Society of Photo-Optical Instrumentation Engineers (SPIE), 2004-04) Cüce, Halil I.; Çetin, A. EnisIn this paper, a moving object tracking algorithm for infrared image sequences is presented. The tracking algorithm is based on the mean-shift tracking method which is based on comparing the histograms of moving objects in consecutive image frames. In video obtained after visible light, the color histogram of the object is used for tracking. In forward looking infrared image sequences, the histogram is constructed not only from the pixel values but also from a highpass filtered version of the original image. The reason behind the use of highpass filter outputs in histogram construction is to capture structural nature of the moving object. Simulation examples are presented.Item Open Access Real time hand gesture recognition for computer interaction(IEEE, 2014-04) Farooq, J.; Ali, Muhaddisa BaratHand gesture recognition is a natural and intuitive way to interact with the computer, since interactions with the computer can be increased through multidimensional use of hand gestures as compare to other input methods. The purpose of this paper is to explore three different techniques for HGR (hand gesture recognition) using finger tips detection. A new approach called 'Curvature of Perimeter' is presented with its application as a virtual mouse. The system presented, uses only a webcam and algorithms which are developed using computer vision, image and the video processing toolboxes of Matlab. © 2014 IEEE.Item Open Access Real-time virtual fitting with body measurement and motion smoothing(Pergamon Press, 2014) Gültepe, U.; Güdükbay, UğurWe present a novel virtual fitting room framework using a depth sensor, which provides a realistic fitting experience with customized motion filters, size adjustments and physical simulation. The proposed scaling method adjusts the avatar and determines a standardized apparel size according to the user's measurements, prepares the collision mesh and the physics simulation, with a total of 1 s preprocessing time. The real-time motion filters prevent unnatural artifacts due to the noise from depth sensor or self-occluded body parts. We apply bone splitting to realistically render the body parts near the joints. All components are integrated efficiently to keep the frame rate higher than previous works while not sacrificing realism.Item Open Access Smart computing for large scale visual data sensing and processing(Elsevier, 2016) Zhang, L.; Duygulu, P.; Zuo, W.; Shan, S.; Hauptmann, A.Item Open Access Vision-based continuous Graffiti™-like text entry system(SPIE, 2004) Erdem, İ. A.; Erdem, M. E.; Atalay, V.; Çetin, A. EnisIt is now possible to design real-time, low-cost computer version systems even in personal computers due to the recent advances in electronics and the computer industry. Due to this reason, it is feasible to develop computer-vision-based human-computer interaction systems. A vision-based continuous Graffiti™-like text entry system is presented. The user sketches characters in a Griffiti™-like alphabet in a continuous manner on a flat surface using a laser pointer. The beam of the laser pointer is tracked on the image sequences captured by a camera, and the corresponding written word is recognized from the extracted trace of the laser beam. © 2004 Society of Photo-Optical Instrumentation Engineers.Item Open Access Which shape representation is the best for real-time hand interface system?(Springer, Berlin, Heidelberg, 2009) Genç, Serkan; Atalay V.Hand is a very convenient interface for immersive human-computer interaction. Users can give commands to a computer by hand signs (hand postures, hand shapes) or hand movements (hand gestures). Such a hand interface system can be realized by using cameras as input devices, and software for analyzing the images. In this hand interface system, commands are recognized by analyzing the hand shapes and its trajectories in the images. Therefore, success of the recognition of hand shape is vital and depends on the discriminative power of the hand shape representation. There are many shape representation techniques in the literature. However, none of them are working properly for all shapes. While a representation leads to a good result for a set of shapes, it may fail in another one. Therefore, our aim is to find the most appropriate shape representation technique for hand shapes to be used in hand interfaces. Our candidate representations are Fourier Descriptors, Hu Moment Invariant, Shape Descriptors and Orientation Histogram. Based on widely-used hand shapes for an interface, we compared the representations in terms of their discriminative power and speed. © 2009 Springer-Verlag.