College of Science and Engineering
Permanent URI for this collectionhttps://hdl.handle.net/10877/17053
Browse
Browsing College of Science and Engineering by Type "Technical Report"
Now showing 1 - 20 of 26
Results Per Page
Sort Options
Item ActionItem: Collaboration through Commitment and Social Tasking(2009-09-10) Ngu, Anne H. H.; Gu, Qijun; Peng, Wuxu; Roberts, MarkThe relentless growth of global organizations and businesses require collaboration among virtual teams that can be formed on-demand and cross institutional, geographical and cultural boundaries. In this paper, we propose ActionItem - a Web 2.0 collaboration tool that fosters cooperation by leveraging the idea of commitment, social tasking and parallel blogging. We describe the prototype implementation of ActionItem and give a quantitative and qualitative evaluation of this collaboration management tool in terms of collaboration provenance, efficiency and quality through case studies. The concept of social tasking for collaboration has been used successfully in many social networking sites. However, social networking tools do not manage the collaboration in a team and nor do they provide a collaborative model for objective measurement of how people work together. The workflow-based collaboration management tools have extensive management capability, but typically are only built for collaboration among a static group of participants with clearly designated roles within a fixed organizational structure oblivious to any form of social networks. Our results show that ActionItem is a nimble, inexpensive, and effective tool to support the collaboration required for loosely coupled virtual teams who do not share the same time and space.Item An Effort-Based Approach to Measuring Software Usability(2008-10-21) Mueller, Carl J.; Komogortsev, Oleg; Tamir, Dan; Feldman, LiamAn Objective Measure of Usability Using Effort Estimation Design and implementation of usable human computer interface (HCI) systems involves expensive, primarily cognitive based, usability testing and evaluation techniques. This complicates the development process and may cause software companies and software engineers that are more familiar with objective testing methodologies to reduce or completely avoid the usability testing stage, reverting to best practice techniques, and producing HCI systems that lack usability. This research is based on the assumption that usability of HCI systems is directly related to the amount of mental and physical effort expended by the user throughout the interaction. It explores and exploits the utility of an objective, relatively easy to measure, and engineering oriented usability metric. A mathematical model of interaction effort is formulated. The model transforms data related to primitive interaction events such as keyboard keystrokes, mouse key clicks and Mickys traversed by the mouse along with eye tracking data into an effort metric. A carefully crafted set of user interaction goals employing scenario based test design techniques is implemented. Data is collected using logging programs that record goal completion time along with keyboard, mouse, and eyes interaction events. The recorded information is reduced to a statistically meaningful data-set that is used to evaluate the validity of the research assumptions. Experimental results support the hypothesize. Furthermore, they are prompting several interesting finding that merit further research and investigation. This is the first research that carries the intuitive idea of relation between effort and usability all the way to the "field" by recording and processing effort based metrics obtained from subjects while interacting with real complex systems.Item An Effort-based Framework for Evaluating Software Usability Design(2010-03-12) Tamir, Dan; Mueller, Carl J.; Komogortsev, OlegOne of the major stakeholder complaints is the usability of software applications. Although there is a rich amount of material on good usability design and evaluation practice, software engineers may need an integrated framework facilitating effective quality assessment. A novel element of the framework, presented in this paper, is its effort-based measure of usability providing developers with an informative model to evaluate software quality, validate usability requirements, and identify missing functionality. Another innovative aspect of this framework is its focus on learning in the process of assessing usability measurements and building the evaluation process around Unified Modeling Languages Use Cases. The framework also provides for additional developer feedback through the notion of designer's and expert's effort representing effort necessary to complete a task. In this paper, we present an effort-based usability model in conjunction with a framework for designing and conducting the evaluation. Experimental results provide evidence of the frameworks utility.Item Automated Classification of Complex Oculomotor Behavior(2012-06-10) Komogortsev, Oleg V.; Dai, Zanxun; Gobert, Denise V.Complex oculomotor behavior in response to a simple step stimulus can include a variety of different types of saccadic patterns including combinations of normal saccades, simple/corrected/multi-corrected overshoots/undershoots, express, dynamic overshoots, and compound saccades depending on the state of the oculomotor plant and the neuronal control signal supplied by the brain. This paper presents an algorithmic framework that allows automated classification of such behavior. Automated classification results were compared to manually classified data used as a reference baseline. In addition, this work investigates the impact of various filtering methods and basic eye movement classification algorithms on the accuracy of classification of complex oculomotor behavior. The proposed framework can be used in clinical examination of normal and abnormal visual systems.Item Automatic Test Case Generation for Web Service Processes Using a SAT Solver(2009-02-16) Radhakrishnan, Karthikeyann; Podorozkny, RodionSuch useful properties of web services as access from any platform, great interoperability with other web services, ability to combine several web services into a larger application relatively quickly have made them an important category of software systems. One of the techniques used to increase the quality of software is testing. The adequacy of test cases and possible automation of the testing process greatly influence the quality of the produced software and timeliness of the software development process. Even though a great deal of work has been done in adapting test case generation techniques to the peculiarities of web services (e.g. [11][12][13]) we believe our work makes a useful contribution in this area. This paper proposes a novel approach to generate test cases based on the process definition model of a web service. A process definition model defines a sequence of activities that can be performed by orchestrating the capabilities of a web service. A SAT solver (such as Alloy [10]) is used to extract the paths from the process definition model. These paths are used to generate test case specifications that will test all web service capabilities involved in a process. In our opinion the main contribution of the work is an application of a static analysis method for generation of test cases for a web service guided by a goodness metric of process coverage.Item Biometric Identification via an Oculomotor Plant Mathematical Model(2009-11-19) Komogortsev, Oleg; Jayarathna, Sampath; Aragon, Cecilia R.; Mahmoud, MechehoulThere has been increased interest in reliable, non-intrusive methods of biometric identification due to the growing emphasis on security and increasing prevalence of identity theft. This paper presents a new biometric approach that involves an estimation of the unique oculomotor plant (OP) or eye globe muscle parameters from an eye movement trace. These parameters model individual properties of the human eye, including neuronal control signal, series elasticity, length tension, force velocity, and active tension. These properties can be estimated for each extraocular muscle, and have been shown to differ between individuals. We describe the algorithms used in our approach and the results of an experiment with 41 human subjects tracking a jumping dot on a screen. Our results show improvement over existing eye movement biometric identification methods. The technique of using Oculomotor Plant Mathematical Model (OPMM) parameters to model the individual eye provides a number of advantages for biometric identification: it includes both behavioral and physiological human attributes, is difficult to counterfeit, non- intrusive, and could easily be incorporated into existing biometric systems to provide an extra layer of security.Item Classification Algorithm for Saccadic Oculomotor Behavior(2010-05-04) Komogortsev, Oleg; Gobert, Denise V.; Dai, ZanxunThis paper presents a detection algorithm that allows automatic classification of hypermetric and hypometric oculomotor plant behavior in cases when saccadic behavior of the oculomotor plant is assessed during the course of the step stimulus. Such behavior can be classified with a number of oculomotor plant metrics represented by the number of overshoots, undershoots, corrected undershoots/overshoots, multi-corrected overshoots/undershoots. The algorithm presented in this paper allows for the automated classification of nine oculomotor plant metrics including dynamic overshoots and express saccades. Data from sixty-five human subjects were used to support this experimental study. The performance of the proposed algorithm was tested and compared to manual classification methods resulting in a detection accuracy of up to 72% for several of the oculomotor plant metrics.Item Clustering in the Cloud: Clustring Algorithms to Hadoop Map/Reduce Framework(2010-05-04) Wang, XuanCloud computing has gained an increasing popularity over the years for its great potentials. It is a logical and forward-thinking solution for addressing key business demands. Cloud computing truly represents what enterprise IT always needs: a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends IT's existing capabilities. This study investigates how clustering algorithms in data mining can benefit from running in the "Cloud".Item Coverings of Finite Sets by Random Covers, with Applications to the HELP Protocol(2011-01-08) Ogden, Robert D.If we choose subsets of a finite set s at random, according to some specified distribution, how many subsets have to be chosen until s is covered? We solve this problem for distributions invariant under permutations of s by a Markov chain model and derive a useful estimate for a sufficient number of subsets to form a cover with a specified probability. These results are applied to the HELP network protocol [GGO] to estimate the number of times the server must send a set of file fragments to clients, not all of them helpful, in order to ensure that all the fragments be shared.Item daspps a distributed implementation of the aspps system(2005-05-11) East, Deborah; High, JasonWe introduce daspps, a distributed general constraint system based on the aspps system, which uses a master-client paradigm for distributing independent theory segments. We describe the advantages of basing a distributed system on the aspps system and distributing only independent segments. The logic PS+ facilitates the modeling and modifying of search problems. The grounder for the aspps system, psgrnd instantiates a problem with data for a specific instances. During the instantiation, many of the constructs of the logic PS+ are maintained thus the resulting aspps theory is smaller than the corresponding satisfiability theory. The aspps solver still has difficulties with scalability. As the size of problem instances increase, single processor machines take too long to find solutions or they exhaust their resources. The distributed implementation, daspps, increases the resources available thus allowing us to solve problems we otherwise would not be able to solve. We describe how the daspps implementation minimizes communication overhead and increases robustness while minimizing redundant work by clients. We demonstrate the effectiveness of daspps by comparing results of executions by both aspps and daspps.Item Enron Dataset Research: E-mail Relevance Classification(2009-09-25) VanBuren, Victoria; Villarreal, David; McMillen, Thomas A.; Minnicks, Andrew L.This paper discusses a probabilistic approach to address the problem of searching through large amount of data to find case-relevant documents. Using a valuable collection of data, e-mail communications from Enron, an actual corporation, we train a Bayes-based text classifier algorithm to identify e-mails known to be case-relevant and those known to be case-irrelevant.Item Facet-based Tetrahedralization Software(1995-04-12) Hazlewood, CarolSoftware for computing Delaunay tetrahedralization is described. The software is designed to be used as part of a larger system and has a simple user interface. Numerically stable Householder transformations are used to implement orientation and insphere tests in floating-point arithmetic. A backward error analysis is done for the orientation test.Item Fast Target Selection via Saccade-driven Methods(2012-06-10) Komogortsev, Oleg V.; Ryu, Young Sam; Koh, Do HyongThree fast, saccade driven, target selection methods are explored. The first method selects a target at the beginning of a saccade with the objective of providing target selection in almost constant amount of time regardless of the distance to the target. The second method selects a target at the end of a saccade. The third is a hybrid method combining the speed of the saccade-driven selection with the accuracy of the conventional Dwell-Time selection. Theoretical evaluation of the proposed methods conducted via characteristics of the Human Visual System and a mathematical model of the human eye indicates that the objective is tenable. Practical evaluation of the proposed methods is conducted with the Multi-Directional Fitts' Law task and with a real-time eye-gaze-guided video game designed to simulate gaming environments where selection speed of a target is of outmost importance. The results indicate that proposed methods show an increased throughput and task completion performance compared to the conventional Dwell-Time target selection method.Item Flexible Scientific Workflows Using Frames and Dynamic Embedding(2007-04-01) Ngu, Anne H. H.; Haasch, Nicholas; McPhilips, Timothy; Bowers, Shawn M.; Ludaescher, Bertram; Critchlow, TerenceCurrent approach to scientific workflow design in the popular open source Kepler system is based on the actor-oriented framework where concrete actors can be hierarchically composed and orchestrated by different directors (schedulers). A common assumption in this design framework is that workflow is static and must be completely specified before orchestration. Such a static and monolithic workflow cannot response to changing runtime conditions. We present flexible scientific workflow design that allows some tasks to be partially specified via abstract actors called Frame. The behavior of a frame is determined at runtime by the embedded concrete actor. We implemented the process of dynamic embedding that can tailor to different selection policies and enable automatic construction of subworkflow to execute the embedded component at runtime. Frames and dynamic embedding provide high level abstractions for specifying workflow that enables flexible execution nested to any level.Item Generating Large Prime Numbers Using the Perrin Sequence(2007-11-05) Tamir, DanA new, computationally efficient, strong primality test procedure is proposed.Item Introduction to Quantum Message Space(2005-11-02) Ogden, Robert D.Quantum Message Space is the quantum state space necessary to allow messages of arbitrary length unknown beforehand. In order for serial bit generation and deletion be a unitary operation it is necessary to extend the notion of quit in the context of the free group on two elements. A version of harmonic analysis on FG(2) is presented, and the possibility of quantum computation based on unitary bit creation and annihilation together with averaging over the first bit is considered.Item Liberating TCP: The Free and the Stunts(2008-10-24) Valdez, Jason; Guirguis, MinaThe performance of a TCP connection is typically dictated by what the network can provide rather than what the application would like to achieve. In particular, the Additive-Increase Multiplicative-Decrease (AIMD) mechanism employed by TCP hinges on its ability to meet specific throughput requirements since it has to respond to congestion signals promptly by decreasing its sending rate. The level and the timing of congestion signals impose strict limitations on the achievable throughput over short time-scales. To that end, this paper presents a new architecture, whereby a set of TCP connections (we refer to them as the Stunts) sacrifice their performance on behalf of another TCP connection (we refer to it as the Free) by picking up a delegated subset of the congestion signals and reacting to them in lieu of the Free connection. This gives the Free connection just enough freedom to meet specific throughput requirements as requested by the application, without affecting the level of congestion in the network. We present numerical analysis which we validate through extensive simulation experiments.Item Metric-based dynamic process adaptation for crisis mitigation(2008-05-16) Podorozhny, Rodion; Georgakopoulos, Dimitrios; Ngu, Anne H. H.Business processes emphasize the capture of best practices and improvement of business activities. A dynamic process aims to deal with a dynamic environment where processes have to change and adapt to respond to unexpected events and situations that have not been anticipated. Dynamic process change currently focuses on mechanisms and engines for dynamic process specification and execution. However, existing approaches do not ensure that dynamically adapted processes achieve desirable performance and optimality goals that are set by the target application. In this paper we suggest an approach that assembles a process customized for a particular crisis and dynamically modifies it, if necessary, to maintain an optimal execution.Item Modeling Contract Bridge Bid Opening Strategies using the aspps System(2006-06-30) East, DeborahContract bridge is a popular and complex card game which has two distinct phases: the auction (bidding) and play. The result of the bidding is a contract (the number of tricks to be taken during play and the trump suit). Success depends as much on the contract as on the play. The difficulty in bidding has resulted in the development of strategies for bidding. In this paper, we model the opening strategies based on variations of precision and Standard American. The language of logic PS+ is used for encoding data and defining the rules for the bridge opening program. We take advantage of the nonmonotonic properties of the language of logic PS+ to model contract bridge openings. The aspps system consists of two modules: psgrnd (for grounding a problem definition with the data of an instance of the problem) and the aspps solver. We show that the aspps system generates answer--sets for our bridge opening program where each answer--set is an opening bid.Item Qualitative and Quantitative Scoring and Evaluation of the Eye Movement Classification Algorithms(2009-09-16) Komogortsev, Oleg; Jayarathna, Sampath; Koh, Do Hyong; Gowda, Sandeep MunikrishneThis paper presents a set of qualitative and quantitative scores designed to assess performance of the various eye movement classification algorithms. The scores are designed to provide a foundation for the eye tracking researchers to communicate about the performance validity of various eye movement classification algorithms. The paper concentrates on the five algorithms in particular: Velocity Threshold Identification (I-VT), Dispersion Threshold Identification (I-DT), Minimum Spanning Tree Identification (MST), Hidden Markov Model Identification (I-HMM) and Kalman Filter Identification (I-KF). The paper presents an evaluation of the classification performance of each algorithm in the case when values of the input parameters are varied. Advantages provided by the new scores are discussed. Discussion on what is the \"best\".