CIS PhD Dissertations
Author: Laura Welsh
According to the World Tourism Organization, there were 935 million international tourists in 2010. International tourists collectively spent $852 billion on tourism products in 2009 (World Tourism Organization, 2011). A tourist’s awareness, selection, and choice of tourism products are extremely dependent on the information used by the tourist (Bieger & Laesser, 2001; Fodness & Murray, 1997). Today, millions of tourists obtain tourism information through the Internet in a fraction of the time and inconvenience that was required in the past (Buhalis & Law, 2008). Given the worldwide economic importance of the tourism industry, it is imperative for both scholars and practitioners alike to determine what impacts the use of tourism websites, as websites are the primary source of information for travel planning and purchase of travel products.
This dissertation’s objective was to examine whether espoused national cultural values, the five-factor model of personality, or the context of travel impact the acceptance and use of tourism websites and to see if these effects varied based on differing levels of information structure on the websites. Espoused national cultural values, the five-factor model personality factors, and travel preferences addressing the context of travel all played small but significant impacts on the use of tourism websites. However, their effects are greater because the cultural, personality, and context of travel variables showed that they also indirectly affected intention to use the tourism websites through perceived usefulness, perceived ease of use, and subjective norms. Some of these effects varied by differing levels of information structure on city-specific and airline reservation websites examined in this study, while other effects remained constant across both levels of structured information tourism websites.
Uncertainty avoidance was also explored to see if it is a cultural or universal construct. As it had a significant but modest correlation with the universal construct of Rotter’s (1966) locus of control, this lends support to the claim that uncertainty avoidance may be a universal construct in the use of tourism websites.
A combined espoused national cultural values and five-factor model of personality theoretical model was created to better predict tourism website use than either model alone.
This study was conducted within the framework of the Technology Acceptance Model (TAM) (Davis, 1989) widely used to predict and explain IT usage. This study replicated the work of Srite and Karahanna (2006) and Devaraj, Easley, and Crant (2008) studying the effects of espoused national cultural values and the five-factor model of personality on IT usage in a different context. This study brought Alvarez and Asugman’s (2006) travel preferences addressing the context of travel from the field of tourism to the field of IS using different subjects and incorporated them into the TAM model for the first time.
Chairperson: Martha E. Crosby
Author: Zach Tomaszewski
In an interactive drama, a human player assumes the role of a character in a story. Through this character, the player interacts with objects and other characters in a virtual story world. The interactive drama system then responds to the player by making changes in that world in order to produce a well-formed story shaped by the actions of the player. Thus, an interactive drama experience is much like that of a roleplaying computer game. The difference is that, rather than providing only an open world for the player to explore or else a fairly rigid preset storyline, the story is generated at runtime in response to the player.
Marlinspike is such an interactive drama system. Its design is based on a neo-Aristotlean poetics of interactive narrative developed from the work of Aristotle, Sam Smiley (1971), Brenda Laurel (1991), and Michael Mateas (2004). Marlinspike generates a story by responding to player actions using small pre-authored story components called scenes. It selects scenes so as to narratively build upon or reincorporate earlier player actions into later story events. This serves to make player actions narratively necessary to the finished story structure.
A prototype implementation of Marlinspike was used to produce the text-based game Demeter: Blood in the Sky. Although Marlinspike's reincorporation feature did not lead to a significant difference in the experience of end users, it did produce a solid interactive drama architecture and better-formed internal story structures. With the lessons learned from the implementation process, Marlinspike provides a solid foundation for future interactive drama development.
Chairperson: Kimberly A. Binsted
Author: Patrick Gilbert
Technology is clawing its way into nearly all organizations and higher education is not exempt. Advocacy organizations press for change to reduce costs, broaden delivery, and equalize access to those who are often denied due to financial, cultural, or geographic limitations. Change agents within the organization advocate for increased technology to deliver instruction. The question for an organization is not that technology is going to change higher education. Two choices seem to remain for futures work – what will the shape of the organization be (predict parameters) or what do the members of the organization want the organization to be. When dealing with futures work, absent formal models of the future, the only alternative is gathering of expert data. Delphi is a recognized tool for gathering expert thought. Recently, thought leaders proposed modifications to broaden the definition of expertise to stakeholders. This project developed a web based implementation of Delphi where functionality demands come from the literature. The data was collected in Fall 2010. Some interesting policy recommendations appeared. Additionally, the research lead to a considerable number of possibilities for future research.
Chairperson: Raymond R. Panko
Author: Kim Chi Diep
Information Literacy (IL) competencies are defined as “the ability to locate, evaluate, and use information effectively” and are considered essential for students in their academic lives and future careers (ALA, 1989). IL plays an important role in developing critical thinking and problem solving skills, and improving academic achievement through active learning, information problem solving, and evaluation of information. In Vietnam, the focus on developing student IL skills has recently received the attention of academic libraries. Vietnamese higher education has been influenced by a tightly structured subject-based model in which pedagogy relies on rote memorization and objective testing, rather than problem solving and critical thinking (Kelly, 2000). The recent shift to and implementation of a credit system requires critical changes in the curriculum and in teacher roles (Zjra, 2008).
This case study explored the perceptions of stakeholders about the development and delivery of information literacy instruction (ILI) to students at four universities, identified perceived challenges of including IL as a credit course in the curriculum, and resulted in a conceptual framework of best practices based on the findings. Concepts from change theory, learning theory, leadership theory and collaboration theory served as lenses to interpret the results.
The findings showed that IL is primarily a concern of librarians and has not yet had an impact on Vietnamese campus culture. IL activities at these four libraries mostly take the form of lectures, workshops, and modules on basic IL skills designed and delivered by instruction librarians, and attended at the discretion of students. Few ILI activities are subject discipline-related and target the information needs of students in a particular area. Assessment has been formative and provides minimal feedback to students and instruction librarians. Respondents reported challenges of including ILI as a credit course in the curriculum, including the impact of the credit system, the lasting impact of teacher-centered instruction and rote learning, misperceptions of stakeholders about the effect of IL on student learning outcomes, degree of support of academic stakeholders, degree of faculty-librarian collaboration, and scarcity of resources. The study provides ample evidence that all stakeholder groups recognize the value of ILI and support progress in the area. IL practitioners and researchers argue that instruction librarians and library administrators should be leaders in IL initiatives, and act as initiators of IL change through disseminating the mission and values of IL to the campus community.
The creation of the best practice framework comes at a propitious time for Vietnam when the government’s IT initiative, learner-centered instruction reform, a credit system, and the assessment of SLOs have become of interest to those in the educational field, ranging from ministerial leaders and campus leaders to faculty, librarians and other constituencies. What makes this study unique is, for the first time a framework of best practices of ILI for academic libraries in Vietnam was developed by synthesizing perceptions of campus stakeholders and key components of ILI that have been reviewed in the Western literature, and scarcely discussed in the literature from Vietnam. Another unique aspect is the data touch many facets of ILI and involves all related stakeholders on campus including library administrators, instruction librarians, discipline faculty, and students.
A key contribution of this research is providing a best practice framework that validates the body of literature on IL in the West showing that no matter what one’s social, cultural, or educational background, the IL-related concepts are universally agreed upon and relevant to developing critical thinking about information. This study has the potential to provide crucial information to library administrators and librarians in academic libraries in Vietnam providing a better understanding of the potential and challenges of implementing ILI programs. In addition, the findings will be useful for decision makers in colleges and universities in issuing appropriate policies related to the adoption and implementation of IL in the academic environment in Vietnam.
Chairperson: Diane M. Nahl
Author: Maureen A. MacLeod
This dissertation examines the influence of professional identity on educators' understanding of technology innovation. The study draws on research on narrative sensemaking (Bruner 1990; Czarniawska 1997), storytelling (Boje 1991a; Brown et. al. 2009; Clandinin and Connelly 1996), communities of practice (Wenger 1999) and identity (Ashforth et. al. 1989; Wenger 1999). Interviews were conducted with teachers, administrators and technology specialists in the middle school (grades 6-8) at an independent school, chosen because of its recent investment in and commitment to transformational technology innovation.
Building on Mishler's (1986) and Riessman's (1993) narrative analysis methods, 20 in-depth interviews were analyzed using a whole story narrative analysis method. Story themes were identified that highlighted how educators made sense of the school's efforts to promote technology innovation in the classroom and their own experiences with it. Four distinct identity perspectives ("identity lenses") were identified. This analysis illustrated how an identity lens draws together aspects of professional work, interactions with colleagues, perceptions of organizational events and perspectives about technology in the classroom, as individuals make sense of technology innovation in their professional lives. Professional identity is transitional and negotiated constantly (Wenger 1999), particularly during periods of technology innovation (Barrett and Walsham 1999; Lamb and Davidson 2005). Four organizational anchor points were identified as significant organizational exchanges through which individuals negotiated their professional identity related to the school's technology initiatives.
This study contributes to our understanding of how professional identity influences individuals' interpretations of and participation with technology innovations. It demonstrates how narrative analysis of stories of technology innovation can be employed to understand how individuals make sense of technology changes in their professional lives. Implications for practice include the recognition of diverse perspectives ("identity lenses") related to technology innovation, which influence how individuals interpreted and related to technology innovation projects. Studying participants' stories highlighted how opportunities for "low-risk" experimentation allowed educators to find success with technology innovation. The support of knowledgeable technology professionals, who have teaching experience themselves, also emerged as an important enabler for such experimentation.
Chairperson: Elizabeth J. Davidson
Author: Jeng-Her Chen
The study aimed to answer the compelling question: “What makes an efficient Web searcher?” Based on Bandura’s self-efficacy theory, Research Question No. 1 asks “How do self-efficacy, problem-solving confidence, and the use of Google’s Advanced Search affect timely successful Web searching?” Based on Newell and Simon’s problem space theory, Research Question No. 2 asks “Do efficient searchers share the same mental organization of keyword importance as the non-efficient searchers?” And Research Question No. 3 asks “Is a higher level of search performance characterized by increasingly consistent mental organizations of keyword importance?”
Participants searched the Web using Google. Each had up to 30 minutes to find the answer to the task: “How did Taiwan’s native (aboriginal) people communicate in writing from roughly 200 to 400 years ago?” Data analyses involved survival analysis for RQ1 with 86 subjects; TRICIR, ANOVA, and Kendall’s concordance of coefficient for RQ2 and RQ3 with 88 subjects.
I tested five hypotheses. For RQ1, I found that Google’s Advanced Search hurts timely successful Web searching, that self-efficacy helps, and that confidence does not help. For RQ2, I found that there is a significant difference in mental organization of keyword importance in two levels of search performance. For RQ3, I found that the efficient searchers have higher consensus in the mental organization of keyword importance than those of non-efficient searchers. In summary, I suggest what not to do—do not use Google’s Advanced Search, and what to do—form a what I call “Decisive Problem Space” prior to searching with Google.
Chairperson: Rebecca Knuth
Author: Louis Tomsic
This study examined the effects of four post-crisis responses on five different variables using a blog tool. The four post-crisis responses are (a) information only, (b) compensation, (c) apology, and (d) sympathy. The five dependent variables are reputation, anger (negative emotion), negative word of mouth, account acceptance and state of the publics based on involvement and knowledge.
Coombs and Holladay’s (2002; Coombs 2007) situational crisis communication theory suggests that the effects of a crisis can be minimized by formulating an appropriate response to the public following a crisis.
Furthermore, Hallahan’s (2001) five-publics model is used to categorize the participants into active, aroused, aware, inactive and non-publics. In the experimental study, participants were active by scoring fifty percent or higher on a knowledge test to show high knowledge and responding to a crisis blog to show high involvement for the given crisis.
This study found sympathy protected the organization’s reputation, lowered negative word of mouth and raised account acceptance when minimal attributions of crisis responsibility and a moderate reputation threat existed. It was the best crisis response to give when compared to information only. Furthermore, a crisis response is better than no response for lowering the segment of participants in the active public.
Chairperson: Tom Kelleher
Author: Hanae K. Kramer
In the spring of 1932 the state of Manchoukuo was born. Its figurehead, Henry P’u-Yi, the last emperor of China dethroned in 1911, lent the state a feeling of historical continuity. Its borders roughly corresponded to those controlled by the Manchu tribes of the early seventeenth century, and like the tribes of old was separated from China by the Great Wall. It was billed by Japan as an independent country. Yet, from the moment of inauguration it was derided as “the puppet state” or “occupied Manchuria” by many governments. Neither position captured the unvarnished truth, which lay somewhere between the propaganda and contemptuous mirth. It was not a true country for sure, but it was never an extension of Tokyo either. Many of the strings of power never left the continent. Manchoukuo was a state born out of compromise, the child of competing interests: Tokyo, the South Manchuria Railway Company (SMR), and the Kwantung Army. These interests, as well as others, found that they could co-exist within the framework of a polity that was completely separate from China and comfortably distant from Japan. Much of the scholarship to date, however, views Manchoukuo merely as a constituent part of Japan's imperialistic enterprise. Manchoukuo, therefore, is seldom studied on its own terms.
In the 1930s, a motley crew of Japanese thinkers were tasked to create a country from scratch. The challenges before them were great since continental East Asia was in a state of chaos. Furthermore, it was a land of many peoples, none of whom the Japanese had any sort of camaraderie with. As a group, these thinkers agreed that their tool of choice for state-building was to be mass media. Since illiteracy is an effective defense against the printed word, radio and film became the brick and mortar of the Manchoukuo propaganda machine. Slogans such as minzoku kyōwa (concord of races) and Wang-tao (prosperity through beneficent rule) sought to foster a sense of unity and obedience. The belief that a viable state would emerge from the ether, enticed out by words and images, was a strong one. Using historical and descriptive analyses, this dissertation studies film and related documents to gain insight into the Manchoukuo government's attitudes towards media as well as their policies.
Dissertation Committee: Dr. Dan Wedemeyer (Chairperson), Dr. Andrew Arno, Dr. Edoardo Biagioni, Dr. Gary Fontaine, Dr. Ellen Hoffman, and Dr. Jenifer Winter
Chairperson: Dan J. Wedemeyer
Author: Kumiko Hachiya
This qualitative research investigates the meaning of keitai (mobile phone) for older Japanese adults between ages of 59 to 79. Participants’ emails from keitai, handwritten daily logs, and audio and video recordings from meetings and interviews were collected during my 7-8 months stay in one of the largest cities in Japan. Latour’s Actor-Network-Theory, Garfinkel’s ethnomethodology, the process theories of A. N. Whitehead and Kitaro Nishida, and “embodied interaction” of Paul Dourish were used for data analysis. All these theorists take similar, nonpositivist positions that support the phenomenologist or constructivist view that social reality is mutually constructed.
The aging of Japan’s population is advancing as Japanese baby boomers are getting older and the birthrate is low, and the government is concerned about how to bear the financial burden of increased pension payments and increasing medical costs for aging retirees with a much smaller work force as a tax base. The government has been building a high-speed broadband infrastructure since 2000 to stream-line its services through Internet; the majority of the aging population is not on-line.
In the backdrop of this situation, Internet access via keitai started in 1999 and took the youth by storm as Internet access from PC was not generally available; however, people over 65 years were not a part of this.
The participants were busy but enthusiastic about learning digital technologies. When they started to use keitai, they moved into a different world. Keitai bridges them to the world with younger generations, creates new interactions with existing close relations, and expands digital expressions by connecting keitai’s data-capturing capabilities to a PC or Internet. For young people keitai is no longer a telephone but a media tool and expressed with westernized katakana, ケータイ, which suggests something modern and foreign. Translating other culture’s tools to one’s own cultural understanding takes time and the participation of co-members from one’s own culture. If this translation is done well, perhaps, the feeling of troublesomeness associated with new communication technology such as keitai can be overcome among older population.
Chairperson: Andrew R. Arno
Author: Dan Smith
While opinions were divided, employees tended to be more trusting of coworkers and top management if they had used a variety of social media recently at work. They were also higher on other organizational climate measures of cooperation and information sharing. The results are correlational; they represent associations so one cannot claim the relationships to be causal. However there was a modest, statistically significant correlation of the favorable organizational climate with the years since the company encouraged use of social media.
Employees who had some social media use were more likely to recognize the potential benefits from social media to build social capital in conjunction with work. They recognized the informational and affect values of social media to strengthen ties in the work group and to build new ties outside the immediate group and the company.
The research gathered data from a sample of 235 employees from a national pool on their social media practices and the social media policies of their employers. It investigated how social media added to a model of organizational climate that promotes knowledge sharing and cooperation, and trust in peers and management. The research integrated theories of social capital, trust, organizational climate, and knowledge sharing to test claims that social media add value to firms in social dimensions above and beyond knowledge sharing. Statistically significant associations of social media use were found with trustworthiness of employees and management, cooperation, and knowledge sharing. A hypothesis that social media use would fit a specific model incorporating organizational climate and knowledge sharing and combination was not supported. Modest associations were shown by multiple regression. The dominant effect of trust both of coworkers and management in organizational climate was reaffirmed.
The sample of respondents came from a wide range of industries and not specifically from social media-active firms so the findings may be robust. The research replicated a commitment-based HR theory linked to increased productivity. It extended the theory by adding trust in top management and social media use.
Some evidence was found that the length of time in years that an organization has had social media correlates with better organizational climate ratings. Moreover, stronger correlations were found for trust in coworkers and trust in management with more recent social media actions.
Employees also tended to use social media for work related matters more at home than on the job. Furthermore in the study sample in relation to work there was a strong plurality for Facebook compared to other social media sites.
Chairperson: Tom Kelleher
Author: Donna Bair-Mundy
The task of this dissertation has been to construct a theoretical model for the development of laws relating to telecommunication privacy vis-à-vis law enforcement surveillance over the past hundred years. Both statutory and case laws relating to telecommunication privacy were examined, as well as the historical context of such legislation and rulings.
The model presented draws upon the work of legal theorists such as Thomas Cooley, Roscoe Pound, H.L.A. Hart, R.M. Dworkin, William Banks, M.E. Bowman, and Marc Rotenberg; surveillance theorists such as Michel Foucault, Anthony Giddens, and David Lyons; and privacy theorists such as Alan Westin, Irwin Altman, and Sandra Petronio. It focuses on three competing fears: fear of external threat, fear of social chaos, and fear of the tyrant. Shifts in emphasis among these three fears throw the nation into periods of boundary turbulence. This boundary turbulence requires re-negotiation of privacy boundaries. This re-negotiation has happened repeatedly during U.S. history.
The model presented was then tested in a case study that examined the inception, debate, and passage of the USA PATRIOT Act.
Chairperson: Rebecca Knuth
Author: Laurel King
Understanding the user and customizing the interface to augment cognition and usability are goals of human computer interaction research and design. Yet, little is known about the influence of individual visual-verbal information presentation preferences on visual navigation and screen element usage. If consistent differences in visual navigation can be detected and measured, these differences could be used to augment cognition or customize views appropriately as eye tracking and other monitoring devices improve. This dissertation research investigates: (1) the relationship between the measured visual-verbal preferences and the participant’s eye movements during different types of problem-solving tasks; (2) performance on text, text plus diagram, diagrammatic reasoning problems and selection of problem representation; and (3) whether different levels of cognitive load are observed in eye movement patterns while solving reasoning problems of differing difficulty.
A visual-verbal preference questionnaire adapted from several established instruments was administered to 140 university students in a variety of fields. The responses to this questionnaire were analyzed to understand overall tendencies toward visual and verbal preferences by field of study, gender and other factors. Twelve participants (six verbal and six visual, balanced by gender) were recruited from those scoring in the extreme 20% of the pool, either more visual than verbal or more verbal than visual, to complete an eye tracking experiment. Each participant completed 3 practice problems and 15 reasoning problem tasks (6 text-plus-diagram, 6 text-only, and six diagram-only).
The results showed a strong trend for the verbal group to perform better on problems with diagrams than without, while the visual group only performed slightly better with a diagram. The visual group performed better than the verbal group on the text-only and diagram-only problems. The visual group spent more time on blank areas of the screen than the verbal, possibly indicating internal visualization. Different strategies were found between the two groups and among individuals. These differences are analyzed in terms of one’s awareness of their visual processing and the importance of specific task requirements. The results are important to the use and customization of representations in interface design, education, marketing and diagrammatic communication for problem solving.
Chairperson: Martha E. Crosby
Author: Patricia Donohue
Small group learning continues to increase as an instructional practice in the K-16 classroom. Currently, this practice meets the need to address a growing population of students occupying larger class sizes; but also, small group shave been shown to be an effective strategy for improving individual learning. Recent research has shown that small groups can be more effective than individual learning when groups recieve preparation in the content and practice in the interactions expected as part of cooperative work. Researchers in small cooperative group learning have provided instructors with guidelines on how to structure successful cooperative group learning experiences. Although the guidelines help students enter their groups ready to engage in complex problem solving, the instructions do not prepare students for how to successfully conduct their cooperative group experience. Students are often not prepared to manage poor cooperative behaviors, or in how to research needed information for the problems they are given. Researchers in affect have told us that emotions play a crucial role in the progress and results of cognitive activity. It influences where we focus attention and if we are satisfied with the results of our efforts. We also know that groups who exhibit positive affect think deeper, more creatively, and more thoroughly. The present research examined the conversations of groups engaged in a cooperative mathematics project for five days. Group interactions were examined to identify the role of affect and the influences that promoted or demoted cooperative productivity. Findings showed that affect played a decisive role in promoting cooperation and productivity and that its influence accumulated, accentuating the positive or negative effect. The development of a set of meta-affective tools was suggested to promote positive group interactions.
Chairperson: Martha E. Crosby
Author: Claire Hitosugi
This is an exploratory work on the relationship between online initial trust and culture. Little work has been done on how culture influences one's online trust perceptions. In IS research, culture is mostly studied either at the national or at the organzational level. This study captures culture at the individual level on a website. Four culture dimensions (masculinity/femininity, individualism/collectivism, power distance and uncertainty avoidance) proposed by Hofstrede (1980) are investigated. The McKnight et al. trust model (2002) is used as the basis of this study. Subjective norm (SN) is also integrated in the trust and culture model. Structural equation modeling was used in the model analysis.
First, the initial online trust model of McKnight et al. was successfully replicated in a tourism context. Then, the McKnight et al. trust model was augmented by subjective norm. I proposed from the 'Theory of Reasoned Action' (Fishbein and Ajzen 1975) that SN is a critical variable in trust formation and trust intention. My data showed that SN directly impacted all four trust constructs (disposition to trust, institutional trust, trusting beliefs, and intention to trust). Furthermore, SN is found to be a positive covariant of all culture variables; thus, all culture variables indirectly affect trust formation and intention through SN. Two culture dimensions (power distance and uncertainty avoidance) also directly affected three trust constructs, but not intention to trust. The dimensions of masculinity/femininity and individualism/collectivism had no direct effects on trust formation.
My results showed that SN, in particular peer perception, has the most significant effects on initial online trust formation. Furthermore, a person high in uncertainty avoidance (US) has the strongest association with SN. Thus, not only does s/he take cues from others more, but also has a more trusting disposition and forms trusting beliefs more easily than a person low in UA.
The unequivocal properties of the UA construct were also discussed. Two types of UA are proposed; "UA need for structure" and "UA need for avoiding uncertainty". The UA construct that the most literature refers to is analyzed as "UA need for structure". Further investigation of the UA construct is suggested.
Chairperson: William E. Remus
Author: Marc Le Pape
Failure to address extreme environments constraints at the human-computer interaction level may lead to the commission of critical and potentially fatal errors. This dissertation addresses gaps in our current theoretical understanding of the combined impact of an extreme environment stressor and perceptual style on task performance in human-computer interaction. A controlled experimental study investigates the effects of altered ±Gz accelerations and ﬁeld dependency- independency on human performance in the completion of perceptual-motor tasks on a personal digital assistant (PDA). Results of the experiment, conducted in an aerobatic aircraft at multiple ±Gz acceleration levels, show that in altered ±Gz environments perceptual style signiﬁcantly impacts perceptual-motor task performance in target acquisition. Based on the results, the argument is made that acknowledging individual cognitive differences in design, will help end-users in extreme environments execute perceptual-motor tasks efﬁciently, without unnecessarily increasing cognitive load and the probability of critical errors. Design guidelines are proposed towards this end.
Chairperson: Daniel Suthers
Author: Miwa Yamazaki
Healthcare marketers have continuously battled for ways to effectively frame the disease prevention messages that help empower consumers to obtain healthier lifestyles. However, marketers have not yet examined the effect of message framing on Diabetes Mellitus Type II (DM2) prevention. In this dissertation, the author investigates (1) the effect of message framing (advantages of preventing DM2 vs. consequences of ignoring DM2 prevention) on people's attitudes and instantaneous intentions toward DM2 prevention; (2) the effect of messages that highlight why, rather than how (e.g., regular exercise, healthy diet), to prevent DM2, emphasizing ex-related consequents (e.g., pregnancy complications, erectile dysfunction); (3) people who have not yet developed diabetes, thus examining the effect of message framing on DM2 prevention, rather than on DM2 complication prevention (e.g., blindness); and (4) potential gender differences in terms of participants' attitudes and intentions when the messages targeted their own versus the opposite gender.
Results revealed that, contrary to what was predicted, the message that highlighted the consequences of ignoring DN2 prevention, such as having pregnancy complications or sexual dysfunction, was more effective than the message that highlighted advantages of preventing DM2 in eliciting participants' positive attitudes toward the message and their greater intentions to prevent DM2. Similar findings hold rue, but were also unexpected, in an opposite-gender message condition. Moreover, female versus male participants generally had significantly more positive attitudes and greater intentions, irrespective of the message framing (particularly in the same-gender message condition). Ex post analysis revealed that fear mediated the relationship between message type and participants' attitudes and intentions. In addition, perceived severity also mediated the relationship between message type and intentions for the same-gender conditions.
The findings provide several implications for healthcare marketers regarding promoting DM2 prevention. Specifically, a gain-framed message is not always an effective way to communicate disease prevention. Instead, healthcare marketers may consider using the messages that focus on sex-related negative consequences and that arouse fear in DM2 prevention. Moreover, identifying the message audience remains important in DM2 prevention; in particular, promoting the message that targets their own gender is effective.
Chairperson: Dineh Davis
Author: Matthew James Sharritt
Extensive literature has shown that games can provide an engaging, dynamic,
and authentic learning context. Many of the studies on the use of games in
education indicate that games can support teaching standards and outcomes;
however, they do not describe actual uses of video games for learning. Through
the analysis of affordances employed by student gamers, an understanding of
how learning takes place can inform the design of effective educational games
and aid their integration into contemporary classrooms. Informed by
ethnomethodology, this study used methods of grounded theory provided a
detailed description of the use of video games for learning in educational
Results demonstrate that learning occurs across multiple levels: the mastery of
the computer interface, followed by the mastery of the game interface and upon
which students can achieve advanced strategy aimed at goal achievement.
Learning also occurs across multiple granularities: occurring either in short
episodes, sequences of episodes, or trends. Learning can be triggered by multiple
cues, such as failure, game visualizations or specific representations, as well as by
peers or teachers in the social environment.
Students used affordances provided by the game interface and learning
environment, specifically: the visual representations of games afford particular
actions; the persistent display of historical context as well as present and future
potentials motivates learning; specific cues can grab attention, helping to focus
efforts on new or underutilized game tasks; consistent and well organized
visualizations encourage learning; and information presented in a plurality of
channels is most effective for learning.
The use of social peers in collaborative learning had several effects on the
learning process: peers disclosed information to achieve shared meaning of
objects’ purposes, and negotiated to collaboratively choose game strategies. Peer
teams served cooperative roles as information sources and competitively as a
Implications for students, educators, and game designers are offered to better
play, implement, and design games for learning. A brief comparison of findings
with existing theory discusses similarities among collaborative learning and
activity theory, and suggests opportunities for future work. Overall, findings
indicate a great potential for the use of games in education for learning.
Chairperson: Daniel Suthers
Author: Jennifer L Campbell-Meier
The development of an institutional repository (IR) is one of the more complex projects that librarians may undertake. While many librarians have managed large information system projects, IR projects involve a larger stakeholder group and require support from technical services, public services and administration to succeed. A significant increase in the development of repositories is expected with technology and process improvements for digital collection development. This study investigated the development of repositories at doctoral institutions, identifying factors that influence development and best practices using a comparative case study analysis approach to gather and analyze data. A detailed account and analysis of academic institutional repositories was formed providing knowledge of individual IR development as well as a cross case comparison of developmental factors including adoption, motivating factors and perceived benefits. The use of a narrative, project management practices beyond technical development, and the inclusion of the campus community are identified as key factors in development. Best practices and recommendations for future developers, such as early involvement of stakeholder groups and the need to educate both librarians and faculty about open access collections are also discussed. This study contributes to a more informed understanding of the development of IRs and identifies a model framework for future IR developers.
Chairperson: Rebecca Knuth
Author: JungHyun Nam
The purpose of this research was to study the quality and motivation attributes of information products from the end-users' perspective, and to measure the impact of these attributes on intention-to-use. An information product is defined as a highly interdependent package of information that can be transmitted or distributed in digital form (e.g., a web portal, an application software). In the context of Web portal use, the information product generally includes three types of services: personal services (e.g., email), information services (e.g., online news) and search services. The literature suggests that the quality of an information product can be assessed from a number of attributes, such as accuracy and applicability of the information content, the timeliness and speed of the physical medium, and the reliability and responsiveness of the product provider. The literature also underscores the importance of motivational factors such as social escapism and privacy concerns on the intention to use. Drawing from this theoretical background, an initial set of 21 quality and motivational attributes has been identified, and an experimental study using 142 subjects as Web portal users has been conducted. Statistical analyses helped us consolidate quality factors into four groups of quality attributes factors as they were perceived by the subjects: Content relevancy, Communication interactiveness, Information currency, and Instant gratification. As far as impact analyses are concerned, social escapism motivation, information motivation, interactive control motivation, and socialization were found to highly correlate to all of the three types of services and the combined use. When quality factors and motivations were considered at the same time to explain intention-to-use of the Web portal, social escapism, as a motivation factor, was identified as the main determinant. The findings of this research shed new lights on the understanding of Web portal use and suggests that there are some quality attributes that are particularly perceived to be relevant to Web portal intention to use. Lessons learned from this study should also help IT professionals to design, develop and deploy more effective general web portals.
Chairperson: Tung X. Bui
Author: Paulo Maurin
The management of marine resources is undergoing a paradigm shift, away from top-down governance by a central power interacting with an stable, limited and relatively homogeneous and isolated set of ocean users, to a field populated by dynamic, abundant, networked and heterogeneous stakeholder groups. These marine stakeholders are playing an increasingly active role in the management and regulation of the ocean resources. This shift has been partly assisted by the increased availability of information about marine resources and by the new communication and information technologies. Together, these developments allow users to become active players, giving rise to a new trend in co-management of marine resources. This research presents evidence that the therm of "ocean user" is conceptually limiting and no longer viable to describe ocean stakeholders' ability to participate in co-management arrangements.
This study employed a qualitative approach across three research sites in Hawaii (Waianae, Hanauma Bay, and West Hawaii) to understand the dynamics of selected marine stakeholders' gathering and use of information, formation of groups and alliances, framing of issues, and affecting regulatory changes. The West Hawaii case study, via the West Hawaii Fisheries Council, yielded the richest data for the research. The Council exemplifies a successful integration of the local community in the management of local marine resources.
Data was gathered by using semi-structured interviews, attending meetings and analyzing documents and other artifacts. The analysis was informed by the Social Actor Model of Lamb and Kling (2003), the Actor-Network Model developed by Latour (2005) and Callon (1986) and, to a lesser extent, the Social Movement literature (McAdam, McCarthy & Zald, 1996). Based on the evidence gathered, this study advances the concept of the emerging Hawaii Marine Stakeholder, and offers a description of how marine resource management has accommodated stakeholders. SAM was used to understand the actor, ANT to explain the network, and SM to analyze large-scale changes and mobilizations. The results offer practical implications for the development and implementation of co-management arrangements. Theoretical implications include the analytical integration of diverse approaches to understanding social action situated in the context of environmental management.
Chairperson: Daniel Suthers
Author: Tsui-Chuan Lin
The switch from analog to digital technologies is reshaping the landscape of the TV news industry. Little research has yet been done on the social and organizational implications of TV news digitalization. This study examines the factors that influence organizational decision-making and implementation strategies and investigates the adoption process in four Taiwanese news stations that were earlier adopters of digital TV news technology (FTV, EBC, TVBS, DA-AI), using in-depth, qualitative field studies at each station. It develops a theoretically-informed process model that draws from Rogers's (2003) innovation process model, Orlikowski et al.'s (1995) technology-use mediation activities, and Leonard-Barton's (1988) implementation strategies, to explain the organizational adoption process and their variations among the organizations.
This multiple case study has several important findings. First, the adoption process of digital TV news systems encompasses the initiation stage (agenda-setting, matching) and the implementation stage (establishment, reinforcement, adjustment, episodic change). Reinforcement in parallel with establishment is critical for successful implementation, and episodic change, triggered by external or internal forces, injects dynamics to the organizational adoption process. The three stations that adopted total solution TV systems experienced a more linear adoption process, while the station that developed an integrated system (DA-AI) experienced a more complex, parallel developmental process. Second, among the four early adopters studied, managers' perceptions of relative advantage, compatibility, cost, and sustainability were the factors that affect the organizations' decision to adopt this technology most. Individual and organizational level factors were more influential on the adoption decision than environmental level factors. Third, the implementation characteristics of digital TV news systems (low transferability, high organizational complexity, high divisibility) set parameters for implementation strategies in the cases. Training and evaluations were essential for assimilating this core production technology successfully. Fourth, digital news technology appears to have a greater impact on the news production process, news workers' roles and tasks, and collaboration than on news representations per se. Finally, a refined model is developed to investigate the adoption and implementation of core production technology in digital broadcasting. This teleological model shows various decision events, reciprocal interaction, ever-changing, and vagueness in the adoption process.
Chairperson: Elizabeth J. Davidson
Author: Vichianin Yudthaphon
The goal of this study is to understand how Healthcare Information and Communication Technologies (HICTs) are applied to healthcare delivery to improve people's access to healthcare services in a developing country using Thailand as an example. The HICT, in this study, refers to the use of Information and Communication Technologies in clinical healthcare settings related to the delivery of healthcare services. Access in this current study is defined as the ability to obtain healthcare when needed, taking into consideration transportation to healthcare facilities, patients' waiting time, referrals, availability of medication, and access to medical and health reference information.
The study design is a qualitative approach using embedded case studies. The study aims at understanding the use of HICT-related interventions across stakeholders, across multiple levels of care, and across the defined five probes of Access, in Saraburi province of Thailand. Five probes of Access in this study are (1) Transportation to Healthcare Facilities, (2) Patients' Waiting Time, (3) Referrals, (4) Availability of Medication, and (5) Access to Medical and Health Reference Information. This study involves multiple levels of healthcare including tertiary care, secondary care, and primary care. A total of 31 healthcare professionals were interviewed across nine healthcare organizations.
HICT use has the potential to improve Access, for example, by providing improved patient appointment scheduling and electronic medical and health reference information for healthcare professionals. The study found that the tertiary care level (the medical center) and the secondary care level (the general hospital) have available resources available to support HICT use. However, healthcare professionals vary in their willingness to use HICT in practice. At the secondary care level (community hospitals) and the primary care level (health centers), resources to support continuing HICT use are limited. Evidence from this study supports the conclusion that HICT use at community hospitals and health centers has a limited impact on Access, because computer technical support and computer literacy training, as well as limited formal financial funding are limited. Eight policy recommendations are offered based on the study findings to improve Access and use of HICTs in the province studied. These recommendations my provide insights on use of HICT in other developing countries.
Chairperson: Elizabeth J. Davidson
Author: Stephanie Rolfe
"Many of the assumptions underpinning current thinking on ICTs in development are based on intuition rather than analysis... The danger is that, without better understanding of the real impact of ICTs on both national economies and community development, the pursuit of over-ambitious, unrealistic goals may mean that resources are misapplied and worthwhile objectives missed." -- OECD-DAC, 2004
In the second half of the 20th Century, rapid developments in Information and Communications technologies (ICTs) have seen the evolution of an "information revolution" which supports and drives an increasingly global economy. In this context, the world recognizes a new form of poverty -- "information poverty" -- as developing countries struggle to obtain the infrastructure, skills and other requisites to be participants in that revolution. Increasingly, aid programs to developing countries are focusing on the role that ICTs can play in economic and social development. However the ongoing debate about this role highlights a need for a greater understanding of how donor and recipient countries conceptualize ICTs and their impact on development so that aid initiatives can be more effectively targeted.
This study fills that need by exploring and comparing how a donor country (New Zealand) and four of its aid partner countries (Cook Islands, Fiji, Niue and Samoa) separately conceptualize the role of ICTs for economic and social development. The researcher used data gathered from interviews, observation and archival research in a qualitative study. She then analyzed the data according to a conceptual framework developed by IS scholars Sein & Harindranath (2004) to identify, map & compare each countryâ€™s conceptualization and to determine alignment.
This study contributes to the literature on ICTs for Development by informing the discussion on the ICT construct and its role in development; it also critiques and extends the conceptual framework used in the study; finally, it makes recommendations for both donor and partner countries for ways to make ICT aid initiatives more effective.
Chairperson: Dan J. Wedemeyer
Author: Ravi K Vatrapu
This dissertation begins a research program aimed at a systematic investigation of phenomena in the nexus of culture, cognition and computers. This dissertation investigates two specific research questions related to the effects of culture on appropriation of affordances and on technological intersubjectivity. Affordances are conceptualized as action-taking possibilities and meaning-making opportunities in an environment relative to an actor. Drawing from ecological psychology and by making meaning ecologically cognitive, formal definitions of technology, social and socio-technical affordances are offered. Socio-technical affordances are relational properties in actor-environment systems that provide social action possibilities given the cultural-cognitive capabilities of the actors and the technical capabilities of the environment. A tripartite distinction of intersubjectivity as psychological, phenomenological and technological is made. Technological intersubjectivity (TI) is an emergent phenomenon in socio-technical systems and refers to a technology supported interactional social relationship between two or more participants.
The basic premise of this research is that social affordances of technologies vary along cultural dimensions. To empirically evaluate this premise, an experimental study was conducted into how culture influences the appropriation of socio-technical affordances and technological intersubjectivity in computer supported collaboration. The experimental study design consisted of three independent groups of dyads from similar or different cultures (Anglo-American, Chinese) doing collaborative problem-solving in a knowledge-mapping learning environment. Participants interacted through an asynchronous computer interface providing multiple tools for interaction (diagrammatic workspace, embedded notes, threaded discussion) as they worked on an intellectually challenging problem of identifying the cause of a disease outbreak.
The analytical focus of the experimental study was to determine the influence of culture on the appropriation of affordances by individual participants in an online learning environment. The theoretical objective of the study was to inform the notion of technological intersubjectivity.
Based on theories of culture and empirical findings in cultural psychology documenting cross- cultural variations in behavior, communication and cognition, seven a priori research hypotheses were advanced. Empirical data were collected using demographic, culture and usability instruments; participants' self-perception and collaborative peer-perception instruments; screen recordings and software logs of experimental sessions. Statistical results showed that members of different cultures appropriated the resources of the interface differently in their interaction, and formed differential impressions of each other. For example, on average, Anglo-American participants of the experimental study created more evidential relation links, made more individual contributions and were more likely to explicitly discuss information sharing and knowledge organization strategies than their Chinese counterparts.
The empirical demonstration of a systemic cultural variation in the phenomena of technological intersubjectivity and appropriation of affordances in socio-technical environments is the primary contribution of my dissertation. Other contributions include an empirically informed theory of technological intersubjectivity, a methodological approach for the systematic study of the appropriation of affordances and a formal definition of socio-technical affordances
Chairperson: Daniel Suthers
Author: John Lee Reardon
This dissertation examines two sets of factors that influence the assimilation of electronic medical record (EMR) technology by independent physician practices. That is, policy makers look to information technology (IT) to play a key role in addressing problems afflicting the delivery of healthcare in the United States such as access, cost, and quality. Although independent physician practices play an essential role in healthcare they also are least likely to adopt health information technology (HIT) such as EMRs. It is thus unclear from a practical perspective, and especially so from a theoretical perspective, why a minority of independent physician practices ultimately does adopt and assimilate EMRs, while the vast majority does not.
Two theoretic concepts using a mailed survey, and data analysis using factor analysis and ordinal regression, were applied to examine the assimilation of EMRs by independent physician practices in the state of Hawai'i. Specifically, using the theoretic concept of an organizing vision (Ramiller & Swanson, 2003; Swanson & Ramiller, 1997) it was hypothesized that practices holding strong organizational vision perceptions of interpretability, plausibility, importance, and discontinuity were more likely to have assimilated EMR technology than those practices with weak organizing vision perceptions. As expected, plausibility, importance, and discontinuity were found significant in predicting the level of assimilation fo EMRs whereas interpretability was not.
In addition, using the theoretic concept of organizational learning (Fichman & Kermerer, 1997) it was hypothesized that practices with a higher propensity to innovate with a new complex HIT would demonstrate more learning-related scale, related knowledge, and diversity and were thus were more likely to assimilate EMR technology than those practices with less learning-related scale, related knowledge, and diversity. As expected, findings indicate that learning-related scale, related knowledge, and diversity were found significant in predicting the level of assimilation of EMRs.
Contribution to information systems (IS) theory includes helping to improve our understanding of HIT assimilation because EMRs are a type of complex organizational technology addressed by Attewell (1992). Furthermore, although the focus in this research was on the assimilation of a specific artifact, EMR, the research contributes to general knowledge about the diffusion and assimilation of complex organizational technologies. That is, by applying Fichman and Kermerer's (1997) model of organizational learning barriers and Swanson and Ramiller's (2003) model or organizing visions to a different type of technology (EMR) and a different organizational context (small physician practices), knowledge has been extended into a new research space (Berthon et al., 2002). In doing so, key findings in these assimilation models' applicability to micro-sized organizations has been found. That is, micro-sized organizations apparently can benefit from an organizing vision and organizational learning much like large organizations do.
In addition, contribution to medical informatics literature suggests that, from a practical perspective, results of this dissertation study can provide invaluable support to the small independent physician practice when addressing the issue of adoption and use of an EMR.l Also, understanding why some small practices are able to overcome learning barriers while others are not may thus help policy makers formulate effective programs to facilitate and expedite EMR adoption and assimilation among this critical population. Finally, policy makers, 3rd party payers, charitable organizations, and others interested in promoting EMR use among independent physician practices should consider ways in which learning barriers might effectively be lowered through the development of shared community resources, such as IT support programs, EMR learning labs, and user groups.
Chairperson: Elizabeth J. Davidson
Author: Anthony Wong
In organizations, users of technology are often constrained by the organizational structural properties such as norms and resources. Much of the deterministic stream of information systems research focuses merely on factors pertaining to the individual level and largely ignores the organizational context in which technology use is given shape. On the other hand, the social stream of information systems research emphasizes social influence on technology use, but it lacks sustentative evidence. In light of these two problems, the literature provides little guidance in determining the extent to which the criterion effects vary among different types of information and communication technology use.
The goal of this research is to investigate several areas where the existing literature leaves questions unanswered. This goal is divided into four objectives: (1) to provide a measure of various types of ICT use, (2) to revisit the roles of individual and organizational factors on ICT use, (3) to validate and extend the social actor model posited by Lamb and Kling (2003), and (4) to bridge the chasm between the deterministic and social streams of information systems research. This study adopts the social actor model as the research framework. Variables are conceptualized into four actor dimensions -- identity, interaction, affiliation, and environment -- that are postulated to impact actors' ICT use.
The study tests hypotheses about ICT use in these four actor dimensions with data from a large study of Japan specialists in North America that included extensive information about the organizational contexts in which they work. The research identifies eight general types of ICT use. The social actor model is supported; the results indicate that individual characteristics play a more crucial role in use of technology than organizational structure properties do in professional organizations such as this population. In addition, through elaboration analyses the study uncovers potential moderators and mediators that influence the findings. The research contributes to the information systems literature by offering methodological and theoretical implications for future studies. The research also has implications that may help managers formulate information systems decisions and justify their value propositions.
Chairperson: Colin G. R. Macdonald
Author: Lotus Elizabeth Y. W Kam
Hawaii, New Zealand, Australia, and similarly isolated regions have a natural barrier against outbreaks of contagious diseases. the state of Hawaii is a protective haven for a variety of agricultural products; however, its biosecurity is compromised by the introduction of invasive species and foreign animal diseases. Viral pathogens threaten the productivity and survival of Hawaii's local shrimp industry. Isolated occurrences of Infectious Hypodermal and Hematopoietic Necrosis Virus (IHHNV) and White Spot Syndrome Virus (WSSV) outbreaks have been reported on Oahu and Kauai, signaling that Hawaii's US$9.7M shrimp and prawn aquaculture industry may be in imminent danger. In order to reduce the risk of WSSV epidemics in Hawaii shrimp aquaculture, a decision-theoretic framework is needed to systematically evaluate the impact of biosecurity decisions.
A "test-action" biosecurity risk framework was developed that translates biosecurity decisions into tests and actions for the purpose of analyzing biosecurity risk. From a decision-theoretic point of view, decisions are viewed as having action aspects that reduce consequences and/or test aspects that gather information. This perspective on decision-making offers an accounting method for biosurveillance measures, particularly the value of information resulting from test decisions. The framework was used to fulfill the research objectives for investigating WSSV import risk associated with frozen commodity shrimp (FCS): (1) development of a Bayesian decision network (BDN) to model WSSV import risk, (2) determining the ''best'' policy networks, and (3) estimating the value of biosurveillance for mitigating WSSV import risk.
A BDN was created based on the test-action biosecurity risk framework to model the impact of WSSV biosecurity policies, including a national movement restriction, biosurveillance, and specific pathogen free (SPF) zoning on FCS retail and shrimp aquaculture industries. The expected combined retail and farm profit was estimated at $60.04M. Based on the results of the WSSV import risk BDN simulation experiments, Hawaii farm loss due to WSSV was estimated at $2.91M. The best biosecurity policy, valued at $12.21M, would be a national movement restriction which limited the import of FCS products to WSSV-negative regions. The best state-level policy, valued at $4.62M, would be the establishment of a statewide SPF zone which required retailers to purchase SPF shrimp products from local Hawaii shrimp farmers. In light of the challenges of implementing a national movement restriction and current Hawaii SPF shrimp production levels, an SPF farm-zone on Oahu could be viewed as an efficient biosecurity policy that would increase the overall impact by $1.31M. Biosurveillance tradeoffs resulted in an increase in farm profit at the expense of retail losses.
While the preliminary estimates based on the BDN were of interest, the main contribution of the research was a decision-theoretic framework for modeling and analyzing WSSV import risk. Several insights were gained throughout the WSSV import risk BDN model development. For example, the modeling process provided transparency of the issues affecting WSSV import risk including tradeoffs and the pathways of farm WSSV-exposure. A model risk analysis was conducted based on the results and insights drawn from the BDN simulation experiments. this first-attempt toward developing a quantitative model for Hawaii's WSSV import risk has practical implications since a scientifically-based risk analysis is required by the World Organization for Animal Health (Office International des Epizooties, OIE) to justify biosecurity policies that could infringe on World Trade Organization Sanitary-Phytosanitary (WTO-SPS) agreements. The test-action biosecurity risk framework is a general approach to analyzing biosecurity problems which can be broken down into test and action decisions and where the pathways of exposure are known.
Chairperson: PingSun Leung
Author: Su-Chin Wu
The purpose of this study was to empirically develop and test a hypothesized theoretical causal model of differences in individuals' preference for asynchronous distributed learning (ADL). This individual differences learning model (IDL) explicates the causal relationships among gender, cultural backgrounds, learning styles, and attitudes toward collaborative learning that determine college students' ADL preferences. Previous studies have suggested that distributed learning provides a more positive environment for instruction designed to accommodate individual differences compared to traditional face-to-face (FtF) instruction (Ahuja, Caeley, & Galletta, 1997; Bates, 1994; Berge, 1999; Dede, 1995; Dede, 1996; Silvan, 1999). Two hundred and ninety-six college students from four universities in the United States and four universities throughout Japan and Taiwan participated in this study. The instrument was created in English and translated into Japanese and Chinese using translation/back-translation method. Kolb's Learning Style Inventory (Version 3) was used as part of the survey along with the ADL Preference Scale and Collaborative Learning (CL) Attitude Scale developed by the researcher. The final IDL model suggests that culture explains both collaborative learning and asynchronous distributed learning, and that association between collaborative learning and asynchronous distributed learning may be spurious. Gender in this study proved not to be a factor for the IDL model and has no impact on ADL. Path analysis results showed that learning styles do not have significant relationships with ADL preferences; however, learning styles do have moderate effects on ADL preferences. This study suggests that ADL environments need to be designed in ways, which accommodate the cultural backgrounds and learning styles of students. In conclusion, this study was the first to investigate individual learning differences in the ADL environment and made a significant contribution to learning by advancing our knowledge and understanding on how gender, culture, and learning styles impact ADL.
Chairperson: Dan J. Wedemeyer
Author: Gregory H Carlton
Computer forensics is a relatively new and rapidly growing field that addresses the use of computer data as evidence in legal proceedings. As a relatively new field of study, little empirical research has been conducted pertaining to computer forensics. This lack of empirical research contributes to problems for practitioners and academics alike.
For the community of practitioners, problems arise from the dilemma of applying scientific methods to legal matters based on anecdotal training methods, and the academic community is hampered by a lack of theory in this evolving field. This research study is designed to provide benefits to both communities by utilizing a multi-method approach to identify a protocol for practitioners and lay a foundation for academic theory development.
This research addresses the initial and most frequently performed phase of computer forensic examinations, data acquisition. Within the data acquisition phase, this research specifically studies the data acquisition of personal computers, the most frequently encountered target of forensic data acquisitions. A multi-method approach is utilized to identify, classify, and evaluate the tasks forensic examiners perform during forensic data acquisitions of personal computer workstations by building upon the framework of Nute's (1996) dissertation that established a scientific basis for forensic science.
The first phase of this study utilizes inductive research and is largely based on Grounded Theory (Glaser and Strauss, 1967) to empirically identify and classify tasks performed during forensic data acquisitions. The second phase of this study uses a discursive analytic strategy to evaluate the identified tasks by two review panels of experts. One review panel consists of technical experts and the other consists of legal experts.
A protocol is provided for the forensic data acquisition of personal computer workstations based on 103 tasks identified by practitioners and evaluated by experts. Each task is presented with expert panel merit ratings, examiner performance measures, and conditional performance measures. Eight constraints were identified that influence the degree in which practitioners perform the identified tasks.
The protocol provides measures not previously available to practitioners, and this study demonstrates the use of Grounded Theory for forensic protocol development.
Chairperson: Raymond R. Panko
Author: Peter Leong
Interaction has been identified as a major construct in distance education research. Interaction is a ubiquitous term in technology-mediated distance education literature. The basic premise is that learners learn most effectively when actively engaged as opposed to passively reading or listening. However, a major obstacle facing researchers studying interaction and interactivity is that these terms have not been clearly or functionally defined to make the concept of interaction measurable and useful.
Researchers studying interactions in computer-mediated communication systems have used social presence theory to analyze interaction, communication and collaborative learning. Social presence is defined as "the degree of salience of the other person in the interaction" (Short, Williams, & Christie, 1976,p.65) and has been determined to be a strong predictor of satisfaction within a CMC environment. Another vein of research that may shed some light on interaction and satisfaction in online learning environments is cognitive absorption, which is derived from flow theory. Cognitive absorption is defined as "a state of deep involvement with software" (Agarwal & Karahanna, 2000).
This study empirically investigated the role of social presence and cognitive absorption in online learning environments. Specifically, this study developed a hypothesized structured equation model (SEM) to explain the relationships among social presence, cognitive absorption, interest, and student satisfaction with online learning. Contrary to expectations, the study determined that social presence does not impact satisfaction directly. The study concludes that while social presence influences student satisfaction, its impact is not direct but rather mediated by cognitive absorption. In addition, the study clarified the impact of students' interest on social presence, cognitive absorption and satisfaction. Additionally, interest appears to influence satisfaction indirectly through social presence and cognitive absorption. Contrary to expectations, this study did not reveal any significant relationship between interest and cognitive absorption.
Findings of this study increase our understanding of online students experience and provide insights into their internal behaviors and the psychological processes that underlie their perceptions of interactivity in online learning environments. Most significantly, this study advocates for the need to consider both social presence and cognitive absorption simultaneously to better understand what constitutes an interactive, compelling online learning environment. Several general but practical recommendations are provided for online course designers to facilitate the occurrence of social presence and cognitive absorption and thereby, increasing student satisfaction with online learning environments.
Chairperson: Curtis P. Ho
Author: Sophea Chea
While emotion has been recognized to have a crucial role in decision making in marketing, organization/industrial management, and psychology; in IS the roles of emotion are understudied. Particularly, research in acceptance and use of technology is dominated by traditional cognitive paradigm. This study proposes and tests the affective extension of UTAUT (Unified Theory of Acceptance and Use of Technology). Two affective constructs, core affective experience (emotional state) and negative affectivity (emotional trait) were introduced to UTAUT to form an integrated model of acceptance and use of technology.
The objectives of this study are to validate UTAUT and to determine the roles of emotional state and emotional trait in technology acceptance and use. Specifically, what is the role of core affective experience in determining users' intention to use a technology? And what are the relationships between negative affectivity and other constructs in the model?
The proposed integrated model was tested in a pilot study and validated with two survey studies of the case of acceptance and use of an end-user computing technology in a classroom context. Most aspects of the model are confirmed for both experienced and inexperienced users. Core affective experience does have impact on behavioral intention to use a technology beyond the effects of cognitive variables in UTAUT. In particular, activation has significant impact on inexperienced users' behavioral intention while pleasantness is a significant determinant of experienced users' behavioral intention. The trait of negative affectivity is a significant antecedent to pleasantness; the higher the level of negative affectivity the lower the experience of pleasantness is reported. Furthermore, individual with higher negative affectivity trait might feel being less supported by organization in the use of a technology.
Thus, introducing core affective experience and negative affectivity into UTAUT to form an integrated model advances the theory and provides a useful tool for IS researchers and practitioners to better understand the likelihood of a new technology being accepted. Furthermore, understanding both cognitive and affective drivers of acceptance and use of a technology can help practitioners in designing a system, providing training intervention, and marketing of the system in a more effective way.
Chairperson: William E. Remus
Author: Michael Cress
Trust is an important element in business relationships. For e-commerce consumers, lack of trust has been identified as a major impediment to purchase intentions. This research replicates and extends Gefen, Karahanna and Straub's (2003a) article, "Trust and TAM in Online Shopping: An Integrated Model." Gefen et al. (2003a) combined the Technology Acceptance Model (TAM) described by Davis (1989) with trust in an attempt at modeling factors that influence consumer e-commerce intentions. In their study, trust was theorized to be influenced by four distinct trust antecedents: calculative based, structural assurances, situational normality, and familiarity. The main objectives of the present study are to extend the use of structural assurances to include the consumer's personal computer, and to increase the understanding of trust within an e-commerce context. Structural assurance is defined as the perception of success a person has in a situation based upon safety nets, guarantees, and other regulations that exist within that situation. Structural assurances have been shown to build trust levels in individuals (McKnight, Cummings and Chervany, 1998; Shapiro, 1987; Zucker, 1986). While many studies have shown the effectiveness of structural assurances used by web vendors, no studies have examined the importance of structural assurances on the consumer's personal computer.
This research aims to make several important practical and theoretical contributions. Practically, this study demonstrates that structural assurances have a greater role in determining trust, and that they can be used to assure all the parts of an e-commerce system with which consumers have uncertainty. Knowing this, web merchants and other third party software providers can start providing a wider variety of assurances to build consumer trust in e-commerce. Theoretical contributions include extending Gefen et al. (2003a) in several areas. First, a greater understanding of trust and its antecedents is gained by including e-commerce system aspects. Second, the variable perceived risk is introduced into the model as a mediator between trust and behavior intention. This research provides a more thorough understanding of the variables a consumer faces in an e-commerce transaction.
Chairperson: Raymond R. Panko
Author: Siong Meng Tay
As corporations create geographically distributed operation centers (offices, warehouses, manufacturing facilities, etc.), multinational corporations have IT departments that span the globe to support distributed business ventures. IT (Information Technology) staff must provide support for the information systems that these centers of operations depend on, round the clock and around the globe. This provides an opportunity to study the phenomenon of distributed IT support team.
This study examined how a distributed IT support team coordinates an IT support event among the team members in different locations, separated by multiple time zones and oceans. The IT characteristics and practices of two organizations as well as the communication pattern of IT support events are described and discussed.
The researcher found that the composition of a distributed IT support team extends beyond the traditional boundaries to include the organization's IT Support staff, IT Support staff from different location or office, employees from other department within the organization, and IT support staff from outsource vendors or suppliers. Each team member brings to the distributed team resources that are associated with each tie or network that each team member brings to the distributed team. With established relationships and ties, IT support may not have to be everywhere, IT support may invoke resources associated with established relationships and ties and their presence may augment the presence of IT support.
The use of ICT does not lead to drastic changes in boundary-spanning practices from highly collaborative to mostly objectified and transactive if moderated by organizational factors like IT leadership and policies.
An organization's creativity and resourcefulness may ease the hiring and retention of good staff in a tight labor market. Organizations have to monitor the welfare of their staff especially those who work in a support role on a 24 by 7. Staff who are unhappy and dissatisfied with their work or the environment will tend to be less productive and in the end will separate themselves from that organization.
Chairperson: Elizabeth J. Davidson
Author: David Pai
One of the most oft used models for determining an individual's intention to use a technology has been the Technology Acceptance Model (TAM)(Davis 1989). Research studies utilizing TAM have focused on cognitive and social influence explanations of perceived ease-of-use and perceived usefulness. Research in individual psychology suggests that a holistic view may better define the mechanisms surrounding how an individual comes to perceive and accept a technological innovation.
Holistic research in information systems has revolved around studying the experiential state of "flow" or "cognitive absorption". these studies conflict on their definition of this state, which leads to the questions; which, if any, of these dimensions are truly reflective of this state? And how do these dimensions affect technology perceptions? This study furthers our understanding of technology acceptance by defining the role that dimensions and experiential state of cognitive absorption play in an individual's acceptance of a technological innovation.
Chairperson: William E. Remus
Author: Joyce Yukawa
This case study examines the learning processes in an online action research course facilitated by the researcher. Two graduate students in the Library and Information Science Program, University of Hawaii at Manoa, studied action research and applied their knowledge to independent research projects. The study's purpose was to examine the co-construction of knowledge and how affect and interaction influence participant understanding of action research. The online workspace was created using wild-style collaborative software, with added email and chat programs.
Three key narratives were used to explicate learning as a holistic process: (a) a primary narrative focused on course learning objectives; (b) a reflection sub-narrative focused on unique learner outcomes within the course framework; and (c) a co-reflection sub-narrative focused on the co-construction of knowledge. Reflection (an individual critical thinking process) and co-reflection (an intersubjective critical thinking process) played key roles in learning action research. Co-reflection, an emerging concept, draws on individual reflection and involves four interactional characteristics. The different co-reflection narratives provide detailed records of the evolution of socially constructed knowledge and collaborative meaning making with affective, cognitive, and interactional dimensions.
The findings of this study indicate that online learning of action research is effectively supported by: (1) field-based, inquiry learning; (2) instructor understanding of the learners' backgrounds; (3) a learning philosophy that values constructivist learning, affect, relationship building, the development of self-efficacy, and empowerment; (4) online facilitation and mentoring skills; and (5) social software. The simple, flexible software tools effectively supported complex learning processes by allowing novice users to focus their learning efforts on course content rather than software features, and to adapt and augment learning and communication strategies from their face-to-face experiences.
Narrative analysis was used to interpret the data for three reasons: (1) the narrative is a basic form for making meaning from human experience; (2) the individual learners were unique in background, learning style, and goals; and (3) the social software encouraged users to adapt and innovate. Social software provided a record of the evolution of socially constructed knowledge. Narrative analysis offered a theoretical framework for elucidating the processes underlying that evolution.
Chairperson: Violet H. Harada
Author: Margaret Luo
The goal of this research is to develop and test a theoretical model of the effects of intrinsic and extrinsic motivations on user acceptance of Internet-based information services. The model, referred to as the integrated model of technology acceptance, is being developed with two major objectives. First, it should improve our understanding of user acceptance behavior, providing new theoretical insights into the successful design and implementation of Internet-based information services. Second, the integrated model should provide the theoretical basis for a practical system design and analysis approach that would enable practitioners to develop new information services or modify their current services.
For user acceptance to be viable, the model of user acceptance must be valid. The present research takes several steps toward establishing a valid motivational model of the user, and aims to provide the foundation for future research that will lead toward this end. Research steps taken in the present research include: (1) choosing U&G (Uses and Gratifications) theory, a well-studied theoretical approach from mass communication to formulate an integrated technology acceptance model with TAM (Technology Acceptance Model); (2) developing and pre-testing the measures for the model’s factors in two pilot studies; (3) conducting two rounds of data collection and analyze them to prove that the integrated model is applicable to the present context; (4) reviewing literature in both information systems and mass communication to demonstrate that empirical support exists for various elements of the proposed model, and (5) using advanced statistical technique, structural equation modeling (SEM), to test the model’s structure.
The results confirm our proposed integrated model. The model posits that entertainment motivation is another important factor in determining the use of online services in addition to the behavioral intention, as postulated by TAM. The integrated model also confirms that TAM’s belief constructs, perceived ease of use and perceived usefulness, are predictors of behavioral intention. Furthermore, perceived usefulness predicts behavioral intention. It also argues that the level of use influences the degree of satisfaction. Satisfaction is a construct that is heavily studied due to its important role as an indicator of system success.
Chairperson: William E. Remus
Author: Randall Larsen
This abstract describes a case-study research project that investigates methods of conflict resolution between artists and copyright owners over artist rights issues in the production and distribution of Motion Pictures by Hollywood Studios. The principal research methodology the project uses is the historical method. In order to more fully understand the history of conflicts involving artist's rights the project traces concepts of intellectual property from conjectured pre-historic beginnings through recorded history to the present era. The research suggests that the current Anglo-American Intellectual Property Law is based on the romantic notion of the individual author and the individual copyright owner.As an incentive to cultural production Copyright law grants the individual author both an author's moral rights and an owner's property rights in his or her cultural production. The financing of a Hollywood Motion Picture generally requires that the cultural production be owned by a corporate owner instead of its artist creator. This research finds that the interests of the two parties often don't coincide. Thus conflicts arise between artist creators and the corporate copyright owner. The research is based on the view that culture is socially constructed. The research is also informed by three other principal theoretical bases. The first is a theory of articulation. The second is Gramsci's theory of hegemony which suggests that a dominant group exercises power over another group by shaping the world-view of that group with a particular ideology (in this case the ideology inherent in intellectual property law). The third theoretical basis of this research is the theory of intertextuality which suggests that artists in constructing motion pictures must refer directly or indirectly to other motion pictures and to other texts or meaning systems. This theory-building research explores the broad research question "how are conflicts resolved between artists and copyright owners surrounding the issue of artist's rights?" The research suggests that the "author" ideology inherent in intellectual property law is used as a hegemonic discourse to resolve conflicts in favor of the commercial interests of dominant corporate intellectual property rights holders in preference to the rights of artists and the information-seeking public.
Chairperson: Majid Tehranian
Author: Steve Takaki
Spreadsheet programs are widely used in business and government. Unfortunately, there is strong evidence that many spreadsheets contain errors. In spite of the importance of spreadsheets in decision-making, studies have shown consistently that end-user spreadsheet developers rarely test their models thoroughly after development in the manner that professional programmers test software.
One contributing factor to both error rates and the lack of post development testing may be that spreadsheet developers are overconfident in the correctness of their spreadsheets. Overconfidence is a widespread human tendency, and it has been demonstrated among spreadsheet developers. When people are overconfident, their "stopping rules" for error detection during and after development may be premature, causing them to stop checking before they should. This may contribute to the number of errors.
At the same time, a research construct that appears to be closely related is self-efficacy, which has been shown that high self-efficacy is positively related to computer task performance, including spreadsheet performance (although not specifically to error reduction performance).
The findings from this research concluded that people with high self-efficacy and high confidence make fewer errors than those with low self-efficacy and high confidence. Also, a "think-aloud" protocol analysis of a subset of subjects observed a lack of system design and analysis effort and a minimal amount of testing during the development of spreadsheet tasks.
Chairperson: Raymond R. Panko
Author: Soussan D Djamasbi
Recent developments in the information and communication technology have made it possible to provide managers with large amounts of information. Although information technology has been instrumental in improving the access and flow of information, it has also been instrumental in creating an overload of this same information for businesses and organizations. Consequently, the problem of information overload and ways to manage it have been the focus of a great number of studies in the MIS literature. A large body of the studies that examine the effects of information overload view decision makers as rational actors who process information inputs into decision outputs and whose performance is constrained by their cognitive structure. Recent psychological investigations have shown that affective states such as positive mood can regularly and significantly influence and enhance one's cognitive structure and flexibility. In the light of these psychological studies, it is reasonable to believe that mood may influence the effects of information overload. That is, it is reasonable to expect that the performance of the people in positive mood will be better than their control counterparts under the conditions of information overload. Thus, this study intends to extend prior work on information overload by establishing mood as an important variable in the existing models.
This study is conducted in two parts. In Part I, the effects of positive mood on cue utilization and accuracy of the judgments using a Decision Support System (DSS) are investigated. Part II of this study extends the investigations of Part I (i.e. the effects of positive mood on cue utilization and judgmental accuracy) to include information overload. To do this, the impact of positive mood on cue utilization and judgment accuracy using a DSS under a baseline load level and an overload level is examined.
Chairperson: William E. Remus
Author: Jenifer Sunrise Winter
Ubiquitous Networked Computing (UNC) is an emerging environment encompassing future developments in the areas of Pervasive Computing. Mobile Computing, and Ubiquitous Computing (e.g., Weiser, 1991). This research sought to enhance policy decision making by identifying and assessing emerging problems related to UNC in Hawaii over a twenty-year time frame. This study also investigated differences in problem assessment between information technology specialists and non-specialists.
A six-phase methodological process employing scenario planning, electronic focus groups, and problem assessment surveys was developed to investigate perceptions about emerging problems. Specialists and non-specialists generated eighty unique problem statements and additional members from each group assessed the relative importance of these statements. Specialists further assessed a subset of 24 statements according to four problem criteria adapted from previous research by the Center for the Study of Social Policy (1977).
Non-specialists participating in the electronic focus groups expressed distinct and different concerns from the specialists. Further, both groups found the statements generated by non-specialists to be valuable contributions, arguing for their inclusion in the process of problem identification. By the Mann-Whitney U test (p < .05), significant differences in assessment between groups were identified in 41 of 80 problem statements. Analysis of between-group differences suggests that specialists share a frame of reference focused on addressing near-term obstacles to the growth of high-technology industries within Hawaii. Non-specialists expressed greater concern for longer-term human-centered issues, particularly those related to control of the process of technological development.
This research contributes a framework that extends current knowledge of potential emerging problems related to UNC. The methodological process can be applied in other content areas or regions. Ranked lists of problem criticality assessments by each group and by individual problem criteria were created. Analysis yielded three policy indices intended to assist decision-makers direct limited resources toward problems that may yield the most substantial long-term return-on-investment. Further, this research contributes to an understanding of the opinions of diverse stakeholders and how identifying differences between them may be an effective means of recognizing emerging problems. Examination of these differences can initiate future-oriented social negotiation involving multiple perspectives, leading to a more human-centered implementation of technology.
Chairperson: Dan J. Wedemeyer
Author: Rita Michele Vick
Decision making is an inherent part of everyday work and learning processes. Superior decision outcomes can be achieved by structuring decision processes, encouraging domain experts to work collaboratively, providing visualization of decisions as they develop, and providing decision makers with time and flexibility to better understand problems and to project outcomes. Evaluation of distributed synchronous virtual teamwork environments has eluded researchers. The theoretical foundation of this study was Adaptive Structuration Theory (AST) enhanced by a distributed cognition framework. Discourse analysis was used to explore ways to evaluate effectiveness of newly-formed time-constrained self-directed virtual teams using computer-mediated communication (CMC) to solve ill-defined problems. Measures of work process performance were percentages of meeting time devoted to Situation Assessment, Resource Coordination, Idea Generation, and Model Building. Ten measures of work outcome for each of six teams were taken to assess change in decision model quality over time. The data informing this study were obtained during an elective computer science course. The author's course design focused on human-computer interaction (HCI) aspects of use, design, and deployment of computer-supported collaborative work (CSCW) and computer-supported collaborative learning (CSCL) systems. Participants were randomly assigned to teams that remained intact throughout the semester. Teams assumed various roles during policy and software-design scenarios. Networked TeamEC(TM) decision-modeling software enabled team problem solving. NetMeeting provided connectivity, application sharing, and text chat for intra-team communication to simulate distributed virtual meetings. Discourse analysis revealed process performance patterns and development of shared mental models of problem solutions. The outcome variable (Model Score) improved over time for all teams, but degree of improvement varied greatly among teams. Qualitative analysis of group process variables indicated variance was due to how well teams understood scenario-role requirements and managed available resources. Time usage by process variable was analyzed to measure critical resource use to discover "best practice" guidelines for distributed synchronous teamwork. A Naturalistic Decision Making (NDM) approach extended collaborative experiential learning to complex applied knowledge domains in order to improve problem solving and critical thinking skills. Constructivist learner-centered course design facilitated a clear task focus enabling participants to learn new work practices applicable to classroom and workplace.
Chairperson: Martha E. Crosby
Author: Yun Du
The demand for language education offered via distance education is facing an increasing demand from potential learners as well as institutions that offer language courses. Due to the recent trend of globalization, there are more demands on people to learn foreign languages or improve their language skills in order to work in an international environment. Resources for learning languages, however, are often limited.
One solution for potential students who cannot find language classes in their institutions or who are not able to take regular classroom-based classes is to take distance language classes offered from other institutions via computer. Prior to the recent growth of the Internet and the World Wide Web (WWW), computers were not easily accessible. Recently, however, the WWW has provided a convenient way for institutions to reach considerably more potential language learners. Although the WWW is becoming a valuable resource for language learning, research on developing Web-based environments for teaching and learning language have been rare. Current practices are limited in terms of both instructional and technical perspectives.
This dissertation introduces a study on developing a Web-based language learning environment for distance language education. The study proposes a model for developing a Web-based environment for distance language teaching and learning. The model combines different factors that influence the development of such environment, and lists the basic components needed to support Web-based teaching and learning activities. This model provides a common platform for experts from different fields to collaborate on the development of language learning environments. The language learning environment developed in this study provides functions that not only support regular class room operations but also take advantages of the WWW to offer more efficient teaching and learning opportunities. Integrated with a specific instructional model for language learning, this learning environment demonstrates a unique approach for building learning system on pedagogical foundations. This dissertation describes a field study of a Chinese class that used the developed system. Results from this study only provide insights on how students and instructors used the Web-based language learning environment but also offer suggestions on the future development of Web-based collaborative environments.
Chairperson: Martha E. Crosby
Author: Justin M. W. Goo
One of the problems documented in behavioral research and cognitive sciences is overconfidence in users. Overconfidence is when the prediction of the degree of success exceeds the actual degree of success achieved by the user. Some studies suggest that systematic feedback would decrease users' overconfidence. This study was designed to determine if direct feedback on the spreadsheet developer's previous performance would have an effect on their subsequent performance. The hypothesis is that such feedback would increase both their performance and their confidence calibration.
In this dissertation, an experiment has been conducted, utilizing performance feedback to study overconfidence and performance changes in spreadsheets. In the first two weeks of November 2001, 193 subjects were given three spreadsheet tasks and questionnaires that measured their confidence levels before and after each task. An experiment group was given feedback on their performance after each task, while the control group was given no such treatment.
At the conclusion of this experiment, the results indicate that there was not enough evidence to statistically accept or reject the main null hypotheses. However, while the experiment did not provide the anticipated results, it was not without relevant findings. It was found that: (1) statistically insignificant changes did occur in the predicted direction, (2) that subpopulations respond differently to treatment and some do respond as hypothesized, (3) that while certain stereotypical gender based differences are dispelled, others are supported and (4) that confidence calibration in the spreadsheet developmental subject domain may be affected with appropriate stimuli.
The results indicate that feedback of this form is not sufficiently effective to improve overall spreadsheet development. Because breakout groups reacted differently to the treatment, different training and development techniques may be warranted for different cross sections of the population. The few noteworthy observations suggest avenues of further research.
Chairperson: Raymond R. Panko
Author: Sonja Wiley-Patton
Information technology (IT) has become pervasive in the healthcare industry. Many view the Internet as a strategic healthcare tool. The Medical Records Institute suggests that Internet-based health applications (IHA), for example, electronic health records, e-prescribing, and mobile health are the goals of most healthcare organizations (2002). The use if the Internet for electronic medical records, e-billing and patient scheduling can enable the health care industry to reduce its inefficiencies and erors in care delivery (HIMSS/IBM LEADERSHIP SURVEY, 2000). While the use of IT in healthcare has increased tremendously, key players, specifically physicians still have not fully embraced the valuable resource of the Internet.
Despite the purported advantages of IT investments in healthcare many doctors do not widely use Internet-based health applications in their clinical practices. Physicians often misunderstand the functions and full potential of the Internet (Wang & Song, 1997). Health & Health Care 2010 report that less than 5% of physicians use computers to record all clinical information for an average patient.
The present study examined physicians' intentions to adopt Internet-based health applications for use in their clinical practices. This research reports on the test-retest reliability of the extended Technnology Acceptance Model-TAM2 (Venkatesh & Davis, 2000).
Data were collected from a survey of pediatricians to evaluate the effectiveness and appropriateness of the model in the medical environment. Results from the study indicate that TAM2 is appropriate but not completely applicable to the unique characteristic of physicians. The test-retest indicated reliable results with the exception of the result demonstrability construct. The results of multiple regression analyses indicated that perceived ease of use was not significant in predicting physicians' behavioral intentions in this study. As theorized the primary predictor variable perceived usefulness was a strong determinant of intention to use. Results indicate that physicians tend to be pragmatic in their IT acceptance decisions. Physicians focus more on the technology's usefulness rather than its ease of use.
This dissertation discusses the implications, limitations and presents possible explanations for the inconsistencies within the extended technology acceptance model when it is applied to a professional group not commonly examined in IS research.
Chairperson: William G. Chismar
Author: Mauricio Sanchez Featherman
The dissertation extended the Technology Acceptance Model2 (TAM2, Venkatesh and Davis, 2000) to include a measure of negative utility, perceived risk of service usage. It applied TAM2 to the e-payments research context for the first time. TAM2 is an evaluation model based on previous attitude-intentions literature, tuned for information systems research. It has been successfully utilized to measure the perceived positive gain in utility possible from adoption of an information system.
Valence models (Lewin, 1944; Fishbein 1967, Peter and Tarpey, 1976) base evaluations on both desirable benefits (with positive valence) and undesirable costs (with negative valence). These models propose that if the benefits outweigh the costs resulting in a net positive valence the purchase will tend to be made.
The dissertation provided initial evidence for the validity of including a measure of negative valence (a potential cost of system adoption) into TAM2. Several operationalizations of the construct product category inherent perceived risk were tested for construct validity within the TAM nomological net.
Results indicated that perceived risk inhibited consumers' adoption intentions as well as perceptions of the usability, and usefulness of an e-billpay software service. Perceived usage risk also proved useful when used as a categorical variable to segment the sample. Different antecedents and inhibitors of perceived usefulness and adoption intention were found for low and high-risk perceivers.
Linear regression, structural modeling, and ANOVA results were used to investigate the research questions and fit of perceived risk within the research model.
Chairperson: Raymond R. Panko
Author: Candace Chien-Tzu Chou
Interaction research provides important information on student behaviors in distance-learning environments to educators, researchers, and instructional designers. The current state of interaction research has focused mostly on the quantitative results of interconnected messages in a teacher-centered learning environment. This study examines interaction patterns at both interpersonal and system levels in a learner-centered distance-learning environment. The research focuses on factors that affect interaction from three areas: learning activities, technology attributes, and learner differences. At the system level, student perceptions of both synchronous and asynchronous computer-mediated communication (CMC) systems and the relationship with interaction are investigated. At the interpersonal level, patterns of learner-learner interaction over both communication modes are compared and contrasted. Furthermore, the overall effects of various theoretical-based instructional activities on learner interaction are also scrutinized. Results of the research suggest that constructivist-based instructional activities, such as student-moderated discussion and small group cooperative learning, are conducive to interaction. The appropriate employment of a synchronous online seminar can enhance interpersonal relations and community building. In general, 79% of the computer-mediated discussions were devoted to task-oriented interaction and 21% on social-emotional-oriented interaction. Nevertheless, a higher percentage of social-emotional interactions occur in synchronous mode than occur in asynchronous mode. Furthermore, positive student ratings of a CMC system rise because the frequency of positive use increases. Student perceptions of technology affect the rate of adoption of the CMC systems only at the beginning stage. Four stages of student adoption of technology are observed. The connection between learner differences and interaction is observed in gender difference. Female students tend to spend more time in social-emotional-oriented interaction than male students. A model on learner-centered computer-mediated interaction for collaborative teaming is proposed to explain the factors that could affect interaction. Recommendations on the design of instructional activities and interactive interfaces are also made for the improvement of distance-learning environments.
Chairperson: Martha E. Crosby
Author: Sungwon Cho
Electronic marketplaces are one of the fastest growing areas in electronic commerce (EC). However, current electronic marketplaces support only limited negotiation functions using a one-dimensional variable of price, leaving complex business negotiations to be done in traditional ways.
To provide automated negotiation services in EC, this dissertation introduces an innovative trading mechanism for electronic marketplaces and presents a formal methodology to evaluate it. It presents an evaluation framework to analyze the current status of electronic marketplaces. In addition, it introduces an online Multidimensional Auction Mechanism with Enhanced Negotiation Support (MAMENS) that provides automated one-to-many negotiation support in EC. The MAMENS system is designed with two innovative features to improve negotiation support: the use of sellers' feedback and a post-utility scoring method.
To test the efficacy of the proposed mechanism, a computational platform for computer simulation was developed. The computer simulation results demonstrate that two main features of the MAMENS system lead to a better bargaining outcome than without them. The simulation also shows that buyers can achieve faster convergence as well as higher utility with the use of sellers' feedback. The findings of this dissertation will facilitate an understanding of the effectiveness of online multidimensional auction systems among practitioners and academics.
Chairperson: Martha E. Crosby
Author: Jeng-Chung Chen
Computer and Internet use in organizations has grown exponentially in recent years as has the installation of software that monitors this use. The monitoring of employee computer use is controversial and raises important issues of trust and privacy. This study investigates the effects of personal psychological preferences, organizational ethical climates, and personal monitoring software knowledge (exogenous variables) on trust and privacy (endogenous variables) and proposes a causal model of these effects.
Attendees of the Pacific Telecommunications Council 2001, held in Honolulu, Hawaii in January, 2001, were asked to fill out a survey. There were a total of 247 acceptable responses. The questionnaire was composed of seven constructs: independent and interdependent self-construals, benevolence and self interest work climates, trust in the supervisor, monitoring software knowledge, and privacy concern. Five of the seven constructs had been developed and tested in previous research; two were developed by the researcher.
Through confirmatory factor analyses, the number of questionnaire items was reduced from thirty-three to twenty-one. An initial theoretical model with seven latent factors and twenty-one manifest variables went through a series of model modifications that resulted in a final model with six latent factors and eighteen manifest variables; the factor monitoring software knowledge was deleted. Results show that independent self-construal, interdependent self-construal, benevolence work climate, and self-interest work climate have direct effects on trust in the supervisor and indirect effects on privacy concern through trust in the supervisor. Both independent self-construal and trust in the supervisor have direct effects on privacy concern.
Chairperson: Rebecca Knuth
Author: Carleton Allen Moore
Software developers work too hard and yet do not get enough done. Developing high quality software efficiently and consistently is a very difficult problem. Developers and managers have tried many different solutions to address this problem. Recently their focus has shifted from the software organization to the individual software developer. For example, the Personal Software Process incorporates many of the previous solutions while focusing on the individual software developer.
This thesis presents the Leap toolkit, which combines ideas from prior research on the Personal Software Process, Formal Technical Review and my experiences building automated support for software engineering activities. The Leap toolkit is intended to help individuals in their efforts to improve their development capabilities. Since it is a light-weight, flexible, powerful, and private tool, it provides a novel way for developers to gain valuable insight into their own development process. The Leap toolkit also addresses many measurement and data issues involved with recording any software development process.
The main thesis of this work is that the Leap toolkit provides a novel tool that allows developers and researchers to collect and analyze software engineering data. To investigate some of the issues of data collection and analysis, I conducted a case study of 16 graduate students in an advanced software engineering course at the University of Hawaii, Manoa. The case study investigated: (1) the relationship between the Leap toolkit's time collection tools and "collection stage" errors; and (2) different time estimation techniques supported by the Leap toolkit.
The major contributions of this research includes (1) the LEAP design philosophy; (2) the Leap toolkit, which is a novel tool for individual developer improvement and software engineering research; and (3) the insights from the case study about collection overhead, collection error and project estimation.
Chairperson: Phillip M. Johnson
Author: Sunyeen Pai
The use of business-to-business (B2B) information technology is one of the fastest growing areas in electronic commerce. There is tremendous optimism regarding Internet B2B and its advantages. Millions are being invested in development.
B2B technologies have not always been successful, however. Despite the tremendous confidence in traditional electronic data interchange (EDI), it did not penetrate the small and medium enterprise (SME) sector. This was a critical failure, as SMEs are the essential suppliers and/or distributors in all industries.
This study looks at small business supplier adoption of Internet B2B. It uses EDI adoption research and examines Internet B2B from the small business user's point of view. It examines adoption in Hawaii, a geographically isolated economy, and develops policy implications regarding IT adoption.
Working from the theories of diffusion of innovation, network externalities, critical mass, inter-organizational relationships, and general systems, a dynamic simulation model is developed. Several experts examine and validate the model. Scenarios are used to forecast adoption behavior under different circumstances.
The analysis reveals that small suppliers will adopt Internet B2B technologies more rapidly and at a higher rate than traditional EDI due to improved expected intangible and tangible benefit to cost comparisons. Small suppliers in Hawaii will adopt at lower rates than small suppliers nationally. The model tests the effects of two trends, outside online competition and prime contracting. Outside online competition will boost adoption in Hawaii, whereas an increase in prime contracting will result in lower adoption rates.
One of the critical insights the model produces is that independent variables have different effects at different times during the innovation decision process. Partner pressure is important for awareness. Expected tangible and intangible benefit-to-cost comparisons fuel the adoption decision. Post-adoption tangible and intangible benefit-to-cost comparisons determine whether or not the pool of adopters remains constant or drops off.
This study incorporates major diffusion theories in a dynamic model that reveals new key insights regarding the diffusion process. An understanding of variable interaction, the critical timing of variable influence, and the importance of adopter feedback will enable researchers and policy makers to understand, facilitate, and forecast technology adoption behaviors.
Chairperson: William G. Chismar
Author: Asanga Porage
Customers are increasingly demanding products and services that satisfy their specific needs. Mass customization is the mass production of individually tailored products that satisfy the specific needs of each customer. It involves the acquisition and satisfaction of customer requirements. Product customization is costly because each customer requires individualized attention. There is a need for intelligent software that identifies possible customer preferences and then assembles products based on the identified preferences. This study provides a framework and an algorithm for the interactive customization of products and services (Iona).
The framework consists of the following functions: acquisition, assessment, elimination, selection and explanation/description. The following information must be acquired: (1) absolute/preferred constraints (product component specifications considered most important), (2) categorical preferences, (3) stereotype (customer type), and (4) context (purchase situation). Assessment consists of the following: (1) deciding which choices violate the absolute/preferred constraints, the categorical constraints, and the constraints inherent in the product (binary constraints between the different parts of the product and unary constraints such as availability), and (2) calculating the multi-attribute utility (usefulness to the customer based on the attribute levels) of the remaining choices. Choices that violate the constraints are eliminated to arrive at a basic solution. A selection is made for each part of the product using a binary integer programming model after choice utilities are estimated at an acceptable level of certainty. Questions regarding stereotypes/contexts are asked of the user to refine the utility estimates of the choices. The query to generate is based on the user's possible membership in the stereotype/context and the probable improvement in the utility estimates. A choice is selected for each part of the product when the usefulness of asking a query is less than the cost of querying. When providing categorical preferences, descriptions of the discriminating attributes are given for users unfamiliar with the product. The final solution is explained emphasizing the attributes considered important based on the user model. The algorithm is applied to the domain of travel planning. A prototype (Travel Planner) demonstrates the feasibility of the framework and the algorithm.
Chairperson: David N. Chin
Author: Koji Takeda
MERA is a diagramic language system designed to enhance client-designer communication in software development. It realizes comprehensive support by providing a uniform language scheme, client support functions, and software engineering tools. The language scheme uses a uniform syntax to reduce the learning burden for diverse diagram notations. The same language scheme is also used to define new diagram types. Therefore the same diagram tools used to create and analyze end diagram is also used to create and analyze the language definition itself. MERA has several language features designed to enhance client-designer communications, including separation of view data, animation of diagrams, stereotype models, methodology models, and common data format. The separation of view data from definitional data increases the flexibility of diagram display so that a diagram can be adapted to suit a client's preferences. The animation of diagrams enhances the understandability of abstract models. The stereotype model serves as a template diagram for particular classes of software and implements storage and reuse of domain knowledge. The methodology model guides the process of constructing the diagram to assure completeness and consistency. The common data format enables exchange of diagrams across heterogeneous computer environments. These client support features help to increase the usability of the diagrams and reduce some of the communication problems using MERA diagrams. Additional software engineering tools support use of MERA diagrams in the context of software development. The design of MERA and associated tools was developed incrementally and tested through actual use in more than seven major software development projects over a period of ten years.
Chairperson: Isao Miyamoto
Author: David K. Lassner
Global telecommunications standards have become much more important in the technologically advancing, increasingly internationalized and structurally evolving telecommunications environment of the world today. During the past two decades as telecommunications has transformed technically and structurally, the traditional standardization regime has been stressed by new expectations and pressures, while still encumbered by traditional working methods and policies. Organizations such as the International Telecommunications Union and International Organization for Standardization have been increasingly perceived as unable to create the standards needed, with the speed required by the marketplace.
The restructuring of the telecommunications sectors within Europe, Japan and the U.S. brought with it the emergence of regional telecommunications standardization organizations. The rising demand for market-based standards is also being increasingly met by consortia of providers concerned with the interoperability of the new products and services they bring to market. And the explosion of the Internet upon the scene has brought the TCP/IP standards, created by the grass-roots Internet Engineering Task Force, into the forefront of standardization.
There has been a widespread recognition within the standardization regime that more needs to be done to involve users, who create the markets for products and services based on the standards developed. But participation by users, including the non-dominant economies of the world which represent huge emerging markets, has been minimal. These non-participating countries hold formal membership in the organizations responsible for standardization but have not been active in the process. The movement of standardization to fora outside the formal global arena has the potential to further institutionalize the limited involvement of these non-dominant economies, as they do not even hold membership in the regional standards organizations or industrial consortia.
This research reconciles the actual concerns within three non-participating nations of Southeast Asia with the literature and activities in progress. It concludes with a recommendation for greater regional cooperation in telecommunications standardization within the Asia-Pacific region.
Chairperson: Dan J. Wedemeyer
Author: John Locke
This dissertation sets forth the foundations of a communication-based paradigm for the study and augmentation of collaboration in creative ecosystems. The objective of this dissertation was to observe what participants actually do in creative collaboration practice as a basis for understanding their activity and defining appropriate design requirements for the development of augmentation environments (support systems) supporting those activities.
The dissertation develops and presents:
* an organic theoretical model of information ecologies;
* an original observational methodology
* a multilayered, non-linear research model for tracking and analyzing the organizational and orientational dynamics of collaborative workgroups
* an analytical framework to structure the observation of activity in multiple information spaces (e.g. personal space, personal media space, shared space, shared media space etc.);
* a representational technique for synchronously and contextually orienting observational data;
* an original computer tool, medium and environment for collaborative interaction analysis research
A sample study was conducted using Digitally Integrated Videographic Analysis techniques, an observational methodology developed by the author for this dissertation. Media design tasks in support of pro-social, entertainment education communication campaigns were utilized to explore creative, social and interactional dynamics in contrast to decision making studies. Videographic, observational data were collected for eight collaborative workgroups, each consisting of four participants. Workgroups were organized around four fundamental media types including text, audio, video and graphics. Multimedia data (digital text, audio, video graphics) from one session (graphic design) was analyzed in detail utilizing the prototype MediaLink Analytical and Archival Environment.
Chairperson: Dan J. Wedemeyer
Author: LeAnn Garrett
Authority control and its influence on information retrieval, as measured by recall and precision, was studied. Three online bibliographic catalogs were constructed. The first catalog used only medical subject headings (MeSH), the second used a 50% combination of MeSH and Library of Congress subject headings (LCSH), and the third used only LCSH. Subject word and keyword searches (in all indexed fields) were performed with the medical subject headings used to develop the catalogs. The average values for recall and precision were then calculated. Using Chebychev's Inequality, 96% confidence intervals were computed to carry out the required hypothesis testing. Results indicate that recall is greater in the MeSH authority controlled catalog for both subject word and keyword searches. Precision is also greater in the MeSH authority controlled catalog for subject word searches. Keyword searches, however, introduce a decrease in precision at the.08 significance level and is no better than searches in the catalogs not using MeSH authority control. The theoretically optimal search of recall = 1.0 and precision = 1.0 was demonstrated in the subject word search in the MeSH authority controlled catalog. When comparing the two non-MeSH authority controlled catalogs, recall is greater at the.08 significance level in the catalog that used a 50% combination of MeSH and LCSH than in the catalog that was indexed only with LCSH. There are no statistically significant results in recall between subject word searches and keyword searches in either of the catalogs not using MeSH authority control. Keyword searching, therefore, cannot be relied on to increase recall. There are no statistically significant results in precision between the two non-MeSH authority controlled catalogs or between subject word and keyword searches. Authority control in online bibliographic catalogs is recommended. In addition to subject word searches, searches in authority controlled fields using authorized headings are recommended as the primary search option.
Chairperson: Larry N. Osborne
Author: Lai Kuen So
This dissertation describes the design and implementation of an automatic code generation system, named SPNACG, that produces simulation programs from Petri net models of discrete event systems. The high-level Simulation Based Petri Nets (SBPNs) we adopted are general enough to allow systems from a broad range of application areas to be modeled. These SBPNs adhere to common conventions but in addition are labeled in a fashion that supports transition localization.
Transition localization refers to the property of a Petri net where its operation can be modeled by a set of independent transition rules, each transition being independent of all other transitions with any interrelationships necessitating use of intermediary place nodes. It is this property that permits the representation of transitions as independent columns in a table, and then from a tabular representation of the Petri net to generate code one column at a time. We have also developed a tabular data structure representation which, along with an adaptation of decision table processing techniques, make it possible to generate simulation code in a variety of target languages automatically without the restrictive assumptions typical of other approaches.
Contrasts between existing simulation tools and SPNACG are made with particular focus on automatic code generation. Benefits offered by SPNACG include high generality, ease of modeling and applicability to both transaction-based and control-based systems. Contributions of our work to the areas of decision systems, Petri net modeling, and discrete-event simulation are discussed.
Two specific target languages (GPSS and SIMSCRIPT) were chosen for actual implementations of SPNACG. These two languages are representative of the main classes of simulation languages, process-oriented and event-scheduling. Programs generated from SPNACG were validated by a comparison of simulation output with analytical results. The problems associated with handling both transaction-based and control-based applications, and how different target language facilities affect their resolution, are discussed.
Chairperson: Art Y. Lew
Author: Jerome B. Heath
The theories that relate to the acceptance of technology tend to discuss the issues of media, leadership and networking. Although these processes are important in the acceptance of technology, it is proposed that epistemologies as measured by belief systems or mindscapes also play an important part in the acceptance of technology.
To determine these relationships I administered a survey with local students that compared their claimed use of technology with the results of the Harvey/Gore Belief System test. To compare with other factors I included the Inkeles and Smith Modernity Scale, which measures interest in media, acceptance of newness and new people, and concern about public issues. I also asked for opinions about a group of pictorial questions that were hoped would demonstrate a relationship to mindscapes or belief systems. Maruyama has been developing pictorial tests in order to represent mindscapes.
The survey also asked a series of demographic questions, especially ones that were considered to be related to the acceptance of technology. This part of the survey was extremely successful as most subjects answered all of the demographic questions.
Although the pictorial test had a distribution of answers, those answers were not strongly correlated with any of the other results of the survey; including the acceptance of technology, belief systems, modernity, or demographics. The acceptance of technology was more related to particular demographic issues than to either the Modernity of Inkeles and Smith or the Belief Systems of Harvey and Gore. In particular, different types of technology showed that different demographic issues were important. The most interesting demographic effects were those of gender, father's education and area of national/cultural origin.
A very interesting result was that the belief system of students who originate in the United States (mostly Hawaii) and are attending school in Hawaii were shown to be more commonly in System 3 and System 4 than the respondents in mainland United States as found by Rowley in a recent test. This indicates that mindscapes vary with culture.
Chairperson: Andrew R. Arno
Author: John F. Morton
The effects of electronic mail and class discussion lists on faculty-student interaction outside the classroom were examined through a field experiment. The electronic media were introduced into twelve different undergraduate courses in a community college while a second section of the same course taught by the same faculty member served as a control. As part of the experiment faculty maintained a log of the number, type, length, and purposes of all student contacts over the semester.
The introduction of the electronic media increased both the amount of faculty-student interactions and the percentage of students who had some contact with the faculty member. However, expected gains in student retention, student satisfaction with the interaction, and student satisfaction with the course did not materialize. This may be related to a continued low level of interaction between faculty and students that occurs in college courses, even with the introduction of the new medium.
The students who did choose to use electronic mail were significantly younger, carried more credits, were more likely to complete the course, and were more likely to be active contributors to in-class discussions. There were no significant differences between the users and non-users of e-mail with respect to gender, student attitudes towards computers, student attitudes toward faculty interaction, and student satisfaction. A regression analysis revealed five variables contributing to a prediction of the extent of e-mail use--prior use of electronic mail, faculty attitudes towards computers, faculty attitudes toward student interaction, student involvement with in-class discussion, and the cumulative credits students had earned. As predicted, students tended to employ electronic mail for short, logistics communications to a higher degree than in face-to-face meetings, but unexpectedly electronic mail was also more likely to be used for non-course related contacts than face-to-face meetings.
There was also support for theories related to the duality of structure and action in the adaptation of information technology. Concepts such as the degree of access to electronic mail seemed to be the result of social construction rather than a purely physical consideration. Additionally, different approaches to faculty-student interaction were identified, each of which had a different relationship to the effects and use of electronic mail and the resulting adaptation strategies of the faculty member.
Chairperson: Laku Chidambaram
Author: Xiangdong Ke
Computer graphics is an important area in multimedia technologies and applications. Due to their popularity, graphic development and manipulation environments are receiving much attention. One problem with computer graphics is the issue of managing the resulting information. There are the related issues of data sharing and efficient reuse of these resources. The issues all refer to retrieving graphic data efficiently.
Computer graphics is closely related to geometric properties of objects. From an object-oriented view, the graphic objects in an image or animation can be considered as objects, and geometric relationships between graphic objects as their behavior. The behavior can be represented by spatial and spatial-temporal relationships.
There is a traditional certainty approach to address graphics database retrieval. But its limitations are obvious. Using certainty retrieval, one can not control the range of a query with regard to objects and their behavior, and can not do precise queries of object behavior.
To address the above issues in graphics database retrieval, I propose object-oriented notions with uncertainty features for graphics database retrieval. My approach has the following unique features: (1) Fuzzy Object Retrieval. This feature provides similar object retrieval, and enables users to control the search range for the objects to be queried. (2) Uncertainty Retrieval on Spatial and Spatial-Temporal Relationships. This feature enables users to control the range of the query on object behavior; i.e., spatial and spatial-temporal relationships, and provides a method for precise queries on them.
A fuzzy object retrieval model and an uncertainty retrieval model for spatial and spatial-temporal relationships have been developed for the proposed approach. To verify the proposed approach, a prototype system called GOURD (Graphical Object-oriented Uncertainty Retrieval in Databases) has been built. In addition to GOURD, a client/server system, GOURD Server, has been developed so that the proposed approach can be used through the Internet. A detailed theoretical analysis and a discussion on these models will be introduced in subsequent chapters, which form the theoretical basis of the prototype systems--GOURD and GOURD Server. The related analysis and evaluation of GOURD and GOURD Server are also given in this document.
Chairperson: Stephen Y. Itoga
Author: Hsiao-Hui Wang
The last decade witnessed a sustained growth in the use of information technology (IT). The use of IT revolutionized the structure if management and the nature of competition in the emerging global economy. Newly industrialized economies (NIEs) have shifted their policy emphasis from IT production to IT use in order to encourage pervasive IT applications throughout government agencies and private industries,and thus transform themselves into information-intensive societies. Companies in NIEs are investing heavily in information technology in order to seize global opportunities and counter competitive threats. Although the investment in IT has been ever-increasing since the last decade, the existing literature provides inconsistent evidence on the payoff of IT investments, and only a small number of studies concentrate on newly industrialized economies. This dissertation aims to more fully examine IT value in an East-Asian NIE (Taiwan), and to offer new insights to policy leaders, business managers and researchers in other Asian newly industrialized economies and less developed countries.
In particular, this dissertation explores the following questions:
1)What are the influential national IT capability factors for economy? Are those factors effective in leading to increased investment in IT?
2)Does investment in IT have a positive impact on economic growth?
3)To what extent do individual organizations assess the effectiveness of national IT policies? Are there any key firm-level IT capability factors which affect implementation of IT infrastructures?
4)Can a firm that implements strategic IT infrastructures expect to improve its overall performance? What is the role of IT capability factors in the relationship between IT implementation and a firm's performance?
This dissertation presents an IT-capability-enhancing approach empirically examining the impact of IT capabilities and IT use on economic growth and on business performance. The study was conducted in two phases. The first phase of research traced the longitudinal analysis by using macroeconomic data; the second phase focused on the firm-level analysis by using cross-sectional data.
First, a time series analysis characterized the national IT-capability factors, and investigated the joint effects of national IT capabilities and IT investment on economic growth over a 16-year period from 1980 to 1995. Second, a cross-sectional field survey was performed. One hundred and forty-eight questionnaires were returned by top management of Taiwan's manufacturing and service enterprises.
The study represents an initial step in understanding how to derive desirable benefits from investment in IT activities. It is one of the first attempts to link two different levels of analysis for a comprehensive understanding of IT value. This is also the first empirical study to treat national IT policy as a research variable in explaining the problem domain of IT value.
Findings from macroeconomic analysis suggest that Taiwan's economic development can be best explained by "S&T human resources" and "basic telecommunications services" over time. While the study's longitudinal data does not provide support for the positive direct contribution of IT investment to economic growth, it implies that the strategy of IT-led development in Taiwan will be more effective in the future if policy leaders focus on investing in telecommunication infrastructure and human training rather than on incentives for increased industry spending in IT.
Findings from firm-level analysis reveal that, generally, IT use in networking, top management involvement, IT investment, and IT use in primary activities appear to be influential indicators in predicting Taiwanese firm's performance. This dissertation supports the view that using multiple measures may be the most effective way of fully capturing the multiple dimensions of a firm's performance and the sophistication of IT implementation. It is worth nothing that managerial support can be really critical in accruing the most benefits from IT, particularly when the firm has limited financial and policy resources, and has little experience with IT.
Chairperson: Meheroo F. Jussawalla
Author: Hyosun Kwon
Since commercial cellular telephone services began in 1983 in the United States, cellular telephones have proliferated worldwide; however, few academic researchers have studied why and how cellular telephones are adopted and used by the general public. Thus, the objective of this study is to gain a more complete understanding of people's acceptance of cellular telephones. Two hundred and ninety-three cellular telephone users from the United States (Hawaii) and South Korea were surveyed through questionnaires that examined individuals' demographic and socio-economic characteristics, perceptions about cellular telephones (the perceived ease of and apprehensiveness about use of telephones), motivations to use cellular telephones (extrinsic motivations, intrinsic motivations and social pressure), and extent of cellular telephone use.
In particular, this study explored the following questions: (1) What are the impacts of users' demographic and socio-economic factors, including gender, age, occupation and income, on their perceptions and social pressure? (2) What are the impacts of users' perceptions on their extrinsic and intrinsic motivations to use cellular telephones? (3) What are the impacts of the three motivational factors on use of cellular telephones? (4) Are cultural differences discernible between cellular telephone users in South Korea and the United States?
These questions were tested using an integrated theoretical model. The model was developed in this study based on existing theories of motivation and explained the relationships between individual characteristics, users' perceptions, motivations, and the usage of cellular telephones. This study is one of the first to analyze these issues theoretically in relation to cellular telephones.
The results of this study confirm that users' perceptions are significantly associated with their motivations, and that extrinsic motivations are among the most influential factors affecting cellular telephone usage. There were significant differences in perceptions, motivations, and extent of cellular telephone usage between the U.S. and South Korean samples. In this regard, culture plays a key role in technology adoption and use. Managerial and theoretical implications of this and other results are examined.
Chairperson: Laku Chidambaram
Author: Ryota Ono
As more countries move toward information-centered economic systems and societies, the advancement of telecommunications capability worldwide becomes more critical. While few would accept the current conditions of telecommunications in developing countries (LDCs) as they are, why do LDCs still lag in telecommunications infrastructure development? What should be done to remedy the existing unfavorable conditions in LDCs? In order to address these questions, the current study solicited a variety of perspectives of telecommunications professionals from approximately 70 countries by using an iterative international survey.
The study clarified that the problem of the telecommunications development gap had a quantitative and qualitative nature. The qualitative nature was further broken into consequence factor and impediment factor. The impediment factor of the gap was found to be the most critical. The study identified a full range of 127 obstacles and formed a comprehensive framework of obstacles consisting of 12 categories such as policy and regulation, finance, politics, human resources, etc..
Further, the study analyzed each of the 12 categories and identified 23 obstacle factors in seven categories. The study then analyzed the assessment of LDCs and developed countries (DCs) about the degree of seriousness of the individual obstacles, obstacle factors and categories of the obstacles. It found both agreement and disagreement between LDCs and DCs. Very little had been discovered and understood about how differently or similarly LDCs and DCs looked at the problem of telecommunications underdevelopment in LDCs.
Finally, the study identified 78 strategies to overcome many of the obstacle factors. It was found that some of the strategies had already been implemented in some countries. Although many of the strategies were narrow and would need to be adjusted on a country-by-country basis, the study has provided a basis for more extensive elaboration of strategies.
Unless telecommunications professionals understand the real mix of problems, they cannot prescribe appropriate solutions that would further telecommunications development. The 12 categories, the 23 obstacle factors and the 78 potential strategies elicited in the present study provide telecommunications professionals with a valuable framework to diagnose the problems inherent to telecommunications development and to prescribe more appropriate actions for improving have and have not conditions.
Chairperson: Dan J. Wedemeyer
Author: Danurahardjo Tjahjono
Formal Technical Review (FTR) plays an important role in modern software development. It can improve the quality of software products and the quality and productivity of their developmental processes. However, the effectiveness of current FTR practice is hampered by uncertainty and ambiguity. This research investigated two issues. First, what differences exist among current FTR methods? Second, what are potential review factors that impact upon the effectiveness of these methods?
The approach taken by this research was to first develop a FTR framework, based on a review of literature in the field. The framework allows one to determine the similarities and differences between the review process of FTR methods, as well as to identify potential review factors. Specifically, it describes a review method in terms of seven components of a review process: phase, objective, degree of collaboration, synchronicity, role, technique, entry/exit criteria. By looking at the values of individual components, one can compare and contrast different FTR methods. Furthermore, by investigating these values empirically, one can methodically improve the practice of FTR.
Second, a computer based review system, called CSRS, was developed to implement the framework. The system provides a set of declarative modeling languages, which allow one to create a wide variety of FTR methods, or to design experiments to compare the performance of two or more review methods, or to evaluate a set of review factors within a method.
Finally, this research involved an empirical study using CSRS to investigate the effectiveness of a group process versus an individual process in finding program faults. Two review methods/systems were implemented using CSRS: EGSM (used by real groups) and EIAM (used by nominal groups). The experiment involved 24 groups of students (3 students per group), each reviewing two sets of source code, once using EGSM and once using EIAM. The experiment found that there were no significant differences in detection effectiveness between the two methods, that synergy was observed in EGSM but did not contribute significantly to the total faults found, and that EGSM incurred higher cost than EIAM, but was significantly more effective in filtering out false positives.
Chairperson: Phillip M. Johnson
Author: Ya Liu
This dissertation presents a design for graphic databases and its implementation, the GDB system. The research explores new directions for spatial database systems and addresses the problem of how to represent graphical data by their essential visual features, not the text, nor the code used to describe graphical data. Our goal in this research is to find a way that best supports the understanding of spatial data, i.e., the images, pictures, and animation projects.
Differing from other systems and research, the GDB system focuses on a special group of spatial data: the computer generated graphics, which have object structures that need to be maintained for future reuse. Another concern of this research are animation projects in which spatial properties are integrated with time factors. The design minimizes the graphical indexing process for higher efficiency, and hides the details of binary graphic data from users for simplicity sake. The query language of the system is based on visual reasoning models. The language concentrates on describing graphical data by spatial and spatial-temporal relations of the objects inside the data. The query resolving process is a deductive procedure. Visual relations in graphical data and predicates in queries are defined and interpreted by deductive rules.
The GDB system is developed in the Windows NT environment. It provides a flexible graphical interface. The system architecture is object oriented. A graphics editing module simply helps the user to edit graphical data in a way that is similar to record editing in traditional relational databases. A preprocessing module does the tasks of finding and calculating graphic properties and building index items for the graphical data. A graphics retrieving environment consists of a set of windows that enable users to specify the definitions of visual queries. The GDB system also provides the following views of graphic data: summary, object hierarchy, animation play, and image display.
The research is compared to other theories and implementations. The theoretical properties of the design are also discussed. The query language is shown to be temporally safe and complete, and spatially expandable. The power and flexibility of the interface are also presented.
Chairperson: Stephen Y. Itoga
Author: Kumiko Aoki
In the information age where societies becoming increasingly interdependent and global, frequently we have to work with people at a distance through communication and information technologies (telecollaboration). The most common form of such collaboration has been collaborative writing in which more than two people create a document collaboratively through computer-mediated communication. Drawing from the literature on computer-mediated communication, computer-supported cooperative work, virtual reality, collaborative learning, and cross-cultural communication, a heuristic model of telecollaboration was proposed and based on the model, a field experiment was designed and conducted from September 27, 1994, to December 9, 1994. In the field experiment, participants in the University of Hawaii and participants in Nanzan University, Japan, formed groups of four consisting of two students in Hawaii and two students in Japan. They were assigned to write a research paper on the topic the group selected. There were two sessions during the entire course; each session lasted five weeks to complete one collaborative research paper. The study examined the effects of such variables as the individual typing skill, the English composition skill, independent/interdependent self-construal, cultural orientation, and the use of videoconferencing, on communication frequency among group members and the perceived quality of group performance as well as the quality of the final paper produced by the group. The results showed that those who had videoconferencing exchanged fewer e-mail messages while showing more group cohesiveness. Individual skills such as typing skills and writing skills were found to have effects on the number of e-mail messages sent and the quality of final group papers. Implications of the research findings in the model were discussed and recommendations for future research were made.
Chairperson: Dan J. Wedemeyer
Author: Beverly G. Hope
Services play a dominant role in post-industrial economies. The strength of those economies depends upon the competitiveness of both their manufacturing and service sectors. In this research we address the challenge of improving service competitiveness through data-driven quality improvement systems.
Measuring, monitoring, and controlling service quality is an elusive task. Members of quality improvement teams frequently lack a detailed understanding of quality improvement techniques and data collection requirements. Some progress has been made toward understanding data needs in manufacturing industries, but many people believe that service organizations are different. What is needed is (a) an improved understanding of service quality data needs, and (b) a way of supporting workers in collecting relevant and valid data.
This research used a field study in the banking industry to develop a model of data needs for service quality improvement. The model describes the data needs at three levels of quality planning and implementation uncovered by our research: strategic, tactical, and operational. The preliminary model developed in the banking industry was divided into two sections for validation. The first section was validated by a series of structured interviews and a survey of service providers in a broad range of service industries. The second, more prescriptive section was validated by a panel of experts.
The validated model provided the basis for a logical model which was subsequently implemented as a demonstration prototype expert support system (ESS). The ESS uses procedural cuing to guide users through a data-driven quality improvement process. Emphasis is placed on problem-focused data needs and selection of appropriate tools and techniques to analyze data. Computerized support at the operational level can provide on-the-job and training to teams charged with implementing quality improvement projects.
The research provides both theoretical and practical contributions. These include an improved understanding of quality-related data needs in service industries, a strategy for tying data needs to the service quality improvement process, and demonstration of computerized support to a new problem domain.
Chairperson: Rosemary H. Wild
Author: Xiaobo Wang
Most existing automatic layout techniques are designed to generate layouts that look pleasant to the eye by improving aesthetics of graphs. Aesthetics, however, do not reflect layout requirements derived from semantics, preference or individual situations. It is important for an automatic layout technique to generate customized layouts according to specific requirements given by the user or applications.
This thesis investigates how to generate customized layouts using selected layout algorithms. A key to this problem is to improve the expressive power of existing algorithms and integrate different techniques to deal with various layout requirements.
LYCA is a graph tool that uses incremental optimization algorithms to draw directed and undirected graphs. It integrates a constraint solver to process constraints. Compared with other works, LYCA has several distinctive features:
* The force-directed placement algorithm is improved to generate compact layouts for graphs with large vertices.
* A novel usage of the divide-and-conquer approach is introduced to generate structured layouts.
* The constraint solver and the layout algorithms are integrated in a simple and efficient way. In addition, the solver and layout algorithms cooperate to ensure layout quality.
* Different interface techniques are used to help the user diagnose layout problems and interact with the layout algorithms directly.
Those features provide a tight coupling of the user and the layout tool. Users can generate customized layouts with LYCA easily and flexibly.
Chairperson: Isao Miyamoto
Author: Dadong Wan
This dissertation presents a computer-based collaborative learning environment, called CLARE, that is based on the theory of learning as collaborative knowledge building. It addresses the question, "what can a computer do for a group of learners beyond helping them share information?" CLARE differs from virtual classrooms and hypermedia systems in three ways. First, CLARE is grounded on the theory of meaningful learning, which focuses the role of meta-knowledge in human learning. Instead of merely allowing learners to share information, CLARE provides an explicit meta-cognitive framework, called RESRA, to help learners interpret information and build knowledge. Second, CLARE defines a new group process, called SECAI, that guides learners to systematically analyze, relate, and discuss scientific text through a set of structured steps: summarization, evaluation, comparison, argumentation, and integration. Third, CLARE provides a fine-grained, non-obtrusive instrumentation mechanism that keeps track of the usage process of its users. Such data forms an important source of feedback for enhancing the system and a basis for rigorously studying collaboration learning behaviors of CLARE users.
CLARE was evaluated through sixteen usage sessions involving six groups of students from two classes. The experiments consist of a total of about 300 hours of usage and over 80,000 timestamps. The survey shows that about 70% of learners think that CLARE provides a novel way of understanding scientific text, and about 80% of learners think that CLARE provides a novel way of understanding their peers' perspectives. The analysis of the CLARE database and the process data also reveals that learners differ greatly in their interpretations of RESRA, strategies for comprehending the online text, and understanding of the selected artifact. It is also found that, despite the large amount of time spent on summarization (up to 66%), these learners often fail to correctly represent important features of scientific text and the relationships between those features. Implications of these findings at the design, empirical, and pedagogical levels are discussed.
Chairperson: Phillip M. Johnson
Author: Kelly Burke
Many factors in the current business environment compel organizations to seek global ventures and alliances. Such globalization often results in new organizational structures and behaviors. For example, workteams must be able to perform tasks while members are distributed across geographic and temporal boundaries.
Technology presents opportunities for teams to collaborate in novel ways. However, researchers understand very little of the effect these technologies have on groups. This study employed a controlled laboratory experiment to examine group development and communication processes in the context of electronically supported meetings.
In particular, the study explored the following questions: (1) Are behavioral and development dynamics different in groups meeting in structurally different environments? (2) Are communication behaviors and effects different in groups meeting in structurally different environments? (3) Does performance differ between groups meeting in structurally different environments?
These questions were tested using a single factor with three levels repeated observations research design. This study is one of the first to analyze these issues in teams meeting from different places at different times over a period of time. The variables examined are described below.
Independent variable: Meeting environment. Environment was manipulated across three levels face-to-face, distributed synchronous, and distributed asynchronous.
Dependent variables: Group development, communication effectiveness, and performance. Development was measured by the level of perceived cohesiveness, conflict management, and process satisfaction. Communication effectiveness was measured by perceptions of social presence, communication effectiveness and satisfaction with the communication interface. Performance was assessed by the quality of the documents produced.
Controlled variables: Technological support, task type, group size, individual differences (through randomization) and time spent on task were controlled across and within treatments.
Thirty three groups of four members each were evenly divided between and randomly assigned to the three conditions. Results of the study indicate that the structure of the meeting environment may not exhibit a significant impact on group development. On the other hand, environment may affect performance. Some support is shown for the theoretical argument that, over time, groups adapt to and appropriate the structures within which they operate. Further, evidence suggests that structures vary in the extent to which they can be appropriated.
Chairperson: Laku Chidambaram
Author: Zhengfu Liu
One of the current trends in the field of information retrieval is to apply artificial intelligence techniques, especially natural language processing and knowledge representation techniques, to the problem of information retrieval. Although this approach is appealing, it is unlikely that the problem can be solved once and for all by completely relying on the semantic processing and knowledge representation techniques and attempting direct retrieval of information from a knowledge base constructed out of a collection of natural language texts.
A feasible approach is to use the well-developed IR techniques as the backbone and incorporate some of the NLP techniques to increase the power of content representation without involving sophisticated processes of semantic interpretation and knowledge representation.
In this dissertation research, a text representation and searching technique, called "the Semantic Vector Space Model" (SVSM), was developed by combining Salton's Vector Space Model (VSM) with heuristic syntax parsing and distributed representation of semantic case structures. In this model, both documents and queries are represented as semantic matrices. A search mechanism was designed to compute the similarity between two semantic matrices and the similarity value was interpreted as the predictor of relevancy.
A prototype system was built to implement this model by modifying the SMART system and using the Xerox P-O-S tagger as the pre-processor of the indexing process. The prototype system, called "SMART++", was used in a series of experiments designed to evaluate the proposed text representation and searching technique in terms of precision, recall, and effectiveness of relevance ranking. The original SMART system was used as the benchmark. Three experimental collections acquired from Cornell University were used in the experiments.
The results of these experiments showed that if documents and queries were too short (typically less than 2 lines in length) our technique was less effective than the Vector Space Model. But with longer documents and queries, especially when original documents were used as queries, we found that the system based on our technique had significantly better performance than the SMART system. This suggests that a significant improvement of system performance can be achieved by combining semantic case structure information with the weighted term representation of texts in a situation where longer queries are available.
Chairperson: Larry N. Osborne
Author: Richard Peyton Halverson
This dissertation describes a software system and related hardware architecture in which high level language programs are compiled into gate level logic circuitry that is configured specifically to execute the compiled program. A system whose processor can be dynamically reconfigured to suit different applications is known as a custom computing machine (CCM). We have designed a new class of CCMs based on the concept of functional memory (FM), which we construct by connecting field programmable gate arrays (FPGAs) in parallel with conventional random access memory (RAM). FM is used by the processor for computing the (possibly multi-operand) expressions of the high level language program in the combinational logic provided by the FPGAs. When all program expressions are computed in FM, the necessary processor instruction set reduces to a minimal number of moves and jumps.
Our functional memory computer (FMC) is a four FPGA FM prototype with a fifth FPGA programmed as the minimal processor. The language we adopted as the high level source language for programming the FMC is a decision-table (DT) variation of standard Pascal. DT programs for a shortest path and two sorting algorithms were translated, executed, and analyzed on the FMC. The second sorting program demonstrated a nondeterministic array selection function. An analysis for the shortest path program showed that memory load/store counts remained comparable for FMC and von Neumann implementations. However, with the FMC, a 35% reduction in total execution steps occurred because all computation steps are performed in parallel on the FMC.
The problem of compiling high level DTs to low level FMC object code is more complex than for conventional machines because each single expression in the source program can translate into several tens of lines of FPGA circuit definition code. The Windows based system developed for this purpose includes a compiler that translates source programs into intermediate assembly language modules, and an operating system that invokes system routines for assembling, linking, placing and routing, and loading the FPGA machine level object code into the minimal processor and functional memory.
Chairperson: Art Y. Lew
Author: Jingxiang He
Increasing system complexity necessitated the development of software engineering methods and CASE (Computer Aided Software Engineering) tools. Many software developers and businesses have adopted engineering principles and computer aided tools to cope with the growing needs of software development and maintenance. In practice, most software projects are initiated by the information needs of the end users. Precise descriptions and understandings of these information needs are critical to information systems. It is believed that increasing end user involvement and doing things right in the early stages of software development processes are the most effective ways to improve software quality.
This dissertation presents a research project to develop a tool, HAT (Hyper Analysis Toolkit), to help the end users to understand and use the structured analysis techniques. HAT provides a hypertext linkage of graphical models, such as DFDs (Data Flow Diagrams) and ERDs (Entity Relation Diagrams), with system description narratives and other documents created during the system analysis. Hyperlinks placed in the diagrams and documents provide an easy way for end users and system analysts to navigate and cross-reference the system models.
Model evaluation is as important as model description. In addition to the hypertext-based user interface for model description, this research incorporates a simulation package and a rule-based expert system to estimate the dynamic features of a DFD model. Dynamic evaluation of models at early stages will help system developers and end users to have better control over software development processes.
Chairperson: Kenneth A. Griggs
Author: Dara Lee Howard
This dissertation explores one aspect of solving information problems: the problem solver's transformation of information into personal knowledge. The primary goal of this work is to move toward describing this information problem solving interaction.
Verbal and action protocols provide the data to describe the activity of personal knowledge construction as executed in the context of a student using public written literature to develop a short written text in response to an externally generated information problem. The constant comparative method of data analysis is used to uncover the categories and transformation operators that comprise the activity. A model that depicts both events and operations is presented using the framework of problem solving and schema theories.
Interacting with information is placed within its encompassing environment of information problems and information problem solving. Using the simultaneous verbal reports and the action reports of the participants, the structures and operators of the problem solvers are identified and described in a frame model with three major branches. Two of the three branches represent the problem solvers' knowledge structures relating to the available information and to the problem solver's personal knowledge base. The structures in these branches were developed to show the various aspects and types of a structure that occurred in the data. The third branch represents operators which are used to bring about changes to the structures. Eleven operators were developed and eighteen knowledge structures which were related either to the information made available in the documents or to the problem solver's previous or developing knowledge base about the problem.
The interaction of the branches is demonstrated in three extended examples drawn from the data. The interactions which were found in the verbal and action reports are discussed and are used to demonstrate the model. The interactions, depicted graphically in a display grid, show the available information the problem solvers used, the behaviors that implemented changes, and the kinds of transformations that were made.
Chairperson: Carol Tenopir
Author: Shakti S. Rana
This study addresses the necessary relationships (competition and cooperation) between and among (1) the manufacturers, (2) the customers, (3) the research organizations, and (4) the government agencies, who are involved in the development of a high technology product as the product traverses successfully through its product development life-cycle (PDLC). The PDLC is composed of three phases (invention, development, and integration) and each of these phases consists of three stages (idea generation and assessment, development and testing, and standardization and launch).
The literature review identified the problem-delay in HDTV development; the case study analyzed the history of television to produce a product development model (PDM) which considers the phases, stages, entities and their relationships; and the field survey validated the PDM using convergence analysis. Monochrome television (MTV) represented the invention phase, color television (CTV) the development phase, and high definition television (HDTV) the integration phase.
The PDM illustrates the following--The relationship for the between entities' category changes from competition to cooperation as the product traverses through the stages, while it remains the same for the among entities' category; and the relationship for the among entities' category changes from competition to cooperation as the product traverses through the phases, while it remains the same for the between entities' category.
Chairperson: Meheroo F. Jussawalla
Author: Hai Huang
Programmers who maintain software systems face one big problem--it is difficult to get accurate and relevant information about the target system. Reverse engineering can be a solution to the problem since it obtains the information about the target system from the most reliable source--the source code of the target system.
This dissertation presents an integrated, intelligent reverse engineering system--Proud (Program Understanding System). Proud is a component for reverse engineering and program analysis in the SMA (Software Maintenance Assistant) project. Its objective is to provide accurate and relevant information of the target system for other components in SMA by extraction from and analysis of the source codes of the target system. Proud has the following unique features: (1) It uses a graphical knowledge representation language that incorporates many advanced artificial intelligence features for representing different aspects and properties of software. (2) It provides a graphical, non-procedural query language to access and manipulate information extracted and abstracted from the source codes of the target system. (3) It adopts a flexible, adaptable approach and hence it is easily customized to fit the requirements of a particular software maintenance project. (4) It uses a rule-based approach in the design and implementation of most of its extraction tools so that tools become language independent and can easily provide extra information if demanded.
The system has been tested with some source code files from Fujitsu Limited. It is also used as the front-end processor for other projects. Proud shows its capabilities in these tests and has been proven as a useful tool for software maintenance.
Chairperson: Isao Miyamoto
Author: Diane M. Nahl
A review of the literature in technical documentation shows that there is a paradigm shift occurring. The emphasis in writing computer manuals is changing from a system-centered focus to a user-centered focus. Affective and cognitive speech acts were extracted from point-of-use instructions written by academic librarians for the H. W. Wilson CD-ROM databases. A taxonomy of affective and cognitive speech acts was constructed, and instructional elaborations based on it were written and added to the original instructions. The affective elaborations consisted of providing orientation, advice, and reassurances to searchers. The two types of instructions were tested with 62 novice searchers from a college population using a two by two factorial ANOVA, with task complexity as the second factor. During one-hour search sessions in the Readers' Guide Abstracts database, 11 dependent variables were assessed using transaction logs and measures designed to assess success, satisfaction, and search style. Results showed that the affectively elaborated instructions were rated more helpful, and searchers were more satisfied with results, but success was not affected. Searchers who expected to succeed were more successful, more satisfied, had faster search times, lower interactivity scores, and reported less frustration during searching than those who reported lower self-confidence as a searcher. Interpretation of results resulted in a theory of the searcher's world in which seven environmental layers are described: information need, indexing language, search mode options, point-of-use instructions, knowledge of commands, search strategies, and motivation or self-confidence. Individual differences in search literacy are attributed to acquired search role types, i.e., self-verbalizations that mimic either positive and effective search models or negative and ineffective ones. It is suggested that writers of point-of-use instructions and other online help facilities for novices use the taxonomic approach as a guide to writing affective elaborations that not only give advice and reassurance, but model positive search literacy role types and self-efficacy as a searcher.
Chairperson: Carol Tenopir
Author: David Stone
The research explores and develops a new strategy for the multichannel (multivariate) autoregressive (MCAR) time series modeling of multichannel stationary and nonstationary time series. The multichannel time series modeling is achieved doing things one channel at-a-time using only scalar computations on instantaneous data. Under the one channel at-a-time modeling paradigm, three long standing and important problems in multichannel time series modeling are studied. First, one channel at-a-time scalar autoregressive (AR) time series modeling in combination with subset selection and a subsequent linear transformation achieves a relatively parsimonious multichannel autoregressive model of stationary time series and reduced one-step-ahead prediction variance as compared to conventional MCAR model fitting. Second, enhanced power spectral density estimation for multichannel stationary time series may be achieved with one channel at-a-time multichannel AR modeling in combination with a smoothness priors distribution on the scalar AR model parameters. Third, estimates of the time varying power spectral density matrix for multichannel nonstationary covariance time series are achieved using the one channel at-a-time paradigm in conjunction with a Bayesian smoothness priors stochastic linear regression model of the partial correlation coefficients (PARCORS) of a scalar lattice AR model. In this case, only a small number of hyper-parameters are fitted for the multichannel time varying AR model which has many more parameters than data.
Chairperson: Will Gersch
Author: Karen Rebecca Clark-Kraut
Libraries operated for centuries with two philosophies: the services they provided were of value to society; the value of those services was immeasurable. Today, those philosophies are being challenged as libraries are asked to measure the costs and benefits of the services they provide. Academic libraries in the United States spend large sums on journal subscriptions since most research is made available to others when it is published in journals. Researchers use indexing publications supplied by libraries to identify relevant journal articles. Indexing publications are available through electronic delivery options including networked CD-ROMs, locally loaded tapes, and flat fee online systems. Selecting the best, most cost effective option for access to indexing and abstracting data is a complex task so a cost benefit decision model is needed.
This study is comprised of three parts. First, a prioritized list of costs and benefits that can affect a library's choice of access option was developed using the Delphi method. Second, usage data for each of the three access options was collected from academic institutions that have adopted one of these access options. Data was collected on the use if indexing publications from The H.W. Wilson Company. The data was analyzed to determine if institutional characteristics such as difficulty of admission or funding source affected the amount usage. The data suggests that all three options considered for this project can provide a reasonable level of access for students and faculty regardless of the enrollment of the institution. Furthermore, institutional characteristics had a minimal effect on usage.
Finally, a decision model was developed using the costs and benefits identified by the Delphi participants and the usage data collected for this study. The model, developed with standard spreadsheet software, prompts the user to enter dollar amounts for the costs and benefits associated with each delivery option. Once all amounts are entered, the user can compare the total costs and benefits for each of the three options.
Chairperson: Carol Tenopir
Author: Leorey O. Marquez
This dissertation analyzes the effect of different characteristics of data on the training and estimation accuracy of neural networks. The literature on the universal approximation property of neural networks is reviewed. An examination of the relationship of the neural network approach to traditional statistical methods of approximation brought about proposed enhancements to the neural network training procedure.
The study generated data samples characterized by different functional forms, levels of random noise, number and magnitude of outliers, and strength of multicollinearity. These samples were then used to train a neural network. The accuracy of the neural network estimate was tested and compared with the accuracy of the estimates obtained from the true model and those from Specht's GRNN model. Statistics on the length of training and the complexity of the neural network estimate were also collected and analyzed.
Chairperson: William E. Remus
Author: Zhi Cheng Li
The resource allocation problem usually involves optimization on uniform resources. This thesis solves the problem of finding matchings between several resources and activities which are optimal for a given linear objective function and are stable with regard to preferences of the resources and activities. The concept of a stable matching comes from the stable marriage problem. Given an equal number of men and women, and, for each person, a strictly ordered preference list containing all the members of the opposite sex, a stable marriage is a one-to-one matching of men and women in which there is no man and woman who prefer each other to their partners. Although this stable marriage problem with its strictly ordered preference lists (no ties or indifference) has been studied for three decades in computer science, there are few applications. When arbitrary indifference is allowed, the stable marriage problem has more applications but fails to have some valuable properties such as Pareto Efficiency and majority assignment. We suggest using group preference lists to allow limited indifference in applications. That is, each person belongs to a group that has a preference list for all groups of the opposite sex. We show that with this approach, stability guarantees Pareto Efficiency. We have constructed a correct definition of majority assignment for our case, and show that stability also guarantees majority assignment. For our approach, there is also a polynomial-size representation of all the stable matchings for a given problem. Thus polynomial-time complexity algorithms are possible for optimization over all such matchings. We give algorithms to find the maximal elements in the solution lattice, to find the polynomial-size representation of the solution lattice using rotations, and to find the partial order on the set of rotations. The first two algorithms generalize those of Gale-Shapley and Gusfield. The third one is new. All three algorithms run in O(Np) time, where N is the total number of men or women, p is the number of groups in the other sex. We also develop the lattice structure theory supporting the algorithms.
Chairperson: Stephen Y. Itoga
Author: T. N. Kamala
Individual differences in the use of technology, and computer-based systems in particular, have been studied by many researchers. The literature has seen contributions from the fields of computer science, communication, psychology, management information science and library & information science. Most of the work in the Information Retrieval (IR) area has, however, been related to the online database systems. Some have studied the effect of different training methods on users with varying cognitive traits, varying experiences, etc. On the other hand, complexity of use of different systems and system interfaces has been the focus of some studies. The proposed research would explore the effect of individual differences on performance of novice users of a new technology, i.e., CD ROM database systems, which are gaining popularity in academic libraries and elsewhere. In addition to extending the outcomes of previous findings, the proposed research would attempt to isolate variables that contribute to individual differences, by studying them in greater detail. The need to introduce variables hitherto ignored in IR context would also be examined.
Chairperson: Carol Tenopir
Author: Johannes Meier
Since the interchange of information forms the basis of all organizational activity, it is not surprising that automated information systems that connect different organizations have become very important in today's business environment. The literature abounds with anecdotal evidence of how these interorganizational systems (IOS) have had strategic impact. However, there is a clear lack of rigorous assessment of costs and benefits of interorganizational systems.
In contrast to intraorganizational systems, interorganizational systems involve more than one organization, thus raising issues of control and cooperation. This makes industrial organization theory an appropriate reference discipline for an attempt to develop normative models highlighting the economics of IOS. Non-cooperative game theory provides a formalism for analyzing the competitive strategies of participants in IOS.
We focus on three areas: (1) Shifts in bargaining positions between a manufacturer and his suppliers are shown to result from the introduction of a vertical IOS. Key determinants are the transaction volume of suppliers and the possibility of credible threats by the manufacturer. (2) Competition proprietary IOS that are already established in a market is analyzed. Switching costs and network externalities induced by IOS result in a stable coexistence of competing systems. A framework of competitive moves provides insight into the competitive use of IOS. (3) The problem of competitive advantage vs. strategic necessity in the context of IOS is studied leading to the issue of cooperation among IOS providers.
We use published data on the airline reservation systems industry, automated teller machine networks, and electronic data interchange (EDI) use, to justify the assumptions of the models.
Based on the results of the dissertation, a new framework for the evolution of IOS is introduced by drawing a parallel to the evolution of internal information systems.
Chairperson: William G. Chismar
Author: Mark Alan Hukill
Telematics is the growing convergence of computer and information systems technologies with telecommunications and broadcast systems.
In the ASEAN (Association of South East Asian Nations), telematics development is part and parcel to the rapid economic expansion of the region. Trends in policies for the development of telematics in the ASEAN include increasing liberalization of markets, moves toward the deregulation and privatization of the telecommunications authorities and increasing private participation in telematics development.
In order to effect institutionally recognized telematics policy and planning in the region, a base-line data taxonomy of infrastructure and investment for development planning is proposed. Due to the complex, problem-oriented nature of telematics studies, an interdisciplinary approach to understanding theory and methods of research is taken.
A description of telematics policies, infrastructure, investments and markets with regards to social, political, economic, cultural and technical development in five of six ASEAN countries is presented. From this, key developments in telematics in the ASEAN result in the formulation of a draft base-line data taxonomy as an indicator of telematics development and for use in regional planning. A methodology to develop and refine the data taxonomy is proposed and executed which includes open interviews in a survey of key policy makers in three of the six ASEAN countries, namely, Indonesia, Malaysia, and Singapore. Initial feedback on use of the taxonomy from Malaysia and the Philippines confirms the viability of the base-line data taxonomy for policy and planning purposes in the region.
Data gathered as a result of the operationalization of the base-line data taxonomy could be used with numerous communication planning methods. A formal adoption of the taxonomy and its subsequent implementation in the ASEAN region on an official level is recommended. The process of modifying and updating the description and taxonomy should continue in an effort to provide a meaningful set of tools for policy and planning. The base-line data taxonomy is but a beginning to the operationalization and use of data gathered under its rubrics for policy and planning purposes in the ASEAN region.
Chairperson: Dan J. Wedemeyer