Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

8-2005

Abstract

The evaluation of information systems development methodologies is becoming increasingly important. The lack of a commonly acceptable set of criteria for evaluation, however, hinders aggregating knowledge from field studies. Our study attempts to fill the gap. To reduce the effects of subjectivity, we surveyed the opinions of a group (28) of experienced IS researchers. Fifty-one criteria were generated from the research participants. A systematic content analysis technique will be applied for grouping the unique criteria into general categories. Upon completion, our study can make both academic and practical contributions in evaluating ISD methodologies.

Keywords

Brainstorming, Content analysis, Evaluation criteria, Field studies, Information systems development methodologies

Discipline

Databases and Information Systems | Numerical Analysis and Scientific Computing

Research Areas

Data Science and Engineering; Information Systems and Management

Publication

Proceedings of the 11th Americas Conference on Information Systems, Nebraska, USA, 2005 August 11-15

Volume

7

First Page

3487

Last Page

3492

ISBN

9781604235531

Identifier

aisel.aisnet.org/amcis2005/509

Publisher

Association for Information Systems

City or Country

Atlanta

Additional URL

https://doi.org/aisel.aisnet.org/amcis2005/509

Share

COinS