Award Abstract # 0916081
SHF: Small: RUI: Making Sense of Source Code: Improving Software through Information Retrieval

NSF Org: CCF
Division of Computing and Communication Foundations
Recipient: LOYOLA UNIVERSITY MARYLAND, INC.
Initial Amendment Date: August 2, 2009
Latest Amendment Date: August 2, 2009
Award Number: 0916081
Award Instrument: Standard Grant
Program Manager: Sol Greenspan
sgreensp@nsf.gov
 (703)292-7841
CCF
 Division of Computing and Communication Foundations
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: September 1, 2009
End Date: August 31, 2013 (Estimated)
Total Intended Award Amount: $309,757.00
Total Awarded Amount to Date: $309,757.00
Funds Obligated to Date: FY 2009 = $309,757.00
History of Investigator:
  • Dawn Lawrie (Principal Investigator)
    lawrie@cs.loyola.edu
  • David Binkley (Co-Principal Investigator)
Recipient Sponsored Research Office: Loyola University Maryland, Inc.
4501 N CHARLES ST
BALTIMORE
MD  US  21210-2601
(401)617-2561
Sponsor Congressional District: 02
Primary Place of Performance: Loyola University Maryland, Inc.
4501 N CHARLES ST
BALTIMORE
MD  US  21210-2601
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): FV5AVEGVTUE4
Parent UEI:
NSF Program(s): SOFTWARE ENG & FORMAL METHODS
Primary Program Source: 01000910DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 9216, 9217, 9218, 9229, HPCC
Program Element Code(s): 794400
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

The cost-effective construction of software is increasingly important to businesses and consumers. Given software's ever-increasing size and complexity, modern software construction employs significant tool support. Recent tools complement traditional static analysis tools by exploiting the natural language found within a program's text through the use of Information Retrieval (IR). Best known for its use by search engines on the Internet, IR encompasses a growing collection of techniques that apply to large repositories of natural language. New tools using IR have tackled problems previously requiring considerable human effort. However, to reap the full benefit of IR techniques, the language across all software artifacts (e.g., requirement and design documents, test plans, as well as source code) must be normalized. Normalization align the vocabulary found in source code with that of other software artifacts. In addition to improving existing tools, normalization will also encourage the development of new techniques and methodologies useful in future tools. Empirical study of successful tool improvements will aid technology transfer of the tools expected to improve programmer productivity. Beyond its technical goals, this research promotes discovery in Loyola's undergraduate curriculum through the direct involvement of undergraduate students in scientific research and by integrating research results into classroom learning.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

D. Binkley, M. Davis, D. Lawrie, J. Maletic, C. Morrell, and B. Sharif. "The impact of identifier style on effort and comprehension" Journal of Empirical Software Engineering , v.na , 2012 , p.1 10.1007/s10664-012-9201-4
D. Binkley., M. Davis, D. Lawrie, J. I. Maletic, C. Morrell, and B. Sharif "The Impact of Identifier Style on Effort and Comprehension" Invited Peer-Review Article Journal of Empirical Software Engineering , 2012 DOI: 10.1007/s10664-012-9201-4
E. Hill, D. Binkley, D. Lawrie, L. Pollock, and V. Shanker "Identifier Splitting for IR Applications in Software Engineering" Empirical Software Engineering , 2013 10.1007/s10664-013-9261-0

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page