Research development guides

The R&E Foundation is here to assist you in developing your research idea. This guide will help you clarify your initial idea and build an actual project for an R&E Foundation grant application. 
If you already know where you want to go on this page, you can use the links below to jump to that section. Otherwise, start at the beginning.

Develop your idea

When it comes to developing a topic for research, we have some recommended guidelines to make the process easier.

First, determine your area of curiosity.

What are you interested in? What excites you? What questions about the radiologic sciences keep you up at night? Where does your passion lie?

Once you know that, develop research questions related to that curiosity. For each of those research questions, you need to answer the following:

  • What does this have to do with anything? If you can’t answer this question, you need to reevaluate the research question, maybe even your topic.
  • Who cares about the result? If you're the only one who’s curious about this topic, then find out if it could benefit anyone else. That might mean exploring additional resources or reaching out to people working in the fields you wish to benefit.
  • Do we already know this? If the answer is yes, try to modify your idea to fit missing elements that haven’t been explored. Or, you’ll need to find a new idea.
  • Are you playing with black boxes? Are the inputs and outputs the only known components of your idea, while the actual mechanism for the system is unknown?
  • Is this research or evaluation? Research is based on the development of a hypothesis and has specific aims. It contributes to the greater knowledge of an area of study. Evaluation is a measure of assessing an individual’s or a program’s progress. Results are often submitted back to a supporting or governing body and publication is less likely.

Steps to success

Once you’ve determined your research topic, you'll need to create a project. We've put together a list of steps and questions to guide you through this process. Your project might only address one or two questions for each step. It all depends on your project scope.

  1. Identify a problem and form a research question
    • What is currently known in this area within the literature?
    • Are there resources outside of radiology that will help project development?
    • What is the need for or benefit of this area of investigation?
    • How will this investigation add to the literature?
    • What is the significance?
    • If this educational project is to replace current educational programming, how is this an improvement? Any goals, objectives and outcomes should back up this claim.

  2. Write a hypothesis
    • Is the hypothesis (or hypotheses) clear?
    • Is the hypothesis concise and does it have a limited number of confounding factors?
    • Does the study have practical or theoretical value?
    • Does the hypothesis lend itself to empirical testing?
    • Can data be obtained?

  3. Develop objectives and goals
    • Does the research promote quality?
    • Does the research build from the existing knowledge base?
    • Does the research enhance professional development?

  4. Define your target audience or determine your participants
    • Who is going to be studied or taught?
    • Who will immediately benefit from this investigation?
    • Other than the immediate beneficiaries, are there larger groups that will benefit as well?

  5. Design the research
    • From a learning design perspective:
      • Learning design requires you to fully understand the innovative forms of education you want to create in order to create them.
    • From a technology perspective:
      • You address complex problems in real contexts.
      • Then integrate known and hypothetical design principles with technological affordances to render plausible solutions to those complex problems.
      • Lastly, you’ll conduct rigorous and reflective inquiries to test and refine innovative learning environments as well as to define new design principles.
    • From a curriculum perspective:
      • Create a curriculum that addresses competency issues, gaps in knowledge, skill or whatever your research topic is.

  6. Address objectives and goals
    • If your project includes a competency assessment, define how competency is to be evaluated.
    • Are there national standards currently in place?
    • Should you be using a rubric?
      • Is there one available or do you need to create your own?
    • Identify meaningful measures that are reproducible and are as objective as possible.

  7. Create a method or procedure
    • Is the method appropriate to the hypothesis? Explain why.
    • Do procedures follow an orderly, logical sequence? Explain.
    • Is there evidence in reviews of previous studies to indicate context of this study in relation to the body of knowledge?

  8. Data collection and analyzation
    • How is data going to be collected? Surveys, interviews, competency evaluation, etc.
    • How is data going to be recorded?
    • How is the reliability and validity of the results going to be evaluated?
    • What statistical evaluation is going to be used? Explain.
    • There are also participant-specific data questions:
      • How many subjects will participate in the study?
      • Are the studied participants a representative case sample?
      • Is there a sufficient number of participants for observation?
      • Was approval to conduct the study obtained? From whom?
      • Do participants need to sign an informed consent?
      • Do you have IRB evaluation and approval?

  9. Outcomes and conclusions
    • What outcomes are expected from the results?
    • If the results are not attained, are there any contingency plans in place?
    • Are there plans to evaluate the reasons why the goals were not attained?
    • Can the reasons for success or failure be predicted?
    • Does the program fulfill ACGME competency assessments?
    • What important conclusions are expected?
    • What are the positive aspects of this research?
    • What are the negative aspects of this research?

Helpful resources

We know you might want or need a few more resources for your research plan. We’ve taken key topics for all of the different kinds of projects you might have and categorized them into additional resources and information on each.

However, these resources are supplementary. If you need help with the development of your initial idea, we suggest you utilize the following tools:


Rubrics are a means of assessing a learner based on individual progress toward previously set standards. This avoids comparing one learner to another, while still ensuring they meet the milestones required for progression.

You can use previously developed rubrics for your project. However, sometimes a project requires something more specific. Developing your own rubric isn’t as daunting as it may sound. Simply follow these steps:

  1. Identify what you’ll be assessing — a concept or a goal.
  2. Identify the important dimensions or characteristics of that concept or goal.
  3. Clearly define your expectations of your learners.
    • Provide a description of the highest possible level of performance.
    • Provide a description of the most basic level of performance.
    • Provide a description of an unacceptable or the lowest level of performance.
  4. Develop a scale to reflect a learner’s competency, similar to a Likert Scale. This scale can look however you want. Some run it from one to five —meaning unacceptable, marginal, acceptable, good, and outstanding. Others run the scale from one to three for novice, competent and exemplary.

Once you’ve completed your rubric, show it to your colleagues for feedback. If it adequately assesses your concept or goal and is achievable, then you’re good to go. Otherwise, revise accordingly.

For more information on rubrics, we suggest you look into the following:

Curriculum development
If you'd like more information on curriculum development, we suggest the following resources:

  • Dent, J., & Harden, R. M. (2013). "A Practical Guide for Medical Teachers." Churchill Livingstone. Section 3: Educational Strategies
  • Diamond, R. M. (2011). "Designing and Assessing Courses and Curricula." John Wiley & Sons. Chapters 1 (A learning centered approach to course and curriculum design), 2 (expanding role of faculty in accreditation and accountability), 4 (scholarship and faculty rewards), 5 (Introduction to the model and its benefits), 9 (linking goals, courses and curricula), 10 (gathering and analyzing essential data), 20 (meeting the needs of adult learners)
  • Kern, D. E., Thomas, P. A., & Hughes, M. T. (2009). "Curriculum Development for Medical Education." Johns Hopkins University Press.
  • Southgate, L., Hays, R. B., Norcini, J., Mulholland, H., Ayers, B., Woolliscroft, J., et al. (2001). "Setting Performance Standards for Medical Practice: A Theoretical Framework." Medical Education, 35(5), pp. 474–481. doi:10.1046/j.1365-2923.2001.00897.x
  • Heirich, M. (1980). "The People We Teach: Aids to Course Planning." Teaching Sociology, pp. 281–302.
To learn more about assessment practices and their effects on student learning, we recommend the following resources:

  • Dent, J., & Harden, R. M. (2013). "A Practical Guide for Medical Teachers." Churchill Livingstone. Section 6: Assessment
  • Suskie, L. (2010). "Assessing Student Learning." John Wiley & Sons. [nice resource for all around general information]
  • Walvoord, B. E. (2010). "Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education." [nice basic resource]
  • Ewell, P. T. (2005). "Can Assessment Serve Accountability? It Depends on the Question." In J. Wergin (Ed.), "Achieving Accountability in Higher Education; Balancing Public, Academic, and Market Demands" (pp. 104–124). San Francisco: Jossey-Bass.
  • Palomba, C. A., & Banta, T. W. (1999a). "The Essentials of Successful Assessment." In "Assessment Essentials" (p. 405). Jossey-Bass.
  • Palomba, C. A., & Banta, T. W. (1999b). "Selecting Methods and Approaches." In "Assessment Essentials" (85–113). Jossey-Bass.
  • Pike, G. R. (2002). "Measurement Issues in Outcomes Assessment." In "Building a Scholarship of Assessment" (131–164). Jossey-Bass.
  • Schuh, J. H. (2009). "Assessment Methods for Student Affairs." Jossey-Bass. Chapter 3: Planning for and implementing data collection; Chapter 4: Selecting, sampling and soliciting subjects
  • Walvoord, B. E. (2003). "Assessment in Accelerated Learning Programs: A Practical Guide." In "New Directions For Adult & Continuing Education." 2003(97), 39-50. doi:10.1002/ace.87
  • Wang, X., & Hurley, S. (2012). "Assessment as a Scholarly Activity?: Faculty Perceptions of and Willingness to Engage in Student Learning Assessment." In The Journal of General Education. 1-15. 10.1353/jge.2012.0005
Validity and reliability
For more information on validity and reliability best practices, we recommend the following articles:
  • Moskal, B. M., & Leydens, J. A. (2000). "Scoring Rubric Development: Validity and Reliability." Practical Assessment, Research & Evaluation, 7(10), 1–11.
  • Shavelson, R. J., & Huang, L. (2003). "Responding Responsibly." Change (Abstracts), 35(1), 10–19.
Clinical education and assessment
We recommend the following articles for further research on assessing learners and effective teaching:

  • Farmer, E. A., & Page, G. (2005). "A Practical Guide to Assessing Clinical Decision-making Skills Using the Key Features Approach." Medical Education, 39(12), 1188–1194. doi:10.1111/j.1365-2929.2005.02339.x
  • Friedman, A. J., Cosby, R., Boyko, S., Hatton-Bauer, J., & Turnbull, G. (2010). "Effective Teaching Strategies and Methods of Delivery for Patient Education: A Systematic Review and Practice Guideline Recommendations." Journal of Cancer Education, 26(1), 12–21. doi:10.1007/s13187-010-0183-x

If your project is focused on ACGME based competencies, review the resources at the ACGME website. Each residency program director should have access to the department’s specific competency requirements. Contact the residency program director to ensure your project is in alignment with the residency goals (PDF).

For more information on competency, please refer to the resources below:

  • Gunderman, R. B. (2009). "Competency-Based Training: Conformity and the Pursuit of Educational Excellence." Radiology, 252(2), 324–326. doi:10.1148/radiol.2522082183
  • Leung, W.-C. (2002). "Competency Based Medical Training: Review." The BMJ (Clinical research ed.), 325(7366), 693–696.
  • Morag, E., Lieberman, G., Volkan, K., Shaffer, K., Novelline, R., & Lang, E. V. (2001). "Clinical Competence Assessment in Radiology: Introduction of an Objective Structured Clinical Examination in the Medical School Curriculum." Academic Radiology, 8(1), 74–81. doi:10.1016/S1076-6332(03)80746-8
  • Newble, D. (2004). "Techniques for Measuring Clinical Competence: Objective Structured Clinical Examinations." Medical Education, 38(2), 199–203. doi:10.1046/j.1365-2923.2004.01755.x
  • Rothwell, W. J., & Graber, J. M. (2010). "Competency-Based Training Basics." American Society for Training and Development.
  • Smee, S. (2003). "ABC of learning and teaching in medicine: skill based assessment." The BMJ, 326(7391), 703.
  • Swanson, D. B., Norman, G. R., & Linn, R. L. (1995). "Performance-Based Assessment: Lessons From the Health Professions." Educational Researcher, 24(5), 5–11. doi:10.3102/0013189X024005005
  • Williamson, K. B., Steele, J. L., Gunderman, R. B., Wilkin, T. D., Tarver, R. D., Jackson, V. P., & Kreipke, D. L. (2002). "Assessing Radiology Resident Reporting Skills." Radiology, 225(3), 719–722. doi:10.1148/radiol.2253011335
Accreditation in higher education
If you’d like to know more about accreditation in higher education, feel free to refer to any of these recommended resources:

  • Wergin, J. (2005a). "Higher Education: Waking Up to the Importance of Accreditation." Change (Abstracts), 35–41.
  • Wergin, J. (2005b). Taking responsibility for student learning: the role of accreditation. Change (Abstracts), 37(1), 30–33. doi:10.3200/CHNG.37.1.30-33
If you’re working with resident education programs, reference the ACGME website for any new developments.
Learning theory
For more information on learning theory and teacher competence, we recommend the resources below:

  • Bransford, J., Vye, N., Stevens, R., Kuhl, P., Schwartz, D., Bell, P., et al. (2005). "Learning Theories and Education: Toward a Decade of Synergy." Handbook of Educational Psychology (2nd edition).
  • Brookfield, S. (1995). "Adult Learning: An Overview." International Encyclopedia of Education, 1–16.
  • Pratt, D. D. (2006). "Three Stages of Teacher Competence: A Developmental Perspective." New Directions For Adult & Continuing Education, 1989(43), 77–87. doi:10.1002/ace.36719894309
Educational research

We recommend the following articles on improving educational research:

  • Burkhardt, H., & Schoenfeld, A. H. (2003). "Improving Educational Research: Toward a More Useful, More Influential, and Better-Funded Enterprise." Educational Researcher, 32(9), 3–14.
  • Education, C. O. R. I., National Research Council. (2004). "Advancing Scientific Research in Education." Educational research National Academies Press.
  • Akkervan, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (2006). "Introducing Educational Design Research." Routledge New York, NY.
Grant writing

RSNA provides numerous grant writing workshops, including half-day sessions and weekend courses. Find which workshop is right for you.

For other grant writing resources, we suggest the following:

Another great resource when writing grant proposals is reviewing past submissions. Here are a few high-quality grant applications that did well at study section. However, keep in mind not every section of every application is perfect.

About the R&E Foundation


Our Research & Education Foundation provides a critical source of support for investigators. Since the Foundation’s inception in 1984 we’ve awarded over 1,600 grants. That’s $70 million in funding for radiology research and improving patient care.