Surveys, Tests, and Decision Support Systems
a complete solution based on free open-source tools, by Robert A. Dennis (buddy@ucla.edu), Khy Huang (khy@arsdigita.com), Avni Khatri (avni@arsdigita.com)
ArsDigita : ArsDigita Systems Journal : One article
The Webster's New International Dictionary (2nd Edition) defines a survey as a "... looking over or upon ... with a purpose of reporting the results."
In this broad sense of the term "survey" any HTML form that solicits input from respondents can be considered a survey.
Assuming this definition, the list of possible applications of Web-based surveys is extremely broad.
Nearly every vertical market at some time or another needs to systematically collect information from its members (customers, students, users, etc)
and report these data in some fashion. Reporting can be for internal management or decision making, or it can be public.
Some examples of Web-based surveys include: employer/employee satisfaction questionnaires, student applications, end of chapter tests,
product registrations, personnel performance reviews, patient demographic information forms, product order forms, compatibility and profile matching
(dating services), customer satisfaction forms, etc. The list of the purposes to which HTML forms can be put is very large,
and there is a clear need for a flexible system that can manage the construction, data collection, and reporting of HTML forms; we call them surveys.
In this article of the ASJ we present an overview of a general and flexible database-backed Web-based system for
constructing, administering, and managing surveys. We call this system the survey builder. The survey builder is
unique in its ability to define a survey which changes its line of questioning based upon the previous responses of the survey taker.
This feature, referred to as dynamic branching, puts the powerful survey builder application ahead of its contemporaries.
This software is a component of the open-source ArsDigitia Community System (ACS) version 3.4 and uses Oracle 8.1.6 and AOLserver 3.0.
The survey builder is available for download from ArsDigita at http://www.arsdigita.com/download/.
We begin with a review of the context in which this work was started.
We briefly present two specific examples of the use of the survey builder,
and how these projects shaped our work. These two case studies predate the integration of the survey builder with the ACS.
While both projects are clear examples of the value of a survey building tool, particularly in the area of research,
they both point to the need for a core system to manage users and groups of users and to provide a solid community foundation.
We continue with details of the data model, design, and functionality of survey builder.
We conclude with some remarks regarding the role this system can play within the ACS.
Background
Computer-based surveys are not new. There have been many different products that allow one to develop a survey and then print it out,
conduct a phone interview, or send it to respondents via email. Most recent efforts have been to move survey building software applications
towards the World Wide Web. The extent to which these products take advantage of the Internet varies from simple exporting to hypertext markup language
(HTML) format, to more sophisticated applications that address complexities like skip patterns based on survey responses.
However most products lack tight integration with a back-end relational database, and they are awkward fits at best with the Web.
Perhaps the biggest limitation of most commercially available survey software is the lack of integration with a larger community infrastructure, and
this makes the programmatic administration of surveys difficult. Indeed, our own experiences with utilizing the survey builder ran
headlong into the issue of managing and maintaining a community of users, and it is this issuse that has brought us to the Arsdigita Community System.
The survey builder makes data collection much more efficient and organized.
Not only can someone quickly create a survey or test,
but he can also see the results of the survey instantly.
By providing an easy interface to a library of stored questions, an author can quickly publish a new survey
and with equal ease specify any number of special reports of data collected by the survey. The ease-of-use and
capable features of the survey builder have attracted considerable users of the system. Initially, the survey builder was
used as a test builder, and later it became the foundation of more programmatic research uses.
For the last three years the survey builder has been used regularly in the UCLA undergraduate Life Science Department as a test building tool.
Every quarter, more than 500 students taking classes in genetics and other introductory courses in life science respond to online quizzes during their weekly lab discussion section.
These quizzes are assembled using the survey builder, and
each teaching assistant (TA) has easy access to aggregate and individual reports of student performances.
Quizzes are given throughout the quarter and from the reports generated TAs can identify individual and group weaknesses and strengths.
Since the results are available instantly, the TAs can immediately utilizes these data to direct discussions and the focus of the time students spend
in labs.
In this way, the survey builder is serving an important role in bringing learning and instruction more closely together.
This system has proved effective in not only helping students get more out of an important component of their education
but also in helping TAs become better facilitators of learning and more responsive providers of information.
(Screen Shots, Survey Builder ACS-standard UI)
The survey builder has found application beyond test and survey authoring and reporting.
One of the first systems that we built that utilized the survey builder in a more integrated manner
was a Web-based prostate cancer patient/physician decision support system (PCDSS).
The PCDSS is a Web-based application that is meant to help patients better understand their options in the management of newly diagnosed prostate cancer.
For this project the survey builder was enhanced to provide support for dynamic branching through sets of questions in order to provide
decision support functionality.
Case study 1- Prostate Cancer Decision Support System
The purpose of the decision support component of the prostate cancer Web site
is to gather important information from a patient and make those data available to
both the patient and his physician, so they can approach the decision-making process together.
The patient fills out nine short surveys (listed below) from which an assessment
report is prepared. The site is structured so that patients do not need to fill out all the surveys at once.
In fact, patients are recommended to try to complete the first four surveys in one sitting,
and the remaining five surveys in another sitting.
The set of surveys that make up the PCDSS are as follows:
- Demographics
- Clinical Values
- General Health Status Questionnaire
- UCLA Prostate Cancer Index
- Putting a value on Life with Blindness (Practice Scenario)
- Putting a value on Life with Impotence
- Putting a value on Life with Incontinence
- Putting a value on Life with Radiation Proctitis
- Putting a value on Life with the Anxiety of Watchful Waiting
In the last five surveys, the goal is to take the patient through a process of "trade-off" questions
so as to come up with a relative value for what a year of life is worth to the person when lived
with a particular health complication. A patient's responses to these questions are used to factor in his own
personal feelings and preferences about living life with some possible problems that can result
from the different prostate cancer treatments. The following is an example of the wording of one of the questions that
is presented to patients as they proceed through the first (a practice session) of these "trade-off" surveys:
Remember, just look at Choice A and Choice B and choose the one that seems better to
you. You will then be presented with the next choice. When you reach the point where the
two choices seem about the same to you i.e. when it's just too hard to choose between
Choice A and Choice B, then pick Choice C at the bottom. And this will complete the
survey.
Reminder: Giving up 1 year in this tradeoff process is the smallest time option that will be
offered. If as you continue to answer, your series of responses should bring you to a 1 year
tradeoff and you pick Choice B at this point, you will be cycled back to the beginning to go
over your series of choices again. So if you feel good about your choices should you reach
that point, then just go ahead and pick Choice C.
(Screen Shots, montage of PCDSS survey pages)
While the first four surveys were easily produced with the survey builder, the "trade-off" question sets were a different matter.
Since the survey builder was originally built to allow teachers to quickly build an online test,
the system had no support for accepting data from a respondent and then using those data to determine what to return to that
person. Until the PCDSS project, the survey builder was a form builder and static data collector. The PCDSS project required that
we add back-end business logic that would allow us to branch between question sets contingent upon a survey taker's responses.
The key capability that was added to the survey builder for the PCDSS project was support for dynamic
branching, or contingency. We use on the word 'contingency' in the context of a survey or a questionnaire to refer
to dynamic ordering of questions that are presented to respondents.
The set of questions that are presented to a respondent can be dynamically determined based upon responses to
previous questions. By incorporating contingency into the authoring and back-end portions (the data-model) of the survey builder,
we can accommodate the operational requirements of decision support systems and also take simple online tests to a more
sophisticated "interactive" level, sometimes referred to as adaptive testing.
The PCDSS project presented us with a difficult set of issues that were never part of the original
survey builder design. Previously we were concerned only with tracking users within a survey.
The PCDSS required that we maintain information regarding users
across several surveys. The next case study exemplifies these same issues:
namely, a set of problems associated with maintaining users and groups, and
a schedule for administration of surveys.
Case study 2- Quality of Life study
Another example of the application of our survey builder also comes from a medical research setting.
The Quality-of-Life (QoL) study is an ongoing longitudinal
study that seeks to fill a gap in knowledge of cancer patients' quality of life following treatment for prostate.
This study is examining how the treatment of prostate cancer affects patients' self-reported perceptions of quality of life of themselves and their
families using a repeated measures research design. The most important physical effects of prostate cancer treatment are on urinary, sexual,
and bowel functions. However, prostate cancer and its various treatments also have a tremendous effect on a patient's emotions, stress level,
family relationships, and social interactions. In order to uncover this important information,
patients treated for early stage prostate cancer are invited to participate in the QoL study.
Once a patient has consented to participate and has chosen (in consultation with their doctors) his treatment,
he is asked to fill out a questionnaire (baseline QoL survey). This is done on a computer, with the help of a research assistant (RA),
using an offline-published version of a survey builder survey.
The same questionnaire is then presented to participants at specific time intervals over a two-year period to see how their quality of life changes,
and how quickly they return to their pre-treatment baseline quality of life.
Patients are encouraged to participate by completing the questionnaires online.
Those who agree to participate online are sent automated email reminders each time they are due to respond to a questionnaire.
Most patients participate in the study using coded paper copies of the questionnaire that are mailed to the patients and then mailed back
to the research office. One of the RA's then enters the data into the QoL research management system.
We had to extend our survey builder to meet the needs of this research effort.
Specifically, we needed a mechanism to support longitudinal research that relies on a string of published surveys.
A string of surveys can either be repeated administrations of a single questionnaire, or it can be any arbitrary sequence of different tests or surveys.
In the QoL study, the survey builder needed to be used to manage the repeated administration of the same surveys (baseline, 1st month,
2nd month, 4th month, etc) at relatively regular time intervals. In addition to general support for longitudinal data collection,
we added a host of management features for tracking compliance and scheduling administration.
(Screen Shots, montage of QoL administration pages)
We added an alert manager to handle automated email reminders. To support the RA's role in research,
we added a field notes component that manages all study-related notes and events, like phone call conversations, and relates these events to both
patient and RA. In short, the QoL research project forced us to address issues associated with supporting a community of users.
In this case, the community comprised patients who had consented to participate and the RAs who were charged with managing the study.
The QoL project is another good example of the use of surveys in a research setting.
The survey builder was a valuable tool in this regard.
However, once again we were faced with a set of similar problems.
The challenges that the QoL project presented were concentrated around the programmatic use of surveys.
The survey builder was ill equipped to step outside of itself and oversee the administration of surveys.
What we learned was that in addition
to a survey builder we needed facilitate to manage a project or even a community.
There was a clear need for a suite of tools to manage the programmatic administration of surveys to
assigned groups, subgroups, or even specific individuals.
Our work to add user group and task management capabilities to the survey builder moved us towards issues that the Arsdigita Community System
had been focusing on for a number of years. Instead of continuing to pursue separate solutions to the same problems,
we embraced the ACS and rewrote the code to make the survey builder an ACS module.
The ACS is the foundation for maintaining a community of users where surveys are one
mechanism that is used to facilitate communication and collaboration.
Over a period of several years the survey builder has evolved and grown.
The survey builder has been through many iterative improvements, redesigns and overhauls.
From its beginnings as a test builder, through reworking to address the needs of decision support system, over several migrations between RDBMS,
and finally to its new home in the ACS and Oracle,
the development of this application has been driven by the reoccurring need to systematically collect information from people on the Web.
In the following section, we present some of the inner details of this flexible and powerful application.
The Details
The basic components of a survey are questions, pages of questions, and the survey itself.
Surveys. This table had the following basic columns:
- survey_id
- author_id
- folder_id (The library folder in which a record of the survey is stored).
- survey_title
- survey_description
- survey_type (test of questionnaire)
- survey_published_p (has the survey been published out to HTML?)
Sections. A section is an abstraction of a page of questions. This table had the following basic columns
- section_id
- author_id
- folder_id (The library folder in which a record of this page is stored).
- section_name
- section_description
Items. Items are the abstraction of questions. This table stored questions and has the following basic structure:
- item_id
- author_id
- folder_id (The library folder in which a record of the question is stored).
- item_name
- item_description
- item_prompt_text (The stem part of the question)
- item_type (radio button, check box, select list, text box, or text area)
- item_html (the actual HTML that gets rendered by a browser)
- answer_required_p (is a response required for this question?)
In order to allow survey authors to be able to browser through private collections of questions, pages, and surveys for possible reuse,
each of these entities are organized into a hierarchical structure:
- sb_folders
- folder_id
- folder_name
- parent_folder_id
- author_id
To map which questions (items) belong to which sections (pages) and which sections belong to which surveys (either a test or a questionnaire), we needed the following tables:
- sb_section_use
- survey_id
- section_id
- order_in_survey (What number is this page in the survey?)
- next_section_id
- sb_item_use
- section_id
- item_id
- order_in_section (What number is this question on this page?)
When a respondent replies to a test or survey we store it in one table called
- sb_item_response_data. This table has the following basic structure:
- survey_taker_id (user_id for the person taking the test or survey).
- survey_id
- section_id
- item_id
- item_response (the answer or reply that the respondent provided to this question).
- survey_task_id
- session_id
In order to allow for anonymous survey takers we have an sb_users table.
We maps survey takers to ACS community members where applicable.
The data model is graphically represented in the following ER diagrams:
(Entire survey builder data model diagram)
(Simplified survey builder data model diagram)
The survey builder has two separate interfaces:
one interface that is a plain vanilla ACS standard interface, and
one interface that takes advantage of javascript, ACS's own pull-down menus module, and a persistent listing of library objects.
Part of the reason for providing two interfaces is a preference for a richer UI.
The Javascript version allows more information to be presented to the survey author in a more compact fashion,
and it generally provides a "cooler" interface.
However there is good reason for providing a non-javascript UI.
A plain html-only UI should always work in the rare occasion when javascript is not enabled.
(Screen Shots, Survey Builder cool UI)
Dynamic Branching Details
One of the strengths of the survey builder is its support for dynamic branching.
This is one of the more important advantages this module has over the simple survey acs module.
Since each set of answered questions is submitted back to the server, and the server communicates with the
database, adding contingent branching to a survey is a matter of the proper abstractions in the data model
and some additional supporting (tcl) business logical. The responsibility for managing data collection in the survey builder is a tcl program called
change-page. The change-page script is called when a survey taker submits a page of questions.
Its function is to accept data from a page, insert those data into the database,
and return the next page of questions to the respondent by querying the table sb_section_use.
In addition to mapping pages of questions to a survey, the table sb_section_use holds information on the order_in_survey of each page.
The following is the sql for the table sb_section_use of the current survey builder.
create table sb_section_use (
survey_id number
constraint sb_sectionuse_survey_id_nn not null
constraint sb_sectionuse_survey_id_fk references sb_surveys,
section_id number
constraint sb_sectionsuse_section_id_nn not null
constraint sb_sectionsuse_section_id_fk references sb_sections,
order_in_survey number
constraint sb_sectionsuse_ordr_n_srvy_nn not null,
next_section_id number,
constraint section_use_sur_sec_id_pk primary key (survey_id, section_id)
);
When a survey taker submits a page of questions, the change-page tcl script queries the database. If a survey has been authored with contingency
then a flag (branch_p) is set and the normal linear order of pages in a survey is subject to be overridden.
Instead of simply presenting the next page of questions according to the order_in_survey value in the sb_section_use table,
change-page will also look to a different table (sb_survey_branch_items) for information
to determine which page should be returned to the current survey taker based upon the responses received.
Here is the structure of our data model that adds support for contingency:
create table sb_survey_branches (
survey_branch_id number
constraint survey_branches_sur_br_id_pk primary key,
branch_name varchar(50),
survey_id number
constraint sb_surbr_survey_id_nn not null
constraint sb_surbr_survey_id_fk references sb_surveys(survey_id),
next_survey_id number,
next_section_id number
constraint sb_surbr_next_survey_fk references sb_surveys,
after_section_id number
constraint sb_surveybranches_bsec_id_fk references sb_sections,
before_section_id number
constraint sb_surveybranches_sec_id_fk references sb_sections,
datetime_created date
constraint sb_surbr_datetime_created_nn not null,
datetime_last_updated date
constraint sb_surbr_dttime_lst_updtd_nn not null
);
create table sb_survey_branch_items (
survey_branch_id number
constraint sb_surveybranchitems_bra_id_nn not null
constraint sb_surveybranchitems_bra_id_fk references sb_survey_branches,
survey_logical_item_id number
constraint sb_srvybrnchitms_srvy_lg_id_nn not null,
condition_id number
constraint sb_surveybranchitems_con_id_fk references sb_conditions,
item_id number,
section_id number,
item_response_value varchar(4000)
constraint sb_srvbrnchitms_itm_rsp_vlu_nn not null,
constraint sb_surveybranchitems_braitm_pk primary key (survey_logical_item_id, survey_branch_id),
constraint sb_surveybranchitems_secitm_fk foreign key (section_id, item_id)
references sb_item_use(section_id,item_id)
);
A branch in a survey must key on a certain predetermined response option (or options).
For example, if a survey taker answers "male" to the question "What is your gender?",
and we wish the survey to branch to a set of questions appropriate to males (e.g., questions about his erectile functions)
then this question needs to be a branching item.
To make a branching item we would add the item_response_value associated with the response option "male" for this question
to the table sb_survey_branch_items.
The tcl script change-page would then see that for this page of questions (identified by section_id) of this survey
(identified by survey_id) the "What is your gender?" question (identified by item_id) is represented in sb_survey_branch_item.
Additionally, the response value associated with the option "male" is linked to the table sb_survey_branches by the key survey_branch_id.
The table sb_survey_branches holds information regarding the appropriate next page of questions that are to be returned to the survey taker.
The reason for breaking the tables sb_survey_branch_items from sb_survey_branches is to allow complex response patterns to be associated with a branch.
For example, maybe we want to branch to the question set regarding erectile functions only if the respondent is male and has indicated that he
has been diagnosed with prostate cancer. Since there is no need to collect such information from normal healthy males,
the survey author can create a multi-question branching pattern.
(Screen Shots, montage of user interface for contingency)
Our solution to accommodating multi-question contingent branching is the notion of a logical item.
To allow a survey author to specify branching patterns that incorporate multiple questions we introduced the abstraction of a logical item.
A logical item can be considered the union of several individual items. The restriction is that each item must be a selected-response type of
question (i.e., radio button, check box, select list). Currently open-ended question items (i.e., text boxes and text areas) can not be part of a branching item.
A narrow world view: Everything is a survey. Just publish it!
We suffer from a narrow world view. Like someone who has just learned to use a hammer,
everything in the world looks like a survey to us. We would like to see the survey builder utilized widely in the ACS, and
we have designed the publishing component so that it is very flexible. There are three aspects of publishing a survey:
- physical format
- administration policy
- task assignment
The options for physical format of a published survey are:
- HTML pages -- boiler-plate HTML templates,
- a procedure call -- for embedding a survey in any existing page,
- adp tags -- a set of adp tags that encapsulates a survey's components.
Publishing a survey to boilerplate HTML was our original solution to building a survey.
This approach entails parsing a template file and replacing special markers
with the item_html from the sb_items table for each question that belongs to a page in a survey.
Once published these survey pages can be altered to change formatting or other cosmetic aspects.
While this approach to publishing has certain advantages, namely that it is simple and fast;
it is rather limited. Surveys published in this fashion almost always require work to make them visually appealing.
For this and other reasons we are adding two other more flexible options to publishing:
publishing to a procedure call, and publishing to adp tags.
Publishing a survey as a procedure call means that a survey can be embedded into nearly any page.
This publishing options will allow a programmer to place a survey in any page by simply adding a single procedure call.
Places where this may be useful include: adding a survey to a news item,
adding a multi question survey to a page similar to how the polls module works in the ACS,
adding a survey to an events page to collect additional information from community members.
Publishing a survey out to a template using the survey builder adp tags allows a designer to come in and easily remake the look of the survey pages.
Publishing a survey in this manner produces an HTML page that is sourced by the AOLserver when requested by a client (e.g., a tcl page).
Each page in a published survey is given code to validate the current user, check the policies that were assigned to the survey by the author,
and to query the database and produce valid HTML for a set of tags that represent components of the questions that belong to the page.
The adp tags that are produced look something like the following:
...
...
<table><tr><td>
<%=$q1_error_tag%> <%=$q1_prompt_text%>
<%=$q1_response_req%> <%=$q1_attachment_1%>
<%=$q1_attachment_2%> ...
</td></tr></table>
<table><tr><td>
<%=$q2_error_tag%> <%=$q2_prompt_text%>
<%=$q2_response_req%> ...
</td></tr></table>...
(One table is generated for each question that is part of this page).
When a survey is published using the adp template option each question for each page (section) is prepared so that the question stem,
the response options, a holder for a custom error message, and holders for any attachments to the question are placed in an HTML table.
Once these pages have been generated a designer can easily alter the look of the survey pages by manipulating
these tags and updating the survey header and footer values.
Survey administration
There are three abstract components to survey administration: a session managing component, a policy managing component, and a data checking component.
The role of the session manager is to control access to a survey.
Each completion of a survey we call a survey session.
A new session is initiated when a survey taker hits the welcome page of the survey.
If there is no welcome page as is the case with embedded surveys, a new session is triggered by the first page of the survey.
If a survey is published to allow multiple sessions then a person taking the survey is allowed to retake that survey repeatedly.
When publishing a survey the author has several session options:
- survey taker can retake the survey and can move back and forth making changes during a survey session,
- survey taker can take the survey only once but can move back and forth making changes during a single survey session,
- survey taker can retake the survey but can only go forward through the survey,
- survey taker can take the survey only once and can only go forward through the survey.
(Screenshot of survey policy properties)
The role of the policy manager is to enforces the specific policies that are established for the survey when it is published.
The session manager checks with the policy manager on whether to generate a new session.
Depending on the policy associated with the survey the policy manager will reply appropriately.
Before data is passed to the data checker it must pass all the policy checks.
The role of the data checker is to validates the survey data fields and make sure they satisfy any restrictions associated with question.
If they do pass then the data is inserted into the database.
The following are the tables in the survey builder data model that contain policy information.
-- General survey policy table
create table sb_policy (
policy_id integer
constraint sb_pol_policy_id_pk primary key,
policy_name varchar2(50)
constraint sb_pol_policy_name_nn not null
constraint sb_pol_policy_name_un unique,
policy_desc varchar(500)
constraint sb_pol_policy_desc_nn not null,
author_id integer
constraint sb_pol_author_id_fk references sb_users
constraint sb_pol_author_id_nn not null,
datetime_created date
constraint sb_pol_datetime_created_nn not null,
-- absolute date range
date_start date,
date_end date,
-- number of times to take survey
n_times integer
constraint sb_pol_edit_count_ck check (n_times >= 0),
-- periodic
periodic_p char(1)
constraint sb_pol_periodic_p_ck check (periodic_p in ('t','f')),
-- periodic type is day(d), week(w), month(m), year(y)
periodic_type char(1)
constraint sb_pol_periodic_type_ck check (periodic_type in ('d','m','w','y')),
periodic_interval integer,
-- editable properties
editable_p char(1)
constraint sb_pol_editable_p_ck check (editable_p in ('t','f')),
editable_until date,
editable_session_p char(1) default 'f'
constraint sb_pol_editable_session_p_ck check(editable_session_p in ('t','f')),
editable_restart_p char(1) default 'f'
constraint sb_pol_editable_restart_p_ck check(editable_restart_p in ('t','f')),
-- number of times to allow edits
edit_count integer
constraint sb_pol_edit_count_p_ck check(edit_count >= 0),
-- saving properties
-- number of minutes to take survey
time_to_complete integer default 0
constraint sb_pol_time_to_comp_ck check (time_to_complete >= 0),
-- do we audit users entry
audit_p char(1) default 'f'
constraint sb_pol_audit_p check(audit_p in ('t','f'))
);
-- map policies with survey
create table sb_survey_tasks (
survey_task_id integer
constraint sb_survey_task_id_pk primary key,
survey_task_name varchar(50)
constraint sb_survey_task_name_nn not null,
survey_id integer
constraint sb_survey_task_survey_id_fk references sb_surveys,
policy_id integer
constraint sb_survey_task_policy_id_fk references sb_policy,
-- map to module
on_which_table varchar(30),
on_what_id integer,
-- which group to assign survey
-- if group id is -1 then public
-- if group id is 0 then members
group_id integer,
datetime_created date
constraint sb_survey_task_date_nn not null,
datetime_last_updated date
constraint sb_survey_task_dt_lt_nn not null,
user_id integer
constraint sb_survey_task_user_id_nn not null
constraint sb_survey_task_user_id_fk references sb_users
);
The Future
The survey builder is a general and flexible database-backed Web-based application for
constructing, administering, and managing surveys. It has been sucessfully applied in dozens of different
applications. The code for survey builder is released and fully supported by ArsDigita.
In this article we have presented an overview
and a brief look at some of the internal design of this powerful tool.
Although our work has been primarily in the areas of online tests, questionnaires, and decision support systems,
we expect to see the survey builder used in more and different ways in the future.
By capitalizing on the robust and flexible capability of the ACS, there are a number of areas where
the survey builder can be exploited.
We expect to use the survey builder in the future as a rapid prototyping tool for complex decision support
systems, project management systems, and in numerous other applications.
Another area that the survey builder is well suited to is adaptive testing.
The dynamic branching capabilities of the survey builder makes it easy to lay out
the complex flow of screens that might be part of such new applications.
We look forward to making progress in this exciting area of interactive instruction.
In the immediate future our own development work will focus on building more powerful reporting capabilities
to include in the next release.
More
asj-editors@arsdigita.com