Skip to end of metadata
Go to start of metadata

Special Considerations when Running Standalone Rice

When upgrading, you need to take into account your Rice server (if running a separate instance) All the database upgrade scripts are broken up on the assumption that you could have separate databases for these components of the Kuali infrastructure.

The Rice team manages their own set of upgrade scripts and instructions. We have not attempted to include those in our distribution instructions explicitly. However, our distribution does include the version of the Rice project which is incorporated with KFS. Within that project you will find the needed scripts under rice/scripts/upgrades.

KFS 3.0.1 was released with Rice So, you will need to perform all the upgrade steps for versions between Rice and Rice 1.0.3. You should visit the Rice documentation home ( and review the release notes for each release. Instructions and considerations for performing the upgrades are documented there.

Rice/KFS Version Compatibility

As of the current releases, Rice has not reached the state of being client/server version compatible between releases. This means that your KFS and Rice servers must be upgraded in lock-step. The KFS application will always ship with an embedded copy of the needed version of Rice. However, until Rice reaches version 1.1 (and there is a release of KFS which uses 1.1), they must be upgraded in parallel.

KFS Version

Compatible Rice Server Version






The upgrade processes are separate. But for the most part, rice upgrades are a simple drop-in of the new code and applying database updates provided by the rice team. Obviously, that is an over-simplification if you have customizations to the Rice application at your institution. Once you have a testing server running the new Rice version, you can start upgrading KFS.

Source Code Upgrades

Merge the updated code into your source tree. You should check in the KFS 4.0 distribution in the same manner in which you checked the prior version. Your version control system should then be able to generate a diff containing the changes and merge that into your current code line.

Our recommendation is that you first merge the KFS 4.0 changes into a vanilla copy of the 3.0.1 distribution in a branch your version control repository. (We assume that your repository has at least the capabilities of SVN.) If you do not have one, then use a copy of your main code line before any customizations were made. Once that is checked in, your VC system should be able to merge the changes between the distributions into your main line of code.

You can preview the files changed with this release at KFS 3.0.1 to KFS 4.0 File Change Summary.

Special Considerations

If you have overridden beans (either Spring or Data Dictionary) in your institutional directories, you should review any differences in the files that contain the original (delivered) versions for any changes. (E.g., if you override an inquiry section and change the fields contents, depending on the method of your change, you may not automatically get new fields added to the baseline configuration. You should compare the 3.0.1 and 4.0 versions for changes which may affect your customizations.)

Library Location Change

If you made any changes to the work/web-root/WEB-INF/lib directory, you will need to manually move any changed libraries to the new build/kfs-lib directory. Starting with this release, the work/web-root/WEB-INF/lib directory contents are generated by copying the jars from the build/rice-lib and build/kfs-lib directories. The build/rice-lib directory is automatically extracted from the Rice WAR file delivered with the project, but the files are checked in so that they are immediately available to Eclipse's classpath. (Otherwise, you would have build path errors immediately upon importing the project.

Review XML File Overrides

Data Dictionary

You should carefully review the updated files for any data dictionary files which you may have overridden to be sure that your local overrides/extensions do not cover up any additions or deletions made to the base. (For example, if you redefined the section list on a maintenance document without the merge="true" attribute, you would not automatically get a new section if it were added to the baseline code.)

Spring Service Configuration

If you have overridden any services, you should check their parent definition to make sure that no properties were added or deleted with the release. Since most service definitions do not use the "-parentBean" structure of the data dictionary, this will cause your definitions to be incorrect if properties were changed.

Review Workflow Upgrade Scripts

In general, we will not include upgrade scripts for existing documents. We assume that if you have implemented, the previous workflow definitions met your needs or you customized them. In either case, an upgrade process should not change your production workflows. Mostly, the scripts will be for new documents. However, if you have significantly customized the document hierarchy, you should review the parent documents used to ensure that they are placed in the appropriate location in your hierarchy.

You should review each workflow modification individually to see if it is suitable for your environment. If it is a document which you already have in production, you need to compare the foundation's offering with what you have in place in terms of labels and route nodes. Ingesting any workflow XML will completely replace the definition for the affected document type. You should be especially careful in ingesting any parent document types, as they can have effects which ripple down to their child documents.

Build Process Changes

Rebuilding the Eclipse .classpath File

A new build feature is the creation of the .classpath file automatically from the contents of the following directories:


Rice libraries extracted from the included Rice server WAR file. Some exclusions apply.


Libraries needed by KFS in addition to the Rice libraries


JEE Application API jars needed for compilation only.


JDBC Drivers used during testing


Libraries which should be installed in the application server rather than within the web application.


Libraries uses by unit tests.

Simply run the build-eclipse-classpath Ant target to rebuild .classpath. Be sure to refresh the project in Eclipse afterwards.

New - Classpath Restrictions

A number of libraries are now identified as "runtime only" (similar to the Maven concept) and are marked in the classpath as forbidden. Any attempt to use classes from those libraries will result in a compilation error within Eclipse. You can alter this list by changing the value of the runtime.only.jars property defined in build/properties/

Built-in Tomcat Server / Tomcat 6 Support

To aid in getting started with development, the project now includes a Tomcat 5.5.28 server in the build/tomcat directory. This location is the new default of the appserver.home property. If you are setting that property in your local file, it will continue to work as before.

Also, the build process now supports Tomcat 6 for development. Set tomcat.version to "6" and point appserver.home to your tomcat installation. Because of this, the appserver.lib.dir and appserver.classes.dir are now dynamic properties set within the build file and have been removed from the property files.


Property Name



This property simply adds the <distributable /> tag into the web.xml file to tell Tomcat that it the session data is serializable and can be sent across the network to other servers you have registered through tomcat configuration.


Must be "5" or "6". Only controls the locations to which appserver files are copied.

This property now defaults to build/drivers. Due to licensing, we can not formally distribute database drivers. You can drop whatever driver files you need in this directory and run the build-eclipse-classpath Ant target to incorporate them into the .classpath file.

Initial Start Help

The first time you attempt to run an ant target, the build script will look for the presence of a file in your home directory. (Technically ${user.home}, which you can redefined if needed.) If not present, the build script will prompt you for information on the location of your files and database and build a standard one for you.

Database Upgrades

When upgrading, you need to take into account your Rice server (if running a separate instance) All the database upgrade scripts are broken up on the assumption that you could have separate databases for these components of the Kuali infrastructure.

Rice Database

Rice Database Upgrade Scripts

These should be run before any KFS upgrade scripts. Please see the Rice release notes as referenced above for instructions on running these.

However, in general, the scripts will be available in the distribution at: rice/scripts/upgrades

See the KFS section below for scripts which update the rice database with new/changed KFS data.

KFS Database

Note: The KFS upgrade scripts assume that the Rice database has already been updated with any additional data or structural changes.

KFS Upgrades are delivered via Liquibase files. This allows for the updates to be given in a database-agnostic fashion. They have been primarily tested on Oracle, but have the needed adjustments to allow them to run under MySQL as well.

The easiest way to run the scripts is to copy the file as and fill in with the appropriate DB information. Then, run the script (or the command within it), passing in the location of the file you want to execute.

Run Changes Directly into Database

Another nice feature of Liquibase is that it can simply generate the SQL for you if you do not have a DBA who will run Liquibase scripts against your databases. The command below will run the script except that it will dump the SQL commands to the console. (Note that you still need a database against which to run, but it will not attempt to run any of the updates.)

Run Changes to SQL Script

KFS Rice Data Upgrade Scripts

With KFS 4.0, there are updates to core Rice data tables. The scripts below need to be run against your Rice server database. If you are running with Rice embedded (no separate Rice server) then this will be your KFS database.

The scripts present in release 4 (all in work/db/upgrades/4.0/rice) are listed below. You should review the contents of each to see what they are doing before executing them. There are some statements in the scripts which, depending on your environment, you may want to alter or omit.




Contains new and updated parameters for KFS release 4.0. Note: Parameter updates will usually be very minimal, only to correct values which are in error and are not likely to have been updated as part of an implementation. Values which it is likely have been changed will not be included in update scripts since we do not want to overwrite values which are working at your institution.


Updates to KNS tables for changes since KFS 3.0.1. Mostly cleanup.


Updated KIM data (new/removed permissions/responsibilities)


KNS updates related to the new Endowment module. Only needs to be executed if you will be implementing this module.


KIM Roles/Permissions/Responsibilities related to the new Endowment module. Only needs to be executed if you will be implementing this module.

KFS DB Structure Updates

These scripts update the KFS database structure for KFS 4.0. This is the most important set of scripts, as KFS will simply fail to work if these are not executed.

The master file is: work/db/upgrades/4.0/db/master-structure-script.xml. It simply executes all the scripts under 01_structure as below:



Core Modules



New columns for object code table for KC integration.


New columns on the DV document.


Expansions of a number of column lengths.

Optional Modules



Addition of country code to org options table.


Updates to name column lengths to match other parts of the system.


Updates to name column lengths to match other parts of the system.


Added active indicator to a couple tables and added PREQ_ID to elec. invoice reject doc.

Kuali Coeus Integration Module



Added table to hold defaults for auto-account creation from Kuali Coeus.

Endowment Module



All new database objects (except FK constraints) for the endowment module. The line including this script can be omitted if you will not be implementing this module.

KFS DB Data Updates

The master file is: work/db/upgrades/4.0/db/master-data-script.xml. It executes all the scripts under 02_data as below:



Core Modules



Fixes the state code referenced in one description on the DV.

Endowment Module



The bootstrap data for the endowment module. You can review the data in the CSV files in the same directory to see what will be loaded.

KFS DB Constraint Updates

These are done last to avoid any parent/child record issues.

The master file is: work/db/upgrades/4.0/db/master-constraint-script.xml. It executes all the scripts under 03_constraints as below:



Endowment Module



All FK constraints for the endowment module.

Rice Country Code Update

As part of 1.0.3, the main rice database will be updated to use the 2-character ISO country codes. For many countries (like the US), this code is the same. However, since the master table is changing, all tables within KFS that have a country code will need to update their data if they use any of the affected country codes.

Based on code provided by the Rice team, we have include scripts in kfs/work/db/upgrades/4.0/country_code_updates/country_code_sql which can be executed against the KFS database to update any references to the obsolete country codes. The convertAll.sql script is simply the concatenation of all the other table-specific scripts. This script should be safe to run against any KFS database, as it only updates those records whose country codes changed between the ANSI and ISO standards.

Optional Conversion


Note: The country code upgrade is optional. If you have already implemented KFS and Rice, you have a set of ANSI country codes. And, while that standard has been deprecated in favor of the new ISO standards, there is no reason that you can not continue to use the old codes. The main reason to upgrade would be if you share country/address data with systems outside of your institution, as many of them use the newer standards. This would leave you with having to make the translation in your communications with them.

  • No labels