DfOperations Sample Code

In case you missed it — it was buried at the end of the last post of the DfOperations Class series — source code for all of the examples discussed in the posts is here.

DFC DfOperations Classes – Part 8

In this final post on DfOperation classes, I will touch on a few advanced topics.

IDfOperation Steps

If you want a little more control over the execution of an operation, you can execute each operation one step at a time and check for errors along the way.  To execute operations step-wise, replace the DfOperation.execute() method call with the following code snippet.

IDfList steps = OpObj.getSteps();
int stepCount = steps.getCount();
boolean result = true;
for(int i = 0; i < stepCount; i++) {
  IDfOperationStep step = (IDfOperationStep) steps.get(i);
  System.out.println("\t\texecuting step " + i + ". - " + step.getName());
  boolean stepResult = step.execute();

  if (!stepResult)
    result = false;
}

The result of this code for the DfCopyOperation is:

executing step 0. – copy_post_population
executing step 1. – copy_pre_processing
executing step 2. – copy_object_processing
executing step 3. – copy_container_processing
executing step 4. – copy_post_processing
executing step 5. – copy_cleanup

Now you are acquainted with the actual steps of the DfCopyOperation.

Operations Monitor

A cool thing you can do with all of the DfOperation classes is attach a monitor to them.  This is, for example, how Webtop displays the progress bar while objects are being copied/moved/imported/exported/deleted. Unfortunately, the DFC DfOperationMonitor class does not offer much to work with (the Webtop operations monitor class is much better). Here is an example of how to use monitoring in a DfOperation and an example of a class to monitor progress.

To enable monitoring, simply set the operation monitor to a DfOperationMonitor class, like this:

// setup monitor
CopyMonitor cm = new CopyMonitor();
copyOpObj.setOperationMonitor(cm);

Here is the CopyMonitor class:

	private static class CopyMonitor extends DfOperationMonitor implements IDfOperationMonitor {

		@Override
		public int getYesNoAnswer(IDfOperationError arg0) throws DfException {
			System.out.println("[ERROR: " + arg0.getMessage() + "] - continuing");
			return IDfOperationMonitor.YES;
		}

		@Override
		public int progressReport(IDfOperation opObj, int opPercentDone,
				IDfOperationStep opStepObj, int stepPercentDone, IDfOperationNode opNodeObj)
				throws DfException {

			IDfProperties props = opNodeObj.getPersistentProperties();
			String objName = props.getString("object_name");
			String objType = props.getString("r_object_type");

			System.out.println("[MONITOR: operation=" + opObj.getName() +
					           " operation%=" + opPercentDone +
					           " step=" + opStepObj.getName() +
					           " step%=" + stepPercentDone +
					           " object=" + objName + " (" + objType + ")");

			return IDfOperationMonitor.CONTINUE;
		}

		@Override
		public int reportError(IDfOperationError arg0) throws DfException {
			System.out.println("[ERROR: " + arg0.getMessage() + "] - aborting");
			return IDfOperationMonitor.ABORT;
		}
	}

The result of this code for the DfCopyOperation is a follows:

[MONITOR: operation=Copy operation%=4 step=copy_pre_processing step%=25 object=Nested (dm_folder)
[MONITOR: operation=Copy operation%=6 step=copy_pre_processing step%=37 object=Nested (dm_folder)
[MONITOR: operation=Copy operation%=8 step=copy_pre_processing step%=50 object=Document5 (dm_document)
[MONITOR: operation=Copy operation%=10 step=copy_pre_processing step%=62 object=Document1 (dm_document)
[MONITOR: operation=Copy operation%=12 step=copy_pre_processing step%=75 object=Document2 (dm_document)
[MONITOR: operation=Copy operation%=14 step=copy_pre_processing step%=87 object=Document3 (dm_document)
[MONITOR: operation=Copy operation%=16 step=copy_pre_processing step%=100 object=Document4 (dm_document)
[MONITOR: operation=Copy operation%=18 step=copy_pre_processing step%=112 object=VirtualDoc (dm_document)
[MONITOR: operation=Copy operation%=4 step=copy_object_processing step%=25 object=Document5 (dm_document)
[MONITOR: operation=Copy operation%=6 step=copy_object_processing step%=37 object=Document1 (dm_document)
. . .

I suspect this is not the output you expected from the CopyMonitor class. Look how the operation and step complete percents jump around. I included a better monitor class with the code archive mentioned at the end of this post.

Aborted Operations

Most operations can be rolled back after they complete, but before the operation object is destroyed, or if an error occurs during processing. This is a really handy feature to help clean up after an error, but also to implement the notion of “cancelling” an operation. The code below augments the error checking we have used previously to implement an abort() and roll back the operation.

// check for errors
if (!result) {
  IDfList errors = copyOpObj.getErrors();
    for (int i = 0; i < errors.getCount(); i++) {
      IDfOperationError err = (IDfOperationError) errors.get(i);
      System.out.println("Error in Copy with Abort operation: " + err.getErrorCode() + " - " + err.getMessage());
    }

    // process abort
    if (copyOpObj.canUndo()) {
      System.out.println("\t\taborting operation...");
      copyOpObj.abort();
    }
}

That’s all there is to it. The DfOperation class will take care of undoing all of the steps of the operation. Pretty cool, eh?

Wrap Up

In general, I like the DfOperation classes and use them whenever I can. As mentioned previously, there are some great benefits to using DfOperation classes instead of coding these operations yourself.  In addition to the benefits, I hope you have seen how easy they are to implement, and you get the bonus of having built in undo methods and monitoring classes.

As cool and useful as DfOperations are, there are a few shortcomings, in my opinion:

  • You cannot create and insert steps into an operation.  For example, I would like to add a step to the Copy operation so that before the copy is done, the operation checks a value in a registered table.
  • You cannot extend the existing operations.  I would love to extend the DfExportOperation class to do deep folder exports.
  • You cannot write your own DfOperation classes.  I think it would be useful to create some custom operations like synchronizing metadata with an external data source.

Finally, working examples of all of the operations I have presented in this series are available here.

DFC DfOperations Classes – Part 7

In this post we will examine the Import and Export operations. These functions tend to be very common in practice. The respective DfOperations for these functions are similar to those previously discussed with a few exceptions noted below.

Import Operation

The Import operation performs a complete import. It creates necessary objects, cleans up, and patches links to XML files if necessary.


private void doImportOp(ArrayList fileList, IDfId importFolderId) {

  try {

    // #1 - manufacture an operation
    IDfImportOperation importOpObj = cx.getImportOperation();

    // #2 - add objects to the operation for processing
    for (String file : fileList) {
    IDfImportNode node = (IDfImportNode) importOpObj.add(file);
      node.setDocbaseObjectType("dm_document");
      node.setFormat("crtext");
    }

    // #3 - set operation params
    // interesting no ACL specified
    importOpObj.setSession(session);
    importOpObj.setDestinationFolderId(importFolderId);
    importOpObj.setKeepLocalFile(true);

    // #4 - execute the operation
    boolean result = importOpObj.execute();

    // #5 - check for errors
    if (!result) {
      IDfList errors = importOpObj.getErrors();
      for (int i=0; i<errors.getCount(); i++) { 
         IDfOperationError err = (IDfOperationError) errors.get(i);
         System.out.println("Error in Import operation: " + err.getErrorCode() + " - " + err.getMessage());
      }
    } else {

      // #6 - get new obj ids
      IDfList newObjs = importOpObj.getNewObjects();
      for (int i=0; i<newObjs.getCount(); i++) { 
         IDfSysObject sObj = (IDfSysObject) newObjs.get(i);
         System.out.println("\timported " + sObj.getObjectId().toString());
        // set ACL here?
      }
    }

  } catch(Exception e) {
    System.out.println("Exception in Import operation: " + e.getMessage());
    e.printStackTrace();
  }

}

  • #2 – instead of adding sysobjects to the operation’s node tree, for Import, we add strings that represent the files (complete paths) to import.  The add() method creates IDfImportNodes which we must further update to include the object type and format for each file being imported.
  • #3 – the Import operation is the only DfOperation that requires you to explicitly set the session.  The other operation parameters set here are obvious
  • #6 – I find it interesting that there is no accommodation for setting the ACL on the DfImportNode.  I guess after the import is completed ACLs, lifecycles, etc. can be set here. 

Export Operation

The export operation does a reasonable job of exporting content from the Docbase.  For virtual documents and XML documents it will export all of the referenced children of the parent document.  The most obvious drawback to the Export operation is that it does not perform a deep export of a folder tree.  You can add dm_folder objects to the operation’s node tree for export and it will export them, but none of their contents.  It would be nice to extend this DfOperation class to perform deep exports but there are limitations there too.


private void doExportOp(ArrayList objList, String dir) {

  try {

    // #1 - manufacture an operation
    IDfExportOperation exportOpObj = cx.getExportOperation();

    // #2 - add objects to the operation for processing
    for (IDfSysObject sObj : objList) {
      exportOpObj.add(sObj);
    }

    // #3 - set operation params
    exportOpObj.setDestinationDirectory(dir);

    // #4 - execute the operation
    boolean result = exportOpObj.execute();

    // #5 - check for errors
    if (!result) {
    IDfList errors = exportOpObj.getErrors();
      for (int i=0; i<errors.getCount(); i++) {
        IDfOperationError err = (IDfOperationError) errors.get(i);
        System.out.println("Error in Export operation: " + err.getErrorCode() + " - " + err.getMessage());
      }
    } else {

      // #6 - get new obj ids
      IDfList newObjs = exportOpObj.getObjects();
      for (int i=0; i<newObjs.getCount(); i++) {
        IDfSysObject sObj = (IDfSysObject) newObjs.get(i);
          System.out.println("\texported " + sObj.getObjectId().toString());
      }
    }

  } catch(Exception e) {
    System.out.println("Exception in Export operation: " + e.getMessage());
    e.printStackTrace();
  }

}

Really, other than the caveats mentioned above, the Export operation is pretty straighforward. In the next post, I will briefly touch on some advanced topics and wrap up this series on the DfOperations classes.

DFC DfOperations Classes – Part 6

The delete operation is one of my favorites. Perhaps because I use it so often, or that I find people like to write this one themselves (me included). It must be the allure of writing a recursive method call to do deep deletes that entices people to write it. Or maybe it is an old habit held over from the days before the operations classes when we had to implement all of this functionality ourselves. Anyway, whatever the reason, I encourage you to use the DfDeleteOperation class instead.

Delete Operation

The delete operation does everything you expect it to: it destroys objects in the Docbase, and if the object is a folder, virtual document, or XML document, it will destroy all of its substructures as well.


private static void doDeleteOp(ArrayList objList ) {
  try {

    // #1 - manufacture an operation
    IDfDeleteOperation deleteOpObj = cx.getDeleteOperation();

    // #2 - add objects to the operation for processing
    for (IDfSysObject sObj : objList) {
      deleteOpObj.add(sObj);
    }

    // #3 - set op parameters
    deleteOpObj.enableDeepDeleteFolderChildren(true);
    deleteOpObj.enableDeepDeleteVirtualDocumentsInFolders(true);
    deleteOpObj.setDeepFolders(true);

    // #4 - execute the operation
    System.out.println("\tdeleting... ");
    boolean result = deleteOpObj.execute();

    // #5 - check for errors
    if (!result) {
      IDfList errors = deleteOpObj.getErrors();
      for (int i=0; i<errors.getCount(); i++) {  
        IDfOperationError err = (IDfOperationError) errors.get(i);
        System.out.println("Error in Delete operation: " + err.getErrorCode() + " - " + err.getMessage());
      }
    } else {
      IDfList deletedObjs = deleteOpObj.getObjects();
      for (int i=0; i<deletedObjs.getCount(); i++) {
         IDfSysObject sObj = (IDfSysObject) deletedObjs.get(i);
         System.out.println("\tdeleted object " + sObj.getObjectId().toString());
      }
    }

  } catch(Exception e) {
    System.out.println("Exception in Delete operation: " + e.getMessage());
    e.printStackTrace();
  }
}

There is not much remarkable about this code. The most interesting bits take place at #3 where the operation parameters are set. All of these parameters are true by default, but I set them just to highlight their existence. See the DFC Javadocs for specifics on what each parameter does.

The DfDeleteOperation is by far the best and most robust delete operation I have seen. Beyond the operation parameters, you can customize each node (DfDeleteNode) added to the operation’s node tree at #2 to behave differently, such as delete certain versions. I showed you an example of using these operation-specific nodes last week with the Move operation.

In the next post I will present the Import and Export operations.

DFC DfOperations Classes – Part 5

In this post I will show you two related functions: copy and move.

Copy Operation

The copy operation copies a folder, document, virtual document or XML document from its current location, to the location specified. If a folder, virtual document or XML document is added to the operation’s node list, the copy is by default “deep”, meaning it will copy sub-folders and children along with the parent object.


private void doCopyOp(ArrayList objList, IDfFolder toFolder ) {

  try {

    // #1 - manufacture an operation
    IDfCopyOperation copyOpObj = cx.getCopyOperation();

    // #2 - add objects to the operation for processing
    for (IDfSysObject sObj : objList) {
      copyOpObj.add(sObj);
    }

    // #3 - set copy params
    copyOpObj.setCopyPreference(DfCopyOperation.COPY_COPY);
    copyOpObj.setDestinationFolderId(toFolder.getObjectId());

    // #4 - execute the operation
    boolean result = copyOpObj.execute();

    // #5 - check for errors
    if (!result) {
      IDfList errors = copyOpObj.getErrors();
        for (int i=0; i<errors.getCount(); i++) {
          IDfOperationError err = (IDfOperationError) errors.get(i);
          System.out.println("Error in Copy operation: " + err.getErrorCode() + " - " + err.getMessage());
        }
    } else {
      // #6 - get new obj ids
      IDfList newObjs = copyOpObj.getNewObjects();
      for (int i=0; i<newObjs.getCount(); i++) {
        IDfSysObject sObj = (IDfSysObject) newObjs.get(i);
        System.out.println("\tnew object is " + sObj.getObjectId().toString());
        newSysObjs.add(sObj);
      }
    }

  } catch(Exception e) {
    System.out.println("Exception in Copy operation: " + e.getMessage());
    e.printStackTrace();
  }

}

Again, the only thing of real note here is the use of the operation-specific parameters at #3.

  • setCopyPreference takes an integer constant defined in the DfCopyOperation class.  This parameter dictates the kind of copy to perform (make copies of children objects, or reference existing children objects)
  • setDestinationFolderId, which indicates where the copies should be made.  Note this parameter is an IDfId object.

Move Operation

The move operation will move objects from one location to another in the repository.  It performs all necessary linking and unlinking of objects.  If the object to be moved is a virtual document or a folder, all of the object’s substructures will be moved also.


private void doMoveOp(ArrayList objList, IDfFolder fromFolder, IDfFolder toFolder ) {

  try {

    // #1 - manufacture an operation
    IDfMoveOperation moveOpObj = cx.getMoveOperation();

    // #2 - add objects to the operation for processing
    for (IDfSysObject sObj : objList) {
      moveOpObj.add(sObj);
    }

    // #3 - set the source and target folder
    moveOpObj.setDestinationFolderId(toFolder.getObjectId());
    moveOpObj.setSourceFolderId(fromFolder.getObjectId());

    // #4 - execute the operation
    boolean result = moveOpObj.execute();

    // #5 - check for errors
    if (!result) {
      IDfList errors = moveOpObj.getErrors();
      for (int i=0; i<errors.getCount(); i++) {
        IDfOperationError err = (IDfOperationError) errors.get(i);
        System.out.println("Error in Move operation: " + err.getErrorCode() + " - " + err.getMessage());
      }
    } else {
      // #6 - get new obj ids
      IDfList newObjs = moveOpObj.getObjects();
        for (int i=0; i<newObjs.getCount(); i++) {
          IDfSysObject sObj = (IDfSysObject) newObjs.get(i);
          System.out.println("\tmoved object " + sObj.getObjectId().toString());
        }
    }

  } catch(Exception e) {
    System.out.println("Exception in Move operation: " + e.getMessage());
    e.printStackTrace();
  }

}

Note with the move operation you must provide the object id for the source folder of the object you are moving, in addition to the destination. This is necessary for the unlink to occur. If you add objects to the operation’s node tree that are in different folders you will need to indicate the source folder for each object as it is added to the tree. In this case, you would not set the setSourceFolderId parameter and change the code that adds objects to the operation’s node tree (#2) to look like this:


// #3 - add objects to the operation for processing
  for (IDfSysObject sObj : objList) {
  IDfMoveNode node = (IDfMoveNode) moveOpObj.add(sObj);
    node.setDestinationFolderId(toFolder.getObjectId());
    node.setSourceFolderId(sObj.getFolderId(0));
  }

This format for adding objects to the operation is actually valid for all of the operation classes. So, if you need more control over the objects you are adding to an operation, it can be achieved like this.

In the next post I’ll show you the delete operation.

DFC DfOperations Classes – Part 4

This post will build upon the last post and demonstrate how to reverse the Checkout operation with either a Checkin operation or a Cancel Checkout operation.

Checkin Operation

The checkin operation does all the things you would expect it to:  versions the content file appropriately, transfers content, unlocks objects, patches XML files, updates the registry, and cleans up local files.


private void doCheckinOp(ArrayList objList) {

 try {

   // #1 - manufacture an operation
   IDfCheckinOperation CheckinOpObj = cx.getCheckinOperation();

   // #2 - add objects to the operation for processing
   for (IDfSysObject sObj : objList) {
     CheckinOpObj.add(sObj);
   }

   // #3 - set operation params
   CheckinOpObj.setCheckinVersion(DfCheckinOperation.NEXT_MINOR);
   CheckinOpObj.setKeepLocalFile(false);

   // #4 - execute the operation
   boolean result = CheckinOpObj.execute();

   // #5 - check for errors
   if (!result) {
     IDfList errors = CheckinOpObj.getErrors();
       for (int i=0; i<errors.getCount(); i++) {
	IDfOperationError err = (IDfOperationError) errors.get(i);
	System.out.println("Error in Checkin operation: " + err.getErrorCode() + " - " + err.getMessage());
       }
   }

   // #6 - get new obj ids
   IDfList newObjs = CheckinOpObj.getNewObjects();
   for (int i=0; i<newObjs.getCount(); i++) {
     IDfSysObject sObj = (IDfSysObject) newObjs.get(i);
     System.out.println("\tchecked in " + sObj.getObjectId().toString());
   }

 } catch(Exception e) {
   System.out.println("Exception in Checkin operation: " + e.getMessage());
   e.printStackTrace();
 }

}

The Checkin operation does not depart from the basic form of the operation code discussed previously, but look at its power and simplicity. Only two notes to make:

  • #3 – set the Checkin operation-specific parameters.  In this case, indicate how to handle versioning. (Versioning behavior is defined by an integer constant;  see the DFC Javadocs.)  And second, indicate what to do with the local content, keep it or delete it.
  • # 6 – Notice here that I use the getNewObjects() method to get the object ids of the newly checked in objects.  To get the original object ids, use the getObjects() method.

Take a look back at some of the code you have written for doing checkins and see how it compares with the compactness and verbosity of these 40 lines of code. The DfCheckinOperation offers a ton of function and capability in a compact space.

CancelCheckout Operation

The Cancel Checkout operation completely nullifies a check out by unlocking objects in the Docbase (including virtual document children and XML nodes), removing local content, and updating the registry.


private void doCancelCheckoutOp(ArrayList objList) {

  try {

    // #1 - manufacture an operation
    IDfCancelCheckoutOperation cancelCheckoutOpObj = cx.getCancelCheckoutOperation();

    // #2 - add objects to the operation for processing
    for (IDfSysObject sObj : objList) {
    	cancelCheckoutOpObj.add(sObj);
    }

    // #3 - set operation params
    cancelCheckoutOpObj.setKeepLocalFile(false);

    // #4 - execute the operation
    boolean result = cancelCheckoutOpObj.execute();

    // #5 - check for errors
   if (!result) {
     IDfList errors = CheckinOpObj.getErrors();
       for (int i=0; i<errors.getCount(); i++) {
	IDfOperationError err = (IDfOperationError) errors.get(i);
	System.out.println("Error in Cancel Checkout operation: " + err.getErrorCode() + " - " + err.getMessage());
       }
   }

   // #6 - get new obj ids
   IDfList newObjs = CheckinOpObj.getObjects();
   for (int i=0; i<newObjs.getCount(); i++) {
       IDfSysObject sObj = (IDfSysObject) newObjs.get(i);
       System.out.println("\tcancelled checkout " + sObj.getObjectId().toString());
   }

  } catch(Exception e) {
    System.out.println("Exception in Cancel Checkout operation: " + e.getMessage());
    e.printStackTrace();
  }

}

Again, the only thing of note about this operation is the operation-specific parameter set at #3.

The next post will look at copying and moving objects in the Docbase.

DFC DfOperations Classes – Part 3

In this post I will show you how to take the general implementation of the DfOperation class described in the last post, and turn it into a concrete implementation for the Checkout operation.

I recently discovered that DfOperation code written on a 32-bit Windows machine would not run on a 64-bit Windows machine.  The operation classes make heavy use of the Registry, and the 32-bit Registry code does not run properly on a 64-bit machine.  The simple solution was to tell the DFC to use a file-based Registry instead of the system Registry.  Add this line to your dfc.properties file and you will be fine.

dfc.registry.mode=file

Checkout Operation

The Checkout operation will lock and download content for all sysobjects passed to it. It also creates registry entries (so they can be checked in or cancelled), and will patch XML files if needed.


private void doCheckoutOp(ArrayList objList, String checkoutDir) {

try {

  // #1 - manufacture a specific operation
  IDfCheckoutOperation checkoutOpObj = cx.getCheckoutOperation();

  // #2 - add objects to the operation for processing
  for (IDfSysObject sObj : objList) {
    checkoutOpObj.add(sObj);
  }

// #3 - set operation params
  checkoutOpObj.setDestinationDirectory(checkoutDir);
  checkoutOpObj.setDownloadContent(true);

  // #4 - execute the operation
  boolean result = checkoutOpObj.execute();

  // #5 - check for errors
  if (!result) {
    IDfList errors = checkoutOpObj.getErrors();
    for (int i=0; i<errors.getCount(); i++) {
      IDfOperationError err = (IDfOperationError) errors.get(i);
      System.out.println("Error in Checkout operation: " + err.getErrorCode() + " - " + err.getMessage());
    }
  } else {

    // #6 - get new obj ids
    IDfList newObjs = checkoutOpObj.getObjects();
    for (int i=0; i<newObjs.getCount(); i++) {
      IDfSysObject sObj = (IDfSysObject) newObjs.get(i);
      System.out.println("\tchecked out " + sObj.getObjectId().toString());
    }

    // #7 - open checked out files
    IDfList checkedOutNodes = checkoutOpObj.getRootNodes();
    for (int i=0; i<checkedOutNodes.getCount(); i++) {
      IDfCheckoutNode  nodeObj = (IDfCheckoutNode) checkedOutNodes.get(i);
      String path = nodeObj.getFilePath();
      if (path != null && path.length() > 0) {
        Runtime.getRuntime().exec("rundll32 SHELL32.DLL,ShellExec_RunDLL " + nodeObj.getFilePath());
      }
    }
  }

  } catch(Exception e) {
    System.out.println("Exception in Checkout operation: " + e.getMessage());
    e.printStackTrace();
  }
}

Most of the details in this code were covered in the previous post, but there are a few areas specific to the Checkout operation I want to point out.

  • #3 – sets two operation parameters that are specific to the DfCheckout operation. The setDestinationDirectory parameter sets the location for the content files to be downloaded to, and the setDownloadContent parameter tells the operation to download the content files. It is possible to checkout files and not download the content by setting this parameter to false.
  • #6 – simply gets a list of all the objects that were checked out.
  • #7 – if you want to manipulate the objects that were checked out, use the getRootNodes() method. This method will get each checked out object as a IDfCheckoutNode object that includes information such as where the object’s content was checked out. The next few lines of code demonstrate how to automatically have Windows open the checked out files.

Next post we’ll take a look at checking these objects back in and cancelling the checkout operation.

DFC DfOperations Classes – Part 2

Before I get into specific implementations of DfOperation classes (next post), I want to give you a general overview of how operations are implemented.  Each operation class contains methods and attributes specific to its particular operation (e.g., checkout is different from move). However, they also share a lot of commonality (inherited from the DfOperation class).  Thus, the invocation of each operation class is basically the same:

  • Instantiate the class – instantiate an interface class for the operation you want to implement.  Operation classes are manufactured from the DfClientX factory classes (e.g., DfClientX.getXXXOperation() where XXX denotes a specific operation name).
  • Populate the operation class – populate the operation with the necessary objects, properties, and execution options.
  • Execute – run the operation.
  • Check for errors – check for errors that occurred during the execution of the operation.  Because operations can be run on multiple objects and not all objects might fail, errors are caught and handled internally as opposed to being thrown to the caller.  Errors should be checked and processed accordingly.
  • Process results – each operation returns different results that might require additional processing.  For example, the Checkin operation returns object ids for newly created objects.

The following pseudocode demonstrates the generic setup and execution of a DfOperation.  Note the use of XXX where specific operation names should be used.


try {
   // #1 - create a clientX factory object
   IDfClientX cx = new DfClientX();

   // #2 - manufacture a specific operation
   IDfXXXOperation opObj = cx.getXXXOperation();

   // #3 - add an object to the operation
  opObj.add(sObj);

   // #4 - execute the operation
   boolean result = opObj.execute();

   // #5 - check for errors
  if (!result) {
    IDfList errors = OpObj.getErrors();
    for (int i=0; i<errors.getCount(); i++) {
      IDfOperationError err = (IDfOperationError) errors.get(i);
      System.out.println("Error in operation: " + err.getErrorCode() + " - " + err.getMessage());
    }
  } else {

    // #6 - get new obj ids
    IDfList newObjs = checkoutOpObj.getNewObjects();
    for (int i=0; i<newObjs.getCount(); i++) {
      IDfSysObject sObj = (IDfSysObject) newObjs.get(i);
      System.out.println("\tnew object is " + sObj.getObjectId().toString());
  }
// #7 - exceptions
} catch(Exception e) {
    System.out.println("Exception in operation: " + e.getMessage());
    e.printStackTrace();
}

  1. Get an IDfClientX object.
  2. Get the specific operation object from the IDfClientX class.  The XXX represent the name of a real operation (found here).
  3. Add an object to operate on. In this example, I assume sObj (an IDfSysObject) was passed to this method. The object itself could represent a document, a folder, a virtual document, or an XML document depending upon the operation. The add() method must be called for each individual object, so if you pass this method an IDfList of objects, loop through them and add each one individually. The exception to this rule is if you add the root of a virtual document (as an IDfVirtualDocument), the add() method is smart enough to add all of its children also. The same is true for an XML document. Notice that the add() method wants an actual IDfSysObject and not an IDfID or String.
  4. Execute the operation.
  5. If an error occurred, the result of the execute() method will be false.  Errors are contained in an IDfList object. Remember that operations do not throw exceptions except for fatal errors.  All other exceptions are caught internally and stored as IDfOperationError objects in the IDfOperation object itself.  An error while processing one object in the operation does not necessarily terminate the operation for all the remaining objects.
  6. If the operation created new objects in the repository (e.g., checkin or copy), these objects are also stored in the operation object.  If the operation did not create new objects (e.g., move or delete), the method call is getObjects() (as opposed to getNewObjects() above) and returns the objects that the operation processed.
  7. Catch any fatal operation errors just in case.

Next week I’ll show you a sample implementation for the Checkout operation.

DFC DfOperations Classes – Part 1

The operation classes (DfOperation) in the DFC offer huge benefits to developers (and ultimately end users), but seem to get little use or notice. Nearly every application I have supported in the past few years has contained custom implementations of basic library functions (e.g., checkin, checkout, etc.). How many of you have written code to implement one or more of these functions? Perhaps you have a library of these functions that you have written and hardened over time and now tote around with you from project to project. Or worse, rewrite these functions for every project. I know, we’re all guilty of doing it.

However, there is a better way. Documentum, since the beginning of the DFC, has provided the DfOperation classes to implement all of these core library functions:

Library Function Operation Class Purpose
Cancel Checkout IDfCancelCheckoutOperation Releases locks on checked out objects and cleans up local resources allocated to them.
Checkin IDfCheckinOperation Checks in new content, creates necessary versions, releases locks and cleans up locally allocate resources.
Checkout IDfCheckoutOperation Locks the object and exports its content for editing.
Copy IDfCopyOperation Copies objects to other locations in the repository, including deep folder structures and virtual documents.
Delete IDfDeleteOperation Deletes objects from the repository, including deep folder structures and virtual documents..
Export IDfExportOperation Exports content from the repository.
Import IDfImportOperation Imports content from the repository.
Move IDfMoveOperation Moves objects in the repository, including deep folder structures and virtual documents..
Transform IDfTransformOperation Perform an XSL transformation on XML content.
Validation IDfValidationOperation Validate XML documents against an XML schema.

Note: there are no operations for creating or viewing objects.

The advantage to using these operation classes over your own are numerous. Here are a few:

  • Take advantage of the years of thought and testing Documentum has invested in these classes. Documentum uses these classes  internally in its applications (e.g., WDK) so you can be confident they are solid.
  • Insulate your code against underlying changes to Documentum and the DFC. Since Documentum uses these classes internally, any such changes will be adapted by these classes.
  • You can do more with less code. When you see what these classes can do and how they can be used, you’ll wish you had been using them all along.
  • The classes are full featured and provide a consistent methodology for handling errors and even rolling back aborted operations.
  • The classes are all XML and virtual doucment-aware in case you are dealing with XML content or virtual documents.
  • The classes can operate on objects distributed across multiple repositories with no additional work or code.
  • These classes are naturally ACS and BOCS-aware.

In the following few posts I will dig into the DfOperation classes and show you how to use them, demonstrate their advantage over custom code, and hopefully convince you of their utility. In the next few weeks, look for these topics:

  • basic use of DfOperation classes;
  • examples of checkout, checkin, and cancel checkout operation classes;
  • examples of copy, move, delete operation classes;
  • how to handle errors and aborted operations;
  • advanced topics like running operation steps  and using operation monitors.
%d bloggers like this: