- There should be one major release at a time, or it should be clear which release is to be developed against for best integration with the current wishes of the software project community (whether it is the current stable release, existing alpha and beta releases, nightly build versions, or the code available at the moment from the repository). It should be easy to download the right version of the source code at any particular time (including the build which is recommended for development).
- There should be clear links from the main project website to instructions for developers.
- There should be a mailing list which should support a helpful and understanding development culture.
- Code conventions and architectural assumptions (e.g. which Java technologies should be used for specific purposes) should be clearly documented, either on the Internet or in the code itself. It should not be necessary to use file search tools or the search interface of an integrated development environment (IDE) to find the source code for a major element of the product.
- It should be clear how back-end code and interface interact.
- It should be clear how authorisation to carry out actions is handled.
- There should be a general architectural philosophy which allows the use of plug-in code where this would be useful. Examples of where this would make life easier would be in adding small features (often known as "widgets") to the interface, which can then just consist of a fragment of HTML template and the code to support the feature which uses an API to interface to the main project code, as opposed to actually needing to modify the main code itself.
- Java and other modern languages are associated with a wide range of supporting technologies (e.g. compilers, HTML template engines, and development frameworks), and the specific choices made for an individual open source project should be clearly documented, as it is unlikely that developers new to the project will be familiar with all of them.
- The source code should compile without too much difficulty most of the time (daily builds may be expected to be broken sometimes, and in alpha and beta releases, some parts of the code may fail to compile if components are not yet updated to match new requirements).
- There should be Javadoc style documentation for every class in the source code to at least make it easy to determine the structure of each class, especially for large classes with many methods. A lot of information can often be obtained about how the code works just from the names of the methods, even if there is no extra detail available. The location of Javadocs should be central and clearly signposted. The Javadocs should be indexed by search engines (if they are automatically generated frequently, this might not happen, so it makes sense to permanently store static versions associated with stable releases, at least).
- Code should have at least some comments, and java classes should not be unnecessarily complex and long – one of the points of modern programming languages is that developers do not need to put all the code into one huge source file.
- If a development tool such as an IDE is recommended, this should be made clear and it should be possible to import the code into the IDE either as a whole or in sections without much difficulty, even when it does not compile. This should be possible with the current version of the IDE (or, if not, the version to use should be clear) on a computer with a reasonable amount of memory.
- Where there are requirements to install third party code, these should be clear and properly maintained (i.e., links to download libraries should not produce “not found” errors). This includes, for example, extensions which need to be installed for the code to work with an IDE of choice, as well as the libraries used by the project itself, for which technologies such as maven make the management of the required libraries much easier.
Monday, 19 November 2012
Open Source Development in an Ideal World
What should the ideal large open source project offer potential developers to help them to get started as quickly and easily as possible? These are my ideas on what would make my life easier when I'm thinking about producing some code for a project. I wouldn't expect any project to manage to get all of these right, but the more that can be managed the better! The larger a project becomes, and the more developers involved, and more code to understand for even the smallest development project, the more these issues become important.
Wednesday, 10 October 2012
Sakai Development: Post Nine
Before actually starting writing the code to do the deposit itself, I need to set up and include the SWORD2 Java client libraries. If you're not used to github, you might take a while to see the button, which you can use to download the library as a zip file. Unzip it, cd to the created directory, and run
to compile (and download a large number of new library files). This should hopefully end up with:
and then it's a question of copying the jar file to somewhere where it can be picked up by Sakai. This requires two things (assuming my understanding of how maven POM files work is correct):
The process is relatively simple; however, I should point out that I have just noticed in the SWORD2 client documentation that multi-part deposit is not supported, and the way I have been thinking has assumed that it is. So I will have to make a zip file or something of the items to be deposited (as their nature as a collection is important). java.util.zip is a core package, but not one I've ever used before; I'll start by adding the import for the package at the top of the file (new line 52).
The steps for producing a SWORD2 deposit from the selected files are:
mvn clean package
to compile (and download a large number of new library files). This should hopefully end up with:
[INFO] [jar:jar {execution: default-jar}] [INFO] Building jar: /home/simon/work/swordapp-JavaClient2.0-420485d/target/sword2-client-0.9.2.jar [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESSFUL [INFO] ------------------------------------------------------------------------ [INFO] Total time: 3 minutes 24 seconds [INFO] Finished at: Wed Sep 26 15:14:32 BST 2012 [INFO] Final Memory: 27M/121M [INFO] ------------------------------------------------------------------------
and then it's a question of copying the jar file to somewhere where it can be picked up by Sakai. This requires two things (assuming my understanding of how maven POM files work is correct):
- Add a dependency to the relevant pom.xml file, which will be picked up on compilation, so that maven will attempt to download the relevant jar file, and, if it can't, will ask for it to be installed locally by hand. The relevant file is in the content/content-tool/tool directory, and needs the following added (with line numbering):
- Import the necessary classes into the ResourcesAction.java file so that the library can be used in this script. This is a simple pair
import org.swordapp.client.SWORDClient;
import org.swordapp.client.SWORDClient.*;
at line 135 of the file.The code which will use this is based on doFinalizeDelete (lines 6545-6675), which follows doDeleteConfirm when the confirmation is selected. I haven't yet worked out where the actual display of the confirmation request happens, so this is not the last change by any means. The confirmation step could also include the opportunity to select from a list of repositories, and from the collections which exist at the chosen repository (as obtained from the repository SWORD2 service document). But that is a complication I don't really want to get into at this stage, I just want to be able to carry out a simple deposit. So I'm going to have both the repository and the collection set in the configuration file for the servlet.
The process is relatively simple; however, I should point out that I have just noticed in the SWORD2 client documentation that multi-part deposit is not supported, and the way I have been thinking has assumed that it is. So I will have to make a zip file or something of the items to be deposited (as their nature as a collection is important). java.util.zip is a core package, but not one I've ever used before; I'll start by adding the import for the package at the top of the file (new line 52).
The steps for producing a SWORD2 deposit from the selected files are:
- Get archive details from configuration (some will always need to be obtained from configuration, but a service description could be downloaded on the fly to get information about collections etc. - just not in this version as I'm already overrunning the schedule);
- Prepare metadata, using data from the confirmation form, which should basically be a subset of DC terms for use in the deposit (bearing in mind that it's possible for the depositor to go to the repository and change the metadata later if necessary);
- Prepare files - create a zip as a file stream;
- Authenticate to repository using userid/password provided in confirmation form;
- Make deposit and (hopefully) get back a receipt.
While the information given in the swordapp documentation at first looks pretty complete, it is missing some details which I need, as I discover on starting to put the code together. I'll need to look at the source code for the app to get them.
The first issue is with the metadata. Dublin Core is not the only metadata used in SWORD2; there is some basic information which is part of the Atom profile: title, author, last modification date, and so on, as seen in the example here. The documentation gives no information about how to set this, and in fact I can't find anything useful in the source code (the string "updated", which is the element name for the last modification date, does not appear anywhere in the client code). I'm not particularly familiar with Atom, so it is possible that these are optional. I'll ignore this for the moment and see what happens. I'm also going to assume that in order to give multiple entries, I just repeatedly add a term: this needs to be supported for DC contributor - I think this should work, but I haven't actually gone through the apache Abdera library .which the swordapp code uses to check this.
Just at this point Andrew Martin put up a useful blog post which details his journey to working with the SWORD2 java library. He's not doing exactly the same thing, though we have already been in contact. While I need to go deeper into the coding, his post is probably a very useful resource for anyone reading this one.
The next thing to sort out is creating a ZIP file (virtually) to contain the items selected for archiving. I've not done this before, and the ZIP creation stuff I can find online, as well in my ancient Java book, concentrates on making a ZIP from files (this looks pretty useful for that, and is likely to form the framework I'll use) rather than from the Sakai content platform where the content may not even be stored in a filesystem. So I need to work out how to get the content out of the files as a Java input stream. I'll start by looking through the ResourcesAction.java code, and then move on to other files in the same directory if I can't find anything. All the input streams in ResourcesAction are for making changes to content rather than reading it - makes sense, as reading is not an action whcih affects the resource. But this code from FilePickerAction.java (lines 1710-13) makes it look very simple:
I just need to work back through the context to be sure that this code is doing what it appears to be doing. Although it doesn't appear to be (because it's in a method for up dating the resource), this is what is is in fact doing, as I eventually discover when I find the relevant javadocs (ContentHostingService, BaseContentService, and ContentResource - not from the current release, though). To re-use this code, the ContentResource class needs to be loaded, which it already is, and the content service needs to be set up (outside the loop which runs through the selected items):
The first problem, then, is that what I have is a ListItem object, when what I want is the itemID (which is a String); this is simple, as id is a property of the ListItem object, so I can just get it. I'll also need to protect against the itemid being null, which I don't think should happen. I'm not quite sure what the correct thing would be to do if it does, so I'll just log a warning if that happens. So the code I add is (lines 6741-4):
and then in the conditional block (6752-6774),
Some preparation has to happen before this, using piped streams to turn the zip output into the input for the SWORD library, calculating the MD5 hash we want for the deposit stage on the way:
And now we should be in a position to set up the deposit with the SWORD2 library..
It's also occurred to me that the solution to the problem of multiple archives is to embed them in the confirmation web form - the user selects the archive there from a drop down list, and the script reaps the URL used for deposit. So the URL to use is then just a form parameter. Except - the sword client readme file suggests that a collection object (derived from the collection description in a service document) is needed for deposit, so I need to check in the source code to see if there's a method with a deposit URL as an alternative. Turns out that there is, so I'll use that. So we have (ignoring a whole load of exceptions which will surely need to be caught, for the moment):
The next issue is what to do with the receipt. And how to alert the user to the success or failure of the deposit. The confirmation web page should still be available (especially if it has an indication that the status of the archiving will be displayed there). So it be displayed there, dynamically. So for the moment, I'll just hold on to the receipt and revisit this code when I've written the appropriate Velocity file.
There's just one final bit of code to add to this file, which is to add a call to the confirm delete method as a new state resolution function (lines 7083-6):
Then I can start work on the Velocity file. I should say at this point that I don't expect this code to compile without error. I'm absolutely certain there will be exceptions which haven't been caught, and I may well have confused some of the variable names in the course of this. But I want to get on to the next stage before coming back here.
Just at this point Andrew Martin put up a useful blog post which details his journey to working with the SWORD2 java library. He's not doing exactly the same thing, though we have already been in contact. While I need to go deeper into the coding, his post is probably a very useful resource for anyone reading this one.
The next thing to sort out is creating a ZIP file (virtually) to contain the items selected for archiving. I've not done this before, and the ZIP creation stuff I can find online, as well in my ancient Java book, concentrates on making a ZIP from files (this looks pretty useful for that, and is likely to form the framework I'll use) rather than from the Sakai content platform where the content may not even be stored in a filesystem. So I need to work out how to get the content out of the files as a Java input stream. I'll start by looking through the ResourcesAction.java code, and then move on to other files in the same directory if I can't find anything. All the input streams in ResourcesAction are for making changes to content rather than reading it - makes sense, as reading is not an action whcih affects the resource. But this code from FilePickerAction.java (lines 1710-13) makes it look very simple:
InputStream contentStream = resource.streamContent(); String contentType = resource.getContentType(); String filename = Validator.getFileName(itemId); String resourceId = Validator.escapeResourceName(filename);
I just need to work back through the context to be sure that this code is doing what it appears to be doing. Although it doesn't appear to be (because it's in a method for up dating the resource), this is what is is in fact doing, as I eventually discover when I find the relevant javadocs (ContentHostingService, BaseContentService, and ContentResource - not from the current release, though). To re-use this code, the ContentResource class needs to be loaded, which it already is, and the content service needs to be set up (outside the loop which runs through the selected items):
ContentHostingService contentService = (ContentHostingService) toolSession.getAttribute (STATE_CONTENT_SERVICE);
The first problem, then, is that what I have is a ListItem object, when what I want is the itemID (which is a String); this is simple, as id is a property of the ListItem object, so I can just get it. I'll also need to protect against the itemid being null, which I don't think should happen. I'm not quite sure what the correct thing would be to do if it does, so I'll just log a warning if that happens. So the code I add is (lines 6741-4):
String itemid = item.id;
ContentResource resource = null; if (itemid != null) {
and then in the conditional block (6752-6774),
resource = contentService.getResource(itemId); InputStream contentStream = resource.streamContent(); byte[] buf = new byte[1024]; //get filename and add to zip entry String fileName = item.getName(); if (fileName != null) { zip.putNextEntry(new ZipEntry(fileName); } else { zip.putNextEntry(new ZipEntry("Unnamed resource with ID " + itemid); } int len; while ((len = contentStream.read(buf)) > 0) { zip.write(buf, 0, len); } zip.closeEntry(); contentStream.close();
Some preparation has to happen before this, using piped streams to turn the zip output into the input for the SWORD library, calculating the MD5 hash we want for the deposit stage on the way:
MessageDigest md = MessageDigest.getInstance("MD5"); DigestInputStream dpins = new DigestInputStream(pins, md);
And now we should be in a position to set up the deposit with the SWORD2 library..
It's also occurred to me that the solution to the problem of multiple archives is to embed them in the confirmation web form - the user selects the archive there from a drop down list, and the script reaps the URL used for deposit. So the URL to use is then just a form parameter. Except - the sword client readme file suggests that a collection object (derived from the collection description in a service document) is needed for deposit, so I need to check in the source code to see if there's a method with a deposit URL as an alternative. Turns out that there is, so I'll use that. So we have (ignoring a whole load of exceptions which will surely need to be caught, for the moment):
// Set up authentication AuthCredentials auth = new AuthCredentials(params.getString("userid"),params.getString("password")); // Make deposit and (hopefully) get back a receipt Deposit deposit = new org.swordapp.client.Deposit(); deposit.setFile(dpins); dpins.close(); pins.close(); pouts.close(); byte[] fileMD5 = md.digest(); deposit.setMimeType("application/zip"); deposit.setFilename(params.getString("title") + ".zip"); deposit.setPackaging(UriRegistry.PACKAGE_SIMPLE_ZIP); deposit.setMd5(fileMD5); deposit.setEntryPart(ep); deposit.setInProgress(true); // And now make the deposit DepositReceipt receipt = client.deposit(params.getString("depositurl"), deposit, auth);
The next issue is what to do with the receipt. And how to alert the user to the success or failure of the deposit. The confirmation web page should still be available (especially if it has an indication that the status of the archiving will be displayed there). So it be displayed there, dynamically. So for the moment, I'll just hold on to the receipt and revisit this code when I've written the appropriate Velocity file.
There's just one final bit of code to add to this file, which is to add a call to the confirm delete method as a new state resolution function (lines 7083-6):
else if(ResourceToolAction.ARCHIVE.equals(actionId)) { doArchiveconfirm(data); }
Then I can start work on the Velocity file. I should say at this point that I don't expect this code to compile without error. I'm absolutely certain there will be exceptions which haven't been caught, and I may well have confused some of the variable names in the course of this. But I want to get on to the next stage before coming back here.
Wednesday, 3 October 2012
Sakai Development: Post Eight
This post follows straight on from the last, especially as I've missed a constant setting which goes with the ones listed at its end, a new line 857:
Looking at the next section I might need to change, where permissions are sorted out at lines 1721-1801, I don't think anything needs to be altered, because this draws in constant values which I have already altered. However, things do need to be changed where permissions are set for items. This is done with new lines 2203 and 2209:
Of course, there will need to be corresponding alterations in the ContentHostingService class. I just need to find it - there is no ContentHostingService.java file in the source code. And searching online doesn't find anything useful. It's time to email those in the know, but meanwhile I can carry on with other bits of code. But I got a really quick answer - before I had a chance to do so, which tells me that I'm on the right lines:
"Yeah this one is in the kernel. The CHS Api lives there.
Sounds like what you are doing is correct. You'll need to add a method there that, most likely, checks some permission and returns true or false. If you want it set via a permission of course. You could just make it always available if that is what you wanted to, then you may not need to mod the kernel.
Then you could do item.canaddarchive(true)"
Thinking a bit more about this, I feel that perhaps I don't need a new kernel method, but I can be more sophisticated than making the service always available. To archive an item is basically making a copy, so what I really should do is to tie the archiving permission to the copy permission. So instead of the lines 2203 and 2209 above, I'll just check the canRead permission which is already there. (It strikes me, though, that in this world of DRM, read and copy are not necessarily the same thing, but never mind.) At least, I'm now at the end of the setting of permission booleans - the next bit of code should actually do something. (And no, I still have no idea why none of the kernel appears to be in the source code I downloaded, but this difficulty is one of the factors in my decision.)
What I want to work on now is to build the three archiving intermediate pages, which should be almost precisely like existing pages. The code for this starts at line 3961, which is where the delete confirmation page building code begins. To remind you, the three intermediate pages are the confirmation, collection metadata entry, and finish; the first and third will basically be copies of the equivalent delete pages and will therefore be built much in the same way. (I expected that most of the work for this little project would consist of copying and then amending existing code, nothing terribly difficult, and this is exactly what is being done here.) Since the code being copied is quite long, I'm not going to quote it all here. The two routines are very similar, so it looks as though copy and paste has already been used. I don't think I have time to look into the lines which are commented "//%%%% FIXME". I'll also need a third copy for MODE_ADD_ARCHIVE_METADATA. The new code becomes lines 4063 to 4207.
The code to call one of these new routines comes next, also a simple copy and modification of existing code. This comes in the context for the main page being re-displayed, as it will be on MODE_ARCHIVE_FINISH: lines 4823-7:
This gives a total of about 200 lines of boilerplate code copied, modified slightly, and inserted back into the class. From this point onwards, we start getting into more exciting development (though there is still a little bit more to add, to call the code to create the context for the metadata form and for the confirmation page - which, it has just occurred to me, could sensibly be the same thing...I may want to revisit some of the above changes to do this, but for the moment I'll just leave things as they are, as it shouldn't do any harm to create constants but not use them).
The model I have used so far for these changes is the existing code for item deletion. Now, there is a slight problem: the actual deletion code appears to delete items one at a time when more than one is selected for deletion, and we can't do this for the archiving; we want one SWORD2 transaction whether there are one or more items. We now need to copy and modify two routines - doCopy, and doDeleteconfirm - which set the application state to doCopy or Deleteconfirm respectively. The first will set the doArchive state, and the second will set Archiveconfirm state, in both cases processing the list of items selected for archiving into a vector format, and these states should then be processed to make the actual archiving or the display of the confirmation page happen. This gives another hundred or so lines of code modified which doesn't really do much.
The next place where something needs to be added is now line 6288. The doDispatchItem method, of which this switch block forms part, is, like most of the rest this class, sparsely commented, but appears to be the part which determines what to do - hence the switch block, with cases corresponding to the different actions. The question is, whether the ARCHIVE case should be like the COPY case or the DELETE case? It's hard to tell without more documentation. The COPY case basically adds the ID of a selected item to a list of items to copy, while the DELETE case actually calls deleteItem - a method we have already decided isn't appropriate to copy directly (as we don't want to break down the archiving of a collection of items into a sequence of archiving actions on the individual items). So the ARCHIVE case needs to be something in between, something like this (I hope):
which forms the new lines 6288 to 6301.
ACTIONS_ON_MULTIPLE_ITEMS.add(ActionType.ARCHIVE);
Looking at the next section I might need to change, where permissions are sorted out at lines 1721-1801, I don't think anything needs to be altered, because this draws in constant values which I have already altered. However, things do need to be changed where permissions are set for items. This is done with new lines 2203 and 2209:
boolean canArchive = ContentHostingService.allowArchiveResource(id); item.setCanArchive(canArchive);
Of course, there will need to be corresponding alterations in the ContentHostingService class. I just need to find it - there is no ContentHostingService.java file in the source code. And searching online doesn't find anything useful. It's time to email those in the know, but meanwhile I can carry on with other bits of code. But I got a really quick answer - before I had a chance to do so, which tells me that I'm on the right lines:
"Yeah this one is in the kernel. The CHS Api lives there.
Sounds like what you are doing is correct. You'll need to add a method there that, most likely, checks some permission and returns true or false. If you want it set via a permission of course. You could just make it always available if that is what you wanted to, then you may not need to mod the kernel.
Then you could do item.canaddarchive(true)"
Thinking a bit more about this, I feel that perhaps I don't need a new kernel method, but I can be more sophisticated than making the service always available. To archive an item is basically making a copy, so what I really should do is to tie the archiving permission to the copy permission. So instead of the lines 2203 and 2209 above, I'll just check the canRead permission which is already there. (It strikes me, though, that in this world of DRM, read and copy are not necessarily the same thing, but never mind.) At least, I'm now at the end of the setting of permission booleans - the next bit of code should actually do something. (And no, I still have no idea why none of the kernel appears to be in the source code I downloaded, but this difficulty is one of the factors in my decision.)
What I want to work on now is to build the three archiving intermediate pages, which should be almost precisely like existing pages. The code for this starts at line 3961, which is where the delete confirmation page building code begins. To remind you, the three intermediate pages are the confirmation, collection metadata entry, and finish; the first and third will basically be copies of the equivalent delete pages and will therefore be built much in the same way. (I expected that most of the work for this little project would consist of copying and then amending existing code, nothing terribly difficult, and this is exactly what is being done here.) Since the code being copied is quite long, I'm not going to quote it all here. The two routines are very similar, so it looks as though copy and paste has already been used. I don't think I have time to look into the lines which are commented "//%%%% FIXME". I'll also need a third copy for MODE_ADD_ARCHIVE_METADATA. The new code becomes lines 4063 to 4207.
The code to call one of these new routines comes next, also a simple copy and modification of existing code. This comes in the context for the main page being re-displayed, as it will be on MODE_ARCHIVE_FINISH: lines 4823-7:
else if (mode.equals (MODE_ARCHIVE_FINISH)) { // build the context for the basic step of archive confirm page template = buildArchiveFinishContext (portlet, context, data, state); }
This gives a total of about 200 lines of boilerplate code copied, modified slightly, and inserted back into the class. From this point onwards, we start getting into more exciting development (though there is still a little bit more to add, to call the code to create the context for the metadata form and for the confirmation page - which, it has just occurred to me, could sensibly be the same thing...I may want to revisit some of the above changes to do this, but for the moment I'll just leave things as they are, as it shouldn't do any harm to create constants but not use them).
The model I have used so far for these changes is the existing code for item deletion. Now, there is a slight problem: the actual deletion code appears to delete items one at a time when more than one is selected for deletion, and we can't do this for the archiving; we want one SWORD2 transaction whether there are one or more items. We now need to copy and modify two routines - doCopy, and doDeleteconfirm - which set the application state to doCopy or Deleteconfirm respectively. The first will set the doArchive state, and the second will set Archiveconfirm state, in both cases processing the list of items selected for archiving into a vector format, and these states should then be processed to make the actual archiving or the display of the confirmation page happen. This gives another hundred or so lines of code modified which doesn't really do much.
The next place where something needs to be added is now line 6288. The doDispatchItem method, of which this switch block forms part, is, like most of the rest this class, sparsely commented, but appears to be the part which determines what to do - hence the switch block, with cases corresponding to the different actions. The question is, whether the ARCHIVE case should be like the COPY case or the DELETE case? It's hard to tell without more documentation. The COPY case basically adds the ID of a selected item to a list of items to copy, while the DELETE case actually calls deleteItem - a method we have already decided isn't appropriate to copy directly (as we don't want to break down the archiving of a collection of items into a sequence of archiving actions on the individual items). So the ARCHIVE case needs to be something in between, something like this (I hope):
case ARCHIVE: Listitems_to_be_archived = new ArrayList (); if(selectedItemId != null) { items_to_be_archived.add(selectedItemId); } state.removeAttribute(STATE_ITEMS_TO_BE_MOVED); state.setAttribute(STATE_ITEMS_TO_BE_ARCHIVED, items_to_be_archived); if (state.getAttribute(STATE_MESSAGE) == null) { // need new context state.setAttribute (STATE_MODE, MODE_ARCHIVE_FINISH); } break;
which forms the new lines 6288 to 6301.
Monday, 24 September 2012
Sakai Development: Post Seven
The plan
How exactly do I expect the SWORD2 application to work? From the point of view of a Sakai user, I think that the first version should do the following:- add a new action to the drop down menu which appears in a resources page, which would have a configurable name but basically be "Submit to archive";
- when files and/or directories are selected, this action may display a metadata form which would need to be completed to continue to submit;
- the items chosen are submitted using SWORD2.
So the development tasks consist of:
- work out how to add a new action and give it configuration information;
- work out how to pick up pointers to the selected items and display an error if none are selected (and a list of what has been selected and a confirmation that the list is correct);
- work out how to create a new form for the metadata and configure its contents and when it is displayed;
- integrate with the SWORD2 java library for deposit using configured information about the archive (and obtain any other information about the submitted items which is required by SWORD2 for deposit);
- install patch on server as well as on laptop;
- test against at least EPrints and DSpace.
Figuring Things Out and the First Code Updates
The first thing to do is to figure out where the specific action drop down is actually set up. The easiest way I can think of to do this is to carry out a very simple search of the source code. There are 15,826 files in the source code, according to eclipse, so there may well be lots of hits for words likely to be common in the source code of any application, such as "action". Eclipse does provide a comprehensive interface to search the contents of the files, but it does't appear to do anything really useful for a large number like this, such as indexing them. (It's probably possible to set this up, maybe with an eclipse interface to a tool such as apache's solr, but I don't know how and I'm not sure it would be worth it for a small task like this.) Running a search goes through about 8,000 files and then creates a pop-up which tells me "Problem occurred with search", with a button "Show details" which freezes eclipse when pressed. So this is less useful than it appears to be.Linux has a large number of command line tools which can be used to search files. There are file indexers around, but I don't run one on my desktop as the indexing process is usually slow and memory intensive. Other tools are rater complicated to use - the find command, for example, has a syntax I can never remember how to use properly even though I first came across it in 1990 in a Unix version which is close to that still available in Linux in 2012 and which appears in several shell scripts I have written over the years. I know what the name of the graphics file which is used to display the button which expands into the drop-down menu I want to change is (because I've found it in the web page source code): icon-dropdn.gif. So I can search for that in the source code, as it's likely to appear near the place I want to work on. (I actually did this before, when trying to get the resources tool to work properly.) Then a simple series of greps finds the file I want in tomcat:
% cd $CATALINA_HOME/webapps % grep "icon-dropdn.gif" */vm/* % grep "icon-dropdn.gif" */vm/*/* sakai-content-tool/vm/content/sakai_filepicker_select.vm: img alt="$tlang.getString(" border="0" class="dropdn" gen.add="gen.add" icon-dropdn.gif="icon-dropdn.gif" nbsp="nbsp" sakai="sakai" src="#imageLink(" / sakai-content-tool/vm/content/sakai_filepicker_select.vm: img alt="$tlang.getString(" border="0" button.add="button.add" class="dropdn" icon-dropdn.gif="icon-dropdn.gif" nbsp="nbsp" sakai="sakai" src="#imageLink(" / sakai-content-tool/vm/content/sakai_resources_list.vm: img alt="$tlang.getString(" border="0" button.add="button.add" class="dropdn" icon-dropdn.gif="icon-dropdn.gif" nbsp="nbsp" sakai="sakai" src="#imageLink(" / sakai-content-tool/vm/content/sakai_resources_list.vm: img alt="$tlang.getString(" border="0" button.actions="button.actions" class="dropdn" icon-dropdn.gif="icon-dropdn.gif" nbsp="nbsp" sakai="sakai" src="#imageLink(" /span class="Apple-tab-span" style="white-space: pre;" /span> sakai-content-tool/vm/content/sakai_resources_list.vm: img alt="$tlang.getString(" border="0" button.add="button.add" class="dropdn" icon-dropdn.gif="icon-dropdn.gif" nbsp="nbsp" sakai="sakai" src="#imageLink(" / sakai-content-tool/vm/content/sakai_resources_list.vm: img alt="$tlang.getString(" border="0" button.actions="button.actions" class="dropdn" icon-dropdn.gif="icon-dropdn.gif" nbsp="nbsp" sakai="sakai" src="#imageLink(" /
(I searched the vm directories because they contain the files which generate the interface for Sakai; I've removed some of the HTML markup from the output lines as a quick way to get the results to display properly in blogger.) The results make it clear that the file we want ends up as sakai-content-tool/vm/content/sakai_resources_list.vm. But what is the file which is used to generate this one in the source code? There is no sakai-content-tool directory there. For this, a very simple find command will do the job perfectly:
% find work/sakai-src -name "sakai_resources_list.vm"
./work/sakai-src/content/content-tool/tool/target/sakai-content-tool-2.8-SNAPSHOT/vm/content/sakai_resources_list.vm
./work/sakai-src/content/content-tool/tool/src/webapp/vm/content/sakai_resources_list.vm
and there we have it: work/sakai-src/content/content-tool/tool/src/webapp/vm/content/sakai_resources_list.vm. Now to actually look at this file in eclipse - but it turns out that the listing in eclipse doesn't follow the directory/file structure of the source code itself and doesn't contain the vm file at all (I should perhaps have expected this, but it's about five years since I last used the IDE). And even on a fairly new laptop bought specifically to have as much RAM as financially possible (3 GB), it's using 50% of the available memory just sitting there open but doing nothing. For reading the source code files, it's definitely going to be easier to use a normal text editor, of which my choice tends to be gedit. So that's where I now open the file. (Note: I will want to create a patch eventually, so I'm going to need to have two copies of the source code, as well as the one in the Eclipse workspace so that the changes can be extracted.)
Now, vm files are not a type of HTML generating script I've used before (the older JSPs being more familiar in my Java development experience), so I want to do a little work understanding the format before going on with this. A quick search on filesuffix.com tells me that the program used to parse them is Apache Velocity, and on that site is a user guide.
While the graphic was useful to find the file, and indeed the part of the file which generates the drop down list which I want to alter, it doesn't do any work of itself. The graphic appears several times on each row of the table containing the directories and files managed by the resource tool, with the various menus being different types of actions, such as ones which are only applicable to folders (e.g. create new folders) and some to both files and folders (copy/paste, etc). Two of the drop down lists apply to single items at a time (Add, Actions) and appear next to each item in the list, rather than letting the user select a group of items and then carry out an action on all of them at once. However, what I want is more an action which is applied to every selected item in the table, whether it is a file or a folder, and these appear at the top of the table, greyed out unless at least one item is selected. (This set of buttons consists of text, rather than a drop down list.) The buttons are "Remove", "Move" and "Copy", by default.
A bit of fiddling with "Inspect Element" in the web browser (a really useful feature for understanding complex web pages), and I see that the relevant lines of code are those near the HTML class definition "navIntraToolLink viewNav" (line 178), which displays a button for each member of an array called listActions. This is not itself set in this vm file, but comes from the relevant Java class which is listed at the top of the file in a useful comment: "sakai_resources_list.vm, use with org.sakaiproject.content.tool.ResourcesAction.java". So that is the next file to look at, and here it is possible to see the array being created and populated (lines 4315-4336). It's a little difficult to work out what these lines do without a significant amount of context, as they basically turn one type of list into another, and the file is really too big to be readily understood without a great deal more commentary (I've removed much of the spacing to make the code display better in a fairly narrow style of blog post):
ContentCollection collection = ContentHostingService.getCollection(collectionId);
ListItem item = ListItem.getListItem(collection, null, registry, need_to_expand_all, expandedCollections, items_to_be_moved, items_to_be_copied, 0, userSelectedSort, false, null); MaplistActions = new HashMap (); List items = item.convert2list(); for(ListItem lItem : items){ if(lItem.hasMultipleItemActions()){ for(String listActionId : lItem.getMultipleItemActions().keySet()){ ServiceLevelAction listAction = registry.getMultiItemAction(listActionId); if(listAction != null){ listActions.put(listActionId, listAction); } } } }
This then indicates that the I need to understand the ListItem class to work out how this is populated, because this is what is used to create the initial list which is then manipulated. This class is defined in the sakaiproject/content/tool/ListItem.java file, another 4000+ lines of code with fairly minimal commenting. I worked through the relevant code (the getListItem method starts on line 151) adding temporary comments to ensure I remembered what I had already worked out. The first important bit is to work out what permissions the user has over the items in the collection, which in fact has an extremely useful comment:
* calculate permissions for this entity. If its access mode is * GROUPED, we need to calculate permissions based on current user's * role in group. Otherwise, we inherit from containing collection * and check to see if additional permissions are set on this entity * that were't set on containing collection...
and it's a sensible way to do it, dynamically changing the list of actions depending on what the user is allowed to do, but isn't something that I'd thought of. Stupid of me, especially as access control is my main field of expertise... So the place to look is not here at all, but the code which lists the permissions available to be set. The form which does this looks like this in a default setup (presumably the creation of custom groups of users would add new columns):
So to find the vm file which generates the table, I need to find a fairly distinctive looking piece of the HTML source for this table. The name attribute of the form tag, "permissionForm" is an obvious one, and finds me the authz/authz-tool/tool/src/webapp/vm/helper/chef_permissions.vm VM file. The line which generates the rows of the table, which correspond to actions and which we want to add to, is line 65:
#foreach($lock in $abilities)
so next I need to find out where the list $abilities is set. Unlike the VM I looked at earlier, this one doesn't have a useful comment indicating which java file is responsible for its processing, but I think I can assume it will be authz/authz-tool/tool/src/java/org/sakaiproject/authz/tool/PermissionsHelperAction.java. Having done so, I can see that it is from line 120:
private static final String TEMPLATE_MAIN = "helper/chef_permissions";
The abilities variable is exported to the VM in line 385, which is set between lines 368 and 382 from a list of functions (potentially filtered, which is why it takes 14 lines of code). The code is set by lines 357-363:
// in state is the list of abilities we will present List functions = (List) state.getAttribute(STATE_ABILITIES); if (functions == null) { // get all functions prefixed with our prefix functions = FunctionManager.getRegisteredFunctions(prefix); }
which will first get data from the class attribute STATE_ABILITIES, and, if this is empty, it will ask the FunctionManager for a list of registered functions. I suspect that it is the latter which is needed, as the script then goes on to write the list of functions back into STATE_ABILITIES. The function manager is defined in the class org.sakaiproject.authz.cover.FunctionManager, so that is where I need to look next. However, finding it is more problematic. There is no file in the source code named FunctionManager.java, and no file containing a class definition for FunctionManager. However, I did eventually find some documentation on the use of the FunctionManager which seems to explain how I should add new functions (it appears when searching using google for "sakai FunctionManager" but not when searching the Sakai WIKI through Confluence's own search box...). It's clearly out of date (linking to non-existent files), but I hope still helpful. I also found a document in the reference documentation section of the source code, reference/docs/architecture/sakai_security_reg.doc, which explains how the function manager works. However, what I really need is an overview of how the security in Sakai works, and I eventually found a useful collection of WIKI pages, https://confluence.sakaiproject.org/display/~markjnorton/Sakai+Security+Model and the four linked to from it.
My understanding of the Sakai security model is now this. There are four components: users, (permission) groups, functions, and objects. New groups can be created (using the Security Service); users can be placed in groups (using a GroupProvider); functions can be associated with objects (using the function manager); and users or groups can be granted permission to carry out a function (using a Permission Helper). Resolution of a user's permission to carry out a function on an object is done by the AuthzGroup service.
So what I need to do is to add a new function via the ListItem.java script (not the Permissions Helper as I previously thought) and work out how to add that permission to the owner of a collection of files. The function manager service documentation tells me that the preferred method to do this is by using Spring to inject the service, via an XML file; however, I quickly discover (with my old friend grep - in this case, the command "grep -R --include *.xml FunctionManager .") that there are no XML files which include the string FunctionManager in the Sakai source. So: should I be good and follow the documentation (last edited in 2010), or should I do this the way that everyone else seems to have done? In fact, it looks on close inspection as though the content manager tool doesn't do things like this at all, because it inherits its permissions from elsewhere (and I've now basically gone round a circle to ResourcesActions.java again). But this time I have more idea what I'm looking for, and can actually start making changes. I'm not sure how or indeed if I can create a configurable name for it, however.
In what follows, line numbers refer to the line number displayed in the editor as the file is changed, so once a line has been added, all those with higher numbers are one greater than they were in the original file. Don't worry: at the end of this process, I'll publish a patch file with all the changes included.
Change line 348 to have ARCHIVE as a member of the list of actions:
CREATE, DELETE, READ, REVISE, SITE_UPDATE, ARCHIVEAdd new line 415 to be similar to the other actions here:
public static final ListCONTENT_ARCHIVE_ACTIONS = new ArrayList ();
There probably need to be some constants set in this section to govern the behaviour of the archiving process, but at the moment I'm not entirely sure what they should be. Perhaps one to indicate the requirement for metadata, and another to give the status of the archiving process (similar to line 510, "private static final String MODE_DELETE_FINISH = "deleteFinish";" - if that is indeed what this constant indicates!). I have to revisit this. I could really do with finding proper javadocs for this part of the code, but Sakai is exceptionally unhelpful for this. For example, https://confluence.sakaiproject.org/display/DOC/Sakai+CLE+2.8+Release+Notes lists some sources for javadocs, but there are separate javadoc locations for each of the projects which make up sakai, and chasing links in from this list ends up at 404 not found errors in many cases. The Sakai project is desperately in need of a tidier documentation collection for developers, but creating one will be an enormous job. Looking at way that MODE_DELETE_FINISH is used later on, it is used to set up what is displayed after the deletion has occurred, and I need some sort of equivalent, a message indicating the archive submission has been made, as well as a similar confirmation message. So I add a new line 512 and 514 (with a blank line between them):
private static final String MODE_ARCHIVE_FINISH = "archiveFinish"; private static final String MODE_ARCHIVE_CONFIRM = "archiveConfirm";There is already a "MODE_REVISE_METADATA" constant, which appears to make Sakai display a metadata form, though I presume that this is for a single resource at a time rather than for a collection. So I'm going to want to have a mode for adding archive metadata, which forms a new line 531:
protected static final String MODE_ADD_ARCHIVE_METADATA = "add_archive_metadata";
Further down, there are more constants to add, with comments similar to those already there (lines 605-11):
/** The name of the state attribute containing a list of ListItem objects corresponding to resources selected for submission to the archive */ private static final String STATE_ARCHIVE_ITEMS = PREFIX + REQUEST + "archive_items"; /** The name of the state attribute containing a list of ListItem objects corresponding to nonempty folders selected for submission to the archive */ private static final String STATE_ARCHIVE_ITEMS_NOT_EMPTY = PREFIX + REQUEST + "archive_items_not_empty"; protected static final String STATE_ARCHIVE_SET = PREFIX + REQUEST + "archive_set";
This is proving quite complicated. I think that, if I had more time, I'd probably want to build a generic method for adding new actions, by configuring the action name and a Java class (implementing some interface, say) to carry it out. The difficulty with that approach would presumably be how to handle actions which need to take the control away from the current page, as happens with deletion confirmation and with the archive metadata I'll need.
Next, I will set the VMs to use to handle the deposit confirmation, completion, and the metadata form, at line 785.
private static final String TEMPLATE_ARCHIVE_FINISH = "content/chef_resources_archiveFinish"; private static final String TEMPLATE_ARCHIVE_CONFIRM = "content/chef_resources_archiveConfirm"; private static final String TEMPLATE_ARCHIVE_METADATA "content/sakai_resources_archiveMetadata";
The need to set these in the code rather than making them configurable details seems rather poor design to me, but never mind.
Still more constants need to be added. Archiving should make no change to the resources themselves, so it is an action which should be in the same category as copy. So at line 816, I add:
CONTENT_READ_ACTIONS.add(ActionType.ARCHIVE);
and at 836:
ACTIONS_ON_FOLDERS.add(ActionType.ARCHIVE);
and at 850:
ACTIONS_ON_RESOURCES.add(ActionType.ARCHIVE);
There's a lot more to do, but that will have to be in the next post, I think.
Labels:
eclipse,
java,
Research360,
Sakai,
source code
Wednesday, 5 September 2012
Sakai Development: Post Six
Get Sakai source code and set up Eclipse IDE
There is a really useful guide to setting up a development environment on the Sakai WIKI. I found it by accident when searching for a solution to one of the problems I encountered: if only it had been linked to from the Sakai website's "Getting Started/Technical Contributors" page. I think I would have saved a lot of time and effort over the last few weeks.
To download the source from the repository (rather than a bundled release), use subversion to add the code to the development environment (as opposed to a server to use for testing, which is what most of the preceding work to this was about). A new enough version of subversion is already installed on Debian:
$ svn --version svn, version 1.6.12 (r955767) compiled May 31 2011, 16:12:12 (etc)
so now download the code this way:
$ svn checkout https://source.sakaiproject.org/svn/sakai/branches/sakai-2.8.x/ sakai-src
This takes a while. If you need to use a web proxy, it should be set up in the global section of /etc/subversion/servers (uncomment the existing http-proxy lines and add appropriate values). Note that HTTP proxies may not enable every subversion function, though this checkout should be fine.
I have a chequered history with IDEs, partly because they are memory intensive applications, and the times I've needed to use them I've been near the end-of-useful-life for the computer I was using, so there was never enough memory to run them properly, and partly because I've never really used one intensively, so my programming habits have remained the way they were before IDEs became popular tools. But this time, there should be no real excuse, as the computer I'm using is just 16 months old, and I deliberately bought one with as much virtual memory as I could afford. Following the instructions at https://confluence.sakaiproject.org/display/BOOT/Install+Eclipse+WTP, I downloaded eclipse (installation just consists of unzipping the packaged archive, basically) then installed Webtools, subclipse (for which http://subclipse.tigris.org/update_1.8.x needs to be added to the download sites), and the maven eclipse plugin (ditto http://download.eclipse.org/technology/m2e/releases) through the Eclipse updater, though the main component was already installed in the base package. Then I set eclipse to ignore "bin" and "target" directories when running svn - from the Window-Preferences-Team-Ignored Resources menu of Eclipse.
Some settings need to be changed. Eclipse doesn't run with memory settings high enough for Sakai (even given what I said about IDEs in the last paragraph). So edit eclipse/eclipse.ini and upgrade -Xms and Xmx to 128m and 1024m respectively, and add "-XX:+UseParallelGC" as a new line.
According to the instructions (but don't do this until you have read the rest of this paragraph!!), to prepare this for use in Eclipse, cd to the sakai-src directory and run:
$ mvn eclipse:clean
$ mvn eclipse:eclipse
which removes any existing eclipse files in the source code (presumably there shouldn't be any anyway) and then creates new ones - which takes a while and has several failed dependencies which have to be resolved manually (the error message helpfully tells you how - the probelm is basically that some library files are not found where expected). Next, create a new workspace for Sakai in Eclipse, using the File-Workspace-Other menu to enter a new workspace name (I used "ws-sakai"); slightly disconcertingly (even when warned this will happen), Eclipse immediately shuts down and restarts when you click OK. Then, add the source code to this workspace. Switch to the Java perspective (Window-Open Perspective-Java), turn off automatic builds (checkbox in Project menu), and import the Sakai source code (File-Import-General-Existing Projects into Workspace and browse to the Sakai source code directory). This fails, because eclipse thinks that this the source code is not a project file. This issue (and the missing dependencies) has been raised on the mailing list before I had got round to doing so, and the response from Steve again is not to do things this way; you only need to have the code in eclipse when you want to modify it, and even then only the specific project which is to be modified. Missing dependencies are then solved by adding the shared library directory of the tomcat installation to the classpath in Eclipse. With a large project like Sakai, this approach makes sense, but it really needs to be spelt out in the documentation! What's a bit annoying about this is that I now need to install tomcat on my laptop, not something I really want to do - I was hoping to write code on the laptop and test it on the server.
So I carry out the steps for an actual Sakai install which I haven't already done for this: setting up tomcat, creating the mysql DB, add the mysql connector to tomcat, edit sakai.properties, and compile with mvn. Of course, this gives a new error:
[INFO] Failed to resolve artifact. Missing: ---------- 1) com.sun:tools:jar:1.5.0 Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=com.sun -DartifactId=tools -Dversion=1.5.0 -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=com.sun -DartifactId=tools -Dversion=1.5.0 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) org.sakaiproject.kernel:sakai-component-manager:jar:1.4.0-SNAPSHOT 2) com.sun:tools:jar:1.5.0 ---------- 1 required artifact is missing. for artifact: org.sakaiproject.kernel:sakai-component-manager:jar:1.4.0-SNAPSHOT from the specified remote repositories: default (http://repo1.maven.org/maven2), central (http://repo1.maven.org/maven2), sakai-maven (http://source.sakaiproject.org/maven2), sonatype-nexus-snapshots (https://oss.sonatype.org/content/repositories/snapshots)
So I found the files which would fix this problem, carried out the mvn commands suggested in the error, and tried again, only to end up with the same to missing files. It ended up with me realising that I was using the wrong java implementation for this - I have several installed on the laptop, and /usr/bin/java is pointing to openjdk. So I tried again with the (Oracle) Sun java SDK, and this time the compilation and installation proceeded without error. However, sakai itself was inaccessible to the web browser, and this is caused by missing libraries:
SEVERE: Error configuring application listener of class org.sakaiproject.portal.charon.velocity.PortalRenderEngineContextListener java.lang.NoClassDefFoundError: org/sakaiproject/portal/api/PortalRenderEngine at java.lang.Class.getDeclaredConstructors0(Native Method) at java.lang.Class.privateGetDeclaredConstructors(Class.java:2406) at java.lang.Class.getConstructor0(Class.java:2716) at java.lang.Class.newInstance0(Class.java:343) at java.lang.Class.newInstance(Class.java:325) (and so on)
The right jar file has been created in the source tree, just not deployed to tomcat. So having found it (sakai-src/portal/iportal-render-engine-impl/mpl/target/sakai-portal-render-engine-impl-2.10-SNAPSHOT.jar - checking it has the missing class using jar -tf sakai-portal-render-engine-impl-2.10-SNAPSHOT.jar) and copied it to $CATALINA_HOME/shared/lib, try again - but this does not fix the problem.
My thought at this point is that there is a possibility that symbolic links may be the cause of a lot of the problems I had earlier, both on the laptop and on the test server. If tomcat is installed from the debian repositories, it is distributed across the filesystem in accordance with Linux standards which the tomcat project itself does not follow (libraries go under /usr/lib, configuration under /etc, log files under /var/log, and so on). This is problematic because many tomcat applications need a single directory which is $CATALINA_HOME, which has all the tomcat components in it, and the debian package solution is to set up a directory at /var/lib/tomcat6 which contains symbolic links to the real locations of the distributed files. If bits of Sakai are not clever enough to follow these symbolic links, it is not surprising that there are large number of inaccessible jar files. Similarly, on the server I followed my usual practice of creating a symbolic link to the actual tomcat installation directory (this makes life much easier when upgrading tomcat, or installing new versions of Sakai, because this can all be done invisibly to the users of the site, who only see something when the symbolic link is recreated to point to a new tomcat installation), and it is possible that the problem with the missing image files is caused by this too. I'm not going to bother having another go at the source installation on the server, but I will try a tomcat as downloaded from apache for the laptop.
OK, the next compilation caused some serious laptop problems - it crashed it. And this is without updating the code from the last compilation, which went fine. Time to stop for the day.
A re-install later (I was thinking about changing my distro anyway) and a reset maven, tomcat and re-downloaded source later, and I'm ready to compile. And again I end up with the dreaded missing library error:
[INFO] snapshot org.sakaiproject:sakai-announcement-help:2.8-SNAPSHOT: checking for updates from sakai-maven2-snapshots Downloading: http://source.sakaiproject.org/maven2-snapshots/org/sakaiproject/sakai-announcement-help/2.8-SNAPSHOT/sakai-announcement-help-2.8-SNAPSHOT.jar [INFO] Unable to find resource 'org.sakaiproject:sakai-announcement-help:jar:2.8-SNAPSHOT' in repository sakai-maven2-snapshots (http://source.sakaiproject.org/maven2-snapshots) [INFO] ------------------------------------------------------------------------ [ERROR] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Fialed to deploy to container :Unable to download the artifact from any repository
(and at the same time notice a typo in the mvn output). Now, I have a good source for the missing files in the working installation on the server, so I can download it and install it, as per the instructions from the mvn output. And this is just the first of 8 - until it starts going round in circles.
The solution for me was to change the deployment, and just run
% mvn -Pcafe sakai:deploy
which is then installing a cut-down version of Sakai which seems not to include any of the modules which have these dependency issues. And it works:
So now back to importing Sakai code into eclipse. I re-ran the Maven eclipse commands above, this time without error. I created a ws-sakai workspace; as before, sakai the restarts. The .m2/repository directory is already in the class path in eclipse, so no need to add it (presumably this was done on the installation of the maven eclipse plugin). I thought I'd try just once to import the whole of the sakai-src tree into eclipse. This resulted in the following error, that the java libraries to interface to subversion could not be found:
Failed to load JavaHL Library. These are the errors that were encountered: no libsvnjavahl-1 in java.library.path no svnjavahl-1 in java.library.path no svnjavahl in java.library.path java.library.path = /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
Usefully, eclipse (or more specifically, the subclipse plugin) suggested a site for a fix: http://subclipse.tigris.org/wiki/JavaHL (a page on the subclipse WIKI), which also explained why the libraries are not built into the subclipse distribution (too complicated due to differing installation methods on different operating systems). Finding the instructions for the redhat based linux distribution (after rebuilding my laptop, I'm using fedora - temporarily, it turns out to be too irritating to keep as my main desktop), I downloaded the file I needed from http://www.open.collab.net/downloads. The list of available files doesn't quite match the subclipse wiki page description (there doesn't seem to be an rpm available any more, for instance), and CollabNet required me to register before downloading, neither of which seem ideal. Howeve, CollabNet Subversion Edge does include the required library, as the file csvn/lib/libsvnjavahl-1.so.0 (csvn being the name of the directory that the downloaded tar file expands into). It's then probably sensible to update the JAVA_OPTS so that the jvm loads the new library each time it is started adding the following to the user's .profile file
JAVA_OPTS=-Djava.library.path=/lib
or amending an existing JAVA_OPTS entry, then adding JAVA_OPTS to the list of exported environment variables. The same path needs to be added to the eclipse configuration as well, by shutting down eclipse, editing the eclipse.ini file in the eclipse home directory, adding the same information (no need to specify JAVA_OPTS, this time), and then re-starting. What's now annoying is that I now need to switch workspace, which means that eclipse will shut and restart - I could put the sakai workspace in the shortcut call to start eclipse, I suppose, if I'm going to need to keep doing this. The projects from the source directory now seem to be loaded completely - success!
All this effort required to get the source code into a tool to make it easier to work with. I seriously think I'd have been better off just downloading the source code and working with a simple text editor directly - the old school method of programming, before these time-consuming products were invented to save developer time and effort. But I now feel something of a sense of accomplishment: I have the sakai source code imported into eclipse. Now for the real work to begin...
Tuesday, 4 September 2012
Sakai Development: Post Five
So now there is a (mostly) working Sakai environment. What next? There are three tasks which I want to carry out, two covered in this post and the third in the next.
Create Some Basic Content
Since the project is to develop a tool which acts on material in the resources tool, some test material will be needed stored in Sakai. There is a useful guide for new Sakai administrators at http://www.freesoftwaremagazine.com/articles/create_your_online_project_site_with_sakai which I used to create some basic content (though it was not entirely helpful, as the Programmer's Cafe version of Sakai has far fewer resources available than the default version of Sakai which most people will install). The content uses material from http://www.identity-project.org, which is a website where the content is not only Creative Commons, but mostly written by me, so something I can re-use without any qualms and which saves me having to create some documents, web pages and so on myself. The resources tool is the one I need for the work, so I'll concentrate on adding stuff to that.
Trying to do this reveals a new problem with the built version of Sakai - all the controls for adding new files are missing from the resources tool. This is quite possibly for the same reason that I was unable to access Sakai webdav from my desktop - something is clearly wrong with this area of Sakai. See the email I sent to the sakai-dev mailing list.
The screenshot was taken while logged in as admin, but it makes no difference (except to the webdav access error) when I accessed Sakai as a non-admin user.
Trying to recompile just makes the problem worse, as tomcat now fails to restart properly, with a very lengthy error, the apposite part of which appears to be:
tomcat then proceeds to destroy a long list of Java beans (several hundred of them), before reporting successful start-up, erroneously. And this is with the same codebase, same tomcat, same maven, and same java as previously. This appears to be a seriously retrograde step in the Sakai installation - days wasted, and no actual development work possible so far.
It's beginning to look like time to start over again from scratch, and this is basically the eventual suggestion from Sakai guru Steve Swinsburg, which is to deploy into a clean tomcat installation (which can be done by deleting the tomcat webapps and shared/lib directory). The first reload runs into database issues - producing the following error in the tomcat logs (followed by others):
which is, I would guess, caused by not having deleted the existing database in order to start again. Sakai sets up the database automatically, but this is part of the compilation process, so I will need to re-compile - again. But this time, it's my fault for not zapping the database first. This is more successful, at least as far as log files go. But there is nothing in tomcat/webapps/portal, until I recompile just that directory from the source code, and then I just get a page saying that an unknown error has been encountered when I access sakai through a web browser, with no further information being logged. Yet another rethink needed now.
In the end, I just junked everything (starting from an empty tomcat and an empty mysql database, with no .m2/repository directory) and started again, which led to a finished compilation, but with the same problem in Sakai I had at the start of this post with missing resources hooks. Round in a circle we go. The only thing I can think of is that the documents specify 5.5.33, and the current version of tomcat-5, which I installed, is 5.5.35. So I will try again with 5.5.33.
After another day of fiddling around, I get a working version of Sakai 2.8.2 in tomcat 5.5.33 - but it is still without file manipulation hooks in the resources tool. A suggestion from Andrew Martin is that it could be an invalid file path; looking at a working Sakai installation shows me that the drop down menu uses a gif image, specifically /library/image/sakai/icon-dropdn.gif - and that file exists on my attempted install. Other files in the same directory are properly displayed - /library/image/sakai/sortascending.gif for instance. and this one will display too if the path to the file itself is opened in a browser. However, the string "icon-dropdn.gif" does not appear in the source code for the page, so looking at where this is generated is probably the next thing to try. The gif is referenced in two files in the webapps/sakai-content-tool/vm/content/ directory: sakai_filepicker_select.vm and sakai_resources_list.vm.
At this point, I've decided that I'm just wasting my time trying to get a demo site working by compiling code. If I need to have it on my laptop too, I might as well concentrate on that. It should be possible to update a demo site produced using a binary with test code I produce. So that's what I'll do, following the instructions here (but using MySQL). Fifteen minutes later, after forgetting to install a MySQL java connector, we have a working Sakai installation, with resource tool file management hooks.
What have I gained from this extended effort to try to get Sakai to compile and work? A lot of frustration, and the feeling that Sakai is far too fragile as a collection of software considering that it's supposed to be a mature product. Some of this is a consequence of the servlet architecture - it's a trait shared with a number of tomcat-using products I have installed in the past. But Sakai itself has a complex structure, with a large number of separate servlets, and it could do with a re-organisation of its architecture to create more simplicity. After all, it basically just consists of several interfaces to the outside world (HTTP, WebDav, and API, so far as I can tell so far), code to generate the interface as required, back end stuff (database interface, for example), and plugins. I'd like to see a lot more being hidden away in libraries, with webapps to service the various interfaces and a standardised plugin structure; however, I suspect that this would require a major re-write of the code, and is unlikely to happen.
And now I can finally add the material. I noticed a blog post from the OpenExeter project about the limitations of the DSpace interface for SWORD, which is going to help me to think about the requirements for the tool for Sakai.
Trying to do this reveals a new problem with the built version of Sakai - all the controls for adding new files are missing from the resources tool. This is quite possibly for the same reason that I was unable to access Sakai webdav from my desktop - something is clearly wrong with this area of Sakai. See the email I sent to the sakai-dev mailing list.
The screenshot was taken while logged in as admin, but it makes no difference (except to the webdav access error) when I accessed Sakai as a non-admin user.
Trying to recompile just makes the problem worse, as tomcat now fails to restart properly, with a very lengthy error, the apposite part of which appears to be:
2012-08-15 12:14:31,306 ERROR main org.sakaiproject.util.NoisierDefaultListableBeanFactory - Failed to preinstantiate the singleton named org.sakaiproject.warehouse.service.ChildWarehouseTask.wizard.support.item. Destroying all Spring beans. org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.sakaiproject.warehouse.service.ChildWarehouseTask.wizard.support.item' defined in file [/home/sjm62/apache-tomcat-5.5.35/components/osp-warehouse-component/WEB-INF/wizard-components.xml]: Cannot resolve reference to bean 'org.sakaiproject.warehouse.service.PropertyAccess.id' while setting bean property 'fields' with key [0]; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'org.sakaiproject.warehouse.service.PropertyAccess.id' is defined
tomcat then proceeds to destroy a long list of Java beans (several hundred of them), before reporting successful start-up, erroneously. And this is with the same codebase, same tomcat, same maven, and same java as previously. This appears to be a seriously retrograde step in the Sakai installation - days wasted, and no actual development work possible so far.
It's beginning to look like time to start over again from scratch, and this is basically the eventual suggestion from Sakai guru Steve Swinsburg, which is to deploy into a clean tomcat installation (which can be done by deleting the tomcat webapps and shared/lib directory). The first reload runs into database issues - producing the following error in the tomcat logs (followed by others):
SEVERE: Exception sending context initialized event to listener instance of class org.sakaiproject.util.ContextLoaderListener org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.sakaiproject.profile2.tool.entityprovider.ProfileEntityProvider#0' defined in ServletContext resource [/WEB-INF/applicationContext.xml]: Cannot resolve reference to bean 'org.sakaiproject.profile2.logic.ProfileLogic' while setting bean property 'profileLogic'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.sakaiproject.profile2.logic.ProfileLogic' defined in file [/usr/local/tomcat/components/profile2-pack/WEB-INF/components.xml]: Cannot resolve reference to bean 'org.sakaiproject.profile2.logic.SakaiProxy' while setting bean property 'sakaiProxy'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.sakaiproject.profile2.logic.SakaiProxy' defined in file [/usr/local/tomcat/components/profile2-pack/WEB-INF/components.xml]: Invocation of init method failed; nested exception is org.springframework.jdbc.UncategorizedSQLException: Hibernate operation: could not insert: [org.sakaiproject.emailtemplateservice.model.EmailTemplate]; uncategorized SQLException for SQL [insert into EMAIL_TEMPLATE_ITEM (LAST_MODIFIED, OWNER, SUBJECT, emailfrom, MESSAGE, HTMLMESSAGE, TEMPLATE_KEY, TEMPLATE_LOCALE, defaultType, VERSION) values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)]; SQL state [HY000]; error code [1366]; Incorrect string value: '\xE4\xBB\x8E${l...' for column 'SUBJECT' at row 1; nested exception is java.sql.SQLException: Incorrect string value: '\xE4\xBB\x8E${l...' for column 'SUBJECT' at row 1
which is, I would guess, caused by not having deleted the existing database in order to start again. Sakai sets up the database automatically, but this is part of the compilation process, so I will need to re-compile - again. But this time, it's my fault for not zapping the database first. This is more successful, at least as far as log files go. But there is nothing in tomcat/webapps/portal, until I recompile just that directory from the source code, and then I just get a page saying that an unknown error has been encountered when I access sakai through a web browser, with no further information being logged. Yet another rethink needed now.
In the end, I just junked everything (starting from an empty tomcat and an empty mysql database, with no .m2/repository directory) and started again, which led to a finished compilation, but with the same problem in Sakai I had at the start of this post with missing resources hooks. Round in a circle we go. The only thing I can think of is that the documents specify 5.5.33, and the current version of tomcat-5, which I installed, is 5.5.35. So I will try again with 5.5.33.
After another day of fiddling around, I get a working version of Sakai 2.8.2 in tomcat 5.5.33 - but it is still without file manipulation hooks in the resources tool. A suggestion from Andrew Martin is that it could be an invalid file path; looking at a working Sakai installation shows me that the drop down menu uses a gif image, specifically /library/image/sakai/icon-dropdn.gif - and that file exists on my attempted install. Other files in the same directory are properly displayed - /library/image/sakai/sortascending.gif for instance. and this one will display too if the path to the file itself is opened in a browser. However, the string "icon-dropdn.gif" does not appear in the source code for the page, so looking at where this is generated is probably the next thing to try. The gif is referenced in two files in the webapps/sakai-content-tool/vm/content/ directory: sakai_filepicker_select.vm and sakai_resources_list.vm.
At this point, I've decided that I'm just wasting my time trying to get a demo site working by compiling code. If I need to have it on my laptop too, I might as well concentrate on that. It should be possible to update a demo site produced using a binary with test code I produce. So that's what I'll do, following the instructions here (but using MySQL). Fifteen minutes later, after forgetting to install a MySQL java connector, we have a working Sakai installation, with resource tool file management hooks.
What have I gained from this extended effort to try to get Sakai to compile and work? A lot of frustration, and the feeling that Sakai is far too fragile as a collection of software considering that it's supposed to be a mature product. Some of this is a consequence of the servlet architecture - it's a trait shared with a number of tomcat-using products I have installed in the past. But Sakai itself has a complex structure, with a large number of separate servlets, and it could do with a re-organisation of its architecture to create more simplicity. After all, it basically just consists of several interfaces to the outside world (HTTP, WebDav, and API, so far as I can tell so far), code to generate the interface as required, back end stuff (database interface, for example), and plugins. I'd like to see a lot more being hidden away in libraries, with webapps to service the various interfaces and a standardised plugin structure; however, I suspect that this would require a major re-write of the code, and is unlikely to happen.
And now I can finally add the material. I noticed a blog post from the OpenExeter project about the limitations of the DSpace interface for SWORD, which is going to help me to think about the requirements for the tool for Sakai.
Add LDAP authentication
Instructions are found at https://confluence.sakaiproject.org/display/~steve.swinsburg/LDAP+in+Sakai+2.5 - this requires recompilation of the providers project, and perhaps might have been better done as part of the initial deployment if compilation had worked out. Overwriting a binary webapp with the new providers code works fine (it's probably sensible to take a copy of providers.war before overwriting it, to make it possible to go back to the old version if necessary). The instructions were combined with local values supplied by Jez Cope from his existing Sakai installation. Editing XML with vi (as I was doing) requires strict attention to ensuring correct XML syntax - failure to do so results in a failure of the LDAP connector, while leaving the rest of Sakai working (as it's only one of the webapps which is replaced). Setting this up is easily the best experience I've had with Sakai so far: apart from a couple of typos I made which needed fixing, it worked first time exactly the way I interpreted the documentation. One small downside - this change seems to make Sakai start up even more slowly - 108.789 seconds when I restarted tomcat to test the installation.
Tuesday, 7 August 2012
Sakai Development: Post Four
Time to do some work sorting out the problems. It's fairly clear that at least some of them stem from difficulties with the sakai.properties file, which seems to have been ignored during compilation: hence the title for the page of "LocalSakaiName : Gateway : Welcome" rather than what I set in the properties file. Unfortunately, this suggests that another re-compilation will need to be carried out. So the first investigation is to try and work out why the properties file was ignored. In fact, this looks to be fairly simple, and is due to a mis-reading of the installation instructions. I assumed that a sakai.properties file should reside in the directory where the sample properties files are kept, and then copied to $CATALINA_HOME/sakai after the installation is complete. In fact, I should have started by creating the sakai directory and putting the file in it before beginning the compilation. So here we go again...
OK, forty five minutes later, we have recreated Sakai. Unfortunately, putting the file in the sakai directory makes absolutely no difference to the web page which is seen at the end of the process. However, touching the file
(which changes its last update time to the present), then stopping and starting tomcat fixes the name of the web page. In the process, I learn that restarting Sakai takes a very long time - it's a couple of minutes before it becomes available again through apache and tomcat. Hopefully, though, we can continue to fix problems in this way. The first issue to fix is the location of the stylesheet, which is in the page as a relative link "/portal/styles/portalstyles.css" when it should be "/sakai/portal/styles/portalstyles.css". Clearly, the value put in the sakai.properties file for portalPath, which I expected to sort this out, is not having the intended purpose. So I'll need to experiment.
Changing skin.repo to /sakai/library/skin means that the stylesheet is found, and the page suddenly looks like a modern web page rather than a relic from 1994. There are still missing graphics, so I went back to the properties file and changed every URL to add /sakai at the beginning - with the result that just about everything was found OK. On the main page, the following still are wrong: /portal/styles/portalstyles.css, /library/js/headscripts.js, /library/js/jquery.js, portal/styles/images/transMin.png, /portal/site/!gateway/page/!gateway-200 (and the same, ending 300, 400, 500, 600 and 700), /library/image/transparent.gif, /library/editor/FCKeditor/fckeditor.js and /library/editor/fckeditor.launch.js. The most serious consequence of this is that all the links down the left hand side, which seem to be the main navigation routes, are broken. Searching the various pieces of documentation for "gateway" does not find anything helpful.
This means that it's probably time to rethink my methodology here. I want to host several applications (and some stand-alone web pages) on this virtual machine. I can't without considerable effort (in getting institutional firewall rules changed) have web services accessible on ports other than 80 and 443, so forwarding based on port number is not an option. This leaves either the use of multiple VMs, which seems a little wasteful, or the use of URL prefixes such as /sakai to distinguish between the applications on the machine. But if Sakai is unwilling to play nicely with others, this is going to be difficult to do, especially as I already know that one of the other applications I want to use (EPrints) has been prone to do likewise in past releases, with an installation process which overwrites and hijacks the global web server configuration. So the question is, can I do something clever in the apache proxy configuration which will sort things out so I can just have / go to Sakai, but have other configurations which deal with (say) /html or /eprints before the global forwarding rule is reached? It's not something I've ever needed to do before, but it is clear from apache's documentation that the sensible processing method is used: rules are checked in the order they appear in the configuration file. The static pages will also need to be handled differently, however. (It might be possible to effectively host them on Sakai, though.) The use of proxying will become complicated if I end up installing a second application which uses tomcat, though I could probably cope by adding a new listener to the tomcat configuration and sending the proxy to that. In the end, this is all sounding like a good idea, so I need to check out the apache configuration again. And think about how best to do it.
(After some thought, and browsing to find possible solutions...)
It turns out that there are various apache modules which are touted online as methods to solve this problem - mod_html_proxy, mod_substitute, and mod_rewrite. Not all of them seem to work in this situation, and there are issues (they are not keen to change "/" to "/sakai/" globally, especially). In the end, the approach which worked was this. First, enable mod_substitute by creating a link in /etc/apache-2/mods-enabled to the appropriate file in /etc/apache2/mods-available:
Then configure it. In the apache configuration, replace the previous proxy configuration with:
The rules basically pub /sakai/ before the three prefixes for sakai URLs unless they already follow /sakai (in a slightly simple-minded way, but I can make the rules more sophisticated later if it turns out to be necessary). The extra rule for /portal/tool turns out to be needed, because the URLs aren't correctly handled by apache otherwise, for some reason. It should be noted that because apache now runs through every http response it proxies from tomcat to find instances of these URLs, a performance hit will have to be taken, which may make these changes inappropriate in a production environment - though it should be possible to run a production environment on a server of its own or at least in a separate apache virtual host with its own domain name anyway.
Restart apache.
Then undo all the changes to sakai.properties which added /sakai to the beginning of URLs, and restart tomcat. And we have:
And now it is possible to register as a user, and start getting something useful done.
OK, forty five minutes later, we have recreated Sakai. Unfortunately, putting the file in the sakai directory makes absolutely no difference to the web page which is seen at the end of the process. However, touching the file
$ touch /usr/local/tomcat/sakai/sakai.properties
(which changes its last update time to the present), then stopping and starting tomcat fixes the name of the web page. In the process, I learn that restarting Sakai takes a very long time - it's a couple of minutes before it becomes available again through apache and tomcat. Hopefully, though, we can continue to fix problems in this way. The first issue to fix is the location of the stylesheet, which is in the page as a relative link "/portal/styles/portalstyles.css" when it should be "/sakai/portal/styles/portalstyles.css". Clearly, the value put in the sakai.properties file for portalPath, which I expected to sort this out, is not having the intended purpose. So I'll need to experiment.
Changing skin.repo to /sakai/library/skin means that the stylesheet is found, and the page suddenly looks like a modern web page rather than a relic from 1994. There are still missing graphics, so I went back to the properties file and changed every URL to add /sakai at the beginning - with the result that just about everything was found OK. On the main page, the following still are wrong: /portal/styles/portalstyles.css, /library/js/headscripts.js, /library/js/jquery.js, portal/styles/images/transMin.png, /portal/site/!gateway/page/!gateway-200 (and the same, ending 300, 400, 500, 600 and 700), /library/image/transparent.gif, /library/editor/FCKeditor/fckeditor.js and /library/editor/fckeditor.launch.js. The most serious consequence of this is that all the links down the left hand side, which seem to be the main navigation routes, are broken. Searching the various pieces of documentation for "gateway" does not find anything helpful.
This means that it's probably time to rethink my methodology here. I want to host several applications (and some stand-alone web pages) on this virtual machine. I can't without considerable effort (in getting institutional firewall rules changed) have web services accessible on ports other than 80 and 443, so forwarding based on port number is not an option. This leaves either the use of multiple VMs, which seems a little wasteful, or the use of URL prefixes such as /sakai to distinguish between the applications on the machine. But if Sakai is unwilling to play nicely with others, this is going to be difficult to do, especially as I already know that one of the other applications I want to use (EPrints) has been prone to do likewise in past releases, with an installation process which overwrites and hijacks the global web server configuration. So the question is, can I do something clever in the apache proxy configuration which will sort things out so I can just have / go to Sakai, but have other configurations which deal with (say) /html or /eprints before the global forwarding rule is reached? It's not something I've ever needed to do before, but it is clear from apache's documentation that the sensible processing method is used: rules are checked in the order they appear in the configuration file. The static pages will also need to be handled differently, however. (It might be possible to effectively host them on Sakai, though.) The use of proxying will become complicated if I end up installing a second application which uses tomcat, though I could probably cope by adding a new listener to the tomcat configuration and sending the proxy to that. In the end, this is all sounding like a good idea, so I need to check out the apache configuration again. And think about how best to do it.
(After some thought, and browsing to find possible solutions...)
It turns out that there are various apache modules which are touted online as methods to solve this problem - mod_html_proxy, mod_substitute, and mod_rewrite. Not all of them seem to work in this situation, and there are issues (they are not keen to change "/" to "/sakai/" globally, especially). In the end, the approach which worked was this. First, enable mod_substitute by creating a link in /etc/apache-2/mods-enabled to the appropriate file in /etc/apache2/mods-available:
$ cd /etc/apache2/mods-enabled
$ ln -s ../mods-available/substitute.load .
Then configure it. In the apache configuration, replace the previous proxy configuration with:
ProxyPass /sakai/ ajp://localhost:8009/ ProxyHTMLURLMap /sakai/ ajp://localhost:8009/ ProxyPassReverse http://localhost:8009/ /sakai/ AddOutputFilterByType SUBSTITUTE text/html Substitute s#([^i])/portal/tool/#$1/sakai/portal/tool/#iSubstitute s#([^i])/portal/#$1/sakai/portal/#iSubstitute s#([^i])/library/#$1/sakai/library/#i
Substitute s#([^i])/access/#$1/sakai/access/#i
The rules basically pub /sakai/ before the three prefixes for sakai URLs unless they already follow /sakai (in a slightly simple-minded way, but I can make the rules more sophisticated later if it turns out to be necessary). The extra rule for /portal/tool turns out to be needed, because the URLs aren't correctly handled by apache otherwise, for some reason. It should be noted that because apache now runs through every http response it proxies from tomcat to find instances of these URLs, a performance hit will have to be taken, which may make these changes inappropriate in a production environment - though it should be possible to run a production environment on a server of its own or at least in a separate apache virtual host with its own domain name anyway.
Restart apache.
$ sudo /etc/init.d/apache2 reload
Then undo all the changes to sakai.properties which added /sakai to the beginning of URLs, and restart tomcat. And we have:
And now it is possible to register as a user, and start getting something useful done.
Labels:
apache,
configuration,
Debian,
mod_proxy,
Research360,
Sakai,
tomcat
Tuesday, 31 July 2012
Sakai Development: Post Three
At this point we can start to install Sakai itself. I decided to start with the source archive, mainly because I don't really want to have to debug any serious issues with the development source code. Because of the restrictions on my VM, I saved http://source.sakaiproject.org/release/2.8.2/artifacts/sakai-src-2.8.2.tar.gz to my desktop and then used scp to copy it to the server. I now have sakai-src-2.8.2.tar.gz in my home directory.
Once the source is downloaded, the first new step is to set up some environment variables. The instructions say that the java executables (java, javac, etc.) need to be in the path; in fact, the way that Java is set up on Debian makes this unnecessary, as /usr/bin/java (etc.) are symbolic links to the java executables, which do not need themselves to be in the path. Some work is needed, though; to increase the memory available to Sakai, create the file /usr/local/tomcat/bin/setenv.sh with the contents:
Make the new file executable:
The required version of subversion is already installed.
Next, install maven 2.2. This is the version of maven in the Debian software repositories as maven2, but this is a package which has a very large number of dependencies, some of which will clash with the already installed Sun java (openjdk, for one). Although java binaries and compilation is supposed to be compatible across the different implementations, it feels wrong to use this maven - and having multiple java implementations on the server when its not necessary seems silly, a recipe for confusion later on. It makes me feel happier to install maven by downloading from apache. (Though in the end, for the sake of time, I chose to download the binary, with no idea about which java was used to compile it...) The installation process is basically the same as that for tomcat.
maven needs some setting up, and I choose to do this globally, so that every new session picks up the home and options for maven. In /etc/bash.bashrc, add the lines:
at the end of the file (the last two lines replacing the ones given in the previous post). To do the same things for a single user, add settings to .profile in the home directory. When this has been done, run
to import these settings into the current session and then test to be sure that this all worked:
Create a local maven repository directory:
and add a settings file to the m2 directory also created by this command, as .m2/settings.xml, containing:
Additional information may need to be added if the server uses a proxy to communicate to the Internet, as maven doesn't check the http_proxy system variable (but go here rather than trying to follow the instructions in the Sakai guide to add the instructions to the settings.xml file). The instructions now tell you to configure Sakai, which is confusing as Sakai has not yet been installed (and the instructions tell the installer to edit as yet non-existent files). There is a minor step missing from the instructions, which is to open the archive of the Sakai source code already downloaded. In the user's home directory,
Then, the actual compilation and deployment to tomcat of Sakai itself. I'm aiming to compile the cafe build, recommended for new developers.
I set this going and then immediately realised that I don't have write permission to /usr/local/tomcat. As expected, this ends with
What doesn't occur to me at this point is that the quick way to fix this problem is to give myself write permission to /usr/local/tomcat and its sub-directories, at least temporarily. Instead, I try using sudo to allow the installation to occur.
Predictably, this also fails (the reason being that I created the sakai environment in /home/sjm62/.m2, so if I run the command as sudo, the information fails to be found). So now try:
This seems to work better, though I'm not at all sure what it might be doing to the ownership and permissions in ~/.m2. Download checksums failed for a fair number of the required resources, with no obvious effect (It looks from this as though the checksum of the downloaded file is compared to the stored checksum, but all that happens if they are different is that a warning is given, but the downloaded file is used anyway.) After twenty minutes downloading and compilation, this happens:
This suggests that the maven options set the memory size too low, even though I used the values from the documentation. I doubled the values, and ran the compilation again, with the same result. (It's very frustrating to have the compilation run for almost twenty minutes, then run out of memory.) It looks as though more thought is needed - perhaps the virtual machine has too little memory. Indeed, with no interactive processes running, the machine has 584 Mb used out of a total of 1002 Mb available of virtual memory: so setting the maximum memory usage of maven above 512 Mb will be pointless (the information comes from running free -m). According to top, the most memory intensive process running is tomcat, so I now propose to run the compilation once more, with tomcat shut down (must remember to restart it later...). This time I will also monitor the memory usage in another shell, using top and also running tail -f /var/log/messages, which should indicate problems if the machine's memory is full (the page at http://rimuhosting.com/howto/memory.jsp is a very helpful introduction to the diagnosis of linux memory issues, BTW). There is no obvious message, though it is possible that
Now on to the final stage of getting Sakai up and running. Restart tomcat:
and then it should be visible in the web browser. However, it isn't, and there are large numbers of errors in the tomcat logs.
I found that recompiling and installing over an existing, failed, sakai build caused huge numbers of problems in tomcat (the re-installed version is basically missing libraries which installed fine the first time around, even if it does have some which were originally missing). This seems to be the problem discussed in http://collab.sakaiproject.org/pipermail/sakai-dev/2009-August/002909.html and the solution is to re-install tomcat completely and remove the existing maven project in .m2/repository/org/sakaiproject directory. However, even then I was still getting errors like the following (note that the full error only appears in the localhost.[date].log file, not in catalina.out, which has an exceptionally unhelpful short version of the first line only):
(This particular error is repeated a large number of times, presumably once for each time a servlet tries to load the class.) I'm not sure what the cause is - permissions, or not removing enough of failed installations, perhaps - but it's fixed by getting rid of the entire sakai and tomcat installation and starting again from scratch. Once I remember to:
Once the source is downloaded, the first new step is to set up some environment variables. The instructions say that the java executables (java, javac, etc.) need to be in the path; in fact, the way that Java is set up on Debian makes this unnecessary, as /usr/bin/java (etc.) are symbolic links to the java executables, which do not need themselves to be in the path. Some work is needed, though; to increase the memory available to Sakai, create the file /usr/local/tomcat/bin/setenv.sh with the contents:
export JAVA_OPTS='-server -Xms512m -Xmx1024m -XX:PermSize=128m -XX:MaxPermSize=512m -XX:NewSize=192m -XX:MaxNewSize=384m -Djava.awt.headless=true -Dhttp.agent=Sakai -Dorg.apache.jasper.compiler.Parser.STRICT_QUOTE_ESCAPING=false -Dsun.lang.ClassLoader.allowArraySyntax=true'
Make the new file executable:
$ sudo chmod a+x /usr/local/tomcat/bin/setenv.sh
The required version of subversion is already installed.
Next, install maven 2.2. This is the version of maven in the Debian software repositories as maven2, but this is a package which has a very large number of dependencies, some of which will clash with the already installed Sun java (openjdk, for one). Although java binaries and compilation is supposed to be compatible across the different implementations, it feels wrong to use this maven - and having multiple java implementations on the server when its not necessary seems silly, a recipe for confusion later on. It makes me feel happier to install maven by downloading from apache. (Though in the end, for the sake of time, I chose to download the binary, with no idea about which java was used to compile it...) The installation process is basically the same as that for tomcat.
$ sudo mv apache-maven-2.2.1-bin.tar.gz /usr/local $ cd /usr/local $ sudo tar zxvf apache-maven-2.2.1-bin.tar.gz $ sudo ln -s apache-maven-2.2.1 maven
maven needs some setting up, and I choose to do this globally, so that every new session picks up the home and options for maven. In /etc/bash.bashrc, add the lines:
MAVEN_HOME=/usr/local/maven export MAVEN_OPTS='-Xms512m -Xmx1024m -XX:PermSize=64m -XX:MaxPermSize=128m' PATH=$PATH:$JAVA_HOME/bin:$CATALINA_HOME/bin:$MAVEN_HOME/bin export JAVA_HOME CATALINA_HOME MAVEN_HOME PATH
at the end of the file (the last two lines replacing the ones given in the previous post). To do the same things for a single user, add settings to .profile in the home directory. When this has been done, run
$ source /etc/bash.bashrc
to import these settings into the current session and then test to be sure that this all worked:
$ mvn --version Apache Maven 2.2.1 (r801777; 2009-08-06 20:16:01+0100) Java version: 1.6.0_26 Java home: /usr/lib/jvm/java-6-sun-1.6.0.26/jre Default locale: en_GB, platform encoding: UTF-8 OS name: "linux" version: "2.6.32-5-amd64" arch: "amd64" Family: "unix"
Create a local maven repository directory:
$ cd $HOME $ mkdir -p .m2/repository
and add a settings file to the m2 directory also created by this command, as .m2/settings.xml, containing:
Additional information may need to be added if the server uses a proxy to communicate to the Internet, as maven doesn't check the http_proxy system variable (but go here rather than trying to follow the instructions in the Sakai guide to add the instructions to the settings.xml file). The instructions now tell you to configure Sakai, which is confusing as Sakai has not yet been installed (and the instructions tell the installer to edit as yet non-existent files). There is a minor step missing from the instructions, which is to open the archive of the Sakai source code already downloaded. In the user's home directory,
$ tar zxvf sakai-src-2.8.2.tar.gz $ cd sakai-src-2.8.2Next, set up the mysql database to use. The right JDBC driver for the database you intend to use needs to be downloaded and put into /usr/local/tomcat/shared/lib; for mysql, this is available from http://www.mysql.com/products/connector/. The database will also need to be set up as per the configuration entries you set up. For mysql this entails the following activities (with some parts of the server response omitted, and the actual password I used replaced by the word "password"):
$ mysql -u root --password
Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Copyright (c) 2000, 2011, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
mysql> create database sakai;
Query OK, 1 row affected (0.00 sec)
mysql> grant usage on *.* to sakai@localhost identified by 'password';Query OK, 0 rows affected (0.00 sec)
mysql> grant all privileges on sakai.* to sakai@localhost;
Query OK, 0 rows affected (0.00 sec)
mysql> exit;
Bye
Note that I had already set a root password; often, new installations of mysql do not have the password set. To test this worked, I then ran:
$ mysql -u sakai -p 'password' sakai mysql>; exit ByeNow it's possible to carry out some pre-compilation configuration. First, make a new properties file to configure the compilation-time information, such as the server domain name; I felt that the best start was with a copy of the existing exemplar, default.sakai.properties. There is also a file, sample.sakai.properties, which contains all the many possible items it is possible to configure at this point - since the default.sakai.properties file is already 924 lines long, to work through the longer file was a job I hoped would be unnecessary.
$ cd config/configuration/bundles/src/bundle/org/sakaiproject/config/bundle $ cp default.sakai.properties sakai.propertiesThen edit the new sakai.properties. Most of these changes from the default were because I decided to forward only access requests for /portal from apache to the tomcat listener, and so the URLs for navigation all need to start /portal. My expectation at this point is that the changes I make won't matter - they all seem to be in the nature of the sort of local customisation which should easily be changeable later (and, if not, I can always change the properties and compile again).. The contents of a sakai.properties file are documented in a 90 page Microsoft Word file found at sakai-src-2.8.2/reference/docs/architecture/sakai_properties.doc - useful, if a bizarre choice for a documentation format for an open source project. To summarise the main points, as relevant to the compilation I want to carry out:
- The edited file should end up at /usr/local/tomcat/sakai/sakai.properties, and further changes can indeed be made later. The compilation process does not copy this file across, so it will need to be sorted out by hand. One of the consequences of not doing this is that Sakai will attempt to use the default hsqldb database, which has not been installed, and this will prevent the installation from working (the error in this case, which appears in catalina.out, is "java.sql.SQLException: Table not found in statement [select COUNT(*) from SAKAI_SESSION]").
- The connection details for the database to use need to be set up, and the existing hsqldb values commented out.
- The property portalPath is used for adding a pathway to the URL (which should not be used to contain the full path to the portal where there is some extra needed, as here where URLs ending /sakai are forwarded to tomcat); this is not in the default properties file.
$ cd ~/sakai-src-2.8.2/master $ mvn clean installThis is followed by a lengthy list of missing libraries, followed by downloads and installation of these libraries - which is far nicer than having to fulfil all the dependences by hand. Eventually, there is the magic message:
[INFO] [install:install {execution: default-install}] [INFO] Installing /home/sjm62/sakai-src-2.8.2/master/pom.xml to /home/sjm62/.m2/repository/org/sakaiproject/master/2.8.2/master-2.8.2.pom [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESSFUL [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1 minute 14 seconds [INFO] Finished at: Wed Jul 18 14:15:09 BST 2012 [INFO] Final Memory: 16M/495M [INFO] ------------------------------------------------------------------------
Then, the actual compilation and deployment to tomcat of Sakai itself. I'm aiming to compile the cafe build, recommended for new developers.
$ cd ~/sakai-src-2.8.2 $ mvn -Pcafe clean install sakai:deploy -Dmaven.tomcat.home=/usr/local/tomcat
I set this going and then immediately realised that I don't have write permission to /usr/local/tomcat. As expected, this ends with
[INFO] [sakai:deploy {execution: default-cli}] [INFO] Copy /home/sjm62/.m2/repository/javax/servlet/jstl/1.1.2/jstl-1.1.2.jar to /usr/local/tomcat/shared/lib/jstl-1.1.2.jar [INFO] ------------------------------------------------------------------------ [ERROR] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Fialed to deploy to container :/usr/local/tomcat/shared/lib/jstl-1.1.2.jar (Permission denied) [INFO] ------------------------------------------------------------------------ [INFO] For more information, run Maven with the -e switch [INFO] ------------------------------------------------------------------------ [INFO] Total time: 6 minutes 36 seconds [INFO] Finished at: Wed Jul 18 14:26:33 BST 2012 [INFO] Final Memory: 76M/495M [INFO] ------------------------------------------------------------------------
What doesn't occur to me at this point is that the quick way to fix this problem is to give myself write permission to /usr/local/tomcat and its sub-directories, at least temporarily. Instead, I try using sudo to allow the installation to occur.
$ sudo mvn clean install sakai:deploy -Dmaven.tomcat.home=/usr/local/tomcat
Predictably, this also fails (the reason being that I created the sakai environment in /home/sjm62/.m2, so if I run the command as sudo, the information fails to be found). So now try:
$ sudo mvn -s /home/sjm62/.m2/settings.xml clean install sakai:deploy -Dmaven.tomcat.home=/usr/local/tomcat
This seems to work better, though I'm not at all sure what it might be doing to the ownership and permissions in ~/.m2. Download checksums failed for a fair number of the required resources, with no obvious effect (It looks from this as though the checksum of the downloaded file is compared to the stored checksum, but all that happens if they are different is that a warning is given, but the downloaded file is used anyway.) After twenty minutes downloading and compilation, this happens:
[INFO] [compiler:compile {execution: default-compile}] [INFO] Compiling 9 source files to /home/sjm62/sakai-src-2.8.2/osp/wizard/tool/target/classes [INFO] ------------------------------------------------------------------------ [ERROR] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Compilation failure Failure executing javac, but could not parse the error: The system is out of resources. Consult the following stack trace for details. java.lang.OutOfMemoryError: PermGen space at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) ... (tedious details omitted) ... [INFO] ------------------------------------------------------------------------ [INFO] For more information, run Maven with the -e switch [INFO] ------------------------------------------------------------------------ [INFO] Total time: 18 minutes 30 seconds [INFO] Finished at: Wed Jul 18 15:01:39 BST 2012 [INFO] Final Memory: 102M/243M [INFO] ------------------------------------------------------------------------
This suggests that the maven options set the memory size too low, even though I used the values from the documentation. I doubled the values, and ran the compilation again, with the same result. (It's very frustrating to have the compilation run for almost twenty minutes, then run out of memory.) It looks as though more thought is needed - perhaps the virtual machine has too little memory. Indeed, with no interactive processes running, the machine has 584 Mb used out of a total of 1002 Mb available of virtual memory: so setting the maximum memory usage of maven above 512 Mb will be pointless (the information comes from running free -m). According to top, the most memory intensive process running is tomcat, so I now propose to run the compilation once more, with tomcat shut down (must remember to restart it later...). This time I will also monitor the memory usage in another shell, using top and also running tail -f /var/log/messages, which should indicate problems if the machine's memory is full (the page at http://rimuhosting.com/howto/memory.jsp is a very helpful introduction to the diagnosis of linux memory issues, BTW). There is no obvious message, though it is possible that
Jul 19 09:56:48 float mpt-statusd: detected non-optimal RAID statusis connected, as Linux memory usage is increased by the caching of disk blocks in spare memory. (I would expect Linux memory management to be sophisticated enough to stop caching the blocks at pretty short notice when an interactive process wants the memory, however.) So my conclusion now is - I should increase the memory allocation for maven and try again. This time, MAVEN_OPTS is set to "-Xms1024m -Xmx2048m -XX:PermSize=1024m -XX:MaxPermSize=2048m" - and the build still fails. It's time to hit the Sakai dev mailing list. Posting at 1.45AM Pacific time unfortunately makes it likely I'll need to wait for an answer - so I'll do something else for a while. The answer comes fairly quickly, but apart from fixing obvious typos, leads to circling round whether or not I should be using root or sudo or neither. (Note: I can't just create new users on this VM in the obvious way, because it's set up to use an LDAP server for authentication, requiring userIDs other than root to match ones on the LDAP. There's probably a way round this, but it's not worth the time to sort it out, I don't think.) Also, the suggestion that tomcat needs to be run as a non-root user is something I thought about, but then didn't do, as apache's tomcat documentation says that there shouldn't be issues. (And applications writing to files outside the tomcat tree seems to me to be the application's security problem not tomcat's - especially as it's not likely to be possible if selinux is enabled...) Anyway, self-justification apart, the solution is to get more memory for the virtual machine. Even with 2Gb virtual memory, the compilation process takes up 90% of it, so it is clear that the maven options suggested in the documentation are rather on the low side of what is needed, at least on Debian. But with the upgrade and write permission to /usr/local/tomcat, the compilation now runs to a successful conclusion:
$ ls /usr/local/tomcat/webapps/ access.war profile2-tool.war sakai-rutgers-linktool.war accountvalidator profile-tool sakai-rwiki-tool.war accountvalidator.war profile-tool.war sakai-search-tool authn.war providers sakai-search-tool.war balancer providers.war sakai-sections-tool.war courier.war ROOT sakai-siteassociation-tool.war dav.war sakai-alias-tool.war sakai-site-manage-group-helper.war direct sakai-announcement-tool.war sakai-site-manage-group-section-role-helper.war direct.war sakai-archive-tool.war sakai-site-manage-link-helper.war emailtemplateservice-tool sakai-assignment-tool.war sakai-site-manage-participant-helper.war emailtemplateservice-tool.war sakai-authz-tool.war sakai-site-manage-tool.war imsblis sakai-axis sakai-site-pageorder-helper.war imsblis.war sakai-axis.war sakai-site-tool.war imsblti sakai-calendar-summary-tool.war sakai-syllabus-tool.war imsblti.war sakai-calendar-tool.war sakai-tool-tool-su.war jsf-resource sakai-chat-tool.war sakai-usermembership-tool.war jsf-resource.war sakai-citations-tool.war sakai-user-tool-admin-prefs.war jsp-examples sakai-content-tool.war sakai-user-tool-prefs.war library.war sakai-editor.war sakai-user-tool.war login-render.war sakai-fck-connector.war sakai-web-tool.war messageforums-tool sakai-gradebook-testservice.war samigo-app messageforums-tool.war sakai-gradebook-tool.war samigo-app.war osp-common-tool.war sakai-help-tool.war savecite.war osp-glossary-tool.war sakai-login-tool.war scheduler-tool osp-jsf-example.war sakai-mailarchive-james.war scheduler-tool.war osp-jsf-resource.war sakai-mailarchive-tool.war servlets-examples osp-matrix-tool.war sakai-memory-tool.war sitestats-tool osp-portal-tool.war sakai-message-tool.war sitestats-tool.war osp-portal.war sakai-metaobj-tool.war tomcat-docs osp-presentation-tool.war sakai-news-tool.war tool.war osp-wizard-tool.war sakai-podcasts.war webdav podcasts.war sakai-postem-tool.war web.war polls-tool sakai-presence-tool.war wiki.war polls-tool.war sakai-reset-pass x portal-render.war sakai-reset-pass.war xsl-portal.war portal.war sakai-rights-tool.war x.war profile2-tool sakai-roster-tool.war
Now on to the final stage of getting Sakai up and running. Restart tomcat:
$ sudo /usr/local/tomcat/bin/startup.sh [sudo] password for sjm62: Using CATALINA_BASE: /usr/local/tomcat Using CATALINA_HOME: /usr/local/tomcat Using CATALINA_TMPDIR: /usr/local/tomcat/temp Using JRE_HOME: /usr/lib/jvm/java-6-sun/ Using CLASSPATH: /usr/local/tomcat/bin/bootstrap.jar
and then it should be visible in the web browser. However, it isn't, and there are large numbers of errors in the tomcat logs.
I found that recompiling and installing over an existing, failed, sakai build caused huge numbers of problems in tomcat (the re-installed version is basically missing libraries which installed fine the first time around, even if it does have some which were originally missing). This seems to be the problem discussed in http://collab.sakaiproject.org/pipermail/sakai-dev/2009-August/002909.html and the solution is to re-install tomcat completely and remove the existing maven project in .m2/repository/org/sakaiproject directory. However, even then I was still getting errors like the following (note that the full error only appears in the localhost.[date].log file, not in catalina.out, which has an exceptionally unhelpful short version of the first line only):
SEVERE: Error configuring application listener of class org.sakaiproject.util.ToolListener java.lang.ClassNotFoundException: org.sakaiproject.util.ToolListener at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1438) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1284) (etc.)
(This particular error is repeated a large number of times, presumably once for each time a servlet tries to load the class.) I'm not sure what the cause is - permissions, or not removing enough of failed installations, perhaps - but it's fixed by getting rid of the entire sakai and tomcat installation and starting again from scratch. Once I remember to:
- ensure that CATALINA_HOME is correctly set;
- copy the setenv.sh file to $CATALINA_HOME/bin;
- copy the mysql JDBC driver JAR file to $CATALINA_HOME/shared/lib;
- copy the sakai.settings file to $CATALINA_HOME/sakai;
then I end up having an installation I can see from a web browser. There are obviously still some fairly serious problems, as can be seen from the screenshot to the right, but I feel that I'm starting to get somewhere.
Subscribe to:
Posts (Atom)