mirror of
https://github.com/DSpace/DSpace.git
synced 2025-10-07 10:04:21 +00:00
Compare commits
2 Commits
dspace-1.7
...
cvs_final
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d61258ec54 | ||
|
|
e2fc84a18b |
40
LICENSE
40
LICENSE
@@ -1,40 +0,0 @@
|
||||
DSpace source code license:
|
||||
|
||||
|
||||
Copyright (c) 2002-2011, DuraSpace. All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
- Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
- Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
- Neither the name DuraSpace nor the name of the DSpace Foundation
|
||||
nor the names of its contributors may be used to endorse or promote
|
||||
products derived from this software without specific prior written
|
||||
permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
HOLDERS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||
BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
|
||||
OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
|
||||
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
|
||||
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
|
||||
USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
|
||||
DAMAGE.
|
||||
|
||||
|
||||
DSpace uses third-party libraries which may be distributed under
|
||||
different licenses to the above. These licenses are located in
|
||||
the lib/licenses directory. You must agree to the terms of these
|
||||
licenses, in addition to the above DSpace source code license, in
|
||||
order to use this software.
|
||||
15
NOTICE
15
NOTICE
@@ -1,15 +0,0 @@
|
||||
|
||||
Licensing Notice
|
||||
|
||||
Fedora Commons joined with the DSpace Foundation and began operating under
|
||||
the new name DuraSpace in July 2009. DuraSpace holds the copyrights of
|
||||
the DSpace Foundation, Inc.
|
||||
|
||||
The DSpace Foundation, Inc. is a 501(c)3 corporation established in July 2007
|
||||
with a mission to promote and advance the dspace platform enabling management,
|
||||
access and preservation of digital works. The Foundation was able to transfer
|
||||
the legal copyright from Hewlett-Packard Company (HP) and Massachusetts
|
||||
Institute of Technology (MIT) to the DSpace Foundation in October 2007. Many
|
||||
of the files in the source code may contain a copyright statement stating HP
|
||||
and MIT possess the copyright, in these instances please note that the copy
|
||||
right has transferred to the DSpace foundation, and subsequently to DuraSpace.
|
||||
51
README
51
README
@@ -1,51 +0,0 @@
|
||||
Installation instructions are included in this release package under
|
||||
|
||||
- dspace/docs/html/index.html
|
||||
or
|
||||
- dspace/docs/pdf/DSpace-Manual.pdf
|
||||
|
||||
DSpace version information can be found in this release package under
|
||||
- dspace/docs/html/History.html
|
||||
or viewed online at
|
||||
- https://wiki.duraspace.org/display/DSDOC/History
|
||||
|
||||
Documentation for the most recent stable release(s) may be downloaded
|
||||
or viewed online at
|
||||
- http://www.dspace.org/latest-release/
|
||||
- http://wiki.duraspace.org/display/DSDOC/
|
||||
|
||||
Installation instructions for other versions may be different, so you
|
||||
are encouraged to obtain the appropriate version of the Documentation
|
||||
(from the links above or from SVN).
|
||||
|
||||
To obtain files from the SVN repository and build, please see:
|
||||
|
||||
- https://scm.dspace.org/svn/repo/dspace/tags/
|
||||
|
||||
|
||||
Please refer any further problems to the dspace-tech@lists.sourceforge.net
|
||||
mailing list.
|
||||
|
||||
- http://sourceforge.net/mail/?group_id=19984
|
||||
|
||||
|
||||
Detailed Issue Tracking for DSpace is done on our JIRA Issue Tracker
|
||||
|
||||
- https://jira.duraspace.org/browse/DS
|
||||
|
||||
|
||||
To contribute to DSpace, please see:
|
||||
|
||||
- https://wiki.duraspace.org/display/DSPACE/How+to+Contribute+to+DSpace
|
||||
|
||||
|
||||
For more details about DSpace, including a list of service providers,
|
||||
places to seek help, news articles and lists of other users, please see:
|
||||
|
||||
- http://www.dspace.org/
|
||||
|
||||
|
||||
DSpace source code licensing information available online at:
|
||||
- http://www.dspace.org/license/
|
||||
|
||||
Copyright (c) 2002-2011, DuraSpace. All rights reserved.
|
||||
@@ -1,282 +0,0 @@
|
||||
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
<groupId>org.dspace</groupId>
|
||||
<artifactId>dspace-api</artifactId>
|
||||
<name>DSpace Kernel :: API and Implementation</name>
|
||||
<description>DSpace core data model and service APIs.</description>
|
||||
<url>http://projects.dspace.org/dspace-api</url>
|
||||
|
||||
<!--
|
||||
A Parent POM that Maven inherits DSpace Defaults
|
||||
POM atrributes from.
|
||||
-->
|
||||
<parent>
|
||||
<groupId>org.dspace</groupId>
|
||||
<artifactId>dspace-parent</artifactId>
|
||||
<version>1.7.2</version>
|
||||
</parent>
|
||||
|
||||
<!--
|
||||
The Subversion repository location is used by Continuum to update against
|
||||
when changes have occured, this spawns a new build cycle and releases snapshots
|
||||
into the snapshot repository below.
|
||||
-->
|
||||
<scm>
|
||||
<connection>scm:svn:http://scm.dspace.org/svn/repo/dspace/tags/dspace-1.7.2</connection>
|
||||
<developerConnection>scm:svn:https://scm.dspace.org/svn/repo/dspace/tags/dspace-1.7.2</developerConnection>
|
||||
<url>http://scm.dspace.org/svn/repo/dspace/tags/dspace-1.7.2</url>
|
||||
</scm>
|
||||
|
||||
<!--
|
||||
Runtime and Compile Time dependencies for DSpace.
|
||||
-->
|
||||
<build>
|
||||
<plugins>
|
||||
<plugin>
|
||||
<groupId>org.apache.maven.plugins</groupId>
|
||||
<artifactId>maven-compiler-plugin</artifactId>
|
||||
<version>2.0.2</version>
|
||||
<configuration>
|
||||
<debug>true</debug>
|
||||
<showDeprecation>true</showDeprecation>
|
||||
</configuration>
|
||||
</plugin>
|
||||
<plugin>
|
||||
<groupId>org.apache.maven.plugins</groupId>
|
||||
<artifactId>maven-jar-plugin</artifactId>
|
||||
<configuration>
|
||||
<archive>
|
||||
<manifest>
|
||||
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
|
||||
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
|
||||
</manifest>
|
||||
</archive>
|
||||
</configuration>
|
||||
<executions>
|
||||
<execution>
|
||||
<goals>
|
||||
<goal>test-jar</goal>
|
||||
</goals>
|
||||
</execution>
|
||||
</executions>
|
||||
</plugin>
|
||||
<plugin>
|
||||
<groupId>com.google.code.maven-license-plugin</groupId>
|
||||
<artifactId>maven-license-plugin</artifactId>
|
||||
<version>1.4.0</version>
|
||||
<configuration>
|
||||
<header>http://scm.dspace.org/svn/repo/licenses/LICENSE_HEADER</header>
|
||||
<includes>
|
||||
<include>src/**</include>
|
||||
</includes>
|
||||
<excludes>
|
||||
<exclude>src/test/resources/dspaceFolder/**</exclude>
|
||||
</excludes>
|
||||
<properties />
|
||||
<encoding>UTF-8</encoding>
|
||||
</configuration>
|
||||
<executions>
|
||||
<execution>
|
||||
<goals>
|
||||
<goal>check</goal>
|
||||
</goals>
|
||||
</execution>
|
||||
</executions>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</build>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>org.springframework</groupId>
|
||||
<artifactId>spring-jdbc</artifactId>
|
||||
<version>2.5.6</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.dspace</groupId>
|
||||
<artifactId>handle</artifactId>
|
||||
<version>6.2</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.dspace</groupId>
|
||||
<artifactId>jargon</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.dspace</groupId>
|
||||
<artifactId>mets</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.dspace.dependencies</groupId>
|
||||
<artifactId>dspace-tm-extractors</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.lucene</groupId>
|
||||
<artifactId>lucene-core</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.lucene</groupId>
|
||||
<artifactId>lucene-analyzers</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-cli</groupId>
|
||||
<artifactId>commons-cli</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-codec</groupId>
|
||||
<artifactId>commons-codec</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-collections</groupId>
|
||||
<artifactId>commons-collections</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-dbcp</groupId>
|
||||
<artifactId>commons-dbcp</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-fileupload</groupId>
|
||||
<artifactId>commons-fileupload</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-io</groupId>
|
||||
<artifactId>commons-io</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-lang</groupId>
|
||||
<artifactId>commons-lang</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-pool</groupId>
|
||||
<artifactId>commons-pool</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>javax.mail</groupId>
|
||||
<artifactId>mail</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>javax.servlet</groupId>
|
||||
<artifactId>servlet-api</artifactId>
|
||||
<scope>provided</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>jaxen</groupId>
|
||||
<artifactId>jaxen</artifactId>
|
||||
<exclusions>
|
||||
<exclusion>
|
||||
<artifactId>xom</artifactId>
|
||||
<groupId>xom</groupId>
|
||||
</exclusion>
|
||||
</exclusions>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>jdom</groupId>
|
||||
<artifactId>jdom</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>log4j</groupId>
|
||||
<artifactId>log4j</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>oro</groupId>
|
||||
<artifactId>oro</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.pdfbox</groupId>
|
||||
<artifactId>pdfbox</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.pdfbox</groupId>
|
||||
<artifactId>fontbox</artifactId>
|
||||
<version>1.2.1</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.pdfbox</groupId>
|
||||
<artifactId>jempbox</artifactId>
|
||||
<version>1.2.1</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.bouncycastle</groupId>
|
||||
<artifactId>bcprov-jdk15</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.bouncycastle</groupId>
|
||||
<artifactId>bcmail-jdk15</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.poi</groupId>
|
||||
<artifactId>poi</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.poi</groupId>
|
||||
<artifactId>poi-scratchpad</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.poi</groupId>
|
||||
<artifactId>poi-ooxml</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>net.java.dev.rome</groupId>
|
||||
<artifactId>rome</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>rome</groupId>
|
||||
<artifactId>opensearch</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>xalan</groupId>
|
||||
<artifactId>xalan</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>xerces</groupId>
|
||||
<artifactId>xercesImpl</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>xml-apis</groupId>
|
||||
<artifactId>xmlParserAPIs</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>javax.activation</groupId>
|
||||
<artifactId>activation</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.ibm.icu</groupId>
|
||||
<artifactId>icu4j</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.dspace</groupId>
|
||||
<artifactId>oclc-harvester2</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.dspace</groupId>
|
||||
<artifactId>dspace-services-api</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.dspace</groupId>
|
||||
<artifactId>dspace-services-impl</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>commons-httpclient</groupId>
|
||||
<artifactId>commons-httpclient</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>junit</groupId>
|
||||
<artifactId>junit</artifactId>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.dspace.dependencies.jmockit</groupId>
|
||||
<artifactId>dspace-jmockit</artifactId>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.h2database</groupId>
|
||||
<artifactId>h2</artifactId>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.databene</groupId>
|
||||
<artifactId>contiperf</artifactId>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
|
||||
</project>
|
||||
@@ -1,268 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.administer;
|
||||
|
||||
import java.io.BufferedReader;
|
||||
import java.io.InputStreamReader;
|
||||
import java.util.Locale;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
|
||||
import org.apache.commons.lang.StringUtils;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.I18nUtil;
|
||||
import org.dspace.eperson.EPerson;
|
||||
import org.dspace.eperson.Group;
|
||||
|
||||
/**
|
||||
* A command-line tool for creating an initial administrator for setting up a
|
||||
* DSpace site. Prompts for an e-mail address, last name, first name and
|
||||
* password from standard input. An administrator group is then created and the
|
||||
* data passed in used to create an e-person in that group.
|
||||
* <P>
|
||||
* Alternatively, it can be used to take the email, first name, last name and
|
||||
* desired password as arguments thus:
|
||||
*
|
||||
* CreateAdministrator -e [email] -f [first name] -l [last name] -p [password]
|
||||
*
|
||||
* This is particularly convenient for automated deploy scripts that require an
|
||||
* initial administrator, for example, before deployment can be completed
|
||||
*
|
||||
* @author Robert Tansley
|
||||
* @author Richard Jones
|
||||
*
|
||||
* @version $Revision$
|
||||
*/
|
||||
public final class CreateAdministrator
|
||||
{
|
||||
/** DSpace Context object */
|
||||
private Context context;
|
||||
|
||||
/**
|
||||
* For invoking via the command line. If called with no command line arguments,
|
||||
* it will negotiate with the user for the administrator details
|
||||
*
|
||||
* @param argv
|
||||
* command-line arguments
|
||||
*/
|
||||
public static void main(String[] argv)
|
||||
throws Exception
|
||||
{
|
||||
CommandLineParser parser = new PosixParser();
|
||||
Options options = new Options();
|
||||
|
||||
CreateAdministrator ca = new CreateAdministrator();
|
||||
|
||||
options.addOption("e", "email", true, "administrator email address");
|
||||
options.addOption("f", "first", true, "administrator first name");
|
||||
options.addOption("l", "last", true, "administrator last name");
|
||||
options.addOption("c", "language", true, "administrator language");
|
||||
options.addOption("p", "password", true, "administrator password");
|
||||
|
||||
CommandLine line = parser.parse(options, argv);
|
||||
|
||||
if (line.hasOption("e") && line.hasOption("f") && line.hasOption("l") &&
|
||||
line.hasOption("c") && line.hasOption("p"))
|
||||
{
|
||||
ca.createAdministrator(line.getOptionValue("e"),
|
||||
line.getOptionValue("f"), line.getOptionValue("l"),
|
||||
line.getOptionValue("c"), line.getOptionValue("p"));
|
||||
}
|
||||
else
|
||||
{
|
||||
ca.negotiateAdministratorDetails();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* constructor, which just creates and object with a ready context
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private CreateAdministrator()
|
||||
throws Exception
|
||||
{
|
||||
context = new Context();
|
||||
}
|
||||
|
||||
/**
|
||||
* Method which will negotiate with the user via the command line to
|
||||
* obtain the administrator's details
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private void negotiateAdministratorDetails()
|
||||
throws Exception
|
||||
{
|
||||
// For easier reading of typing
|
||||
BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
|
||||
|
||||
System.out.println("Creating an initial administrator account");
|
||||
|
||||
boolean dataOK = false;
|
||||
|
||||
String email = null;
|
||||
String firstName = null;
|
||||
String lastName = null;
|
||||
String password1 = null;
|
||||
String password2 = null;
|
||||
String language = I18nUtil.DEFAULTLOCALE.getLanguage();
|
||||
|
||||
while (!dataOK)
|
||||
{
|
||||
System.out.print("E-mail address: ");
|
||||
System.out.flush();
|
||||
|
||||
email = input.readLine();
|
||||
if (email != null)
|
||||
{
|
||||
email = email.trim();
|
||||
}
|
||||
|
||||
System.out.print("First name: ");
|
||||
System.out.flush();
|
||||
|
||||
firstName = input.readLine();
|
||||
|
||||
if (firstName != null)
|
||||
{
|
||||
firstName = firstName.trim();
|
||||
}
|
||||
|
||||
System.out.print("Last name: ");
|
||||
System.out.flush();
|
||||
|
||||
lastName = input.readLine();
|
||||
|
||||
if (lastName != null)
|
||||
{
|
||||
lastName = lastName.trim();
|
||||
}
|
||||
|
||||
if (ConfigurationManager.getProperty("webui.supported.locales") != null)
|
||||
{
|
||||
System.out.println("Select one of the following languages: " + ConfigurationManager.getProperty("webui.supported.locales"));
|
||||
System.out.print("Language: ");
|
||||
System.out.flush();
|
||||
|
||||
language = input.readLine();
|
||||
|
||||
if (language != null)
|
||||
{
|
||||
language = language.trim();
|
||||
language = I18nUtil.getSupportedLocale(new Locale(language)).getLanguage();
|
||||
}
|
||||
}
|
||||
|
||||
System.out.println("WARNING: Password will appear on-screen.");
|
||||
System.out.print("Password: ");
|
||||
System.out.flush();
|
||||
|
||||
password1 = input.readLine();
|
||||
|
||||
if (password1 != null)
|
||||
{
|
||||
password1 = password1.trim();
|
||||
}
|
||||
|
||||
System.out.print("Again to confirm: ");
|
||||
System.out.flush();
|
||||
|
||||
password2 = input.readLine();
|
||||
|
||||
if (password2 != null)
|
||||
{
|
||||
password2 = password2.trim();
|
||||
}
|
||||
|
||||
if (!StringUtils.isEmpty(password1) && StringUtils.equals(password1, password2))
|
||||
{
|
||||
// password OK
|
||||
System.out.print("Is the above data correct? (y or n): ");
|
||||
System.out.flush();
|
||||
|
||||
String s = input.readLine();
|
||||
|
||||
if (s != null)
|
||||
{
|
||||
s = s.trim();
|
||||
if (s.toLowerCase().startsWith("y"))
|
||||
{
|
||||
dataOK = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
System.out.println("Passwords don't match");
|
||||
}
|
||||
}
|
||||
|
||||
// if we make it to here, we are ready to create an administrator
|
||||
createAdministrator(email, firstName, lastName, language, password1);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create the administrator with the given details. If the user
|
||||
* already exists then they are simply upped to administrator status
|
||||
*
|
||||
* @param email the email for the user
|
||||
* @param first user's first name
|
||||
* @param last user's last name
|
||||
* @param ps desired password
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private void createAdministrator(String email, String first, String last,
|
||||
String language, String pw)
|
||||
throws Exception
|
||||
{
|
||||
// Of course we aren't an administrator yet so we need to
|
||||
// circumvent authorisation
|
||||
context.setIgnoreAuthorization(true);
|
||||
|
||||
// Find administrator group
|
||||
Group admins = Group.find(context, 1);
|
||||
|
||||
if (admins == null)
|
||||
{
|
||||
throw new IllegalStateException("Error, no admin group (group 1) found");
|
||||
}
|
||||
|
||||
// Create the administrator e-person
|
||||
EPerson eperson = EPerson.findByEmail(context,email);
|
||||
|
||||
// check if the email belongs to a registered user,
|
||||
// if not create a new user with this email
|
||||
if (eperson == null)
|
||||
{
|
||||
eperson = EPerson.create(context);
|
||||
eperson.setEmail(email);
|
||||
eperson.setCanLogIn(true);
|
||||
eperson.setRequireCertificate(false);
|
||||
eperson.setSelfRegistered(false);
|
||||
}
|
||||
|
||||
eperson.setLastName(last);
|
||||
eperson.setFirstName(first);
|
||||
eperson.setLanguage(language);
|
||||
eperson.setPassword(pw);
|
||||
eperson.update();
|
||||
|
||||
admins.addMember(eperson);
|
||||
admins.update();
|
||||
|
||||
context.complete();
|
||||
|
||||
System.out.println("Administrator account created");
|
||||
}
|
||||
}
|
||||
@@ -1,325 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.administer;
|
||||
|
||||
import java.io.BufferedWriter;
|
||||
import java.io.FileWriter;
|
||||
import java.io.IOException;
|
||||
import java.sql.SQLException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.ParseException;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
import org.apache.xml.serialize.Method;
|
||||
import org.apache.xml.serialize.OutputFormat;
|
||||
import org.apache.xml.serialize.XMLSerializer;
|
||||
import org.dspace.content.MetadataField;
|
||||
import org.dspace.content.MetadataSchema;
|
||||
import org.dspace.core.Context;
|
||||
import org.xml.sax.SAXException;
|
||||
|
||||
|
||||
/**
|
||||
* @author Graham Triggs
|
||||
*
|
||||
* This class creates an xml document as passed in the arguments and
|
||||
* from the metadata schemas for the repository.
|
||||
*
|
||||
* The form of the XML is as follows
|
||||
*
|
||||
* <metadata-schemas>
|
||||
* <schema>
|
||||
* <name>dc</name>
|
||||
* <namespace>http://dublincore.org/documents/dcmi-terms/</namespace>
|
||||
* </schema>
|
||||
* </metadata-schemas>
|
||||
*/
|
||||
public class MetadataExporter
|
||||
{
|
||||
|
||||
/**
|
||||
* @param args
|
||||
* @throws ParseException
|
||||
* @throws SAXException
|
||||
* @throws IOException
|
||||
* @throws SQLException
|
||||
* @throws RegistryExportException
|
||||
*/
|
||||
public static void main(String[] args) throws ParseException, SQLException, IOException, SAXException, RegistryExportException
|
||||
{
|
||||
// create an options object and populate it
|
||||
CommandLineParser parser = new PosixParser();
|
||||
Options options = new Options();
|
||||
options.addOption("f", "file", true, "output xml file for registry");
|
||||
options.addOption("s", "schema", true, "the name of the schema to export");
|
||||
CommandLine line = parser.parse(options, args);
|
||||
|
||||
String file = null;
|
||||
String schema = null;
|
||||
|
||||
if (line.hasOption('f'))
|
||||
{
|
||||
file = line.getOptionValue('f');
|
||||
}
|
||||
else
|
||||
{
|
||||
usage();
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
if (line.hasOption('s'))
|
||||
{
|
||||
schema = line.getOptionValue('s');
|
||||
}
|
||||
|
||||
saveRegistry(file, schema);
|
||||
}
|
||||
|
||||
public static void saveRegistry(String file, String schema) throws SQLException, IOException, SAXException, RegistryExportException
|
||||
{
|
||||
// create a context
|
||||
Context context = new Context();
|
||||
context.setIgnoreAuthorization(true);
|
||||
|
||||
OutputFormat xmlFormat = new OutputFormat(Method.XML, "UTF-8", true);
|
||||
xmlFormat.setLineWidth(120);
|
||||
xmlFormat.setIndent(4);
|
||||
|
||||
XMLSerializer xmlSerializer = new XMLSerializer(new BufferedWriter(new FileWriter(file)), xmlFormat);
|
||||
// XMLSerializer xmlSerializer = new XMLSerializer(System.out, xmlFormat);
|
||||
xmlSerializer.startDocument();
|
||||
xmlSerializer.startElement("dspace-dc-types", null);
|
||||
|
||||
// Save the schema definition(s)
|
||||
saveSchema(context, xmlSerializer, schema);
|
||||
|
||||
MetadataField[] mdFields = null;
|
||||
|
||||
// If a single schema has been specified
|
||||
if (schema != null && !"".equals(schema))
|
||||
{
|
||||
// Get the id of that schema
|
||||
MetadataSchema mdSchema = MetadataSchema.find(context, schema);
|
||||
if (mdSchema == null)
|
||||
{
|
||||
throw new RegistryExportException("no schema to export");
|
||||
}
|
||||
|
||||
// Get the metadata fields only for the specified schema
|
||||
mdFields = MetadataField.findAllInSchema(context, mdSchema.getSchemaID());
|
||||
}
|
||||
else
|
||||
{
|
||||
// Get the metadata fields for all the schemas
|
||||
mdFields = MetadataField.findAll(context);
|
||||
}
|
||||
|
||||
// Output the metadata fields
|
||||
for (MetadataField mdField : mdFields)
|
||||
{
|
||||
saveType(context, xmlSerializer, mdField);
|
||||
}
|
||||
|
||||
xmlSerializer.endElement("dspace-dc-types");
|
||||
xmlSerializer.endDocument();
|
||||
|
||||
// abort the context, as we shouldn't have changed it!!
|
||||
context.abort();
|
||||
}
|
||||
|
||||
/**
|
||||
* Serialize the schema registry. If the parameter 'schema' is null or empty, save all schemas
|
||||
* @param context
|
||||
* @param xmlSerializer
|
||||
* @param schema
|
||||
* @throws SQLException
|
||||
* @throws SAXException
|
||||
* @throws RegistryExportException
|
||||
*/
|
||||
public static void saveSchema(Context context, XMLSerializer xmlSerializer, String schema) throws SQLException, SAXException, RegistryExportException
|
||||
{
|
||||
if (schema != null && !"".equals(schema))
|
||||
{
|
||||
// Find a single named schema
|
||||
MetadataSchema mdSchema = MetadataSchema.find(context, schema);
|
||||
|
||||
saveSchema(xmlSerializer, mdSchema);
|
||||
}
|
||||
else
|
||||
{
|
||||
// Find all schemas
|
||||
MetadataSchema[] mdSchemas = MetadataSchema.findAll(context);
|
||||
|
||||
for (MetadataSchema mdSchema : mdSchemas)
|
||||
{
|
||||
saveSchema(xmlSerializer, mdSchema);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Serialize a single schema (namespace) registry entry
|
||||
*
|
||||
* @param xmlSerializer
|
||||
* @param mdSchema
|
||||
* @throws SAXException
|
||||
* @throws RegistryExportException
|
||||
*/
|
||||
private static void saveSchema(XMLSerializer xmlSerializer, MetadataSchema mdSchema) throws SAXException, RegistryExportException
|
||||
{
|
||||
// If we haven't got a schema, it's an error
|
||||
if (mdSchema == null)
|
||||
{
|
||||
throw new RegistryExportException("no schema to export");
|
||||
}
|
||||
|
||||
String name = mdSchema.getName();
|
||||
String namespace = mdSchema.getNamespace();
|
||||
|
||||
if (name == null || "".equals(name))
|
||||
{
|
||||
System.out.println("name is null, skipping");
|
||||
return;
|
||||
}
|
||||
|
||||
if (namespace == null || "".equals(namespace))
|
||||
{
|
||||
System.out.println("namespace is null, skipping");
|
||||
return;
|
||||
}
|
||||
|
||||
// Output the parent tag
|
||||
xmlSerializer.startElement("dc-schema", null);
|
||||
|
||||
// Output the schema name
|
||||
xmlSerializer.startElement("name", null);
|
||||
xmlSerializer.characters(name.toCharArray(), 0, name.length());
|
||||
xmlSerializer.endElement("name");
|
||||
|
||||
// Output the schema namespace
|
||||
xmlSerializer.startElement("namespace", null);
|
||||
xmlSerializer.characters(namespace.toCharArray(), 0, namespace.length());
|
||||
xmlSerializer.endElement("namespace");
|
||||
|
||||
xmlSerializer.endElement("dc-schema");
|
||||
}
|
||||
|
||||
/**
|
||||
* Serialize a single metadata field registry entry to xml
|
||||
*
|
||||
* @param context
|
||||
* @param xmlSerializer
|
||||
* @param mdField
|
||||
* @throws SAXException
|
||||
* @throws RegistryExportException
|
||||
* @throws SQLException
|
||||
* @throws IOException
|
||||
*/
|
||||
private static void saveType(Context context, XMLSerializer xmlSerializer, MetadataField mdField) throws SAXException, RegistryExportException, SQLException, IOException
|
||||
{
|
||||
// If we haven't been given a field, it's an error
|
||||
if (mdField == null)
|
||||
{
|
||||
throw new RegistryExportException("no field to export");
|
||||
}
|
||||
|
||||
// Get the data from the metadata field
|
||||
String schemaName = getSchemaName(context, mdField);
|
||||
String element = mdField.getElement();
|
||||
String qualifier = mdField.getQualifier();
|
||||
String scopeNote = mdField.getScopeNote();
|
||||
|
||||
// We must have a schema and element
|
||||
if (schemaName == null || element == null)
|
||||
{
|
||||
throw new RegistryExportException("incomplete field information");
|
||||
}
|
||||
|
||||
// Output the parent tag
|
||||
xmlSerializer.startElement("dc-type", null);
|
||||
|
||||
// Output the schema name
|
||||
xmlSerializer.startElement("schema", null);
|
||||
xmlSerializer.characters(schemaName.toCharArray(), 0, schemaName.length());
|
||||
xmlSerializer.endElement("schema");
|
||||
|
||||
// Output the element
|
||||
xmlSerializer.startElement("element", null);
|
||||
xmlSerializer.characters(element.toCharArray(), 0, element.length());
|
||||
xmlSerializer.endElement("element");
|
||||
|
||||
// Output the qualifier, if present
|
||||
if (qualifier != null)
|
||||
{
|
||||
xmlSerializer.startElement("qualifier", null);
|
||||
xmlSerializer.characters(qualifier.toCharArray(), 0, qualifier.length());
|
||||
xmlSerializer.endElement("qualifier");
|
||||
}
|
||||
else
|
||||
{
|
||||
xmlSerializer.comment("unqualified");
|
||||
}
|
||||
|
||||
// Output the scope note, if present
|
||||
if (scopeNote != null)
|
||||
{
|
||||
xmlSerializer.startElement("scope_note", null);
|
||||
xmlSerializer.characters(scopeNote.toCharArray(), 0, scopeNote.length());
|
||||
xmlSerializer.endElement("scope_note");
|
||||
}
|
||||
else
|
||||
{
|
||||
xmlSerializer.comment("no scope note");
|
||||
}
|
||||
|
||||
xmlSerializer.endElement("dc-type");
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper method to retrieve a schema name for the field.
|
||||
* Caches the name after looking up the id.
|
||||
*/
|
||||
static Map<Integer, String> schemaMap = new HashMap<Integer, String>();
|
||||
private static String getSchemaName(Context context, MetadataField mdField) throws SQLException, RegistryExportException
|
||||
{
|
||||
// Get name from cache
|
||||
String name = schemaMap.get(Integer.valueOf(mdField.getSchemaID()));
|
||||
|
||||
if (name == null)
|
||||
{
|
||||
// Name not retrieved before, so get the schema now
|
||||
MetadataSchema mdSchema = MetadataSchema.find(context, mdField.getSchemaID());
|
||||
if (mdSchema != null)
|
||||
{
|
||||
name = mdSchema.getName();
|
||||
schemaMap.put(Integer.valueOf(mdSchema.getSchemaID()), name);
|
||||
}
|
||||
else
|
||||
{
|
||||
// Can't find the schema
|
||||
throw new RegistryExportException("Can't get schema name for field");
|
||||
}
|
||||
}
|
||||
return name;
|
||||
}
|
||||
|
||||
/**
|
||||
* Print the usage message to stdout
|
||||
*/
|
||||
public static void usage()
|
||||
{
|
||||
String usage = "Use this class with the following options:\n" +
|
||||
" -f <xml output file> : specify the output file for the schemas\n" +
|
||||
" -s <schema> : name of the schema to export\n";
|
||||
System.out.println(usage);
|
||||
}
|
||||
}
|
||||
@@ -1,264 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.administer;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.sql.SQLException;
|
||||
|
||||
import javax.xml.parsers.ParserConfigurationException;
|
||||
import javax.xml.transform.TransformerException;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.ParseException;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
|
||||
import org.apache.xpath.XPathAPI;
|
||||
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.MetadataField;
|
||||
import org.dspace.content.MetadataSchema;
|
||||
import org.dspace.content.NonUniqueMetadataException;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
import org.w3c.dom.Document;
|
||||
import org.w3c.dom.Node;
|
||||
import org.w3c.dom.NodeList;
|
||||
|
||||
import org.xml.sax.SAXException;
|
||||
|
||||
/**
|
||||
* @author Richard Jones
|
||||
*
|
||||
* This class takes an xml document as passed in the arguments and
|
||||
* uses it to create metadata elements in the Metadata Registry if
|
||||
* they do not already exist
|
||||
*
|
||||
* The format of the XML file is as follows:
|
||||
*
|
||||
* <dspace-dc-types>
|
||||
* <dc-type>
|
||||
* <schema>icadmin</schema>
|
||||
* <element>status</element>
|
||||
* <qualifier>dateset</qualifier>
|
||||
* <scope_note>the workflow status of an item</scope_note>
|
||||
* </dc-type>
|
||||
*
|
||||
* [....]
|
||||
*
|
||||
* </dspace-dc-types>
|
||||
*/
|
||||
public class MetadataImporter
|
||||
{
|
||||
/**
|
||||
* main method for reading user input from the command line
|
||||
*/
|
||||
public static void main(String[] args)
|
||||
throws ParseException, SQLException, IOException, TransformerException,
|
||||
ParserConfigurationException, AuthorizeException, SAXException,
|
||||
NonUniqueMetadataException, RegistryImportException
|
||||
{
|
||||
boolean forceUpdate = false;
|
||||
|
||||
// create an options object and populate it
|
||||
CommandLineParser parser = new PosixParser();
|
||||
Options options = new Options();
|
||||
options.addOption("f", "file", true, "source xml file for DC fields");
|
||||
options.addOption("u", "update", false, "update an existing schema");
|
||||
CommandLine line = parser.parse(options, args);
|
||||
|
||||
String file = null;
|
||||
if (line.hasOption('f'))
|
||||
{
|
||||
file = line.getOptionValue('f');
|
||||
}
|
||||
else
|
||||
{
|
||||
usage();
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
forceUpdate = line.hasOption('u');
|
||||
loadRegistry(file, forceUpdate);
|
||||
}
|
||||
|
||||
/**
|
||||
* Load the data from the specified file path into the database
|
||||
*
|
||||
* @param file the file path containing the source data
|
||||
*/
|
||||
public static void loadRegistry(String file, boolean forceUpdate)
|
||||
throws SQLException, IOException, TransformerException, ParserConfigurationException,
|
||||
AuthorizeException, SAXException, NonUniqueMetadataException, RegistryImportException
|
||||
{
|
||||
// create a context
|
||||
Context context = new Context();
|
||||
context.setIgnoreAuthorization(true);
|
||||
|
||||
// read the XML
|
||||
Document document = RegistryImporter.loadXML(file);
|
||||
|
||||
// Get the nodes corresponding to types
|
||||
NodeList schemaNodes = XPathAPI.selectNodeList(document, "/dspace-dc-types/dc-schema");
|
||||
|
||||
// Add each one as a new format to the registry
|
||||
for (int i = 0; i < schemaNodes.getLength(); i++)
|
||||
{
|
||||
Node n = schemaNodes.item(i);
|
||||
loadSchema(context, n, forceUpdate);
|
||||
}
|
||||
|
||||
// Get the nodes corresponding to types
|
||||
NodeList typeNodes = XPathAPI.selectNodeList(document, "/dspace-dc-types/dc-type");
|
||||
|
||||
// Add each one as a new format to the registry
|
||||
for (int i = 0; i < typeNodes.getLength(); i++)
|
||||
{
|
||||
Node n = typeNodes.item(i);
|
||||
loadType(context, n);
|
||||
}
|
||||
|
||||
context.complete();
|
||||
}
|
||||
|
||||
/**
|
||||
* Process a node in the metadata registry XML file. If the
|
||||
* schema already exists, it will not be recreated
|
||||
*
|
||||
* @param context
|
||||
* DSpace context object
|
||||
* @param node
|
||||
* the node in the DOM tree
|
||||
* @throws NonUniqueMetadataException
|
||||
*/
|
||||
private static void loadSchema(Context context, Node node, boolean updateExisting)
|
||||
throws SQLException, IOException, TransformerException,
|
||||
AuthorizeException, NonUniqueMetadataException, RegistryImportException
|
||||
{
|
||||
// Get the values
|
||||
String name = RegistryImporter.getElementData(node, "name");
|
||||
String namespace = RegistryImporter.getElementData(node, "namespace");
|
||||
|
||||
if (name == null || "".equals(name))
|
||||
{
|
||||
throw new RegistryImportException("Name of schema must be supplied");
|
||||
}
|
||||
|
||||
if (namespace == null || "".equals(namespace))
|
||||
{
|
||||
throw new RegistryImportException("Namespace of schema must be supplied");
|
||||
}
|
||||
|
||||
System.out.print("Registering Schema: " + name + " - " + namespace + " ... ");
|
||||
|
||||
// check to see if the schema already exists
|
||||
MetadataSchema s = MetadataSchema.find(context, name);
|
||||
|
||||
if (s == null)
|
||||
{
|
||||
// Schema does not exist - create
|
||||
MetadataSchema schema = new MetadataSchema(namespace, name);
|
||||
schema.create(context);
|
||||
System.out.println("created");
|
||||
}
|
||||
else
|
||||
{
|
||||
// Schema exists - if it's the same namespace, allow the type imports to continue
|
||||
if (s.getNamespace().equals(namespace))
|
||||
{
|
||||
System.out.println("already exists, skipping to type import");
|
||||
return;
|
||||
}
|
||||
|
||||
// It's a different namespace - have we been told to update?
|
||||
if (updateExisting)
|
||||
{
|
||||
// Update the existing schema namespace and continue to type import
|
||||
s.setNamespace(namespace);
|
||||
s.update(context);
|
||||
System.out.println("namespace updated (" + name + " = " + namespace + ")");
|
||||
}
|
||||
else
|
||||
{
|
||||
// Don't update the existing namespace - abort abort abort
|
||||
System.out.println("schema exists, but with different namespace");
|
||||
System.out.println("was: " + s.getNamespace());
|
||||
System.out.println("xml: " + namespace);
|
||||
System.out.println("aborting - use -u to force the update");
|
||||
|
||||
throw new RegistryImportException("schema already registered with different namespace - use -u to update");
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Process a node in the metadata registry XML file. The node must
|
||||
* be a "dc-type" node. If the type already exists, then it
|
||||
* will not be reimported
|
||||
*
|
||||
* @param context
|
||||
* DSpace context object
|
||||
* @param node
|
||||
* the node in the DOM tree
|
||||
* @throws NonUniqueMetadataException
|
||||
*/
|
||||
private static void loadType(Context context, Node node)
|
||||
throws SQLException, IOException, TransformerException,
|
||||
AuthorizeException, NonUniqueMetadataException, RegistryImportException
|
||||
{
|
||||
// Get the values
|
||||
String schema = RegistryImporter.getElementData(node, "schema");
|
||||
String element = RegistryImporter.getElementData(node, "element");
|
||||
String qualifier = RegistryImporter.getElementData(node, "qualifier");
|
||||
String scopeNote = RegistryImporter.getElementData(node, "scope_note");
|
||||
|
||||
// If the schema is not provided default to DC
|
||||
if (schema == null)
|
||||
{
|
||||
schema = MetadataSchema.DC_SCHEMA;
|
||||
}
|
||||
|
||||
System.out.print("Registering Metadata: " + schema + "." + element + "." + qualifier + " ... ");
|
||||
|
||||
// Find the matching schema object
|
||||
MetadataSchema schemaObj = MetadataSchema.find(context, schema);
|
||||
|
||||
if (schemaObj == null)
|
||||
{
|
||||
throw new RegistryImportException("Schema '" + schema + "' is not registered");
|
||||
}
|
||||
|
||||
MetadataField mf = MetadataField.findByElement(context, schemaObj.getSchemaID(), element, qualifier);
|
||||
if (mf != null)
|
||||
{
|
||||
System.out.println("already exists, skipping");
|
||||
return;
|
||||
}
|
||||
|
||||
MetadataField field = new MetadataField();
|
||||
field.setSchemaID(schemaObj.getSchemaID());
|
||||
field.setElement(element);
|
||||
field.setQualifier(qualifier);
|
||||
field.setScopeNote(scopeNote);
|
||||
field.create(context);
|
||||
System.out.println("created");
|
||||
}
|
||||
|
||||
/**
|
||||
* Print the usage message to stdout
|
||||
*/
|
||||
public static void usage()
|
||||
{
|
||||
String usage = "Use this class with the following option:\n" +
|
||||
" -f <xml source file> : specify which xml source file " +
|
||||
"contains the DC fields to import.\n";
|
||||
System.out.println(usage);
|
||||
}
|
||||
}
|
||||
@@ -1,56 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.administer;
|
||||
|
||||
/**
|
||||
* @author Graham Triggs
|
||||
*
|
||||
* An exception to report any problems with registry exports
|
||||
*/
|
||||
public class RegistryExportException extends Exception
|
||||
{
|
||||
/**
|
||||
* Create an empty authorize exception
|
||||
*/
|
||||
public RegistryExportException()
|
||||
{
|
||||
super();
|
||||
}
|
||||
|
||||
/**
|
||||
* create an exception with only a message
|
||||
*
|
||||
* @param message
|
||||
*/
|
||||
public RegistryExportException(String message)
|
||||
{
|
||||
super(message);
|
||||
}
|
||||
|
||||
/**
|
||||
* create an exception with an inner exception and a message
|
||||
*
|
||||
* @param message
|
||||
* @param e
|
||||
*/
|
||||
public RegistryExportException(String message, Throwable e)
|
||||
{
|
||||
super(message, e);
|
||||
}
|
||||
|
||||
/**
|
||||
* create an exception with an inner exception
|
||||
*
|
||||
* @param e
|
||||
*/
|
||||
public RegistryExportException(Throwable e)
|
||||
{
|
||||
super(e);
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,56 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.administer;
|
||||
|
||||
/**
|
||||
* @author Richard Jones
|
||||
*
|
||||
* An exception to report any problems with registry imports
|
||||
*/
|
||||
public class RegistryImportException extends Exception
|
||||
{
|
||||
/**
|
||||
* Create an empty authorize exception
|
||||
*/
|
||||
public RegistryImportException()
|
||||
{
|
||||
super();
|
||||
}
|
||||
|
||||
/**
|
||||
* create an exception with only a message
|
||||
*
|
||||
* @param message
|
||||
*/
|
||||
public RegistryImportException(String message)
|
||||
{
|
||||
super(message);
|
||||
}
|
||||
|
||||
/**
|
||||
* create an exception with an inner exception and a message
|
||||
*
|
||||
* @param message
|
||||
* @param e
|
||||
*/
|
||||
public RegistryImportException(String message, Throwable e)
|
||||
{
|
||||
super(message, e);
|
||||
}
|
||||
|
||||
/**
|
||||
* create an exception with an inner exception
|
||||
*
|
||||
* @param e
|
||||
*/
|
||||
public RegistryImportException(Throwable e)
|
||||
{
|
||||
super(e);
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,141 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.administer;
|
||||
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
|
||||
import javax.xml.parsers.DocumentBuilder;
|
||||
import javax.xml.parsers.DocumentBuilderFactory;
|
||||
import javax.xml.parsers.ParserConfigurationException;
|
||||
import javax.xml.transform.TransformerException;
|
||||
|
||||
import org.apache.xpath.XPathAPI;
|
||||
|
||||
import org.w3c.dom.Document;
|
||||
import org.w3c.dom.Node;
|
||||
import org.w3c.dom.NodeList;
|
||||
|
||||
import org.xml.sax.SAXException;
|
||||
|
||||
/**
|
||||
* @author Richard Jones
|
||||
*
|
||||
* This class provides the tools that registry importers might need to
|
||||
* use. Basically some utility methods. And actually, although it says
|
||||
* I am the author, really I ripped these methods off from other
|
||||
* classes
|
||||
*/
|
||||
public class RegistryImporter
|
||||
{
|
||||
/**
|
||||
* Load in the XML from file.
|
||||
*
|
||||
* @param filename
|
||||
* the filename to load from
|
||||
*
|
||||
* @return the DOM representation of the XML file
|
||||
*/
|
||||
public static Document loadXML(String filename)
|
||||
throws IOException, ParserConfigurationException, SAXException
|
||||
{
|
||||
DocumentBuilder builder = DocumentBuilderFactory.newInstance()
|
||||
.newDocumentBuilder();
|
||||
|
||||
Document document = builder.parse(new File(filename));
|
||||
|
||||
return document;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the CDATA of a particular element. For example, if the XML document
|
||||
* contains:
|
||||
* <P>
|
||||
* <code>
|
||||
* <foo><mimetype>application/pdf</mimetype></foo>
|
||||
* </code>
|
||||
* passing this the <code>foo</code> node and <code>mimetype</code> will
|
||||
* return <code>application/pdf</code>.
|
||||
* </P>
|
||||
* Why this isn't a core part of the XML API I do not know...
|
||||
*
|
||||
* @param parentElement
|
||||
* the element, whose child element you want the CDATA from
|
||||
* @param childName
|
||||
* the name of the element you want the CDATA from
|
||||
*
|
||||
* @return the CDATA as a <code>String</code>
|
||||
*/
|
||||
public static String getElementData(Node parentElement, String childName)
|
||||
throws TransformerException
|
||||
{
|
||||
// Grab the child node
|
||||
Node childNode = XPathAPI.selectSingleNode(parentElement, childName);
|
||||
|
||||
if (childNode == null)
|
||||
{
|
||||
// No child node, so no values
|
||||
return null;
|
||||
}
|
||||
|
||||
// Get the #text
|
||||
Node dataNode = childNode.getFirstChild();
|
||||
|
||||
if (dataNode == null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
// Get the data
|
||||
String value = dataNode.getNodeValue().trim();
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get repeated CDATA for a particular element. For example, if the XML
|
||||
* document contains:
|
||||
* <P>
|
||||
* <code>
|
||||
* <foo>
|
||||
* <bar>val1</bar>
|
||||
* <bar>val2</bar>
|
||||
* </foo>
|
||||
* </code>
|
||||
* passing this the <code>foo</code> node and <code>bar</code> will
|
||||
* return <code>val1</code> and <code>val2</code>.
|
||||
* </P>
|
||||
* Why this also isn't a core part of the XML API I do not know...
|
||||
*
|
||||
* @param parentElement
|
||||
* the element, whose child element you want the CDATA from
|
||||
* @param childName
|
||||
* the name of the element you want the CDATA from
|
||||
*
|
||||
* @return the CDATA as a <code>String</code>
|
||||
*/
|
||||
public static String[] getRepeatedElementData(Node parentElement,
|
||||
String childName) throws TransformerException
|
||||
{
|
||||
// Grab the child node
|
||||
NodeList childNodes = XPathAPI.selectNodeList(parentElement, childName);
|
||||
|
||||
String[] data = new String[childNodes.getLength()];
|
||||
|
||||
for (int i = 0; i < childNodes.getLength(); i++)
|
||||
{
|
||||
// Get the #text node
|
||||
Node dataNode = childNodes.item(i).getFirstChild();
|
||||
|
||||
// Get the data
|
||||
data[i] = dataNode.getNodeValue().trim();
|
||||
}
|
||||
|
||||
return data;
|
||||
}
|
||||
}
|
||||
@@ -1,581 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.administer;
|
||||
|
||||
import java.io.BufferedWriter;
|
||||
import java.io.File;
|
||||
import java.io.FileWriter;
|
||||
import java.io.IOException;
|
||||
import java.sql.SQLException;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
import javax.xml.parsers.DocumentBuilder;
|
||||
import javax.xml.parsers.DocumentBuilderFactory;
|
||||
import javax.xml.parsers.ParserConfigurationException;
|
||||
import javax.xml.transform.TransformerException;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
import org.apache.xpath.XPathAPI;
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.Collection;
|
||||
import org.dspace.content.Community;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.eperson.EPerson;
|
||||
import org.jdom.Element;
|
||||
import org.jdom.output.XMLOutputter;
|
||||
import org.w3c.dom.Document;
|
||||
import org.w3c.dom.Node;
|
||||
import org.w3c.dom.NodeList;
|
||||
import org.xml.sax.SAXException;
|
||||
|
||||
/**
|
||||
* This class deals with importing community and collection structures from
|
||||
* an XML file.
|
||||
*
|
||||
* The XML file structure needs to be:
|
||||
*
|
||||
* <import_structure>
|
||||
* <community>
|
||||
* <name>....</name>
|
||||
* <community>...</community>
|
||||
* <collection>
|
||||
* <name>....</name>
|
||||
* </collection>
|
||||
* </community>
|
||||
* </import_structure>
|
||||
*
|
||||
* it can be arbitrarily deep, and supports all the metadata elements
|
||||
* that make up the community and collection metadata. See the system
|
||||
* documentation for more details
|
||||
*
|
||||
* @author Richard Jones
|
||||
*
|
||||
*/
|
||||
|
||||
public class StructBuilder
|
||||
{
|
||||
/** the output xml document which will contain updated information about the
|
||||
* imported structure
|
||||
*/
|
||||
private static org.jdom.Document xmlOutput = new org.jdom.Document(new Element("imported_structure"));
|
||||
|
||||
/** a hashtable to hold metadata for the collection being worked on */
|
||||
private static Map<String, String> collectionMap = new HashMap<String, String>();
|
||||
|
||||
/** a hashtable to hold metadata for the community being worked on */
|
||||
private static Map<String, String> communityMap = new HashMap<String, String>();
|
||||
|
||||
/**
|
||||
* Main method to be run from the command line to import a structure into
|
||||
* DSpace
|
||||
*
|
||||
* This is of the form:
|
||||
*
|
||||
* StructBuilder -f [xml source] -e [administrator email] -o [output file]
|
||||
*
|
||||
* The output file will contain exactly the same as the source xml document, but
|
||||
* with the handle for each imported item added as an attribute.
|
||||
*/
|
||||
public static void main(String[] argv)
|
||||
throws Exception
|
||||
{
|
||||
CommandLineParser parser = new PosixParser();
|
||||
|
||||
Options options = new Options();
|
||||
|
||||
options.addOption( "f", "file", true, "file");
|
||||
options.addOption( "e", "eperson", true, "eperson");
|
||||
options.addOption("o", "output", true, "output");
|
||||
|
||||
CommandLine line = parser.parse( options, argv );
|
||||
|
||||
String file = null;
|
||||
String eperson = null;
|
||||
String output = null;
|
||||
|
||||
if (line.hasOption('f'))
|
||||
{
|
||||
file = line.getOptionValue('f');
|
||||
}
|
||||
|
||||
if (line.hasOption('e'))
|
||||
{
|
||||
eperson = line.getOptionValue('e');
|
||||
}
|
||||
|
||||
if (line.hasOption('o'))
|
||||
{
|
||||
output = line.getOptionValue('o');
|
||||
}
|
||||
|
||||
if (output == null || eperson == null || file == null)
|
||||
{
|
||||
usage();
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
// create a context
|
||||
Context context = new Context();
|
||||
|
||||
// set the context
|
||||
context.setCurrentUser(EPerson.findByEmail(context, eperson));
|
||||
|
||||
// load the XML
|
||||
Document document = loadXML(file);
|
||||
|
||||
// run the preliminary validation, to be sure that the the XML document
|
||||
// is properly structured
|
||||
validate(document);
|
||||
|
||||
// load the mappings into the member variable hashmaps
|
||||
communityMap.put("name", "name");
|
||||
communityMap.put("description", "short_description");
|
||||
communityMap.put("intro", "introductory_text");
|
||||
communityMap.put("copyright", "copyright_text");
|
||||
communityMap.put("sidebar", "side_bar_text");
|
||||
|
||||
collectionMap.put("name", "name");
|
||||
collectionMap.put("description", "short_description");
|
||||
collectionMap.put("intro", "introductory_text");
|
||||
collectionMap.put("copyright", "copyright_text");
|
||||
collectionMap.put("sidebar", "side_bar_text");
|
||||
collectionMap.put("license", "license");
|
||||
collectionMap.put("provenance", "provenance_description");
|
||||
|
||||
// get the top level community list
|
||||
NodeList first = XPathAPI.selectNodeList(document, "/import_structure/community");
|
||||
|
||||
// run the import starting with the top level communities
|
||||
Element[] elements = handleCommunities(context, first, null);
|
||||
|
||||
// generate the output
|
||||
Element root = xmlOutput.getRootElement();
|
||||
for (int i = 0; i < elements.length; i++)
|
||||
{
|
||||
root.addContent(elements[i]);
|
||||
}
|
||||
|
||||
// finally write the string into the output file
|
||||
try
|
||||
{
|
||||
BufferedWriter out = new BufferedWriter(new FileWriter(output));
|
||||
out.write(new XMLOutputter().outputString(xmlOutput));
|
||||
out.close();
|
||||
}
|
||||
catch (IOException e)
|
||||
{
|
||||
System.out.println("Unable to write to output file " + output);
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
context.complete();
|
||||
}
|
||||
|
||||
/**
|
||||
* Output the usage information
|
||||
*/
|
||||
private static void usage()
|
||||
{
|
||||
System.out.println("Usage: java StructBuilder -f <source XML file> -o <output file> -e <eperson email>");
|
||||
System.out.println("Communities will be created from the top level, and a map of communities to handles will be returned in the output file");
|
||||
return;
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate the XML document. This method does not return, but if validation
|
||||
* fails it generates an error and ceases execution
|
||||
*
|
||||
* @param document the XML document object
|
||||
* @throws TransformerException
|
||||
*
|
||||
*/
|
||||
private static void validate(org.w3c.dom.Document document)
|
||||
throws TransformerException
|
||||
{
|
||||
StringBuffer err = new StringBuffer();
|
||||
boolean trip = false;
|
||||
|
||||
err.append("The following errors were encountered parsing the source XML\n");
|
||||
err.append("No changes have been made to the DSpace instance\n\n");
|
||||
|
||||
NodeList first = XPathAPI.selectNodeList(document, "/import_structure/community");
|
||||
if (first.getLength() == 0)
|
||||
{
|
||||
err.append("-There are no top level communities in the source document");
|
||||
System.out.println(err.toString());
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
String errs = validateCommunities(first, 1);
|
||||
if (errs != null)
|
||||
{
|
||||
err.append(errs);
|
||||
trip = true;
|
||||
}
|
||||
|
||||
if (trip)
|
||||
{
|
||||
System.out.println(err.toString());
|
||||
System.exit(0);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate the communities section of the XML document. This returns a string
|
||||
* containing any errors encountered, or null if there were no errors
|
||||
*
|
||||
* @param communities the NodeList of communities to validate
|
||||
* @param level the level in the XML document that we are at, for the purposes
|
||||
* of error reporting
|
||||
*
|
||||
* @return the errors that need to be generated by the calling method, or null if
|
||||
* no errors.
|
||||
*/
|
||||
private static String validateCommunities(NodeList communities, int level)
|
||||
throws TransformerException
|
||||
{
|
||||
StringBuffer err = new StringBuffer();
|
||||
boolean trip = false;
|
||||
String errs = null;
|
||||
|
||||
for (int i = 0; i < communities.getLength(); i++)
|
||||
{
|
||||
Node n = communities.item(i);
|
||||
NodeList name = XPathAPI.selectNodeList(n, "name");
|
||||
if (name.getLength() != 1)
|
||||
{
|
||||
String pos = Integer.toString(i + 1);
|
||||
err.append("-The level " + level + " community in position " + pos);
|
||||
err.append(" does not contain exactly one name field\n");
|
||||
trip = true;
|
||||
}
|
||||
|
||||
// validate sub communities
|
||||
NodeList subCommunities = XPathAPI.selectNodeList(n, "community");
|
||||
String comErrs = validateCommunities(subCommunities, level + 1);
|
||||
if (comErrs != null)
|
||||
{
|
||||
err.append(comErrs);
|
||||
trip = true;
|
||||
}
|
||||
|
||||
// validate collections
|
||||
NodeList collections = XPathAPI.selectNodeList(n, "collection");
|
||||
String colErrs = validateCollections(collections, level + 1);
|
||||
if (colErrs != null)
|
||||
{
|
||||
err.append(colErrs);
|
||||
trip = true;
|
||||
}
|
||||
}
|
||||
|
||||
if (trip)
|
||||
{
|
||||
errs = err.toString();
|
||||
}
|
||||
|
||||
return errs;
|
||||
}
|
||||
|
||||
/**
|
||||
* validate the collection section of the XML document. This generates a
|
||||
* string containing any errors encountered, or returns null if no errors
|
||||
*
|
||||
* @param collections a NodeList of collections to validate
|
||||
* @param level the level in the XML document for the purposes of error reporting
|
||||
*
|
||||
* @return the errors to be generated by the calling method, or null if none
|
||||
*/
|
||||
private static String validateCollections(NodeList collections, int level)
|
||||
throws TransformerException
|
||||
{
|
||||
StringBuffer err = new StringBuffer();
|
||||
boolean trip = false;
|
||||
String errs = null;
|
||||
|
||||
for (int i = 0; i < collections.getLength(); i++)
|
||||
{
|
||||
Node n = collections.item(i);
|
||||
NodeList name = XPathAPI.selectNodeList(n, "name");
|
||||
if (name.getLength() != 1)
|
||||
{
|
||||
String pos = Integer.toString(i + 1);
|
||||
err.append("-The level " + level + " collection in position " + pos);
|
||||
err.append(" does not contain exactly one name field\n");
|
||||
trip = true;
|
||||
}
|
||||
}
|
||||
|
||||
if (trip)
|
||||
{
|
||||
errs = err.toString();
|
||||
}
|
||||
|
||||
return errs;
|
||||
}
|
||||
|
||||
/**
|
||||
* Load in the XML from file.
|
||||
*
|
||||
* @param filename
|
||||
* the filename to load from
|
||||
*
|
||||
* @return the DOM representation of the XML file
|
||||
*/
|
||||
private static org.w3c.dom.Document loadXML(String filename)
|
||||
throws IOException, ParserConfigurationException, SAXException
|
||||
{
|
||||
DocumentBuilder builder = DocumentBuilderFactory.newInstance()
|
||||
.newDocumentBuilder();
|
||||
|
||||
org.w3c.dom.Document document = builder.parse(new File(filename));
|
||||
|
||||
return document;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the String value of a Node
|
||||
*
|
||||
* @param node the node from which we want to extract the string value
|
||||
*
|
||||
* @return the string value of the node
|
||||
*/
|
||||
public static String getStringValue(Node node)
|
||||
{
|
||||
String value = node.getNodeValue();
|
||||
|
||||
if (node.hasChildNodes())
|
||||
{
|
||||
Node first = node.getFirstChild();
|
||||
|
||||
if (first.getNodeType() == Node.TEXT_NODE)
|
||||
{
|
||||
return first.getNodeValue().trim();
|
||||
}
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
/**
|
||||
* Take a node list of communities and build the structure from them, delegating
|
||||
* to the relevant methods in this class for sub-communities and collections
|
||||
*
|
||||
* @param context the context of the request
|
||||
* @param communities a nodelist of communities to create along with their sub-structures
|
||||
* @param parent the parent community of the nodelist of communities to create
|
||||
*
|
||||
* @return an element array containing additional information regarding the
|
||||
* created communities (e.g. the handles they have been assigned)
|
||||
*/
|
||||
private static Element[] handleCommunities(Context context, NodeList communities, Community parent)
|
||||
throws TransformerException, SQLException, Exception
|
||||
{
|
||||
Element[] elements = new Element[communities.getLength()];
|
||||
|
||||
for (int i = 0; i < communities.getLength(); i++)
|
||||
{
|
||||
Community community;
|
||||
Element element = new Element("community");
|
||||
|
||||
// create the community or sub community
|
||||
if (parent != null)
|
||||
{
|
||||
community = parent.createSubcommunity();
|
||||
}
|
||||
else
|
||||
{
|
||||
community = Community.create(null, context);
|
||||
}
|
||||
|
||||
// default the short description to be an empty string
|
||||
community.setMetadata("short_description", " ");
|
||||
|
||||
// now update the metadata
|
||||
Node tn = communities.item(i);
|
||||
for (Map.Entry<String, String> entry : communityMap.entrySet())
|
||||
{
|
||||
NodeList nl = XPathAPI.selectNodeList(tn, entry.getKey());
|
||||
if (nl.getLength() == 1)
|
||||
{
|
||||
community.setMetadata(entry.getValue(), getStringValue(nl.item(0)));
|
||||
}
|
||||
}
|
||||
|
||||
// FIXME: at the moment, if the community already exists by name
|
||||
// then this will throw a PSQLException on a duplicate key
|
||||
// violation
|
||||
// Ideally we'd skip this row and continue to create sub
|
||||
// communities
|
||||
// and so forth where they don't exist, but it's proving
|
||||
// difficult
|
||||
// to isolate the community that already exists without hitting
|
||||
// the database directly.
|
||||
community.update();
|
||||
|
||||
// build the element with the handle that identifies the new
|
||||
// community
|
||||
// along with all the information that we imported here
|
||||
// This looks like a lot of repetition of getting information
|
||||
// from above
|
||||
// but it's here to keep it separate from the create process in
|
||||
// case
|
||||
// we want to move it or make it switchable later
|
||||
element.setAttribute("identifier", community.getHandle());
|
||||
|
||||
Element nameElement = new Element("name");
|
||||
nameElement.setText(community.getMetadata("name"));
|
||||
element.addContent(nameElement);
|
||||
|
||||
if (community.getMetadata("short_description") != null)
|
||||
{
|
||||
Element descriptionElement = new Element("description");
|
||||
descriptionElement.setText(community.getMetadata("short_description"));
|
||||
element.addContent(descriptionElement);
|
||||
}
|
||||
|
||||
if (community.getMetadata("introductory_text") != null)
|
||||
{
|
||||
Element introElement = new Element("intro");
|
||||
introElement.setText(community.getMetadata("introductory_text"));
|
||||
element.addContent(introElement);
|
||||
}
|
||||
|
||||
if (community.getMetadata("copyright_text") != null)
|
||||
{
|
||||
Element copyrightElement = new Element("copyright");
|
||||
copyrightElement.setText(community.getMetadata("copyright_text"));
|
||||
element.addContent(copyrightElement);
|
||||
}
|
||||
|
||||
if (community.getMetadata("side_bar_text") != null)
|
||||
{
|
||||
Element sidebarElement = new Element("sidebar");
|
||||
sidebarElement.setText(community.getMetadata("side_bar_text"));
|
||||
element.addContent(sidebarElement);
|
||||
}
|
||||
|
||||
// handle sub communities
|
||||
NodeList subCommunities = XPathAPI.selectNodeList(tn, "community");
|
||||
Element[] subCommunityElements = handleCommunities(context, subCommunities, community);
|
||||
|
||||
// handle collections
|
||||
NodeList collections = XPathAPI.selectNodeList(tn, "collection");
|
||||
Element[] collectionElements = handleCollections(context, collections, community);
|
||||
|
||||
int j;
|
||||
for (j = 0; j < subCommunityElements.length; j++)
|
||||
{
|
||||
element.addContent(subCommunityElements[j]);
|
||||
}
|
||||
for (j = 0; j < collectionElements.length; j++)
|
||||
{
|
||||
element.addContent(collectionElements[j]);
|
||||
}
|
||||
|
||||
elements[i] = element;
|
||||
}
|
||||
|
||||
return elements;
|
||||
}
|
||||
|
||||
/**
|
||||
* Take a node list of collections and create the structure from them
|
||||
*
|
||||
* @param context the context of the request
|
||||
* @param collections the node list of collections to be created
|
||||
* @param parent the parent community to whom the collections belong
|
||||
*
|
||||
* @return an Element array containing additional information about the
|
||||
* created collections (e.g. the handle)
|
||||
*/
|
||||
private static Element[] handleCollections(Context context, NodeList collections, Community parent)
|
||||
throws TransformerException, SQLException, AuthorizeException, IOException, Exception
|
||||
{
|
||||
Element[] elements = new Element[collections.getLength()];
|
||||
|
||||
for (int i = 0; i < collections.getLength(); i++)
|
||||
{
|
||||
Element element = new Element("collection");
|
||||
Collection collection = parent.createCollection();
|
||||
|
||||
// default the short description to the empty string
|
||||
collection.setMetadata("short_description", " ");
|
||||
|
||||
// import the rest of the metadata
|
||||
Node tn = collections.item(i);
|
||||
for (Map.Entry<String, String> entry : collectionMap.entrySet())
|
||||
{
|
||||
NodeList nl = XPathAPI.selectNodeList(tn, entry.getKey());
|
||||
if (nl.getLength() == 1)
|
||||
{
|
||||
collection.setMetadata(entry.getValue(), getStringValue(nl.item(0)));
|
||||
}
|
||||
}
|
||||
|
||||
collection.update();
|
||||
|
||||
element.setAttribute("identifier", collection.getHandle());
|
||||
|
||||
Element nameElement = new Element("name");
|
||||
nameElement.setText(collection.getMetadata("name"));
|
||||
element.addContent(nameElement);
|
||||
|
||||
if (collection.getMetadata("short_description") != null)
|
||||
{
|
||||
Element descriptionElement = new Element("description");
|
||||
descriptionElement.setText(collection.getMetadata("short_description"));
|
||||
element.addContent(descriptionElement);
|
||||
}
|
||||
|
||||
if (collection.getMetadata("introductory_text") != null)
|
||||
{
|
||||
Element introElement = new Element("intro");
|
||||
introElement.setText(collection.getMetadata("introductory_text"));
|
||||
element.addContent(introElement);
|
||||
}
|
||||
|
||||
if (collection.getMetadata("copyright_text") != null)
|
||||
{
|
||||
Element copyrightElement = new Element("copyright");
|
||||
copyrightElement.setText(collection.getMetadata("copyright_text"));
|
||||
element.addContent(copyrightElement);
|
||||
}
|
||||
|
||||
if (collection.getMetadata("side_bar_text") != null)
|
||||
{
|
||||
Element sidebarElement = new Element("sidebar");
|
||||
sidebarElement.setText(collection.getMetadata("side_bar_text"));
|
||||
element.addContent(sidebarElement);
|
||||
}
|
||||
|
||||
if (collection.getMetadata("license") != null)
|
||||
{
|
||||
Element sidebarElement = new Element("license");
|
||||
sidebarElement.setText(collection.getMetadata("license"));
|
||||
element.addContent(sidebarElement);
|
||||
}
|
||||
|
||||
if (collection.getMetadata("provenance_description") != null)
|
||||
{
|
||||
Element sidebarElement = new Element("provenance");
|
||||
sidebarElement.setText(collection.getMetadata("provenance_description"));
|
||||
element.addContent(sidebarElement);
|
||||
}
|
||||
|
||||
elements[i] = element;
|
||||
}
|
||||
|
||||
return elements;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,338 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.bulkedit;
|
||||
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.content.DCValue;
|
||||
import org.dspace.content.Collection;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* Utility class to store changes to item that may occur during a batch edit.
|
||||
*
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class BulkEditChange
|
||||
{
|
||||
/** The item these changes relate to */
|
||||
private Item item;
|
||||
|
||||
/** The List of hashtables with the new elements */
|
||||
private List<DCValue> adds;
|
||||
|
||||
/** The List of hashtables with the removed elements */
|
||||
private List<DCValue> removes;
|
||||
|
||||
/** The List of hashtables with the unchanged elements */
|
||||
private List<DCValue> constant;
|
||||
|
||||
/** The List of the complete set of new values (constant + adds) */
|
||||
private List<DCValue> complete;
|
||||
|
||||
/** The list of old collections the item used to be mapped to */
|
||||
private List<Collection> oldMappedCollections;
|
||||
|
||||
/** The list of new collections the item has been mapped into */
|
||||
private List<Collection> newMappedCollections;
|
||||
|
||||
/** The old owning collection */
|
||||
private Collection oldOwningCollection;
|
||||
|
||||
/** The new owning collection */
|
||||
private Collection newOwningCollection;
|
||||
|
||||
/** Is this a new item */
|
||||
private boolean newItem;
|
||||
|
||||
/** Have any changes actually been made? */
|
||||
private boolean empty;
|
||||
|
||||
|
||||
/**
|
||||
* Initialise a change holder for a new item
|
||||
*/
|
||||
public BulkEditChange()
|
||||
{
|
||||
// Set the item to be null
|
||||
item = null;
|
||||
newItem = true;
|
||||
empty = true;
|
||||
oldOwningCollection = null;
|
||||
newOwningCollection = null;
|
||||
|
||||
// Initialise the arrays
|
||||
adds = new ArrayList<DCValue>();
|
||||
removes = new ArrayList<DCValue>();
|
||||
constant = new ArrayList<DCValue>();
|
||||
complete = new ArrayList<DCValue>();
|
||||
oldMappedCollections = new ArrayList<Collection>();
|
||||
newMappedCollections = new ArrayList<Collection>();
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialise a new change holder for an existing item
|
||||
*
|
||||
* @param i The Item to store
|
||||
*/
|
||||
public BulkEditChange(Item i)
|
||||
{
|
||||
// Store the item
|
||||
item = i;
|
||||
newItem = false;
|
||||
empty = true;
|
||||
|
||||
// Initialise the arrays
|
||||
adds = new ArrayList<DCValue>();
|
||||
removes = new ArrayList<DCValue>();
|
||||
constant = new ArrayList<DCValue>();
|
||||
complete = new ArrayList<DCValue>();
|
||||
oldMappedCollections = new ArrayList<Collection>();
|
||||
newMappedCollections = new ArrayList<Collection>();
|
||||
}
|
||||
|
||||
/**
|
||||
* Store the item - used when a new item is created
|
||||
*
|
||||
* @param i The item
|
||||
*/
|
||||
public void setItem(Item i)
|
||||
{
|
||||
// Store the item
|
||||
item = i;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add an added metadata value
|
||||
*
|
||||
* @param dcv The value to add
|
||||
*/
|
||||
public void registerAdd(DCValue dcv)
|
||||
{
|
||||
// Add the added value
|
||||
adds.add(dcv);
|
||||
complete.add(dcv);
|
||||
empty = false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a removed metadata value
|
||||
*
|
||||
* @param dcv The value to remove
|
||||
*/
|
||||
public void registerRemove(DCValue dcv)
|
||||
{
|
||||
// Add the removed value
|
||||
removes.add(dcv);
|
||||
empty = false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add an unchanged metadata value
|
||||
*
|
||||
* @param dcv The value to keep unchanged
|
||||
*/
|
||||
public void registerConstant(DCValue dcv)
|
||||
{
|
||||
// Add the removed value
|
||||
constant.add(dcv);
|
||||
complete.add(dcv);
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a new mapped Collection
|
||||
*
|
||||
* @param c The new mapped Collection
|
||||
*/
|
||||
public void registerNewMappedCollection(Collection c)
|
||||
{
|
||||
// Add the new owning Collection
|
||||
newMappedCollections.add(c);
|
||||
empty = false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add an old mapped Collection
|
||||
*
|
||||
* @param c The old mapped Collection
|
||||
*/
|
||||
public void registerOldMappedCollection(Collection c)
|
||||
{
|
||||
// Add the old owning Collection (if it isn't there already, or is an old collection)
|
||||
boolean found = false;
|
||||
|
||||
if ((this.getOldOwningCollection() != null) &&
|
||||
(this.getOldOwningCollection().getHandle().equals(c.getHandle())))
|
||||
{
|
||||
found = true;
|
||||
}
|
||||
|
||||
for (Collection collection : oldMappedCollections)
|
||||
{
|
||||
if (collection.getHandle().equals(c.getHandle()))
|
||||
{
|
||||
found = true;
|
||||
}
|
||||
}
|
||||
|
||||
if (!found)
|
||||
{
|
||||
oldMappedCollections.add(c);
|
||||
empty = false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Register a change to the owning collection
|
||||
*
|
||||
* @param oldC The old owning collection
|
||||
* @param newC The new owning collection
|
||||
*/
|
||||
public void changeOwningCollection(Collection oldC, Collection newC)
|
||||
{
|
||||
// Store the old owning collection
|
||||
oldOwningCollection = oldC;
|
||||
|
||||
// Store the new owning collection
|
||||
newOwningCollection = newC;
|
||||
empty = false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the owning collection of an item
|
||||
*
|
||||
* @param newC The new owning collection
|
||||
*/
|
||||
public void setOwningCollection(Collection newC)
|
||||
{
|
||||
// Store the new owning collection
|
||||
newOwningCollection = newC;
|
||||
//empty = false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the DSpace Item that these changes are applicable to.
|
||||
*
|
||||
* @return The item
|
||||
*/
|
||||
public Item getItem()
|
||||
{
|
||||
// Return the item
|
||||
return item;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the list of elements and their values that have been added.
|
||||
*
|
||||
* @return the list of elements and their values that have been added.
|
||||
*/
|
||||
public List<DCValue> getAdds()
|
||||
{
|
||||
// Return the array
|
||||
return adds;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the list of elements and their values that have been removed.
|
||||
*
|
||||
* @return the list of elements and their values that have been removed.
|
||||
*/
|
||||
public List<DCValue> getRemoves()
|
||||
{
|
||||
// Return the array
|
||||
return removes;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the list of unchanged values
|
||||
*
|
||||
* @return the list of unchanged values
|
||||
*/
|
||||
public List<DCValue> getConstant()
|
||||
{
|
||||
// Return the array
|
||||
return constant;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the list of all values
|
||||
*
|
||||
* @return the list of all values
|
||||
*/
|
||||
public List<DCValue> getComplete()
|
||||
{
|
||||
// Return the array
|
||||
return complete;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the list of new mapped Collections
|
||||
*
|
||||
* @return the list of new mapped collections
|
||||
*/
|
||||
public List<Collection> getNewMappedCollections()
|
||||
{
|
||||
// Return the array
|
||||
return newMappedCollections;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the list of old mapped Collections
|
||||
*
|
||||
* @return the list of old mapped collections
|
||||
*/
|
||||
public List<Collection> getOldMappedCollections()
|
||||
{
|
||||
// Return the array
|
||||
return oldMappedCollections;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the old owning collection
|
||||
*
|
||||
* @return the old owning collection
|
||||
*/
|
||||
public Collection getOldOwningCollection()
|
||||
{
|
||||
// Return the old owning collection
|
||||
return oldOwningCollection;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the new owning collection
|
||||
*
|
||||
* @return the new owning collection
|
||||
*/
|
||||
public Collection getNewOwningCollection()
|
||||
{
|
||||
// Return the new owning collection
|
||||
return newOwningCollection;
|
||||
}
|
||||
|
||||
/**
|
||||
* Does this change object represent a new item?
|
||||
*
|
||||
* @return Whether or not this is for a new item
|
||||
*/
|
||||
public boolean isNewItem()
|
||||
{
|
||||
// Return the new item status
|
||||
return newItem;
|
||||
}
|
||||
|
||||
/**
|
||||
* Have any changes actually been recorded, or is this empty?
|
||||
*
|
||||
* @return Whether or not changes have been made
|
||||
*/
|
||||
public boolean hasChanges()
|
||||
{
|
||||
return !empty;
|
||||
}
|
||||
}
|
||||
@@ -1,607 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.bulkedit;
|
||||
|
||||
import org.dspace.content.*;
|
||||
import org.dspace.content.Collection;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
import java.util.*;
|
||||
import java.util.regex.Pattern;
|
||||
import java.util.regex.Matcher;
|
||||
import java.io.*;
|
||||
|
||||
/**
|
||||
* Utility class to read and write CSV files
|
||||
*
|
||||
* **************
|
||||
* Important Note
|
||||
* **************
|
||||
*
|
||||
* This class has been made serializable, as it is stored in a Session.
|
||||
* Is it wise to:
|
||||
* a) be putting this into a user's session?
|
||||
* b) holding an entire CSV upload in memory?
|
||||
*
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class DSpaceCSV implements Serializable
|
||||
{
|
||||
/** The headings of the CSV file */
|
||||
private List<String> headings;
|
||||
|
||||
/** An array list of CSV lines */
|
||||
private List<DSpaceCSVLine> lines;
|
||||
|
||||
/** A counter of how many CSV lines this object holds */
|
||||
private int counter;
|
||||
|
||||
/** The value separator (defaults to double pipe '||') */
|
||||
protected static String valueSeparator;
|
||||
|
||||
/** The value separator in an escaped form for using in regexs */
|
||||
protected static String escapedValueSeparator;
|
||||
|
||||
/** The field separator (defaults to comma) */
|
||||
protected static String fieldSeparator;
|
||||
|
||||
/** The field separator in an escaped form for using in regexs */
|
||||
protected static String escapedFieldSeparator;
|
||||
|
||||
/** Whether to export all metadata such as handles and provenance information */
|
||||
private boolean exportAll;
|
||||
|
||||
/** A list of metadata elements to ignore */
|
||||
private Map<String, String> ignore;
|
||||
|
||||
|
||||
/**
|
||||
* Create a new instance of a CSV line holder
|
||||
*
|
||||
* @param exportAll Whether to export all metadata such as handles and provenance information
|
||||
*/
|
||||
public DSpaceCSV(boolean exportAll)
|
||||
{
|
||||
// Initialise the class
|
||||
init();
|
||||
|
||||
// Store the exportAll setting
|
||||
this.exportAll = exportAll;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new instance, reading the lines in from file
|
||||
*
|
||||
* @param f The file to read from
|
||||
* @param c The DSpace Context
|
||||
*
|
||||
* @throws Exception thrown if there is an error reading or processing the file
|
||||
*/
|
||||
public DSpaceCSV(File f, Context c) throws Exception
|
||||
{
|
||||
// Initialise the class
|
||||
init();
|
||||
|
||||
// Open the CSV file
|
||||
BufferedReader input = null;
|
||||
try
|
||||
{
|
||||
input = new BufferedReader(new InputStreamReader(new FileInputStream(f),"UTF-8"));
|
||||
|
||||
// Read the heading line
|
||||
String head = input.readLine();
|
||||
String[] headingElements = head.split(escapedFieldSeparator);
|
||||
for (String element : headingElements)
|
||||
{
|
||||
// Remove surrounding quotes if there are any
|
||||
if ((element.startsWith("\"")) && (element.endsWith("\"")))
|
||||
{
|
||||
element = element.substring(1, element.length() - 1);
|
||||
}
|
||||
|
||||
// Store the heading
|
||||
if ("collection".equals(element))
|
||||
{
|
||||
// Store the heading
|
||||
headings.add(element);
|
||||
}
|
||||
else if (!"id".equals(element))
|
||||
{
|
||||
// Verify that the heading is valid in the metadata registry
|
||||
String[] clean = element.split("\\[");
|
||||
String[] parts = clean[0].split("\\.");
|
||||
String metadataSchema = parts[0];
|
||||
String metadataElement = parts[1];
|
||||
String metadataQualifier = null;
|
||||
if (parts.length > 2) {
|
||||
metadataQualifier = parts[2];
|
||||
}
|
||||
|
||||
// Check that the scheme exists
|
||||
MetadataSchema foundSchema = MetadataSchema.find(c, metadataSchema);
|
||||
if (foundSchema == null) {
|
||||
throw new MetadataImportInvalidHeadingException(clean[0],
|
||||
MetadataImportInvalidHeadingException.SCHEMA);
|
||||
}
|
||||
|
||||
// Check that the metadata element exists in the schema
|
||||
int schemaID = foundSchema.getSchemaID();
|
||||
MetadataField foundField = MetadataField.findByElement(c, schemaID, metadataElement, metadataQualifier);
|
||||
if (foundField == null) {
|
||||
throw new MetadataImportInvalidHeadingException(clean[0],
|
||||
MetadataImportInvalidHeadingException.ELEMENT);
|
||||
}
|
||||
|
||||
// Store the heading
|
||||
headings.add(element);
|
||||
}
|
||||
}
|
||||
|
||||
// Read each subsequent line
|
||||
StringBuilder lineBuilder = new StringBuilder();
|
||||
String lineRead;
|
||||
|
||||
while ((lineRead = input.readLine()) != null)
|
||||
{
|
||||
if (lineBuilder.length() > 0) {
|
||||
// Already have a previously read value - add this line
|
||||
lineBuilder.append("\n").append(lineRead);
|
||||
|
||||
// Count the number of quotes in the buffer
|
||||
int quoteCount = 0;
|
||||
for (int pos = 0; pos < lineBuilder.length(); pos++) {
|
||||
if (lineBuilder.charAt(pos) == '"') {
|
||||
quoteCount++;
|
||||
}
|
||||
}
|
||||
|
||||
if (quoteCount % 2 == 0) {
|
||||
// Number of quotes is a multiple of 2, add the item
|
||||
addItem(lineBuilder.toString());
|
||||
lineBuilder = new StringBuilder();
|
||||
}
|
||||
} else if (lineRead.indexOf('"') > -1) {
|
||||
// Get the number of quotes in the line
|
||||
int quoteCount = 0;
|
||||
for (int pos = 0; pos < lineRead.length(); pos++) {
|
||||
if (lineRead.charAt(pos) == '"') {
|
||||
quoteCount++;
|
||||
}
|
||||
}
|
||||
|
||||
if (quoteCount % 2 == 0) {
|
||||
// Number of quotes is a multiple of 2, add the item
|
||||
addItem(lineRead);
|
||||
} else {
|
||||
// Uneven quotes - add to the buffer and leave for later
|
||||
lineBuilder.append(lineRead);
|
||||
}
|
||||
} else {
|
||||
// No previously read line, and no quotes in the line - add item
|
||||
addItem(lineRead);
|
||||
}
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (input != null)
|
||||
{
|
||||
input.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialise this class with values from dspace.cfg
|
||||
*/
|
||||
private void init()
|
||||
{
|
||||
// Set the value separator
|
||||
setValueSeparator();
|
||||
|
||||
// Set the field separator
|
||||
setFieldSeparator();
|
||||
|
||||
// Create the headings
|
||||
headings = new ArrayList<String>();
|
||||
|
||||
// Create the blank list of items
|
||||
lines = new ArrayList<DSpaceCSVLine>();
|
||||
|
||||
// Initialise the counter
|
||||
counter = 0;
|
||||
|
||||
// Set the metadata fields to ignore
|
||||
ignore = new HashMap<String, String>();
|
||||
String toIgnore = ConfigurationManager.getProperty("bulkedit.ignore-on-export");
|
||||
if ((toIgnore == null) || ("".equals(toIgnore.trim())))
|
||||
{
|
||||
// Set a default value
|
||||
toIgnore = "dc.date.accessioned, dc.date.available, " +
|
||||
"dc.date.updated, dc.description.provenance";
|
||||
}
|
||||
String[] toIgnoreArray = toIgnore.split(",");
|
||||
for (String toIgnoreString : toIgnoreArray)
|
||||
{
|
||||
if (!"".equals(toIgnoreString.trim()))
|
||||
{
|
||||
ignore.put(toIgnoreString.trim(), toIgnoreString.trim());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the value separator for multiple values stored in one csv value.
|
||||
*
|
||||
* Is set in dspace.cfg as bulkedit.valueseparator
|
||||
*
|
||||
* If not set, defaults to double pipe '||'
|
||||
*/
|
||||
private void setValueSeparator()
|
||||
{
|
||||
// Get the value separator
|
||||
valueSeparator = ConfigurationManager.getProperty("bulkedit.valueseparator");
|
||||
if ((valueSeparator != null) && (!"".equals(valueSeparator.trim())))
|
||||
{
|
||||
valueSeparator = valueSeparator.trim();
|
||||
}
|
||||
else
|
||||
{
|
||||
valueSeparator = "||";
|
||||
}
|
||||
|
||||
// Now store the escaped version
|
||||
Pattern spchars = Pattern.compile("([\\\\*+\\[\\](){}\\$.?\\^|])");
|
||||
Matcher match = spchars.matcher(valueSeparator);
|
||||
escapedValueSeparator = match.replaceAll("\\\\$1");
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the field separator use to separate fields in the csv.
|
||||
*
|
||||
* Is set in dspace.cfg as bulkedit.fieldseparator
|
||||
*
|
||||
* If not set, defaults to comma ','.
|
||||
*
|
||||
* Special values are 'tab', 'hash' and 'semicolon' which will
|
||||
* get substituted from the text to the value.
|
||||
*/
|
||||
private void setFieldSeparator()
|
||||
{
|
||||
// Get the value separator
|
||||
fieldSeparator = ConfigurationManager.getProperty("bulkedit.fieldseparator");
|
||||
if ((fieldSeparator != null) && (!"".equals(fieldSeparator.trim())))
|
||||
{
|
||||
fieldSeparator = fieldSeparator.trim();
|
||||
if ("tab".equals(fieldSeparator))
|
||||
{
|
||||
fieldSeparator = "\t";
|
||||
}
|
||||
else if ("semicolon".equals(fieldSeparator))
|
||||
{
|
||||
fieldSeparator = ";";
|
||||
}
|
||||
else if ("hash".equals(fieldSeparator))
|
||||
{
|
||||
fieldSeparator = "#";
|
||||
}
|
||||
else
|
||||
{
|
||||
fieldSeparator = fieldSeparator.trim();
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
fieldSeparator = ",";
|
||||
}
|
||||
|
||||
// Now store the escaped version
|
||||
Pattern spchars = Pattern.compile("([\\\\*+\\[\\](){}\\$.?\\^|])");
|
||||
Matcher match = spchars.matcher(fieldSeparator);
|
||||
escapedFieldSeparator = match.replaceAll("\\\\$1");
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a DSpace item to the CSV file
|
||||
*
|
||||
* @param i The DSpace item
|
||||
*
|
||||
* @throws Exception if something goes wrong with adding the Item
|
||||
*/
|
||||
public final void addItem(Item i) throws Exception
|
||||
{
|
||||
// Create the CSV line
|
||||
DSpaceCSVLine line = new DSpaceCSVLine(i.getID());
|
||||
|
||||
// Add in owning collection
|
||||
String owningCollectionHandle = i.getOwningCollection().getHandle();
|
||||
line.add("collection", owningCollectionHandle);
|
||||
|
||||
// Add in any mapped collections
|
||||
Collection[] collections = i.getCollections();
|
||||
for (Collection c : collections)
|
||||
{
|
||||
// Only add if it is not the owning collection
|
||||
if (!c.getHandle().equals(owningCollectionHandle))
|
||||
{
|
||||
line.add("collection", c.getHandle());
|
||||
}
|
||||
}
|
||||
|
||||
// Populate it
|
||||
DCValue md[] = i.getMetadata(Item.ANY, Item.ANY, Item.ANY, Item.ANY);
|
||||
for (DCValue value : md)
|
||||
{
|
||||
// Get the key (schema.element)
|
||||
String key = value.schema + "." + value.element;
|
||||
|
||||
// Add the qualifier if there is one (schema.element.qualifier)
|
||||
if (value.qualifier != null)
|
||||
{
|
||||
key = key + "." + value.qualifier;
|
||||
}
|
||||
|
||||
// Add the language if there is one (schema.element.qualifier[langauge])
|
||||
//if ((value.language != null) && (!"".equals(value.language)))
|
||||
if (value.language != null)
|
||||
{
|
||||
key = key + "[" + value.language + "]";
|
||||
}
|
||||
|
||||
// Store the item
|
||||
if (exportAll || okToExport(value))
|
||||
{
|
||||
line.add(key, value.value);
|
||||
if (!headings.contains(key))
|
||||
{
|
||||
headings.add(key);
|
||||
}
|
||||
}
|
||||
}
|
||||
lines.add(line);
|
||||
counter++;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add an item to the CSV file, from a CSV line of elements
|
||||
*
|
||||
* @param line The line of elements
|
||||
* @throws Exception Thrown if an error occurs when adding the item
|
||||
*/
|
||||
public final void addItem(String line) throws Exception
|
||||
{
|
||||
// Check to see if the last character is a field separator, which hides the last empy column
|
||||
boolean last = false;
|
||||
if (line.endsWith(fieldSeparator))
|
||||
{
|
||||
// Add a space to the end, then remove it later
|
||||
last = true;
|
||||
line += " ";
|
||||
}
|
||||
|
||||
// Split up on field separator
|
||||
String[] parts = line.split(escapedFieldSeparator);
|
||||
ArrayList<String> bits = new ArrayList<String>();
|
||||
bits.addAll(Arrays.asList(parts));
|
||||
|
||||
// Merge parts with embedded separators
|
||||
boolean alldone = false;
|
||||
while (!alldone)
|
||||
{
|
||||
boolean found = false;
|
||||
int i = 0;
|
||||
for (String part : bits)
|
||||
{
|
||||
int bitcounter = part.length() - part.replaceAll("\"", "").length();
|
||||
if ((part.startsWith("\"")) && ((!part.endsWith("\"")) || ((bitcounter & 1) == 1)))
|
||||
{
|
||||
found = true;
|
||||
String add = bits.get(i) + fieldSeparator + bits.get(i + 1);
|
||||
bits.remove(i);
|
||||
bits.add(i, add);
|
||||
bits.remove(i + 1);
|
||||
break;
|
||||
}
|
||||
i++;
|
||||
}
|
||||
alldone = !found;
|
||||
}
|
||||
|
||||
// Deal with quotes around the elements
|
||||
int i = 0;
|
||||
for (String part : bits)
|
||||
{
|
||||
if ((part.startsWith("\"")) && (part.endsWith("\"")))
|
||||
{
|
||||
part = part.substring(1, part.length() - 1);
|
||||
bits.set(i, part);
|
||||
}
|
||||
i++;
|
||||
}
|
||||
|
||||
// Remove embedded quotes
|
||||
i = 0;
|
||||
for (String part : bits)
|
||||
{
|
||||
if (part.contains("\"\""))
|
||||
{
|
||||
part = part.replaceAll("\"\"", "\"");
|
||||
bits.set(i, part);
|
||||
}
|
||||
i++;
|
||||
}
|
||||
|
||||
// Add elements to a DSpaceCSVLine
|
||||
String id = parts[0].replaceAll("\"", "");
|
||||
DSpaceCSVLine csvLine;
|
||||
|
||||
// Is this an existing item, or a new item (where id = '+')
|
||||
if ("+".equals(id))
|
||||
{
|
||||
csvLine = new DSpaceCSVLine();
|
||||
}
|
||||
else
|
||||
{
|
||||
try
|
||||
{
|
||||
csvLine = new DSpaceCSVLine(Integer.parseInt(id));
|
||||
}
|
||||
catch (NumberFormatException nfe)
|
||||
{
|
||||
System.err.println("Invalid item identifier: " + id);
|
||||
System.err.println("Please check your CSV file for information. " +
|
||||
"Item id must be numeric, or a '+' to add a new item");
|
||||
throw(nfe);
|
||||
}
|
||||
}
|
||||
|
||||
// Add the rest of the parts
|
||||
i = 0;
|
||||
for (String part : bits)
|
||||
{
|
||||
if (i > 0)
|
||||
{
|
||||
// Is this a last empty item?
|
||||
if ((last) && (i == headings.size()))
|
||||
{
|
||||
part = "";
|
||||
}
|
||||
|
||||
// Make sure we register that this column was there
|
||||
csvLine.add(headings.get(i - 1), null);
|
||||
String[] elements = part.split(escapedValueSeparator);
|
||||
for (String element : elements)
|
||||
{
|
||||
if ((element != null) && (!"".equals(element)))
|
||||
{
|
||||
csvLine.add(headings.get(i - 1), element);
|
||||
}
|
||||
}
|
||||
}
|
||||
i++;
|
||||
}
|
||||
lines.add(csvLine);
|
||||
counter++;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the lines in CSV holders
|
||||
*
|
||||
* @return The lines
|
||||
*/
|
||||
public final List<DSpaceCSVLine> getCSVLines()
|
||||
{
|
||||
// Return the lines
|
||||
return lines;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the CSV lines as an array of CSV formatted strings
|
||||
*
|
||||
* @return the array of CSV formatted Strings
|
||||
*/
|
||||
public final String[] getCSVLinesAsStringArray()
|
||||
{
|
||||
// Create the headings line
|
||||
String[] csvLines = new String[counter + 1];
|
||||
csvLines[0] = "id" + fieldSeparator + "collection";
|
||||
Collections.sort(headings);
|
||||
for (String value : headings)
|
||||
{
|
||||
csvLines[0] = csvLines[0] + fieldSeparator + value;
|
||||
}
|
||||
|
||||
Iterator<DSpaceCSVLine> i = lines.iterator();
|
||||
int c = 1;
|
||||
while (i.hasNext())
|
||||
{
|
||||
csvLines[c++] = i.next().toCSV(headings);
|
||||
}
|
||||
|
||||
return csvLines;
|
||||
}
|
||||
|
||||
/**
|
||||
* Save the CSV file to the given filename
|
||||
*
|
||||
* @param filename The filename to save the CSV file to
|
||||
*
|
||||
* @throws IOException Thrown if an error occurs when writing the file
|
||||
*/
|
||||
public final void save(String filename) throws IOException
|
||||
{
|
||||
// Save the file
|
||||
BufferedWriter out = new BufferedWriter(
|
||||
new OutputStreamWriter(
|
||||
new FileOutputStream(filename), "UTF-8"));
|
||||
for (String csvLine : getCSVLinesAsStringArray()) {
|
||||
out.write(csvLine + "\n");
|
||||
}
|
||||
out.flush();
|
||||
out.close();
|
||||
}
|
||||
|
||||
/**
|
||||
* Is it Ok to export this value? When exportAll is set to false, we don't export
|
||||
* some of the metadata elements.
|
||||
*
|
||||
* The list can be configured via the key bulkedit.ignore-on-export in dspace.cfg
|
||||
*
|
||||
* @param md The DCValue to examine
|
||||
* @return Whether or not it is OK to export this element
|
||||
*/
|
||||
private final boolean okToExport(DCValue md)
|
||||
{
|
||||
// First check the metadata format, and K all non DC elements
|
||||
if (!"dc".equals(md.schema))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
// Now compare with the list to ignore
|
||||
String key = md.schema + "." + md.element;
|
||||
if (md.qualifier != null)
|
||||
{
|
||||
key += "." + md.qualifier;
|
||||
}
|
||||
if (ignore.get(key) != null) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Must be OK, so don't ignore
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the headings used in this CSV file
|
||||
*
|
||||
* @return The headings
|
||||
*/
|
||||
public List<String> getHeadings()
|
||||
{
|
||||
return headings;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the csv file as one long formatted string
|
||||
*
|
||||
* @return The formatted String as a csv
|
||||
*/
|
||||
public final String toString()
|
||||
{
|
||||
// Return the csv as one long string
|
||||
StringBuffer csvLines = new StringBuffer();
|
||||
String[] lines = this.getCSVLinesAsStringArray();
|
||||
for (String line : lines)
|
||||
{
|
||||
csvLines.append(line).append("\n");
|
||||
}
|
||||
return csvLines.toString();
|
||||
}
|
||||
}
|
||||
@@ -1,176 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.bulkedit;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
|
||||
/**
|
||||
* Utility class to store a line from a CSV file
|
||||
*
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class DSpaceCSVLine
|
||||
{
|
||||
/** The item id of the item represented by this line. -1 is for a new item */
|
||||
private int id;
|
||||
|
||||
/** The elements in this line in a hashtable, keyed by the metadata type */
|
||||
private Map<String, ArrayList> items;
|
||||
|
||||
/**
|
||||
* Create a new CSV line
|
||||
*
|
||||
* @param id The item ID of the line
|
||||
*/
|
||||
public DSpaceCSVLine(int itemId)
|
||||
{
|
||||
// Store the ID + separator, and initialise the hashtable
|
||||
this.id = itemId;
|
||||
items = new HashMap<String, ArrayList>();
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new CSV line for a new item
|
||||
*/
|
||||
public DSpaceCSVLine()
|
||||
{
|
||||
// Set the ID to be -1, and initialise the hashtable
|
||||
this.id = -1;
|
||||
this.items = new HashMap<String, ArrayList>();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the item ID that this line represents
|
||||
*
|
||||
* @return The item ID
|
||||
*/
|
||||
public int getID()
|
||||
{
|
||||
// Return the ID
|
||||
return id;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a new metadata value to this line
|
||||
*
|
||||
* @param key The metadata key (e.g. dc.contributor.author)
|
||||
* @param value The metadata value
|
||||
*/
|
||||
public void add(String key, String value)
|
||||
{
|
||||
// Create the array list if we need to
|
||||
if (items.get(key) == null)
|
||||
{
|
||||
items.put(key, new ArrayList<String>());
|
||||
}
|
||||
|
||||
// Store the item if it is not null
|
||||
if (value != null)
|
||||
{
|
||||
items.get(key).add(value);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all the values that match the given metadata key. Will be null if none exist.
|
||||
*
|
||||
* @param key The metadata key
|
||||
* @return All the elements that match
|
||||
*/
|
||||
public List<String> get(String key)
|
||||
{
|
||||
// Return any relevant values
|
||||
return items.get(key);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all the metadata keys that are represented in this line
|
||||
*
|
||||
* @return An enumeration of all the keys
|
||||
*/
|
||||
public Set<String> keys()
|
||||
{
|
||||
// Return the keys
|
||||
return items.keySet();
|
||||
}
|
||||
|
||||
/**
|
||||
* Write this line out as a CSV formatted string, in the order given by the headings provided
|
||||
*
|
||||
* @param headings The headings which define the order the elements must be presented in
|
||||
* @return The CSV formatted String
|
||||
*/
|
||||
protected String toCSV(List<String> headings)
|
||||
{
|
||||
StringBuilder bits = new StringBuilder();
|
||||
|
||||
// Add the id
|
||||
bits.append("\"").append(id).append("\"").append(DSpaceCSV.fieldSeparator);
|
||||
bits.append(valueToCSV(items.get("collection")));
|
||||
|
||||
// Add the rest of the elements
|
||||
for (String heading : headings)
|
||||
{
|
||||
bits.append(DSpaceCSV.fieldSeparator);
|
||||
List<String> values = items.get(heading);
|
||||
if (values != null && !"collection".equals(heading))
|
||||
{
|
||||
bits.append(valueToCSV(values));
|
||||
}
|
||||
}
|
||||
|
||||
return bits.toString();
|
||||
}
|
||||
|
||||
/**
|
||||
* Internal method to create a CSV formatted String joining a given set of elements
|
||||
*
|
||||
* @param values The values to create the string from
|
||||
* @return The line as a CSV formatted String
|
||||
*/
|
||||
protected String valueToCSV(List<String> values)
|
||||
{
|
||||
// Check there is some content
|
||||
if (values == null)
|
||||
{
|
||||
return "";
|
||||
}
|
||||
|
||||
// Get on with the work
|
||||
String s;
|
||||
if (values.size() == 1)
|
||||
{
|
||||
s = values.get(0);
|
||||
}
|
||||
else
|
||||
{
|
||||
// Concatenate any fields together
|
||||
StringBuilder str = new StringBuilder();
|
||||
|
||||
for (String value : values)
|
||||
{
|
||||
if (str.length() > 0)
|
||||
{
|
||||
str.append(DSpaceCSV.valueSeparator);
|
||||
}
|
||||
|
||||
str.append(value);
|
||||
}
|
||||
|
||||
s = str.toString();
|
||||
}
|
||||
|
||||
// Replace internal quotes with two sets of quotes
|
||||
return "\"" + s.replaceAll("\"", "\"\"") + "\"";
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,266 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.bulkedit;
|
||||
|
||||
import org.apache.commons.cli.*;
|
||||
|
||||
import org.dspace.content.*;
|
||||
import org.dspace.core.Constants;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.handle.HandleManager;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.sql.SQLException;
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* Metadata exporter to allow the batch export of metadata into a file
|
||||
*
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class MetadataExport
|
||||
{
|
||||
/** The items to export */
|
||||
private ItemIterator toExport;
|
||||
|
||||
/** Whether to export all metadata, or just normally edited metadata */
|
||||
private boolean exportAll;
|
||||
|
||||
/**
|
||||
* Set up a new metadata export
|
||||
*
|
||||
* @param c The Context
|
||||
* @param toExport The ItemIterator of items to export
|
||||
* @param exportAll whether to export all metadata or not (include handle, provenance etc)
|
||||
*/
|
||||
public MetadataExport(Context c, ItemIterator toExport, boolean exportAll)
|
||||
{
|
||||
// Store the export settings
|
||||
this.toExport = toExport;
|
||||
this.exportAll = exportAll;
|
||||
}
|
||||
|
||||
/**
|
||||
* Method to export a community (and sub-communities and collections)
|
||||
*
|
||||
* @param c The Context
|
||||
* @param toExport The Community to export
|
||||
* @param exportAll whether to export all metadata or not (include handle, provenance etc)
|
||||
*/
|
||||
public MetadataExport(Context c, Community toExport, boolean exportAll)
|
||||
{
|
||||
try
|
||||
{
|
||||
// Try to export the community
|
||||
this.toExport = new ItemIterator(c, buildFromCommunity(toExport, new ArrayList<Integer>(), 0));
|
||||
this.exportAll = exportAll;
|
||||
}
|
||||
catch (SQLException sqle)
|
||||
{
|
||||
// Something went wrong...
|
||||
System.err.println("Error running exporter:");
|
||||
sqle.printStackTrace(System.err);
|
||||
System.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Build an array list of item ids that are in a community (include sub-communities and collections)
|
||||
*
|
||||
* @param community The community to build from
|
||||
* @param itemIDs The itemID (used for recursion - use an empty ArrayList)
|
||||
* @param indent How many spaces to use when writing out the names of items added
|
||||
* @return The list of item ids
|
||||
* @throws SQLException
|
||||
*/
|
||||
private List<Integer> buildFromCommunity(Community community, List<Integer> itemIDs, int indent)
|
||||
throws SQLException
|
||||
{
|
||||
// Add all the collections
|
||||
Collection[] collections = community.getCollections();
|
||||
for (Collection collection : collections)
|
||||
{
|
||||
for (int i = 0; i < indent; i++)
|
||||
{
|
||||
System.out.print(" ");
|
||||
}
|
||||
|
||||
ItemIterator items = collection.getAllItems();
|
||||
while (items.hasNext())
|
||||
{
|
||||
int id = items.next().getID();
|
||||
// Only add if not already included (so mapped items only appear once)
|
||||
if (!itemIDs.contains(id))
|
||||
{
|
||||
itemIDs.add(id);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add all the sub-communities
|
||||
Community[] communities = community.getSubcommunities();
|
||||
for (Community subCommunity : communities)
|
||||
{
|
||||
for (int i = 0; i < indent; i++)
|
||||
{
|
||||
System.out.print(" ");
|
||||
}
|
||||
buildFromCommunity(subCommunity, itemIDs, indent + 1);
|
||||
}
|
||||
|
||||
return itemIDs;
|
||||
}
|
||||
|
||||
/**
|
||||
* Run the export
|
||||
*
|
||||
* @return the exported CSV lines
|
||||
*/
|
||||
public DSpaceCSV export()
|
||||
{
|
||||
try
|
||||
{
|
||||
// Process each item
|
||||
DSpaceCSV csv = new DSpaceCSV(exportAll);
|
||||
while (toExport.hasNext())
|
||||
{
|
||||
csv.addItem(toExport.next());
|
||||
}
|
||||
|
||||
// Return the results
|
||||
return csv;
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Print the help message
|
||||
*
|
||||
* @param options The command line options the user gave
|
||||
* @param exitCode the system exit code to use
|
||||
*/
|
||||
private static void printHelp(Options options, int exitCode)
|
||||
{
|
||||
// print the help message
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("MetadataExport\n", options);
|
||||
System.out.println("\nfull export: metadataexport -f filename");
|
||||
System.out.println("partial export: metadataexport -i handle -f filename");
|
||||
System.exit(exitCode);
|
||||
}
|
||||
|
||||
/**
|
||||
* main method to run the metadata exporter
|
||||
*
|
||||
* @param argv the command line arguments given
|
||||
*/
|
||||
public static void main(String[] argv) throws Exception
|
||||
{
|
||||
// Create an options object and populate it
|
||||
CommandLineParser parser = new PosixParser();
|
||||
|
||||
Options options = new Options();
|
||||
|
||||
options.addOption("i", "id", true, "ID or handle of thing to export (item, collection, or community)");
|
||||
options.addOption("f", "file", true, "destination where you want file written");
|
||||
options.addOption("a", "all", false, "include all metadata fields that are not normally changed (e.g. provenance)");
|
||||
options.addOption("h", "help", false, "help");
|
||||
|
||||
CommandLine line = null;
|
||||
|
||||
try
|
||||
{
|
||||
line = parser.parse(options, argv);
|
||||
}
|
||||
catch (ParseException pe)
|
||||
{
|
||||
System.err.println("Error with commands.");
|
||||
printHelp(options, 1);
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
if (line.hasOption('h'))
|
||||
{
|
||||
printHelp(options, 0);
|
||||
}
|
||||
|
||||
// Check a filename is given
|
||||
if (!line.hasOption('f'))
|
||||
{
|
||||
System.err.println("Required parameter -f missing!");
|
||||
printHelp(options, 1);
|
||||
}
|
||||
String filename = line.getOptionValue('f');
|
||||
|
||||
// Create a context
|
||||
Context c = new Context();
|
||||
c.turnOffAuthorisationSystem();
|
||||
|
||||
// The things we'll export
|
||||
ItemIterator toExport = null;
|
||||
MetadataExport exporter = null;
|
||||
|
||||
// Export everything?
|
||||
boolean exportAll = line.hasOption('a');
|
||||
|
||||
// Check we have an item OK
|
||||
if (!line.hasOption('i'))
|
||||
{
|
||||
System.out.println("Exporting whole repository WARNING: May take some time!");
|
||||
exporter = new MetadataExport(c, Item.findAll(c), exportAll);
|
||||
}
|
||||
else
|
||||
{
|
||||
String handle = line.getOptionValue('i');
|
||||
DSpaceObject dso = HandleManager.resolveToObject(c, handle);
|
||||
if (dso == null)
|
||||
{
|
||||
System.err.println("Item '" + handle + "' does not resolve to an item in your repository!");
|
||||
printHelp(options, 1);
|
||||
}
|
||||
|
||||
if (dso.getType() == Constants.ITEM)
|
||||
{
|
||||
System.out.println("Exporting item '" + dso.getName() + "' (" + handle + ")");
|
||||
List<Integer> item = new ArrayList<Integer>();
|
||||
item.add(dso.getID());
|
||||
exporter = new MetadataExport(c, new ItemIterator(c, item), exportAll);
|
||||
}
|
||||
else if (dso.getType() == Constants.COLLECTION)
|
||||
{
|
||||
System.out.println("Exporting collection '" + dso.getName() + "' (" + handle + ")");
|
||||
Collection collection = (Collection)dso;
|
||||
toExport = collection.getAllItems();
|
||||
exporter = new MetadataExport(c, toExport, exportAll);
|
||||
}
|
||||
else if (dso.getType() == Constants.COMMUNITY)
|
||||
{
|
||||
System.out.println("Exporting community '" + dso.getName() + "' (" + handle + ")");
|
||||
exporter = new MetadataExport(c, (Community)dso, exportAll);
|
||||
}
|
||||
else
|
||||
{
|
||||
System.err.println("Error identifying '" + handle + "'");
|
||||
System.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Perform the export
|
||||
DSpaceCSV csv = exporter.export();
|
||||
|
||||
// Save the files to the file
|
||||
csv.save(filename);
|
||||
|
||||
// Finish off and tidy up
|
||||
c.restoreAuthSystemState();
|
||||
c.complete();
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,37 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.bulkedit;
|
||||
|
||||
/**
|
||||
* Metadata importer exception
|
||||
*
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class MetadataImportException extends Exception
|
||||
{
|
||||
/**
|
||||
* Instantiate a new MetadataImportException
|
||||
*
|
||||
* @param message the error message
|
||||
*/
|
||||
public MetadataImportException(String message)
|
||||
{
|
||||
super(message);
|
||||
}
|
||||
|
||||
/**
|
||||
* Instantiate a new MetadataImportException
|
||||
*
|
||||
* @param message the error message
|
||||
* @param exception the root cause
|
||||
*/
|
||||
public MetadataImportException(String message, Exception exception)
|
||||
{
|
||||
super(message, exception);
|
||||
}
|
||||
}
|
||||
@@ -1,79 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.bulkedit;
|
||||
|
||||
/**
|
||||
* Metadata importer exception
|
||||
*
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class MetadataImportInvalidHeadingException extends Exception
|
||||
{
|
||||
/** The type of error (schema or element) */
|
||||
private int type;
|
||||
|
||||
/** The bad heading */
|
||||
private String badHeading;
|
||||
|
||||
/** Error with the schema */
|
||||
public static final int SCHEMA = 0;
|
||||
|
||||
/** Error with the element */
|
||||
public static final int ELEMENT = 1;
|
||||
|
||||
|
||||
/**
|
||||
* Instantiate a new MetadataImportInvalidHeadingException
|
||||
*
|
||||
* @param message the error message
|
||||
* @param theType the type of the error
|
||||
*/
|
||||
public MetadataImportInvalidHeadingException(String message, int theType)
|
||||
{
|
||||
super(message);
|
||||
badHeading = message;
|
||||
type = theType;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the type of the exception
|
||||
*
|
||||
* @return the type of the exception
|
||||
*/
|
||||
public String getType()
|
||||
{
|
||||
return "" + type;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the heading that was invalid
|
||||
*
|
||||
* @return the invalid heading
|
||||
*/
|
||||
public String getBadHeader()
|
||||
{
|
||||
return badHeading;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the exception message
|
||||
*
|
||||
* @return The exception message
|
||||
*/
|
||||
public String getMessage()
|
||||
{
|
||||
if (type == SCHEMA)
|
||||
{
|
||||
return "Unknown metadata schema in heading: " + badHeading;
|
||||
}
|
||||
else
|
||||
{
|
||||
return "Unknown metadata element in heading: " + badHeading;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,483 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.harvest;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.sql.SQLException;
|
||||
import java.util.List;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.HelpFormatter;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.browse.IndexBrowse;
|
||||
import org.dspace.content.Collection;
|
||||
import org.dspace.content.DSpaceObject;
|
||||
import org.dspace.harvest.HarvestedCollection;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.content.ItemIterator;
|
||||
import org.dspace.harvest.OAIHarvester;
|
||||
import org.dspace.harvest.OAIHarvester.HarvestingException;
|
||||
import org.dspace.core.Constants;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.eperson.EPerson;
|
||||
import org.dspace.handle.HandleManager;
|
||||
|
||||
/**
|
||||
* Test class for harvested collections.
|
||||
*
|
||||
* @author Alexey Maslov
|
||||
*/
|
||||
public class Harvest
|
||||
{
|
||||
private static Context context;
|
||||
|
||||
public static void main(String[] argv) throws Exception
|
||||
{
|
||||
// create an options object and populate it
|
||||
CommandLineParser parser = new PosixParser();
|
||||
|
||||
Options options = new Options();
|
||||
|
||||
options.addOption("p", "purge", false, "delete all items in the collection");
|
||||
options.addOption("r", "run", false, "run the standard harvest procedure");
|
||||
options.addOption("g", "ping", false, "test the OAI server and set");
|
||||
options.addOption("o", "once", false, "run the harvest procedure with specified parameters");
|
||||
options.addOption("s", "setup", false, "Set the collection up for harvesting");
|
||||
options.addOption("S", "start", false, "start the harvest loop");
|
||||
options.addOption("R", "reset", false, "reset harvest status on all collections");
|
||||
options.addOption("P", "purge", false, "purge all harvestable collections");
|
||||
|
||||
|
||||
options.addOption("e", "eperson", true, "eperson");
|
||||
options.addOption("c", "collection", true, "harvesting collection (handle or id)");
|
||||
options.addOption("t", "type", true, "type of harvesting (0 for none)");
|
||||
options.addOption("a", "address", true, "address of the OAI-PMH server");
|
||||
options.addOption("i", "oai_set_id", true, "id of the PMH set representing the harvested collection");
|
||||
options.addOption("m", "metadata_format", true, "the name of the desired metadata format for harvesting, resolved to namespace and crosswalk in dspace.cfg");
|
||||
|
||||
options.addOption("h", "help", false, "help");
|
||||
|
||||
CommandLine line = parser.parse(options, argv);
|
||||
|
||||
String command = null;
|
||||
String eperson = null;
|
||||
String collection = null;
|
||||
String oaiSource = null;
|
||||
String oaiSetID = null;
|
||||
String metadataKey = null;
|
||||
int harvestType = 0;
|
||||
|
||||
if (line.hasOption('h'))
|
||||
{
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("Harvest\n", options);
|
||||
System.out
|
||||
.println("\nPING OAI server: Harvest -g -s oai_source -i oai_set_id");
|
||||
System.out
|
||||
.println("RUNONCE harvest with arbitrary options: Harvest -o -e eperson -c collection -t harvest_type -a oai_source -i oai_set_id -m metadata_format");
|
||||
System.out
|
||||
.println("SETUP a collection for harvesting: Harvest -s -c collection -t harvest_type -a oai_source -i oai_set_id -m metadata_format");
|
||||
System.out
|
||||
.println("RUN harvest once: Harvest -r -e eperson -c collection");
|
||||
System.out
|
||||
.println("START harvest scheduler: Harvest -S");
|
||||
System.out
|
||||
.println("RESET all harvest status: Harvest -R");
|
||||
System.out
|
||||
.println("PURGE a collection of items and settings: Harvest -p -e eperson -c collection");
|
||||
System.out
|
||||
.println("PURGE all harvestable collections: Harvest -P -e eperson");
|
||||
|
||||
|
||||
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
if (line.hasOption('s')) {
|
||||
command = "config";
|
||||
}
|
||||
if (line.hasOption('p')) {
|
||||
command = "purge";
|
||||
}
|
||||
if (line.hasOption('r')) {
|
||||
command = "run";
|
||||
}
|
||||
if (line.hasOption('g')) {
|
||||
command = "ping";
|
||||
}
|
||||
if (line.hasOption('o')) {
|
||||
command = "runOnce";
|
||||
}
|
||||
if (line.hasOption('S')) {
|
||||
command = "start";
|
||||
}
|
||||
if (line.hasOption('R')) {
|
||||
command = "reset";
|
||||
}
|
||||
if (line.hasOption('P')) {
|
||||
command = "purgeAll";
|
||||
}
|
||||
|
||||
|
||||
if (line.hasOption('e')) {
|
||||
eperson = line.getOptionValue('e');
|
||||
}
|
||||
if (line.hasOption('c')) {
|
||||
collection = line.getOptionValue('c');
|
||||
}
|
||||
if (line.hasOption('t')) {
|
||||
harvestType = Integer.parseInt(line.getOptionValue('t'));
|
||||
} else {
|
||||
harvestType = 0;
|
||||
}
|
||||
if (line.hasOption('a')) {
|
||||
oaiSource = line.getOptionValue('a');
|
||||
}
|
||||
if (line.hasOption('i')) {
|
||||
oaiSetID = line.getOptionValue('i');
|
||||
}
|
||||
if (line.hasOption('m')) {
|
||||
metadataKey = line.getOptionValue('m');
|
||||
}
|
||||
|
||||
|
||||
// Instantiate our class
|
||||
Harvest harvester = new Harvest();
|
||||
harvester.context = new Context();
|
||||
|
||||
|
||||
// Check our options
|
||||
if (command == null)
|
||||
{
|
||||
System.out
|
||||
.println("Error - no parameters specified (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
// Run a single harvest cycle on a collection using saved settings.
|
||||
else if ("run".equals(command))
|
||||
{
|
||||
if (collection == null || eperson == null)
|
||||
{
|
||||
System.out
|
||||
.println("Error - a target collection and eperson must be provided");
|
||||
System.out.println(" (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
harvester.runHarvest(collection, eperson);
|
||||
}
|
||||
// start the harvest loop
|
||||
else if ("start".equals(command))
|
||||
{
|
||||
startHarvester();
|
||||
}
|
||||
// reset harvesting status
|
||||
else if ("reset".equals(command))
|
||||
{
|
||||
resetHarvesting();
|
||||
}
|
||||
// purge all collections that are set up for harvesting (obviously for testing purposes only)
|
||||
else if ("purgeAll".equals(command))
|
||||
{
|
||||
if (eperson == null)
|
||||
{
|
||||
System.out
|
||||
.println("Error - an eperson must be provided");
|
||||
System.out.println(" (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
List<Integer> cids = HarvestedCollection.findAll(context);
|
||||
System.out.println("Purging the following collections (deleting items and resetting harvest status): " + cids.toString());
|
||||
for (Integer cid : cids)
|
||||
{
|
||||
harvester.purgeCollection(cid.toString(), eperson);
|
||||
}
|
||||
context.complete();
|
||||
}
|
||||
// Delete all items in a collection. Useful for testing fresh harvests.
|
||||
else if ("purge".equals(command))
|
||||
{
|
||||
if (collection == null || eperson == null)
|
||||
{
|
||||
System.out
|
||||
.println("Error - a target collection and eperson must be provided");
|
||||
System.out.println(" (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
harvester.purgeCollection(collection, eperson);
|
||||
context.complete();
|
||||
|
||||
//TODO: implement this... remove all items and remember to unset "last-harvested" settings
|
||||
}
|
||||
// Configure a collection with the three main settings
|
||||
else if ("config".equals(command))
|
||||
{
|
||||
if (collection == null)
|
||||
{
|
||||
System.out.println("Error - a target collection must be provided");
|
||||
System.out.println(" (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
if (oaiSource == null || oaiSetID == null)
|
||||
{
|
||||
System.out.println("Error - both the OAI server address and OAI set id must be specified");
|
||||
System.out.println(" (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
if (metadataKey == null)
|
||||
{
|
||||
System.out.println("Error - a metadata key (commonly the prefix) must be specified for this collection");
|
||||
System.out.println(" (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
harvester.configureCollection(collection, harvestType, oaiSource, oaiSetID, metadataKey);
|
||||
}
|
||||
else if ("ping".equals(command))
|
||||
{
|
||||
if (oaiSource == null || oaiSetID == null)
|
||||
{
|
||||
System.out.println("Error - both the OAI server address and OAI set id must be specified");
|
||||
System.out.println(" (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Resolve the ID into a collection and check to see if its harvesting options are set. If so, return
|
||||
* the collection, if not, bail out.
|
||||
*/
|
||||
private Collection resolveCollection(String collectionID) {
|
||||
|
||||
DSpaceObject dso;
|
||||
Collection targetCollection = null;
|
||||
|
||||
try {
|
||||
// is the ID a handle?
|
||||
if (collectionID != null)
|
||||
{
|
||||
if (collectionID.indexOf('/') != -1)
|
||||
{
|
||||
// string has a / so it must be a handle - try and resolve it
|
||||
dso = HandleManager.resolveToObject(context, collectionID);
|
||||
|
||||
// resolved, now make sure it's a collection
|
||||
if (dso == null || dso.getType() != Constants.COLLECTION)
|
||||
{
|
||||
targetCollection = null;
|
||||
}
|
||||
else
|
||||
{
|
||||
targetCollection = (Collection) dso;
|
||||
}
|
||||
}
|
||||
// not a handle, try and treat it as an integer collection
|
||||
// database ID
|
||||
else
|
||||
{
|
||||
System.out.println("Looking up by id: " + collectionID + ", parsed as '" + Integer.parseInt(collectionID) + "', " + "in context: " + context);
|
||||
targetCollection = Collection.find(context, Integer.parseInt(collectionID));
|
||||
}
|
||||
}
|
||||
// was the collection valid?
|
||||
if (targetCollection == null)
|
||||
{
|
||||
System.out.println("Cannot resolve " + collectionID + " to collection");
|
||||
System.exit(1);
|
||||
}
|
||||
}
|
||||
catch (SQLException se) {
|
||||
se.printStackTrace();
|
||||
}
|
||||
|
||||
return targetCollection;
|
||||
}
|
||||
|
||||
|
||||
private void configureCollection(String collectionID, int type, String oaiSource, String oaiSetId, String mdConfigId) {
|
||||
System.out.println("Running: configure collection");
|
||||
|
||||
Collection collection = resolveCollection(collectionID);
|
||||
System.out.println(collection.getID());
|
||||
|
||||
try {
|
||||
HarvestedCollection hc = HarvestedCollection.find(context, collection.getID());
|
||||
if (hc == null) {
|
||||
hc = HarvestedCollection.create(context, collection.getID());
|
||||
}
|
||||
|
||||
context.turnOffAuthorisationSystem();
|
||||
hc.setHarvestParams(type, oaiSource, oaiSetId, mdConfigId);
|
||||
hc.setHarvestStatus(HarvestedCollection.STATUS_READY);
|
||||
hc.update();
|
||||
context.restoreAuthSystemState();
|
||||
context.complete();
|
||||
}
|
||||
catch (Exception e) {
|
||||
System.out.println("Changes could not be committed");
|
||||
e.printStackTrace();
|
||||
System.exit(1);
|
||||
}
|
||||
finally {
|
||||
if (context != null)
|
||||
{
|
||||
context.restoreAuthSystemState();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Purges a collection of all harvest-related data and settings. All items in the collection will be deleted.
|
||||
*
|
||||
* @param collectionID
|
||||
* @param email
|
||||
*/
|
||||
private void purgeCollection(String collectionID, String email) {
|
||||
System.out.println("Purging collection of all items and resetting last_harvested and harvest_message: " + collectionID);
|
||||
Collection collection = resolveCollection(collectionID);
|
||||
|
||||
try
|
||||
{
|
||||
EPerson eperson = EPerson.findByEmail(context, email);
|
||||
context.setCurrentUser(eperson);
|
||||
context.turnOffAuthorisationSystem();
|
||||
|
||||
ItemIterator it = collection.getAllItems();
|
||||
IndexBrowse ib = new IndexBrowse(context);
|
||||
int i=0;
|
||||
while (it.hasNext()) {
|
||||
i++;
|
||||
Item item = it.next();
|
||||
System.out.println("Deleting: " + item.getHandle());
|
||||
ib.itemRemoved(item);
|
||||
collection.removeItem(item);
|
||||
// commit every 50 items
|
||||
if (i%50 == 0) {
|
||||
context.commit();
|
||||
i=0;
|
||||
}
|
||||
}
|
||||
|
||||
HarvestedCollection hc = HarvestedCollection.find(context, collection.getID());
|
||||
if (hc != null) {
|
||||
hc.setHarvestResult(null,"");
|
||||
hc.setHarvestStatus(HarvestedCollection.STATUS_READY);
|
||||
hc.setHarvestStartTime(null);
|
||||
hc.update();
|
||||
}
|
||||
context.restoreAuthSystemState();
|
||||
context.commit();
|
||||
}
|
||||
catch (Exception e) {
|
||||
System.out.println("Changes could not be committed");
|
||||
e.printStackTrace();
|
||||
System.exit(1);
|
||||
}
|
||||
finally {
|
||||
context.restoreAuthSystemState();
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Run a single harvest cycle on the specified collection under the authorization of the supplied EPerson
|
||||
*/
|
||||
private void runHarvest(String collectionID, String email) {
|
||||
System.out.println("Running: a harvest cycle on " + collectionID);
|
||||
|
||||
System.out.print("Initializing the harvester... ");
|
||||
OAIHarvester harvester = null;
|
||||
try {
|
||||
Collection collection = resolveCollection(collectionID);
|
||||
HarvestedCollection hc = HarvestedCollection.find(context, collection.getID());
|
||||
harvester = new OAIHarvester(context, collection, hc);
|
||||
System.out.println("success. ");
|
||||
}
|
||||
catch (HarvestingException hex) {
|
||||
System.out.print("failed. ");
|
||||
System.out.println(hex.getMessage());
|
||||
throw new IllegalStateException("Unable to harvest", hex);
|
||||
} catch (SQLException se) {
|
||||
System.out.print("failed. ");
|
||||
System.out.println(se.getMessage());
|
||||
throw new IllegalStateException("Unable to access database", se);
|
||||
}
|
||||
|
||||
try {
|
||||
// Harvest will not work for an anonymous user
|
||||
EPerson eperson = EPerson.findByEmail(context, email);
|
||||
System.out.println("Harvest started... ");
|
||||
context.setCurrentUser(eperson);
|
||||
harvester.runHarvest();
|
||||
context.complete();
|
||||
}
|
||||
catch (SQLException e) {
|
||||
throw new IllegalStateException("Failed to run harvester", e);
|
||||
}
|
||||
catch (AuthorizeException e) {
|
||||
throw new IllegalStateException("Failed to run harvester", e);
|
||||
}
|
||||
catch (IOException e) {
|
||||
throw new IllegalStateException("Failed to run harvester", e);
|
||||
}
|
||||
|
||||
System.out.println("Harvest complete. ");
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Resets harvest_status and harvest_start_time flags for all collections that have a row in the harvested_collections table
|
||||
*/
|
||||
private static void resetHarvesting() {
|
||||
System.out.print("Resetting harvest status flag on all collections... ");
|
||||
|
||||
try
|
||||
{
|
||||
List<Integer> cids = HarvestedCollection.findAll(context);
|
||||
for (Integer cid : cids)
|
||||
{
|
||||
HarvestedCollection hc = HarvestedCollection.find(context, cid);
|
||||
//hc.setHarvestResult(null,"");
|
||||
hc.setHarvestStartTime(null);
|
||||
hc.setHarvestStatus(HarvestedCollection.STATUS_READY);
|
||||
hc.update();
|
||||
}
|
||||
context.commit();
|
||||
System.out.println("success. ");
|
||||
}
|
||||
catch (Exception ex) {
|
||||
System.out.println("failed. ");
|
||||
ex.printStackTrace();
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Starts up the harvest scheduler. Terminating this process will stop the scheduler.
|
||||
*/
|
||||
private static void startHarvester()
|
||||
{
|
||||
try
|
||||
{
|
||||
System.out.print("Starting harvest loop... ");
|
||||
OAIHarvester.startNewScheduler();
|
||||
System.out.println("running. ");
|
||||
}
|
||||
catch (Exception ex) {
|
||||
ex.printStackTrace();
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,29 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemexport;
|
||||
|
||||
/**
|
||||
* An exception that can be thrown when error occur during item export
|
||||
*/
|
||||
public class ItemExportException extends Exception
|
||||
{
|
||||
public static final int EXPORT_TOO_LARGE = 0;
|
||||
|
||||
private int reason;
|
||||
|
||||
public ItemExportException(int r, String message)
|
||||
{
|
||||
super(message);
|
||||
reason = r;
|
||||
}
|
||||
|
||||
public int getReason()
|
||||
{
|
||||
return reason;
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,80 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.util.Iterator;
|
||||
import java.util.LinkedHashMap;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* Container for UpdateActions
|
||||
* Order of actions is very import for correct processing. This implementation
|
||||
* supports an iterator that returns the actions in the order in which they are
|
||||
* put in. Adding the same action a second time has no effect on this order.
|
||||
*
|
||||
*
|
||||
*/
|
||||
public class ActionManager implements Iterable<UpdateAction> {
|
||||
|
||||
private Map<Class<? extends UpdateAction>, UpdateAction> registry
|
||||
= new LinkedHashMap<Class<? extends UpdateAction>, UpdateAction>();
|
||||
|
||||
public UpdateAction getUpdateAction(Class<? extends UpdateAction> actionClass)
|
||||
throws InstantiationException, IllegalAccessException
|
||||
{
|
||||
UpdateAction action = registry.get(actionClass);
|
||||
|
||||
if (action == null)
|
||||
{
|
||||
action = actionClass.newInstance();
|
||||
registry.put(actionClass, action);
|
||||
}
|
||||
|
||||
return action;
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @return whether any actions have been registered with this manager
|
||||
*/
|
||||
public boolean hasActions()
|
||||
{
|
||||
return !registry.isEmpty();
|
||||
}
|
||||
|
||||
/**
|
||||
* This implementation guarantees the iterator order is the same as the order
|
||||
* in which updateActions have been added
|
||||
*
|
||||
* @return iterator for UpdateActions
|
||||
*/
|
||||
public Iterator<UpdateAction> iterator()
|
||||
{
|
||||
return new Iterator<UpdateAction>()
|
||||
{
|
||||
private Iterator<Class<? extends UpdateAction>> itr = registry.keySet().iterator();
|
||||
|
||||
public boolean hasNext()
|
||||
{
|
||||
return itr.hasNext();
|
||||
}
|
||||
|
||||
public UpdateAction next()
|
||||
{
|
||||
return registry.get(itr.next());
|
||||
}
|
||||
|
||||
//not supported
|
||||
public void remove()
|
||||
{
|
||||
throw new UnsupportedOperationException();
|
||||
}
|
||||
};
|
||||
|
||||
}
|
||||
}
|
||||
@@ -1,203 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.io.BufferedInputStream;
|
||||
import java.io.File;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.IOException;
|
||||
import java.sql.SQLException;
|
||||
import java.text.ParseException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.authorize.AuthorizeManager;
|
||||
import org.dspace.authorize.ResourcePolicy;
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.BitstreamFormat;
|
||||
import org.dspace.content.Bundle;
|
||||
import org.dspace.content.DCDate;
|
||||
import org.dspace.content.FormatIdentifier;
|
||||
import org.dspace.content.InstallItem;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.eperson.Group;
|
||||
|
||||
/**
|
||||
* Action to add bitstreams listed in item contents file to the item in DSpace
|
||||
*
|
||||
*
|
||||
*/
|
||||
public class AddBitstreamsAction extends UpdateBitstreamsAction {
|
||||
|
||||
public AddBitstreamsAction()
|
||||
{
|
||||
//empty
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds bitstreams from the archive as listed in the contents file.
|
||||
*
|
||||
* @param context
|
||||
* @param itarch
|
||||
* @param isTest
|
||||
* @param suppressUndo
|
||||
* @throws IllegalArgumentException
|
||||
* @throws ParseException
|
||||
* @throws IOException
|
||||
* @throws AuthorizeException
|
||||
* @throws SQLException
|
||||
*/
|
||||
public void execute(Context context, ItemArchive itarch, boolean isTest,
|
||||
boolean suppressUndo) throws IllegalArgumentException,
|
||||
ParseException, IOException, AuthorizeException, SQLException
|
||||
{
|
||||
Item item = itarch.getItem();
|
||||
File dir = itarch.getDirectory();
|
||||
|
||||
List<ContentsEntry> contents = MetadataUtilities.readContentsFile(new File(dir, ItemUpdate.CONTENTS_FILE));
|
||||
|
||||
if (contents.isEmpty())
|
||||
{
|
||||
ItemUpdate.pr("Contents is empty - no bitstreams to add");
|
||||
return;
|
||||
}
|
||||
|
||||
ItemUpdate.pr("Contents bitstream count: " + contents.size());
|
||||
|
||||
String[] files = dir.list(ItemUpdate.fileFilter);
|
||||
List<String> fileList = new ArrayList<String>();
|
||||
for (String filename : files)
|
||||
{
|
||||
fileList.add(filename);
|
||||
ItemUpdate.pr("file: " + filename);
|
||||
}
|
||||
|
||||
for (ContentsEntry ce : contents)
|
||||
{
|
||||
//validate match to existing file in archive
|
||||
if (!fileList.contains(ce.filename))
|
||||
{
|
||||
throw new IllegalArgumentException("File listed in contents is missing: " + ce.filename);
|
||||
}
|
||||
}
|
||||
|
||||
//now okay to add
|
||||
for (ContentsEntry ce : contents)
|
||||
{
|
||||
addBitstream(context, itarch, item, dir, ce, suppressUndo, isTest);
|
||||
}
|
||||
}
|
||||
|
||||
private void addBitstream(Context context, ItemArchive itarch, Item item, File dir,
|
||||
ContentsEntry ce, boolean suppressUndo, boolean isTest)
|
||||
throws IOException, IllegalArgumentException, SQLException, AuthorizeException, ParseException
|
||||
{
|
||||
ItemUpdate.pr("contents entry for bitstream: " + ce.toString());
|
||||
File f = new File(dir, ce.filename);
|
||||
|
||||
// get an input stream
|
||||
BufferedInputStream bis = new BufferedInputStream(new FileInputStream(f));
|
||||
|
||||
Bitstream bs = null;
|
||||
String newBundleName = ce.bundlename;
|
||||
|
||||
if (ce.bundlename == null) // should be required but default convention established
|
||||
{
|
||||
if (ce.filename.equals("license.txt"))
|
||||
{
|
||||
newBundleName = "LICENSE";
|
||||
}
|
||||
else
|
||||
{
|
||||
newBundleName = "ORIGINAL";
|
||||
}
|
||||
}
|
||||
ItemUpdate.pr(" Bitstream " + ce.filename + " to be added to bundle: " + newBundleName);
|
||||
|
||||
if (!isTest)
|
||||
{
|
||||
// find the bundle
|
||||
Bundle[] bundles = item.getBundles(newBundleName);
|
||||
Bundle targetBundle = null;
|
||||
|
||||
if (bundles.length < 1)
|
||||
{
|
||||
// not found, create a new one
|
||||
targetBundle = item.createBundle(newBundleName);
|
||||
}
|
||||
else
|
||||
{
|
||||
//verify bundle + name are not duplicates
|
||||
for (Bundle b : bundles)
|
||||
{
|
||||
Bitstream[] bitstreams = b.getBitstreams();
|
||||
for (Bitstream bsm : bitstreams)
|
||||
{
|
||||
if (bsm.getName().equals(ce.filename))
|
||||
{
|
||||
throw new IllegalArgumentException("Duplicate bundle + filename cannot be added: "
|
||||
+ b.getName() + " + " + bsm.getName());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// select first bundle
|
||||
targetBundle = bundles[0];
|
||||
}
|
||||
|
||||
bs = targetBundle.createBitstream(bis);
|
||||
bs.setName(ce.filename);
|
||||
|
||||
// Identify the format
|
||||
// FIXME - guessing format guesses license.txt incorrectly as a text file format!
|
||||
BitstreamFormat fmt = FormatIdentifier.guessFormat(context, bs);
|
||||
bs.setFormat(fmt);
|
||||
|
||||
if (ce.description != null)
|
||||
{
|
||||
bs.setDescription(ce.description);
|
||||
}
|
||||
|
||||
if ((ce.permissionsActionId != -1) && (ce.permissionsGroupName != null))
|
||||
{
|
||||
Group group = Group.findByName(context, ce.permissionsGroupName);
|
||||
|
||||
if (group != null)
|
||||
{
|
||||
AuthorizeManager.removeAllPolicies(context, bs); // remove the default policy
|
||||
ResourcePolicy rp = ResourcePolicy.create(context);
|
||||
rp.setResource(bs);
|
||||
rp.setAction(ce.permissionsActionId);
|
||||
rp.setGroup(group);
|
||||
rp.update();
|
||||
}
|
||||
}
|
||||
|
||||
if (alterProvenance && !targetBundle.getName().equals("THUMBNAIL")
|
||||
&& !targetBundle.getName().equals("TEXT"))
|
||||
{
|
||||
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
|
||||
|
||||
String append = "Bitstream added on " + DCDate.getCurrent() + " : "
|
||||
+ InstallItem.getBitstreamProvenanceMessage(item);
|
||||
MetadataUtilities.appendMetadata(item, dtom, false, append);
|
||||
}
|
||||
|
||||
//update after all changes are applied, even metadata ones
|
||||
bs.update();
|
||||
|
||||
if (!suppressUndo)
|
||||
{
|
||||
itarch.addUndoDeleteContents(bs.getID());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,120 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.sql.SQLException;
|
||||
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.DCValue;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.content.MetadataField;
|
||||
import org.dspace.content.MetadataSchema;
|
||||
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* Action to add metadata to item
|
||||
*
|
||||
*/
|
||||
public class AddMetadataAction extends UpdateMetadataAction {
|
||||
|
||||
/**
|
||||
* Adds metadata specified in the source archive
|
||||
*
|
||||
* @param context
|
||||
* @param itarch
|
||||
* @param isTest
|
||||
* @param suppressUndo
|
||||
* @throws AuthorizeException
|
||||
* @throws SQLException
|
||||
*/
|
||||
public void execute(Context context, ItemArchive itarch, boolean isTest,
|
||||
boolean suppressUndo) throws AuthorizeException, SQLException
|
||||
{
|
||||
Item item = itarch.getItem();
|
||||
String dirname = itarch.getDirectoryName();
|
||||
|
||||
for (DtoMetadata dtom : itarch.getMetadataFields())
|
||||
{
|
||||
for (String f : targetFields)
|
||||
{
|
||||
if (dtom.matches(f, false))
|
||||
{
|
||||
// match against metadata for this field/value in repository
|
||||
// qualifier must be strictly matched, possibly null
|
||||
DCValue[] ardcv = null;
|
||||
ardcv = item.getMetadata(dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
|
||||
|
||||
boolean found = false;
|
||||
for (DCValue dcv : ardcv)
|
||||
{
|
||||
if (dcv.value.equals(dtom.value))
|
||||
{
|
||||
found = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (found)
|
||||
{
|
||||
ItemUpdate.pr("Warning: No new metadata found to add to item " + dirname
|
||||
+ " for element " + f);
|
||||
}
|
||||
else
|
||||
{
|
||||
if (isTest)
|
||||
{
|
||||
ItemUpdate.pr("Metadata to add: " + dtom.toString());
|
||||
//validity tests that would occur in actual processing
|
||||
// If we're just test the import, let's check that the actual metadata field exists.
|
||||
MetadataSchema foundSchema = MetadataSchema.find(context, dtom.schema);
|
||||
|
||||
if (foundSchema == null)
|
||||
{
|
||||
ItemUpdate.pr("ERROR: schema '"
|
||||
+ dtom.schema + "' was not found in the registry; found on item " + dirname);
|
||||
}
|
||||
else
|
||||
{
|
||||
int schemaID = foundSchema.getSchemaID();
|
||||
MetadataField foundField = MetadataField.findByElement(context, schemaID, dtom.element, dtom.qualifier);
|
||||
|
||||
if (foundField == null)
|
||||
{
|
||||
ItemUpdate.pr("ERROR: Metadata field: '" + dtom.schema + "." + dtom.element + "."
|
||||
+ dtom.qualifier + "' not found in registry; found on item " + dirname);
|
||||
}
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
item.addMetadata(dtom.schema, dtom.element, dtom.qualifier, dtom.language, dtom.value);
|
||||
ItemUpdate.pr("Metadata added: " + dtom.toString());
|
||||
|
||||
if (!suppressUndo)
|
||||
{
|
||||
//itarch.addUndoDtom(dtom);
|
||||
//ItemUpdate.pr("Undo metadata: " + dtom);
|
||||
|
||||
// add all as a replace record to be preceded by delete
|
||||
for (DCValue dcval : ardcv)
|
||||
{
|
||||
itarch.addUndoMetadataField(DtoMetadata.create(dcval.schema, dcval.element,
|
||||
dcval.qualifier, dcval.language, dcval.value));
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
}
|
||||
break; // don't need to check if this field matches any other target fields
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,61 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Properties;
|
||||
import java.io.InputStream;
|
||||
import java.io.FileInputStream;
|
||||
import org.dspace.content.Bitstream;
|
||||
|
||||
/**
|
||||
* Filter interface to be used by ItemUpdate
|
||||
* to determine which bitstreams in an Item
|
||||
* acceptable for removal.
|
||||
*
|
||||
*/
|
||||
public abstract class BitstreamFilter {
|
||||
|
||||
protected Properties props = null;
|
||||
|
||||
/**
|
||||
* The filter method
|
||||
*
|
||||
* @param bitstream
|
||||
* @return whether the bitstream matches the criteria
|
||||
* @throws BitstreamFilterException
|
||||
*/
|
||||
public abstract boolean accept(Bitstream bitstream) throws BitstreamFilterException;
|
||||
|
||||
/**
|
||||
*
|
||||
* @param filepath - The complete path for the properties file
|
||||
* @throws IOException
|
||||
*/
|
||||
public void initProperties(String filepath)
|
||||
throws IOException
|
||||
{
|
||||
props = new Properties();
|
||||
|
||||
InputStream in = null;
|
||||
|
||||
try
|
||||
{
|
||||
in = new FileInputStream(filepath);
|
||||
props.load(in);
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (in != null)
|
||||
{
|
||||
in.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,66 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.sql.SQLException;
|
||||
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Bundle;
|
||||
|
||||
/**
|
||||
* BitstreamFilter implementation to filter by bundle name
|
||||
*
|
||||
*/
|
||||
public class BitstreamFilterByBundleName extends BitstreamFilter {
|
||||
|
||||
protected String bundleName;
|
||||
|
||||
public BitstreamFilterByBundleName()
|
||||
{
|
||||
//empty
|
||||
}
|
||||
|
||||
/**
|
||||
* Filter bitstream based on bundle name found in properties file
|
||||
*
|
||||
* @param bitstream
|
||||
* @throws BitstreamFilterException
|
||||
* @return whether bitstream is in bundle
|
||||
*
|
||||
*/
|
||||
public boolean accept(Bitstream bitstream)
|
||||
throws BitstreamFilterException
|
||||
{
|
||||
if (bundleName == null)
|
||||
{
|
||||
bundleName = props.getProperty("bundle");
|
||||
if (bundleName == null)
|
||||
{
|
||||
throw new BitstreamFilterException("Property 'bundle' not found.");
|
||||
}
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
Bundle[] bundles = bitstream.getBundles();
|
||||
for (Bundle b : bundles)
|
||||
{
|
||||
if (b.getName().equals(bundleName))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
catch(SQLException e)
|
||||
{
|
||||
throw new BitstreamFilterException(e);
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,50 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.util.regex.*;
|
||||
|
||||
import org.dspace.content.Bitstream;
|
||||
|
||||
/**
|
||||
* BitstreamFilter implementation to filter by filename pattern
|
||||
*
|
||||
*/
|
||||
public class BitstreamFilterByFilename extends BitstreamFilter {
|
||||
|
||||
private Pattern pattern;
|
||||
private String filenameRegex;
|
||||
|
||||
public BitstreamFilterByFilename()
|
||||
{
|
||||
//empty
|
||||
}
|
||||
|
||||
/**
|
||||
* Tests bitstream by matching the regular expression in the
|
||||
* properties against the bitstream name
|
||||
*
|
||||
* @return whether bitstream name matches the regular expression
|
||||
*/
|
||||
public boolean accept(Bitstream bitstream) throws BitstreamFilterException
|
||||
{
|
||||
if (filenameRegex == null)
|
||||
{
|
||||
filenameRegex = props.getProperty("filename");
|
||||
if (filenameRegex == null)
|
||||
{
|
||||
throw new BitstreamFilterException("BitstreamFilter property 'filename' not found.");
|
||||
}
|
||||
pattern = Pattern.compile(filenameRegex);
|
||||
}
|
||||
|
||||
Matcher m = pattern.matcher(bitstream.getName());
|
||||
return m.matches();
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,29 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
/**
|
||||
* Exception class for BitstreamFilters
|
||||
*
|
||||
*/
|
||||
public class BitstreamFilterException extends Exception
|
||||
{
|
||||
|
||||
private static final long serialVersionUID = 1L;
|
||||
|
||||
public BitstreamFilterException() {}
|
||||
public BitstreamFilterException(String msg)
|
||||
{
|
||||
super(msg);
|
||||
}
|
||||
public BitstreamFilterException(Exception e)
|
||||
{
|
||||
super(e);
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,153 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.text.ParseException;
|
||||
import java.util.regex.*;
|
||||
|
||||
import org.dspace.core.Constants;
|
||||
|
||||
/**
|
||||
* Holds the elements of a line in the Contents Entry file
|
||||
*
|
||||
* Based on private methods in ItemImport
|
||||
*
|
||||
* Lacking a spec or full documentation for the file format,
|
||||
* it looks from the source code that the ordering or elements is not fixed
|
||||
*
|
||||
* e.g.:
|
||||
* 48217870-MIT.pdf\tbundle: bundlename\tpermissions: -r 'MIT Users'\tdescription: Full printable version (MIT only)
|
||||
* permissions: -[r|w] ['group name']
|
||||
* description: <the description of the file>
|
||||
*
|
||||
*
|
||||
*/
|
||||
public class ContentsEntry
|
||||
{
|
||||
public static final String HDR_BUNDLE = "bundle:";
|
||||
public static final String HDR_PERMISSIONS = "permissions:";
|
||||
public static final String HDR_DESCRIPTION = "description:";
|
||||
|
||||
public static final Pattern permissionsPattern = Pattern.compile("-([rw])\\s*'?([^']+)'?");
|
||||
|
||||
final String filename;
|
||||
final String bundlename;
|
||||
final String permissionsGroupName;
|
||||
final int permissionsActionId;
|
||||
final String description;
|
||||
|
||||
private ContentsEntry(String filename,
|
||||
String bundlename,
|
||||
int permissionsActionId,
|
||||
String permissionsGroupName,
|
||||
String description)
|
||||
{
|
||||
this.filename = filename;
|
||||
this.bundlename = bundlename;
|
||||
this.permissionsActionId = permissionsActionId;
|
||||
this.permissionsGroupName = permissionsGroupName;
|
||||
this.description = description;
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method parses a line from the Contents Entry file
|
||||
*
|
||||
* @param line
|
||||
* @return the parsed ContentsEntry object
|
||||
* @throws ParseException
|
||||
*/
|
||||
public static ContentsEntry parse(String line)
|
||||
throws ParseException
|
||||
{
|
||||
String[] ar = line.split("\t");
|
||||
ItemUpdate.pr("ce line split: " + ar.length);
|
||||
|
||||
String[] arp = new String[4];
|
||||
arp[0] = ar[0]; //bitstream name doesn't have header and is always first
|
||||
|
||||
String groupName = null;
|
||||
int actionId = -1;
|
||||
|
||||
if (ar.length > 1)
|
||||
{
|
||||
for (int i=1; i < ar.length; i++)
|
||||
{
|
||||
ItemUpdate.pr("ce " + i + " : " + ar[i]);
|
||||
if (ar[i].startsWith(HDR_BUNDLE))
|
||||
{
|
||||
arp[1] = ar[i].substring(HDR_BUNDLE.length()).trim();
|
||||
|
||||
}
|
||||
else if (ar[i].startsWith(HDR_PERMISSIONS))
|
||||
{
|
||||
arp[2] = ar[i].substring(HDR_PERMISSIONS.length()).trim();
|
||||
|
||||
// parse into actionId and group name
|
||||
|
||||
Matcher m = permissionsPattern.matcher(arp[2]);
|
||||
if (m.matches())
|
||||
{
|
||||
String action = m.group(1); //
|
||||
if (action.equals("r"))
|
||||
{
|
||||
actionId = Constants.READ;
|
||||
}
|
||||
else if (action.equals("w"))
|
||||
{
|
||||
actionId = Constants.WRITE;
|
||||
}
|
||||
|
||||
groupName = m.group(2).trim();
|
||||
}
|
||||
|
||||
}
|
||||
else if (ar[i].startsWith(HDR_DESCRIPTION))
|
||||
{
|
||||
arp[3] = ar[i].substring(HDR_DESCRIPTION.length()).trim();
|
||||
|
||||
}
|
||||
else
|
||||
{
|
||||
throw new ParseException("Unknown text in contents file: " + ar[i], 0);
|
||||
}
|
||||
}
|
||||
}
|
||||
return new ContentsEntry(arp[0], arp[1], actionId, groupName, arp[3]);
|
||||
}
|
||||
|
||||
public String toString()
|
||||
{
|
||||
StringBuilder sb = new StringBuilder(filename);
|
||||
if (bundlename != null)
|
||||
{
|
||||
sb.append(HDR_BUNDLE).append(" ").append(bundlename);
|
||||
}
|
||||
|
||||
if (permissionsGroupName != null)
|
||||
{
|
||||
sb.append(HDR_PERMISSIONS);
|
||||
if (permissionsActionId == Constants.READ)
|
||||
{
|
||||
sb.append(" -r ");
|
||||
}
|
||||
else if (permissionsActionId == Constants.WRITE)
|
||||
{
|
||||
sb.append(" -w ");
|
||||
}
|
||||
sb.append(permissionsGroupName);
|
||||
}
|
||||
|
||||
if (description != null)
|
||||
{
|
||||
sb.append(HDR_DESCRIPTION).append(" ").append(description);
|
||||
}
|
||||
|
||||
return sb.toString();
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,114 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
import java.sql.SQLException;
|
||||
import java.text.ParseException;
|
||||
import java.util.List;
|
||||
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Bundle;
|
||||
import org.dspace.content.DCDate;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* Action to delete bitstreams
|
||||
*
|
||||
* Undo not supported for this UpdateAction
|
||||
*
|
||||
* Derivatives of the bitstream to be deleted are not also deleted
|
||||
*
|
||||
*/
|
||||
public class DeleteBitstreamsAction extends UpdateBitstreamsAction
|
||||
{
|
||||
/**
|
||||
* Delete bitstream from item
|
||||
*
|
||||
* @param context
|
||||
* @param itarch
|
||||
* @param isTest
|
||||
* @param suppressUndo
|
||||
* @throws IllegalArgumentException
|
||||
* @throws ParseException
|
||||
* @throws IOException
|
||||
* @throws AuthorizeException
|
||||
* @throws SQLException
|
||||
*/
|
||||
public void execute(Context context, ItemArchive itarch, boolean isTest,
|
||||
boolean suppressUndo) throws IllegalArgumentException, IOException,
|
||||
SQLException, AuthorizeException, ParseException
|
||||
{
|
||||
File f = new File(itarch.getDirectory(), ItemUpdate.DELETE_CONTENTS_FILE);
|
||||
if (!f.exists())
|
||||
{
|
||||
ItemUpdate.pr("Warning: Delete_contents file for item " + itarch.getDirectoryName() + " not found.");
|
||||
}
|
||||
else
|
||||
{
|
||||
List<Integer> list = MetadataUtilities.readDeleteContentsFile(f);
|
||||
if (list.isEmpty())
|
||||
{
|
||||
ItemUpdate.pr("Warning: empty delete_contents file for item " + itarch.getDirectoryName() );
|
||||
}
|
||||
else
|
||||
{
|
||||
for (int id : list)
|
||||
{
|
||||
try
|
||||
{
|
||||
Bitstream bs = Bitstream.find(context, id);
|
||||
if (bs == null)
|
||||
{
|
||||
ItemUpdate.pr("Bitstream not found by id: " + id);
|
||||
}
|
||||
else
|
||||
{
|
||||
Bundle[] bundles = bs.getBundles();
|
||||
for (Bundle b : bundles)
|
||||
{
|
||||
if (isTest)
|
||||
{
|
||||
ItemUpdate.pr("Delete bitstream with id = " + id);
|
||||
}
|
||||
else
|
||||
{
|
||||
b.removeBitstream(bs);
|
||||
ItemUpdate.pr("Deleted bitstream with id = " + id);
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
if (alterProvenance)
|
||||
{
|
||||
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
|
||||
|
||||
String append = "Bitstream " + bs.getName() + " deleted on " + DCDate.getCurrent() + "; ";
|
||||
Item item = bundles[0].getItems()[0];
|
||||
ItemUpdate.pr("Append provenance with: " + append);
|
||||
|
||||
if (!isTest)
|
||||
{
|
||||
MetadataUtilities.appendMetadata(item, dtom, false, append);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch(SQLException e)
|
||||
{
|
||||
ItemUpdate.pr("Error finding bitstream from id: " + id + " : " + e.toString());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,130 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.sql.SQLException;
|
||||
import java.text.ParseException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Bundle;
|
||||
import org.dspace.content.DCDate;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* Action to delete bitstreams using a specified filter implementing BitstreamFilter
|
||||
* Derivatives for the target bitstreams are not deleted.
|
||||
*
|
||||
* The dc.description.provenance field is amended to reflect the deletions
|
||||
*
|
||||
* Note: Multiple filters are impractical if trying to manage multiple properties files
|
||||
* in a commandline environment
|
||||
*
|
||||
*
|
||||
*/
|
||||
public class DeleteBitstreamsByFilterAction extends UpdateBitstreamsAction {
|
||||
|
||||
private BitstreamFilter filter;
|
||||
|
||||
/**
|
||||
* Set filter
|
||||
*
|
||||
* @param filter
|
||||
*/
|
||||
public void setBitstreamFilter(BitstreamFilter filter)
|
||||
{
|
||||
this.filter = filter;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get filter
|
||||
* @return filter
|
||||
*/
|
||||
public BitstreamFilter getBitstreamFilter()
|
||||
{
|
||||
return filter;
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete bitstream
|
||||
*
|
||||
* @param context
|
||||
* @param itarch
|
||||
* @param isTest
|
||||
* @param suppressUndo
|
||||
* @throws IllegalArgumentException
|
||||
* @throws ParseException
|
||||
* @throws IOException
|
||||
* @throws AuthorizeException
|
||||
* @throws SQLException
|
||||
*/
|
||||
public void execute(Context context, ItemArchive itarch, boolean isTest,
|
||||
boolean suppressUndo) throws AuthorizeException,
|
||||
BitstreamFilterException, IOException, ParseException, SQLException
|
||||
{
|
||||
|
||||
List<String> deleted = new ArrayList<String>();
|
||||
|
||||
Item item = itarch.getItem();
|
||||
Bundle[] bundles = item.getBundles();
|
||||
|
||||
for (Bundle b : bundles)
|
||||
{
|
||||
Bitstream[] bitstreams = b.getBitstreams();
|
||||
String bundleName = b.getName();
|
||||
|
||||
for (Bitstream bs : bitstreams)
|
||||
{
|
||||
if (filter.accept(bs))
|
||||
{
|
||||
if (isTest)
|
||||
{
|
||||
ItemUpdate.pr("Delete from bundle " + bundleName + " bitstream " + bs.getName()
|
||||
+ " with id = " + bs.getID());
|
||||
}
|
||||
else
|
||||
{
|
||||
//provenance is not maintained for derivative bitstreams
|
||||
if (!bundleName.equals("THUMBNAIL") && !bundleName.equals("TEXT"))
|
||||
{
|
||||
deleted.add(bs.getName());
|
||||
}
|
||||
b.removeBitstream(bs);
|
||||
ItemUpdate.pr("Deleted " + bundleName + " bitstream " + bs.getName()
|
||||
+ " with id = " + bs.getID());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (alterProvenance && !deleted.isEmpty())
|
||||
{
|
||||
StringBuilder sb = new StringBuilder(" Bitstreams deleted on ");
|
||||
sb.append(DCDate.getCurrent()).append(": ");
|
||||
|
||||
for (String s : deleted)
|
||||
{
|
||||
sb.append(s).append(", ");
|
||||
}
|
||||
|
||||
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
|
||||
|
||||
ItemUpdate.pr("Append provenance with: " + sb.toString());
|
||||
|
||||
if (!isTest)
|
||||
{
|
||||
MetadataUtilities.appendMetadata(item, dtom, false, sb.toString());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,64 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.text.ParseException;
|
||||
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.DCValue;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* Action to delete metadata
|
||||
*
|
||||
*
|
||||
*/
|
||||
public class DeleteMetadataAction extends UpdateMetadataAction {
|
||||
|
||||
/**
|
||||
* Delete metadata from item
|
||||
*
|
||||
* @param context
|
||||
* @param itarch
|
||||
* @param isTest
|
||||
* @param suppressUndo
|
||||
* @throws ParseException
|
||||
* @throws AuthorizeException
|
||||
*/
|
||||
public void execute(Context context, ItemArchive itarch, boolean isTest,
|
||||
boolean suppressUndo) throws AuthorizeException, ParseException
|
||||
{
|
||||
Item item = itarch.getItem();
|
||||
for (String f : targetFields)
|
||||
{
|
||||
DtoMetadata dummy = DtoMetadata.create(f, Item.ANY, "");
|
||||
DCValue[] ardcv = item.getMetadata(f);
|
||||
|
||||
ItemUpdate.pr("Metadata to be deleted: ");
|
||||
for (DCValue dcv : ardcv)
|
||||
{
|
||||
ItemUpdate.pr(" " + MetadataUtilities.getDCValueString(dcv));
|
||||
}
|
||||
|
||||
if (!isTest)
|
||||
{
|
||||
if (!suppressUndo)
|
||||
{
|
||||
for (DCValue dcv : ardcv)
|
||||
{
|
||||
itarch.addUndoMetadataField(DtoMetadata.create(dcv.schema, dcv.element,
|
||||
dcv.qualifier, dcv.language, dcv.value));
|
||||
}
|
||||
}
|
||||
|
||||
item.clearMetadata(dummy.schema, dummy.element, dummy.qualifier, Item.ANY);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,24 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.util.Properties;
|
||||
|
||||
/**
|
||||
* Bitstream filter to delete from TEXT bundle
|
||||
*
|
||||
*/
|
||||
public class DerivativeTextBitstreamFilter extends BitstreamFilterByBundleName {
|
||||
|
||||
public DerivativeTextBitstreamFilter()
|
||||
{
|
||||
props = new Properties();
|
||||
props.setProperty("bundle", "TEXT");
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,156 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.text.ParseException;
|
||||
import org.dspace.content.Item;
|
||||
|
||||
/**
|
||||
* A data transfer object class enhancement of org.dspace.content.DCValue, which is deprecated
|
||||
* Name intended to not conflict with DSpace API classes for similar concepts but not usable in this context
|
||||
*
|
||||
* Adds some utility methods
|
||||
*
|
||||
* Really not at all general enough but supports Dublin Core and the compound form notation <schema>.<element>[.<qualifier>]
|
||||
*
|
||||
* Does not support wildcard for qualifier
|
||||
*
|
||||
*
|
||||
*/
|
||||
class DtoMetadata
|
||||
{
|
||||
final String schema;
|
||||
final String element;
|
||||
final String qualifier;
|
||||
final String language;
|
||||
final String value;
|
||||
|
||||
private DtoMetadata(String schema, String element, String qualifier, String language, String value)
|
||||
{
|
||||
this.schema = schema;
|
||||
this.element = element;
|
||||
this.qualifier = qualifier;
|
||||
this.language = language;
|
||||
this.value = value;
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method
|
||||
*
|
||||
*
|
||||
* @param schema not null, not empty - 'dc' is the standard case
|
||||
* @param element not null, not empty
|
||||
* @param qualifier null; don't allow empty string or * indicating 'any'
|
||||
* @param language null or empty
|
||||
* @param value
|
||||
* @return DtoMetadata object
|
||||
*/
|
||||
public static DtoMetadata create(String schema,
|
||||
String element,
|
||||
String qualifier,
|
||||
String language,
|
||||
String value)
|
||||
throws IllegalArgumentException
|
||||
{
|
||||
if ((qualifier != null) && (qualifier.equals(Item.ANY) || qualifier.equals("")))
|
||||
{
|
||||
throw new IllegalArgumentException("Invalid qualifier: " + qualifier);
|
||||
}
|
||||
return new DtoMetadata(schema, element, qualifier, language, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Factory method to create metadata object
|
||||
*
|
||||
*
|
||||
* @param compoundForm of the form <schema>.<element>[.<qualifier>]
|
||||
* @param language null or empty
|
||||
* @param value
|
||||
*/
|
||||
public static DtoMetadata create(String compoundForm, String language, String value)
|
||||
throws ParseException, IllegalArgumentException
|
||||
{
|
||||
String[] ar = MetadataUtilities.parseCompoundForm(compoundForm);
|
||||
|
||||
String qual = null;
|
||||
if (ar.length > 2)
|
||||
{
|
||||
qual = ar[2];
|
||||
}
|
||||
|
||||
return create(ar[0], ar[1], qual, language, value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if this metadata field matches the specified type:
|
||||
* schema.element or schema.element.qualifier
|
||||
*
|
||||
*
|
||||
* @param compoundForm of the form <schema>.<element>[.<qualifier>|.*]
|
||||
* @param wildcard allow wildcards in compoundForm param
|
||||
* @return whether matches
|
||||
*/
|
||||
public boolean matches(String compoundForm, boolean wildcard)
|
||||
{
|
||||
String[] ar = compoundForm.split("\\s*\\.\\s*"); //MetadataUtilities.parseCompoundForm(compoundForm);
|
||||
|
||||
if ((ar.length < 2) || (ar.length > 3))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!this.schema.equals(ar[0]) || !this.element.equals(ar[1]))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
if (ar.length == 2)
|
||||
{
|
||||
if (this.qualifier != null)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
if (ar.length == 3)
|
||||
{
|
||||
if (this.qualifier == null)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
if (wildcard && ar[2].equals(Item.ANY))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
if (!this.qualifier.equals(ar[2]))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
public String toString()
|
||||
{
|
||||
String s = "\tSchema: " + schema + " Element: " + element;
|
||||
if (qualifier != null)
|
||||
{
|
||||
s+= " Qualifier: " + qualifier;
|
||||
}
|
||||
s+= " Language: " + ((language == null) ? "[null]" : language);
|
||||
s += " Value: " + value;
|
||||
|
||||
return s;
|
||||
}
|
||||
|
||||
public String getValue()
|
||||
{
|
||||
return value;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,348 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.io.BufferedWriter;
|
||||
import java.io.File;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.FileWriter;
|
||||
import java.io.FileNotFoundException;
|
||||
import java.io.FileOutputStream;
|
||||
import java.io.InputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.OutputStream;
|
||||
import java.io.PrintWriter;
|
||||
import java.sql.SQLException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
import javax.xml.parsers.DocumentBuilder;
|
||||
import javax.xml.parsers.DocumentBuilderFactory;
|
||||
import javax.xml.parsers.ParserConfigurationException;
|
||||
import javax.xml.transform.Transformer;
|
||||
import javax.xml.transform.TransformerException;
|
||||
import javax.xml.transform.TransformerFactory;
|
||||
import javax.xml.transform.TransformerConfigurationException;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.content.ItemIterator;
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.DSpaceObject;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.handle.HandleManager;
|
||||
|
||||
import org.w3c.dom.Document;
|
||||
|
||||
|
||||
/**
|
||||
* Encapsulates the Item in the context of the DSpace Archive Format
|
||||
*
|
||||
*/
|
||||
public class ItemArchive {
|
||||
private static final Logger log = Logger.getLogger(ItemArchive.class);
|
||||
|
||||
public static final String DUBLIN_CORE_XML = "dublin_core.xml";
|
||||
|
||||
private static DocumentBuilder builder = null;
|
||||
private static Transformer transformer = null;
|
||||
|
||||
private List<DtoMetadata> dtomList = null;
|
||||
private List<DtoMetadata> undoDtomList = new ArrayList<DtoMetadata>();
|
||||
|
||||
private List<Integer> undoAddContents = new ArrayList<Integer>(); // for undo of add
|
||||
|
||||
private Item item;
|
||||
private File dir; // directory name in source archive for this item
|
||||
private String dirname; //convenience
|
||||
|
||||
//constructors
|
||||
private ItemArchive()
|
||||
{
|
||||
// nothing
|
||||
}
|
||||
|
||||
/** factory method
|
||||
*
|
||||
* Minimal requirements for dublin_core.xml for this application
|
||||
* is the presence of dc.identifier.uri
|
||||
* which must contain the handle for the item
|
||||
*
|
||||
* @param context - The DSpace context
|
||||
* @param dir - The directory File in the source archive
|
||||
* @param itemField - The metadata field in which the Item identifier is located
|
||||
* if null, the default is the handle in the dc.identifier.uri field
|
||||
*
|
||||
*/
|
||||
public static ItemArchive create(Context context, File dir, String itemField)
|
||||
throws Exception
|
||||
{
|
||||
ItemArchive itarch = new ItemArchive();
|
||||
itarch.dir = dir;
|
||||
itarch.dirname = dir.getName();
|
||||
InputStream is = null;
|
||||
try
|
||||
{
|
||||
is = new FileInputStream(new File(dir, DUBLIN_CORE_XML));
|
||||
itarch.dtomList = MetadataUtilities.loadDublinCore(getDocumentBuilder(), is);
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (is != null)
|
||||
{
|
||||
is.close();
|
||||
}
|
||||
}
|
||||
ItemUpdate.pr("Loaded metadata with " + itarch.dtomList.size() + " fields");
|
||||
|
||||
if (itemField == null)
|
||||
{
|
||||
itarch.item = itarch.itemFromHandleInput(context); // sets the item instance var and seeds the undo list
|
||||
}
|
||||
else
|
||||
{
|
||||
itarch.item = itarch.itemFromMetadataField(context, itemField);
|
||||
}
|
||||
|
||||
if (itarch.item == null)
|
||||
{
|
||||
throw new Exception("Item not instantiated: " + itarch.dirname);
|
||||
}
|
||||
|
||||
ItemUpdate.prv("item instantiated: " + itarch.item.getHandle());
|
||||
|
||||
return itarch;
|
||||
}
|
||||
|
||||
private static DocumentBuilder getDocumentBuilder()
|
||||
throws ParserConfigurationException
|
||||
{
|
||||
if (builder == null)
|
||||
{
|
||||
builder = DocumentBuilderFactory.newInstance().newDocumentBuilder();
|
||||
}
|
||||
return builder;
|
||||
}
|
||||
|
||||
private static Transformer getTransformer()
|
||||
throws TransformerConfigurationException
|
||||
{
|
||||
if (transformer == null)
|
||||
{
|
||||
transformer = TransformerFactory.newInstance().newTransformer();
|
||||
}
|
||||
return transformer;
|
||||
}
|
||||
|
||||
/**
|
||||
* Getter for the DSpace item referenced in the archive
|
||||
* @return DSpace item
|
||||
*/
|
||||
public Item getItem()
|
||||
{
|
||||
return item;
|
||||
}
|
||||
|
||||
/**
|
||||
* Getter for directory in archive on disk
|
||||
* @return directory in archive
|
||||
*/
|
||||
public File getDirectory()
|
||||
{
|
||||
return dir;
|
||||
}
|
||||
|
||||
/**
|
||||
* Getter for directory name in archive
|
||||
* @return directory name in archive
|
||||
*/
|
||||
public String getDirectoryName()
|
||||
{
|
||||
return dirname;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add metadata field to undo list
|
||||
* @param dtom
|
||||
*/
|
||||
public void addUndoMetadataField(DtoMetadata dtom)
|
||||
{
|
||||
this.undoDtomList.add(dtom);
|
||||
}
|
||||
|
||||
/**
|
||||
* Getter for list of metadata fields
|
||||
* @return list of metadata fields
|
||||
*/
|
||||
public List<DtoMetadata> getMetadataFields()
|
||||
{
|
||||
return dtomList;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add bitstream id to delete contents file
|
||||
* @param bitstreamId
|
||||
*/
|
||||
public void addUndoDeleteContents(int bitstreamId)
|
||||
{
|
||||
this.undoAddContents.add(bitstreamId);
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Obtain item from DSpace based on handle
|
||||
* This is the default implementation
|
||||
* that uses the dc.identifier.uri metadatafield
|
||||
* that contains the item handle as its value
|
||||
*
|
||||
*/
|
||||
private Item itemFromHandleInput(Context context)
|
||||
throws SQLException, Exception
|
||||
{
|
||||
DtoMetadata dtom = getMetadataField("dc.identifier.uri");
|
||||
if (dtom == null)
|
||||
{
|
||||
throw new Exception("No dc.identier.uri field found for handle");
|
||||
}
|
||||
|
||||
this.addUndoMetadataField(dtom); //seed the undo list with the uri
|
||||
|
||||
String uri = dtom.value;
|
||||
|
||||
if (!uri.startsWith(ItemUpdate.HANDLE_PREFIX))
|
||||
{
|
||||
throw new Exception("dc.identifier.uri for item " + uri
|
||||
+ " does not begin with prefix: " + ItemUpdate.HANDLE_PREFIX);
|
||||
}
|
||||
|
||||
String handle = uri.substring(ItemUpdate.HANDLE_PREFIX.length());
|
||||
|
||||
DSpaceObject dso = HandleManager.resolveToObject(context, handle);
|
||||
if (dso instanceof Item)
|
||||
{
|
||||
item = (Item) dso;
|
||||
}
|
||||
else
|
||||
{
|
||||
ItemUpdate.pr("Warning: item not instantiated");
|
||||
throw new IllegalArgumentException("Item " + handle + " not instantiated.");
|
||||
}
|
||||
return item;
|
||||
}
|
||||
|
||||
/**
|
||||
* Find and instantiate Item from the dublin_core.xml based
|
||||
* on the specified itemField for the item identifier,
|
||||
*
|
||||
*
|
||||
* @param context - the DSpace context
|
||||
* @param itemField - the compound form of the metadata element <schema>.<element>.<qualifier>
|
||||
* @throws SQLException
|
||||
* @throws Exception
|
||||
*/
|
||||
private Item itemFromMetadataField(Context context, String itemField)
|
||||
throws SQLException, AuthorizeException, Exception
|
||||
{
|
||||
DtoMetadata dtom = getMetadataField(itemField);
|
||||
|
||||
Item item = null;
|
||||
|
||||
if (dtom == null)
|
||||
{
|
||||
throw new IllegalArgumentException("No field found for item identifier field: " + itemField);
|
||||
}
|
||||
ItemUpdate.prv("Metadata field to match for item: " + dtom.toString());
|
||||
|
||||
this.addUndoMetadataField(dtom); //seed the undo list with the identifier field
|
||||
|
||||
ItemIterator itr = Item.findByMetadataField(context, dtom.schema, dtom.element, dtom.qualifier, dtom.value);
|
||||
int count = 0;
|
||||
while (itr.hasNext())
|
||||
{
|
||||
item = itr.next();
|
||||
count++;
|
||||
}
|
||||
|
||||
itr.close();
|
||||
|
||||
ItemUpdate.prv("items matching = " + count );
|
||||
|
||||
if (count != 1)
|
||||
{
|
||||
throw new Exception ("" + count + " items matching item identifier: " + dtom.value);
|
||||
}
|
||||
|
||||
return item;
|
||||
}
|
||||
|
||||
private DtoMetadata getMetadataField(String compoundForm)
|
||||
{
|
||||
for (DtoMetadata dtom : dtomList)
|
||||
{
|
||||
if (dtom.matches(compoundForm, false))
|
||||
{
|
||||
return dtom;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* write undo directory and files to Disk in archive format
|
||||
*
|
||||
*
|
||||
* @param undoDir - the root directory of the undo archive
|
||||
*/
|
||||
public void writeUndo(File undoDir)
|
||||
throws IOException, ParserConfigurationException, TransformerConfigurationException,
|
||||
TransformerException, FileNotFoundException
|
||||
{
|
||||
// create directory for item
|
||||
File dir = new File(undoDir, dirname);
|
||||
if (!dir.exists() && !dir.mkdir())
|
||||
{
|
||||
log.error("Unable to create undo directory");
|
||||
}
|
||||
|
||||
OutputStream out = null;
|
||||
|
||||
try
|
||||
{
|
||||
out = new FileOutputStream(new File(dir, "dublin_core.xml"));
|
||||
Document doc = MetadataUtilities.writeDublinCore(getDocumentBuilder(), undoDtomList);
|
||||
MetadataUtilities.writeDocument(doc, getTransformer(), out);
|
||||
|
||||
// if undo has delete bitstream
|
||||
if (undoAddContents.size() > 0)
|
||||
{
|
||||
PrintWriter pw = null;
|
||||
try
|
||||
{
|
||||
File f = new File(dir, ItemUpdate.DELETE_CONTENTS_FILE);
|
||||
pw = new PrintWriter(new BufferedWriter(new FileWriter(f)));
|
||||
for (Integer i : undoAddContents)
|
||||
{
|
||||
pw.println(i);
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
pw.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (out != null)
|
||||
{
|
||||
out.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
} //end class
|
||||
@@ -1,609 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.io.BufferedWriter;
|
||||
import java.io.File;
|
||||
import java.io.FilenameFilter;
|
||||
import java.io.FileNotFoundException;
|
||||
import java.io.FileWriter;
|
||||
import java.io.IOException;
|
||||
import java.io.PrintWriter;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Date;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.HelpFormatter;
|
||||
import org.apache.commons.cli.Option;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.eperson.EPerson;
|
||||
|
||||
/**
|
||||
*
|
||||
* Provides some batch editing capabilities for items in DSpace:
|
||||
* Metadata fields - Add, Delete
|
||||
* Bitstreams - Add, Delete
|
||||
*
|
||||
* The design has been for compatibility with ItemImporter
|
||||
* in the use of the DSpace archive format which is used to
|
||||
* specify changes on a per item basis. The directory names
|
||||
* to correspond to each item are arbitrary and will only be
|
||||
* used for logging purposes. The reference to the item is
|
||||
* from a required dc.identifier with the item handle to be
|
||||
* included in the dublin_core.xml (or similar metadata) file.
|
||||
*
|
||||
* Any combination of these actions is permitted in a single run of this class
|
||||
* The order of actions is important when used in combination.
|
||||
* It is the responsibility of the calling class (here, ItemUpdate)
|
||||
* to register UpdateAction classes in the order to which they are
|
||||
* to be performed.
|
||||
*
|
||||
*
|
||||
* It is unfortunate that so much code needs to be borrowed
|
||||
* from ItemImport as it is not reusable in private methods, etc.
|
||||
* Some of this has been placed into the MetadataUtilities class
|
||||
* for possible reuse elsewhere.
|
||||
*
|
||||
*
|
||||
* @author W. Hays based on a conceptual design by R. Rodgers
|
||||
*
|
||||
*/
|
||||
public class ItemUpdate {
|
||||
|
||||
public static final String SUPPRESS_UNDO_FILENAME = "suppress_undo";
|
||||
|
||||
public static final String CONTENTS_FILE = "contents";
|
||||
public static final String DELETE_CONTENTS_FILE = "delete_contents";
|
||||
|
||||
public static String HANDLE_PREFIX = null;
|
||||
public static final Map<String, String> filterAliases = new HashMap<String, String>();
|
||||
|
||||
public static boolean verbose = false;
|
||||
|
||||
static
|
||||
{
|
||||
filterAliases.put("ORIGINAL", "org.dspace.app.itemupdate.OriginalBitstreamFilter");
|
||||
filterAliases.put("ORIGINAL_AND_DERIVATIVES", "org.dspace.app.itemupdate.OriginalWithDerivativesBitstreamFilter");
|
||||
filterAliases.put("TEXT", "org.dspace.app.itemupdate.DerivativeTextBitstreamFilter");
|
||||
filterAliases.put("THUMBNAIL", "org.dspace.app.itemupdate.ThumbnailBitstreamFilter");
|
||||
}
|
||||
|
||||
// File listing filter to check for folders
|
||||
static FilenameFilter directoryFilter = new FilenameFilter()
|
||||
{
|
||||
public boolean accept(File dir, String n)
|
||||
{
|
||||
File f = new File(dir.getAbsolutePath() + File.separatorChar + n);
|
||||
return f.isDirectory();
|
||||
}
|
||||
};
|
||||
|
||||
// File listing filter to check for files (not directories)
|
||||
static FilenameFilter fileFilter = new FilenameFilter()
|
||||
{
|
||||
public boolean accept(File dir, String n)
|
||||
{
|
||||
File f = new File(dir.getAbsolutePath() + File.separatorChar + n);
|
||||
return (f.isFile());
|
||||
}
|
||||
};
|
||||
|
||||
// instance variables
|
||||
private ActionManager actionMgr = new ActionManager();
|
||||
private List<String> undoActionList = new ArrayList<String>();
|
||||
private String eperson;
|
||||
|
||||
/**
|
||||
*
|
||||
* @param argv
|
||||
*/
|
||||
public static void main(String[] argv)
|
||||
{
|
||||
// create an options object and populate it
|
||||
CommandLineParser parser = new PosixParser();
|
||||
|
||||
Options options = new Options();
|
||||
|
||||
//processing basis for determining items
|
||||
//item-specific changes with metadata in source directory with dublin_core.xml files
|
||||
options.addOption("s", "source", true, "root directory of source dspace archive ");
|
||||
|
||||
//actions on items
|
||||
options.addOption("a", "addmetadata", true, "add metadata specified for each item; multiples separated by semicolon ';'");
|
||||
options.addOption("d", "deletemetadata", true, "delete metadata specified for each item");
|
||||
|
||||
options.addOption("A", "addbitstreams", false, "add bitstreams as specified for each item");
|
||||
|
||||
// extra work to get optional argument
|
||||
Option delBitstreamOption = new Option("D", "deletebitstreams", true, "delete bitstreams as specified for each item");
|
||||
delBitstreamOption.setOptionalArg(true);
|
||||
delBitstreamOption.setArgName("BitstreamFilter");
|
||||
options.addOption(delBitstreamOption);
|
||||
|
||||
//other params
|
||||
options.addOption("e", "eperson", true, "email of eperson doing the update");
|
||||
options.addOption("i", "itemfield", true, "optional metadata field that containing item identifier; default is dc.identifier.uri");
|
||||
options.addOption("F", "filter-properties", true, "filter class name; only for deleting bitstream");
|
||||
options.addOption("v", "verbose", false, "verbose logging");
|
||||
|
||||
//special run states
|
||||
options.addOption("t", "test", false, "test run - do not actually import items");
|
||||
options.addOption("P", "provenance", false, "suppress altering provenance field for bitstream changes");
|
||||
options.addOption("h", "help", false, "help");
|
||||
|
||||
int status = 0;
|
||||
boolean isTest = false;
|
||||
boolean alterProvenance = true;
|
||||
String itemField = null;
|
||||
String metadataIndexName = null;
|
||||
|
||||
Context context = null;
|
||||
ItemUpdate iu = new ItemUpdate();
|
||||
|
||||
try
|
||||
{
|
||||
CommandLine line = parser.parse(options, argv);
|
||||
|
||||
if (line.hasOption('h'))
|
||||
{
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("ItemUpdate", options);
|
||||
pr("");
|
||||
pr("Examples:");
|
||||
pr(" adding metadata: ItemUpdate -e jsmith@mit.edu -s sourcedir -a dc.contributor -a dc.subject ");
|
||||
pr(" deleting metadata: ItemUpdate -e jsmith@mit.edu -s sourcedir -d dc.description.other");
|
||||
pr(" adding bitstreams: ItemUpdate -e jsmith@mit.edu -s sourcedir -A -i dc.identifier");
|
||||
pr(" deleting bitstreams: ItemUpdate -e jsmith@mit.edu -s sourcedir -D ORIGINAL ");
|
||||
pr("");
|
||||
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
if (line.hasOption('v'))
|
||||
{
|
||||
verbose = true;
|
||||
}
|
||||
|
||||
|
||||
if (line.hasOption('P'))
|
||||
{
|
||||
alterProvenance = false;
|
||||
pr("Suppressing changes to Provenance field option");
|
||||
}
|
||||
|
||||
iu.eperson = line.getOptionValue('e'); // db ID or email
|
||||
|
||||
if (!line.hasOption('s')) // item specific changes from archive dir
|
||||
{
|
||||
pr("Missing source archive option");
|
||||
System.exit(1);
|
||||
}
|
||||
String sourcedir = line.getOptionValue('s');
|
||||
|
||||
if (line.hasOption('t')) //test
|
||||
{
|
||||
isTest = true;
|
||||
pr("**Test Run** - not actually updating items.");
|
||||
|
||||
}
|
||||
|
||||
if (line.hasOption('i'))
|
||||
{
|
||||
itemField = line.getOptionValue('i');
|
||||
}
|
||||
|
||||
if (line.hasOption('d'))
|
||||
{
|
||||
String[] targetFields = line.getOptionValues('d');
|
||||
|
||||
DeleteMetadataAction delMetadataAction = (DeleteMetadataAction) iu.actionMgr.getUpdateAction(DeleteMetadataAction.class);
|
||||
delMetadataAction.addTargetFields(targetFields);
|
||||
|
||||
//undo is an add
|
||||
for (String field : targetFields)
|
||||
{
|
||||
iu.undoActionList.add(" -a " + field + " ");
|
||||
}
|
||||
|
||||
pr("Delete metadata for fields: ");
|
||||
for (String s : targetFields)
|
||||
{
|
||||
pr(" " + s);
|
||||
}
|
||||
}
|
||||
|
||||
if (line.hasOption('a'))
|
||||
{
|
||||
String[] targetFields = line.getOptionValues('a');
|
||||
|
||||
AddMetadataAction addMetadataAction = (AddMetadataAction) iu.actionMgr.getUpdateAction(AddMetadataAction.class);
|
||||
addMetadataAction.addTargetFields(targetFields);
|
||||
|
||||
//undo is a delete followed by an add of a replace record for target fields
|
||||
for (String field : targetFields)
|
||||
{
|
||||
iu.undoActionList.add(" -d " + field + " ");
|
||||
}
|
||||
|
||||
for (String field : targetFields)
|
||||
{
|
||||
iu.undoActionList.add(" -a " + field + " ");
|
||||
}
|
||||
|
||||
pr("Add metadata for fields: ");
|
||||
for (String s : targetFields)
|
||||
{
|
||||
pr(" " + s);
|
||||
}
|
||||
}
|
||||
|
||||
if (line.hasOption('D')) // undo not supported
|
||||
{
|
||||
pr("Delete bitstreams ");
|
||||
|
||||
String[] filterNames = line.getOptionValues('D');
|
||||
if ((filterNames != null) && (filterNames.length > 1))
|
||||
{
|
||||
pr("Error: Only one filter can be a used at a time.");
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
String filterName = line.getOptionValue('D');
|
||||
pr("Filter argument: " + filterName);
|
||||
|
||||
if (filterName == null) // indicates using delete_contents files
|
||||
{
|
||||
DeleteBitstreamsAction delAction = (DeleteBitstreamsAction) iu.actionMgr.getUpdateAction(DeleteBitstreamsAction.class);
|
||||
delAction.setAlterProvenance(alterProvenance);
|
||||
}
|
||||
else
|
||||
{
|
||||
// check if param is on ALIAS list
|
||||
String filterClassname = filterAliases.get(filterName);
|
||||
|
||||
if (filterClassname == null)
|
||||
{
|
||||
filterClassname = filterName;
|
||||
}
|
||||
|
||||
BitstreamFilter filter = null;
|
||||
|
||||
try
|
||||
{
|
||||
Class<?> cfilter = Class.forName(filterClassname);
|
||||
pr("BitstreamFilter class to instantiate: " + cfilter.toString());
|
||||
|
||||
filter = (BitstreamFilter) cfilter.newInstance(); //unfortunate cast, an erasure consequence
|
||||
}
|
||||
catch(Exception e)
|
||||
{
|
||||
pr("Error: Failure instantiating bitstream filter class: " + filterClassname);
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
String filterPropertiesName = line.getOptionValue('F');
|
||||
if (filterPropertiesName != null) //not always required
|
||||
{
|
||||
try
|
||||
{
|
||||
// TODO try multiple relative locations, e.g. source dir
|
||||
if (!filterPropertiesName.startsWith("/"))
|
||||
{
|
||||
filterPropertiesName = sourcedir + File.separator + filterPropertiesName;
|
||||
}
|
||||
|
||||
filter.initProperties(filterPropertiesName);
|
||||
}
|
||||
catch(Exception e)
|
||||
{
|
||||
pr("Error: Failure finding properties file for bitstream filter class: " + filterPropertiesName);
|
||||
System.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
DeleteBitstreamsByFilterAction delAction =
|
||||
(DeleteBitstreamsByFilterAction) iu.actionMgr.getUpdateAction(DeleteBitstreamsByFilterAction.class);
|
||||
delAction.setAlterProvenance(alterProvenance);
|
||||
delAction.setBitstreamFilter(filter);
|
||||
//undo not supported
|
||||
}
|
||||
}
|
||||
|
||||
if (line.hasOption('A'))
|
||||
{
|
||||
pr("Add bitstreams ");
|
||||
AddBitstreamsAction addAction = (AddBitstreamsAction) iu.actionMgr.getUpdateAction(AddBitstreamsAction.class);
|
||||
addAction.setAlterProvenance(alterProvenance);
|
||||
|
||||
iu.undoActionList.add(" -D "); // delete_contents file will be written, no arg required
|
||||
}
|
||||
|
||||
if (!iu.actionMgr.hasActions())
|
||||
{
|
||||
pr("Error - an action must be specified");
|
||||
System.exit(1);
|
||||
}
|
||||
else
|
||||
{
|
||||
pr("Actions to be performed: ");
|
||||
|
||||
for (UpdateAction ua : iu.actionMgr)
|
||||
{
|
||||
pr(" " + ua.getClass().getName());
|
||||
}
|
||||
}
|
||||
|
||||
pr("ItemUpdate - initializing run on " + (new Date()).toString());
|
||||
|
||||
context = new Context();
|
||||
iu.setEPerson(context, iu.eperson);
|
||||
context.setIgnoreAuthorization(true);
|
||||
|
||||
HANDLE_PREFIX = ConfigurationManager.getProperty("handle.canonical.prefix");
|
||||
if (HANDLE_PREFIX == null || HANDLE_PREFIX.length() == 0)
|
||||
{
|
||||
HANDLE_PREFIX = "http://hdl.handle.net/";
|
||||
}
|
||||
|
||||
iu.processArchive(context, sourcedir, itemField, metadataIndexName, alterProvenance, isTest);
|
||||
|
||||
context.complete(); // complete all transactions
|
||||
context.setIgnoreAuthorization(false);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
if (context != null && context.isValid())
|
||||
{
|
||||
context.abort();
|
||||
context.setIgnoreAuthorization(false);
|
||||
}
|
||||
e.printStackTrace();
|
||||
pr(e.toString());
|
||||
status = 1;
|
||||
}
|
||||
|
||||
if (isTest)
|
||||
{
|
||||
pr("***End of Test Run***");
|
||||
}
|
||||
else
|
||||
{
|
||||
pr("End.");
|
||||
|
||||
}
|
||||
System.exit(status);
|
||||
}
|
||||
|
||||
private void processArchive(Context context, String sourceDirPath, String itemField,
|
||||
String metadataIndexName, boolean alterProvenance, boolean isTest)
|
||||
throws Exception
|
||||
{
|
||||
// open and process the source directory
|
||||
File sourceDir = new File(sourceDirPath);
|
||||
|
||||
if ((sourceDir == null) || !sourceDir.exists() || !sourceDir.isDirectory())
|
||||
{
|
||||
pr("Error, cannot open archive source directory " + sourceDirPath);
|
||||
throw new Exception("error with archive source directory " + sourceDirPath);
|
||||
}
|
||||
|
||||
String[] dircontents = sourceDir.list(directoryFilter); //just the names, not the path
|
||||
Arrays.sort(dircontents);
|
||||
|
||||
//Undo is suppressed to prevent undo of undo
|
||||
boolean suppressUndo = false;
|
||||
File fSuppressUndo = new File(sourceDir, SUPPRESS_UNDO_FILENAME);
|
||||
if (fSuppressUndo.exists())
|
||||
{
|
||||
suppressUndo = true;
|
||||
}
|
||||
|
||||
File undoDir = null; //sibling directory of source archive
|
||||
|
||||
if (!suppressUndo && !isTest)
|
||||
{
|
||||
undoDir = initUndoArchive(sourceDir);
|
||||
}
|
||||
|
||||
int itemCount = 0;
|
||||
int successItemCount = 0;
|
||||
|
||||
for (String dirname : dircontents)
|
||||
{
|
||||
itemCount++;
|
||||
pr("");
|
||||
pr("processing item " + dirname);
|
||||
|
||||
try
|
||||
{
|
||||
ItemArchive itarch = ItemArchive.create(context, new File(sourceDir, dirname), itemField);
|
||||
|
||||
for (UpdateAction action : actionMgr)
|
||||
{
|
||||
pr("action: " + action.getClass().getName());
|
||||
action.execute(context, itarch, isTest, suppressUndo);
|
||||
if (!isTest && !suppressUndo)
|
||||
{
|
||||
itarch.writeUndo(undoDir);
|
||||
}
|
||||
}
|
||||
if (!isTest)
|
||||
{
|
||||
Item item = itarch.getItem();
|
||||
item.update(); //need to update before commit
|
||||
context.commit();
|
||||
item.decache();
|
||||
}
|
||||
ItemUpdate.pr("Item " + dirname + " completed");
|
||||
successItemCount++;
|
||||
}
|
||||
catch(Exception e)
|
||||
{
|
||||
pr("Exception processing item " + dirname + ": " + e.toString());
|
||||
}
|
||||
}
|
||||
|
||||
if (!suppressUndo && !isTest)
|
||||
{
|
||||
StringBuilder sb = new StringBuilder("dsrun org.dspace.app.itemupdate.ItemUpdate ");
|
||||
sb.append(" -e ").append(this.eperson);
|
||||
sb.append(" -s ").append(undoDir);
|
||||
|
||||
if (itemField != null)
|
||||
{
|
||||
sb.append(" -i ").append(itemField);
|
||||
}
|
||||
|
||||
if (!alterProvenance)
|
||||
{
|
||||
sb.append(" -P ");
|
||||
}
|
||||
if (isTest)
|
||||
{
|
||||
sb.append(" -t ");
|
||||
}
|
||||
|
||||
for (String actionOption : undoActionList)
|
||||
{
|
||||
sb.append(actionOption);
|
||||
}
|
||||
|
||||
PrintWriter pw = null;
|
||||
try
|
||||
{
|
||||
File cmdFile = new File (undoDir.getParent(), undoDir.getName() + "_command.sh");
|
||||
pw = new PrintWriter(new BufferedWriter(new FileWriter(cmdFile)));
|
||||
pw.println(sb.toString());
|
||||
}
|
||||
finally
|
||||
{
|
||||
pw.close();
|
||||
}
|
||||
}
|
||||
|
||||
pr("");
|
||||
pr("Done processing. Successful items: " + successItemCount + " of " + itemCount + " items in source archive");
|
||||
pr("");
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
*
|
||||
* to avoid overwriting the undo source tree on repeated processing
|
||||
* sequence numbers are added and checked
|
||||
*
|
||||
* @param sourceDir - the original source directory
|
||||
* @return the directory of the undo archive
|
||||
* @throws FileNotFoundException
|
||||
* @throws IOException
|
||||
*/
|
||||
private File initUndoArchive(File sourceDir)
|
||||
throws FileNotFoundException, IOException
|
||||
{
|
||||
File parentDir = sourceDir.getAbsoluteFile().getParentFile();
|
||||
if (parentDir == null)
|
||||
{
|
||||
throw new FileNotFoundException("Parent directory of archive directory not found; unable to write UndoArchive; no processing performed");
|
||||
}
|
||||
|
||||
String sourceDirName = sourceDir.getName();
|
||||
int seqNo = 1;
|
||||
|
||||
File undoDir = new File(parentDir, "undo_" + sourceDirName + "_" + seqNo);
|
||||
while (undoDir.exists())
|
||||
{
|
||||
undoDir = new File(parentDir, "undo_" + sourceDirName+ "_" + ++seqNo); //increment
|
||||
}
|
||||
|
||||
// create root directory
|
||||
if (!undoDir.mkdir())
|
||||
{
|
||||
pr("ERROR creating Undo Archive directory ");
|
||||
throw new IOException("ERROR creating Undo Archive directory ");
|
||||
}
|
||||
|
||||
//Undo is suppressed to prevent undo of undo
|
||||
File fSuppressUndo = new File(undoDir, ItemUpdate.SUPPRESS_UNDO_FILENAME);
|
||||
try
|
||||
{
|
||||
fSuppressUndo.createNewFile();
|
||||
}
|
||||
catch(IOException e)
|
||||
{
|
||||
pr("ERROR creating Suppress Undo File " + e.toString());
|
||||
throw e;
|
||||
}
|
||||
return undoDir;
|
||||
}
|
||||
|
||||
//private void write
|
||||
|
||||
private void setEPerson(Context context, String eperson)
|
||||
throws Exception
|
||||
{
|
||||
if (eperson == null)
|
||||
{
|
||||
pr("Error - an eperson to do the importing must be specified");
|
||||
pr(" (run with -h flag for details)");
|
||||
throw new Exception("EPerson not specified."); }
|
||||
|
||||
EPerson myEPerson = null;
|
||||
|
||||
if (eperson.indexOf('@') != -1)
|
||||
{
|
||||
// @ sign, must be an email
|
||||
myEPerson = EPerson.findByEmail(context, eperson);
|
||||
}
|
||||
else
|
||||
{
|
||||
myEPerson = EPerson.find(context, Integer.parseInt(eperson));
|
||||
}
|
||||
|
||||
if (myEPerson == null)
|
||||
{
|
||||
pr("Error, eperson cannot be found: " + eperson);
|
||||
throw new Exception("Invalid EPerson");
|
||||
}
|
||||
|
||||
context.setCurrentUser(myEPerson);
|
||||
}
|
||||
|
||||
/**
|
||||
* poor man's logging
|
||||
* As with ItemImport, API logging goes through log4j to the DSpace.log files
|
||||
* whereas the batch logging goes to the console to be captured there.
|
||||
* @param s
|
||||
*/
|
||||
static void pr(String s)
|
||||
{
|
||||
System.out.println(s);
|
||||
}
|
||||
|
||||
/**
|
||||
* print if verbose flag is set
|
||||
* @param s
|
||||
*/
|
||||
static void prv(String s)
|
||||
{
|
||||
if (verbose)
|
||||
{
|
||||
System.out.println(s);
|
||||
}
|
||||
}
|
||||
|
||||
} //end of class
|
||||
|
||||
@@ -1,532 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.io.BufferedReader;
|
||||
import java.io.File;
|
||||
import java.io.FileNotFoundException;
|
||||
import java.io.FileReader;
|
||||
import java.io.InputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.OutputStream;
|
||||
import java.sql.SQLException;
|
||||
import java.text.ParseException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
import javax.xml.parsers.DocumentBuilder;
|
||||
import javax.xml.parsers.ParserConfigurationException;
|
||||
import javax.xml.transform.Result;
|
||||
import javax.xml.transform.Source;
|
||||
import javax.xml.transform.Transformer;
|
||||
import javax.xml.transform.TransformerConfigurationException;
|
||||
import javax.xml.transform.TransformerException;
|
||||
import javax.xml.transform.dom.DOMSource;
|
||||
import javax.xml.transform.stream.StreamResult;
|
||||
|
||||
import org.apache.commons.lang.StringUtils;
|
||||
import org.apache.xpath.XPathAPI;
|
||||
|
||||
import org.w3c.dom.Document;
|
||||
import org.w3c.dom.Element;
|
||||
import org.w3c.dom.NamedNodeMap;
|
||||
import org.w3c.dom.Node;
|
||||
import org.w3c.dom.NodeList;
|
||||
import org.xml.sax.SAXException;
|
||||
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.DCValue;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.content.MetadataSchema;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
|
||||
|
||||
/**
|
||||
* Miscellaneous methods for metadata handling that build on the API
|
||||
* which might have general utility outside of the specific use
|
||||
* in context in ItemUpdate.
|
||||
*
|
||||
* The XML methods were based on those in ItemImport
|
||||
*
|
||||
*
|
||||
*/
|
||||
public class MetadataUtilities {
|
||||
|
||||
/**
|
||||
*
|
||||
* Working around Item API to delete a value-specific DCValue
|
||||
* For a given element/qualifier/lang:
|
||||
* get all DCValues
|
||||
* clear (i.e. delete) all of these DCValues
|
||||
* add them back, minus the one to actually delete
|
||||
*
|
||||
*
|
||||
* @param item
|
||||
* @param dtom
|
||||
* @param isLanguageStrict -
|
||||
*
|
||||
* @return true if metadata field is found with matching value and was deleted
|
||||
*/
|
||||
public static boolean deleteMetadataByValue(Item item, DtoMetadata dtom, boolean isLanguageStrict)
|
||||
{
|
||||
DCValue[] ar = null;
|
||||
|
||||
if (isLanguageStrict)
|
||||
{ // get all for given type
|
||||
ar = item.getMetadata(dtom.schema, dtom.element, dtom.qualifier, dtom.language);
|
||||
}
|
||||
else
|
||||
{
|
||||
ar = item.getMetadata(dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
|
||||
}
|
||||
|
||||
boolean found = false;
|
||||
|
||||
//build new set minus the one to delete
|
||||
List<String> vals = new ArrayList<String>();
|
||||
for (DCValue dcv : ar)
|
||||
{
|
||||
if (dcv.value.equals(dtom.value))
|
||||
{
|
||||
found = true;
|
||||
}
|
||||
else
|
||||
{
|
||||
vals.add(dcv.value);
|
||||
}
|
||||
}
|
||||
|
||||
if (found) //remove all for given type ??synchronize this block??
|
||||
{
|
||||
if (isLanguageStrict)
|
||||
{
|
||||
item.clearMetadata(dtom.schema, dtom.element, dtom.qualifier, dtom.language);
|
||||
}
|
||||
else
|
||||
{
|
||||
item.clearMetadata(dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
|
||||
}
|
||||
|
||||
item.addMetadata(dtom.schema, dtom.element, dtom.qualifier, dtom.language, vals.toArray(new String[vals.size()]));
|
||||
}
|
||||
return found;
|
||||
}
|
||||
|
||||
/**
|
||||
* Append text to value metadata field to item
|
||||
*
|
||||
* @param item
|
||||
* @param dtom
|
||||
* @param isLanguageStrict
|
||||
* @param textToAppend
|
||||
* @throws IllegalArgumentException - When target metadata field is not found
|
||||
*/
|
||||
public static void appendMetadata(Item item, DtoMetadata dtom, boolean isLanguageStrict,
|
||||
String textToAppend)
|
||||
throws IllegalArgumentException
|
||||
{
|
||||
DCValue[] ar = null;
|
||||
|
||||
// get all values for given element/qualifier
|
||||
if (isLanguageStrict) // get all for given element/qualifier
|
||||
{
|
||||
ar = item.getMetadata(dtom.schema, dtom.element, dtom.qualifier, dtom.language);
|
||||
}
|
||||
else
|
||||
{
|
||||
ar = item.getMetadata(dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
|
||||
}
|
||||
|
||||
if (ar.length == 0)
|
||||
{
|
||||
throw new IllegalArgumentException("Metadata to append to not found");
|
||||
}
|
||||
|
||||
int idx = 0; //index of field to change
|
||||
if (ar.length > 1) //need to pick one, can't be sure it's the last one
|
||||
{
|
||||
// TODO maybe get highest id ?
|
||||
}
|
||||
|
||||
//build new set minus the one to delete
|
||||
List<String> vals = new ArrayList<String>();
|
||||
for (int i=0; i < ar.length; i++)
|
||||
{
|
||||
if (i == idx)
|
||||
{
|
||||
vals.add(ar[i].value + textToAppend);
|
||||
}
|
||||
else
|
||||
{
|
||||
vals.add(ar[i].value);
|
||||
}
|
||||
}
|
||||
|
||||
if (isLanguageStrict)
|
||||
{
|
||||
item.clearMetadata(dtom.schema, dtom.element, dtom.qualifier, dtom.language);
|
||||
}
|
||||
else
|
||||
{
|
||||
item.clearMetadata(dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
|
||||
}
|
||||
|
||||
item.addMetadata(dtom.schema, dtom.element, dtom.qualifier, dtom.language, vals.toArray(new String[vals.size()]));
|
||||
}
|
||||
|
||||
/**
|
||||
* Modification of method from ItemImporter.loadDublinCore
|
||||
* as a Factory method
|
||||
*
|
||||
* @param docBuilder -
|
||||
* @param is - InputStream of dublin_core.xml
|
||||
* @return list of DtoMetadata representing the metadata fields relating to an Item
|
||||
* @throws SQLException
|
||||
* @throws IOException
|
||||
* @throws ParserConfigurationException
|
||||
* @throws SAXException
|
||||
* @throws TransformerException
|
||||
* @throws AuthorizeException
|
||||
*/
|
||||
public static List<DtoMetadata> loadDublinCore(DocumentBuilder docBuilder, InputStream is)
|
||||
throws SQLException, IOException, ParserConfigurationException,
|
||||
SAXException, TransformerException, AuthorizeException
|
||||
{
|
||||
Document document = docBuilder.parse(is);
|
||||
|
||||
List<DtoMetadata> dtomList = new ArrayList<DtoMetadata>();
|
||||
|
||||
// Get the schema, for backward compatibility we will default to the
|
||||
// dublin core schema if the schema name is not available in the import file
|
||||
String schema = null;
|
||||
NodeList metadata = XPathAPI.selectNodeList(document, "/dublin_core");
|
||||
Node schemaAttr = metadata.item(0).getAttributes().getNamedItem("schema");
|
||||
if (schemaAttr == null)
|
||||
{
|
||||
schema = MetadataSchema.DC_SCHEMA;
|
||||
}
|
||||
else
|
||||
{
|
||||
schema = schemaAttr.getNodeValue();
|
||||
}
|
||||
|
||||
// Get the nodes corresponding to formats
|
||||
NodeList dcNodes = XPathAPI.selectNodeList(document, "/dublin_core/dcvalue");
|
||||
|
||||
for (int i = 0; i < dcNodes.getLength(); i++)
|
||||
{
|
||||
Node n = dcNodes.item(i);
|
||||
String value = getStringValue(n).trim();
|
||||
// compensate for empty value getting read as "null", which won't display
|
||||
if (value == null)
|
||||
{
|
||||
value = "";
|
||||
}
|
||||
String element = getAttributeValue(n, "element");
|
||||
if (element != null)
|
||||
{
|
||||
element = element.trim();
|
||||
}
|
||||
String qualifier = getAttributeValue(n, "qualifier");
|
||||
if (qualifier != null)
|
||||
{
|
||||
qualifier = qualifier.trim();
|
||||
}
|
||||
String language = getAttributeValue(n, "language");
|
||||
if (language != null)
|
||||
{
|
||||
language = language.trim();
|
||||
}
|
||||
|
||||
if ("none".equals(qualifier) || "".equals(qualifier))
|
||||
{
|
||||
qualifier = null;
|
||||
}
|
||||
|
||||
// a goofy default, but consistent with DSpace treatment elsewhere
|
||||
if (language == null)
|
||||
{
|
||||
language = "en";
|
||||
}
|
||||
else if ("".equals(language))
|
||||
{
|
||||
language = ConfigurationManager.getProperty("default.language");
|
||||
}
|
||||
|
||||
DtoMetadata dtom = DtoMetadata.create(schema, element, qualifier, language, value);
|
||||
ItemUpdate.pr(dtom.toString());
|
||||
dtomList.add(dtom);
|
||||
}
|
||||
return dtomList;
|
||||
}
|
||||
|
||||
/**
|
||||
* Write dublin_core.xml
|
||||
*
|
||||
* @param docBuilder
|
||||
* @param dtomList
|
||||
* @return xml document
|
||||
* @throws ParserConfigurationException
|
||||
* @throws TransformerConfigurationException
|
||||
* @throws TransformerException
|
||||
*/
|
||||
public static Document writeDublinCore(DocumentBuilder docBuilder, List<DtoMetadata> dtomList)
|
||||
throws ParserConfigurationException, TransformerConfigurationException, TransformerException
|
||||
{
|
||||
Document doc = docBuilder.newDocument();
|
||||
Element root = doc.createElement("dublin_core");
|
||||
doc.appendChild(root);
|
||||
|
||||
for (DtoMetadata dtom : dtomList)
|
||||
{
|
||||
Element mel = doc.createElement("dcvalue");
|
||||
mel.setAttribute("element", dtom.element);
|
||||
if (dtom.qualifier == null)
|
||||
{
|
||||
mel.setAttribute("qualifier", "none");
|
||||
}
|
||||
else
|
||||
{
|
||||
mel.setAttribute("qualifier", dtom.qualifier);
|
||||
}
|
||||
|
||||
if (StringUtils.isEmpty(dtom.language))
|
||||
{
|
||||
mel.setAttribute("language", "en");
|
||||
}
|
||||
else
|
||||
{
|
||||
mel.setAttribute("language", dtom.language);
|
||||
}
|
||||
mel.setTextContent(dtom.value);
|
||||
root.appendChild(mel);
|
||||
}
|
||||
|
||||
return doc;
|
||||
}
|
||||
|
||||
/**
|
||||
* write xml document to output stream
|
||||
* @param doc
|
||||
* @param transformer
|
||||
* @param out
|
||||
* @throws IOException
|
||||
* @throws TransformerException
|
||||
*/
|
||||
public static void writeDocument(Document doc, Transformer transformer, OutputStream out)
|
||||
throws IOException, TransformerException
|
||||
{
|
||||
Source src = new DOMSource(doc);
|
||||
Result dest = new StreamResult(out);
|
||||
transformer.transform(src, dest);
|
||||
}
|
||||
|
||||
|
||||
|
||||
// XML utility methods
|
||||
/**
|
||||
* Lookup an attribute from a DOM node.
|
||||
* @param n
|
||||
* @param name
|
||||
* @return
|
||||
*/
|
||||
private static String getAttributeValue(Node n, String name)
|
||||
{
|
||||
NamedNodeMap nm = n.getAttributes();
|
||||
|
||||
for (int i = 0; i < nm.getLength(); i++)
|
||||
{
|
||||
Node node = nm.item(i);
|
||||
|
||||
if (name.equals(node.getNodeName()))
|
||||
{
|
||||
return node.getNodeValue();
|
||||
}
|
||||
}
|
||||
|
||||
return "";
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the String value of a Node.
|
||||
* @param node
|
||||
* @return
|
||||
*/
|
||||
private static String getStringValue(Node node)
|
||||
{
|
||||
String value = node.getNodeValue();
|
||||
|
||||
if (node.hasChildNodes())
|
||||
{
|
||||
Node first = node.getFirstChild();
|
||||
|
||||
if (first.getNodeType() == Node.TEXT_NODE)
|
||||
{
|
||||
return first.getNodeValue();
|
||||
}
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
/**
|
||||
* rewrite of ItemImport's functionality
|
||||
* but just the parsing of the file, not the processing of its elements
|
||||
*
|
||||
*
|
||||
* @validate flag to verify matching files in tree
|
||||
* @return
|
||||
*/
|
||||
public static List<ContentsEntry> readContentsFile(File f)
|
||||
throws FileNotFoundException, IOException, ParseException
|
||||
{
|
||||
List<ContentsEntry> list = new ArrayList<ContentsEntry>();
|
||||
|
||||
BufferedReader in = null;
|
||||
|
||||
try
|
||||
{
|
||||
in = new BufferedReader(new FileReader(f));
|
||||
String line = null;
|
||||
|
||||
while ((line = in.readLine()) != null)
|
||||
{
|
||||
line = line.trim();
|
||||
if ("".equals(line))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
ItemUpdate.pr("Contents entry: " + line);
|
||||
list.add(ContentsEntry.parse(line));
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
try
|
||||
{
|
||||
in.close();
|
||||
}
|
||||
catch(IOException e)
|
||||
{
|
||||
//skip
|
||||
}
|
||||
}
|
||||
|
||||
return list;
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param f
|
||||
* @return
|
||||
* @throws FileNotFoundException
|
||||
* @throws IOException
|
||||
*/
|
||||
public static List<Integer> readDeleteContentsFile(File f)
|
||||
throws FileNotFoundException, IOException
|
||||
{
|
||||
List<Integer> list = new ArrayList<Integer>();
|
||||
|
||||
BufferedReader in = null;
|
||||
|
||||
try
|
||||
{
|
||||
in = new BufferedReader(new FileReader(f));
|
||||
String line = null;
|
||||
|
||||
while ((line = in.readLine()) != null)
|
||||
{
|
||||
line = line.trim();
|
||||
if ("".equals(line))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
int n = 0;
|
||||
try
|
||||
{
|
||||
n = Integer.parseInt(line);
|
||||
list.add(n);
|
||||
}
|
||||
catch(NumberFormatException e)
|
||||
{
|
||||
ItemUpdate.pr("Error reading delete contents line:" + e.toString());
|
||||
}
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
try
|
||||
{
|
||||
in.close();
|
||||
}
|
||||
catch(IOException e)
|
||||
{
|
||||
//skip
|
||||
}
|
||||
}
|
||||
|
||||
return list;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get display of DCValue
|
||||
*
|
||||
* @param dcv
|
||||
* @return string displaying elements of the DCValue
|
||||
*/
|
||||
public static String getDCValueString(DCValue dcv)
|
||||
{
|
||||
return "schema: " + dcv.schema + "; element: " + dcv.element + "; qualifier: " + dcv.qualifier +
|
||||
"; language: " + dcv.language + "; value: " + dcv.value;
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @return a String representation of the two- or three-part form of a metadata element
|
||||
* e.g. dc.identifier.uri
|
||||
*/
|
||||
public static String getCompoundForm(String schema, String element, String qualifier)
|
||||
{
|
||||
StringBuilder sb = new StringBuilder();
|
||||
sb.append(schema).append(".").append(element);
|
||||
|
||||
if (qualifier != null)
|
||||
{
|
||||
sb.append(".").append(qualifier);
|
||||
}
|
||||
return sb.toString();
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses metadata field given in the form <schema>.<element>[.<qualifier>|.*]
|
||||
* checks for correct number of elements (2 or 3) and for empty strings
|
||||
*
|
||||
* @return String Array
|
||||
* @throws ParseException if validity checks fail
|
||||
*
|
||||
*/
|
||||
public static String[] parseCompoundForm(String compoundForm)
|
||||
throws ParseException
|
||||
{
|
||||
String[] ar = compoundForm.split("\\s*\\.\\s*"); //trim ends
|
||||
|
||||
if ("".equals(ar[0]))
|
||||
{
|
||||
throw new ParseException("schema is empty string: " + compoundForm, 0);
|
||||
}
|
||||
|
||||
if ((ar.length < 2) || (ar.length > 3) || "".equals(ar[1]))
|
||||
{
|
||||
throw new ParseException("element is malformed or empty string: " + compoundForm, 0);
|
||||
}
|
||||
|
||||
return ar;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,55 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.sql.SQLException;
|
||||
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Bundle;
|
||||
|
||||
/**
|
||||
* Filter all bitstreams in the ORIGINAL bundle
|
||||
* Also delete all derivative bitstreams, i.e.
|
||||
* all bitstreams in the TEXT and THUMBNAIL bundles
|
||||
*/
|
||||
public class OriginalBitstreamFilter extends BitstreamFilterByBundleName
|
||||
{
|
||||
public OriginalBitstreamFilter()
|
||||
{
|
||||
//empty
|
||||
}
|
||||
|
||||
/**
|
||||
* Tests bitstreams for containment in an ORIGINAL bundle
|
||||
*
|
||||
* @return true if the bitstream is in the ORIGINAL bundle
|
||||
*
|
||||
* @throws BitstreamFilterException
|
||||
*/
|
||||
public boolean accept(Bitstream bitstream)
|
||||
throws BitstreamFilterException
|
||||
{
|
||||
try
|
||||
{
|
||||
Bundle[] bundles = bitstream.getBundles();
|
||||
for (Bundle b : bundles)
|
||||
{
|
||||
if (b.getName().equals("ORIGINAL"))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
catch(SQLException e)
|
||||
{
|
||||
throw new BitstreamFilterException(e);
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,59 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.sql.SQLException;
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Bundle;
|
||||
|
||||
/**
|
||||
* Filter all bitstreams in the ORIGINAL bundle
|
||||
* Also delete all derivative bitstreams, i.e.
|
||||
* all bitstreams in the TEXT and THUMBNAIL bundles
|
||||
*/
|
||||
public class OriginalWithDerivativesBitstreamFilter extends BitstreamFilter
|
||||
{
|
||||
private String[] bundlesToEmpty = { "ORIGINAL", "TEXT", "THUMBNAIL" };
|
||||
|
||||
public OriginalWithDerivativesBitstreamFilter()
|
||||
{
|
||||
//empty
|
||||
}
|
||||
|
||||
/**
|
||||
* Tests bitstream for membership in specified bundles (ORIGINAL, TEXT, THUMBNAIL)
|
||||
*
|
||||
* @param bitstream
|
||||
* @throws BitstreamFilterException
|
||||
* @return true if bitstream is in specified bundles
|
||||
*/
|
||||
public boolean accept(Bitstream bitstream)
|
||||
throws BitstreamFilterException
|
||||
{
|
||||
try
|
||||
{
|
||||
Bundle[] bundles = bitstream.getBundles();
|
||||
for (Bundle b : bundles)
|
||||
{
|
||||
for (String bn : bundlesToEmpty)
|
||||
{
|
||||
if (b.getName().equals(bn))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch(SQLException e)
|
||||
{
|
||||
throw new BitstreamFilterException(e);
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,24 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.util.Properties;
|
||||
|
||||
/**
|
||||
* Bitstream filter targetting the THUMBNAIL bundle
|
||||
*
|
||||
*/
|
||||
public class ThumbnailBitstreamFilter extends BitstreamFilterByBundleName {
|
||||
|
||||
public ThumbnailBitstreamFilter()
|
||||
{
|
||||
props = new Properties();
|
||||
props.setProperty("bundle", "THUMBNAIL");
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,30 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* Interface for actions to update an item
|
||||
*
|
||||
*/
|
||||
public interface UpdateAction
|
||||
{
|
||||
/**
|
||||
* Action to update item
|
||||
*
|
||||
* @param context
|
||||
* @param itarch
|
||||
* @param isTest
|
||||
* @param suppressUndo
|
||||
* @throws Exception
|
||||
*/
|
||||
public void execute(Context context, ItemArchive itarch, boolean isTest, boolean suppressUndo)
|
||||
throws Exception;
|
||||
|
||||
}
|
||||
@@ -1,39 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
/**
|
||||
* Base class for Bitstream actions
|
||||
*
|
||||
*
|
||||
*/
|
||||
public abstract class UpdateBitstreamsAction implements UpdateAction {
|
||||
|
||||
protected boolean alterProvenance = true;
|
||||
|
||||
/**
|
||||
* Set variable to indicate that the dc.description.provenance field may
|
||||
* be changed as a result of Bitstream changes by ItemUpdate
|
||||
* @param alterProvenance
|
||||
*/
|
||||
public void setAlterProvenance(boolean alterProvenance)
|
||||
{
|
||||
this.alterProvenance = alterProvenance;
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @return boolean value to indicate whether the dc.description.provenance field may
|
||||
* be changed as a result of Bitstream changes by ItemUpdate
|
||||
*/
|
||||
public boolean getAlterProvenance()
|
||||
{
|
||||
return alterProvenance;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,70 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.itemupdate;
|
||||
|
||||
import java.util.HashSet;
|
||||
import java.util.Set;
|
||||
|
||||
/**
|
||||
* This abstract subclass for metadata actions
|
||||
* maintains a collection for the target metadata fields
|
||||
* expressed as a string in the compound notation ( <schema>.<element>.<qualifier> )
|
||||
* on which to apply the action when the method execute is called.
|
||||
*
|
||||
* Implemented as a Set to avoid problems with duplicates
|
||||
*
|
||||
*
|
||||
*/
|
||||
public abstract class UpdateMetadataAction implements UpdateAction {
|
||||
|
||||
protected Set<String> targetFields = new HashSet<String>();
|
||||
|
||||
/**
|
||||
* Get target fields
|
||||
*
|
||||
* @return set of fields to update
|
||||
*/
|
||||
public Set<String> getTargetFields() {
|
||||
return targetFields;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set target fields
|
||||
*
|
||||
* @param targetFields
|
||||
*/
|
||||
public void addTargetFields(Set<String> targetFields) {
|
||||
for (String tf : targetFields)
|
||||
{
|
||||
this.targetFields.add(tf);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Add array of target fields to update
|
||||
* @param targetFields
|
||||
*/
|
||||
public void addTargetFields(String[] targetFields) {
|
||||
for (String tf : targetFields)
|
||||
{
|
||||
this.targetFields.add(tf);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Add single field to update
|
||||
*
|
||||
* @param targetField
|
||||
*/
|
||||
public void addTargetField(String targetField) {
|
||||
this.targetFields.add(targetField);
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,273 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.launcher;
|
||||
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.servicemanager.DSpaceKernelImpl;
|
||||
import org.dspace.servicemanager.DSpaceKernelInit;
|
||||
import org.dspace.services.RequestService;
|
||||
import org.jdom.Document;
|
||||
import org.jdom.Element;
|
||||
import org.jdom.input.SAXBuilder;
|
||||
import java.util.List;
|
||||
import java.lang.reflect.Method;
|
||||
|
||||
/**
|
||||
* A DSpace script launcher.
|
||||
*
|
||||
* @author Stuart Lewis
|
||||
* @author Mark Diggory
|
||||
*/
|
||||
public class ScriptLauncher
|
||||
{
|
||||
/** The service manager kernel */
|
||||
private static transient DSpaceKernelImpl kernelImpl;
|
||||
|
||||
/**
|
||||
* Execute the DSpace script launcher
|
||||
*
|
||||
* @param args Any parameters required to be passed to the scripts it executes
|
||||
*/
|
||||
public static void main(String[] args)
|
||||
{
|
||||
// Check that there is at least one argument
|
||||
if (args.length < 1)
|
||||
{
|
||||
System.err.println("You must provide at least one command argument");
|
||||
display();
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
// Initialise the service manager kernel
|
||||
try {
|
||||
kernelImpl = DSpaceKernelInit.getKernel(null);
|
||||
if (!kernelImpl.isRunning())
|
||||
{
|
||||
kernelImpl.start(ConfigurationManager.getProperty("dspace.dir"));
|
||||
}
|
||||
} catch (Exception e)
|
||||
{
|
||||
// Failed to start so destroy it and log and throw an exception
|
||||
try
|
||||
{
|
||||
kernelImpl.destroy();
|
||||
}
|
||||
catch (Exception e1)
|
||||
{
|
||||
// Nothing to do
|
||||
}
|
||||
String message = "Failure during filter init: " + e.getMessage();
|
||||
System.err.println(message + ":" + e);
|
||||
throw new IllegalStateException(message, e);
|
||||
}
|
||||
|
||||
// Parse the configuration file looking for the command entered
|
||||
Document doc = getConfig();
|
||||
String request = args[0];
|
||||
Element root = doc.getRootElement();
|
||||
List<Element> commands = root.getChildren("command");
|
||||
for (Element command : commands)
|
||||
{
|
||||
if (request.equalsIgnoreCase(command.getChild("name").getValue()))
|
||||
{
|
||||
// Run each step
|
||||
List<Element> steps = command.getChildren("step");
|
||||
for (Element step : steps)
|
||||
{
|
||||
// Instantiate the class
|
||||
Class target = null;
|
||||
|
||||
// Is it the special case 'dsrun' where the user provides the class name?
|
||||
String className;
|
||||
if ("dsrun".equals(request))
|
||||
{
|
||||
if (args.length < 2)
|
||||
{
|
||||
System.err.println("Error in launcher.xml: Missing class name");
|
||||
System.exit(1);
|
||||
}
|
||||
className = args[1];
|
||||
}
|
||||
else {
|
||||
className = step.getChild("class").getValue();
|
||||
}
|
||||
try
|
||||
{
|
||||
target = Class.forName(className,
|
||||
true,
|
||||
Thread.currentThread().getContextClassLoader());
|
||||
}
|
||||
catch (ClassNotFoundException e)
|
||||
{
|
||||
System.err.println("Error in launcher.xml: Invalid class name: " + className);
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
// Strip the leading argument from the args, and add the arguments
|
||||
// Set <passargs>false</passargs> if the arguments should not be passed on
|
||||
String[] useargs = args.clone();
|
||||
Class[] argTypes = {useargs.getClass()};
|
||||
boolean passargs = true;
|
||||
if ((step.getAttribute("passuserargs") != null) &&
|
||||
("false".equalsIgnoreCase(step.getAttribute("passuserargs").getValue())))
|
||||
{
|
||||
passargs = false;
|
||||
}
|
||||
if ((args.length == 1) || (("dsrun".equals(request)) && (args.length == 2)) || (!passargs))
|
||||
{
|
||||
useargs = new String[0];
|
||||
}
|
||||
else
|
||||
{
|
||||
// The number of arguments to ignore
|
||||
// If dsrun is the command, ignore the next, as it is the class name not an arg
|
||||
int x = 1;
|
||||
if ("dsrun".equals(request))
|
||||
{
|
||||
x = 2;
|
||||
}
|
||||
String[] argsnew = new String[useargs.length - x];
|
||||
for (int i = x; i < useargs.length; i++)
|
||||
{
|
||||
argsnew[i - x] = useargs[i];
|
||||
}
|
||||
useargs = argsnew;
|
||||
}
|
||||
|
||||
// Add any extra properties
|
||||
List<Element> bits = step.getChildren("argument");
|
||||
if (step.getChild("argument") != null)
|
||||
{
|
||||
String[] argsnew = new String[useargs.length + bits.size()];
|
||||
int i = 0;
|
||||
for (Element arg : bits)
|
||||
{
|
||||
argsnew[i++] = arg.getValue();
|
||||
}
|
||||
for (; i < bits.size() + useargs.length; i++)
|
||||
{
|
||||
argsnew[i] = useargs[i - bits.size()];
|
||||
}
|
||||
useargs = argsnew;
|
||||
}
|
||||
|
||||
// Establish the request service startup
|
||||
RequestService requestService = kernelImpl.getServiceManager().getServiceByName(RequestService.class.getName(), RequestService.class);
|
||||
if (requestService == null) {
|
||||
throw new IllegalStateException("Could not get the DSpace RequestService to start the request transaction");
|
||||
}
|
||||
|
||||
// Establish a request related to the current session
|
||||
// that will trigger the various request listeners
|
||||
requestService.startRequest();
|
||||
|
||||
// Run the main() method
|
||||
try
|
||||
{
|
||||
Object[] arguments = {useargs};
|
||||
|
||||
// Useful for debugging, so left in the code...
|
||||
/**System.out.print("About to execute: " + className);
|
||||
for (String param : useargs)
|
||||
{
|
||||
System.out.print(" " + param);
|
||||
}
|
||||
System.out.println("");**/
|
||||
|
||||
Method main = target.getMethod("main", argTypes);
|
||||
main.invoke(null, arguments);
|
||||
|
||||
// ensure we close out the request (happy request)
|
||||
requestService.endRequest(null);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
// Failure occurred in the request so we destroy it
|
||||
requestService.endRequest(e);
|
||||
|
||||
if (kernelImpl != null)
|
||||
{
|
||||
kernelImpl.destroy();
|
||||
kernelImpl = null;
|
||||
}
|
||||
|
||||
// Exceptions from the script are reported as a 'cause'
|
||||
Throwable cause = e.getCause();
|
||||
System.err.println("Exception: " + cause.getMessage());
|
||||
cause.printStackTrace();
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
// Destroy the service kernel
|
||||
if (kernelImpl != null)
|
||||
{
|
||||
kernelImpl.destroy();
|
||||
kernelImpl = null;
|
||||
}
|
||||
|
||||
// Everything completed OK
|
||||
System.exit(0);
|
||||
}
|
||||
}
|
||||
|
||||
// Destroy the service kernel if it is still alive
|
||||
if (kernelImpl != null)
|
||||
{
|
||||
kernelImpl.destroy();
|
||||
kernelImpl = null;
|
||||
}
|
||||
|
||||
// The command wasn't found
|
||||
System.err.println("Command not found: " + args[0]);
|
||||
display();
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
/**
|
||||
* Load the launcher configuration file
|
||||
*
|
||||
* @return The XML configuration file Document
|
||||
*/
|
||||
private static Document getConfig()
|
||||
{
|
||||
// Load the launcher configuration file
|
||||
String config = ConfigurationManager.getProperty("dspace.dir") +
|
||||
System.getProperty("file.separator") + "config" +
|
||||
System.getProperty("file.separator") + "launcher.xml";
|
||||
SAXBuilder saxBuilder = new SAXBuilder();
|
||||
Document doc = null;
|
||||
try
|
||||
{
|
||||
doc = saxBuilder.build(config);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
System.err.println("Unable to load the launcher configuration file: [dspace]/config/launcher.xml");
|
||||
System.err.println(e.getMessage());
|
||||
System.exit(1);
|
||||
}
|
||||
return doc;
|
||||
}
|
||||
|
||||
/**
|
||||
* Display the commands that the current launcher config file knows about
|
||||
*/
|
||||
private static void display()
|
||||
{
|
||||
Document doc = getConfig();
|
||||
List<Element> commands = doc.getRootElement().getChildren("command");
|
||||
System.out.println("Usage: dspace [command-name] {parameters}");
|
||||
for (Element command : commands)
|
||||
{
|
||||
System.out.println(" - " + command.getChild("name").getValue() +
|
||||
": " + command.getChild("description").getValue());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,103 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import java.io.InputStream;
|
||||
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* Public interface for any class which transforms or converts content/bitstreams
|
||||
* from one format to another. This interface should be implemented by any class
|
||||
* which defines a "filter" to be run by the MediaFilterManager.
|
||||
*/
|
||||
public interface FormatFilter
|
||||
{
|
||||
/**
|
||||
* Get a filename for a newly created filtered bitstream
|
||||
*
|
||||
* @param sourceName
|
||||
* name of source bitstream
|
||||
* @return filename generated by the filter - for example, document.pdf
|
||||
* becomes document.pdf.txt
|
||||
*/
|
||||
public String getFilteredName(String sourceName);
|
||||
|
||||
/**
|
||||
* @return name of the bundle this filter will stick its generated
|
||||
* Bitstreams
|
||||
*/
|
||||
public String getBundleName();
|
||||
|
||||
/**
|
||||
* @return name of the bitstream format (say "HTML" or "Microsoft Word")
|
||||
* returned by this filter look in the bitstream format registry or
|
||||
* mediafilter.cfg for valid format strings.
|
||||
*/
|
||||
public String getFormatString();
|
||||
|
||||
/**
|
||||
* @return string to describe the newly-generated Bitstream's - how it was
|
||||
* produced is a good idea
|
||||
*/
|
||||
public String getDescription();
|
||||
|
||||
/**
|
||||
* @param source
|
||||
* input stream
|
||||
*
|
||||
* @return result of filter's transformation, written out to a bitstream
|
||||
*/
|
||||
public InputStream getDestinationStream(InputStream source)
|
||||
throws Exception;
|
||||
|
||||
/**
|
||||
* Perform any pre-processing of the source bitstream *before* the actual
|
||||
* filtering takes place in MediaFilterManager.processBitstream().
|
||||
* <p>
|
||||
* Return true if pre-processing is successful (or no pre-processing
|
||||
* is necessary). Return false if bitstream should be skipped
|
||||
* for any reason.
|
||||
*
|
||||
*
|
||||
* @param c
|
||||
* context
|
||||
* @param item
|
||||
* item containing bitstream to process
|
||||
* @param source
|
||||
* source bitstream to be processed
|
||||
*
|
||||
* @return true if bitstream processing should continue,
|
||||
* false if this bitstream should be skipped
|
||||
*/
|
||||
public boolean preProcessBitstream(Context c, Item item, Bitstream source)
|
||||
throws Exception;
|
||||
|
||||
/**
|
||||
* Perform any post-processing of the generated bitstream *after* this
|
||||
* filter has already been run.
|
||||
* <p>
|
||||
* Return true if pre-processing is successful (or no pre-processing
|
||||
* is necessary). Return false if bitstream should be skipped
|
||||
* for some reason.
|
||||
*
|
||||
*
|
||||
* @param c
|
||||
* context
|
||||
* @param item
|
||||
* item containing bitstream to process
|
||||
* @param generatedBitstream
|
||||
* the bitstream which was generated by
|
||||
* this filter.
|
||||
*/
|
||||
public void postProcessBitstream(Context c, Item item, Bitstream generatedBitstream)
|
||||
throws Exception;
|
||||
}
|
||||
|
||||
@@ -1,81 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.InputStream;
|
||||
|
||||
import javax.swing.text.Document;
|
||||
import javax.swing.text.html.HTMLEditorKit;
|
||||
|
||||
/*
|
||||
*
|
||||
* to do: helpful error messages - can't find mediafilter.cfg - can't
|
||||
* instantiate filter - bitstream format doesn't exist
|
||||
*
|
||||
*/
|
||||
public class HTMLFilter extends MediaFilter
|
||||
{
|
||||
|
||||
public String getFilteredName(String oldFilename)
|
||||
{
|
||||
return oldFilename + ".txt";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String bundle name
|
||||
*
|
||||
*/
|
||||
public String getBundleName()
|
||||
{
|
||||
return "TEXT";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String bitstreamformat
|
||||
*/
|
||||
public String getFormatString()
|
||||
{
|
||||
return "Text";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String description
|
||||
*/
|
||||
public String getDescription()
|
||||
{
|
||||
return "Extracted text";
|
||||
}
|
||||
|
||||
/**
|
||||
* @param source
|
||||
* source input stream
|
||||
*
|
||||
* @return InputStream the resulting input stream
|
||||
*/
|
||||
public InputStream getDestinationStream(InputStream source)
|
||||
throws Exception
|
||||
{
|
||||
// try and read the document - set to ignore character set directive,
|
||||
// assuming that the input stream is already set properly (I hope)
|
||||
HTMLEditorKit kit = new HTMLEditorKit();
|
||||
Document doc = kit.createDefaultDocument();
|
||||
|
||||
doc.putProperty("IgnoreCharsetDirective", Boolean.TRUE);
|
||||
|
||||
kit.read(source, doc, 0);
|
||||
|
||||
String extractedText = doc.getText(0, doc.getLength());
|
||||
|
||||
// generate an input stream with the extracted text
|
||||
byte[] textBytes = extractedText.getBytes();
|
||||
ByteArrayInputStream bais = new ByteArrayInputStream(textBytes);
|
||||
|
||||
return bais; // will this work? or will the byte array be out of scope?
|
||||
}
|
||||
}
|
||||
@@ -1,70 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
|
||||
/**
|
||||
* Abstract class which defines the default settings for a *simple* Media or Format Filter.
|
||||
* This class may be extended by any class which wishes to define a simple filter to be run
|
||||
* by the MediaFilterManager. More complex filters should likely implement the FormatFilter
|
||||
* interface directly, so that they can define their own pre/postProcessing methods.
|
||||
*/
|
||||
public abstract class MediaFilter implements FormatFilter
|
||||
{
|
||||
/**
|
||||
* Perform any pre-processing of the source bitstream *before* the actual
|
||||
* filtering takes place in MediaFilterManager.processBitstream().
|
||||
* <p>
|
||||
* Return true if pre-processing is successful (or no pre-processing
|
||||
* is necessary). Return false if bitstream should be skipped
|
||||
* for any reason.
|
||||
*
|
||||
*
|
||||
* @param c
|
||||
* context
|
||||
* @param item
|
||||
* item containing bitstream to process
|
||||
* @param source
|
||||
* source bitstream to be processed
|
||||
*
|
||||
* @return true if bitstream processing should continue,
|
||||
* false if this bitstream should be skipped
|
||||
*/
|
||||
public boolean preProcessBitstream(Context c, Item item, Bitstream source)
|
||||
throws Exception
|
||||
{
|
||||
return true; //default to no pre-processing
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform any post-processing of the generated bitstream *after* this
|
||||
* filter has already been run.
|
||||
* <p>
|
||||
* Return true if pre-processing is successful (or no pre-processing
|
||||
* is necessary). Return false if bitstream should be skipped
|
||||
* for some reason.
|
||||
*
|
||||
*
|
||||
* @param c
|
||||
* context
|
||||
* @param item
|
||||
* item containing bitstream to process
|
||||
* @param generatedBitstream
|
||||
* the bitstream which was generated by
|
||||
* this filter.
|
||||
*/
|
||||
public void postProcessBitstream(Context c, Item item, Bitstream generatedBitstream)
|
||||
throws Exception
|
||||
{
|
||||
//default to no post-processing necessary
|
||||
}
|
||||
}
|
||||
@@ -1,838 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import java.io.InputStream;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.HashMap;
|
||||
import java.util.Iterator;
|
||||
import java.util.Map;
|
||||
import java.util.List;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.HelpFormatter;
|
||||
import org.apache.commons.cli.MissingArgumentException;
|
||||
import org.apache.commons.cli.Option;
|
||||
import org.apache.commons.cli.OptionBuilder;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
|
||||
import org.apache.commons.lang.ArrayUtils;
|
||||
import org.dspace.authorize.AuthorizeManager;
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.BitstreamFormat;
|
||||
import org.dspace.content.Bundle;
|
||||
import org.dspace.content.Collection;
|
||||
import org.dspace.content.Community;
|
||||
import org.dspace.content.DCDate;
|
||||
import org.dspace.content.DSpaceObject;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.content.ItemIterator;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Constants;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.PluginManager;
|
||||
import org.dspace.core.SelfNamedPlugin;
|
||||
import org.dspace.handle.HandleManager;
|
||||
import org.dspace.search.DSIndexer;
|
||||
|
||||
/**
|
||||
* MediaFilterManager is the class that invokes the media/format filters over the
|
||||
* repository's content. a few command line flags affect the operation of the
|
||||
* MFM: -v verbose outputs all extracted text to STDOUT; -f force forces all
|
||||
* bitstreams to be processed, even if they have been before; -n noindex does not
|
||||
* recreate index after processing bitstreams; -i [identifier] limits processing
|
||||
* scope to a community, collection or item; and -m [max] limits processing to a
|
||||
* maximum number of items.
|
||||
*/
|
||||
public class MediaFilterManager
|
||||
{
|
||||
//key (in dspace.cfg) which lists all enabled filters by name
|
||||
public static final String MEDIA_FILTER_PLUGINS_KEY = "filter.plugins";
|
||||
|
||||
//prefix (in dspace.cfg) for all filter properties
|
||||
public static final String FILTER_PREFIX = "filter";
|
||||
|
||||
//suffix (in dspace.cfg) for input formats supported by each filter
|
||||
public static final String INPUT_FORMATS_SUFFIX = "inputFormats";
|
||||
|
||||
static boolean updateIndex = true; // default to updating index
|
||||
|
||||
static boolean isVerbose = false; // default to not verbose
|
||||
|
||||
static boolean isQuiet = false; // default is noisy
|
||||
|
||||
static boolean isForce = false; // default to not forced
|
||||
|
||||
static String identifier = null; // object scope limiter
|
||||
|
||||
static int max2Process = Integer.MAX_VALUE; // maximum number items to process
|
||||
|
||||
static int processed = 0; // number items processed
|
||||
|
||||
private static Item currentItem = null; // current item being processed
|
||||
|
||||
private static FormatFilter[] filterClasses = null;
|
||||
|
||||
private static Map<String, List<String>> filterFormats = new HashMap<String, List<String>>();
|
||||
|
||||
private static List<String> skipList = null; //list of identifiers to skip during processing
|
||||
|
||||
//separator in filterFormats Map between a filter class name and a plugin name,
|
||||
//for MediaFilters which extend SelfNamedPlugin (\034 is "file separator" char)
|
||||
public static final String FILTER_PLUGIN_SEPARATOR = "\034";
|
||||
|
||||
public static void main(String[] argv) throws Exception
|
||||
{
|
||||
// set headless for non-gui workstations
|
||||
System.setProperty("java.awt.headless", "true");
|
||||
|
||||
// create an options object and populate it
|
||||
CommandLineParser parser = new PosixParser();
|
||||
|
||||
int status = 0;
|
||||
|
||||
Options options = new Options();
|
||||
|
||||
options.addOption("v", "verbose", false,
|
||||
"print all extracted text and other details to STDOUT");
|
||||
options.addOption("q", "quiet", false,
|
||||
"do not print anything except in the event of errors.");
|
||||
options.addOption("f", "force", false,
|
||||
"force all bitstreams to be processed");
|
||||
options.addOption("n", "noindex", false,
|
||||
"do NOT update the search index after filtering bitstreams");
|
||||
options.addOption("i", "identifier", true,
|
||||
"ONLY process bitstreams belonging to identifier");
|
||||
options.addOption("m", "maximum", true,
|
||||
"process no more than maximum items");
|
||||
options.addOption("h", "help", false, "help");
|
||||
|
||||
//create a "plugin" option (to specify specific MediaFilter plugins to run)
|
||||
OptionBuilder.withLongOpt("plugins");
|
||||
OptionBuilder.withValueSeparator(',');
|
||||
OptionBuilder.withDescription(
|
||||
"ONLY run the specified Media Filter plugin(s)\n" +
|
||||
"listed from '" + MEDIA_FILTER_PLUGINS_KEY + "' in dspace.cfg.\n" +
|
||||
"Separate multiple with a comma (,)\n" +
|
||||
"(e.g. MediaFilterManager -p \n\"Word Text Extractor\",\"PDF Text Extractor\")");
|
||||
Option pluginOption = OptionBuilder.create('p');
|
||||
pluginOption.setArgs(Option.UNLIMITED_VALUES); //unlimited number of args
|
||||
options.addOption(pluginOption);
|
||||
|
||||
//create a "skip" option (to specify communities/collections/items to skip)
|
||||
OptionBuilder.withLongOpt("skip");
|
||||
OptionBuilder.withValueSeparator(',');
|
||||
OptionBuilder.withDescription(
|
||||
"SKIP the bitstreams belonging to identifier\n" +
|
||||
"Separate multiple identifiers with a comma (,)\n" +
|
||||
"(e.g. MediaFilterManager -s \n 123456789/34,123456789/323)");
|
||||
Option skipOption = OptionBuilder.create('s');
|
||||
skipOption.setArgs(Option.UNLIMITED_VALUES); //unlimited number of args
|
||||
options.addOption(skipOption);
|
||||
|
||||
CommandLine line = null;
|
||||
try
|
||||
{
|
||||
line = parser.parse(options, argv);
|
||||
}
|
||||
catch(MissingArgumentException e)
|
||||
{
|
||||
System.out.println("ERROR: " + e.getMessage());
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("MediaFilterManager\n", options);
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
if (line.hasOption('h'))
|
||||
{
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("MediaFilterManager\n", options);
|
||||
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
if (line.hasOption('v'))
|
||||
{
|
||||
isVerbose = true;
|
||||
}
|
||||
|
||||
isQuiet = line.hasOption('q');
|
||||
|
||||
if (line.hasOption('n'))
|
||||
{
|
||||
updateIndex = false;
|
||||
}
|
||||
|
||||
if (line.hasOption('f'))
|
||||
{
|
||||
isForce = true;
|
||||
}
|
||||
|
||||
if (line.hasOption('i'))
|
||||
{
|
||||
identifier = line.getOptionValue('i');
|
||||
}
|
||||
|
||||
if (line.hasOption('m'))
|
||||
{
|
||||
max2Process = Integer.parseInt(line.getOptionValue('m'));
|
||||
if (max2Process <= 1)
|
||||
{
|
||||
System.out.println("Invalid maximum value '" +
|
||||
line.getOptionValue('m') + "' - ignoring");
|
||||
max2Process = Integer.MAX_VALUE;
|
||||
}
|
||||
}
|
||||
|
||||
String filterNames[] = null;
|
||||
if(line.hasOption('p'))
|
||||
{
|
||||
//specified which media filter plugins we are using
|
||||
filterNames = line.getOptionValues('p');
|
||||
|
||||
if(filterNames==null || filterNames.length==0)
|
||||
{ //display error, since no plugins specified
|
||||
System.err.println("\nERROR: -p (-plugin) option requires at least one plugin to be specified.\n" +
|
||||
"(e.g. MediaFilterManager -p \"Word Text Extractor\",\"PDF Text Extractor\")\n");
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("MediaFilterManager\n", options);
|
||||
System.exit(1);
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
//retrieve list of all enabled media filter plugins!
|
||||
String enabledPlugins = ConfigurationManager.getProperty(MEDIA_FILTER_PLUGINS_KEY);
|
||||
filterNames = enabledPlugins.split(",\\s*");
|
||||
}
|
||||
|
||||
//initialize an array of our enabled filters
|
||||
List<FormatFilter> filterList = new ArrayList<FormatFilter>();
|
||||
|
||||
//set up each filter
|
||||
for(int i=0; i< filterNames.length; i++)
|
||||
{
|
||||
//get filter of this name & add to list of filters
|
||||
FormatFilter filter = (FormatFilter) PluginManager.getNamedPlugin(FormatFilter.class, filterNames[i]);
|
||||
if(filter==null)
|
||||
{
|
||||
System.err.println("\nERROR: Unknown MediaFilter specified (either from command-line or in dspace.cfg): '" + filterNames[i] + "'");
|
||||
System.exit(1);
|
||||
}
|
||||
else
|
||||
{
|
||||
filterList.add(filter);
|
||||
|
||||
String filterClassName = filter.getClass().getName();
|
||||
|
||||
String pluginName = null;
|
||||
|
||||
//If this filter is a SelfNamedPlugin,
|
||||
//then the input formats it accepts may differ for
|
||||
//each "named" plugin that it defines.
|
||||
//So, we have to look for every key that fits the
|
||||
//following format: filter.<class-name>.<plugin-name>.inputFormats
|
||||
if( SelfNamedPlugin.class.isAssignableFrom(filter.getClass()) )
|
||||
{
|
||||
//Get the plugin instance name for this class
|
||||
pluginName = ((SelfNamedPlugin) filter).getPluginInstanceName();
|
||||
}
|
||||
|
||||
|
||||
//Retrieve our list of supported formats from dspace.cfg
|
||||
//For SelfNamedPlugins, format of key is:
|
||||
// filter.<class-name>.<plugin-name>.inputFormats
|
||||
//For other MediaFilters, format of key is:
|
||||
// filter.<class-name>.inputFormats
|
||||
String formats = ConfigurationManager.getProperty(
|
||||
FILTER_PREFIX + "." + filterClassName +
|
||||
(pluginName!=null ? "." + pluginName : "") +
|
||||
"." + INPUT_FORMATS_SUFFIX);
|
||||
|
||||
//add to internal map of filters to supported formats
|
||||
if (formats != null)
|
||||
{
|
||||
//For SelfNamedPlugins, map key is:
|
||||
// <class-name><separator><plugin-name>
|
||||
//For other MediaFilters, map key is just:
|
||||
// <class-name>
|
||||
filterFormats.put(filterClassName +
|
||||
(pluginName!=null ? FILTER_PLUGIN_SEPARATOR + pluginName : ""),
|
||||
Arrays.asList(formats.split(",[\\s]*")));
|
||||
}
|
||||
}//end if filter!=null
|
||||
}//end for
|
||||
|
||||
//If verbose, print out loaded mediafilter info
|
||||
if(isVerbose)
|
||||
{
|
||||
System.out.println("The following MediaFilters are enabled: ");
|
||||
Iterator<String> i = filterFormats.keySet().iterator();
|
||||
while(i.hasNext())
|
||||
{
|
||||
String filterName = i.next();
|
||||
System.out.println("Full Filter Name: " + filterName);
|
||||
String pluginName = null;
|
||||
if(filterName.contains(FILTER_PLUGIN_SEPARATOR))
|
||||
{
|
||||
String[] fields = filterName.split(FILTER_PLUGIN_SEPARATOR);
|
||||
filterName=fields[0];
|
||||
pluginName=fields[1];
|
||||
}
|
||||
|
||||
System.out.println(filterName +
|
||||
(pluginName!=null? " (Plugin: " + pluginName + ")": ""));
|
||||
}
|
||||
}
|
||||
|
||||
//store our filter list into an internal array
|
||||
filterClasses = (FormatFilter[]) filterList.toArray(new FormatFilter[filterList.size()]);
|
||||
|
||||
|
||||
//Retrieve list of identifiers to skip (if any)
|
||||
String skipIds[] = null;
|
||||
if(line.hasOption('s'))
|
||||
{
|
||||
//specified which identifiers to skip when processing
|
||||
skipIds = line.getOptionValues('s');
|
||||
|
||||
if(skipIds==null || skipIds.length==0)
|
||||
{ //display error, since no identifiers specified to skip
|
||||
System.err.println("\nERROR: -s (-skip) option requires at least one identifier to SKIP.\n" +
|
||||
"Make sure to separate multiple identifiers with a comma!\n" +
|
||||
"(e.g. MediaFilterManager -s 123456789/34,123456789/323)\n");
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("MediaFilterManager\n", options);
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
//save to a global skip list
|
||||
skipList = Arrays.asList(skipIds);
|
||||
}
|
||||
|
||||
Context c = null;
|
||||
|
||||
try
|
||||
{
|
||||
c = new Context();
|
||||
|
||||
// have to be super-user to do the filtering
|
||||
c.setIgnoreAuthorization(true);
|
||||
|
||||
// now apply the filters
|
||||
if (identifier == null)
|
||||
{
|
||||
applyFiltersAllItems(c);
|
||||
}
|
||||
else // restrict application scope to identifier
|
||||
{
|
||||
DSpaceObject dso = HandleManager.resolveToObject(c, identifier);
|
||||
if (dso == null)
|
||||
{
|
||||
throw new IllegalArgumentException("Cannot resolve "
|
||||
+ identifier + " to a DSpace object");
|
||||
}
|
||||
|
||||
switch (dso.getType())
|
||||
{
|
||||
case Constants.COMMUNITY:
|
||||
applyFiltersCommunity(c, (Community)dso);
|
||||
break;
|
||||
case Constants.COLLECTION:
|
||||
applyFiltersCollection(c, (Collection)dso);
|
||||
break;
|
||||
case Constants.ITEM:
|
||||
applyFiltersItem(c, (Item)dso);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// update search index?
|
||||
if (updateIndex)
|
||||
{
|
||||
if (!isQuiet)
|
||||
{
|
||||
System.out.println("Updating search index:");
|
||||
}
|
||||
DSIndexer.setBatchProcessingMode(true);
|
||||
try
|
||||
{
|
||||
DSIndexer.updateIndex(c);
|
||||
}
|
||||
finally
|
||||
{
|
||||
DSIndexer.setBatchProcessingMode(false);
|
||||
}
|
||||
}
|
||||
|
||||
c.complete();
|
||||
c = null;
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
status = 1;
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (c != null)
|
||||
{
|
||||
c.abort();
|
||||
}
|
||||
}
|
||||
System.exit(status);
|
||||
}
|
||||
|
||||
public static void applyFiltersAllItems(Context c) throws Exception
|
||||
{
|
||||
if(skipList!=null)
|
||||
{
|
||||
//if a skip-list exists, we need to filter community-by-community
|
||||
//so we can respect what is in the skip-list
|
||||
Community[] topLevelCommunities = Community.findAllTop(c);
|
||||
|
||||
for(int i=0; i<topLevelCommunities.length; i++)
|
||||
{
|
||||
applyFiltersCommunity(c, topLevelCommunities[i]);
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
//otherwise, just find every item and process
|
||||
ItemIterator i = Item.findAll(c);
|
||||
try
|
||||
{
|
||||
while (i.hasNext() && processed < max2Process)
|
||||
{
|
||||
applyFiltersItem(c, i.next());
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (i != null)
|
||||
{
|
||||
i.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public static void applyFiltersCommunity(Context c, Community community)
|
||||
throws Exception
|
||||
{ //only apply filters if community not in skip-list
|
||||
if(!inSkipList(community.getHandle()))
|
||||
{
|
||||
Community[] subcommunities = community.getSubcommunities();
|
||||
for (int i = 0; i < subcommunities.length; i++)
|
||||
{
|
||||
applyFiltersCommunity(c, subcommunities[i]);
|
||||
}
|
||||
|
||||
Collection[] collections = community.getCollections();
|
||||
for (int j = 0; j < collections.length; j++)
|
||||
{
|
||||
applyFiltersCollection(c, collections[j]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public static void applyFiltersCollection(Context c, Collection collection)
|
||||
throws Exception
|
||||
{
|
||||
//only apply filters if collection not in skip-list
|
||||
if(!inSkipList(collection.getHandle()))
|
||||
{
|
||||
ItemIterator i = collection.getItems();
|
||||
try
|
||||
{
|
||||
while (i.hasNext() && processed < max2Process)
|
||||
{
|
||||
applyFiltersItem(c, i.next());
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (i != null)
|
||||
{
|
||||
i.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public static void applyFiltersItem(Context c, Item item) throws Exception
|
||||
{
|
||||
//only apply filters if item not in skip-list
|
||||
if(!inSkipList(item.getHandle()))
|
||||
{
|
||||
//cache this item in MediaFilterManager
|
||||
//so it can be accessed by MediaFilters as necessary
|
||||
currentItem = item;
|
||||
|
||||
if (filterItem(c, item))
|
||||
{
|
||||
// commit changes after each filtered item
|
||||
c.commit();
|
||||
// increment processed count
|
||||
++processed;
|
||||
}
|
||||
// clear item objects from context cache and internal cache
|
||||
item.decache();
|
||||
currentItem = null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* iterate through the item's bitstreams in the ORIGINAL bundle, applying
|
||||
* filters if possible
|
||||
*
|
||||
* @return true if any bitstreams processed,
|
||||
* false if none
|
||||
*/
|
||||
public static boolean filterItem(Context c, Item myItem) throws Exception
|
||||
{
|
||||
// get 'original' bundles
|
||||
Bundle[] myBundles = myItem.getBundles("ORIGINAL");
|
||||
boolean done = false;
|
||||
for (int i = 0; i < myBundles.length; i++)
|
||||
{
|
||||
// now look at all of the bitstreams
|
||||
Bitstream[] myBitstreams = myBundles[i].getBitstreams();
|
||||
|
||||
for (int k = 0; k < myBitstreams.length; k++)
|
||||
{
|
||||
done |= filterBitstream(c, myItem, myBitstreams[k]);
|
||||
}
|
||||
}
|
||||
return done;
|
||||
}
|
||||
|
||||
/**
|
||||
* Attempt to filter a bitstream
|
||||
*
|
||||
* An exception will be thrown if the media filter class cannot be
|
||||
* instantiated, exceptions from filtering will be logged to STDOUT and
|
||||
* swallowed.
|
||||
*
|
||||
* @return true if bitstream processed,
|
||||
* false if no applicable filter or already processed
|
||||
*/
|
||||
public static boolean filterBitstream(Context c, Item myItem,
|
||||
Bitstream myBitstream) throws Exception
|
||||
{
|
||||
boolean filtered = false;
|
||||
|
||||
// iterate through filter classes. A single format may be actioned
|
||||
// by more than one filter
|
||||
for (int i = 0; i < filterClasses.length; i++)
|
||||
{
|
||||
//List fmts = (List)filterFormats.get(filterClasses[i].getClass().getName());
|
||||
String pluginName = null;
|
||||
|
||||
//if this filter class is a SelfNamedPlugin,
|
||||
//its list of supported formats is different for
|
||||
//differently named "plugin"
|
||||
if( SelfNamedPlugin.class.isAssignableFrom(filterClasses[i].getClass()) )
|
||||
{
|
||||
//get plugin instance name for this media filter
|
||||
pluginName = ((SelfNamedPlugin)filterClasses[i]).getPluginInstanceName();
|
||||
}
|
||||
|
||||
//Get list of supported formats for the filter (and possibly named plugin)
|
||||
//For SelfNamedPlugins, map key is:
|
||||
// <class-name><separator><plugin-name>
|
||||
//For other MediaFilters, map key is just:
|
||||
// <class-name>
|
||||
List<String> fmts = filterFormats.get(filterClasses[i].getClass().getName() +
|
||||
(pluginName!=null ? FILTER_PLUGIN_SEPARATOR + pluginName : ""));
|
||||
|
||||
if (fmts.contains(myBitstream.getFormat().getShortDescription()))
|
||||
{
|
||||
try
|
||||
{
|
||||
// only update item if bitstream not skipped
|
||||
if (processBitstream(c, myItem, myBitstream, filterClasses[i]))
|
||||
{
|
||||
myItem.update(); // Make sure new bitstream has a sequence
|
||||
// number
|
||||
filtered = true;
|
||||
}
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
String handle = myItem.getHandle();
|
||||
Bundle[] bundles = myBitstream.getBundles();
|
||||
long size = myBitstream.getSize();
|
||||
String checksum = myBitstream.getChecksum() + " ("+myBitstream.getChecksumAlgorithm()+")";
|
||||
int assetstore = myBitstream.getStoreNumber();
|
||||
|
||||
// Printout helpfull information to find the errored bistream.
|
||||
System.out.println("ERROR filtering, skipping bitstream:\n");
|
||||
System.out.println("\tItem Handle: "+ handle);
|
||||
for (Bundle bundle : bundles)
|
||||
{
|
||||
System.out.println("\tBundle Name: " + bundle.getName());
|
||||
}
|
||||
System.out.println("\tFile Size: " + size);
|
||||
System.out.println("\tChecksum: " + checksum);
|
||||
System.out.println("\tAsset Store: " + assetstore);
|
||||
System.out.println(e);
|
||||
e.printStackTrace();
|
||||
}
|
||||
}
|
||||
else if (filterClasses[i] instanceof SelfRegisterInputFormats)
|
||||
{
|
||||
// Filter implements self registration, so check to see if it should be applied
|
||||
// given the formats it claims to support
|
||||
SelfRegisterInputFormats srif = (SelfRegisterInputFormats)filterClasses[i];
|
||||
boolean applyFilter = false;
|
||||
|
||||
// Check MIME type
|
||||
String[] mimeTypes = srif.getInputMIMETypes();
|
||||
if (mimeTypes != null)
|
||||
{
|
||||
for (String mimeType : mimeTypes)
|
||||
{
|
||||
if (mimeType.equalsIgnoreCase(myBitstream.getFormat().getMIMEType()))
|
||||
{
|
||||
applyFilter = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check description
|
||||
if (!applyFilter)
|
||||
{
|
||||
String[] descriptions = srif.getInputDescriptions();
|
||||
if (descriptions != null)
|
||||
{
|
||||
for (String desc : descriptions)
|
||||
{
|
||||
if (desc.equalsIgnoreCase(myBitstream.getFormat().getShortDescription()))
|
||||
{
|
||||
applyFilter = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check extensions
|
||||
if (!applyFilter)
|
||||
{
|
||||
String[] extensions = srif.getInputExtensions();
|
||||
if (extensions != null)
|
||||
{
|
||||
for (String ext : extensions)
|
||||
{
|
||||
String[] formatExtensions = myBitstream.getFormat().getExtensions();
|
||||
if (formatExtensions != null && ArrayUtils.contains(formatExtensions, ext))
|
||||
{
|
||||
applyFilter = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Filter claims to handle this type of file, so attempt to apply it
|
||||
if (applyFilter)
|
||||
{
|
||||
try
|
||||
{
|
||||
// only update item if bitstream not skipped
|
||||
if (processBitstream(c, myItem, myBitstream, filterClasses[i]))
|
||||
{
|
||||
myItem.update(); // Make sure new bitstream has a sequence
|
||||
// number
|
||||
filtered = true;
|
||||
}
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
System.out.println("ERROR filtering, skipping bitstream #"
|
||||
+ myBitstream.getID() + " " + e);
|
||||
e.printStackTrace();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return filtered;
|
||||
}
|
||||
|
||||
/**
|
||||
* processBitstream is a utility class that calls the virtual methods
|
||||
* from the current MediaFilter class.
|
||||
* It scans the bitstreams in an item, and decides if a bitstream has
|
||||
* already been filtered, and if not or if overWrite is set, invokes the
|
||||
* filter.
|
||||
*
|
||||
* @param c
|
||||
* context
|
||||
* @param item
|
||||
* item containing bitstream to process
|
||||
* @param source
|
||||
* source bitstream to process
|
||||
* @param formatFilter
|
||||
* FormatFilter to perform filtering
|
||||
*
|
||||
* @return true if new rendition is created, false if rendition already
|
||||
* exists and overWrite is not set
|
||||
*/
|
||||
public static boolean processBitstream(Context c, Item item, Bitstream source, FormatFilter formatFilter)
|
||||
throws Exception
|
||||
{
|
||||
//do pre-processing of this bitstream, and if it fails, skip this bitstream!
|
||||
if(!formatFilter.preProcessBitstream(c, item, source))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
boolean overWrite = MediaFilterManager.isForce;
|
||||
|
||||
// get bitstream filename, calculate destination filename
|
||||
String newName = formatFilter.getFilteredName(source.getName());
|
||||
|
||||
Bitstream existingBitstream = null; // is there an existing rendition?
|
||||
Bundle targetBundle = null; // bundle we're modifying
|
||||
|
||||
Bundle[] bundles = item.getBundles(formatFilter.getBundleName());
|
||||
|
||||
// check if destination bitstream exists
|
||||
if (bundles.length > 0)
|
||||
{
|
||||
// only finds the last match (FIXME?)
|
||||
for (int i = 0; i < bundles.length; i++)
|
||||
{
|
||||
Bitstream[] bitstreams = bundles[i].getBitstreams();
|
||||
|
||||
for (int j = 0; j < bitstreams.length; j++)
|
||||
{
|
||||
if (bitstreams[j].getName().equals(newName))
|
||||
{
|
||||
targetBundle = bundles[i];
|
||||
existingBitstream = bitstreams[j];
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// if exists and overwrite = false, exit
|
||||
if (!overWrite && (existingBitstream != null))
|
||||
{
|
||||
if (!isQuiet)
|
||||
{
|
||||
System.out.println("SKIPPED: bitstream " + source.getID()
|
||||
+ " (item: " + item.getHandle() + ") because '" + newName + "' already exists");
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
InputStream destStream = formatFilter.getDestinationStream(source.retrieve());
|
||||
if (destStream == null)
|
||||
{
|
||||
if (!isQuiet)
|
||||
{
|
||||
System.out.println("SKIPPED: bitstream " + source.getID()
|
||||
+ " (item: " + item.getHandle() + ") because filtering was unsuccessful");
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
// create new bundle if needed
|
||||
if (bundles.length < 1)
|
||||
{
|
||||
targetBundle = item.createBundle(formatFilter.getBundleName());
|
||||
}
|
||||
else
|
||||
{
|
||||
// take the first match
|
||||
targetBundle = bundles[0];
|
||||
}
|
||||
|
||||
Bitstream b = targetBundle.createBitstream(destStream);
|
||||
|
||||
// Now set the format and name of the bitstream
|
||||
b.setName(newName);
|
||||
b.setSource("Written by FormatFilter " + formatFilter.getClass().getName() +
|
||||
" on " + DCDate.getCurrent() + " (GMT).");
|
||||
b.setDescription(formatFilter.getDescription());
|
||||
|
||||
// Find the proper format
|
||||
BitstreamFormat bf = BitstreamFormat.findByShortDescription(c,
|
||||
formatFilter.getFormatString());
|
||||
b.setFormat(bf);
|
||||
b.update();
|
||||
|
||||
//Inherit policies from the source bitstream
|
||||
//(first remove any existing policies)
|
||||
AuthorizeManager.removeAllPolicies(c, b);
|
||||
AuthorizeManager.inheritPolicies(c, source, b);
|
||||
|
||||
// fixme - set date?
|
||||
// we are overwriting, so remove old bitstream
|
||||
if (existingBitstream != null)
|
||||
{
|
||||
targetBundle.removeBitstream(existingBitstream);
|
||||
}
|
||||
|
||||
if (!isQuiet)
|
||||
{
|
||||
System.out.println("FILTERED: bitstream " + source.getID()
|
||||
+ " (item: " + item.getHandle() + ") and created '" + newName + "'");
|
||||
}
|
||||
|
||||
//do post-processing of the generated bitstream
|
||||
formatFilter.postProcessBitstream(c, item, b);
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the item that is currently being processed/filtered
|
||||
* by the MediaFilterManager
|
||||
* <p>
|
||||
* This allows FormatFilters to retrieve the Item object
|
||||
* in case they need access to item-level information for their format
|
||||
* transformations/conversions.
|
||||
*
|
||||
* @return current Item being processed by MediaFilterManager
|
||||
*/
|
||||
public static Item getCurrentItem()
|
||||
{
|
||||
return currentItem;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check whether or not to skip processing the given identifier
|
||||
*
|
||||
* @param identifier
|
||||
* identifier (handle) of a community, collection or item
|
||||
*
|
||||
* @return true if this community, collection or item should be skipped
|
||||
* during processing. Otherwise, return false.
|
||||
*/
|
||||
public static boolean inSkipList(String identifier)
|
||||
{
|
||||
if(skipList!=null && skipList.contains(identifier))
|
||||
{
|
||||
if (!isQuiet)
|
||||
{
|
||||
System.out.println("SKIP-LIST: skipped bitstreams within identifier " + identifier);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
else
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,148 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.io.File;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.FileOutputStream;
|
||||
import java.io.InputStream;
|
||||
import java.io.OutputStreamWriter;
|
||||
import java.io.Writer;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.apache.pdfbox.pdmodel.PDDocument;
|
||||
import org.apache.pdfbox.util.PDFTextStripper;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
|
||||
/*
|
||||
*
|
||||
* to do: helpful error messages - can't find mediafilter.cfg - can't
|
||||
* instantiate filter - bitstream format doesn't exist
|
||||
*
|
||||
*/
|
||||
public class PDFFilter extends MediaFilter
|
||||
{
|
||||
|
||||
private static Logger log = Logger.getLogger(PDFFilter.class);
|
||||
|
||||
public String getFilteredName(String oldFilename)
|
||||
{
|
||||
return oldFilename + ".txt";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String bundle name
|
||||
*
|
||||
*/
|
||||
public String getBundleName()
|
||||
{
|
||||
return "TEXT";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String bitstreamformat
|
||||
*/
|
||||
public String getFormatString()
|
||||
{
|
||||
return "Text";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String description
|
||||
*/
|
||||
public String getDescription()
|
||||
{
|
||||
return "Extracted text";
|
||||
}
|
||||
|
||||
/**
|
||||
* @param source
|
||||
* source input stream
|
||||
*
|
||||
* @return InputStream the resulting input stream
|
||||
*/
|
||||
public InputStream getDestinationStream(InputStream source)
|
||||
throws Exception
|
||||
{
|
||||
try
|
||||
{
|
||||
boolean useTemporaryFile = ConfigurationManager.getBooleanProperty("pdffilter.largepdfs", false);
|
||||
|
||||
// get input stream from bitstream
|
||||
// pass to filter, get string back
|
||||
PDFTextStripper pts = new PDFTextStripper();
|
||||
PDDocument pdfDoc = null;
|
||||
Writer writer = null;
|
||||
File tempTextFile = null;
|
||||
ByteArrayOutputStream byteStream = null;
|
||||
|
||||
if (useTemporaryFile)
|
||||
{
|
||||
tempTextFile = File.createTempFile("dspacepdfextract" + source.hashCode(), ".txt");
|
||||
tempTextFile.deleteOnExit();
|
||||
writer = new OutputStreamWriter(new FileOutputStream(tempTextFile));
|
||||
}
|
||||
else
|
||||
{
|
||||
byteStream = new ByteArrayOutputStream();
|
||||
writer = new OutputStreamWriter(byteStream);
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
pdfDoc = PDDocument.load(source);
|
||||
pts.writeText(pdfDoc, writer);
|
||||
}
|
||||
finally
|
||||
{
|
||||
try
|
||||
{
|
||||
if (pdfDoc != null)
|
||||
{
|
||||
pdfDoc.close();
|
||||
}
|
||||
}
|
||||
catch(Exception e)
|
||||
{
|
||||
log.error("Error closing PDF file: " + e.getMessage(), e);
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
writer.close();
|
||||
}
|
||||
catch(Exception e)
|
||||
{
|
||||
log.error("Error closing temporary extract file: " + e.getMessage(), e);
|
||||
}
|
||||
}
|
||||
|
||||
if (useTemporaryFile)
|
||||
{
|
||||
return new FileInputStream(tempTextFile);
|
||||
}
|
||||
else
|
||||
{
|
||||
byte[] bytes = byteStream.toByteArray();
|
||||
return new ByteArrayInputStream(bytes);
|
||||
}
|
||||
}
|
||||
catch (OutOfMemoryError oome)
|
||||
{
|
||||
log.error("Error parsing PDF document " + oome.getMessage(), oome);
|
||||
if (!ConfigurationManager.getBooleanProperty("pdffilter.skiponmemoryexception", false))
|
||||
{
|
||||
throw oome;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,123 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.InputStream;
|
||||
|
||||
import org.apache.poi.extractor.ExtractorFactory;
|
||||
import org.apache.poi.xslf.extractor.XSLFPowerPointExtractor;
|
||||
import org.apache.poi.hslf.extractor.PowerPointExtractor;
|
||||
import org.apache.poi.POITextExtractor;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
|
||||
|
||||
/*
|
||||
* TODO: Allow user to configure extraction of only text or only notes
|
||||
*
|
||||
*/
|
||||
public class PowerPointFilter extends MediaFilter
|
||||
{
|
||||
|
||||
private static Logger log = Logger.getLogger(PowerPointFilter.class);
|
||||
|
||||
public String getFilteredName(String oldFilename)
|
||||
{
|
||||
return oldFilename + ".txt";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String bundle name
|
||||
*
|
||||
*/
|
||||
public String getBundleName()
|
||||
{
|
||||
return "TEXT";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String bitstreamformat
|
||||
*
|
||||
* TODO: Check that this is correct
|
||||
*/
|
||||
public String getFormatString()
|
||||
{
|
||||
return "Text";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String description
|
||||
*/
|
||||
public String getDescription()
|
||||
{
|
||||
return "Extracted text";
|
||||
}
|
||||
|
||||
/**
|
||||
* @param source
|
||||
* source input stream
|
||||
*
|
||||
* @return InputStream the resulting input stream
|
||||
*/
|
||||
public InputStream getDestinationStream(InputStream source)
|
||||
throws Exception
|
||||
{
|
||||
|
||||
try
|
||||
{
|
||||
|
||||
String extractedText = null;
|
||||
POITextExtractor pptExtractor =
|
||||
new ExtractorFactory().createExtractor(source);
|
||||
|
||||
// PowerPoint XML files and legacy format PowerPoint files
|
||||
// require different classes and APIs for text extraction
|
||||
|
||||
// If this is a PowerPoint XML file, extract accordingly
|
||||
if (pptExtractor instanceof XSLFPowerPointExtractor)
|
||||
{
|
||||
|
||||
// The true method arguments indicate that text from
|
||||
// the slides and the notes is desired
|
||||
extractedText =
|
||||
((XSLFPowerPointExtractor)pptExtractor).getText(true, true);
|
||||
}
|
||||
|
||||
// Legacy PowerPoint files
|
||||
else if (pptExtractor instanceof PowerPointExtractor)
|
||||
{
|
||||
|
||||
extractedText = ((PowerPointExtractor)pptExtractor).getText()
|
||||
+ " " + ((PowerPointExtractor)pptExtractor).getNotes();
|
||||
|
||||
}
|
||||
if (extractedText != null)
|
||||
{
|
||||
// if verbose flag is set, print out extracted text
|
||||
// to STDOUT
|
||||
if (MediaFilterManager.isVerbose)
|
||||
{
|
||||
System.out.println(extractedText);
|
||||
}
|
||||
|
||||
// generate an input stream with the extracted text
|
||||
byte[] textBytes = extractedText.getBytes();
|
||||
ByteArrayInputStream bais = new ByteArrayInputStream(textBytes);
|
||||
|
||||
return bais;
|
||||
}
|
||||
}
|
||||
catch(Exception e)
|
||||
{
|
||||
log.error("Error filtering bitstream: " + e.getMessage(), e);
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,21 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
/**
|
||||
* Interface to allow filters to register the input formats they handle
|
||||
* (useful for exposing underlying capabilities of libraries used)
|
||||
*/
|
||||
public interface SelfRegisterInputFormats
|
||||
{
|
||||
public String[] getInputMIMETypes();
|
||||
|
||||
public String[] getInputDescriptions();
|
||||
|
||||
public String[] getInputExtensions();
|
||||
}
|
||||
@@ -1,97 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.InputStream;
|
||||
import java.io.IOException;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
|
||||
import org.textmining.extraction.TextExtractor;
|
||||
import org.textmining.extraction.word.WordTextExtractorFactory;
|
||||
|
||||
/*
|
||||
*
|
||||
* to do: helpful error messages - can't find mediafilter.cfg - can't
|
||||
* instantiate filter - bitstream format doesn't exist
|
||||
*
|
||||
*/
|
||||
public class WordFilter extends MediaFilter
|
||||
{
|
||||
|
||||
private static Logger log = Logger.getLogger(WordFilter.class);
|
||||
|
||||
public String getFilteredName(String oldFilename)
|
||||
{
|
||||
return oldFilename + ".txt";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String bundle name
|
||||
*
|
||||
*/
|
||||
public String getBundleName()
|
||||
{
|
||||
return "TEXT";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String bitstreamformat
|
||||
*/
|
||||
public String getFormatString()
|
||||
{
|
||||
return "Text";
|
||||
}
|
||||
|
||||
/**
|
||||
* @return String description
|
||||
*/
|
||||
public String getDescription()
|
||||
{
|
||||
return "Extracted text";
|
||||
}
|
||||
|
||||
/**
|
||||
* @param source
|
||||
* source input stream
|
||||
*
|
||||
* @return InputStream the resulting input stream
|
||||
*/
|
||||
public InputStream getDestinationStream(InputStream source)
|
||||
throws Exception
|
||||
{
|
||||
// get input stream from bitstream
|
||||
// pass to filter, get string back
|
||||
try
|
||||
{
|
||||
WordTextExtractorFactory factory = new WordTextExtractorFactory();
|
||||
TextExtractor e = factory.textExtractor(source);
|
||||
String extractedText = e.getText();
|
||||
|
||||
// if verbose flag is set, print out extracted text
|
||||
// to STDOUT
|
||||
if (MediaFilterManager.isVerbose)
|
||||
{
|
||||
System.out.println(extractedText);
|
||||
}
|
||||
|
||||
// generate an input stream with the extracted text
|
||||
byte[] textBytes = extractedText.getBytes();
|
||||
ByteArrayInputStream bais = new ByteArrayInputStream(textBytes);
|
||||
|
||||
return bais; // will this work? or will the byte array be out of scope?
|
||||
}
|
||||
catch (IOException ioe)
|
||||
{
|
||||
System.out.println("Invalid Word Format");
|
||||
log.error("Error detected - Word File format not recognized: " + ioe.getMessage(), ioe);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,160 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import java.io.BufferedInputStream;
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.io.File;
|
||||
import java.io.FileOutputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.io.OutputStream;
|
||||
import java.util.Arrays;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Utils;
|
||||
|
||||
/**
|
||||
* Text MediaFilter for PDF sources
|
||||
*
|
||||
* This filter produces extracted text suitable for building an index,
|
||||
* but not for display to end users.
|
||||
* It forks a process running the "pdftotext" program from the
|
||||
* XPdf suite -- see http://www.foolabs.com/xpdf/
|
||||
* This is a suite of open-source PDF tools that has been widely ported
|
||||
* to Unix platforms and the ones we use (pdftoppm, pdftotext) even
|
||||
* run on Win32.
|
||||
*
|
||||
* This was written for the FACADE project but it is not directly connected
|
||||
* to any of the other FACADE-specific software. The FACADE UI expects
|
||||
* to find thumbnail images for 3D PDFs generated by this filter.
|
||||
*
|
||||
* Requires DSpace config properties keys:
|
||||
*
|
||||
* xpdf.path.pdftotext -- path to "pdftotext" executable (required!)
|
||||
*
|
||||
* @author Larry Stone
|
||||
* @see org.dspace.app.mediafilter.MediaFilter
|
||||
*/
|
||||
public class XPDF2Text extends MediaFilter
|
||||
{
|
||||
private static Logger log = Logger.getLogger(XPDF2Text.class);
|
||||
|
||||
// Command to get text from pdf; @infile@, @COMMAND@ are placeholders
|
||||
private static final String XPDF_PDFTOTEXT_COMMAND[] =
|
||||
{
|
||||
"@COMMAND@", "-q", "-enc", "UTF-8", "@infile@", "-"
|
||||
};
|
||||
|
||||
|
||||
// executable path that comes from DSpace config at runtime.
|
||||
private String pdftotextPath = null;
|
||||
|
||||
public String getFilteredName(String oldFilename)
|
||||
{
|
||||
return oldFilename + ".txt";
|
||||
}
|
||||
|
||||
public String getBundleName()
|
||||
{
|
||||
return "TEXT";
|
||||
}
|
||||
|
||||
public String getFormatString()
|
||||
{
|
||||
return "Text";
|
||||
}
|
||||
|
||||
public String getDescription()
|
||||
{
|
||||
return "Extracted Text";
|
||||
}
|
||||
|
||||
public InputStream getDestinationStream(InputStream sourceStream)
|
||||
throws Exception
|
||||
{
|
||||
// get configured value for path to XPDF command:
|
||||
if (pdftotextPath == null)
|
||||
{
|
||||
pdftotextPath = ConfigurationManager.getProperty("xpdf.path.pdftotext");
|
||||
if (pdftotextPath == null)
|
||||
{
|
||||
throw new IllegalStateException("No value for key \"xpdf.path.pdftotext\" in DSpace configuration! Should be path to XPDF pdftotext executable.");
|
||||
}
|
||||
}
|
||||
|
||||
File sourceTmp = File.createTempFile("DSfilt",".pdf");
|
||||
sourceTmp.deleteOnExit(); // extra insurance, we'll delete it here.
|
||||
int status = -1;
|
||||
try
|
||||
{
|
||||
// make local temp copy of source PDF since PDF tools
|
||||
// require a file for random access.
|
||||
// XXX fixme could optimize if we ever get an interface to grab asset *files*
|
||||
OutputStream sto = new FileOutputStream(sourceTmp);
|
||||
Utils.copy(sourceStream, sto);
|
||||
sto.close();
|
||||
sourceStream.close();
|
||||
|
||||
String pdfCmd[] = XPDF_PDFTOTEXT_COMMAND.clone();
|
||||
pdfCmd[0] = pdftotextPath;
|
||||
pdfCmd[4] = sourceTmp.toString();
|
||||
|
||||
log.debug("Running command: "+Arrays.deepToString(pdfCmd));
|
||||
Process pdfProc = Runtime.getRuntime().exec(pdfCmd);
|
||||
InputStream stdout = pdfProc.getInputStream();
|
||||
ByteArrayOutputStream baos = new ByteArrayOutputStream();
|
||||
Utils.copy(new BufferedInputStream(stdout), baos);
|
||||
stdout.close();
|
||||
baos.close();
|
||||
|
||||
status = pdfProc.waitFor();
|
||||
String msg = null;
|
||||
if (status == 1)
|
||||
{
|
||||
msg = "pdftotext failed opening input: file=" + sourceTmp.toString();
|
||||
}
|
||||
else if (status == 3)
|
||||
{
|
||||
msg = "pdftotext permission failure (perhaps copying of text from this document is not allowed - check PDF file's internal permissions): file=" + sourceTmp.toString();
|
||||
}
|
||||
else if (status != 0)
|
||||
{
|
||||
msg = "pdftotext failed, maybe corrupt PDF? status=" + String.valueOf(status);
|
||||
}
|
||||
|
||||
if (msg != null)
|
||||
{
|
||||
log.error(msg);
|
||||
throw new IOException(msg);
|
||||
}
|
||||
|
||||
return new ByteArrayInputStream(baos.toByteArray());
|
||||
}
|
||||
catch (InterruptedException e)
|
||||
{
|
||||
log.error("Failed in pdftotext subprocess: ",e);
|
||||
throw e;
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (!sourceTmp.delete())
|
||||
{
|
||||
log.error("Unable to delete temporary file");
|
||||
}
|
||||
if (status != 0)
|
||||
{
|
||||
log.error("PDF conversion proc failed, returns=" + status + ", file=" + sourceTmp);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,313 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.mediafilter;
|
||||
|
||||
import java.awt.Graphics2D;
|
||||
import java.awt.image.BufferedImage;
|
||||
import java.io.BufferedReader;
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.io.File;
|
||||
import java.io.FileOutputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.io.InputStreamReader;
|
||||
import java.io.OutputStream;
|
||||
import java.util.Arrays;
|
||||
import java.util.regex.MatchResult;
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
import javax.imageio.ImageIO;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Utils;
|
||||
|
||||
/**
|
||||
* Thumbnail MediaFilter for PDF sources
|
||||
*
|
||||
* This filter generates thumbnail images for PDF documents, _including_
|
||||
* 3D PDF documents with 2D "poster" images. Since the PDFBox library
|
||||
* does not understand these, and fails to render a lot of other PDFs,
|
||||
* this filter forks a process running the "pdftoppm" program from the
|
||||
* XPdf suite -- see http://www.foolabs.com/xpdf/
|
||||
* This is a suite of open-source PDF tools that has been widely ported
|
||||
* to Unix platforms and the ones we use (pdftoppm, pdfinfo) even
|
||||
* run on Win32.
|
||||
*
|
||||
* This was written for the FACADE project but it is not directly connected
|
||||
* to any of the other FACADE-specific software. The FACADE UI expects
|
||||
* to find thumbnail images for 3D PDFs generated by this filter.
|
||||
*
|
||||
* Requires DSpace config properties keys:
|
||||
*
|
||||
* xpdf.path.pdftoppm -- absolute path to "pdftoppm" executable (required!)
|
||||
* xpdf.path.pdfinfo -- absolute path to "pdfinfo" executable (required!)
|
||||
* thumbnail.maxwidth -- borrowed from thumbnails, max dim of generated image
|
||||
*
|
||||
* @author Larry Stone
|
||||
* @see org.dspace.app.mediafilter.MediaFilter
|
||||
*/
|
||||
public class XPDF2Thumbnail extends MediaFilter
|
||||
{
|
||||
private static Logger log = Logger.getLogger(XPDF2Thumbnail.class);
|
||||
|
||||
// maximum size of either preview image dimension
|
||||
private static final int MAX_PX = 800;
|
||||
|
||||
// maxium DPI - use common screen res, 100dpi.
|
||||
private static final int MAX_DPI = 100;
|
||||
|
||||
// command to get image from PDF; @FILE@, @OUTPUT@ are placeholders
|
||||
private static final String XPDF_PDFTOPPM_COMMAND[] =
|
||||
{
|
||||
"@COMMAND@", "-q", "-f", "1", "-l", "1",
|
||||
"-r", "@DPI@", "@FILE@", "@OUTPUTFILE@"
|
||||
};
|
||||
|
||||
// command to get image from PDF; @FILE@, @OUTPUT@ are placeholders
|
||||
private static final String XPDF_PDFINFO_COMMAND[] =
|
||||
{
|
||||
"@COMMAND@", "-f", "1", "-l", "1", "-box", "@FILE@"
|
||||
};
|
||||
|
||||
// executable path for "pdftoppm", comes from DSpace config at runtime.
|
||||
private String pdftoppmPath = null;
|
||||
|
||||
// executable path for "pdfinfo", comes from DSpace config at runtime.
|
||||
private String pdfinfoPath = null;
|
||||
|
||||
// match line in pdfinfo output that describes file's MediaBox
|
||||
private static final Pattern MEDIABOX_PATT = Pattern.compile(
|
||||
"^Page\\s+\\d+\\s+MediaBox:\\s+([\\.\\d-]+)\\s+([\\.\\d-]+)\\s+([\\.\\d-]+)\\s+([\\.\\d-]+)");
|
||||
|
||||
// also from thumbnail.maxwidth in config
|
||||
private int maxwidth = 0;
|
||||
|
||||
// backup default for size, on the large side.
|
||||
private static final int DEFAULT_MAXWIDTH = 500;
|
||||
|
||||
public String getFilteredName(String oldFilename)
|
||||
{
|
||||
return oldFilename + ".jpg";
|
||||
}
|
||||
|
||||
public String getBundleName()
|
||||
{
|
||||
return "THUMBNAIL";
|
||||
}
|
||||
|
||||
public String getFormatString()
|
||||
{
|
||||
return "JPEG";
|
||||
}
|
||||
|
||||
public String getDescription()
|
||||
{
|
||||
return "Generated Thumbnail";
|
||||
}
|
||||
|
||||
// canonical MediaFilter method to generate the thumbnail as stream.
|
||||
public InputStream getDestinationStream(InputStream sourceStream)
|
||||
throws Exception
|
||||
{
|
||||
// sanity check: xpdf paths are required. can cache since it won't change
|
||||
if (pdftoppmPath == null || pdfinfoPath == null)
|
||||
{
|
||||
pdftoppmPath = ConfigurationManager.getProperty("xpdf.path.pdftoppm");
|
||||
pdfinfoPath = ConfigurationManager.getProperty("xpdf.path.pdfinfo");
|
||||
if (pdftoppmPath == null)
|
||||
{
|
||||
throw new IllegalStateException("No value for key \"xpdf.path.pdftoppm\" in DSpace configuration! Should be path to XPDF pdftoppm executable.");
|
||||
}
|
||||
if (pdfinfoPath == null)
|
||||
{
|
||||
throw new IllegalStateException("No value for key \"xpdf.path.pdfinfo\" in DSpace configuration! Should be path to XPDF pdfinfo executable.");
|
||||
}
|
||||
maxwidth = ConfigurationManager.getIntProperty("thumbnail.maxwidth");
|
||||
if (maxwidth == 0)
|
||||
{
|
||||
maxwidth = DEFAULT_MAXWIDTH;
|
||||
}
|
||||
}
|
||||
|
||||
// make local file copy of source PDF since the PDF tools
|
||||
// require a file for random access.
|
||||
// XXX fixme would be nice to optimize this if we ever get
|
||||
// XXX a DSpace method to access (optionally!) the _file_ of
|
||||
// a Bitstream in the asset store, only when there is one of course.
|
||||
File sourceTmp = File.createTempFile("DSfilt",".pdf");
|
||||
sourceTmp.deleteOnExit();
|
||||
int status = 0;
|
||||
BufferedImage source = null;
|
||||
try
|
||||
{
|
||||
OutputStream sto = new FileOutputStream(sourceTmp);
|
||||
Utils.copy(sourceStream, sto);
|
||||
sto.close();
|
||||
sourceStream.close();
|
||||
|
||||
// First get max physical dim of bounding box of first page
|
||||
// to compute the DPI to ask for.. otherwise some AutoCAD
|
||||
// drawings can produce enormous files even at 75dpi, for
|
||||
// 48" drawings..
|
||||
|
||||
// run pdfinfo, look for MediaBox description in the output, e.g.
|
||||
// "Page 1 MediaBox: 0.00 0.00 612.00 792.00"
|
||||
//
|
||||
int dpi = 0;
|
||||
String pdfinfoCmd[] = XPDF_PDFINFO_COMMAND.clone();
|
||||
pdfinfoCmd[0] = pdfinfoPath;
|
||||
pdfinfoCmd[pdfinfoCmd.length-1] = sourceTmp.toString();
|
||||
BufferedReader lr = null;
|
||||
try
|
||||
{
|
||||
MatchResult mediaBox = null;
|
||||
Process pdfProc = Runtime.getRuntime().exec(pdfinfoCmd);
|
||||
lr = new BufferedReader(new InputStreamReader(pdfProc.getInputStream()));
|
||||
String line;
|
||||
for (line = lr.readLine(); line != null; line = lr.readLine())
|
||||
{
|
||||
// if (line.matches(MEDIABOX_PATT))
|
||||
Matcher mm = MEDIABOX_PATT.matcher(line);
|
||||
if (mm.matches())
|
||||
{
|
||||
mediaBox = mm.toMatchResult();
|
||||
}
|
||||
}
|
||||
int istatus = pdfProc.waitFor();
|
||||
if (istatus != 0)
|
||||
{
|
||||
log.error("XPDF pdfinfo proc failed, exit status=" + istatus + ", file=" + sourceTmp);
|
||||
}
|
||||
if (mediaBox == null)
|
||||
{
|
||||
log.error("Sanity check: Did not find \"MediaBox\" line in output of XPDF pdfinfo, file="+sourceTmp);
|
||||
throw new IllegalArgumentException("Failed to get MediaBox of PDF with pdfinfo, cannot compute thumbnail.");
|
||||
}
|
||||
else
|
||||
{
|
||||
double x0 = Double.parseDouble(mediaBox.group(1));
|
||||
double y0 = Double.parseDouble(mediaBox.group(2));
|
||||
double x1 = Double.parseDouble(mediaBox.group(3));
|
||||
double y1 = Double.parseDouble(mediaBox.group(4));
|
||||
int maxdim = (int)Math.max(Math.abs(x1 - x0), Math.abs(y1 - y0));
|
||||
dpi = Math.min(MAX_DPI, (MAX_PX * 72 / maxdim));
|
||||
log.debug("DPI: pdfinfo method got dpi="+dpi+" for max dim="+maxdim+" (points, 1/72\")");
|
||||
}
|
||||
}
|
||||
catch (InterruptedException e)
|
||||
{
|
||||
log.error("Failed transforming file for preview: ",e);
|
||||
throw new IllegalArgumentException("Failed transforming file for thumbnail: ",e);
|
||||
}
|
||||
catch (NumberFormatException e)
|
||||
{
|
||||
log.error("Failed interpreting pdfinfo results, check regexp: ",e);
|
||||
throw new IllegalArgumentException("Failed transforming file for thumbnail: ",e);
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (lr != null)
|
||||
{
|
||||
lr.close();
|
||||
}
|
||||
}
|
||||
|
||||
// Render page 1 using xpdf's pdftoppm
|
||||
// Requires Sun JAI imageio additions to read ppm directly.
|
||||
// this will get "-000001.ppm" appended to it by pdftoppm
|
||||
File outPrefixF = File.createTempFile("prevu","out");
|
||||
String outPrefix = outPrefixF.toString();
|
||||
if (!outPrefixF.delete())
|
||||
{
|
||||
log.error("Unable to delete output file");
|
||||
}
|
||||
String pdfCmd[] = XPDF_PDFTOPPM_COMMAND.clone();
|
||||
pdfCmd[0] = pdftoppmPath;
|
||||
pdfCmd[pdfCmd.length-3] = String.valueOf(dpi);
|
||||
pdfCmd[pdfCmd.length-2] = sourceTmp.toString();
|
||||
pdfCmd[pdfCmd.length-1] = outPrefix;
|
||||
File outf = new File(outPrefix+"-000001.ppm");
|
||||
log.debug("Running xpdf command: "+Arrays.deepToString(pdfCmd));
|
||||
try
|
||||
{
|
||||
Process pdfProc = Runtime.getRuntime().exec(pdfCmd);
|
||||
status = pdfProc.waitFor();
|
||||
log.debug("PDFTOPPM output is: "+outf+", exists="+outf.exists());
|
||||
source = ImageIO.read(outf);
|
||||
}
|
||||
catch (InterruptedException e)
|
||||
{
|
||||
log.error("Failed transforming file for preview: ",e);
|
||||
throw new IllegalArgumentException("Failed transforming file for preview: ",e);
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (!outf.delete())
|
||||
{
|
||||
log.error("Unable to delete file");
|
||||
}
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (!sourceTmp.delete())
|
||||
{
|
||||
log.error("Unable to delete temporary source");
|
||||
}
|
||||
|
||||
if (status != 0)
|
||||
{
|
||||
log.error("PDF conversion proc failed, exit status=" + status + ", file=" + sourceTmp);
|
||||
}
|
||||
}
|
||||
|
||||
if (source == null)
|
||||
{
|
||||
throw new IOException("Unknown failure while transforming file to preview: no image produced.");
|
||||
}
|
||||
|
||||
// Scale image and return in-memory stream
|
||||
BufferedImage toenail = scaleImage(source, maxwidth*3/4, maxwidth);
|
||||
ByteArrayOutputStream baos = new ByteArrayOutputStream();
|
||||
ImageIO.write(toenail, "jpeg", baos);
|
||||
return new ByteArrayInputStream(baos.toByteArray());
|
||||
}
|
||||
|
||||
// scale the image, preserving aspect ratio, if at least one
|
||||
// dimension is not between min and max.
|
||||
private static BufferedImage scaleImage(BufferedImage source,
|
||||
int min, int max)
|
||||
{
|
||||
int xsize = source.getWidth(null);
|
||||
int ysize = source.getHeight(null);
|
||||
int msize = Math.max(xsize, ysize);
|
||||
BufferedImage result = null;
|
||||
|
||||
// scale the image if it's outside of requested range.
|
||||
// ALSO pass through if min and max are both 0
|
||||
if ((min == 0 && max == 0) ||
|
||||
(msize >= min && Math.min(xsize, ysize) <= max))
|
||||
{
|
||||
return source;
|
||||
}
|
||||
else
|
||||
{
|
||||
int xnew = xsize * max / msize;
|
||||
int ynew = ysize * max / msize;
|
||||
result = new BufferedImage(xnew, ynew, BufferedImage.TYPE_INT_RGB);
|
||||
Graphics2D g2d = result.createGraphics();
|
||||
g2d.drawImage(source, 0, 0, xnew, ynew, null);
|
||||
return result;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,775 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.packager;
|
||||
|
||||
import java.io.BufferedReader;
|
||||
import java.io.File;
|
||||
import java.io.FileNotFoundException;
|
||||
import java.io.IOException;
|
||||
import java.io.InputStreamReader;
|
||||
import java.sql.SQLException;
|
||||
import java.util.List;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.HelpFormatter;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.content.DSpaceObject;
|
||||
import org.dspace.content.crosswalk.CrosswalkException;
|
||||
import org.dspace.content.packager.PackageDisseminator;
|
||||
import org.dspace.content.packager.PackageException;
|
||||
import org.dspace.content.packager.PackageParameters;
|
||||
import org.dspace.content.packager.PackageIngester;
|
||||
import org.dspace.core.Constants;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.PluginManager;
|
||||
import org.dspace.eperson.EPerson;
|
||||
import org.dspace.handle.HandleManager;
|
||||
|
||||
/**
|
||||
* Command-line interface to the Packager plugin.
|
||||
* <p>
|
||||
* This class ONLY exists to provide a CLI for the packager plugins. It does not
|
||||
* "manage" the plugins and it is not called from within DSpace, but the name
|
||||
* follows a DSpace convention.
|
||||
* <p>
|
||||
* It can invoke one of the Submission (SIP) packagers to create a new DSpace
|
||||
* Item out of a package, or a Dissemination (DIP) packager to write an Item out
|
||||
* as a package.
|
||||
* <p>
|
||||
* Usage is as follows:<br>
|
||||
* (Add the -h option to get the command to show its own help)
|
||||
*
|
||||
* <pre>
|
||||
* 1. To submit a SIP (submissions tend to create a *new* object, with a new handle. If you want to restore an object, see -r option below)
|
||||
* dspace packager
|
||||
* -e {ePerson}
|
||||
* -t {PackagerType}
|
||||
* -p {parent-handle} [ -p {parent2} ...]
|
||||
* [-o {name}={value} [ -o {name}={value} ..]]
|
||||
* [-a] --- also recursively ingest all child packages of the initial package
|
||||
* (child pkgs must be referenced from parent pkg)
|
||||
* [-w] --- skip Workflow
|
||||
* {package-filename}
|
||||
*
|
||||
* {PackagerType} must match one of the aliases of the chosen Packager
|
||||
* plugin.
|
||||
*
|
||||
* The "-w" option circumvents Workflow, and is optional. The "-o"
|
||||
* option, which may be repeated, passes options to the packager
|
||||
* (e.g. "metadataOnly" to a DIP packager).
|
||||
*
|
||||
* 2. To restore an AIP (similar to submit mode, but attempts to restore with the handles/parents specified in AIP):
|
||||
* dspace packager
|
||||
* -r --- restores a object from a package info, including the specified handle (will throw an error if handle is already in use)
|
||||
* -e {ePerson}
|
||||
* -t {PackagerType}
|
||||
* [-o {name}={value} [ -o {name}={value} ..]]
|
||||
* [-a] --- also recursively restore all child packages of the initial package
|
||||
* (child pkgs must be referenced from parent pkg)
|
||||
* [-k] --- Skip over errors where objects already exist and Keep Existing objects by default.
|
||||
* Use with -r to only restore objects which do not already exist. By default, -r will throw an error
|
||||
* and rollback all changes when an object is found that already exists.
|
||||
* [-f] --- Force a restore (even if object already exists).
|
||||
* Use with -r to replace an existing object with one from a package (essentially a delete and restore).
|
||||
* By default, -r will throw an error and rollback all changes when an object is found that already exists.
|
||||
* [-i {identifier-handle-of-object}] -- Optional when -f is specified. When replacing an object, you can specify the
|
||||
* object to replace if it cannot be easily determined from the package itself.
|
||||
* {package-filename}
|
||||
*
|
||||
* Restoring is very similar to submitting, except that you are recreating pre-existing objects. So, in a restore, the object(s) are
|
||||
* being recreated based on the details in the AIP. This means that the object is recreated with the same handle and same parent/children
|
||||
* objects. Not all {PackagerTypes} may support a "restore".
|
||||
*
|
||||
* 3. To write out a DIP:
|
||||
* dspace packager
|
||||
* -d
|
||||
* -e {ePerson}
|
||||
* -t {PackagerType}
|
||||
* -i {identifier-handle-of-object}
|
||||
* [-a] --- also recursively disseminate all child objects of this object
|
||||
* [-o {name}={value} [ -o {name}={value} ..]]
|
||||
* {package-filename}
|
||||
*
|
||||
* The "-d" switch chooses a Dissemination packager, and is required.
|
||||
* The "-o" option, which may be repeated, passes options to the packager
|
||||
* (e.g. "metadataOnly" to a DIP packager).
|
||||
* </pre>
|
||||
*
|
||||
* Note that {package-filename} may be "-" for standard input or standard
|
||||
* output, respectively.
|
||||
*
|
||||
* @author Larry Stone
|
||||
* @author Tim Donohue
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class Packager
|
||||
{
|
||||
/* Various private global settings/options */
|
||||
private String packageType = null;
|
||||
private boolean submit = true;
|
||||
private boolean userInteractionEnabled = true;
|
||||
|
||||
// die from illegal command line
|
||||
private static void usageError(String msg)
|
||||
{
|
||||
System.out.println(msg);
|
||||
System.out.println(" (run with -h flag for details)");
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
public static void main(String[] argv) throws Exception
|
||||
{
|
||||
Options options = new Options();
|
||||
options.addOption("p", "parent", true,
|
||||
"Handle(s) of parent Community or Collection into which to ingest object (repeatable)");
|
||||
options.addOption("e", "eperson", true,
|
||||
"email address of eperson doing importing");
|
||||
options
|
||||
.addOption(
|
||||
"w",
|
||||
"install",
|
||||
false,
|
||||
"disable workflow; install immediately without going through collection's workflow");
|
||||
options.addOption("r", "restore", false, "ingest in \"restore\" mode. Restores a missing object based on the contents in a package.");
|
||||
options.addOption("k", "keep-existing", false, "if an object is found to already exist during a restore (-r), then keep the existing object and continue processing. Can only be used with '-r'. This avoids object-exists errors which are thrown by -r by default.");
|
||||
options.addOption("f", "force-replace", false, "if an object is found to already exist during a restore (-r), then remove it and replace it with the contents of the package. Can only be used with '-r'. This REPLACES the object(s) in the repository with the contents from the package(s).");
|
||||
options.addOption("t", "type", true, "package type or MIMEtype");
|
||||
options
|
||||
.addOption("o", "option", true,
|
||||
"Packager option to pass to plugin, \"name=value\" (repeatable)");
|
||||
options.addOption("d", "disseminate", false,
|
||||
"Disseminate package (output); default is to submit.");
|
||||
options.addOption("s", "submit", false,
|
||||
"Submission package (Input); this is the default. ");
|
||||
options.addOption("i", "identifier", true, "Handle of object to disseminate.");
|
||||
options.addOption("a", "all", false, "also recursively ingest/disseminate any child packages, e.g. all Items within a Collection (not all packagers may support this option!)");
|
||||
options.addOption("h", "help", false, "help (you may also specify '-h -t [type]' for additional help with a specific type of packager)");
|
||||
options.addOption("u", "no-user-interaction", false, "Skips over all user interaction (i.e. [y/n] question prompts) within this script. This flag can be used if you want to save (pipe) a report of all changes to a file, and therefore need to bypass all user interaction.");
|
||||
|
||||
CommandLineParser parser = new PosixParser();
|
||||
CommandLine line = parser.parse(options, argv);
|
||||
|
||||
String sourceFile = null;
|
||||
String eperson = null;
|
||||
String[] parents = null;
|
||||
String identifier = null;
|
||||
PackageParameters pkgParams = new PackageParameters();
|
||||
|
||||
//initialize a new packager -- we'll add all our current params as settings
|
||||
Packager myPackager = new Packager();
|
||||
|
||||
if (line.hasOption('h'))
|
||||
{
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("Packager [options] package-file|-\n",
|
||||
options);
|
||||
//If user specified a type, also print out the SIP and DIP options
|
||||
// that are specific to that type of packager
|
||||
if (line.hasOption('t'))
|
||||
{
|
||||
System.out.println("\n--------------------------------------------------------------");
|
||||
System.out.println("Additional options for the " + line.getOptionValue('t') + " packager:");
|
||||
System.out.println("--------------------------------------------------------------");
|
||||
System.out.println("(These options may be specified using --option as described above)");
|
||||
|
||||
PackageIngester sip = (PackageIngester) PluginManager
|
||||
.getNamedPlugin(PackageIngester.class, line.getOptionValue('t'));
|
||||
|
||||
if (sip != null)
|
||||
{
|
||||
System.out.println("\n\n" + line.getOptionValue('t') + " Submission (SIP) plugin options:\n");
|
||||
System.out.println(sip.getParameterHelp());
|
||||
}
|
||||
else
|
||||
{
|
||||
System.out.println("\nNo valid Submission plugin found for " + line.getOptionValue('t') + " type.");
|
||||
}
|
||||
|
||||
PackageDisseminator dip = (PackageDisseminator) PluginManager
|
||||
.getNamedPlugin(PackageDisseminator.class, line.getOptionValue('t'));
|
||||
|
||||
if (dip != null)
|
||||
{
|
||||
System.out.println("\n\n" + line.getOptionValue('t') + " Dissemination (DIP) plugin options:\n");
|
||||
System.out.println(dip.getParameterHelp());
|
||||
}
|
||||
else
|
||||
{
|
||||
System.out.println("\nNo valid Dissemination plugin found for " + line.getOptionValue('t') + " type.");
|
||||
}
|
||||
|
||||
}
|
||||
else //otherwise, display list of valid packager types
|
||||
{
|
||||
System.out.println("\nAvailable Submission Package (SIP) types:");
|
||||
String pn[] = PluginManager
|
||||
.getAllPluginNames(PackageIngester.class);
|
||||
for (int i = 0; i < pn.length; ++i)
|
||||
{
|
||||
System.out.println(" " + pn[i]);
|
||||
}
|
||||
System.out
|
||||
.println("\nAvailable Dissemination Package (DIP) types:");
|
||||
pn = PluginManager.getAllPluginNames(PackageDisseminator.class);
|
||||
for (int i = 0; i < pn.length; ++i)
|
||||
{
|
||||
System.out.println(" " + pn[i]);
|
||||
}
|
||||
}
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
//look for flag to disable all user interaction
|
||||
if(line.hasOption('u'))
|
||||
{
|
||||
myPackager.userInteractionEnabled = false;
|
||||
}
|
||||
if (line.hasOption('w'))
|
||||
{
|
||||
pkgParams.setWorkflowEnabled(false);
|
||||
}
|
||||
if (line.hasOption('r'))
|
||||
{
|
||||
pkgParams.setRestoreModeEnabled(true);
|
||||
}
|
||||
//keep-existing is only valid in restoreMode (-r) -- otherwise ignore -k option.
|
||||
if (line.hasOption('k') && pkgParams.restoreModeEnabled())
|
||||
{
|
||||
pkgParams.setKeepExistingModeEnabled(true);
|
||||
}
|
||||
//force-replace is only valid in restoreMode (-r) -- otherwise ignore -f option.
|
||||
if (line.hasOption('f') && pkgParams.restoreModeEnabled())
|
||||
{
|
||||
pkgParams.setReplaceModeEnabled(true);
|
||||
}
|
||||
if (line.hasOption('e'))
|
||||
{
|
||||
eperson = line.getOptionValue('e');
|
||||
}
|
||||
if (line.hasOption('p'))
|
||||
{
|
||||
parents = line.getOptionValues('p');
|
||||
}
|
||||
if (line.hasOption('t'))
|
||||
{
|
||||
myPackager.packageType = line.getOptionValue('t');
|
||||
}
|
||||
if (line.hasOption('i'))
|
||||
{
|
||||
identifier = line.getOptionValue('i');
|
||||
}
|
||||
if (line.hasOption('a'))
|
||||
{
|
||||
//enable 'recursiveMode' param to packager implementations, in case it helps with packaging or ingestion process
|
||||
pkgParams.setRecursiveModeEnabled(true);
|
||||
}
|
||||
String files[] = line.getArgs();
|
||||
if (files.length > 0)
|
||||
{
|
||||
sourceFile = files[0];
|
||||
}
|
||||
if (line.hasOption('d'))
|
||||
{
|
||||
myPackager.submit = false;
|
||||
}
|
||||
if (line.hasOption('o'))
|
||||
{
|
||||
String popt[] = line.getOptionValues('o');
|
||||
for (int i = 0; i < popt.length; ++i)
|
||||
{
|
||||
String pair[] = popt[i].split("\\=", 2);
|
||||
if (pair.length == 2)
|
||||
{
|
||||
pkgParams.addProperty(pair[0].trim(), pair[1].trim());
|
||||
}
|
||||
else if (pair.length == 1)
|
||||
{
|
||||
pkgParams.addProperty(pair[0].trim(), "");
|
||||
}
|
||||
else
|
||||
{
|
||||
System.err
|
||||
.println("Warning: Illegal package option format: \""
|
||||
+ popt[i] + "\"");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Sanity checks on arg list: required args
|
||||
// REQUIRED: sourceFile, ePerson (-e), packageType (-t)
|
||||
if (sourceFile == null || eperson == null || myPackager.packageType == null)
|
||||
{
|
||||
System.err.println("Error - missing a REQUIRED argument or option.\n");
|
||||
HelpFormatter myhelp = new HelpFormatter();
|
||||
myhelp.printHelp("PackageManager [options] package-file|-\n", options);
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
// find the EPerson, assign to context
|
||||
Context context = new Context();
|
||||
EPerson myEPerson = null;
|
||||
myEPerson = EPerson.findByEmail(context, eperson);
|
||||
if (myEPerson == null)
|
||||
{
|
||||
usageError("Error, eperson cannot be found: " + eperson);
|
||||
}
|
||||
context.setCurrentUser(myEPerson);
|
||||
|
||||
|
||||
//If we are in REPLACE mode
|
||||
if(pkgParams.replaceModeEnabled())
|
||||
{
|
||||
PackageIngester sip = (PackageIngester) PluginManager
|
||||
.getNamedPlugin(PackageIngester.class, myPackager.packageType);
|
||||
if (sip == null)
|
||||
{
|
||||
usageError("Error, Unknown package type: " + myPackager.packageType);
|
||||
}
|
||||
|
||||
DSpaceObject objToReplace = null;
|
||||
|
||||
//if a specific identifier was specified, make sure it is valid
|
||||
if(identifier!=null && identifier.length()>0)
|
||||
{
|
||||
objToReplace = HandleManager.resolveToObject(context, identifier);
|
||||
if (objToReplace == null)
|
||||
{
|
||||
throw new IllegalArgumentException("Bad identifier/handle -- "
|
||||
+ "Cannot resolve handle \"" + identifier + "\"");
|
||||
}
|
||||
}
|
||||
|
||||
String choiceString = null;
|
||||
if(myPackager.userInteractionEnabled)
|
||||
{
|
||||
BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
|
||||
System.out.println("\n\nWARNING -- You are running the packager in REPLACE mode.");
|
||||
System.out.println("\nREPLACE mode may be potentially dangerous as it will automatically remove and replace contents within DSpace.");
|
||||
System.out.println("We highly recommend backing up all your DSpace contents (files & database) before continuing.");
|
||||
System.out.print("\nWould you like to continue? [y/n]: ");
|
||||
choiceString = input.readLine();
|
||||
}
|
||||
else
|
||||
{
|
||||
//user interaction disabled -- default answer to 'yes', otherwise script won't continue
|
||||
choiceString = "y";
|
||||
}
|
||||
|
||||
if (choiceString.equalsIgnoreCase("y"))
|
||||
{
|
||||
System.out.println("Beginning replacement process...");
|
||||
|
||||
try
|
||||
{
|
||||
//replace the object from the source file
|
||||
myPackager.replace(context, sip, pkgParams, sourceFile, objToReplace);
|
||||
|
||||
//commit all changes & exit successfully
|
||||
context.complete();
|
||||
System.exit(0);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
// abort all operations
|
||||
e.printStackTrace();
|
||||
context.abort();
|
||||
System.out.println(e);
|
||||
System.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
//else if normal SUBMIT mode (or basic RESTORE mode -- which is a special type of submission)
|
||||
else if (myPackager.submit || pkgParams.restoreModeEnabled())
|
||||
{
|
||||
PackageIngester sip = (PackageIngester) PluginManager
|
||||
.getNamedPlugin(PackageIngester.class, myPackager.packageType);
|
||||
if (sip == null)
|
||||
{
|
||||
usageError("Error, Unknown package type: " + myPackager.packageType);
|
||||
}
|
||||
|
||||
// validate each parent arg (if any)
|
||||
DSpaceObject parentObjs[] = null;
|
||||
if(parents!=null)
|
||||
{
|
||||
System.out.println("Destination parents:");
|
||||
|
||||
parentObjs = new DSpaceObject[parents.length];
|
||||
for (int i = 0; i < parents.length; i++)
|
||||
{
|
||||
// sanity check: did handle resolve?
|
||||
parentObjs[i] = HandleManager.resolveToObject(context,
|
||||
parents[i]);
|
||||
if (parentObjs[i] == null)
|
||||
{
|
||||
throw new IllegalArgumentException(
|
||||
"Bad parent list -- "
|
||||
+ "Cannot resolve parent handle \""
|
||||
+ parents[i] + "\"");
|
||||
}
|
||||
System.out.println((i == 0 ? "Owner: " : "Parent: ")
|
||||
+ parentObjs[i].getHandle());
|
||||
}
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
//ingest the object from the source file
|
||||
myPackager.ingest(context, sip, pkgParams, sourceFile, parentObjs);
|
||||
|
||||
//commit all changes & exit successfully
|
||||
context.complete();
|
||||
System.exit(0);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
// abort all operations
|
||||
e.printStackTrace();
|
||||
context.abort();
|
||||
System.out.println(e);
|
||||
System.exit(1);
|
||||
}
|
||||
}// else, if DISSEMINATE mode
|
||||
else
|
||||
{
|
||||
//retrieve specified package disseminator
|
||||
PackageDisseminator dip = (PackageDisseminator) PluginManager
|
||||
.getNamedPlugin(PackageDisseminator.class, myPackager.packageType);
|
||||
if (dip == null)
|
||||
{
|
||||
usageError("Error, Unknown package type: " + myPackager.packageType);
|
||||
}
|
||||
|
||||
DSpaceObject dso = HandleManager.resolveToObject(context, identifier);
|
||||
if (dso == null)
|
||||
{
|
||||
throw new IllegalArgumentException("Bad identifier/handle -- "
|
||||
+ "Cannot resolve handle \"" + identifier + "\"");
|
||||
}
|
||||
|
||||
//disseminate the requested object
|
||||
myPackager.disseminate(context, dip, dso, pkgParams, sourceFile);
|
||||
}
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
/**
|
||||
* Ingest one or more DSpace objects from package(s) based on the
|
||||
* options passed to the 'packager' script. This method is called
|
||||
* for both 'submit' (-s) and 'restore' (-r) modes.
|
||||
* <p>
|
||||
* Please note that replace (-r -f) mode calls the replace() method instead.
|
||||
*
|
||||
* @param context DSpace Context
|
||||
* @param sip PackageIngester which will actually ingest the package
|
||||
* @param pkgParams Parameters to pass to individual packager instances
|
||||
* @param sourceFile location of the source package to ingest
|
||||
* @param parentObjs Parent DSpace object(s) to attach new object to
|
||||
* @throws IOException
|
||||
* @throws SQLException
|
||||
* @throws FileNotFoundException
|
||||
* @throws AuthorizeException
|
||||
* @throws CrosswalkException
|
||||
* @throws PackageException
|
||||
*/
|
||||
protected void ingest(Context context, PackageIngester sip, PackageParameters pkgParams, String sourceFile, DSpaceObject parentObjs[])
|
||||
throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException, PackageException
|
||||
{
|
||||
// make sure we have an input file
|
||||
File pkgFile = new File(sourceFile);
|
||||
|
||||
if(!pkgFile.exists())
|
||||
{
|
||||
System.out.println("\nERROR: Package located at " + sourceFile + " does not exist!");
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
System.out.println("\nIngesting package located at " + sourceFile);
|
||||
|
||||
//find first parent (if specified) -- this will be the "owner" of the object
|
||||
DSpaceObject parent = null;
|
||||
if(parentObjs!=null && parentObjs.length>0)
|
||||
{
|
||||
parent = parentObjs[0];
|
||||
}
|
||||
//NOTE: at this point, Parent may be null -- in which case it is up to the PackageIngester
|
||||
// to either determine the Parent (from package contents) or throw an error.
|
||||
|
||||
|
||||
//If we are doing a recursive ingest, call ingestAll()
|
||||
if(pkgParams.recursiveModeEnabled())
|
||||
{
|
||||
System.out.println("\nAlso ingesting all referenced packages (recursive mode)..");
|
||||
System.out.println("This may take a while, please check your logs for ongoing status while we process each package.");
|
||||
|
||||
//ingest first package & recursively ingest anything else that package references (child packages, etc)
|
||||
List<DSpaceObject> dsoResults = sip.ingestAll(context, parent, pkgFile, pkgParams, null);
|
||||
|
||||
if(dsoResults!=null)
|
||||
{
|
||||
//Report total objects created
|
||||
System.out.println("\nCREATED a total of " + dsoResults.size() + " DSpace Objects.");
|
||||
|
||||
String choiceString = null;
|
||||
//Ask if user wants full list printed to command line, as this may be rather long.
|
||||
if(this.userInteractionEnabled)
|
||||
{
|
||||
BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
|
||||
System.out.print("\nWould you like to view a list of all objects that were created? [y/n]: ");
|
||||
choiceString = input.readLine();
|
||||
}
|
||||
else
|
||||
{
|
||||
// user interaction disabled -- default answer to 'yes', as
|
||||
// we want to provide user with as detailed a report as possible.
|
||||
choiceString = "y";
|
||||
}
|
||||
|
||||
// Provide detailed report if user answered 'yes'
|
||||
if (choiceString.equalsIgnoreCase("y"))
|
||||
{
|
||||
System.out.println("\n\n");
|
||||
for(DSpaceObject result : dsoResults)
|
||||
{
|
||||
if(pkgParams.restoreModeEnabled())
|
||||
{
|
||||
System.out.println("RESTORED DSpace " + Constants.typeText[result.getType()] +
|
||||
" [ hdl=" + result.getHandle() + ", dbID=" + result.getID() + " ] ");
|
||||
}
|
||||
else
|
||||
{
|
||||
System.out.println("CREATED new DSpace " + Constants.typeText[result.getType()] +
|
||||
" [ hdl=" + result.getHandle() + ", dbID=" + result.getID() + " ] ");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
|
||||
//otherwise, just one package to ingest
|
||||
try
|
||||
{
|
||||
|
||||
DSpaceObject dso = sip.ingest(context, parent, pkgFile, pkgParams, null);
|
||||
|
||||
if(dso!=null)
|
||||
{
|
||||
if(pkgParams.restoreModeEnabled())
|
||||
{
|
||||
System.out.println("RESTORED DSpace " + Constants.typeText[dso.getType()] +
|
||||
" [ hdl=" + dso.getHandle() + ", dbID=" + dso.getID() + " ] ");
|
||||
}
|
||||
else
|
||||
{
|
||||
System.out.println("CREATED new DSpace " + Constants.typeText[dso.getType()] +
|
||||
" [ hdl=" + dso.getHandle() + ", dbID=" + dso.getID() + " ] ");
|
||||
}
|
||||
}
|
||||
}
|
||||
catch(IllegalStateException ie)
|
||||
{
|
||||
// NOTE: if we encounter an IllegalStateException, this means the
|
||||
// handle is already in use and this object already exists.
|
||||
|
||||
//if we are skipping over (i.e. keeping) existing objects
|
||||
if(pkgParams.keepExistingModeEnabled())
|
||||
{
|
||||
System.out.println("\nSKIPPED processing package '" + pkgFile + "', as an Object already exists with this handle.");
|
||||
}
|
||||
else // Pass this exception on -- which essentially causes a full rollback of all changes (this is the default)
|
||||
{
|
||||
throw ie;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Disseminate one or more DSpace objects into package(s) based on the
|
||||
* options passed to the 'packager' script
|
||||
*
|
||||
* @param context DSpace context
|
||||
* @param dip PackageDisseminator which will actually create the package
|
||||
* @param dso DSpace Object to disseminate as a package
|
||||
* @param pkgParams Parameters to pass to individual packager instances
|
||||
* @param outputFile File where final package should be saved
|
||||
* @param identifier identifier of main DSpace object to disseminate
|
||||
* @throws IOException
|
||||
* @throws SQLException
|
||||
* @throws FileNotFoundException
|
||||
* @throws AuthorizeException
|
||||
* @throws CrosswalkException
|
||||
* @throws PackageException
|
||||
*/
|
||||
protected void disseminate(Context context, PackageDisseminator dip, DSpaceObject dso, PackageParameters pkgParams, String outputFile)
|
||||
throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException, PackageException
|
||||
{
|
||||
// initialize output file
|
||||
File pkgFile = new File(outputFile);
|
||||
|
||||
System.out.println("\nDisseminating DSpace " + Constants.typeText[dso.getType()] +
|
||||
" [ hdl=" + dso.getHandle() + " ] to " + outputFile);
|
||||
|
||||
//If we are doing a recursive dissemination of this object & all its child objects, call disseminateAll()
|
||||
if(pkgParams.recursiveModeEnabled())
|
||||
{
|
||||
System.out.println("\nAlso disseminating all child objects (recursive mode)..");
|
||||
System.out.println("This may take a while, please check your logs for ongoing status while we process each package.");
|
||||
|
||||
//disseminate initial object & recursively disseminate all child objects as well
|
||||
List<File> fileResults = dip.disseminateAll(context, dso, pkgParams, pkgFile);
|
||||
|
||||
if(fileResults!=null)
|
||||
{
|
||||
//Report total files created
|
||||
System.out.println("\nCREATED a total of " + fileResults.size() + " dissemination package files.");
|
||||
|
||||
String choiceString = null;
|
||||
//Ask if user wants full list printed to command line, as this may be rather long.
|
||||
if(this.userInteractionEnabled)
|
||||
{
|
||||
BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
|
||||
System.out.print("\nWould you like to view a list of all files that were created? [y/n]: ");
|
||||
choiceString = input.readLine();
|
||||
}
|
||||
else
|
||||
{
|
||||
// user interaction disabled -- default answer to 'yes', as
|
||||
// we want to provide user with as detailed a report as possible.
|
||||
choiceString = "y";
|
||||
}
|
||||
|
||||
// Provide detailed report if user answered 'yes'
|
||||
if (choiceString.equalsIgnoreCase("y"))
|
||||
{
|
||||
System.out.println("\n\n");
|
||||
for(File result : fileResults)
|
||||
{
|
||||
System.out.println("CREATED package file: " + result.getCanonicalPath());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
//otherwise, just disseminate a single object to a single package file
|
||||
dip.disseminate(context, dso, pkgParams, pkgFile);
|
||||
|
||||
if(pkgFile!=null && pkgFile.exists())
|
||||
{
|
||||
System.out.println("\nCREATED package file: " + pkgFile.getCanonicalPath());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
/**
|
||||
* Replace an one or more existing DSpace objects with the contents of
|
||||
* specified package(s) based on the options passed to the 'packager' script.
|
||||
* This method is only called for full replaces ('-r -f' options specified)
|
||||
*
|
||||
* @param context DSpace Context
|
||||
* @param sip PackageIngester which will actually replace the object with the package
|
||||
* @param pkgParams Parameters to pass to individual packager instances
|
||||
* @param sourceFile location of the source package to ingest as the replacement
|
||||
* @param objToReplace DSpace object to replace (may be null if it will be specified in the package itself)
|
||||
* @throws IOException
|
||||
* @throws SQLException
|
||||
* @throws FileNotFoundException
|
||||
* @throws AuthorizeException
|
||||
* @throws CrosswalkException
|
||||
* @throws PackageException
|
||||
*/
|
||||
protected void replace(Context context, PackageIngester sip, PackageParameters pkgParams, String sourceFile, DSpaceObject objToReplace)
|
||||
throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException, PackageException
|
||||
{
|
||||
|
||||
// make sure we have an input file
|
||||
File pkgFile = new File(sourceFile);
|
||||
|
||||
if(!pkgFile.exists())
|
||||
{
|
||||
System.out.println("\nPackage located at " + sourceFile + " does not exist!");
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
System.out.println("\nReplacing DSpace object(s) with package located at " + sourceFile);
|
||||
if(objToReplace!=null)
|
||||
{
|
||||
System.out.println("Will replace existing DSpace " + Constants.typeText[objToReplace.getType()] +
|
||||
" [ hdl=" + objToReplace.getHandle() + " ]");
|
||||
}
|
||||
// NOTE: At this point, objToReplace may be null. If it is null, it is up to the PackageIngester
|
||||
// to determine which Object needs to be replaced (based on the handle specified in the pkg, etc.)
|
||||
|
||||
|
||||
//If we are doing a recursive replace, call replaceAll()
|
||||
if(pkgParams.recursiveModeEnabled())
|
||||
{
|
||||
//ingest first object using package & recursively replace anything else that package references (child objects, etc)
|
||||
List<DSpaceObject> dsoResults = sip.replaceAll(context, objToReplace, pkgFile, pkgParams);
|
||||
|
||||
if(dsoResults!=null)
|
||||
{
|
||||
//Report total objects replaced
|
||||
System.out.println("\nREPLACED a total of " + dsoResults.size() + " DSpace Objects.");
|
||||
|
||||
String choiceString = null;
|
||||
//Ask if user wants full list printed to command line, as this may be rather long.
|
||||
if(this.userInteractionEnabled)
|
||||
{
|
||||
BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
|
||||
System.out.print("\nWould you like to view a list of all objects that were replaced? [y/n]: ");
|
||||
choiceString = input.readLine();
|
||||
}
|
||||
else
|
||||
{
|
||||
// user interaction disabled -- default answer to 'yes', as
|
||||
// we want to provide user with as detailed a report as possible.
|
||||
choiceString = "y";
|
||||
}
|
||||
|
||||
// Provide detailed report if user answered 'yes'
|
||||
if (choiceString.equalsIgnoreCase("y"))
|
||||
{
|
||||
System.out.println("\n\n");
|
||||
for(DSpaceObject result : dsoResults)
|
||||
{
|
||||
System.out.println("REPLACED DSpace " + Constants.typeText[result.getType()] +
|
||||
" [ hdl=" + result.getHandle() + " ] ");
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
//otherwise, just one object to replace
|
||||
DSpaceObject dso = sip.replace(context, objToReplace, pkgFile, pkgParams);
|
||||
|
||||
if(dso!=null)
|
||||
{
|
||||
System.out.println("REPLACED DSpace " + Constants.typeText[dso.getType()] +
|
||||
" [ hdl=" + dso.getHandle() + " ] ");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,16 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
|
||||
/**
|
||||
* <p>Tools for exporting and importing DSpace objects (Community, Collection,
|
||||
* Item, etc.) wrapped in various kinds of packaging.</p>
|
||||
*
|
||||
* @see org.dspace.content.packager
|
||||
*/
|
||||
|
||||
package org.dspace.app.packager;
|
||||
@@ -1,324 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.sfx;
|
||||
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
import java.net.URLEncoder;
|
||||
|
||||
import org.apache.commons.lang.StringUtils;
|
||||
import org.apache.log4j.Logger;
|
||||
import org.w3c.dom.Document;
|
||||
|
||||
import org.dspace.content.DCPersonName;
|
||||
import org.dspace.content.DCValue;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Constants;
|
||||
|
||||
import javax.xml.parsers.DocumentBuilderFactory;
|
||||
import javax.xml.parsers.DocumentBuilder;
|
||||
import javax.xml.parsers.ParserConfigurationException;
|
||||
|
||||
import org.w3c.dom.NamedNodeMap;
|
||||
import org.w3c.dom.Node;
|
||||
import org.w3c.dom.NodeList;
|
||||
import org.xml.sax.SAXException;
|
||||
|
||||
|
||||
public class SFXFileReader {
|
||||
|
||||
/** The SFX configuration file */
|
||||
private static Document doc;
|
||||
|
||||
/** log4j logger */
|
||||
private static final Logger log = Logger.getLogger(SFXFileReader.class);
|
||||
|
||||
|
||||
/**
|
||||
* Loads the SFX configuraiton file
|
||||
*
|
||||
* @param fileName The name of the SFX configuration file
|
||||
* @param item The item to process
|
||||
*
|
||||
* @return the SFX string
|
||||
* @throws IOException
|
||||
*/
|
||||
public static String loadSFXFile(String fileName, Item item) throws IOException
|
||||
{
|
||||
// Parse XML file -> XML document will be build
|
||||
if (doc == null)
|
||||
{
|
||||
doc = parseFile(fileName);
|
||||
}
|
||||
|
||||
// Return final sfx Query String
|
||||
return doNodes(doc, item);
|
||||
}
|
||||
|
||||
/** Parses XML file and returns XML document.
|
||||
* @param fileName XML file to parse
|
||||
* @return XML document or <B>null</B> if error occured
|
||||
*/
|
||||
|
||||
public static Document parseFile(String fileName) {
|
||||
log.info("Parsing XML file... " + fileName);
|
||||
DocumentBuilder docBuilder;
|
||||
Document doc = null;
|
||||
DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
|
||||
docBuilderFactory.setIgnoringElementContentWhitespace(true);
|
||||
try {
|
||||
docBuilder = docBuilderFactory.newDocumentBuilder();
|
||||
}
|
||||
catch (ParserConfigurationException e) {
|
||||
log.error("Wrong parser configuration: " + e.getMessage());
|
||||
return null;
|
||||
}
|
||||
File sourceFile = new File(fileName);
|
||||
try {
|
||||
doc = docBuilder.parse(sourceFile);
|
||||
}
|
||||
catch (SAXException e) {
|
||||
log.error("Wrong XML file structure: " + e.getMessage());
|
||||
return null;
|
||||
}
|
||||
catch (IOException e) {
|
||||
log.error("Could not read source file: " + e.getMessage());
|
||||
}
|
||||
log.info("XML file parsed");
|
||||
return doc;
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the item
|
||||
*
|
||||
* @param node
|
||||
* @param item
|
||||
* @return
|
||||
* @throws IOException
|
||||
*/
|
||||
public static String doNodes(Node node, Item item) throws IOException
|
||||
{
|
||||
if (node == null)
|
||||
{
|
||||
log.error (" Empty Node ");
|
||||
return null;
|
||||
}
|
||||
Node e = getElement(node);
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
String sfxfield = "";
|
||||
int i = 0;
|
||||
|
||||
while ((i < len) && StringUtils.isEmpty(sfxfield))
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
if ((nd == null) || isEmptyTextNode(nd))
|
||||
{
|
||||
i++;
|
||||
continue;
|
||||
}
|
||||
String tagName = nd.getNodeName();
|
||||
if (tagName.equals("query-pairs"))
|
||||
{
|
||||
sfxfield = processFields(nd, item);
|
||||
|
||||
}
|
||||
i++;
|
||||
}
|
||||
log.info("Process fields : " + sfxfield);
|
||||
return sfxfield;
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the fields
|
||||
*
|
||||
* @param e
|
||||
* @param item
|
||||
* @return
|
||||
* @throws IOException
|
||||
*/
|
||||
private static String processFields(Node e, Item item) throws IOException
|
||||
{
|
||||
NodeList cl = e.getChildNodes();
|
||||
int lench = cl.getLength();
|
||||
String myquery = "";
|
||||
|
||||
for (int j = 0; j < lench; j++)
|
||||
{
|
||||
Node nch = cl.item(j);
|
||||
String querystring = "";
|
||||
String schema = "";
|
||||
String qualifier = "";
|
||||
String element = "";
|
||||
|
||||
if (nch.getNodeName().equals("field"))
|
||||
{
|
||||
NodeList pl = nch.getChildNodes();
|
||||
int plen = pl.getLength();
|
||||
int finish = 0;
|
||||
for (int k = 0; k < plen; k++)
|
||||
{
|
||||
Node vn= pl.item(k);
|
||||
String vName = vn.getNodeName();
|
||||
if (vName.equals("querystring"))
|
||||
{
|
||||
querystring = getValue(vn);
|
||||
finish ++;
|
||||
}
|
||||
else if (vName.equals("dc-schema"))
|
||||
{
|
||||
schema = getValue(vn);
|
||||
finish ++;
|
||||
}
|
||||
else if (vName.equals("dc-element"))
|
||||
{
|
||||
element = getValue(vn);
|
||||
finish ++;
|
||||
}
|
||||
else if (vName.equals("dc-qualifier"))
|
||||
{
|
||||
qualifier = getValue(vn);
|
||||
finish ++;
|
||||
if (StringUtils.isEmpty(qualifier))
|
||||
{
|
||||
qualifier = null;
|
||||
}
|
||||
}
|
||||
if (finish == 4)
|
||||
{
|
||||
DCValue[] dcvalue = item.getMetadata(schema, element, qualifier, Item.ANY);
|
||||
if (dcvalue.length > 0)
|
||||
{
|
||||
// Issued Date
|
||||
if (element.equals("date") && qualifier.equals("issued"))
|
||||
{
|
||||
String fullDate = dcvalue[0].value;
|
||||
// Remove the time if there is one - day is greatest granularity for SFX
|
||||
if (fullDate.length() > 10)
|
||||
{
|
||||
fullDate = fullDate.substring(0, 10);
|
||||
}
|
||||
if (myquery.equals(""))
|
||||
{ myquery = querystring + URLEncoder.encode(fullDate, Constants.DEFAULT_ENCODING); }
|
||||
else
|
||||
{ myquery = myquery + "&" + querystring + URLEncoder.encode(fullDate, Constants.DEFAULT_ENCODING); }
|
||||
}
|
||||
else
|
||||
{
|
||||
// Contributor Author
|
||||
if (element.equals("contributor") && qualifier.equals("author"))
|
||||
{
|
||||
DCPersonName dpn = new DCPersonName(dcvalue[0].value);
|
||||
String dpnName = dcvalue[0].value;
|
||||
|
||||
if (querystring.endsWith("aulast=")) { dpnName = dpn.getLastName(); }
|
||||
else { if (querystring.endsWith("aufirst=")) { dpnName = dpn.getFirstNames(); }}
|
||||
|
||||
if (myquery.equals(""))
|
||||
{ myquery = querystring + URLEncoder.encode(dpnName, Constants.DEFAULT_ENCODING); }
|
||||
else
|
||||
{ myquery = myquery + "&" + querystring + URLEncoder.encode(dpnName, Constants.DEFAULT_ENCODING); }
|
||||
}
|
||||
else
|
||||
{
|
||||
if (myquery.equals(""))
|
||||
{ myquery = querystring + URLEncoder.encode(dcvalue[0].value, Constants.DEFAULT_ENCODING);}
|
||||
else
|
||||
{ myquery = myquery + "&" + querystring + URLEncoder.encode(dcvalue[0].value, Constants.DEFAULT_ENCODING);}
|
||||
}
|
||||
}
|
||||
} // if dc.length > 0
|
||||
|
||||
finish = 0;
|
||||
querystring = "";
|
||||
schema = "";
|
||||
element = "";
|
||||
qualifier = "";
|
||||
} // if finish == 4
|
||||
} //for k
|
||||
} // if field
|
||||
} // for j
|
||||
return myquery;
|
||||
}
|
||||
|
||||
/** Returns element node
|
||||
* @param node element (it is XML tag)
|
||||
* @return Element node otherwise null
|
||||
*/
|
||||
public static Node getElement(Node node)
|
||||
{
|
||||
NodeList child = node.getChildNodes();
|
||||
int length = child.getLength();
|
||||
for (int i = 0; i < length; i++)
|
||||
{
|
||||
Node kid = child.item(i);
|
||||
if (kid.getNodeType() == Node.ELEMENT_NODE)
|
||||
{
|
||||
return kid;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/** Is Empty text Node **/
|
||||
public static boolean isEmptyTextNode(Node nd)
|
||||
{
|
||||
boolean isEmpty = false;
|
||||
if (nd.getNodeType() == Node.TEXT_NODE)
|
||||
{
|
||||
String text = nd.getNodeValue().trim();
|
||||
if (text.length() == 0)
|
||||
{
|
||||
isEmpty = true;
|
||||
}
|
||||
}
|
||||
return isEmpty;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value of the node's attribute named <name>
|
||||
**/
|
||||
public static String getAttribute(Node e, String name)
|
||||
{
|
||||
NamedNodeMap attrs = e.getAttributes();
|
||||
int len = attrs.getLength();
|
||||
if (len > 0)
|
||||
{
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node attr = attrs.item(i);
|
||||
if (name.equals(attr.getNodeName()))
|
||||
{
|
||||
return attr.getNodeValue().trim();
|
||||
}
|
||||
}
|
||||
}
|
||||
//no such attribute
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value found in the Text node (if any) in the
|
||||
* node list that's passed in.
|
||||
*/
|
||||
public static String getValue(Node node)
|
||||
{
|
||||
NodeList child = node.getChildNodes();
|
||||
for (int i = 0; i < child.getLength(); i++)
|
||||
{
|
||||
Node kid = child.item(i);
|
||||
short type = kid.getNodeType();
|
||||
if (type == Node.TEXT_NODE)
|
||||
{
|
||||
return kid.getNodeValue().trim();
|
||||
}
|
||||
}
|
||||
// Didn't find a text node
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,251 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.sitemap;
|
||||
|
||||
import java.io.File;
|
||||
import java.io.FileOutputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.OutputStream;
|
||||
import java.io.PrintStream;
|
||||
import java.util.Date;
|
||||
import java.util.zip.GZIPOutputStream;
|
||||
|
||||
/**
|
||||
* Base class for creating sitemaps of various kinds. A sitemap consists of one
|
||||
* or more files which list significant URLs on a site for search engines to
|
||||
* efficiently crawl. Dates of modification may also be included. A sitemap
|
||||
* index file that links to each of the sitemap files is also generated. It is
|
||||
* this index file that search engines should be directed towards.
|
||||
* <P>
|
||||
* Provides most of the required functionality, subclasses need just implement a
|
||||
* few methods that specify the "boilerplate" and text for including URLs.
|
||||
* <P>
|
||||
* Typical usage:
|
||||
* <pre>
|
||||
* AbstractGenerator g = new FooGenerator(...);
|
||||
* while (...) {
|
||||
* g.addURL(url, date);
|
||||
* }
|
||||
* g.finish();
|
||||
* </pre>
|
||||
*
|
||||
* @author Robert Tansley
|
||||
*/
|
||||
public abstract class AbstractGenerator
|
||||
{
|
||||
/** Number of files written so far */
|
||||
protected int fileCount;
|
||||
|
||||
/** Number of bytes written to current file */
|
||||
protected int bytesWritten;
|
||||
|
||||
/** Number of URLs written to current file */
|
||||
protected int urlsWritten;
|
||||
|
||||
/** Directory files are written to */
|
||||
protected File outputDir;
|
||||
|
||||
/** Current output */
|
||||
protected PrintStream currentOutput;
|
||||
|
||||
/** Size in bytes of trailing boilerplate */
|
||||
private int trailingByteCount;
|
||||
|
||||
/**
|
||||
* Initialize this generator to write to the given directory. This must be
|
||||
* called by any subclass constructor.
|
||||
*
|
||||
* @param outputDirIn
|
||||
* directory to write sitemap files to
|
||||
*/
|
||||
public AbstractGenerator(File outputDirIn)
|
||||
{
|
||||
fileCount = 0;
|
||||
outputDir = outputDirIn;
|
||||
trailingByteCount = getTrailingBoilerPlate().length();
|
||||
currentOutput = null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Start writing a new sitemap file.
|
||||
*
|
||||
* @throws IOException
|
||||
* if an error occurs creating the file
|
||||
*/
|
||||
protected void startNewFile() throws IOException
|
||||
{
|
||||
String lbp = getLeadingBoilerPlate();
|
||||
|
||||
OutputStream fo = new FileOutputStream(new File(outputDir,
|
||||
getFilename(fileCount)));
|
||||
|
||||
if (useCompression())
|
||||
{
|
||||
fo = new GZIPOutputStream(fo);
|
||||
}
|
||||
|
||||
currentOutput = new PrintStream(fo);
|
||||
currentOutput.print(lbp);
|
||||
bytesWritten = lbp.length();
|
||||
urlsWritten = 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add the given URL to the sitemap.
|
||||
*
|
||||
* @param url
|
||||
* Full URL to add
|
||||
* @param lastMod
|
||||
* Date URL was last modified, or {@code null}
|
||||
* @throws IOException
|
||||
* if an error occurs writing
|
||||
*/
|
||||
public void addURL(String url, Date lastMod) throws IOException
|
||||
{
|
||||
// Kick things off if this is the first call
|
||||
if (currentOutput == null)
|
||||
{
|
||||
startNewFile();
|
||||
}
|
||||
|
||||
String newURLText = getURLText(url, lastMod);
|
||||
|
||||
if (bytesWritten + newURLText.length() + trailingByteCount > getMaxSize()
|
||||
|| urlsWritten + 1 > getMaxURLs())
|
||||
{
|
||||
closeCurrentFile();
|
||||
startNewFile();
|
||||
}
|
||||
|
||||
currentOutput.print(newURLText);
|
||||
bytesWritten += newURLText.length();
|
||||
urlsWritten++;
|
||||
}
|
||||
|
||||
/**
|
||||
* Finish with the current sitemap file.
|
||||
*
|
||||
* @throws IOException
|
||||
* if an error occurs writing
|
||||
*/
|
||||
protected void closeCurrentFile() throws IOException
|
||||
{
|
||||
currentOutput.print(getTrailingBoilerPlate());
|
||||
currentOutput.close();
|
||||
fileCount++;
|
||||
}
|
||||
|
||||
/**
|
||||
* Complete writing sitemap files and write the index files. This is invoked
|
||||
* when all calls to {@link AbstractGenerator#addURL(String, Date)} have
|
||||
* been completed, and invalidates the generator.
|
||||
*
|
||||
* @return number of sitemap files written.
|
||||
*
|
||||
* @throws IOException
|
||||
* if an error occurs writing
|
||||
*/
|
||||
public int finish() throws IOException
|
||||
{
|
||||
closeCurrentFile();
|
||||
|
||||
OutputStream fo = new FileOutputStream(new File(outputDir,
|
||||
getIndexFilename()));
|
||||
|
||||
if (useCompression())
|
||||
{
|
||||
fo = new GZIPOutputStream(fo);
|
||||
}
|
||||
|
||||
PrintStream out = new PrintStream(fo);
|
||||
writeIndex(out, fileCount);
|
||||
out.close();
|
||||
|
||||
return fileCount;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return marked-up text to be included in a sitemap about a given URL.
|
||||
*
|
||||
* @param url
|
||||
* URL to add information about
|
||||
* @param lastMod
|
||||
* date URL was last modified, or {@code null} if unknown or not
|
||||
* applicable
|
||||
* @return the mark-up to include
|
||||
*/
|
||||
public abstract String getURLText(String url, Date lastMod);
|
||||
|
||||
/**
|
||||
* Return the boilerplate at the top of a sitemap file.
|
||||
*
|
||||
* @return The boilerplate markup.
|
||||
*/
|
||||
public abstract String getLeadingBoilerPlate();
|
||||
|
||||
/**
|
||||
* Return the boilerplate at the end of a sitemap file.
|
||||
*
|
||||
* @return The boilerplate markup.
|
||||
*/
|
||||
public abstract String getTrailingBoilerPlate();
|
||||
|
||||
/**
|
||||
* Return the maximum size in bytes that an individual sitemap file should
|
||||
* be.
|
||||
*
|
||||
* @return the size in bytes.
|
||||
*/
|
||||
public abstract int getMaxSize();
|
||||
|
||||
/**
|
||||
* Return the maximum number of URLs that an individual sitemap file should
|
||||
* contain.
|
||||
*
|
||||
* @return the maximum number of URLs.
|
||||
*/
|
||||
public abstract int getMaxURLs();
|
||||
|
||||
/**
|
||||
* Return whether the written sitemap files and index should be
|
||||
* GZIP-compressed.
|
||||
*
|
||||
* @return {@code true} if GZIP compression should be used, {@code false}
|
||||
* otherwise.
|
||||
*/
|
||||
public abstract boolean useCompression();
|
||||
|
||||
/**
|
||||
* Return the filename a sitemap at the given index should be stored at.
|
||||
*
|
||||
* @param number
|
||||
* index of the sitemap file (zero is first).
|
||||
* @return the filename to write the sitemap to.
|
||||
*/
|
||||
public abstract String getFilename(int number);
|
||||
|
||||
/**
|
||||
* Get the filename the index should be written to.
|
||||
*
|
||||
* @return the filename of the index.
|
||||
*/
|
||||
public abstract String getIndexFilename();
|
||||
|
||||
/**
|
||||
* Write the index file.
|
||||
*
|
||||
* @param output
|
||||
* stream to write the index to
|
||||
* @param sitemapCount
|
||||
* number of sitemaps that were generated
|
||||
* @throws IOException
|
||||
* if an IO error occurs
|
||||
*/
|
||||
public abstract void writeIndex(PrintStream output, int sitemapCount)
|
||||
throws IOException;
|
||||
}
|
||||
@@ -1,365 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.sitemap;
|
||||
|
||||
import java.io.BufferedReader;
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
import java.io.InputStreamReader;
|
||||
import java.io.UnsupportedEncodingException;
|
||||
import java.net.HttpURLConnection;
|
||||
import java.net.MalformedURLException;
|
||||
import java.net.URL;
|
||||
import java.net.URLEncoder;
|
||||
import java.sql.SQLException;
|
||||
import java.util.Date;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.HelpFormatter;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.ParseException;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.content.Collection;
|
||||
import org.dspace.content.Community;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.content.ItemIterator;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.LogManager;
|
||||
|
||||
/**
|
||||
* Command-line utility for generating HTML and Sitemaps.org protocol Sitemaps.
|
||||
*
|
||||
* @author Robert Tansley
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class GenerateSitemaps
|
||||
{
|
||||
/** Logger */
|
||||
private static Logger log = Logger.getLogger(GenerateSitemaps.class);
|
||||
|
||||
public static void main(String[] args) throws Exception
|
||||
{
|
||||
final String usage = GenerateSitemaps.class.getCanonicalName();
|
||||
|
||||
CommandLineParser parser = new PosixParser();
|
||||
HelpFormatter hf = new HelpFormatter();
|
||||
|
||||
Options options = new Options();
|
||||
|
||||
options.addOption("h", "help", false, "help");
|
||||
options.addOption("s", "no_sitemaps", false,
|
||||
"do not generate sitemaps.org protocol sitemap");
|
||||
options.addOption("b", "no_htmlmap", false,
|
||||
"do not generate a basic HTML sitemap");
|
||||
options.addOption("a", "ping_all", false,
|
||||
"ping configured search engines");
|
||||
options
|
||||
.addOption("p", "ping", true,
|
||||
"ping specified search engine URL");
|
||||
|
||||
CommandLine line = null;
|
||||
|
||||
try
|
||||
{
|
||||
line = parser.parse(options, args);
|
||||
}
|
||||
catch (ParseException pe)
|
||||
{
|
||||
hf.printHelp(usage, options);
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
if (line.hasOption('h'))
|
||||
{
|
||||
hf.printHelp(usage, options);
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
if (line.getArgs().length != 0)
|
||||
{
|
||||
hf.printHelp(usage, options);
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
/*
|
||||
* Sanity check -- if no sitemap generation or pinging to do, print
|
||||
* usage
|
||||
*/
|
||||
if (line.getArgs().length != 0 || line.hasOption('b')
|
||||
&& line.hasOption('s') && !line.hasOption('g')
|
||||
&& !line.hasOption('m') && !line.hasOption('y')
|
||||
&& !line.hasOption('p'))
|
||||
{
|
||||
System.err
|
||||
.println("Nothing to do (no sitemap to generate, no search engines to ping)");
|
||||
hf.printHelp(usage, options);
|
||||
System.exit(1);
|
||||
}
|
||||
|
||||
// Note the negation (CLI options indicate NOT to generate a sitemap)
|
||||
if (!line.hasOption('b') || !line.hasOption('s'))
|
||||
{
|
||||
generateSitemaps(!line.hasOption('b'), !line.hasOption('s'));
|
||||
}
|
||||
|
||||
if (line.hasOption('a'))
|
||||
{
|
||||
pingConfiguredSearchEngines();
|
||||
}
|
||||
|
||||
if (line.hasOption('p'))
|
||||
{
|
||||
try
|
||||
{
|
||||
pingSearchEngine(line.getOptionValue('p'));
|
||||
}
|
||||
catch (MalformedURLException me)
|
||||
{
|
||||
System.err
|
||||
.println("Bad search engine URL (include all except sitemap URL)");
|
||||
System.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate sitemap.org protocol and/or basic HTML sitemaps.
|
||||
*
|
||||
* @param makeHTMLMap
|
||||
* if {@code true}, generate an HTML sitemap.
|
||||
* @param makeSitemapOrg
|
||||
* if {@code true}, generate an sitemap.org sitemap.
|
||||
* @throws SQLException
|
||||
* if a database error occurs.
|
||||
* @throws IOException
|
||||
* if IO error occurs.
|
||||
*/
|
||||
public static void generateSitemaps(boolean makeHTMLMap,
|
||||
boolean makeSitemapOrg) throws SQLException, IOException
|
||||
{
|
||||
String sitemapStem = ConfigurationManager.getProperty("dspace.url")
|
||||
+ "/sitemap";
|
||||
String htmlMapStem = ConfigurationManager.getProperty("dspace.url")
|
||||
+ "/htmlmap";
|
||||
String handleURLStem = ConfigurationManager.getProperty("dspace.url")
|
||||
+ "/handle/";
|
||||
|
||||
File outputDir = new File(ConfigurationManager.getProperty("sitemap.dir"));
|
||||
if (!outputDir.exists() && !outputDir.mkdir())
|
||||
{
|
||||
log.error("Unable to create output directory");
|
||||
}
|
||||
|
||||
AbstractGenerator html = null;
|
||||
AbstractGenerator sitemapsOrg = null;
|
||||
|
||||
if (makeHTMLMap)
|
||||
{
|
||||
html = new HTMLSitemapGenerator(outputDir, htmlMapStem + "?map=",
|
||||
null);
|
||||
}
|
||||
|
||||
if (makeSitemapOrg)
|
||||
{
|
||||
sitemapsOrg = new SitemapsOrgGenerator(outputDir, sitemapStem
|
||||
+ "?map=", null);
|
||||
}
|
||||
|
||||
Context c = new Context();
|
||||
|
||||
Community[] comms = Community.findAll(c);
|
||||
|
||||
for (int i = 0; i < comms.length; i++)
|
||||
{
|
||||
String url = handleURLStem + comms[i].getHandle();
|
||||
|
||||
if (makeHTMLMap)
|
||||
{
|
||||
html.addURL(url, null);
|
||||
}
|
||||
if (makeSitemapOrg)
|
||||
{
|
||||
sitemapsOrg.addURL(url, null);
|
||||
}
|
||||
}
|
||||
|
||||
Collection[] colls = Collection.findAll(c);
|
||||
|
||||
for (int i = 0; i < colls.length; i++)
|
||||
{
|
||||
String url = handleURLStem + colls[i].getHandle();
|
||||
|
||||
if (makeHTMLMap)
|
||||
{
|
||||
html.addURL(url, null);
|
||||
}
|
||||
if (makeSitemapOrg)
|
||||
{
|
||||
sitemapsOrg.addURL(url, null);
|
||||
}
|
||||
}
|
||||
|
||||
ItemIterator allItems = Item.findAll(c);
|
||||
try
|
||||
{
|
||||
int itemCount = 0;
|
||||
|
||||
while (allItems.hasNext())
|
||||
{
|
||||
Item i = allItems.next();
|
||||
String url = handleURLStem + i.getHandle();
|
||||
Date lastMod = i.getLastModified();
|
||||
|
||||
if (makeHTMLMap)
|
||||
{
|
||||
html.addURL(url, lastMod);
|
||||
}
|
||||
if (makeSitemapOrg)
|
||||
{
|
||||
sitemapsOrg.addURL(url, lastMod);
|
||||
}
|
||||
i.decache();
|
||||
|
||||
itemCount++;
|
||||
}
|
||||
|
||||
if (makeHTMLMap)
|
||||
{
|
||||
int files = html.finish();
|
||||
log.info(LogManager.getHeader(c, "write_sitemap",
|
||||
"type=html,num_files=" + files + ",communities="
|
||||
+ comms.length + ",collections=" + colls.length
|
||||
+ ",items=" + itemCount));
|
||||
}
|
||||
|
||||
if (makeSitemapOrg)
|
||||
{
|
||||
int files = sitemapsOrg.finish();
|
||||
log.info(LogManager.getHeader(c, "write_sitemap",
|
||||
"type=html,num_files=" + files + ",communities="
|
||||
+ comms.length + ",collections=" + colls.length
|
||||
+ ",items=" + itemCount));
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (allItems != null)
|
||||
{
|
||||
allItems.close();
|
||||
}
|
||||
}
|
||||
|
||||
c.abort();
|
||||
}
|
||||
|
||||
/**
|
||||
* Ping all search engines configured in {@code dspace.cfg}.
|
||||
*
|
||||
* @throws UnsupportedEncodingException
|
||||
* theoretically should never happen
|
||||
*/
|
||||
public static void pingConfiguredSearchEngines()
|
||||
throws UnsupportedEncodingException
|
||||
{
|
||||
String engineURLProp = ConfigurationManager
|
||||
.getProperty("sitemap.engineurls");
|
||||
String engineURLs[] = null;
|
||||
|
||||
if (engineURLProp != null)
|
||||
{
|
||||
engineURLs = engineURLProp.trim().split("\\s*,\\s*");
|
||||
}
|
||||
|
||||
if (engineURLProp == null || engineURLs == null
|
||||
|| engineURLs.length == 0 || engineURLs[0].trim().equals(""))
|
||||
{
|
||||
log.warn("No search engine URLs configured to ping");
|
||||
return;
|
||||
}
|
||||
|
||||
for (int i = 0; i < engineURLs.length; i++)
|
||||
{
|
||||
try
|
||||
{
|
||||
pingSearchEngine(engineURLs[i]);
|
||||
}
|
||||
catch (MalformedURLException me)
|
||||
{
|
||||
log.warn("Bad search engine URL in configuration: "
|
||||
+ engineURLs[i]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Ping the given search engine.
|
||||
*
|
||||
* @param engineURL
|
||||
* Search engine URL minus protocol etc, e.g.
|
||||
* {@code www.google.com}
|
||||
* @throws MalformedURLException
|
||||
* if the passed in URL is malformed
|
||||
* @throws UnsupportedEncodingException
|
||||
* theoretically should never happen
|
||||
*/
|
||||
public static void pingSearchEngine(String engineURL)
|
||||
throws MalformedURLException, UnsupportedEncodingException
|
||||
{
|
||||
// Set up HTTP proxy
|
||||
if ((ConfigurationManager.getProperty("http.proxy.host") != null)
|
||||
&& (ConfigurationManager.getProperty("http.proxy.port") != null))
|
||||
{
|
||||
System.setProperty("proxySet", "true");
|
||||
System.setProperty("proxyHost", ConfigurationManager
|
||||
.getProperty("http.proxy.host"));
|
||||
System.getProperty("proxyPort", ConfigurationManager
|
||||
.getProperty("http.proxy.port"));
|
||||
}
|
||||
|
||||
String sitemapURL = ConfigurationManager.getProperty("dspace.url")
|
||||
+ "/sitemap";
|
||||
|
||||
URL url = new URL(engineURL + URLEncoder.encode(sitemapURL, "UTF-8"));
|
||||
|
||||
try
|
||||
{
|
||||
HttpURLConnection connection = (HttpURLConnection) url
|
||||
.openConnection();
|
||||
|
||||
BufferedReader in = new BufferedReader(new InputStreamReader(
|
||||
connection.getInputStream()));
|
||||
|
||||
String inputLine;
|
||||
StringBuffer resp = new StringBuffer();
|
||||
while ((inputLine = in.readLine()) != null)
|
||||
{
|
||||
resp.append(inputLine).append("\n");
|
||||
}
|
||||
in.close();
|
||||
|
||||
if (connection.getResponseCode() == 200)
|
||||
{
|
||||
log.info("Pinged " + url.toString() + " successfully");
|
||||
}
|
||||
else
|
||||
{
|
||||
log.warn("Error response pinging " + url.toString() + ":\n"
|
||||
+ resp);
|
||||
}
|
||||
}
|
||||
catch (IOException e)
|
||||
{
|
||||
log.warn("Error pinging " + url.toString(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,114 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.sitemap;
|
||||
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
import java.io.PrintStream;
|
||||
import java.util.Date;
|
||||
|
||||
/**
|
||||
* Class for generating HTML "sitemaps" which contain links to various pages in
|
||||
* a DSpace site. This should improve search engine coverage of the DSpace site
|
||||
* and limit the server load caused by crawlers.
|
||||
*
|
||||
* @author Robert Tansley
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class HTMLSitemapGenerator extends AbstractGenerator
|
||||
{
|
||||
/** Stem of URLs sitemaps will eventually appear at */
|
||||
private String indexURLStem;
|
||||
|
||||
/** Tail of URLs sitemaps will eventually appear at */
|
||||
private String indexURLTail;
|
||||
|
||||
/**
|
||||
* Construct an HTML sitemap generator, writing files to the given
|
||||
* directory, and with the sitemaps eventually exposed at starting with the
|
||||
* given URL stem and tail.
|
||||
*
|
||||
* @param outputDirIn
|
||||
* Directory to write sitemap files to
|
||||
* @param urlStem
|
||||
* start of URL that sitemap files will appear at, e.g.
|
||||
* {@code http://dspace.myu.edu/sitemap?sitemap=}
|
||||
* @param urlTail
|
||||
* end of URL that sitemap files will appear at, e.g.
|
||||
* {@code .html} or {@code null}
|
||||
*/
|
||||
public HTMLSitemapGenerator(File outputDirIn, String urlStem, String urlTail)
|
||||
{
|
||||
super(outputDirIn);
|
||||
|
||||
indexURLStem = urlStem;
|
||||
indexURLTail = (urlTail == null ? "" : urlTail);
|
||||
}
|
||||
|
||||
public String getFilename(int number)
|
||||
{
|
||||
return "sitemap" + number + ".html";
|
||||
}
|
||||
|
||||
public String getLeadingBoilerPlate()
|
||||
{
|
||||
return "<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.01//EN\" \"http://www.w3.org/TR/html4/strict.dtd\">\n"
|
||||
+ "<html><head><title>URL List</title></head><body><ul>";
|
||||
}
|
||||
|
||||
public int getMaxSize()
|
||||
{
|
||||
// 50k
|
||||
return 51200;
|
||||
}
|
||||
|
||||
public int getMaxURLs()
|
||||
{
|
||||
return 1000;
|
||||
}
|
||||
|
||||
public String getTrailingBoilerPlate()
|
||||
{
|
||||
return "</ul></body></html>\n";
|
||||
}
|
||||
|
||||
public String getURLText(String url, Date lastMod)
|
||||
{
|
||||
StringBuffer urlText = new StringBuffer();
|
||||
|
||||
urlText.append("<li><a href=\"").append(url).append("\">").append(url)
|
||||
.append("</a></li>\n");
|
||||
|
||||
return urlText.toString();
|
||||
}
|
||||
|
||||
public boolean useCompression()
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
public String getIndexFilename()
|
||||
{
|
||||
return "sitemap_index.html";
|
||||
}
|
||||
|
||||
public void writeIndex(PrintStream output, int sitemapCount)
|
||||
throws IOException
|
||||
{
|
||||
output.println(getLeadingBoilerPlate());
|
||||
|
||||
for (int i = 0; i < sitemapCount; i++)
|
||||
{
|
||||
output.print("<li><a href=\"" + indexURLStem + i + indexURLTail
|
||||
+ "\">sitemap " + i);
|
||||
output.print("</a></li>\n");
|
||||
}
|
||||
|
||||
output.println(getTrailingBoilerPlate());
|
||||
}
|
||||
}
|
||||
@@ -1,129 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.sitemap;
|
||||
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
import java.io.PrintStream;
|
||||
import java.text.DateFormat;
|
||||
import java.text.SimpleDateFormat;
|
||||
import java.util.Date;
|
||||
|
||||
/**
|
||||
* Class for generating <a href="http://sitemaps.org/">Sitemaps</a> to improve
|
||||
* search engine coverage of the DSpace site and limit the server load caused by
|
||||
* crawlers.
|
||||
*
|
||||
* @author Robert Tansley
|
||||
* @author Stuart Lewis
|
||||
*/
|
||||
public class SitemapsOrgGenerator extends AbstractGenerator
|
||||
{
|
||||
/** Stem of URLs sitemaps will eventually appear at */
|
||||
private String indexURLStem;
|
||||
|
||||
/** Tail of URLs sitemaps will eventually appear at */
|
||||
private String indexURLTail;
|
||||
|
||||
/** The correct date format */
|
||||
private DateFormat w3dtfFormat = new SimpleDateFormat(
|
||||
"yyyy-MM-dd'T'HH:mm:ss'Z'");
|
||||
|
||||
/**
|
||||
* Construct a sitemaps.org protocol sitemap generator, writing files to the
|
||||
* given directory, and with the sitemaps eventually exposed at starting
|
||||
* with the given URL stem and tail.
|
||||
*
|
||||
* @param outputDirIn
|
||||
* Directory to write sitemap files to
|
||||
* @param urlStem
|
||||
* start of URL that sitemap files will appear at, e.g.
|
||||
* {@code http://dspace.myu.edu/sitemap?sitemap=}
|
||||
* @param urlTail
|
||||
* end of URL that sitemap files will appear at, e.g.
|
||||
* {@code .html} or {@code null}
|
||||
*/
|
||||
public SitemapsOrgGenerator(File outputDirIn, String urlStem, String urlTail)
|
||||
{
|
||||
super(outputDirIn);
|
||||
|
||||
indexURLStem = urlStem;
|
||||
indexURLTail = (urlTail == null ? "" : urlTail);
|
||||
}
|
||||
|
||||
public String getFilename(int number)
|
||||
{
|
||||
return "sitemap" + number + ".xml.gz";
|
||||
}
|
||||
|
||||
public String getLeadingBoilerPlate()
|
||||
{
|
||||
return "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n"
|
||||
+ "<urlset xmlns=\"http://www.sitemaps.org/schemas/sitemap/0.9\">";
|
||||
}
|
||||
|
||||
public int getMaxSize()
|
||||
{
|
||||
// 10 Mb
|
||||
return 10485760;
|
||||
}
|
||||
|
||||
public int getMaxURLs()
|
||||
{
|
||||
return 50000;
|
||||
}
|
||||
|
||||
public String getTrailingBoilerPlate()
|
||||
{
|
||||
return "</urlset>";
|
||||
}
|
||||
|
||||
public String getURLText(String url, Date lastMod)
|
||||
{
|
||||
StringBuffer urlText = new StringBuffer();
|
||||
|
||||
urlText.append("<url><loc>").append(url).append("</loc>");
|
||||
if (lastMod != null)
|
||||
{
|
||||
urlText.append("<lastmod>").append(w3dtfFormat.format(lastMod))
|
||||
.append("</lastmod>");
|
||||
}
|
||||
urlText.append("</url>\n");
|
||||
|
||||
return urlText.toString();
|
||||
}
|
||||
|
||||
public boolean useCompression()
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
public String getIndexFilename()
|
||||
{
|
||||
return "sitemap_index.xml.gz";
|
||||
}
|
||||
|
||||
public void writeIndex(PrintStream output, int sitemapCount)
|
||||
throws IOException
|
||||
{
|
||||
String now = w3dtfFormat.format(new Date());
|
||||
|
||||
output.println("<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n");
|
||||
output
|
||||
.println("<sitemapindex xmlns=\"http://www.sitemaps.org/schemas/sitemap/0.9\">");
|
||||
|
||||
for (int i = 0; i < sitemapCount; i++)
|
||||
{
|
||||
output.print("<sitemap><loc>" + indexURLStem + i + indexURLTail
|
||||
+ "</loc>");
|
||||
output.print("<lastmod>" + now + "</lastmod></sitemap>\n");
|
||||
}
|
||||
|
||||
output.println("</sitemapindex>");
|
||||
}
|
||||
}
|
||||
@@ -1,392 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.statistics;
|
||||
|
||||
import java.io.File;
|
||||
import java.io.FileInputStream;
|
||||
import java.util.Calendar;
|
||||
import java.util.Date;
|
||||
import java.util.GregorianCalendar;
|
||||
import java.util.Properties;
|
||||
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.CommandLineParser;
|
||||
import org.apache.commons.cli.Options;
|
||||
import org.apache.commons.cli.PosixParser;
|
||||
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
|
||||
/**
|
||||
* This class allows the running of the DSpace statistic tools
|
||||
*
|
||||
* Usage: java CreateStatReport -r <statistic to run>
|
||||
* Available: <stat-initial> <stat-general> <stat-monthly> <stat-report-initial>
|
||||
* <stat-report-general> <stat-report-monthly>
|
||||
*
|
||||
* @author Chris Yates
|
||||
*
|
||||
*/
|
||||
|
||||
public class CreateStatReport {
|
||||
|
||||
/**Current date and time*/
|
||||
private static Calendar calendar = null;
|
||||
|
||||
/**Reporting start date and time*/
|
||||
private static Calendar reportStartDate = null;
|
||||
|
||||
/**Path of log directory*/
|
||||
private static String outputLogDirectory = null;
|
||||
|
||||
/**Path of reporting directory*/
|
||||
private static String outputReportDirectory = null;
|
||||
|
||||
/**File suffix for log files*/
|
||||
private static String outputSuffix = ".dat";
|
||||
|
||||
/**User context*/
|
||||
private static Context context;
|
||||
|
||||
/** the config file from which to configure the analyser */
|
||||
private static String configFile = ConfigurationManager.getProperty("dspace.dir") +
|
||||
File.separator + "config" + File.separator +
|
||||
"dstat.cfg";
|
||||
|
||||
/*
|
||||
* Main method to be run from the command line executes individual statistic methods
|
||||
*
|
||||
* Usage: java CreateStatReport -r <statistic to run>
|
||||
*/
|
||||
public static void main(String[] argv) throws Exception {
|
||||
|
||||
// Open the statistics config file
|
||||
FileInputStream fis = new java.io.FileInputStream(new File(configFile));
|
||||
Properties config = new Properties();
|
||||
config.load(fis);
|
||||
int startMonth = 0;
|
||||
int startYear = 2005;
|
||||
try
|
||||
{
|
||||
startYear = Integer.parseInt(config.getProperty("start.year", "1").trim());
|
||||
} catch (NumberFormatException nfe)
|
||||
{
|
||||
System.err.println("start.year is incorrectly set in dstat.cfg. Must be a number (e.g. 2005).");
|
||||
System.exit(0);
|
||||
}
|
||||
try
|
||||
{
|
||||
startMonth = Integer.parseInt(config.getProperty("start.month", "2005").trim());
|
||||
} catch (NumberFormatException nfe)
|
||||
{
|
||||
System.err.println("start.month is incorrectly set in dstat.cfg. Must be a number between 1 and 12.");
|
||||
System.exit(0);
|
||||
}
|
||||
reportStartDate = new GregorianCalendar(startYear, startMonth - 1, 1);
|
||||
calendar = new GregorianCalendar();
|
||||
|
||||
// create context as super user
|
||||
context = new Context();
|
||||
context.setIgnoreAuthorization(true);
|
||||
|
||||
//get paths to directories
|
||||
outputLogDirectory = ConfigurationManager.getProperty("log.dir") + File.separator;
|
||||
outputReportDirectory = ConfigurationManager.getProperty("report.dir") + File.separator;
|
||||
|
||||
//read in command line variable to determine which statistic to run
|
||||
CommandLineParser parser = new PosixParser();
|
||||
Options options = new Options();
|
||||
options.addOption("r", "report", true, "report");
|
||||
CommandLine line = parser.parse(options, argv);
|
||||
|
||||
String statAction = null;
|
||||
|
||||
if(line.hasOption('r'))
|
||||
{
|
||||
statAction = line.getOptionValue('r');
|
||||
}
|
||||
|
||||
if (statAction == null) {
|
||||
usage();
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
//call appropriate statistics method
|
||||
if(statAction.equals("stat-monthly")) {
|
||||
statMonthly();
|
||||
}
|
||||
|
||||
if(statAction.equals("stat-general")) {
|
||||
statGeneral();
|
||||
}
|
||||
|
||||
if(statAction.equals("stat-initial")) {
|
||||
statInitial();
|
||||
}
|
||||
|
||||
if(statAction.equals("stat-report-general")) {
|
||||
statReportGeneral();
|
||||
}
|
||||
|
||||
if(statAction.equals("stat-report-initial")) {
|
||||
statReportInitial();
|
||||
}
|
||||
|
||||
if(statAction.equals("stat-report-monthly")) {
|
||||
statReportMonthly();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* This method generates a report from the first of the current month to the end of the current month.
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private static void statMonthly() throws Exception {
|
||||
|
||||
//Output Prefix
|
||||
String outputPrefix = "dspace-log-monthly-";
|
||||
|
||||
// set up our command line variables
|
||||
String myLogDir = null;
|
||||
String myFileTemplate = null;
|
||||
String myConfigFile = null;
|
||||
StringBuffer myOutFile = null;
|
||||
Date myStartDate = null;
|
||||
Date myEndDate = null;
|
||||
boolean myLookUp = false;
|
||||
|
||||
Calendar start = new GregorianCalendar( calendar.get(Calendar.YEAR),
|
||||
calendar.get(Calendar.MONTH),
|
||||
calendar.getActualMinimum(Calendar.DAY_OF_MONTH));
|
||||
myStartDate = start.getTime();
|
||||
|
||||
Calendar end = new GregorianCalendar( calendar.get(Calendar.YEAR),
|
||||
calendar.get(Calendar.MONTH),
|
||||
calendar.getActualMaximum(Calendar.DAY_OF_MONTH));
|
||||
myEndDate = end.getTime();
|
||||
|
||||
myOutFile = new StringBuffer(outputLogDirectory);
|
||||
myOutFile.append(outputPrefix);
|
||||
myOutFile.append(calendar.get(Calendar.YEAR));
|
||||
myOutFile.append("-");
|
||||
myOutFile.append(calendar.get(Calendar.MONTH)+1);
|
||||
myOutFile.append(outputSuffix);
|
||||
|
||||
LogAnalyser.processLogs(context, myLogDir, myFileTemplate, myConfigFile, myOutFile.toString(), myStartDate, myEndDate, myLookUp);
|
||||
}
|
||||
|
||||
/**
|
||||
* This method generates a full report based on the full log period
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private static void statGeneral() throws Exception {
|
||||
|
||||
//Output Prefix
|
||||
String outputPrefix = "dspace-log-general-";
|
||||
|
||||
// set up our command line variables
|
||||
String myLogDir = null;
|
||||
String myFileTemplate = null;
|
||||
String myConfigFile = null;
|
||||
StringBuffer myOutFile = null;
|
||||
Date myStartDate = null;
|
||||
Date myEndDate = null;
|
||||
boolean myLookUp = false;
|
||||
|
||||
myOutFile = new StringBuffer(outputLogDirectory);
|
||||
myOutFile.append(outputPrefix);
|
||||
myOutFile.append(calendar.get(Calendar.YEAR));
|
||||
myOutFile.append("-");
|
||||
myOutFile.append(calendar.get(Calendar.MONTH)+1);
|
||||
myOutFile.append("-");
|
||||
myOutFile.append(calendar.get(Calendar.DAY_OF_MONTH));
|
||||
myOutFile.append(outputSuffix);
|
||||
|
||||
LogAnalyser.processLogs(context, myLogDir, myFileTemplate, myConfigFile, myOutFile.toString(), myStartDate, myEndDate, myLookUp);
|
||||
}
|
||||
|
||||
/**
|
||||
* This script starts from the year and month specified below and loops each month until the current month
|
||||
* generating a monthly aggregation files for the DStat system.
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private static void statInitial() throws Exception {
|
||||
|
||||
//Output Prefix
|
||||
String outputPrefix = "dspace-log-monthly-";
|
||||
|
||||
// set up our command line variables
|
||||
String myLogDir = null;
|
||||
String myFileTemplate = null;
|
||||
String myConfigFile = null;
|
||||
StringBuffer myOutFile = null;
|
||||
Date myStartDate = null;
|
||||
Date myEndDate = null;
|
||||
boolean myLookUp = false;
|
||||
|
||||
Calendar reportEndDate = new GregorianCalendar( calendar.get(Calendar.YEAR),
|
||||
calendar.get(Calendar.MONTH),
|
||||
calendar.getActualMaximum(Calendar.DAY_OF_MONTH));
|
||||
|
||||
Calendar currentMonth = (Calendar)reportStartDate.clone();
|
||||
while(currentMonth.before(reportEndDate)) {
|
||||
|
||||
Calendar start = new GregorianCalendar( currentMonth.get(Calendar.YEAR),
|
||||
currentMonth.get(Calendar.MONTH),
|
||||
currentMonth.getActualMinimum(Calendar.DAY_OF_MONTH));
|
||||
myStartDate = start.getTime();
|
||||
|
||||
Calendar end = new GregorianCalendar( currentMonth.get(Calendar.YEAR),
|
||||
currentMonth.get(Calendar.MONTH),
|
||||
currentMonth.getActualMaximum(Calendar.DAY_OF_MONTH));
|
||||
myEndDate = end.getTime();
|
||||
|
||||
myOutFile = new StringBuffer(outputLogDirectory);
|
||||
myOutFile.append(outputPrefix);
|
||||
myOutFile.append(currentMonth.get(Calendar.YEAR));
|
||||
myOutFile.append("-");
|
||||
myOutFile.append(currentMonth.get(Calendar.MONTH)+1);
|
||||
myOutFile.append(outputSuffix);
|
||||
|
||||
LogAnalyser.processLogs(context, myLogDir, myFileTemplate, myConfigFile, myOutFile.toString(), myStartDate, myEndDate, myLookUp);
|
||||
|
||||
currentMonth.add(Calendar.MONTH, 1);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* This method generates a full report based on the full log period
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private static void statReportGeneral() throws Exception {
|
||||
|
||||
//Prefix
|
||||
String inputPrefix = "dspace-log-general-";
|
||||
String outputPrefix = "report-general-";
|
||||
|
||||
String myFormat = "html";
|
||||
StringBuffer myInput = null;
|
||||
StringBuffer myOutput = null;
|
||||
String myMap = null;
|
||||
|
||||
myInput = new StringBuffer(outputLogDirectory);
|
||||
myInput.append(inputPrefix);
|
||||
myInput.append(calendar.get(Calendar.YEAR));
|
||||
myInput.append("-");
|
||||
myInput.append(calendar.get(Calendar.MONTH)+1);
|
||||
myInput.append("-");
|
||||
myInput.append(calendar.get(Calendar.DAY_OF_MONTH));
|
||||
myInput.append(outputSuffix);
|
||||
|
||||
myOutput = new StringBuffer(outputReportDirectory);
|
||||
myOutput.append(outputPrefix);
|
||||
myOutput.append(calendar.get(Calendar.YEAR));
|
||||
myOutput.append("-");
|
||||
myOutput.append(calendar.get(Calendar.MONTH)+1);
|
||||
myOutput.append("-");
|
||||
myOutput.append(calendar.get(Calendar.DAY_OF_MONTH));
|
||||
myOutput.append(".");
|
||||
myOutput.append(myFormat);
|
||||
|
||||
ReportGenerator.processReport(context, myFormat, myInput.toString(), myOutput.toString(), myMap);
|
||||
}
|
||||
|
||||
/**
|
||||
* This script starts from the year and month specified below and loops each month until the current month
|
||||
* generating monthly reports from the DStat aggregation files
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private static void statReportInitial() throws Exception {
|
||||
|
||||
//Prefix
|
||||
String inputPrefix = "dspace-log-monthly-";
|
||||
String outputPrefix = "report-";
|
||||
|
||||
String myFormat = "html";
|
||||
StringBuffer myInput = null;
|
||||
StringBuffer myOutput = null;
|
||||
String myMap = null;
|
||||
|
||||
Calendar reportEndDate = new GregorianCalendar( calendar.get(Calendar.YEAR),
|
||||
calendar.get(Calendar.MONTH),
|
||||
calendar.getActualMaximum(Calendar.DAY_OF_MONTH));
|
||||
|
||||
Calendar currentMonth = (Calendar)reportStartDate.clone();
|
||||
|
||||
while(currentMonth.before(reportEndDate)) {
|
||||
|
||||
myInput = new StringBuffer(outputLogDirectory);
|
||||
myInput.append(inputPrefix);
|
||||
myInput.append(currentMonth.get(Calendar.YEAR));
|
||||
myInput.append("-");
|
||||
myInput.append(currentMonth.get(Calendar.MONTH)+1);
|
||||
myInput.append(outputSuffix);
|
||||
|
||||
myOutput = new StringBuffer(outputReportDirectory);
|
||||
myOutput.append(outputPrefix);
|
||||
myOutput.append(currentMonth.get(Calendar.YEAR));
|
||||
myOutput.append("-");
|
||||
myOutput.append(currentMonth.get(Calendar.MONTH)+1);
|
||||
myOutput.append(".");
|
||||
myOutput.append(myFormat);
|
||||
|
||||
ReportGenerator.processReport(context, myFormat, myInput.toString(), myOutput.toString(), myMap);
|
||||
|
||||
currentMonth.add(Calendar.MONTH, 1);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* This method generates a report from the aggregation files which have been run for the most recent month
|
||||
*
|
||||
* @throws Exception
|
||||
*/
|
||||
private static void statReportMonthly() throws Exception
|
||||
{
|
||||
//Prefix
|
||||
String inputPrefix = "dspace-log-monthly-";
|
||||
String outputPrefix = "report-";
|
||||
|
||||
String myFormat = "html";
|
||||
StringBuffer myInput = null;
|
||||
StringBuffer myOutput = null;
|
||||
String myMap = null;
|
||||
|
||||
myInput = new StringBuffer(outputLogDirectory);
|
||||
myInput.append(inputPrefix);
|
||||
myInput.append(calendar.get(Calendar.YEAR));
|
||||
myInput.append("-");
|
||||
myInput.append(calendar.get(Calendar.MONTH)+1);
|
||||
myInput.append(outputSuffix);
|
||||
|
||||
myOutput = new StringBuffer(outputReportDirectory);
|
||||
myOutput.append(outputPrefix);
|
||||
myOutput.append(calendar.get(Calendar.YEAR));
|
||||
myOutput.append("-");
|
||||
myOutput.append(calendar.get(Calendar.MONTH)+1);
|
||||
myOutput.append(".");
|
||||
myOutput.append(myFormat);
|
||||
|
||||
ReportGenerator.processReport(context, myFormat, myInput.toString(), myOutput.toString(), myMap);
|
||||
}
|
||||
|
||||
/*
|
||||
* Output the usage information
|
||||
*/
|
||||
private static void usage() throws Exception {
|
||||
|
||||
System.out.println("Usage: java CreateStatReport -r <statistic to run>");
|
||||
System.out.println("Available: <stat-initial> <stat-general> <stat-monthly> <stat-report-initial> <stat-report-general> <stat-report-monthly>");
|
||||
return;
|
||||
}
|
||||
}
|
||||
@@ -1,139 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.statistics;
|
||||
|
||||
import org.dspace.app.statistics.Statistics;
|
||||
|
||||
import java.util.Date;
|
||||
|
||||
/**
|
||||
* Sn interface to a generic report generating
|
||||
* class, and to provide the polymorphism necessary to allow the report
|
||||
* generator to generate any number of different formats of report
|
||||
*
|
||||
* Note: This used to be an abstract class, but has been made an interface as there wasn't
|
||||
* any logic contained within it. It's also been made public, so that you can create a Report
|
||||
* type without monkeying about in the statistics package.
|
||||
*
|
||||
* @author Richard Jones
|
||||
*/
|
||||
public interface Report
|
||||
{
|
||||
/**
|
||||
* output any top headers that this page needs
|
||||
*
|
||||
* @return a string containing the header for the report
|
||||
*/
|
||||
public abstract String header();
|
||||
|
||||
/**
|
||||
* output any top headers that this page needs
|
||||
*
|
||||
* @param title the title of the report, useful for email subjects or
|
||||
* HTML headers
|
||||
*
|
||||
* @return a string containing the header for the report
|
||||
*/
|
||||
public abstract String header(String title);
|
||||
|
||||
/**
|
||||
* output the title in the relevant format. This requires that the title
|
||||
* has been set with setMainTitle()
|
||||
*
|
||||
* @return a string containing the title of the report
|
||||
*/
|
||||
public abstract String mainTitle();
|
||||
|
||||
/**
|
||||
* output the date range in the relevant format. This requires that the
|
||||
* date ranges have been set using setStartDate() and setEndDate()
|
||||
*
|
||||
* @return a string containing date range information
|
||||
*/
|
||||
public abstract String dateRange();
|
||||
|
||||
/**
|
||||
* output the section header in the relevant format
|
||||
*
|
||||
* @param title the title of the current section header
|
||||
*
|
||||
* @return a string containing the formatted section header
|
||||
*/
|
||||
public abstract String sectionHeader(String title);
|
||||
|
||||
/**
|
||||
* output the report block based on the passed statistics object array
|
||||
*
|
||||
* @param content a statistics object to form the basis of the displayed
|
||||
* stat block
|
||||
*
|
||||
* @return a string containing the formatted statistics block
|
||||
*/
|
||||
public abstract String statBlock(Statistics content);
|
||||
|
||||
/**
|
||||
* output the floor information in the relevant format
|
||||
*
|
||||
* @param floor the floor value for the statistics block
|
||||
*
|
||||
* @return a string containing the formatted floor information
|
||||
*/
|
||||
public abstract String floorInfo(int floor);
|
||||
|
||||
/**
|
||||
* output the explanation of the stat block in the relevant format
|
||||
*
|
||||
* @param explanation the explanatory or clarification text for the stats
|
||||
*
|
||||
* @return a string containing the formatted explanation
|
||||
*/
|
||||
public abstract String blockExplanation(String explanation);
|
||||
|
||||
/**
|
||||
* output the final footers for this file
|
||||
*
|
||||
* @return a string containing the report footer
|
||||
*/
|
||||
public abstract String footer();
|
||||
|
||||
/**
|
||||
* set the main title for the report
|
||||
*
|
||||
* @param name the name of the service
|
||||
* @param serverName the name of the server
|
||||
*/
|
||||
public abstract void setMainTitle (String name, String serverName);
|
||||
|
||||
/**
|
||||
* add a statistics block to the report to the class register
|
||||
*
|
||||
* @param stat the statistics object to be added to the report
|
||||
*/
|
||||
public abstract void addBlock(Statistics stat);
|
||||
|
||||
/**
|
||||
* render the report
|
||||
*
|
||||
* @return a string containing the full content of the report
|
||||
*/
|
||||
public abstract String render();
|
||||
|
||||
/**
|
||||
* set the starting date for the report
|
||||
*
|
||||
* @param start the start date for the report
|
||||
*/
|
||||
public abstract void setStartDate(Date start);
|
||||
|
||||
/**
|
||||
* set the end date for the report
|
||||
*
|
||||
* @param end the end date for the report
|
||||
*/
|
||||
public abstract void setEndDate(Date end);
|
||||
}
|
||||
@@ -1,36 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.statistics;
|
||||
|
||||
import java.text.NumberFormat;
|
||||
|
||||
/**
|
||||
* This class provides a number of tools that may be useful to the methods
|
||||
* which generate the different types of report
|
||||
*
|
||||
* @author Richard Jones
|
||||
*/
|
||||
public class ReportTools
|
||||
{
|
||||
/**
|
||||
* method to take the given integer and produce a string to be used in
|
||||
* the display of the report. Basically provides an interface for a
|
||||
* standard NumberFormat class, but without the hassle of instantiating
|
||||
* and localising it.
|
||||
*
|
||||
* @param number the integer to be formatted
|
||||
*
|
||||
* @return a string containing the formatted number
|
||||
*/
|
||||
public static String numberFormat(int number)
|
||||
{
|
||||
NumberFormat nf = NumberFormat.getIntegerInstance();
|
||||
return nf.format((double) number);
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,387 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.statistics;
|
||||
|
||||
import org.apache.commons.lang.time.DateUtils;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
|
||||
import java.util.*;
|
||||
import java.util.regex.Pattern;
|
||||
import java.util.regex.Matcher;
|
||||
import java.io.File;
|
||||
import java.io.FilenameFilter;
|
||||
import java.text.SimpleDateFormat;
|
||||
import java.text.ParseException;
|
||||
|
||||
/**
|
||||
* Helper class for loading the analysis / report files from the reports directory
|
||||
*/
|
||||
public class StatisticsLoader
|
||||
{
|
||||
private static Map<String, StatsFile> monthlyAnalysis = null;
|
||||
private static Map<String, StatsFile> monthlyReports = null;
|
||||
|
||||
private static StatsFile generalAnalysis = null;
|
||||
private static StatsFile generalReport = null;
|
||||
|
||||
private static Date lastLoaded = null;
|
||||
private static int fileCount = 0;
|
||||
|
||||
private static Pattern analysisMonthlyPattern;
|
||||
private static Pattern analysisGeneralPattern;
|
||||
private static Pattern reportMonthlyPattern;
|
||||
private static Pattern reportGeneralPattern;
|
||||
|
||||
private static SimpleDateFormat monthlySDF;
|
||||
private static SimpleDateFormat generalSDF;
|
||||
|
||||
// one time initialisation of the regex patterns and formatters we will use
|
||||
static
|
||||
{
|
||||
analysisMonthlyPattern = Pattern.compile("dspace-log-monthly-([0-9][0-9][0-9][0-9]-[0-9]+)\\.dat");
|
||||
analysisGeneralPattern = Pattern.compile("dspace-log-general-([0-9]+-[0-9]+-[0-9]+)\\.dat");
|
||||
reportMonthlyPattern = Pattern.compile("report-([0-9][0-9][0-9][0-9]-[0-9]+)\\.html");
|
||||
reportGeneralPattern = Pattern.compile("report-general-([0-9]+-[0-9]+-[0-9]+)\\.html");
|
||||
|
||||
monthlySDF = new SimpleDateFormat("yyyy'-'M");
|
||||
generalSDF = new SimpleDateFormat("yyyy'-'M'-'dd");
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an array of the dates of the report files
|
||||
* @return
|
||||
*/
|
||||
public static Date[] getMonthlyReportDates()
|
||||
{
|
||||
return sortDatesDescending(getDatesFromMap(monthlyReports));
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an array of the dates of the analysis files
|
||||
* @return
|
||||
*/
|
||||
public static Date[] getMonthlyAnalysisDates()
|
||||
{
|
||||
return sortDatesDescending(getDatesFromMap(monthlyAnalysis));
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert the formatted dates that are the keys of the map into a date array
|
||||
* @param monthlyMap
|
||||
* @return
|
||||
*/
|
||||
protected static Date[] getDatesFromMap(Map<String, StatsFile> monthlyMap)
|
||||
{
|
||||
Set<String> keys = monthlyMap.keySet();
|
||||
Date[] dates = new Date[keys.size()];
|
||||
int i = 0;
|
||||
for (String date : keys)
|
||||
{
|
||||
try
|
||||
{
|
||||
dates[i] = monthlySDF.parse(date);
|
||||
}
|
||||
catch (ParseException pe)
|
||||
{
|
||||
}
|
||||
|
||||
i++;
|
||||
}
|
||||
|
||||
return dates;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sort the date array in descending (reverse chronological) order
|
||||
* @param dates
|
||||
* @return
|
||||
*/
|
||||
protected static Date[] sortDatesDescending(Date[] dates)
|
||||
{
|
||||
Arrays.sort(dates, new Comparator<Date>() {
|
||||
public int compare(Date d1, Date d2)
|
||||
{
|
||||
if (d1 == null && d2 == null)
|
||||
{
|
||||
return 0;
|
||||
}
|
||||
else if (d1 == null)
|
||||
{
|
||||
return -1;
|
||||
}
|
||||
else if (d2 == null)
|
||||
{
|
||||
return 1;
|
||||
}
|
||||
else if (d1.before(d2))
|
||||
{
|
||||
return 1;
|
||||
}
|
||||
else if (d2.before(d1))
|
||||
{
|
||||
return -1;
|
||||
}
|
||||
|
||||
return 0;
|
||||
}
|
||||
});
|
||||
return dates;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the analysis file for a given date
|
||||
* @param date
|
||||
* @return
|
||||
*/
|
||||
public static File getAnalysisFor(String date)
|
||||
{
|
||||
StatisticsLoader.syncFileList();
|
||||
StatsFile sf = (monthlyAnalysis == null ? null : monthlyAnalysis.get(date));
|
||||
return sf == null ? null : sf.file;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the report file for a given date
|
||||
* @param date
|
||||
* @return
|
||||
*/
|
||||
public static File getReportFor(String date)
|
||||
{
|
||||
StatisticsLoader.syncFileList();
|
||||
StatsFile sf = (monthlyReports == null ? null : monthlyReports.get(date));
|
||||
return sf == null ? null : sf.file;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the current general analysis file
|
||||
* @return
|
||||
*/
|
||||
public static File getGeneralAnalysis()
|
||||
{
|
||||
StatisticsLoader.syncFileList();
|
||||
return generalAnalysis == null ? null : generalAnalysis.file;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the current general report file
|
||||
* @return
|
||||
*/
|
||||
public static File getGeneralReport()
|
||||
{
|
||||
StatisticsLoader.syncFileList();
|
||||
return generalReport == null ? null : generalReport.file;
|
||||
}
|
||||
|
||||
/**
|
||||
* Synchronize the cached list of analysis / report files with the reports directory
|
||||
*
|
||||
* We synchronize if:
|
||||
*
|
||||
* 1) The number of files is different (ie. files have been added or removed)
|
||||
* 2) We haven't cached anything yet
|
||||
* 3) The cache was last generate over an hour ago
|
||||
*/
|
||||
private static void syncFileList()
|
||||
{
|
||||
// Get an array of all the analysis and report files present
|
||||
File[] fileList = StatisticsLoader.getAnalysisAndReportFileList();
|
||||
|
||||
if (fileList != null && fileList.length != fileCount)
|
||||
{
|
||||
StatisticsLoader.loadFileList(fileList);
|
||||
}
|
||||
else if (lastLoaded == null)
|
||||
{
|
||||
StatisticsLoader.loadFileList(fileList);
|
||||
}
|
||||
else if (DateUtils.addHours(lastLoaded, 1).before(new Date()))
|
||||
{
|
||||
StatisticsLoader.loadFileList(fileList);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate the cached file list from the array of files
|
||||
* @param fileList
|
||||
*/
|
||||
private static synchronized void loadFileList(File[] fileList)
|
||||
{
|
||||
// If we haven't been passed an array of files, get one now
|
||||
if (fileList == null || fileList.length == 0)
|
||||
{
|
||||
fileList = StatisticsLoader.getAnalysisAndReportFileList();
|
||||
}
|
||||
|
||||
// Create new maps for the monthly analysis / reports
|
||||
Map<String, StatsFile> newMonthlyAnalysis = new HashMap<String, StatsFile>();
|
||||
Map<String, StatsFile> newMonthlyReports = new HashMap<String, StatsFile>();
|
||||
|
||||
StatsFile newGeneralAnalysis = null;
|
||||
StatsFile newGeneralReport = null;
|
||||
|
||||
if (fileList != null)
|
||||
{
|
||||
for (File thisFile : fileList)
|
||||
{
|
||||
StatsFile statsFile = null;
|
||||
|
||||
// If we haven't identified this file yet
|
||||
if (statsFile == null)
|
||||
{
|
||||
// See if it is a monthly analysis file
|
||||
statsFile = makeStatsFile(thisFile, analysisMonthlyPattern, monthlySDF);
|
||||
if (statsFile != null)
|
||||
{
|
||||
// If it is, add it to the map
|
||||
newMonthlyAnalysis.put(statsFile.dateStr, statsFile);
|
||||
}
|
||||
}
|
||||
|
||||
// If we haven't identified this file yet
|
||||
if (statsFile == null)
|
||||
{
|
||||
// See if it is a monthly report file
|
||||
statsFile = makeStatsFile(thisFile, reportMonthlyPattern, monthlySDF);
|
||||
if (statsFile != null)
|
||||
{
|
||||
// If it is, add it to the map
|
||||
newMonthlyReports.put(statsFile.dateStr, statsFile);
|
||||
}
|
||||
}
|
||||
|
||||
// If we haven't identified this file yet
|
||||
if (statsFile == null)
|
||||
{
|
||||
// See if it is a general analysis file
|
||||
statsFile = makeStatsFile(thisFile, analysisGeneralPattern, generalSDF);
|
||||
if (statsFile != null)
|
||||
{
|
||||
// If it is, ensure that we are pointing to the most recent file
|
||||
if (newGeneralAnalysis == null || statsFile.date.after(newGeneralAnalysis.date))
|
||||
{
|
||||
newGeneralAnalysis = statsFile;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If we haven't identified this file yet
|
||||
if (statsFile == null)
|
||||
{
|
||||
// See if it is a general report file
|
||||
statsFile = makeStatsFile(thisFile, reportGeneralPattern, generalSDF);
|
||||
if (statsFile != null)
|
||||
{
|
||||
// If it is, ensure that we are pointing to the most recent file
|
||||
if (newGeneralReport == null || statsFile.date.after(newGeneralReport.date))
|
||||
{
|
||||
newGeneralReport = statsFile;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Store the newly discovered values in the member cache
|
||||
monthlyAnalysis = newMonthlyAnalysis;
|
||||
monthlyReports = newMonthlyReports;
|
||||
generalAnalysis = newGeneralAnalysis;
|
||||
generalReport = newGeneralReport;
|
||||
lastLoaded = new Date();
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate a StatsFile entry for this file. The pattern and date formatters are used to
|
||||
* identify the file as a particular type, and extract the relevant information.
|
||||
* If the file is not identified by the formatter provided, then we return null
|
||||
* @param thisFile
|
||||
* @param thisPattern
|
||||
* @param sdf
|
||||
* @return
|
||||
*/
|
||||
private static StatsFile makeStatsFile(File thisFile, Pattern thisPattern, SimpleDateFormat sdf)
|
||||
{
|
||||
Matcher matcher = thisPattern.matcher(thisFile.getName());
|
||||
if (matcher.matches())
|
||||
{
|
||||
StatsFile sf = new StatsFile();
|
||||
sf.file = thisFile;
|
||||
sf.path = thisFile.getPath();
|
||||
sf.dateStr = matcher.group(1).trim();
|
||||
|
||||
try
|
||||
{
|
||||
sf.date = sdf.parse(sf.dateStr);
|
||||
}
|
||||
catch (ParseException e)
|
||||
{
|
||||
|
||||
}
|
||||
|
||||
return sf;
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an array of all the analysis and report files
|
||||
* @return
|
||||
*/
|
||||
private static File[] getAnalysisAndReportFileList()
|
||||
{
|
||||
File reportDir = new File(ConfigurationManager.getProperty("log.dir"));
|
||||
if (reportDir != null)
|
||||
{
|
||||
return reportDir.listFiles(new AnalysisAndReportFilter());
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Simple class for holding information about an analysis/report file
|
||||
*/
|
||||
private static class StatsFile
|
||||
{
|
||||
File file;
|
||||
String path;
|
||||
Date date;
|
||||
String dateStr;
|
||||
}
|
||||
|
||||
/**
|
||||
* Filter used to restrict files in the reports directory to just analysis or report types
|
||||
*/
|
||||
private static class AnalysisAndReportFilter implements FilenameFilter
|
||||
{
|
||||
public boolean accept(File dir, String name)
|
||||
{
|
||||
if (analysisMonthlyPattern.matcher(name).matches())
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
if (analysisGeneralPattern.matcher(name).matches())
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
if (reportMonthlyPattern.matcher(name).matches())
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
if (reportGeneralPattern.matcher(name).matches())
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,81 +0,0 @@
|
||||
<!--
|
||||
|
||||
The contents of this file are subject to the license and copyright
|
||||
detailed in the LICENSE and NOTICE files at the root of the source
|
||||
tree and available online at
|
||||
|
||||
http://www.dspace.org/license/
|
||||
|
||||
-->
|
||||
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
|
||||
<html>
|
||||
<head>
|
||||
<title>org.dspace.app.statistics package</title>
|
||||
<!--
|
||||
* Copyright (c) 2002-2009, The DSpace Foundation. All rights reserved.
|
||||
*
|
||||
* Redistribution and use in source and binary forms, with or without
|
||||
* modification, are permitted provided that the following conditions are
|
||||
* met:
|
||||
*
|
||||
* - Redistributions of source code must retain the above copyright
|
||||
* notice, this list of conditions and the following disclaimer.
|
||||
*
|
||||
* - Redistributions in binary form must reproduce the above copyright
|
||||
* notice, this list of conditions and the following disclaimer in the
|
||||
* documentation and/or other materials provided with the distribution.
|
||||
*
|
||||
* - Neither the name of the DSpace Foundation nor the names of its
|
||||
* contributors may be used to endorse or promote products derived from
|
||||
* this software without specific prior written permission.
|
||||
*
|
||||
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
* ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
* A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
* HOLDERS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
|
||||
* OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
|
||||
* ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
|
||||
* TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
|
||||
* USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
|
||||
* DAMAGE.
|
||||
-->
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<p>
|
||||
Defines usage event instrumentation points and provides implementations for
|
||||
testing.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
This package makes usage instrumentation (for statistics, or whatever else
|
||||
you may fancy) pluggable, while avoiding any unnecessary assumptions about how
|
||||
usage events may be transmitted, persisted, or processed.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
At appropriate points in the processing of user actions, events may be
|
||||
assembled and "fired". What happens when an event is fired is configurable
|
||||
via the PluginManager. One must configure a plugin for the AbstractUsageEvent
|
||||
class, defined in this package, to select an event processing implementation.
|
||||
</p>
|
||||
|
||||
<p>
|
||||
Three "stock" implementations are provided.
|
||||
</p>
|
||||
<dl>
|
||||
<dt>{@link org.dspace.app.statistics.PassiveUsageEvent PassiveUsageEvent}</dt>
|
||||
<dd>absorbs events without taking action, resulting in behavior identical
|
||||
to that of DSpace before this package was added. This is the default
|
||||
if no plugin is configured.</dd>
|
||||
<dt>{@link org.dspace.app.statistics.UsageEventTabFileLogger UsageEventTabFileLogger}</dt>
|
||||
<dd>writes event records to a file in Tab Separated Values format.</dd>
|
||||
<dt>{@link org.dspace.app.statistics.UsageEventXMLLogger UsageEventXMLLogger}</dt>
|
||||
<dd>writes event records to a file in an XML format. Suitable mainly for
|
||||
testing.</dd>
|
||||
</dl>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,622 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.sql.SQLException;
|
||||
|
||||
import org.dspace.authorize.AuthorizeConfiguration;
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.authorize.AuthorizeManager;
|
||||
import org.dspace.authorize.ResourcePolicy;
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Bundle;
|
||||
import org.dspace.content.Collection;
|
||||
import org.dspace.content.Community;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Constants;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* This class is an addition to the AuthorizeManager that perform authorization
|
||||
* check on not CRUD (ADD, WRITE, etc.) actions.
|
||||
*
|
||||
* @author bollini
|
||||
*
|
||||
*/
|
||||
public class AuthorizeUtil
|
||||
{
|
||||
|
||||
/**
|
||||
* Is allowed manage (create, remove, edit) bitstream's policies in the
|
||||
* current context?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param bitstream
|
||||
* the bitstream that the policy refer to
|
||||
* @throws AuthorizeException
|
||||
* if the current context (current user) is not allowed to
|
||||
* manage the bitstream's policies
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageBitstreamPolicy(Context context,
|
||||
Bitstream bitstream) throws AuthorizeException, SQLException
|
||||
{
|
||||
Bundle bundle = bitstream.getBundles()[0];
|
||||
authorizeManageBundlePolicy(context, bundle);
|
||||
}
|
||||
|
||||
/**
|
||||
* Is allowed manage (create, remove, edit) bundle's policies in the
|
||||
* current context?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param bundle
|
||||
* the bundle that the policy refer to
|
||||
* @throws AuthorizeException
|
||||
* if the current context (current user) is not allowed to
|
||||
* manage the bundle's policies
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageBundlePolicy(Context context,
|
||||
Bundle bundle) throws AuthorizeException, SQLException
|
||||
{
|
||||
Item item = bundle.getItems()[0];
|
||||
authorizeManageItemPolicy(context, item);
|
||||
}
|
||||
|
||||
/**
|
||||
* Is allowed manage (create, remove, edit) item's policies in the
|
||||
* current context?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param item
|
||||
* the item that the policy refer to
|
||||
* @throws AuthorizeException
|
||||
* if the current context (current user) is not allowed to
|
||||
* manage the item's policies
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageItemPolicy(Context context, Item item)
|
||||
throws AuthorizeException, SQLException
|
||||
{
|
||||
if (AuthorizeConfiguration.canItemAdminManagePolicies())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, item, Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration.canCollectionAdminManageItemPolicies())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, item
|
||||
.getOwningCollection(), Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration.canCommunityAdminManageItemPolicies())
|
||||
{
|
||||
AuthorizeManager
|
||||
.authorizeAction(context, item.getOwningCollection()
|
||||
.getCommunities()[0], Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin are allowed to manage item policies");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Is allowed manage (create, remove, edit) collection's policies in the
|
||||
* current context?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param collection
|
||||
* the collection that the policy refer to
|
||||
* @throws AuthorizeException
|
||||
* if the current context (current user) is not allowed to
|
||||
* manage the collection's policies
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageCollectionPolicy(Context context,
|
||||
Collection collection) throws AuthorizeException, SQLException
|
||||
{
|
||||
if (AuthorizeConfiguration.canCollectionAdminManagePolicies())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection,
|
||||
Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration
|
||||
.canCommunityAdminManageCollectionPolicies())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection
|
||||
.getCommunities()[0], Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin are allowed to manage collection policies");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Is allowed manage (create, remove, edit) community's policies in the
|
||||
* current context?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param community
|
||||
* the community that the policy refer to
|
||||
* @throws AuthorizeException
|
||||
* if the current context (current user) is not allowed to
|
||||
* manage the community's policies
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageCommunityPolicy(Context context,
|
||||
Community community) throws AuthorizeException, SQLException
|
||||
{
|
||||
if (AuthorizeConfiguration.canCommunityAdminManagePolicies())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, community,
|
||||
Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin are allowed to manage community policies");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Throw an AuthorizeException if the current user is not a System Admin
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not a System Admin
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void requireAdminRole(Context context)
|
||||
throws AuthorizeException, SQLException
|
||||
{
|
||||
if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin are allowed to perform this action");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Is the current user allowed to manage (add, remove, replace) the item's
|
||||
* CC License
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param item
|
||||
* the item that the CC License refer to
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to
|
||||
* manage the item's CC License
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageCCLicense(Context context, Item item)
|
||||
throws AuthorizeException, SQLException
|
||||
{
|
||||
try
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, item, Constants.ADD);
|
||||
AuthorizeManager.authorizeAction(context, item, Constants.REMOVE);
|
||||
}
|
||||
catch (AuthorizeException authex)
|
||||
{
|
||||
if (AuthorizeConfiguration.canItemAdminManageCCLicense())
|
||||
{
|
||||
AuthorizeManager
|
||||
.authorizeAction(context, item, Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration.canCollectionAdminManageCCLicense())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, item
|
||||
.getParentObject(), Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration.canCommunityAdminManageCCLicense())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, item
|
||||
.getParentObject().getParentObject(), Constants.ADMIN);
|
||||
}
|
||||
else
|
||||
{
|
||||
requireAdminRole(context);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Is the current user allowed to manage (create, remove, edit) the
|
||||
* collection's template item?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param collection
|
||||
* the collection
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to manage the collection's
|
||||
* template item
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageTemplateItem(Context context,
|
||||
Collection collection) throws AuthorizeException, SQLException
|
||||
{
|
||||
boolean isAuthorized = collection.canEditBoolean(false);
|
||||
|
||||
if (!isAuthorized
|
||||
&& AuthorizeConfiguration
|
||||
.canCollectionAdminManageTemplateItem())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection,
|
||||
Constants.ADMIN);
|
||||
}
|
||||
else if (!isAuthorized
|
||||
&& AuthorizeConfiguration
|
||||
.canCommunityAdminManageCollectionTemplateItem())
|
||||
{
|
||||
Community[] communities = collection.getCommunities();
|
||||
Community parent = communities != null && communities.length > 0 ? communities[0]
|
||||
: null;
|
||||
AuthorizeManager.authorizeAction(context, parent, Constants.ADMIN);
|
||||
}
|
||||
else if (!isAuthorized && !AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"You are not authorized to create a template item for the collection");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user manage (create, remove, edit) the submitters group of
|
||||
* the collection?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param collection
|
||||
* the collection
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to manage the collection's
|
||||
* submitters group
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageSubmittersGroup(Context context,
|
||||
Collection collection) throws AuthorizeException, SQLException
|
||||
{
|
||||
if (AuthorizeConfiguration.canCollectionAdminManageSubmitters())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection,
|
||||
Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration
|
||||
.canCommunityAdminManageCollectionSubmitters())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection
|
||||
.getCommunities()[0], Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin are allowed to manage collection submitters");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user manage (create, remove, edit) the workflow groups of
|
||||
* the collection?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param collection
|
||||
* the collection
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to manage the collection's
|
||||
* workflow groups
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageWorkflowsGroup(Context context,
|
||||
Collection collection) throws AuthorizeException, SQLException
|
||||
{
|
||||
if (AuthorizeConfiguration.canCollectionAdminManageWorkflows())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection,
|
||||
Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration
|
||||
.canCommunityAdminManageCollectionWorkflows())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection
|
||||
.getCommunities()[0], Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin are allowed to manage collection workflow");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user create/edit the admins group of the collection?
|
||||
* please note that the remove action need a separate check
|
||||
*
|
||||
* @see #authorizeRemoveAdminGroup(Context, Collection)
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param collection
|
||||
* the collection
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to create/edit the
|
||||
* collection's admins group
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageAdminGroup(Context context,
|
||||
Collection collection) throws AuthorizeException, SQLException
|
||||
{
|
||||
if (AuthorizeConfiguration.canCollectionAdminManageAdminGroup())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection,
|
||||
Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration
|
||||
.canCommunityAdminManageCollectionAdminGroup())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection
|
||||
.getCommunities()[0], Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin are allowed to manage collection admin");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user remove the admins group of the collection?
|
||||
* please note that the create/edit actions need separate check
|
||||
*
|
||||
* @see #authorizeManageAdminGroup(Context, Collection)
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param collection
|
||||
* the collection
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to remove the
|
||||
* collection's admins group
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeRemoveAdminGroup(Context context,
|
||||
Collection collection) throws AuthorizeException, SQLException
|
||||
{
|
||||
Community[] parentCommunities = collection.getCommunities();
|
||||
if (AuthorizeConfiguration
|
||||
.canCommunityAdminManageCollectionAdminGroup()
|
||||
&& parentCommunities != null && parentCommunities.length > 0)
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, collection
|
||||
.getCommunities()[0], Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin can remove the admin group of a collection");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user create/edit the admins group of the community?
|
||||
* please note that the remove action need a separate check
|
||||
*
|
||||
* @see #authorizeRemoveAdminGroup(Context, Collection)
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param community
|
||||
* the community
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to create/edit the
|
||||
* community's admins group
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManageAdminGroup(Context context,
|
||||
Community community) throws AuthorizeException, SQLException
|
||||
{
|
||||
if (AuthorizeConfiguration.canCommunityAdminManageAdminGroup())
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, community,
|
||||
Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin are allowed to manage community admin");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user remove the admins group of the community?
|
||||
* please note that the create/edit actions need separate check
|
||||
*
|
||||
* @see #authorizeManageAdminGroup(Context, Community)
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param community
|
||||
* the community
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to remove the
|
||||
* collection's admins group
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeRemoveAdminGroup(Context context,
|
||||
Community community) throws SQLException, AuthorizeException
|
||||
{
|
||||
Community parentCommunity = community.getParentCommunity();
|
||||
if (AuthorizeConfiguration.canCommunityAdminManageAdminGroup()
|
||||
&& parentCommunity != null)
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, parentCommunity,
|
||||
Constants.ADMIN);
|
||||
}
|
||||
else if (!AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"Only system admin can remove the admin group of the community");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user remove or edit the supplied policy?
|
||||
*
|
||||
* @param c
|
||||
* the DSpace Context Object
|
||||
* @param rp
|
||||
* a resource policy
|
||||
* @throws AuthorizeException
|
||||
* if the current context (current user) is not allowed to
|
||||
* remove/edit the policy
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
*/
|
||||
public static void authorizeManagePolicy(Context c, ResourcePolicy rp)
|
||||
throws SQLException, AuthorizeException
|
||||
{
|
||||
switch (rp.getResourceType())
|
||||
{
|
||||
case Constants.BITSTREAM:
|
||||
authorizeManageBitstreamPolicy(c, Bitstream.find(c, rp
|
||||
.getResourceID()));
|
||||
break;
|
||||
case Constants.BUNDLE:
|
||||
authorizeManageBundlePolicy(c, Bundle.find(c, rp.getResourceID()));
|
||||
break;
|
||||
|
||||
case Constants.ITEM:
|
||||
authorizeManageItemPolicy(c, Item.find(c, rp.getResourceID()));
|
||||
break;
|
||||
case Constants.COLLECTION:
|
||||
authorizeManageCollectionPolicy(c, Collection.find(c, rp
|
||||
.getResourceID()));
|
||||
break;
|
||||
case Constants.COMMUNITY:
|
||||
authorizeManageCommunityPolicy(c, Community.find(c, rp
|
||||
.getResourceID()));
|
||||
break;
|
||||
|
||||
default:
|
||||
requireAdminRole(c);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user withdraw the item?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param item
|
||||
* the item
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to perform the item
|
||||
* withdraw
|
||||
*/
|
||||
public static void authorizeWithdrawItem(Context context, Item item)
|
||||
throws SQLException, AuthorizeException
|
||||
{
|
||||
boolean authorized = false;
|
||||
if (AuthorizeConfiguration.canCollectionAdminPerformItemWithdrawn())
|
||||
{
|
||||
authorized = AuthorizeManager.authorizeActionBoolean(context, item
|
||||
.getOwningCollection(), Constants.ADMIN);
|
||||
}
|
||||
else if (AuthorizeConfiguration.canCommunityAdminPerformItemWithdrawn())
|
||||
{
|
||||
authorized = AuthorizeManager
|
||||
.authorizeActionBoolean(context, item.getOwningCollection()
|
||||
.getCommunities()[0], Constants.ADMIN);
|
||||
}
|
||||
|
||||
if (!authorized)
|
||||
{
|
||||
authorized = AuthorizeManager.authorizeActionBoolean(context, item
|
||||
.getOwningCollection(), Constants.REMOVE, false);
|
||||
}
|
||||
|
||||
// authorized
|
||||
if (!authorized)
|
||||
{
|
||||
throw new AuthorizeException(
|
||||
"To withdraw item must be COLLECTION_ADMIN or have REMOVE authorization on owning Collection");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Can the current user reinstate the item?
|
||||
*
|
||||
* @param context
|
||||
* the DSpace Context Object
|
||||
* @param item
|
||||
* the item
|
||||
* @throws SQLException
|
||||
* if a db error occur
|
||||
* @throws AuthorizeException
|
||||
* if the current user is not allowed to perform the item
|
||||
* reinstatement
|
||||
*/
|
||||
public static void authorizeReinstateItem(Context context, Item item)
|
||||
throws SQLException, AuthorizeException
|
||||
{
|
||||
Collection[] colls = item.getCollections();
|
||||
|
||||
for (int i = 0; i < colls.length; i++)
|
||||
{
|
||||
if (!AuthorizeConfiguration
|
||||
.canCollectionAdminPerformItemReinstatiate())
|
||||
{
|
||||
if (AuthorizeConfiguration
|
||||
.canCommunityAdminPerformItemReinstatiate()
|
||||
&& AuthorizeManager.authorizeActionBoolean(context,
|
||||
colls[i].getCommunities()[0], Constants.ADMIN))
|
||||
{
|
||||
// authorized
|
||||
}
|
||||
else
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, colls[i],
|
||||
Constants.ADD, false);
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
AuthorizeManager.authorizeAction(context, colls[i],
|
||||
Constants.ADD);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,697 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.io.File;
|
||||
import java.util.*;
|
||||
|
||||
import org.xml.sax.SAXException;
|
||||
import org.w3c.dom.*;
|
||||
import javax.xml.parsers.*;
|
||||
|
||||
import org.dspace.content.MetadataSchema;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
|
||||
/**
|
||||
* Submission form generator for DSpace. Reads and parses the installation
|
||||
* form definitions file, input-forms.xml, from the configuration directory.
|
||||
* A forms definition details the page and field layout of the metadata
|
||||
* collection pages used by the submission process. Each forms definition
|
||||
* starts with a unique name that gets associated with that form set.
|
||||
*
|
||||
* The file also specifies which collections use which form sets. At a
|
||||
* minimum, the definitions file must define a default mapping from the
|
||||
* placeholder collection #0 to the distinguished form 'default'. Any
|
||||
* collections that use a custom form set are listed paired with the name
|
||||
* of the form set they use.
|
||||
*
|
||||
* The definitions file also may contain sets of value pairs. Each value pair
|
||||
* will contain one string that the user reads, and a paired string that will
|
||||
* supply the value stored in the database if its sibling display value gets
|
||||
* selected from a choice list.
|
||||
*
|
||||
* @author Brian S. Hughes
|
||||
* @version $Revision$
|
||||
*/
|
||||
|
||||
public class DCInputsReader
|
||||
{
|
||||
/**
|
||||
* The ID of the default collection. Will never be the ID of a named
|
||||
* collection
|
||||
*/
|
||||
public static final String DEFAULT_COLLECTION = "default";
|
||||
|
||||
/** Name of the form definition XML file */
|
||||
static final String FORM_DEF_FILE = "input-forms.xml";
|
||||
|
||||
/** Keyname for storing dropdown value-pair set name */
|
||||
static final String PAIR_TYPE_NAME = "value-pairs-name";
|
||||
|
||||
/** The fully qualified pathname of the form definition XML file */
|
||||
private String defsFile = ConfigurationManager.getProperty("dspace.dir")
|
||||
+ File.separator + "config" + File.separator + FORM_DEF_FILE;
|
||||
|
||||
/**
|
||||
* Reference to the collections to forms map, computed from the forms
|
||||
* definition file
|
||||
*/
|
||||
private Map<String, String> whichForms = null;
|
||||
|
||||
/**
|
||||
* Reference to the forms definitions map, computed from the forms
|
||||
* definition file
|
||||
*/
|
||||
private Map<String, List<List<Map<String, String>>>> formDefns = null;
|
||||
|
||||
/**
|
||||
* Reference to the value-pairs map, computed from the forms definition file
|
||||
*/
|
||||
private Map<String, List<String>> valuePairs = null; // Holds display/storage pairs
|
||||
|
||||
/**
|
||||
* Mini-cache of last DCInputSet requested. If submissions are not typically
|
||||
* form-interleaved, there will be a modest win.
|
||||
*/
|
||||
private DCInputSet lastInputSet = null;
|
||||
|
||||
/**
|
||||
* Parse an XML encoded submission forms template file, and create a hashmap
|
||||
* containing all the form information. This hashmap will contain three top
|
||||
* level structures: a map between collections and forms, the definition for
|
||||
* each page of each form, and lists of pairs of values that populate
|
||||
* selection boxes.
|
||||
*/
|
||||
|
||||
public DCInputsReader()
|
||||
throws DCInputsReaderException
|
||||
{
|
||||
buildInputs(defsFile);
|
||||
}
|
||||
|
||||
|
||||
public DCInputsReader(String fileName)
|
||||
throws DCInputsReaderException
|
||||
{
|
||||
buildInputs(fileName);
|
||||
}
|
||||
|
||||
|
||||
private void buildInputs(String fileName)
|
||||
throws DCInputsReaderException
|
||||
{
|
||||
whichForms = new HashMap<String, String>();
|
||||
formDefns = new HashMap<String, List<List<Map<String, String>>>>();
|
||||
valuePairs = new HashMap<String, List<String>>();
|
||||
|
||||
String uri = "file:" + new File(fileName).getAbsolutePath();
|
||||
|
||||
try
|
||||
{
|
||||
DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
|
||||
factory.setValidating(false);
|
||||
factory.setIgnoringComments(true);
|
||||
factory.setIgnoringElementContentWhitespace(true);
|
||||
|
||||
DocumentBuilder db = factory.newDocumentBuilder();
|
||||
Document doc = db.parse(uri);
|
||||
doNodes(doc);
|
||||
checkValues();
|
||||
}
|
||||
catch (FactoryConfigurationError fe)
|
||||
{
|
||||
throw new DCInputsReaderException("Cannot create Submission form parser",fe);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
throw new DCInputsReaderException("Error creating submission forms: "+e);
|
||||
}
|
||||
}
|
||||
|
||||
public Iterator<String> getPairsNameIterator()
|
||||
{
|
||||
return valuePairs.keySet().iterator();
|
||||
}
|
||||
|
||||
public List<String> getPairs(String name)
|
||||
{
|
||||
return valuePairs.get(name);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the set of DC inputs used for a particular collection, or the
|
||||
* default set if no inputs defined for the collection
|
||||
*
|
||||
* @param collectionHandle
|
||||
* collection's unique Handle
|
||||
* @return DC input set
|
||||
* @throws DCInputsReaderException
|
||||
* if no default set defined
|
||||
*/
|
||||
public DCInputSet getInputs(String collectionHandle)
|
||||
throws DCInputsReaderException
|
||||
{
|
||||
String formName = whichForms.get(collectionHandle);
|
||||
if (formName == null)
|
||||
{
|
||||
formName = whichForms.get(DEFAULT_COLLECTION);
|
||||
}
|
||||
if (formName == null)
|
||||
{
|
||||
throw new DCInputsReaderException("No form designated as default");
|
||||
}
|
||||
// check mini-cache, and return if match
|
||||
if ( lastInputSet != null && lastInputSet.getFormName().equals( formName ) )
|
||||
{
|
||||
return lastInputSet;
|
||||
}
|
||||
// cache miss - construct new DCInputSet
|
||||
List<List<Map<String, String>>> pages = formDefns.get(formName);
|
||||
if ( pages == null )
|
||||
{
|
||||
throw new DCInputsReaderException("Missing the " + formName + " form");
|
||||
}
|
||||
lastInputSet = new DCInputSet(formName, pages, valuePairs);
|
||||
return lastInputSet;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the number of pages the inputs span for a desginated collection
|
||||
* @param collectionHandle collection's unique Handle
|
||||
* @return number of pages of input
|
||||
* @throws DCInputsReaderException if no default set defined
|
||||
*/
|
||||
public int getNumberInputPages(String collectionHandle)
|
||||
throws DCInputsReaderException
|
||||
{
|
||||
return getInputs(collectionHandle).getNumberPages();
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the top level child nodes in the passed top-level node. These
|
||||
* should correspond to the collection-form maps, the form definitions, and
|
||||
* the display/storage word pairs.
|
||||
*/
|
||||
private void doNodes(Node n)
|
||||
throws SAXException, DCInputsReaderException
|
||||
{
|
||||
if (n == null)
|
||||
{
|
||||
return;
|
||||
}
|
||||
Node e = getElement(n);
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
boolean foundMap = false;
|
||||
boolean foundDefs = false;
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
if ((nd == null) || isEmptyTextNode(nd))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
String tagName = nd.getNodeName();
|
||||
if (tagName.equals("form-map"))
|
||||
{
|
||||
processMap(nd);
|
||||
foundMap = true;
|
||||
}
|
||||
else if (tagName.equals("form-definitions"))
|
||||
{
|
||||
processDefinition(nd);
|
||||
foundDefs = true;
|
||||
}
|
||||
else if (tagName.equals("form-value-pairs"))
|
||||
{
|
||||
processValuePairs(nd);
|
||||
}
|
||||
// Ignore unknown nodes
|
||||
}
|
||||
if (!foundMap)
|
||||
{
|
||||
throw new DCInputsReaderException("No collection to form map found");
|
||||
}
|
||||
if (!foundDefs)
|
||||
{
|
||||
throw new DCInputsReaderException("No form definition found");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the form-map section of the XML file.
|
||||
* Each element looks like:
|
||||
* <name-map collection-handle="hdl" form-name="name" />
|
||||
* Extract the collection handle and form name, put name in hashmap keyed
|
||||
* by the collection handle.
|
||||
*/
|
||||
private void processMap(Node e)
|
||||
throws SAXException
|
||||
{
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
if (nd.getNodeName().equals("name-map"))
|
||||
{
|
||||
String id = getAttribute(nd, "collection-handle");
|
||||
String value = getAttribute(nd, "form-name");
|
||||
String content = getValue(nd);
|
||||
if (id == null)
|
||||
{
|
||||
throw new SAXException("name-map element is missing collection-handle attribute");
|
||||
}
|
||||
if (value == null)
|
||||
{
|
||||
throw new SAXException("name-map element is missing form-name attribute");
|
||||
}
|
||||
if (content != null && content.length() > 0)
|
||||
{
|
||||
throw new SAXException("name-map element has content, it should be empty.");
|
||||
}
|
||||
whichForms.put(id, value);
|
||||
} // ignore any child node that isn't a "name-map"
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the form-definitions section of the XML file. Each element is
|
||||
* formed thusly: <form name="formname">...pages...</form> Each pages
|
||||
* subsection is formed: <page number="#"> ...fields... </page> Each field
|
||||
* is formed from: dc-element, dc-qualifier, label, hint, input-type name,
|
||||
* required text, and repeatable flag.
|
||||
*/
|
||||
private void processDefinition(Node e)
|
||||
throws SAXException, DCInputsReaderException
|
||||
{
|
||||
int numForms = 0;
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
// process each form definition
|
||||
if (nd.getNodeName().equals("form"))
|
||||
{
|
||||
numForms++;
|
||||
String formName = getAttribute(nd, "name");
|
||||
if (formName == null)
|
||||
{
|
||||
throw new SAXException("form element has no name attribute");
|
||||
}
|
||||
List<List<Map<String, String>>> pages = new ArrayList<List<Map<String, String>>>(); // the form contains pages
|
||||
formDefns.put(formName, pages);
|
||||
NodeList pl = nd.getChildNodes();
|
||||
int lenpg = pl.getLength();
|
||||
for (int j = 0; j < lenpg; j++)
|
||||
{
|
||||
Node npg = pl.item(j);
|
||||
// process each page definition
|
||||
if (npg.getNodeName().equals("page"))
|
||||
{
|
||||
String pgNum = getAttribute(npg, "number");
|
||||
if (pgNum == null)
|
||||
{
|
||||
throw new SAXException("Form " + formName + " has no identified pages");
|
||||
}
|
||||
List<Map<String, String>> page = new ArrayList<Map<String, String>>();
|
||||
pages.add(page);
|
||||
NodeList flds = npg.getChildNodes();
|
||||
int lenflds = flds.getLength();
|
||||
for (int k = 0; k < lenflds; k++)
|
||||
{
|
||||
Node nfld = flds.item(k);
|
||||
if ( nfld.getNodeName().equals("field") )
|
||||
{
|
||||
// process each field definition
|
||||
Map<String, String> field = new HashMap<String, String>();
|
||||
page.add(field);
|
||||
processPageParts(formName, pgNum, nfld, field);
|
||||
String error = checkForDups(formName, field, pages);
|
||||
if (error != null)
|
||||
{
|
||||
throw new SAXException(error);
|
||||
}
|
||||
}
|
||||
}
|
||||
} // ignore any child that is not a 'page'
|
||||
}
|
||||
// sanity check number of pages
|
||||
if (pages.size() < 1)
|
||||
{
|
||||
throw new DCInputsReaderException("Form " + formName + " has no pages");
|
||||
}
|
||||
}
|
||||
}
|
||||
if (numForms == 0)
|
||||
{
|
||||
throw new DCInputsReaderException("No form definition found");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process parts of a field
|
||||
* At the end, make sure that input-types 'qualdrop_value' and
|
||||
* 'twobox' are marked repeatable. Complain if dc-element, label,
|
||||
* or input-type are missing.
|
||||
*/
|
||||
private void processPageParts(String formName, String page, Node n, Map<String, String> field)
|
||||
throws SAXException
|
||||
{
|
||||
NodeList nl = n.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
if ( ! isEmptyTextNode(nd) )
|
||||
{
|
||||
String tagName = nd.getNodeName();
|
||||
String value = getValue(nd);
|
||||
field.put(tagName, value);
|
||||
if (tagName.equals("input-type"))
|
||||
{
|
||||
if (value.equals("dropdown")
|
||||
|| value.equals("qualdrop_value")
|
||||
|| value.equals("list"))
|
||||
{
|
||||
String pairTypeName = getAttribute(nd, PAIR_TYPE_NAME);
|
||||
if (pairTypeName == null)
|
||||
{
|
||||
throw new SAXException("Form " + formName + ", field " +
|
||||
field.get("dc-element") +
|
||||
"." + field.get("dc-qualifier") +
|
||||
" has no name attribute");
|
||||
}
|
||||
else
|
||||
{
|
||||
field.put(PAIR_TYPE_NAME, pairTypeName);
|
||||
}
|
||||
}
|
||||
}
|
||||
else if (tagName.equals("vocabulary"))
|
||||
{
|
||||
String closedVocabularyString = getAttribute(nd, "closed");
|
||||
field.put("closedVocabulary", closedVocabularyString);
|
||||
}
|
||||
}
|
||||
}
|
||||
String missing = null;
|
||||
if (field.get("dc-element") == null)
|
||||
{
|
||||
missing = "dc-element";
|
||||
}
|
||||
if (field.get("label") == null)
|
||||
{
|
||||
missing = "label";
|
||||
}
|
||||
if (field.get("input-type") == null)
|
||||
{
|
||||
missing = "input-type";
|
||||
}
|
||||
if ( missing != null )
|
||||
{
|
||||
String msg = "Required field " + missing + " missing on page " + page + " of form " + formName;
|
||||
throw new SAXException(msg);
|
||||
}
|
||||
String type = field.get("input-type");
|
||||
if (type.equals("twobox") || type.equals("qualdrop_value"))
|
||||
{
|
||||
String rpt = field.get("repeatable");
|
||||
if ((rpt == null) ||
|
||||
((!rpt.equalsIgnoreCase("yes")) &&
|
||||
(!rpt.equalsIgnoreCase("true"))))
|
||||
{
|
||||
String msg = "The field \'"+field.get("label")+"\' must be repeatable";
|
||||
throw new SAXException(msg);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check that this is the only field with the name dc-element.dc-qualifier
|
||||
* If there is a duplicate, return an error message, else return null;
|
||||
*/
|
||||
private String checkForDups(String formName, Map<String, String> field, List<List<Map<String, String>>> pages)
|
||||
{
|
||||
int matches = 0;
|
||||
String err = null;
|
||||
String schema = field.get("dc-schema");
|
||||
String elem = field.get("dc-element");
|
||||
String qual = field.get("dc-qualifier");
|
||||
if ((schema == null) || (schema.equals("")))
|
||||
{
|
||||
schema = MetadataSchema.DC_SCHEMA;
|
||||
}
|
||||
String schemaTest;
|
||||
|
||||
for (int i = 0; i < pages.size(); i++)
|
||||
{
|
||||
List<Map<String, String>> pg = pages.get(i);
|
||||
for (int j = 0; j < pg.size(); j++)
|
||||
{
|
||||
Map<String, String> fld = pg.get(j);
|
||||
if ((fld.get("dc-schema") == null) ||
|
||||
((fld.get("dc-schema")).equals("")))
|
||||
{
|
||||
schemaTest = MetadataSchema.DC_SCHEMA;
|
||||
}
|
||||
else
|
||||
{
|
||||
schemaTest = fld.get("dc-schema");
|
||||
}
|
||||
|
||||
// Are the schema and element the same? If so, check the qualifier
|
||||
if (((fld.get("dc-element")).equals(elem)) &&
|
||||
(schemaTest.equals(schema)))
|
||||
{
|
||||
String ql = fld.get("dc-qualifier");
|
||||
if (qual != null)
|
||||
{
|
||||
if ((ql != null) && ql.equals(qual))
|
||||
{
|
||||
matches++;
|
||||
}
|
||||
}
|
||||
else if (ql == null)
|
||||
{
|
||||
matches++;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if (matches > 1)
|
||||
{
|
||||
err = "Duplicate field " + schema + "." + elem + "." + qual + " detected in form " + formName;
|
||||
}
|
||||
|
||||
return err;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Process the form-value-pairs section of the XML file.
|
||||
* Each element is formed thusly:
|
||||
* <value-pairs name="..." dc-term="...">
|
||||
* <pair>
|
||||
* <display>displayed name-</display>
|
||||
* <storage>stored name</storage>
|
||||
* </pair>
|
||||
* For each value-pairs element, create a new vector, and extract all
|
||||
* the pairs contained within it. Put the display and storage values,
|
||||
* respectively, in the next slots in the vector. Store the vector
|
||||
* in the passed in hashmap.
|
||||
*/
|
||||
private void processValuePairs(Node e)
|
||||
throws SAXException
|
||||
{
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
String tagName = nd.getNodeName();
|
||||
|
||||
// process each value-pairs set
|
||||
if (tagName.equals("value-pairs"))
|
||||
{
|
||||
String pairsName = getAttribute(nd, PAIR_TYPE_NAME);
|
||||
String dcTerm = getAttribute(nd, "dc-term");
|
||||
if (pairsName == null)
|
||||
{
|
||||
String errString =
|
||||
"Missing name attribute for value-pairs for DC term " + dcTerm;
|
||||
throw new SAXException(errString);
|
||||
|
||||
}
|
||||
List<String> pairs = new ArrayList<String>();
|
||||
valuePairs.put(pairsName, pairs);
|
||||
NodeList cl = nd.getChildNodes();
|
||||
int lench = cl.getLength();
|
||||
for (int j = 0; j < lench; j++)
|
||||
{
|
||||
Node nch = cl.item(j);
|
||||
String display = null;
|
||||
String storage = null;
|
||||
|
||||
if (nch.getNodeName().equals("pair"))
|
||||
{
|
||||
NodeList pl = nch.getChildNodes();
|
||||
int plen = pl.getLength();
|
||||
for (int k = 0; k < plen; k++)
|
||||
{
|
||||
Node vn= pl.item(k);
|
||||
String vName = vn.getNodeName();
|
||||
if (vName.equals("displayed-value"))
|
||||
{
|
||||
display = getValue(vn);
|
||||
}
|
||||
else if (vName.equals("stored-value"))
|
||||
{
|
||||
storage = getValue(vn);
|
||||
if (storage == null)
|
||||
{
|
||||
storage = "";
|
||||
}
|
||||
} // ignore any children that aren't 'display' or 'storage'
|
||||
}
|
||||
pairs.add(display);
|
||||
pairs.add(storage);
|
||||
} // ignore any children that aren't a 'pair'
|
||||
}
|
||||
} // ignore any children that aren't a 'value-pair'
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Check that all referenced value-pairs are present
|
||||
* and field is consistent
|
||||
*
|
||||
* Throws DCInputsReaderException if detects a missing value-pair.
|
||||
*/
|
||||
|
||||
private void checkValues()
|
||||
throws DCInputsReaderException
|
||||
{
|
||||
// Step through every field of every page of every form
|
||||
Iterator<String> ki = formDefns.keySet().iterator();
|
||||
while (ki.hasNext())
|
||||
{
|
||||
String idName = ki.next();
|
||||
List<List<Map<String, String>>> pages = formDefns.get(idName);
|
||||
for (int i = 0; i < pages.size(); i++)
|
||||
{
|
||||
List<Map<String, String>> page = pages.get(i);
|
||||
for (int j = 0; j < page.size(); j++)
|
||||
{
|
||||
Map<String, String> fld = page.get(j);
|
||||
// verify reference in certain input types
|
||||
String type = fld.get("input-type");
|
||||
if (type.equals("dropdown")
|
||||
|| type.equals("qualdrop_value")
|
||||
|| type.equals("list"))
|
||||
{
|
||||
String pairsName = fld.get(PAIR_TYPE_NAME);
|
||||
List<String> v = valuePairs.get(pairsName);
|
||||
if (v == null)
|
||||
{
|
||||
String errString = "Cannot find value pairs for " + pairsName;
|
||||
throw new DCInputsReaderException(errString);
|
||||
}
|
||||
}
|
||||
// if visibility restricted, make sure field is not required
|
||||
String visibility = fld.get("visibility");
|
||||
if (visibility != null && visibility.length() > 0 )
|
||||
{
|
||||
String required = fld.get("required");
|
||||
if (required != null && required.length() > 0)
|
||||
{
|
||||
String errString = "Field '" + fld.get("label") +
|
||||
"' is required but invisible";
|
||||
throw new DCInputsReaderException(errString);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private Node getElement(Node nd)
|
||||
{
|
||||
NodeList nl = nd.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node n = nl.item(i);
|
||||
if (n.getNodeType() == Node.ELEMENT_NODE)
|
||||
{
|
||||
return n;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
private boolean isEmptyTextNode(Node nd)
|
||||
{
|
||||
boolean isEmpty = false;
|
||||
if (nd.getNodeType() == Node.TEXT_NODE)
|
||||
{
|
||||
String text = nd.getNodeValue().trim();
|
||||
if (text.length() == 0)
|
||||
{
|
||||
isEmpty = true;
|
||||
}
|
||||
}
|
||||
return isEmpty;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value of the node's attribute named <name>
|
||||
*/
|
||||
private String getAttribute(Node e, String name)
|
||||
{
|
||||
NamedNodeMap attrs = e.getAttributes();
|
||||
int len = attrs.getLength();
|
||||
if (len > 0)
|
||||
{
|
||||
int i;
|
||||
for (i = 0; i < len; i++)
|
||||
{
|
||||
Node attr = attrs.item(i);
|
||||
if (name.equals(attr.getNodeName()))
|
||||
{
|
||||
return attr.getNodeValue().trim();
|
||||
}
|
||||
}
|
||||
}
|
||||
//no such attribute
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value found in the Text node (if any) in the
|
||||
* node list that's passed in.
|
||||
*/
|
||||
private String getValue(Node nd)
|
||||
{
|
||||
NodeList nl = nd.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node n = nl.item(i);
|
||||
short type = n.getNodeType();
|
||||
if (type == Node.TEXT_NODE)
|
||||
{
|
||||
return n.getNodeValue().trim();
|
||||
}
|
||||
}
|
||||
// Didn't find a text node
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,56 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
/**
|
||||
* This is a superclass for exceptions representing a failure when
|
||||
* importing or exporting a package. E.g., unacceptable package format
|
||||
* or contents. Implementations should throw one of the more specific
|
||||
* exceptions. This class is intended for declarations and catch clauses.
|
||||
*
|
||||
* @author Larry Stone
|
||||
* @version $Revision: 3761 $
|
||||
*/
|
||||
public class DCInputsReaderException extends Exception
|
||||
{
|
||||
/**
|
||||
* No-args constructor.
|
||||
*/
|
||||
public DCInputsReaderException()
|
||||
{
|
||||
super();
|
||||
}
|
||||
|
||||
/**
|
||||
* Constructor for a given message.
|
||||
* @param message diagnostic message.
|
||||
*/
|
||||
public DCInputsReaderException(String message)
|
||||
{
|
||||
super(message);
|
||||
}
|
||||
|
||||
/**
|
||||
* Constructor for a given cause.
|
||||
* @param cause throwable that caused this exception
|
||||
*/
|
||||
public DCInputsReaderException(Throwable cause)
|
||||
{
|
||||
super(cause);
|
||||
}
|
||||
|
||||
/**
|
||||
* Constructor to create a new exception wrapping it around another exception.
|
||||
* @param message diagnostic message.
|
||||
* @param cause throwable that caused this exception
|
||||
*/
|
||||
public DCInputsReaderException(String message, Throwable cause)
|
||||
{
|
||||
super(message, cause);
|
||||
}
|
||||
}
|
||||
@@ -1,153 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.storage.rdbms.DatabaseManager;
|
||||
import org.apache.log4j.Logger;
|
||||
|
||||
import javax.servlet.ServletContextListener;
|
||||
import javax.servlet.ServletContextEvent;
|
||||
|
||||
import java.beans.Introspector;
|
||||
import java.net.URL;
|
||||
import java.net.URLConnection;
|
||||
import java.sql.Driver;
|
||||
import java.sql.DriverManager;
|
||||
import java.util.Enumeration;
|
||||
|
||||
/**
|
||||
* Class to initialize / cleanup resources used by DSpace when the web application
|
||||
* is started or stopped
|
||||
*/
|
||||
public class DSpaceContextListener implements ServletContextListener
|
||||
{
|
||||
private static Logger log = Logger.getLogger(DSpaceContextListener.class);
|
||||
|
||||
/**
|
||||
* The DSpace config parameter, this is where the path to the DSpace
|
||||
* configuration file can be obtained
|
||||
*/
|
||||
public static final String DSPACE_CONFIG_PARAMETER = "dspace-config";
|
||||
|
||||
/**
|
||||
* Initialize any resources required by the application
|
||||
* @param event
|
||||
*/
|
||||
public void contextInitialized(ServletContextEvent event)
|
||||
{
|
||||
|
||||
// On Windows, URL caches can cause problems, particularly with undeployment
|
||||
// So, here we attempt to disable them if we detect that we are running on Windows
|
||||
try
|
||||
{
|
||||
String osName = System.getProperty("os.name");
|
||||
|
||||
if (osName != null && osName.toLowerCase().contains("windows"))
|
||||
{
|
||||
URL url = new URL("http://localhost/");
|
||||
URLConnection urlConn = url.openConnection();
|
||||
urlConn.setDefaultUseCaches(false);
|
||||
}
|
||||
}
|
||||
// Any errors thrown in disabling the caches aren't significant to
|
||||
// the normal execution of the application, so we ignore them
|
||||
catch (RuntimeException e)
|
||||
{
|
||||
log.error(e.getMessage(), e);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
log.error(e.getMessage(), e);
|
||||
}
|
||||
|
||||
// Paths to the various config files
|
||||
String dspaceConfig = null;
|
||||
|
||||
/**
|
||||
* Stage 1
|
||||
*
|
||||
* Locate the dspace config
|
||||
*/
|
||||
|
||||
// first check the local per webapp parameter, then check the global parameter.
|
||||
dspaceConfig = event.getServletContext().getInitParameter(DSPACE_CONFIG_PARAMETER);
|
||||
|
||||
// Finally, if no config parameter found throw an error
|
||||
if (dspaceConfig == null || "".equals(dspaceConfig))
|
||||
{
|
||||
throw new IllegalStateException(
|
||||
"\n\nDSpace has failed to initialize. This has occurred because it was unable to determine \n" +
|
||||
"where the dspace.cfg file is located. The path to the configuration file should be stored \n" +
|
||||
"in a context variable, '"+DSPACE_CONFIG_PARAMETER+"', in the global context. \n" +
|
||||
"No context variable was found in either location.\n\n");
|
||||
}
|
||||
|
||||
/**
|
||||
* Stage 2
|
||||
*
|
||||
* Load the dspace config. Also may load log4j configuration.
|
||||
* (Please rely on ConfigurationManager or Log4j to configure logging)
|
||||
*
|
||||
*/
|
||||
try
|
||||
{
|
||||
ConfigurationManager.loadConfig(dspaceConfig);
|
||||
}
|
||||
catch (RuntimeException e)
|
||||
{
|
||||
throw e;
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
throw new IllegalStateException(
|
||||
"\n\nDSpace has failed to initialize, during stage 2. Error while attempting to read the \n" +
|
||||
"DSpace configuration file (Path: '"+dspaceConfig+"'). \n" +
|
||||
"This has likely occurred because either the file does not exist, or it's permissions \n" +
|
||||
"are set incorrectly, or the path to the configuration file is incorrect. The path to \n" +
|
||||
"the DSpace configuration file is stored in a context variable, 'dspace-config', in \n" +
|
||||
"either the local servlet or global context.\n\n",e);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up resources used by the application when stopped
|
||||
*
|
||||
* @param event
|
||||
*/
|
||||
public void contextDestroyed(ServletContextEvent event)
|
||||
{
|
||||
try
|
||||
{
|
||||
// Remove the database pool
|
||||
DatabaseManager.shutdown();
|
||||
|
||||
// Clean out the introspector
|
||||
Introspector.flushCaches();
|
||||
|
||||
// Remove any drivers registered by this classloader
|
||||
for (Enumeration e = DriverManager.getDrivers(); e.hasMoreElements();)
|
||||
{
|
||||
Driver driver = (Driver) e.nextElement();
|
||||
if (driver.getClass().getClassLoader() == getClass().getClassLoader())
|
||||
{
|
||||
DriverManager.deregisterDriver(driver);
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (RuntimeException e)
|
||||
{
|
||||
log.error("Failed to cleanup ClassLoader for webapp", e);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
log.error("Failed to cleanup ClassLoader for webapp", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,348 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
import java.net.UnknownHostException;
|
||||
import java.text.SimpleDateFormat;
|
||||
import java.util.Calendar;
|
||||
import java.util.Date;
|
||||
|
||||
import org.apache.commons.lang.time.DateUtils;
|
||||
import org.apache.log4j.FileAppender;
|
||||
import org.apache.log4j.helpers.LogLog;
|
||||
import org.apache.log4j.spi.LoggingEvent;
|
||||
|
||||
/**
|
||||
* Special log appender for log4j. Adds the current date (ie. year-mon) to
|
||||
* the end of the file name, so that rolling on to the next log is simply
|
||||
* a case of starting a new one - no renaming of old logs.
|
||||
*
|
||||
* This is advisable if you are using Windows, and have multiple applications
|
||||
* (ie. dspace, dspace-oai, dspace-sword) that all want to write to the same log file,
|
||||
* as each would otherwise try to rename the old files during rollover.
|
||||
*
|
||||
* An example log4j.properties (one log per month, retains three months of logs)
|
||||
*
|
||||
* log4j.rootCategory=INFO, A1
|
||||
* log4j.appender.A1=org.dspace.app.util.DailyFileAppender
|
||||
* log4j.appender.A1.File=@@log.dir@@/dspace.log
|
||||
* log4j.appender.A1.DatePattern=yyyy-MM
|
||||
* log4j.appender.A1.MaxLogs=3
|
||||
* log4j.appender.A1.layout=org.apache.log4j.PatternLayout
|
||||
* log4j.appender.A1.layout.ConversionPattern=%d %-5p %c @ %m%n
|
||||
*
|
||||
*/
|
||||
public class DailyFileAppender extends FileAppender
|
||||
{
|
||||
/**
|
||||
* The fixed date pattern to be used if one is not specified.
|
||||
*/
|
||||
private static final String DATE_PATTERN = "yyyy-MM-dd";
|
||||
|
||||
/**
|
||||
* The folder under which daily folders are created. This can be a absolute path
|
||||
* or relative path also.
|
||||
* e.g. JavaLogs/CPRILog or F:/LogFiles/CPRILog
|
||||
*/
|
||||
private String mstrFileName;
|
||||
|
||||
/**
|
||||
* Used internally and contains the name of the date derived from current system date.
|
||||
*/
|
||||
private Date mstrDate = new Date(System.currentTimeMillis());
|
||||
|
||||
/**
|
||||
* Holds the user specified DatePattern,
|
||||
*/
|
||||
private String mstrDatePattern = DATE_PATTERN;
|
||||
|
||||
private boolean mMonthOnly = false;
|
||||
|
||||
/**
|
||||
* The date formatter object used for parsing the user specified DatePattern.
|
||||
*/
|
||||
private SimpleDateFormat mobjSDF;
|
||||
|
||||
private boolean mWithHostName = false;
|
||||
|
||||
private int mMaxLogs = 0;
|
||||
|
||||
/**
|
||||
* Default constructor. This is required as the appender class is dynamically
|
||||
* loaded.
|
||||
*/
|
||||
public DailyFileAppender()
|
||||
{
|
||||
super();
|
||||
}
|
||||
|
||||
/* (non-Javadoc)
|
||||
* @see org.apache.log4j.FileAppender#activateOptions()
|
||||
*/
|
||||
public void activateOptions()
|
||||
{
|
||||
setFileName();
|
||||
cleanupOldFiles();
|
||||
super.activateOptions();
|
||||
}
|
||||
|
||||
/*------------------------------------------------------------------------------
|
||||
* Getters
|
||||
*----------------------------------------------------------------------------*/
|
||||
public String getDatePattern()
|
||||
{
|
||||
return this.mstrDatePattern;
|
||||
}
|
||||
|
||||
public String getFile()
|
||||
{
|
||||
return this.mstrFileName;
|
||||
}
|
||||
|
||||
public boolean getWithHost()
|
||||
{
|
||||
return mWithHostName;
|
||||
}
|
||||
|
||||
public int getMaxLogs()
|
||||
{
|
||||
return mMaxLogs;
|
||||
}
|
||||
|
||||
/*------------------------------------------------------------------------------
|
||||
* Setters
|
||||
*----------------------------------------------------------------------------*/
|
||||
public void setDatePattern(String pstrPattern)
|
||||
{
|
||||
this.mstrDatePattern = checkPattern(pstrPattern);
|
||||
if (mstrDatePattern.contains("dd") || mstrDatePattern.contains("DD"))
|
||||
{
|
||||
mMonthOnly = false;
|
||||
}
|
||||
else
|
||||
{
|
||||
mMonthOnly = true;
|
||||
}
|
||||
}
|
||||
|
||||
public void setFile(String file)
|
||||
{
|
||||
// Trim spaces from both ends. The users probably does not want
|
||||
// trailing spaces in file names.
|
||||
String val = file.trim();
|
||||
mstrFileName = val;
|
||||
}
|
||||
|
||||
public void setWithHost(boolean wh)
|
||||
{
|
||||
mWithHostName = wh;
|
||||
}
|
||||
|
||||
public void setMaxLogs(int ml)
|
||||
{
|
||||
mMaxLogs = ml;
|
||||
}
|
||||
|
||||
/*------------------------------------------------------------------------------
|
||||
* Methods
|
||||
*----------------------------------------------------------------------------*/
|
||||
/* (non-Javadoc)
|
||||
* @see org.apache.log4j.WriterAppender#subAppend(org.apache.log4j.spi.LoggingEvent)
|
||||
*/
|
||||
protected void subAppend(LoggingEvent pobjEvent)
|
||||
{
|
||||
Date dtNow = new Date(System.currentTimeMillis());
|
||||
|
||||
boolean rollover = false;
|
||||
|
||||
if (mMonthOnly)
|
||||
{
|
||||
Calendar now = Calendar.getInstance();
|
||||
Calendar cur = Calendar.getInstance();
|
||||
now.setTime(dtNow);
|
||||
cur.setTime(mstrDate);
|
||||
rollover = !(now.get(Calendar.YEAR) == cur.get(Calendar.YEAR) && now.get(Calendar.MONTH) == cur.get(Calendar.MONTH));
|
||||
}
|
||||
else
|
||||
{
|
||||
rollover = !(DateUtils.isSameDay(dtNow, mstrDate));
|
||||
}
|
||||
|
||||
if (rollover)
|
||||
{
|
||||
try
|
||||
{
|
||||
rollOver(dtNow);
|
||||
}
|
||||
catch (IOException IOEx)
|
||||
{
|
||||
LogLog.error("rollOver() failed!", IOEx);
|
||||
}
|
||||
}
|
||||
|
||||
super.subAppend(pobjEvent);
|
||||
}
|
||||
|
||||
/*------------------------------------------------------------------------------
|
||||
* Helpers
|
||||
*----------------------------------------------------------------------------*/
|
||||
/**
|
||||
* The helper function to validate the DatePattern.
|
||||
* @param pstrPattern The DatePattern to be validated.
|
||||
* @return The validated date pattern or defautlt DATE_PATTERN
|
||||
*/
|
||||
private String checkPattern(String pstrPattern)
|
||||
{
|
||||
String strRet = null;
|
||||
SimpleDateFormat objFmt = new SimpleDateFormat(DATE_PATTERN);
|
||||
|
||||
try
|
||||
{
|
||||
this.mobjSDF = new SimpleDateFormat(pstrPattern);
|
||||
strRet = pstrPattern;
|
||||
}
|
||||
catch (NullPointerException NPExIgnore)
|
||||
{
|
||||
LogLog.error("Invalid DatePattern " + pstrPattern, NPExIgnore);
|
||||
this.mobjSDF = objFmt;
|
||||
strRet = DATE_PATTERN;
|
||||
}
|
||||
catch (IllegalArgumentException IlArgExIgnore)
|
||||
{
|
||||
LogLog.error("Invalid DatePattern " + pstrPattern, IlArgExIgnore);
|
||||
this.mobjSDF = objFmt;
|
||||
strRet = DATE_PATTERN;
|
||||
}
|
||||
finally
|
||||
{
|
||||
objFmt = null;
|
||||
}
|
||||
return strRet;
|
||||
}
|
||||
|
||||
/**
|
||||
* This function is responsible for performing the actual file rollover.
|
||||
* @param pstrName The name of the new folder based on current system date.
|
||||
* @throws IOException
|
||||
*/
|
||||
private static boolean deletingFiles = false;
|
||||
private void cleanupOldFiles()
|
||||
{
|
||||
// If we need to delete log files
|
||||
if (mMaxLogs > 0 && !deletingFiles)
|
||||
{
|
||||
deletingFiles = true;
|
||||
|
||||
// Determine the final file extension with the hostname
|
||||
String hostFileExt = null;
|
||||
try
|
||||
{
|
||||
hostFileExt = "." + java.net.InetAddress.getLocalHost().getHostName();
|
||||
}
|
||||
catch (UnknownHostException e)
|
||||
{
|
||||
LogLog.error("Unable to retrieve host name");
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
// Array to hold the logs we are going to keep
|
||||
File[] logsToKeep = new File[mMaxLogs];
|
||||
|
||||
// Get a 'master' file handle, and the parent directory from it
|
||||
File logMaster = new File(mstrFileName);
|
||||
File logDir = logMaster.getParentFile();
|
||||
if (logDir.isDirectory())
|
||||
{
|
||||
// Iterate all the files in that directory
|
||||
File[] logArr = logDir.listFiles();
|
||||
for (File curLog : logArr)
|
||||
{
|
||||
LogLog.debug("Comparing '" + curLog.getAbsolutePath() + "' to '" + mstrFileName + "'");
|
||||
String name = curLog.getAbsolutePath();
|
||||
|
||||
// First, see if we are not using hostname, or the log file ends with this host
|
||||
if (!mWithHostName || (hostFileExt != null && name.endsWith(hostFileExt)))
|
||||
{
|
||||
// Check that the file is indeed one we want (contains the master file name)
|
||||
if (name.contains(mstrFileName))
|
||||
{
|
||||
// Iterate through the array of logs we are keeping
|
||||
for (int i = 0; curLog != null && i < logsToKeep.length; i++)
|
||||
{
|
||||
// Have we exhausted the 'to keep' array?
|
||||
if (logsToKeep[i] == null)
|
||||
{
|
||||
// Empty space, retain this log file
|
||||
logsToKeep[i] = curLog;
|
||||
curLog = null;
|
||||
}
|
||||
// If the 'kept' file is older than the current one
|
||||
else if (logsToKeep[i].getName().compareTo(curLog.getName()) < 0)
|
||||
{
|
||||
// Replace tested entry with current file
|
||||
File temp = logsToKeep[i];
|
||||
logsToKeep[i] = curLog;
|
||||
curLog = temp;
|
||||
}
|
||||
}
|
||||
|
||||
// If we have a 'current' entry at this point, it's a log we don't want
|
||||
if (curLog != null)
|
||||
{
|
||||
LogLog.debug("Deleting log " + curLog.getName());
|
||||
if (curLog.delete())
|
||||
{
|
||||
LogLog.error("Unable to delete log file");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
// Don't worry about exceptions
|
||||
}
|
||||
finally
|
||||
{
|
||||
deletingFiles = false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void rollOver(Date dtNow) throws IOException
|
||||
{
|
||||
mstrDate = dtNow;
|
||||
setFileName();
|
||||
this.setFile(fileName, true, bufferedIO, bufferSize);
|
||||
|
||||
cleanupOldFiles();
|
||||
}
|
||||
|
||||
private void setFileName()
|
||||
{
|
||||
fileName = mstrFileName + "." + mobjSDF.format(mstrDate);
|
||||
|
||||
if (mWithHostName)
|
||||
{
|
||||
try
|
||||
{
|
||||
fileName += "." + java.net.InetAddress.getLocalHost().getHostName();
|
||||
}
|
||||
catch (UnknownHostException e)
|
||||
{
|
||||
LogLog.error("Unable to retrieve host name");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,158 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
import java.util.HashMap;
|
||||
import java.util.HashSet;
|
||||
import java.util.Enumeration;
|
||||
import java.sql.SQLException;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.authorize.AuthorizeManager;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* Static utility class to manage configuration for exposure (hiding) of
|
||||
* certain Item metadata fields.
|
||||
*
|
||||
* This class answers the question, "is the user allowed to see this
|
||||
* metadata field?" Any external interface (UI, OAI-PMH, etc) that
|
||||
* disseminates metadata should consult it before disseminating the value
|
||||
* of a metadata field.
|
||||
*
|
||||
* Since the MetadataExposure.isHidden() method gets called in a lot of inner
|
||||
* loops, it is important to implement it efficiently, in both time and
|
||||
* memory utilization. It computes an answer without consuming ANY memory
|
||||
* (e.g. it does not build any temporary Strings) and in close to constant
|
||||
* time by use of hash tables. Although most sites will only hide a few
|
||||
* fields, we can't predict what the usage will be so it's better to make it
|
||||
* scalable.
|
||||
*
|
||||
* Algorithm is as follows:
|
||||
* 1. If a Context is provided and it has a user who is Administrator,
|
||||
* always grant access (return false).
|
||||
* 2. Return true if field is on the hidden list, false otherwise.
|
||||
*
|
||||
* The internal maps are populated from DSpace Configuration at the first
|
||||
* call, in case the properties are not available in the static context.
|
||||
*
|
||||
* Configuration Properties:
|
||||
* ## hide a single metadata field
|
||||
* #metadata.hide.SCHEMA.ELEMENT[.QUALIFIER] = true
|
||||
* # example: dc.type
|
||||
* metadata.hide.dc.type = true
|
||||
* # example: dc.description.provenance
|
||||
* metadata.hide.dc.description.provenance = true
|
||||
*
|
||||
* @author Larry Stone
|
||||
* @version $Revision: 3734 $
|
||||
*/
|
||||
public class MetadataExposure
|
||||
{
|
||||
private static Logger log = Logger.getLogger(MetadataExposure.class);
|
||||
|
||||
private static Map<String,Set<String>> hiddenElementSets = null;
|
||||
private static Map<String,Map<String,Set<String>>> hiddenElementMaps = null;
|
||||
|
||||
private static final String CONFIG_PREFIX = "metadata.hide.";
|
||||
|
||||
public static boolean isHidden(Context context, String schema, String element, String qualifier)
|
||||
throws SQLException
|
||||
{
|
||||
// the administrator's override
|
||||
if (context != null && AuthorizeManager.isAdmin(context))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
// for schema.element, just check schema->elementSet
|
||||
if (!isInitialized())
|
||||
{
|
||||
init();
|
||||
}
|
||||
|
||||
if (qualifier == null)
|
||||
{
|
||||
Set<String> elts = hiddenElementSets.get(schema);
|
||||
return elts == null ? false : elts.contains(element);
|
||||
}
|
||||
|
||||
// for schema.element.qualifier, just schema->eltMap->qualSet
|
||||
else
|
||||
{
|
||||
Map<String,Set<String>> elts = hiddenElementMaps.get(schema);
|
||||
if (elts == null)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
Set<String> quals = elts.get(element);
|
||||
return quals == null ? false : quals.contains(qualifier);
|
||||
}
|
||||
}
|
||||
|
||||
private static boolean isInitialized()
|
||||
{
|
||||
return hiddenElementSets != null;
|
||||
}
|
||||
|
||||
// load maps from configuration unless it's already done.
|
||||
private static synchronized void init()
|
||||
{
|
||||
if (!isInitialized())
|
||||
{
|
||||
hiddenElementSets = new HashMap<String,Set<String>>();
|
||||
hiddenElementMaps = new HashMap<String,Map<String,Set<String>>>();
|
||||
|
||||
Enumeration pne = ConfigurationManager.propertyNames();
|
||||
while (pne.hasMoreElements())
|
||||
{
|
||||
String key = (String)pne.nextElement();
|
||||
if (key.startsWith(CONFIG_PREFIX))
|
||||
{
|
||||
String mdField = key.substring(CONFIG_PREFIX.length());
|
||||
String segment[] = mdField.split("\\.", 3);
|
||||
|
||||
// got schema.element.qualifier
|
||||
if (segment.length == 3)
|
||||
{
|
||||
Map<String,Set<String>> eltMap = hiddenElementMaps.get(segment[0]);
|
||||
if (eltMap == null)
|
||||
{
|
||||
eltMap = new HashMap<String,Set<String>>();
|
||||
hiddenElementMaps.put(segment[0], eltMap);
|
||||
}
|
||||
if (!eltMap.containsKey(segment[1]))
|
||||
{
|
||||
eltMap.put(segment[1], new HashSet<String>());
|
||||
}
|
||||
eltMap.get(segment[1]).add(segment[2]);
|
||||
}
|
||||
|
||||
// got schema.element
|
||||
else if (segment.length == 2)
|
||||
{
|
||||
if (!hiddenElementSets.containsKey(segment[0]))
|
||||
{
|
||||
hiddenElementSets.put(segment[0], new HashSet<String>());
|
||||
}
|
||||
hiddenElementSets.get(segment[0]).add(segment[1]);
|
||||
}
|
||||
|
||||
// oops..
|
||||
else
|
||||
{
|
||||
log.warn("Bad format in hidden metadata directive, field=\""+mdField+"\", config property="+key);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,330 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.net.URLEncoder;
|
||||
import java.io.UnsupportedEncodingException;
|
||||
|
||||
import org.w3c.dom.Document;
|
||||
|
||||
import org.jdom.Element;
|
||||
import org.jdom.JDOMException;
|
||||
import org.jdom.Namespace;
|
||||
import org.jdom.output.DOMOutputter;
|
||||
import org.jdom.output.XMLOutputter;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
|
||||
import org.dspace.content.DSpaceObject;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.search.QueryResults;
|
||||
|
||||
import com.sun.syndication.feed.module.opensearch.OpenSearchModule;
|
||||
import com.sun.syndication.feed.module.opensearch.entity.OSQuery;
|
||||
import com.sun.syndication.feed.module.opensearch.impl.OpenSearchModuleImpl;
|
||||
import com.sun.syndication.io.FeedException;
|
||||
|
||||
/**
|
||||
* Utility Class with static methods for producing OpenSearch-compliant search results,
|
||||
* and the OpenSearch description document.
|
||||
* <p>
|
||||
* OpenSearch is a specification for describing and advertising search-engines
|
||||
* and their result formats. Commonly, RSS and Atom formats are used, which
|
||||
* the current implementation supports, as is HTML (used directly in browsers).
|
||||
* NB: this is baseline OpenSearch, no extensions currently supported.
|
||||
* </p>
|
||||
* <p>
|
||||
* The value of the "scope" parameter should either be absent (which means no
|
||||
* scope restriction), or the handle of a community or collection.
|
||||
* </p>
|
||||
*
|
||||
* @author Richard Rodgers
|
||||
*
|
||||
*/
|
||||
public class OpenSearch
|
||||
{
|
||||
private static final Logger log = Logger.getLogger(OpenSearch.class);
|
||||
|
||||
// are open search queries enabled?
|
||||
private static boolean enabled = false;
|
||||
// supported results formats
|
||||
private static List<String> formats = null;
|
||||
// Namespaces used
|
||||
private static final String osNs = "http://a9.com/-/spec/opensearch/1.1/";
|
||||
|
||||
// base search UI URL
|
||||
private static String uiUrl = null;
|
||||
// base search service URL
|
||||
private static String svcUrl = null;
|
||||
|
||||
static
|
||||
{
|
||||
enabled = ConfigurationManager.getBooleanProperty("websvc.opensearch.enable");
|
||||
svcUrl = ConfigurationManager.getProperty("dspace.url") + "/" +
|
||||
ConfigurationManager.getProperty("websvc.opensearch.svccontext");
|
||||
uiUrl = ConfigurationManager.getProperty("dspace.url") + "/" +
|
||||
ConfigurationManager.getProperty("websvc.opensearch.uicontext");
|
||||
|
||||
// read rest of config info if enabled
|
||||
formats = new ArrayList<String>();
|
||||
if (enabled)
|
||||
{
|
||||
String fmtsStr = ConfigurationManager.getProperty("websvc.opensearch.formats");
|
||||
if ( fmtsStr != null )
|
||||
{
|
||||
for (String fmt : fmtsStr.split(","))
|
||||
{
|
||||
formats.add(fmt);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns list of supported formats
|
||||
*
|
||||
* @return list of format names - 'rss', 'atom' or 'html'
|
||||
*/
|
||||
public static List<String> getFormats()
|
||||
{
|
||||
return formats;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a mime-type associated with passed format
|
||||
*
|
||||
* @param format the results document format (rss, atom, html)
|
||||
* @return content-type mime-type
|
||||
*/
|
||||
public static String getContentType(String format)
|
||||
{
|
||||
return "html".equals(format) ? "text/html" :
|
||||
"application/" + format + "+xml; charset=UTF-8";
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the OpenSearch service document appropriate for given scope
|
||||
*
|
||||
* @param scope - null for entire repository, or handle or community or collection
|
||||
* @return document the service document
|
||||
* @throws IOException
|
||||
*/
|
||||
public static Document getDescriptionDoc(String scope) throws IOException
|
||||
{
|
||||
return jDomToW3(getServiceDocument(scope));
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns OpenSearch Servic Document as a string
|
||||
*
|
||||
* @param scope - null for entire repository, or handle or community or collection
|
||||
* @return service document as a string
|
||||
*/
|
||||
public static String getDescription(String scope)
|
||||
{
|
||||
return new XMLOutputter().outputString(getServiceDocument(scope));
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a formatted set of search results as a string
|
||||
*
|
||||
* @param format results format - html, rss or atom
|
||||
* @param query - the search query
|
||||
* @param qResults - the query results to be formatted
|
||||
* @param scope - search scope, null or community/collection handle
|
||||
* @param results the retreived DSpace objects satisfying search
|
||||
* @param labels labels to apply - format specific
|
||||
* @return formatted search results
|
||||
* @throws IOException
|
||||
*/
|
||||
public static String getResultsString(String format, String query, QueryResults qResults,
|
||||
DSpaceObject scope, DSpaceObject[] results,
|
||||
Map<String, String> labels) throws IOException
|
||||
{
|
||||
try
|
||||
{
|
||||
return getResults(format, query, qResults, scope, results, labels).outputString();
|
||||
}
|
||||
catch (FeedException e)
|
||||
{
|
||||
log.error(e.toString(), e);
|
||||
throw new IOException("Unable to generate feed", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a formatted set of search results as a document
|
||||
*
|
||||
* @param format results format - html, rss or atom
|
||||
* @param query - the search query
|
||||
* @param qResults - the query results to be formatted
|
||||
* @param scope - search scope, null or community/collection handle
|
||||
* @param results the retreived DSpace objects satisfying search
|
||||
* @param labels labels to apply - format specific
|
||||
* @return formatted search results
|
||||
* @throws IOException
|
||||
*/
|
||||
public static Document getResultsDoc(String format, String query, QueryResults qResults,
|
||||
DSpaceObject scope, DSpaceObject[] results, Map<String, String> labels)
|
||||
throws IOException
|
||||
{
|
||||
try
|
||||
{
|
||||
return getResults(format, query, qResults, scope, results, labels).outputW3CDom();
|
||||
}
|
||||
catch (FeedException e)
|
||||
{
|
||||
log.error(e.toString(), e);
|
||||
throw new IOException("Unable to generate feed", e);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
private static SyndicationFeed getResults(String format, String query, QueryResults qResults,
|
||||
DSpaceObject scope, DSpaceObject[] results, Map<String, String> labels)
|
||||
{
|
||||
// Encode results in requested format
|
||||
if ("rss".equals(format))
|
||||
{
|
||||
format = "rss_2.0";
|
||||
}
|
||||
else if ("atom".equals(format))
|
||||
{
|
||||
format = "atom_1.0";
|
||||
}
|
||||
|
||||
SyndicationFeed feed = new SyndicationFeed(labels.get(SyndicationFeed.MSG_UITYPE));
|
||||
feed.populate(null, scope, results, labels);
|
||||
feed.setType(format);
|
||||
feed.addModule(openSearchMarkup(query, qResults));
|
||||
return feed;
|
||||
}
|
||||
|
||||
/*
|
||||
* Generates the OpenSearch elements which are added to the RSS or Atom feeds as foreign markup
|
||||
* wrapped in a module
|
||||
*
|
||||
* @param query the search query
|
||||
* @param qRes the search results
|
||||
* @return module
|
||||
*/
|
||||
private static OpenSearchModule openSearchMarkup(String query, QueryResults qRes)
|
||||
{
|
||||
OpenSearchModule osMod = new OpenSearchModuleImpl();
|
||||
osMod.setTotalResults(qRes.getHitCount());
|
||||
osMod.setStartIndex(qRes.getStart());
|
||||
osMod.setItemsPerPage(qRes.getPageSize());
|
||||
OSQuery osq = new OSQuery();
|
||||
osq.setRole("request");
|
||||
try
|
||||
{
|
||||
osq.setSearchTerms(URLEncoder.encode(query, "UTF-8"));
|
||||
}
|
||||
catch(UnsupportedEncodingException e)
|
||||
{
|
||||
log.error(e);
|
||||
}
|
||||
osq.setStartPage(1 + (qRes.getStart() / qRes.getPageSize()));
|
||||
osMod.addQuery(osq);
|
||||
return osMod;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns as a document the OpenSearch service document
|
||||
*
|
||||
* @param scope - null for the entire repository, or a collection/community handle
|
||||
* @return Service Document
|
||||
*/
|
||||
private static org.jdom.Document getServiceDocument(String scope)
|
||||
{
|
||||
Namespace ns = Namespace.getNamespace(osNs);
|
||||
Element root = new Element("OpenSearchDescription", ns);
|
||||
root.addContent(new Element("ShortName", ns).setText(ConfigurationManager.getProperty("websvc.opensearch.shortname")));
|
||||
root.addContent(new Element("LongName", ns).setText(ConfigurationManager.getProperty("websvc.opensearch.longname")));
|
||||
root.addContent(new Element("Description", ns).setText(ConfigurationManager.getProperty("websvc.opensearch.description")));
|
||||
root.addContent(new Element("InputEncoding", ns).setText("UTF-8"));
|
||||
root.addContent(new Element("OutputEncoding", ns).setText("UTF-8"));
|
||||
// optional elements
|
||||
String sample = ConfigurationManager.getProperty("websvc.opensearch.samplequery");
|
||||
if (sample != null && sample.length() > 0)
|
||||
{
|
||||
Element sq = new Element("Query", ns).setAttribute("role", "example");
|
||||
root.addContent(sq.setAttribute("searchTerms", sample));
|
||||
}
|
||||
String tags = ConfigurationManager.getProperty("websvc.opensearch.tags");
|
||||
if (tags != null && tags.length() > 0)
|
||||
{
|
||||
root.addContent(new Element("Tags", ns).setText(tags));
|
||||
}
|
||||
String contact = ConfigurationManager.getProperty("mail.admin");
|
||||
if (contact != null && contact.length() > 0)
|
||||
{
|
||||
root.addContent(new Element("Contact", ns).setText(contact));
|
||||
}
|
||||
String faviconUrl = ConfigurationManager.getProperty("websvc.opensearch.faviconurl");
|
||||
if (faviconUrl != null && faviconUrl.length() > 0)
|
||||
{
|
||||
String dim = String.valueOf(16);
|
||||
String type = faviconUrl.endsWith("ico") ? "image/vnd.microsoft.icon" : "image/png";
|
||||
Element fav = new Element("Image", ns).setAttribute("height", dim).setAttribute("width", dim).
|
||||
setAttribute("type", type).setText(faviconUrl);
|
||||
root.addContent(fav);
|
||||
}
|
||||
// service URLs
|
||||
for (String format : formats)
|
||||
{
|
||||
Element url = new Element("Url", ns).setAttribute("type", getContentType(format));
|
||||
StringBuffer template = new StringBuffer();
|
||||
if ("html".equals(format))
|
||||
{
|
||||
template.append(uiUrl);
|
||||
}
|
||||
else
|
||||
{
|
||||
template.append(svcUrl);
|
||||
}
|
||||
template.append("?query={searchTerms}");
|
||||
if(! "html".equals(format))
|
||||
{
|
||||
template.append("&start={startIndex?}&rpp={count?}&format=");
|
||||
template.append(format);
|
||||
}
|
||||
if (scope != null)
|
||||
{
|
||||
template.append("&scope=");
|
||||
template.append(scope);
|
||||
}
|
||||
url.setAttribute("template", template.toString());
|
||||
root.addContent(url);
|
||||
}
|
||||
return new org.jdom.Document(root);
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a JDOM document to a W3C one
|
||||
* @param jdomDoc
|
||||
* @return W3C Document object
|
||||
* @throws IOException
|
||||
*/
|
||||
private static Document jDomToW3(org.jdom.Document jdomDoc) throws IOException
|
||||
{
|
||||
DOMOutputter domOut = new DOMOutputter();
|
||||
try
|
||||
{
|
||||
return domOut.output(jdomDoc);
|
||||
}
|
||||
catch(JDOMException jde)
|
||||
{
|
||||
throw new IOException("JDOM output exception", jde);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,172 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.io.Serializable;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
|
||||
/**
|
||||
* Class representing a single Item Submission config definition, organized into
|
||||
* steps. This class represents the structure of a single 'submission-process'
|
||||
* node in the item-submission.xml configuration file.
|
||||
*
|
||||
* Note: Implements Serializable as it will be saved to the current session during submission.
|
||||
* Please ensure that nothing is added to this class that isn't also serializable
|
||||
*
|
||||
* @see org.dspace.app.util.SubmissionConfigReader
|
||||
* @see org.dspace.app.util.SubmissionStepConfig
|
||||
*
|
||||
* @author Tim Donohue, based on DCInputSet by Brian S. Hughes
|
||||
* @version $Revision$
|
||||
*/
|
||||
|
||||
public class SubmissionConfig implements Serializable
|
||||
{
|
||||
/** name of the item submission process */
|
||||
private String submissionName = null;
|
||||
|
||||
/** the configuration classes for the steps in this submission process */
|
||||
private SubmissionStepConfig[] submissionSteps = null;
|
||||
|
||||
/** whether or not this submission process is being used in a workflow * */
|
||||
private boolean isWorkflow = false;
|
||||
|
||||
/** log4j logger */
|
||||
private static Logger log = Logger.getLogger(SubmissionConfig.class);
|
||||
|
||||
/**
|
||||
* Constructs a new Submission Configuration object, based on the XML
|
||||
* configuration file (item-submission.xml)
|
||||
*
|
||||
* @param submissionName
|
||||
* the submission process name
|
||||
* @param steps
|
||||
* the vector listing of step information to build
|
||||
* SubmissionStepConfig objects for this submission process
|
||||
* @param isWorkflowProcess
|
||||
* whether this submission process is being used in a workflow or
|
||||
* not. If it is a workflow process this may limit the steps that
|
||||
* are available for editing.
|
||||
*/
|
||||
public SubmissionConfig(String submissionName, List<Map<String, String>> steps,
|
||||
boolean isWorkflowProcess)
|
||||
{
|
||||
this.submissionName = submissionName;
|
||||
this.isWorkflow = isWorkflowProcess;
|
||||
|
||||
// initialize a vector of SubmissionStepConfig objects
|
||||
List<SubmissionStepConfig> stepConfigs = new ArrayList<SubmissionStepConfig>();
|
||||
|
||||
// loop through our steps, and create SubmissionStepConfig objects
|
||||
for (int stepNum = 0; stepNum < steps.size(); stepNum++)
|
||||
{
|
||||
Map<String, String> stepInfo = steps.get(stepNum);
|
||||
SubmissionStepConfig step = new SubmissionStepConfig(stepInfo);
|
||||
|
||||
// Only add this step to the process if either:
|
||||
// (a) this is not a workflow process OR
|
||||
// (b) this is a workflow process, and this step is editable in a
|
||||
// workflow
|
||||
if ((!this.isWorkflow)
|
||||
|| ((this.isWorkflow) && step.isWorkflowEditable()))
|
||||
{
|
||||
// set the number of the step (starts at 0) and add it
|
||||
step.setStepNumber(stepConfigs.size());
|
||||
stepConfigs.add(step);
|
||||
|
||||
log.debug("Added step '" + step.getProcessingClassName()
|
||||
+ "' as step #" + step.getStepNumber()
|
||||
+ " of submission process " + submissionName);
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
// get steps as an array of Strings
|
||||
submissionSteps = stepConfigs
|
||||
.toArray(new SubmissionStepConfig[stepConfigs.size()]);
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the name of the item submission process definition
|
||||
*
|
||||
* @return the name of the submission process
|
||||
*/
|
||||
public String getSubmissionName()
|
||||
{
|
||||
return submissionName;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the number of steps in this submission process
|
||||
*
|
||||
* @return number of steps
|
||||
*/
|
||||
public int getNumberOfSteps()
|
||||
{
|
||||
return submissionSteps.length;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return whether or not this submission process is being used in a
|
||||
* workflow!
|
||||
*
|
||||
* @return true, if it's a workflow process. false, otherwise.
|
||||
*/
|
||||
public boolean isWorkflow()
|
||||
{
|
||||
return isWorkflow;
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieve a particular Step configuration in this Item Submission Process
|
||||
* configuration. The first step is numbered "0" (although step #0 is the
|
||||
* implied "select collection" step).
|
||||
* <p>
|
||||
* If you want to retrieve the step after the "select collection" step, you
|
||||
* should retrieve step #1.
|
||||
*
|
||||
* If the specified step isn't found, null is returned.
|
||||
*
|
||||
* @param stepNum
|
||||
* desired step to retrieve
|
||||
*
|
||||
* @return the SubmissionStepConfig object for the step
|
||||
*/
|
||||
|
||||
public SubmissionStepConfig getStep(int stepNum)
|
||||
{
|
||||
if ((stepNum > submissionSteps.length - 1) || (stepNum < 0))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
else
|
||||
{
|
||||
return submissionSteps[stepNum];
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether or not there are more steps which follow the specified
|
||||
* "stepNum". For example, if you specify stepNum=4, then this method checks
|
||||
* to see if there is a Step #5. The first step is numbered "0".
|
||||
*
|
||||
* @param stepNum
|
||||
* the current step.
|
||||
*
|
||||
* @return true, if a step at "stepNum+1" exists. false, otherwise.
|
||||
*/
|
||||
|
||||
public boolean hasMoreSteps(int stepNum)
|
||||
{
|
||||
return (getStep(stepNum + 1) != null);
|
||||
}
|
||||
}
|
||||
@@ -1,653 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.io.File;
|
||||
import java.util.*;
|
||||
import javax.servlet.ServletException;
|
||||
import org.xml.sax.SAXException;
|
||||
import org.w3c.dom.*;
|
||||
import javax.xml.parsers.*;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
|
||||
/**
|
||||
* Item Submission configuration generator for DSpace. Reads and parses the
|
||||
* installed submission process configuration file, item-submission.xml, from
|
||||
* the configuration directory. This submission process definition details the
|
||||
* ordering of the steps (and number of steps) that occur during the Item
|
||||
* Submission Process. There may be multiple Item Submission processes defined,
|
||||
* where each definition is assigned a unique name.
|
||||
*
|
||||
* The file also specifies which collections use which Item Submission process.
|
||||
* At a minimum, the definitions file must define a default mapping from the
|
||||
* placeholder collection # to the distinguished submission process 'default'.
|
||||
* Any collections that use a custom submission process are listed paired with
|
||||
* the name of the item submission process they use.
|
||||
*
|
||||
* @see org.dspace.app.util.SubmissionConfig
|
||||
* @see org.dspace.app.util.SubmissionStepConfig
|
||||
*
|
||||
* @author Tim Donohue based on DCInputsReader by Brian S. Hughes
|
||||
* @version $Revision$
|
||||
*/
|
||||
|
||||
public class SubmissionConfigReader
|
||||
{
|
||||
/**
|
||||
* The ID of the default collection. Will never be the ID of a named
|
||||
* collection
|
||||
*/
|
||||
public static final String DEFAULT_COLLECTION = "default";
|
||||
|
||||
/** Prefix of the item submission definition XML file */
|
||||
static final String SUBMIT_DEF_FILE_PREFIX = "item-submission";
|
||||
|
||||
/** Suffix of the item submission definition XML file */
|
||||
static final String SUBMIT_DEF_FILE_SUFFIX = ".xml";
|
||||
|
||||
/** log4j logger */
|
||||
private static Logger log = Logger.getLogger(SubmissionConfigReader.class);
|
||||
|
||||
/** The fully qualified pathname of the directory containing the Item Submission Configuration file */
|
||||
private String configDir = ConfigurationManager.getProperty("dspace.dir")
|
||||
+ File.separator + "config" + File.separator;
|
||||
|
||||
/**
|
||||
* Hashmap which stores which submission process configuration is used by
|
||||
* which collection, computed from the item submission config file
|
||||
* (specifically, the 'submission-map' tag)
|
||||
*/
|
||||
private Map<String, String> collectionToSubmissionConfig = null;
|
||||
|
||||
/**
|
||||
* Reference to the global submission step definitions defined in the
|
||||
* "step-definitions" section
|
||||
*/
|
||||
private Map<String, Map<String, String>> stepDefns = null;
|
||||
|
||||
/**
|
||||
* Reference to the item submission definitions defined in the
|
||||
* "submission-definitions" section
|
||||
*/
|
||||
private Map<String, List<Map<String, String>>> submitDefns = null;
|
||||
|
||||
/**
|
||||
* Mini-cache of last SubmissionConfig object requested (so that we don't
|
||||
* always reload from scratch)
|
||||
*/
|
||||
private SubmissionConfig lastSubmissionConfig = null;
|
||||
|
||||
/**
|
||||
* Load Submission Configuration from the
|
||||
* item-submission.xml configuration file
|
||||
*/
|
||||
public SubmissionConfigReader() throws ServletException
|
||||
{
|
||||
buildInputs(configDir + SUBMIT_DEF_FILE_PREFIX + SUBMIT_DEF_FILE_SUFFIX);
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse an XML encoded item submission configuration file.
|
||||
* <P>
|
||||
* Creates two main hashmaps:
|
||||
* <ul>
|
||||
* <li>Hashmap of Collection to Submission definition mappings -
|
||||
* defines which Submission process a particular collection uses
|
||||
* <li>Hashmap of all Submission definitions. List of all valid
|
||||
* Submision Processes by name.
|
||||
* </ul>
|
||||
*/
|
||||
private void buildInputs(String fileName) throws ServletException
|
||||
{
|
||||
collectionToSubmissionConfig = new HashMap<String, String>();
|
||||
submitDefns = new HashMap<String, List<Map<String, String>>>();
|
||||
|
||||
String uri = "file:" + new File(fileName).getAbsolutePath();
|
||||
|
||||
try
|
||||
{
|
||||
DocumentBuilderFactory factory = DocumentBuilderFactory
|
||||
.newInstance();
|
||||
factory.setValidating(false);
|
||||
factory.setIgnoringComments(true);
|
||||
factory.setIgnoringElementContentWhitespace(true);
|
||||
|
||||
DocumentBuilder db = factory.newDocumentBuilder();
|
||||
Document doc = db.parse(uri);
|
||||
doNodes(doc);
|
||||
}
|
||||
catch (FactoryConfigurationError fe)
|
||||
{
|
||||
throw new ServletException(
|
||||
"Cannot create Item Submission Configuration parser", fe);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
throw new ServletException(
|
||||
"Error creating Item Submission Configuration: " + e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the Item Submission process config used for a particular
|
||||
* collection, or the default if none is defined for the collection
|
||||
*
|
||||
* @param collectionHandle
|
||||
* collection's unique Handle
|
||||
* @param isWorkflow
|
||||
* whether or not we are loading the submission process for a
|
||||
* workflow
|
||||
* @return the SubmissionConfig representing the item submission config
|
||||
*
|
||||
* @throws ServletException
|
||||
* if no default submission process configuration defined
|
||||
*/
|
||||
public SubmissionConfig getSubmissionConfig(String collectionHandle,
|
||||
boolean isWorkflow) throws ServletException
|
||||
{
|
||||
// get the name of the submission process config for this collection
|
||||
String submitName = collectionToSubmissionConfig
|
||||
.get(collectionHandle);
|
||||
if (submitName == null)
|
||||
{
|
||||
submitName = collectionToSubmissionConfig
|
||||
.get(DEFAULT_COLLECTION);
|
||||
}
|
||||
if (submitName == null)
|
||||
{
|
||||
throw new ServletException(
|
||||
"No item submission process configuration designated as 'default' in 'submission-map' section of 'item-submission.xml'.");
|
||||
}
|
||||
|
||||
log.debug("Loading submission process config named '" + submitName
|
||||
+ "'");
|
||||
|
||||
// check mini-cache, and return if match
|
||||
if (lastSubmissionConfig != null
|
||||
&& lastSubmissionConfig.getSubmissionName().equals(submitName)
|
||||
&& lastSubmissionConfig.isWorkflow() == isWorkflow)
|
||||
{
|
||||
log.debug("Found submission process config '" + submitName
|
||||
+ "' in cache.");
|
||||
|
||||
return lastSubmissionConfig;
|
||||
}
|
||||
|
||||
// cache miss - construct new SubmissionConfig
|
||||
List<Map<String, String>> steps = submitDefns.get(submitName);
|
||||
|
||||
if (steps == null)
|
||||
{
|
||||
throw new ServletException(
|
||||
"Missing the Item Submission process config '" + submitName
|
||||
+ "' (or unable to load) from 'item-submission.xml'.");
|
||||
}
|
||||
|
||||
log.debug("Submission process config '" + submitName
|
||||
+ "' not in cache. Reloading from scratch.");
|
||||
|
||||
lastSubmissionConfig = new SubmissionConfig(submitName, steps,
|
||||
isWorkflow);
|
||||
|
||||
log.debug("Submission process config has "
|
||||
+ lastSubmissionConfig.getNumberOfSteps() + " steps listed.");
|
||||
|
||||
return lastSubmissionConfig;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a particular global step definition based on its ID.
|
||||
* <P>
|
||||
* Global step definitions are those defined in the <step-definitions>
|
||||
* section of the configuration file.
|
||||
*
|
||||
* @param stepID
|
||||
* step's identifier
|
||||
*
|
||||
* @return the SubmissionStepConfig representing the step
|
||||
*
|
||||
* @throws ServletException
|
||||
* if no default submission process configuration defined
|
||||
*/
|
||||
public SubmissionStepConfig getStepConfig(String stepID)
|
||||
throws ServletException
|
||||
{
|
||||
// We should already have the step definitions loaded
|
||||
if (stepDefns != null)
|
||||
{
|
||||
// retreive step info
|
||||
Map<String, String> stepInfo = stepDefns.get(stepID);
|
||||
|
||||
if (stepInfo != null)
|
||||
{
|
||||
return new SubmissionStepConfig(stepInfo);
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the top level child nodes in the passed top-level node. These
|
||||
* should correspond to the collection-form maps, the form definitions, and
|
||||
* the display/storage word pairs.
|
||||
*/
|
||||
private void doNodes(Node n) throws SAXException, ServletException
|
||||
{
|
||||
if (n == null)
|
||||
{
|
||||
return;
|
||||
}
|
||||
Node e = getElement(n);
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
boolean foundMap = false;
|
||||
boolean foundStepDefs = false;
|
||||
boolean foundSubmitDefs = false;
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
if ((nd == null) || isEmptyTextNode(nd))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
String tagName = nd.getNodeName();
|
||||
if (tagName.equals("submission-map"))
|
||||
{
|
||||
processMap(nd);
|
||||
foundMap = true;
|
||||
}
|
||||
else if (tagName.equals("step-definitions"))
|
||||
{
|
||||
processStepDefinition(nd);
|
||||
foundStepDefs = true;
|
||||
}
|
||||
else if (tagName.equals("submission-definitions"))
|
||||
{
|
||||
processSubmissionDefinition(nd);
|
||||
foundSubmitDefs = true;
|
||||
}
|
||||
// Ignore unknown nodes
|
||||
}
|
||||
if (!foundMap)
|
||||
{
|
||||
throw new ServletException(
|
||||
"No collection to item submission map ('submission-map') found in 'item-submission.xml'");
|
||||
}
|
||||
if (!foundStepDefs)
|
||||
{
|
||||
throw new ServletException("No 'step-definitions' section found in 'item-submission.xml'");
|
||||
}
|
||||
if (!foundSubmitDefs)
|
||||
{
|
||||
throw new ServletException(
|
||||
"No 'submission-definitions' section found in 'item-submission.xml'");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the submission-map section of the XML file. Each element looks
|
||||
* like: <name-map collection-handle="hdl" submission-name="name" /> Extract
|
||||
* the collection handle and item submission name, put name in hashmap keyed
|
||||
* by the collection handle.
|
||||
*/
|
||||
private void processMap(Node e) throws SAXException
|
||||
{
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
if (nd.getNodeName().equals("name-map"))
|
||||
{
|
||||
String id = getAttribute(nd, "collection-handle");
|
||||
String value = getAttribute(nd, "submission-name");
|
||||
String content = getValue(nd);
|
||||
if (id == null)
|
||||
{
|
||||
throw new SAXException(
|
||||
"name-map element is missing collection-handle attribute in 'item-submission.xml'");
|
||||
}
|
||||
if (value == null)
|
||||
{
|
||||
throw new SAXException(
|
||||
"name-map element is missing submission-name attribute in 'item-submission.xml'");
|
||||
}
|
||||
if (content != null && content.length() > 0)
|
||||
{
|
||||
throw new SAXException(
|
||||
"name-map element has content in 'item-submission.xml', it should be empty.");
|
||||
}
|
||||
collectionToSubmissionConfig.put(id, value);
|
||||
} // ignore any child node that isn't a "name-map"
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the "step-definition" section of the XML file. Each element is
|
||||
* formed thusly: <step id="unique-id"> ...step_fields... </step> The valid
|
||||
* step_fields are: heading, processing-servlet.
|
||||
* <P>
|
||||
* Extract the step information (from the step_fields) and place in a
|
||||
* HashMap whose key is the step's unique id.
|
||||
*/
|
||||
private void processStepDefinition(Node e) throws SAXException,
|
||||
ServletException
|
||||
{
|
||||
stepDefns = new HashMap<String, Map<String, String>>();
|
||||
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
// process each step definition
|
||||
if (nd.getNodeName().equals("step"))
|
||||
{
|
||||
String stepID = getAttribute(nd, "id");
|
||||
if (stepID == null)
|
||||
{
|
||||
throw new SAXException(
|
||||
"step element has no 'id' attribute in 'item-submission.xml', which is required in the 'step-definitions' section");
|
||||
}
|
||||
else if (stepDefns.containsKey(stepID))
|
||||
{
|
||||
throw new SAXException(
|
||||
"There are two step elements with the id '" + stepID + "' in 'item-submission.xml'");
|
||||
}
|
||||
|
||||
Map<String, String> stepInfo = processStepChildNodes("step-definition", nd);
|
||||
|
||||
stepDefns.put(stepID, stepInfo);
|
||||
} // ignore any child that is not a 'step'
|
||||
}
|
||||
|
||||
// Sanity check number of step definitions
|
||||
if (stepDefns.size() < 1)
|
||||
{
|
||||
throw new ServletException(
|
||||
"step-definition section has no steps! A step with id='collection' is required in 'item-submission.xml'!");
|
||||
}
|
||||
|
||||
// Sanity check to see that the required "collection" step is defined
|
||||
if (!stepDefns.containsKey(SubmissionStepConfig.SELECT_COLLECTION_STEP))
|
||||
{
|
||||
throw new ServletException(
|
||||
"The step-definition section is REQUIRED to have a step with id='"
|
||||
+ SubmissionStepConfig.SELECT_COLLECTION_STEP
|
||||
+ "' in 'item-submission.xml'! This step is used to ensure that a new item submission is assigned to a collection.");
|
||||
}
|
||||
|
||||
// Sanity check to see that the required "complete" step is defined
|
||||
if (!stepDefns.containsKey(SubmissionStepConfig.COMPLETE_STEP))
|
||||
{
|
||||
throw new ServletException(
|
||||
"The step-definition section is REQUIRED to have a step with id='"
|
||||
+ SubmissionStepConfig.COMPLETE_STEP
|
||||
+ "' in 'item-submission.xml'! This step is used to perform all processing necessary at the completion of the submission (e.g. starting workflow).");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the "submission-definition" section of the XML file. Each element
|
||||
* is formed thusly: <submission-process name="submitName">...steps...</submit-process>
|
||||
* Each step subsection is formed: <step> ...step_fields... </step> (with
|
||||
* optional "id" attribute, to reference a step from the <step-definition>
|
||||
* section). The valid step_fields are: heading, class-name.
|
||||
* <P>
|
||||
* Extract the submission-process name and steps and place in a HashMap
|
||||
* whose key is the submission-process's unique name.
|
||||
*/
|
||||
private void processSubmissionDefinition(Node e) throws SAXException,
|
||||
ServletException
|
||||
{
|
||||
int numSubmitProcesses = 0;
|
||||
List<String> submitNames = new ArrayList<String>();
|
||||
|
||||
// find all child nodes of the 'submission-definition' node and loop
|
||||
// through
|
||||
NodeList nl = e.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node nd = nl.item(i);
|
||||
|
||||
// process each 'submission-process' node
|
||||
if (nd.getNodeName().equals("submission-process"))
|
||||
{
|
||||
numSubmitProcesses++;
|
||||
String submitName = getAttribute(nd, "name");
|
||||
if (submitName == null)
|
||||
{
|
||||
throw new SAXException(
|
||||
"'submission-process' element has no 'name' attribute in 'item-submission.xml'");
|
||||
}
|
||||
else if (submitNames.contains(submitName))
|
||||
{
|
||||
throw new SAXException(
|
||||
"There are two 'submission-process' elements with the name '"
|
||||
+ submitName + "' in 'item-submission.xml'.");
|
||||
}
|
||||
submitNames.add(submitName);
|
||||
|
||||
// the 'submission-process' definition contains steps
|
||||
List<Map<String, String>> steps = new ArrayList<Map<String, String>>();
|
||||
submitDefns.put(submitName, steps);
|
||||
|
||||
// loop through all the 'step' nodes of the 'submission-process'
|
||||
NodeList pl = nd.getChildNodes();
|
||||
int lenStep = pl.getLength();
|
||||
for (int j = 0; j < lenStep; j++)
|
||||
{
|
||||
Node nStep = pl.item(j);
|
||||
|
||||
// process each 'step' definition
|
||||
if (nStep.getNodeName().equals("step"))
|
||||
{
|
||||
// check for an 'id' attribute
|
||||
String stepID = getAttribute(nStep, "id");
|
||||
|
||||
Map<String, String> stepInfo;
|
||||
|
||||
// if this step has an id, load its information from the
|
||||
// step-definition section
|
||||
if ((stepID != null) && (stepID.length() > 0))
|
||||
{
|
||||
if (stepDefns.containsKey(stepID))
|
||||
{
|
||||
// load the step information from the
|
||||
// step-definition
|
||||
stepInfo = stepDefns.get(stepID);
|
||||
}
|
||||
else
|
||||
{
|
||||
throw new SAXException(
|
||||
"The Submission process config named "
|
||||
+ submitName
|
||||
+ " contains a step with id="
|
||||
+ stepID
|
||||
+ ". There is no step with this 'id' defined in the 'step-definition' section of 'item-submission.xml'.");
|
||||
}
|
||||
|
||||
// Ignore all children of a step element with an
|
||||
// "id"
|
||||
}
|
||||
else
|
||||
{
|
||||
// get information about step from its children
|
||||
// nodes
|
||||
stepInfo = processStepChildNodes(
|
||||
"submission-process", nStep);
|
||||
}
|
||||
|
||||
steps.add(stepInfo);
|
||||
|
||||
} // ignore any child that is not a 'step'
|
||||
}
|
||||
|
||||
// sanity check number of steps
|
||||
if (steps.size() < 1)
|
||||
{
|
||||
throw new ServletException(
|
||||
"Item Submission process config named "
|
||||
+ submitName + " has no steps defined in 'item-submission.xml'");
|
||||
}
|
||||
|
||||
// ALL Item Submission processes MUST BEGIN with selecting a
|
||||
// Collection. So, automatically insert in the "collection" step
|
||||
// (from the 'step-definition' section)
|
||||
// Note: we already did a sanity check that this "collection"
|
||||
// step exists.
|
||||
steps.add(0, stepDefns
|
||||
.get(SubmissionStepConfig.SELECT_COLLECTION_STEP));
|
||||
|
||||
// ALL Item Submission processes MUST END with the
|
||||
// "Complete" processing step.
|
||||
// So, automatically append in the "complete" step
|
||||
// (from the 'step-definition' section)
|
||||
// Note: we already did a sanity check that this "complete"
|
||||
// step exists.
|
||||
steps.add(stepDefns
|
||||
.get(SubmissionStepConfig.COMPLETE_STEP));
|
||||
|
||||
}
|
||||
}
|
||||
if (numSubmitProcesses == 0)
|
||||
{
|
||||
throw new ServletException(
|
||||
"No 'submission-process' elements/definitions found in 'item-submission.xml'");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process the children of the "step" tag of the XML file. Returns a HashMap
|
||||
* of all the fields under that "step" tag, where the key is the field name,
|
||||
* and the value is the field value.
|
||||
*
|
||||
*/
|
||||
private Map<String, String> processStepChildNodes(String configSection, Node nStep)
|
||||
throws SAXException, ServletException
|
||||
{
|
||||
// initialize the HashMap of step Info
|
||||
Map<String, String> stepInfo = new HashMap<String, String>();
|
||||
|
||||
NodeList flds = nStep.getChildNodes();
|
||||
int lenflds = flds.getLength();
|
||||
for (int k = 0; k < lenflds; k++)
|
||||
{
|
||||
// process each child node of a <step> tag
|
||||
Node nfld = flds.item(k);
|
||||
|
||||
if (!isEmptyTextNode(nfld))
|
||||
{
|
||||
String tagName = nfld.getNodeName();
|
||||
String value = getValue(nfld);
|
||||
stepInfo.put(tagName, value);
|
||||
}
|
||||
}// end for each field
|
||||
|
||||
// check for ID attribute & save to step info
|
||||
String stepID = getAttribute(nStep, "id");
|
||||
if (stepID != null && stepID.length() > 0)
|
||||
{
|
||||
stepInfo.put("id", stepID);
|
||||
}
|
||||
|
||||
// look for REQUIRED 'step' information
|
||||
String missing = null;
|
||||
if (stepInfo.get("processing-class") == null)
|
||||
{
|
||||
missing = "'processing-class'";
|
||||
}
|
||||
if (missing != null)
|
||||
{
|
||||
String msg = "Required field " + missing
|
||||
+ " missing in a 'step' in the " + configSection
|
||||
+ " of the item submission configuration file ('item-submission.xml')";
|
||||
throw new SAXException(msg);
|
||||
}
|
||||
|
||||
return stepInfo;
|
||||
}
|
||||
|
||||
private Node getElement(Node nd)
|
||||
{
|
||||
NodeList nl = nd.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node n = nl.item(i);
|
||||
if (n.getNodeType() == Node.ELEMENT_NODE)
|
||||
{
|
||||
return n;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
private boolean isEmptyTextNode(Node nd)
|
||||
{
|
||||
boolean isEmpty = false;
|
||||
if (nd.getNodeType() == Node.TEXT_NODE)
|
||||
{
|
||||
String text = nd.getNodeValue().trim();
|
||||
if (text.length() == 0)
|
||||
{
|
||||
isEmpty = true;
|
||||
}
|
||||
}
|
||||
return isEmpty;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value of the node's attribute named <name>
|
||||
*/
|
||||
private String getAttribute(Node e, String name)
|
||||
{
|
||||
NamedNodeMap attrs = e.getAttributes();
|
||||
int len = attrs.getLength();
|
||||
if (len > 0)
|
||||
{
|
||||
int i;
|
||||
for (i = 0; i < len; i++)
|
||||
{
|
||||
Node attr = attrs.item(i);
|
||||
if (name.equals(attr.getNodeName()))
|
||||
{
|
||||
return attr.getNodeValue().trim();
|
||||
}
|
||||
}
|
||||
}
|
||||
// no such attribute
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the value found in the Text node (if any) in the node list that's
|
||||
* passed in.
|
||||
*/
|
||||
private String getValue(Node nd)
|
||||
{
|
||||
NodeList nl = nd.getChildNodes();
|
||||
int len = nl.getLength();
|
||||
for (int i = 0; i < len; i++)
|
||||
{
|
||||
Node n = nl.item(i);
|
||||
short type = n.getNodeType();
|
||||
if (type == Node.TEXT_NODE)
|
||||
{
|
||||
return n.getNodeValue().trim();
|
||||
}
|
||||
}
|
||||
// Didn't find a text node
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,701 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.util.LinkedHashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import javax.servlet.ServletException;
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import javax.servlet.http.HttpSession;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Bundle;
|
||||
import org.dspace.content.InProgressSubmission;
|
||||
|
||||
import org.dspace.submit.AbstractProcessingStep;
|
||||
import org.dspace.workflow.WorkflowItem;
|
||||
|
||||
/**
|
||||
* Information about an item being editing with the submission UI
|
||||
*
|
||||
* @author Robert Tansley
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class SubmissionInfo
|
||||
{
|
||||
/** log4j logger */
|
||||
private static Logger log = Logger.getLogger(SubmissionInfo.class);
|
||||
|
||||
/** The item which is being submitted */
|
||||
private InProgressSubmission submissionItem = null;
|
||||
|
||||
/**
|
||||
* The Submission process config, which holds all info about the submission
|
||||
* process that this item is going through (including all steps, etc)
|
||||
*/
|
||||
private SubmissionConfig submissionConfig = null;
|
||||
|
||||
/**
|
||||
* Handle of the collection where this item is being submitted
|
||||
*/
|
||||
private String collectionHandle = null;
|
||||
|
||||
/***************************************************************************
|
||||
* Holds all information used to build the Progress Bar in a key,value set.
|
||||
* Keys are the number of the step, followed by the number of the page
|
||||
* within the step (e.g. "2.1" = The first page of Step 2) (e.g. "5.2" = The
|
||||
* second page of Step 5) Values are the Headings to display for each step
|
||||
* (e.g. "Describe")
|
||||
**************************************************************************/
|
||||
private Map<String, String> progressBar = null;
|
||||
|
||||
/** The element or element_qualifier to show more input boxes for */
|
||||
private String moreBoxesFor;
|
||||
|
||||
/** The element or element_qualifier to scroll to initially using anchor */
|
||||
private String jumpToField;
|
||||
|
||||
/** If non-empty, form-relative indices of missing fields */
|
||||
private List<String> missingFields;
|
||||
|
||||
/** Specific bundle we're dealing with */
|
||||
private Bundle bundle;
|
||||
|
||||
/** Specific bitstream we're dealing with */
|
||||
private Bitstream bitstream;
|
||||
|
||||
/** Reader for submission process configuration file * */
|
||||
private static SubmissionConfigReader submissionConfigReader;
|
||||
|
||||
/**
|
||||
* Default Constructor - PRIVATE
|
||||
* <p>
|
||||
* Create a SubmissionInfo object
|
||||
* using the load() method!
|
||||
*
|
||||
*/
|
||||
private SubmissionInfo()
|
||||
{
|
||||
}
|
||||
|
||||
/**
|
||||
* Loads all known submission information based on the given in progress
|
||||
* submission and request object.
|
||||
* <P>
|
||||
* If subItem is null, then just loads the default submission information
|
||||
* for a new submission.
|
||||
*
|
||||
* @param request
|
||||
* The HTTP Servlet Request object
|
||||
* @param subItem
|
||||
* The in-progress submission we are loading information for
|
||||
*
|
||||
* @return a SubmissionInfo object
|
||||
*
|
||||
* @throws ServletException
|
||||
* if an error occurs
|
||||
*/
|
||||
public static SubmissionInfo load(HttpServletRequest request, InProgressSubmission subItem) throws ServletException
|
||||
{
|
||||
boolean forceReload = false;
|
||||
SubmissionInfo subInfo = new SubmissionInfo();
|
||||
|
||||
// load SubmissionConfigReader only the first time
|
||||
// or if we're using a different UI now.
|
||||
if (submissionConfigReader == null)
|
||||
{
|
||||
submissionConfigReader = new SubmissionConfigReader();
|
||||
forceReload=true;
|
||||
}
|
||||
|
||||
// save the item which is going through the submission process
|
||||
subInfo.setSubmissionItem(subItem);
|
||||
|
||||
// Only if the submission item is created can we set its collection
|
||||
String collectionHandle = SubmissionConfigReader.DEFAULT_COLLECTION;
|
||||
if (subItem != null)
|
||||
{
|
||||
collectionHandle = subItem.getCollection().getHandle();
|
||||
}
|
||||
|
||||
// save this collection handle to this submission info object
|
||||
subInfo.setCollectionHandle(collectionHandle);
|
||||
|
||||
// load Submission Process config for this item's collection
|
||||
// (Note: this also loads the Progress Bar info, since it is
|
||||
// dependent on the Submission config)
|
||||
loadSubmissionConfig(request, subInfo, forceReload);
|
||||
|
||||
return subInfo;
|
||||
}
|
||||
|
||||
/**
|
||||
* Is this submission in the workflow process?
|
||||
*
|
||||
* @return true if the current submission is in the workflow process
|
||||
*/
|
||||
public boolean isInWorkflow()
|
||||
{
|
||||
return ((this.submissionItem != null) && this.submissionItem instanceof WorkflowItem);
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the current in progress submission
|
||||
*
|
||||
* @return the InProgressSubmission object representing the current
|
||||
* submission
|
||||
*/
|
||||
public InProgressSubmission getSubmissionItem()
|
||||
{
|
||||
return this.submissionItem;
|
||||
}
|
||||
|
||||
/**
|
||||
* Updates the current in progress submission item
|
||||
*
|
||||
* @param subItem
|
||||
* the new InProgressSubmission object
|
||||
*/
|
||||
public void setSubmissionItem(InProgressSubmission subItem)
|
||||
{
|
||||
this.submissionItem = subItem;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the current submission process config (which includes all steps
|
||||
* which need to be completed for the submission to be successful)
|
||||
*
|
||||
* @return the SubmissionConfig object, which contains info on all the steps
|
||||
* in the current submission process
|
||||
*/
|
||||
public SubmissionConfig getSubmissionConfig()
|
||||
{
|
||||
return this.submissionConfig;
|
||||
}
|
||||
|
||||
/**
|
||||
* Causes the SubmissionConfig to be completely reloaded from the XML
|
||||
* configuration file (item-submission.xml).
|
||||
* <P>
|
||||
* Note: This also reloads the progress bar info, since the progress bar
|
||||
* depends entirely on the submission process (and its steps).
|
||||
*
|
||||
* @param request
|
||||
* The HTTP Servlet Request object
|
||||
*
|
||||
* @throws ServletException
|
||||
* if an error occurs
|
||||
*/
|
||||
public void reloadSubmissionConfig(HttpServletRequest request)
|
||||
throws ServletException
|
||||
{
|
||||
// Only if the submission item is created can we set its collection
|
||||
String collectionHandle = SubmissionConfigReader.DEFAULT_COLLECTION;
|
||||
if (this.submissionItem != null)
|
||||
{
|
||||
collectionHandle = submissionItem.getCollection().getHandle();
|
||||
}
|
||||
this.setCollectionHandle(collectionHandle);
|
||||
|
||||
// force a reload of the submission process configuration
|
||||
loadSubmissionConfig(request, this, true);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a particular global step definition based on its ID.
|
||||
* <P>
|
||||
* Global step definitions are those defined in the <step-definitions>
|
||||
* section of the configuration file.
|
||||
*
|
||||
* @param stepID
|
||||
* step's identifier
|
||||
*
|
||||
* @return the SubmissionStepConfig representing the step
|
||||
*
|
||||
* @throws ServletException
|
||||
* if no default submission process configuration defined
|
||||
*/
|
||||
public SubmissionStepConfig getStepConfig(String stepID)
|
||||
throws ServletException
|
||||
{
|
||||
return submissionConfigReader.getStepConfig(stepID);
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Return text information suitable for logging.
|
||||
* <p>
|
||||
* This method is used by several of the Step classes
|
||||
* to log major events during the submission process (e.g. when
|
||||
* license agreement was accepted, when item was submitted,
|
||||
* when it was available in DSpace, etc.)
|
||||
*
|
||||
* @return the type and ID of the submission, bundle and/or bitstream for
|
||||
* logging
|
||||
*/
|
||||
public String getSubmissionLogInfo()
|
||||
{
|
||||
String info = "";
|
||||
|
||||
if (isInWorkflow())
|
||||
{
|
||||
info = info + "workflow_id=" + getSubmissionItem().getID();
|
||||
}
|
||||
else
|
||||
{
|
||||
info = info + "workspace_item_id" + getSubmissionItem().getID();
|
||||
}
|
||||
|
||||
if (getBundle() != null)
|
||||
{
|
||||
info = info + ",bundle_id=" + getBundle().getID();
|
||||
}
|
||||
|
||||
if (getBitstream() != null)
|
||||
{
|
||||
info = info + ",bitstream_id=" + getBitstream().getID();
|
||||
}
|
||||
|
||||
return info;
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the handle of the collection to which this item is being submitted
|
||||
*
|
||||
* @return the collection handle
|
||||
*/
|
||||
public String getCollectionHandle()
|
||||
{
|
||||
return this.collectionHandle;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the handle of the collection to which this item is being submitted
|
||||
*
|
||||
* @param handle
|
||||
* the new collection handle
|
||||
*/
|
||||
public void setCollectionHandle(String handle)
|
||||
{
|
||||
this.collectionHandle = handle;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the information used to build the progress bar (this includes all
|
||||
* the steps in this submission, as well as the ordering and names of the
|
||||
* steps).
|
||||
* <p>
|
||||
* Returns a Hashmap, with the following specifics:
|
||||
* <p>
|
||||
* Keys are the number of the step, followed by the number of the page
|
||||
* within the step
|
||||
* <p>
|
||||
* (e.g. "2.1" = The first page of Step 2)
|
||||
* <p>
|
||||
* (e.g. "5.2" = The second page of Step 5)
|
||||
* <P>
|
||||
* Values are the Headings to display for each step (e.g. "Describe")
|
||||
*
|
||||
* @return a Hashmap of Progress Bar information.
|
||||
*/
|
||||
public Map<String, String> getProgressBarInfo()
|
||||
{
|
||||
return this.progressBar;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the current bitstream we're working with (This is used during
|
||||
* upload processes, or user interfaces that are dealing with bitstreams)
|
||||
*
|
||||
* @return the Bitstream object for the bitstream
|
||||
*/
|
||||
public Bitstream getBitstream()
|
||||
{
|
||||
return this.bitstream;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the current bitstream we're working with (This is used during upload
|
||||
* processes, or user interfaces that are dealing with bitstreams)
|
||||
*
|
||||
* @param bits
|
||||
* the bitstream
|
||||
*/
|
||||
public void setBitstream(Bitstream bits)
|
||||
{
|
||||
this.bitstream = bits;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the current bundle we're working with (This is used during upload
|
||||
* processes, or user interfaces that are dealing with bundles/bitstreams)
|
||||
*
|
||||
* @return the Bundle object for the bundle
|
||||
*/
|
||||
public Bundle getBundle()
|
||||
{
|
||||
return this.bundle;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the current bundle we're working with (This is used during upload
|
||||
* processes, or user interfaces that are dealing with bundles/bitstreams)
|
||||
*
|
||||
* @param bund
|
||||
* the bundle
|
||||
*/
|
||||
public void setBundle(Bundle bund)
|
||||
{
|
||||
this.bundle = bund;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return form related indices of the required fields which were not filled
|
||||
* out by the user.
|
||||
*
|
||||
* @return a List of empty fields which are required
|
||||
*/
|
||||
public List<String> getMissingFields()
|
||||
{
|
||||
return this.missingFields;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the form related indices of the required fields which were not
|
||||
* filled out by the user.
|
||||
*
|
||||
* @param missing
|
||||
* the List of empty fields which are required
|
||||
*/
|
||||
public void setMissingFields(List<String> missing)
|
||||
{
|
||||
this.missingFields = missing;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return metadata field which user has requested more input boxes be
|
||||
* displayed (by pressing "Add More" on one of the "Describe" pages)
|
||||
*
|
||||
* @return the String name of the field element
|
||||
*/
|
||||
public String getMoreBoxesFor()
|
||||
{
|
||||
return this.moreBoxesFor;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the metadata field which user has requested more input boxes be
|
||||
* displayed (by pressing "Add More" on one of the "Describe" pages)
|
||||
*
|
||||
* @param fieldname
|
||||
* the name of the field element on the page
|
||||
*/
|
||||
public void setMoreBoxesFor(String fieldname)
|
||||
{
|
||||
this.moreBoxesFor = fieldname;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return metadata field which JSP should "jump to" (i.e. set focus on) when
|
||||
* the JSP next loads. This is used during the Describe step.
|
||||
*
|
||||
* @return the String name of the field element
|
||||
*/
|
||||
public String getJumpToField()
|
||||
{
|
||||
return this.jumpToField;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets metadata field which JSP should "jump to" (i.e. set focus on) when
|
||||
* the JSP next loads. This is used during the Describe step.
|
||||
*
|
||||
* @param fieldname
|
||||
* the name of the field on the page
|
||||
*/
|
||||
public void setJumpToField(String fieldname)
|
||||
{
|
||||
this.jumpToField = fieldname;
|
||||
}
|
||||
|
||||
/**
|
||||
* Load necessary information to build the Progress Bar for the Item
|
||||
* Submission Progress.
|
||||
*
|
||||
* This information is returned in the form of a HashMap (which is then
|
||||
* stored as a part of the SubmissionInfo). The HashMap takes the following
|
||||
* form:
|
||||
*
|
||||
* Keys - the number of the step, followed by the number of the page within
|
||||
* the step (e.g. "2.1" = The first page of Step 2) (e.g. "5.2" = The second
|
||||
* page of Step 5)
|
||||
*
|
||||
* Values - the headings to display for each step (e.g. "Describe",
|
||||
* "Verify")
|
||||
*
|
||||
* @param request
|
||||
* The HTTP Servlet Request object
|
||||
* @param subInfo
|
||||
* the SubmissionInfo object we are loading into
|
||||
* @param forceReload
|
||||
* If true, this method reloads from scratch (and overwrites
|
||||
* cached progress bar info)
|
||||
*
|
||||
*/
|
||||
private static void loadProgressBar(HttpServletRequest request,
|
||||
SubmissionInfo subInfo, boolean forceReload)
|
||||
{
|
||||
Map<String, String> progressBarInfo = null;
|
||||
|
||||
log.debug("Loading Progress Bar Info");
|
||||
|
||||
if (!forceReload)
|
||||
{
|
||||
// first, attempt to load from cache
|
||||
progressBarInfo = loadProgressBarFromCache(request
|
||||
.getSession());
|
||||
}
|
||||
|
||||
if (progressBarInfo != null && log.isDebugEnabled())
|
||||
{
|
||||
log.debug("Found Progress Bar Info in cache: "
|
||||
+ progressBarInfo.size()
|
||||
+ " pages to display in progress bar");
|
||||
}
|
||||
// if unable to load from cache, must load from scratch
|
||||
else
|
||||
{
|
||||
progressBarInfo = new LinkedHashMap<String, String>();
|
||||
|
||||
// loop through all steps
|
||||
for (int i = 0; i < subInfo.submissionConfig.getNumberOfSteps(); i++)
|
||||
{
|
||||
// get the current step info
|
||||
SubmissionStepConfig currentStep = subInfo.submissionConfig
|
||||
.getStep(i);
|
||||
String stepNumber = Integer.toString(currentStep
|
||||
.getStepNumber());
|
||||
String stepHeading = currentStep.getHeading();
|
||||
|
||||
// as long as this step is visible, include it in
|
||||
// the Progress Bar
|
||||
if (currentStep.isVisible())
|
||||
{
|
||||
// default to just one page in this step
|
||||
int numPages = 1;
|
||||
|
||||
try
|
||||
{
|
||||
// load the processing class for this step
|
||||
ClassLoader loader = subInfo.getClass()
|
||||
.getClassLoader();
|
||||
Class<AbstractProcessingStep> stepClass = (Class<AbstractProcessingStep>)loader.loadClass(currentStep.getProcessingClassName());
|
||||
|
||||
// call the "getNumberOfPages()" method of the class
|
||||
// to get it's number of pages
|
||||
AbstractProcessingStep step = stepClass.newInstance();
|
||||
|
||||
// get number of pages from servlet
|
||||
numPages = step.getNumberOfPages(request, subInfo);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
log.error(
|
||||
"Error loading progress bar information from Step Class '"
|
||||
+ currentStep.getProcessingClassName()
|
||||
+ "' Error:", e);
|
||||
}
|
||||
|
||||
// save each of the step's pages to the progress bar
|
||||
for (int j = 1; j <= numPages; j++)
|
||||
{
|
||||
String pageNumber = Integer.toString(j);
|
||||
|
||||
// store ("stepNumber.pageNumber", Heading) for each
|
||||
// page in the step
|
||||
progressBarInfo.put(stepNumber + "." + pageNumber,
|
||||
stepHeading);
|
||||
}// end for each page
|
||||
}
|
||||
}// end for each step
|
||||
|
||||
log.debug("Loaded Progress Bar Info from scratch: "
|
||||
+ progressBarInfo.size()
|
||||
+ " pages to display in progress bar");
|
||||
|
||||
// cache this new progress bar
|
||||
saveProgressBarToCache(request.getSession(), progressBarInfo);
|
||||
}// end if null
|
||||
|
||||
// save progressBarInfo to submission Info
|
||||
subInfo.progressBar = progressBarInfo;
|
||||
}
|
||||
|
||||
/**
|
||||
* Saves all progress bar information into session cache. This saves us from
|
||||
* having to reload this same progress bar over and over again.
|
||||
*
|
||||
* @param session
|
||||
* The HTTP Session object
|
||||
* @param progressBarInfo
|
||||
* The progress bar info to cache
|
||||
*
|
||||
*/
|
||||
private static void saveProgressBarToCache(HttpSession session,
|
||||
Map<String, String> progressBarInfo)
|
||||
{
|
||||
// cache progress bar info to Session
|
||||
session.setAttribute("submission.progressbar", progressBarInfo);
|
||||
}
|
||||
|
||||
/**
|
||||
* Attempts to retrieve progress bar information (for a particular
|
||||
* collection) from session cache.
|
||||
*
|
||||
* If the progress bar info cannot be found, returns null
|
||||
*
|
||||
* @param session
|
||||
* The HTTP Session object
|
||||
*
|
||||
* @return progressBarInfo HashMap (if found), or null (if not)
|
||||
*
|
||||
*/
|
||||
private static Map<String, String> loadProgressBarFromCache(HttpSession session)
|
||||
{
|
||||
return (Map<String, String>) session.getAttribute("submission.progressbar");
|
||||
}
|
||||
|
||||
/**
|
||||
* Loads SubmissionConfig object for the given submission info object. If a
|
||||
* SubmissionConfig object cannot be loaded, a Servlet Error is thrown.
|
||||
* <p>
|
||||
* This method just loads this SubmissionConfig object internally, so that
|
||||
* it is available via a call to "getSubmissionConfig()"
|
||||
*
|
||||
* @param request
|
||||
* The HTTP Servlet Request object
|
||||
* @param subInfo
|
||||
* the SubmissionInfo object we are loading into
|
||||
* @param forceReload
|
||||
* If true, this method reloads from scratch (and overwrites
|
||||
* cached SubmissionConfig)
|
||||
*
|
||||
*/
|
||||
private static void loadSubmissionConfig(HttpServletRequest request,
|
||||
SubmissionInfo subInfo, boolean forceReload)
|
||||
throws ServletException
|
||||
{
|
||||
|
||||
log.debug("Loading Submission Config information");
|
||||
|
||||
if (!forceReload)
|
||||
{
|
||||
// first, try to load from cache
|
||||
subInfo.submissionConfig = loadSubmissionConfigFromCache(request
|
||||
.getSession(), subInfo.getCollectionHandle(), subInfo
|
||||
.isInWorkflow());
|
||||
}
|
||||
|
||||
if (subInfo.submissionConfig == null || forceReload)
|
||||
{
|
||||
// reload the proper Submission process config
|
||||
// (by reading the XML config file)
|
||||
subInfo.submissionConfig = submissionConfigReader
|
||||
.getSubmissionConfig(subInfo.getCollectionHandle(), subInfo
|
||||
.isInWorkflow());
|
||||
|
||||
// cache this new submission process configuration
|
||||
saveSubmissionConfigToCache(request.getSession(),
|
||||
subInfo.submissionConfig, subInfo.getCollectionHandle(),
|
||||
subInfo.isInWorkflow());
|
||||
|
||||
// also must force reload Progress Bar info,
|
||||
// since it's based on the Submission config
|
||||
loadProgressBar(request, subInfo, true);
|
||||
}
|
||||
else
|
||||
{
|
||||
log.debug("Found Submission Config in session cache!");
|
||||
|
||||
// try and reload progress bar from cache
|
||||
loadProgressBar(request, subInfo, false);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Saves SubmissionConfig object into session cache. This saves us from
|
||||
* having to reload this object during every "Step".
|
||||
*
|
||||
* @param session
|
||||
* The HTTP Session object
|
||||
* @param subConfig
|
||||
* The SubmissionConfig to cache
|
||||
* @param collectionHandle
|
||||
* The Collection handle this SubmissionConfig corresponds to
|
||||
* @param isWorkflow
|
||||
* Whether this SubmissionConfig corresponds to a workflow
|
||||
*
|
||||
*
|
||||
*/
|
||||
private static void saveSubmissionConfigToCache(HttpSession session,
|
||||
SubmissionConfig subConfig, String collectionHandle,
|
||||
boolean isWorkflow)
|
||||
{
|
||||
// cache the submission process config
|
||||
// and the collection it corresponds to
|
||||
session.setAttribute("submission.config", subConfig);
|
||||
session.setAttribute("submission.config.collection", collectionHandle);
|
||||
session.setAttribute("submission.config.isWorkflow", Boolean.valueOf(
|
||||
isWorkflow));
|
||||
}
|
||||
|
||||
/**
|
||||
* Loads SubmissionConfig object from session cache for the given
|
||||
* Collection. If a SubmissionConfig object cannot be found, null is
|
||||
* returned.
|
||||
*
|
||||
* @param session
|
||||
* The HTTP Session object
|
||||
* @param collectionHandle
|
||||
* The Collection handle of the SubmissionConfig to load
|
||||
* @param isWorkflow
|
||||
* whether or not we loading the Submission process for a
|
||||
* workflow item
|
||||
*
|
||||
* @return The cached SubmissionConfig for this collection
|
||||
*/
|
||||
private static SubmissionConfig loadSubmissionConfigFromCache(
|
||||
HttpSession session, String collectionHandle, boolean isWorkflow)
|
||||
{
|
||||
// attempt to load submission process config
|
||||
// from cache for the current collection
|
||||
String cachedHandle = (String) session
|
||||
.getAttribute("submission.config.collection");
|
||||
|
||||
Boolean cachedIsWorkflow = (Boolean) session
|
||||
.getAttribute("submission.config.isWorkflow");
|
||||
|
||||
// only load from cache if the collection handle and
|
||||
// workflow item status both match!
|
||||
if (collectionHandle.equals(cachedHandle)
|
||||
&& isWorkflow == cachedIsWorkflow.booleanValue())
|
||||
|
||||
{
|
||||
return (SubmissionConfig) session.getAttribute("submission.config");
|
||||
}
|
||||
else
|
||||
{
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@@ -1,220 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.util.Map;
|
||||
import java.io.Serializable;
|
||||
|
||||
/**
|
||||
* Class representing configuration for a single step within an Item Submission
|
||||
* Process. In other words, this is a single step in the SubmissionConfig class.
|
||||
* This class represents the structure of a single 'step' node in the
|
||||
* item-submission.xml configuration file.
|
||||
*
|
||||
* Note: Implements Serializable as it will be saved to the current session during submission.
|
||||
* Please ensure that nothing is added to this class that isn't also serializable
|
||||
*
|
||||
* @see org.dspace.app.util.SubmissionConfigReader
|
||||
* @see org.dspace.app.util.SubmissionConfig
|
||||
*
|
||||
* @author Tim Donohue
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class SubmissionStepConfig implements Serializable
|
||||
{
|
||||
/*
|
||||
* The identifier for the Select Collection step
|
||||
*/
|
||||
public static final String SELECT_COLLECTION_STEP = "collection";
|
||||
|
||||
/*
|
||||
* The identifier for the Completion step
|
||||
*/
|
||||
public static final String COMPLETE_STEP = "complete";
|
||||
|
||||
/**
|
||||
* the id for this step ('id' only exists if this step is defined in the
|
||||
* <step-definitions> section)
|
||||
*/
|
||||
private String id = null;
|
||||
|
||||
/** the heading for this step */
|
||||
private String heading = null;
|
||||
|
||||
/** the name of the java processing class for this step */
|
||||
private String processingClassName = null;
|
||||
|
||||
/** whether or not this step is editable during workflow (default=true) */
|
||||
private boolean workflowEditable = true;
|
||||
|
||||
/**
|
||||
* The full name of the JSP-UI binding class for this step. This field is
|
||||
* ONLY used by the JSP-UI.
|
||||
**/
|
||||
private String jspBindingClassName = null;
|
||||
|
||||
/**
|
||||
* The full name of the Manakin XML-UI Transformer class which will generate
|
||||
* the necessary DRI for displaying this class in Manakin. This field is
|
||||
* ONLY used by the Manakin XML-UI.
|
||||
*/
|
||||
private String xmlBindingClassName = null;
|
||||
|
||||
/** The number of this step in the current SubmissionConfig */
|
||||
private int number = -1;
|
||||
|
||||
/**
|
||||
* Class constructor for creating an empty SubmissionStepConfig object
|
||||
*/
|
||||
public SubmissionStepConfig()
|
||||
{
|
||||
}
|
||||
|
||||
/**
|
||||
* Class constructor for creating a SubmissionStepConfig object based on the
|
||||
* contents of a HashMap initialized by the SubmissionConfig object.
|
||||
*
|
||||
* @param stepMap
|
||||
* the HashMap containing all required information about this
|
||||
* step
|
||||
*/
|
||||
public SubmissionStepConfig(Map<String, String> stepMap)
|
||||
{
|
||||
id = stepMap.get("id");
|
||||
heading = stepMap.get("heading");
|
||||
processingClassName = stepMap.get("processing-class");
|
||||
jspBindingClassName = stepMap.get("jspui-binding");
|
||||
xmlBindingClassName = stepMap.get("xmlui-binding");
|
||||
|
||||
String wfEditString = stepMap.get("workflow-editable");
|
||||
if (wfEditString != null && wfEditString.length() > 0)
|
||||
{
|
||||
workflowEditable = Boolean.parseBoolean(wfEditString);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the ID for this step. An ID is only defined if the step exists in the
|
||||
* <step-definitions> section. This ID field is used to reference special
|
||||
* steps (like the required step with id="collection")
|
||||
*
|
||||
* @return the step ID
|
||||
*/
|
||||
public String getId()
|
||||
{
|
||||
return id;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the heading for this step. This can either be a property from
|
||||
* Messages.properties, or the actual heading text. If this "heading"
|
||||
* contains a period(.) it is assumed to reference Messages.properties.
|
||||
*
|
||||
* @return the heading
|
||||
*/
|
||||
public String getHeading()
|
||||
{
|
||||
return heading;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the class which handles all processing for this step.
|
||||
* <p>
|
||||
* This class must extend the org.dspace.submit.AbstractProcessingStep class,
|
||||
* and provide processing for BOTH the JSP-UI and XML-UI
|
||||
*
|
||||
* @return the class's full class path (e.g.
|
||||
* "org.dspace.submit.step.MySampleStep")
|
||||
*/
|
||||
public String getProcessingClassName()
|
||||
{
|
||||
return processingClassName;
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieve the full class name of the Manakin Transformer which will
|
||||
* generate this step's DRI, for display in Manakin XML-UI.
|
||||
* <P>
|
||||
* This class must extend the
|
||||
* org.dspace.app.xmlui.aspect.submission.StepTransformer class.
|
||||
* <P>
|
||||
* This property is only used by the Manakin XML-UI, and therefore is not
|
||||
* relevant if you are using the JSP-UI.
|
||||
*
|
||||
* @return the full java class name of the Transformer to use for this step
|
||||
*/
|
||||
public String getXMLUIClassName()
|
||||
{
|
||||
return xmlBindingClassName;
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieve the full class name of the JSP-UI "binding" class which will
|
||||
* initialize and call the necessary JSPs for display in the JSP-UI
|
||||
* <P>
|
||||
* This class must extend the
|
||||
* org.dspace.app.webui.submit.JSPStep class.
|
||||
* <P>
|
||||
* This property is only used by the JSP-UI, and therefore is not
|
||||
* relevant if you are using the XML-UI (aka. Manakin).
|
||||
*
|
||||
* @return the full java class name of the JSPStep to use for this step
|
||||
*/
|
||||
public String getJSPUIClassName()
|
||||
{
|
||||
return jspBindingClassName;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the number of this step in the current Submission process config.
|
||||
* Step numbers start with #0 (although step #0 is ALWAYS the special
|
||||
* "select collection" step)
|
||||
*
|
||||
* @return the number of this step in the current SubmissionConfig
|
||||
*/
|
||||
public int getStepNumber()
|
||||
{
|
||||
return number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the number of this step in the current Submission process config.
|
||||
* Step numbers start with #0 (although step #0 is ALWAYS the special
|
||||
* "select collection" step)
|
||||
*
|
||||
* @param stepNum
|
||||
* the step number.
|
||||
*/
|
||||
protected void setStepNumber(int stepNum)
|
||||
{
|
||||
this.number = stepNum;
|
||||
}
|
||||
|
||||
/**
|
||||
* Whether or not this step is editable during workflow processing. If
|
||||
* "true", then this step will appear in the "Edit Metadata" stage of the
|
||||
* workflow process.
|
||||
*
|
||||
* @return if step is editable in a workflow process
|
||||
*/
|
||||
public boolean isWorkflowEditable()
|
||||
{
|
||||
return workflowEditable;
|
||||
}
|
||||
|
||||
/**
|
||||
* Whether or not this step is visible within the Progress Bar. A step is
|
||||
* only visible if it has been assigned a Heading, otherwise it's invisible
|
||||
*
|
||||
* @return if step is visible within the progress bar
|
||||
*/
|
||||
public boolean isVisible()
|
||||
{
|
||||
return ((heading != null) && (heading.length() > 0));
|
||||
}
|
||||
}
|
||||
@@ -1,473 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Date;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
|
||||
import org.apache.commons.lang.ArrayUtils;
|
||||
import org.w3c.dom.Document;
|
||||
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Collection;
|
||||
import org.dspace.content.Community;
|
||||
import org.dspace.content.DCDate;
|
||||
import org.dspace.content.DCValue;
|
||||
import org.dspace.content.DSpaceObject;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Constants;
|
||||
import org.dspace.handle.HandleManager;
|
||||
|
||||
import com.sun.syndication.feed.synd.SyndFeed;
|
||||
import com.sun.syndication.feed.synd.SyndFeedImpl;
|
||||
import com.sun.syndication.feed.synd.SyndEntry;
|
||||
import com.sun.syndication.feed.synd.SyndEntryImpl;
|
||||
import com.sun.syndication.feed.synd.SyndImage;
|
||||
import com.sun.syndication.feed.synd.SyndImageImpl;
|
||||
import com.sun.syndication.feed.synd.SyndPerson;
|
||||
import com.sun.syndication.feed.synd.SyndPersonImpl;
|
||||
import com.sun.syndication.feed.synd.SyndContent;
|
||||
import com.sun.syndication.feed.synd.SyndContentImpl;
|
||||
import com.sun.syndication.feed.module.DCModuleImpl;
|
||||
import com.sun.syndication.feed.module.DCModule;
|
||||
import com.sun.syndication.feed.module.Module;
|
||||
import com.sun.syndication.io.SyndFeedOutput;
|
||||
import com.sun.syndication.io.FeedException;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
|
||||
/**
|
||||
* Invoke ROME library to assemble a generic model of a syndication
|
||||
* for the given list of Items and scope. Consults configuration for the
|
||||
* metadata bindings to feed elements. Uses ROME's output drivers to
|
||||
* return any of the implemented formats, e.g. RSS 1.0, RSS 2.0, ATOM 1.0.
|
||||
*
|
||||
* The feed generator and OpenSearch call on this class so feed contents are
|
||||
* uniform for both.
|
||||
*
|
||||
* @author Larry Stone
|
||||
*/
|
||||
public class SyndicationFeed
|
||||
{
|
||||
private static final Logger log = Logger.getLogger(SyndicationFeed.class);
|
||||
|
||||
|
||||
/** i18n key values */
|
||||
public static final String MSG_UNTITLED = "notitle";
|
||||
public static final String MSG_LOGO_TITLE = "logo.title";
|
||||
public static final String MSG_FEED_TITLE = "feed.title";
|
||||
public static final String MSG_FEED_DESCRIPTION = "general-feed.description";
|
||||
public static final String MSG_METADATA = "metadata.";
|
||||
public static final String MSG_UITYPE = "ui.type";
|
||||
|
||||
// UI keywords
|
||||
public static final String UITYPE_XMLUI = "xmlui";
|
||||
public static final String UITYPE_JSPUI = "jspui";
|
||||
|
||||
// default DC fields for entry
|
||||
private static String defaultTitleField = "dc.title";
|
||||
private static String defaultAuthorField = "dc.contributor.author";
|
||||
private static String defaultDateField = "dc.date.issued";
|
||||
private static String defaultDescriptionFields = "dc.description.abstract, dc.description, dc.title.alternative, dc.title";
|
||||
|
||||
// metadata field for Item title in entry:
|
||||
private static String titleField =
|
||||
getDefaultedConfiguration("webui.feed.item.title", defaultTitleField);
|
||||
|
||||
// metadata field for Item publication date in entry:
|
||||
private static String dateField =
|
||||
getDefaultedConfiguration("webui.feed.item.date", defaultDateField);
|
||||
|
||||
// metadata field for Item description in entry:
|
||||
private static String descriptionFields[] =
|
||||
getDefaultedConfiguration("webui.feed.item.description", defaultDescriptionFields).split("\\s*,\\s*");
|
||||
|
||||
private static String authorField =
|
||||
getDefaultedConfiguration("webui.feed.item.author", defaultAuthorField);
|
||||
|
||||
// metadata field for Item dc:creator field in entry's DCModule (no default)
|
||||
private static String dcCreatorField = ConfigurationManager.getProperty("webui.feed.item.dc.creator");
|
||||
|
||||
// metadata field for Item dc:date field in entry's DCModule (no default)
|
||||
private static String dcDateField = ConfigurationManager.getProperty("webui.feed.item.dc.date");
|
||||
|
||||
// metadata field for Item dc:author field in entry's DCModule (no default)
|
||||
private static String dcDescriptionField = ConfigurationManager.getProperty("webui.feed.item.dc.description");
|
||||
|
||||
|
||||
// -------- Instance variables:
|
||||
|
||||
// the feed object we are building
|
||||
private SyndFeed feed = null;
|
||||
|
||||
// memory of UI that called us, "xmlui" or "jspui"
|
||||
// affects Bitstream retrieval URL and I18N keys
|
||||
private String uiType = null;
|
||||
|
||||
/**
|
||||
* Constructor.
|
||||
* @param ui either "xmlui" or "jspui"
|
||||
*/
|
||||
public SyndicationFeed(String ui)
|
||||
{
|
||||
feed = new SyndFeedImpl();
|
||||
uiType = ui;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns list of metadata selectors used to compose the description element
|
||||
*
|
||||
* @return selector list - format 'schema.element[.qualifier]'
|
||||
*/
|
||||
public static String[] getDescriptionSelectors()
|
||||
{
|
||||
return (String[]) ArrayUtils.clone(descriptionFields);
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Fills in the feed and entry-level metadata from DSpace objects.
|
||||
*/
|
||||
public void populate(HttpServletRequest request, DSpaceObject dso,
|
||||
DSpaceObject items[], Map<String, String> labels)
|
||||
{
|
||||
String logoURL = null;
|
||||
String objectURL = null;
|
||||
String defaultTitle = null;
|
||||
|
||||
// dso is null for the whole site, or a search without scope
|
||||
if (dso == null)
|
||||
{
|
||||
defaultTitle = ConfigurationManager.getProperty("dspace.name");
|
||||
feed.setDescription(localize(labels, MSG_FEED_DESCRIPTION));
|
||||
objectURL = resolveURL(request, null);
|
||||
logoURL = ConfigurationManager.getProperty("webui.feed.logo.url");
|
||||
}
|
||||
else
|
||||
{
|
||||
Bitstream logo = null;
|
||||
if (dso.getType() == Constants.COLLECTION)
|
||||
{
|
||||
Collection col = (Collection)dso;
|
||||
defaultTitle = col.getMetadata("name");
|
||||
feed.setDescription(col.getMetadata("short_description"));
|
||||
logo = col.getLogo();
|
||||
}
|
||||
else if (dso.getType() == Constants.COMMUNITY)
|
||||
{
|
||||
Community comm = (Community)dso;
|
||||
defaultTitle = comm.getMetadata("name");
|
||||
feed.setDescription(comm.getMetadata("short_description"));
|
||||
logo = comm.getLogo();
|
||||
}
|
||||
objectURL = resolveURL(request, dso);
|
||||
if (logo != null)
|
||||
{
|
||||
logoURL = urlOfBitstream(request, logo);
|
||||
}
|
||||
}
|
||||
feed.setTitle(labels.containsKey(MSG_FEED_TITLE) ?
|
||||
localize(labels, MSG_FEED_TITLE) : defaultTitle);
|
||||
feed.setLink(objectURL);
|
||||
feed.setPublishedDate(new Date());
|
||||
feed.setUri(objectURL);
|
||||
|
||||
// add logo if we found one:
|
||||
if (logoURL != null)
|
||||
{
|
||||
// we use the path to the logo for this, the logo itself cannot
|
||||
// be contained in the rdf. Not all RSS-viewers show this logo.
|
||||
SyndImage image = new SyndImageImpl();
|
||||
image.setLink(objectURL);
|
||||
image.setTitle(localize(labels, MSG_LOGO_TITLE));
|
||||
image.setUrl(logoURL);
|
||||
feed.setImage(image);
|
||||
}
|
||||
|
||||
// add entries for items
|
||||
if (items != null)
|
||||
{
|
||||
List<SyndEntry> entries = new ArrayList<SyndEntry>();
|
||||
for (DSpaceObject itemDSO : items)
|
||||
{
|
||||
if (itemDSO.getType() != Constants.ITEM)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
Item item = (Item)itemDSO;
|
||||
boolean hasDate = false;
|
||||
SyndEntry entry = new SyndEntryImpl();
|
||||
entries.add(entry);
|
||||
|
||||
String entryURL = resolveURL(request, item);
|
||||
entry.setLink(entryURL);
|
||||
entry.setUri(entryURL);
|
||||
|
||||
String title = getOneDC(item, titleField);
|
||||
entry.setTitle(title == null ? localize(labels, MSG_UNTITLED) : title);
|
||||
|
||||
// "published" date -- should be dc.date.issued
|
||||
String pubDate = getOneDC(item, dateField);
|
||||
if (pubDate != null)
|
||||
{
|
||||
entry.setPublishedDate((new DCDate(pubDate)).toDate());
|
||||
hasDate = true;
|
||||
}
|
||||
// date of last change to Item
|
||||
entry.setUpdatedDate(item.getLastModified());
|
||||
|
||||
StringBuffer db = new StringBuffer();
|
||||
for (String df : descriptionFields)
|
||||
{
|
||||
// Special Case: "(date)" in field name means render as date
|
||||
boolean isDate = df.indexOf("(date)") > 0;
|
||||
if (isDate)
|
||||
{
|
||||
df = df.replaceAll("\\(date\\)", "");
|
||||
}
|
||||
|
||||
DCValue dcv[] = item.getMetadata(df);
|
||||
if (dcv.length > 0)
|
||||
{
|
||||
String fieldLabel = labels.get(MSG_METADATA + df);
|
||||
if (fieldLabel != null && fieldLabel.length()>0)
|
||||
{
|
||||
db.append(fieldLabel).append(": ");
|
||||
}
|
||||
boolean first = true;
|
||||
for (DCValue v : dcv)
|
||||
{
|
||||
if (first)
|
||||
{
|
||||
first = false;
|
||||
}
|
||||
else
|
||||
{
|
||||
db.append("; ");
|
||||
}
|
||||
db.append(isDate ? new DCDate(v.value).toString() : v.value);
|
||||
}
|
||||
db.append("\n");
|
||||
}
|
||||
}
|
||||
if (db.length() > 0)
|
||||
{
|
||||
SyndContent desc = new SyndContentImpl();
|
||||
desc.setType("text/plain");
|
||||
desc.setValue(db.toString());
|
||||
entry.setDescription(desc);
|
||||
}
|
||||
|
||||
// This gets the authors into an ATOM feed
|
||||
DCValue authors[] = item.getMetadata(authorField);
|
||||
if (authors.length > 0)
|
||||
{
|
||||
List<SyndPerson> creators = new ArrayList<SyndPerson>();
|
||||
for (DCValue author : authors)
|
||||
{
|
||||
SyndPerson sp = new SyndPersonImpl();
|
||||
sp.setName(author.value);
|
||||
creators.add(sp);
|
||||
}
|
||||
entry.setAuthors(creators);
|
||||
}
|
||||
|
||||
// only add DC module if any DC fields are configured
|
||||
if (dcCreatorField != null || dcDateField != null ||
|
||||
dcDescriptionField != null)
|
||||
{
|
||||
DCModule dc = new DCModuleImpl();
|
||||
if (dcCreatorField != null)
|
||||
{
|
||||
DCValue dcAuthors[] = item.getMetadata(dcCreatorField);
|
||||
if (dcAuthors.length > 0)
|
||||
{
|
||||
List<String> creators = new ArrayList<String>();
|
||||
for (DCValue author : dcAuthors)
|
||||
{
|
||||
creators.add(author.value);
|
||||
}
|
||||
dc.setCreators(creators);
|
||||
}
|
||||
}
|
||||
if (dcDateField != null && !hasDate)
|
||||
{
|
||||
DCValue v[] = item.getMetadata(dcDateField);
|
||||
if (v.length > 0)
|
||||
{
|
||||
dc.setDate((new DCDate(v[0].value)).toDate());
|
||||
}
|
||||
}
|
||||
if (dcDescriptionField != null)
|
||||
{
|
||||
DCValue v[] = item.getMetadata(dcDescriptionField);
|
||||
if (v.length > 0)
|
||||
{
|
||||
StringBuffer descs = new StringBuffer();
|
||||
for (DCValue d : v)
|
||||
{
|
||||
if (descs.length() > 0)
|
||||
{
|
||||
descs.append("\n\n");
|
||||
}
|
||||
descs.append(d.value);
|
||||
}
|
||||
dc.setDescription(descs.toString());
|
||||
}
|
||||
}
|
||||
entry.getModules().add(dc);
|
||||
}
|
||||
}
|
||||
feed.setEntries(entries);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the feed type for XML delivery, e.g. "rss_1.0", "atom_1.0"
|
||||
* Must match one of ROME's configured generators, see rome.properties
|
||||
* (currently rss_1.0, rss_2.0, atom_1.0, atom_0.3)
|
||||
*/
|
||||
public void setType(String feedType)
|
||||
{
|
||||
feed.setFeedType(feedType);
|
||||
// XXX FIXME: workaround ROME 1.0 bug, it puts invalid image element in rss1.0
|
||||
if ("rss_1.0".equals(feedType))
|
||||
{
|
||||
feed.setImage(null);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* @return the feed we built as DOM Document
|
||||
*/
|
||||
public Document outputW3CDom()
|
||||
throws FeedException
|
||||
{
|
||||
try
|
||||
{
|
||||
SyndFeedOutput feedWriter = new SyndFeedOutput();
|
||||
return feedWriter.outputW3CDom(feed);
|
||||
}
|
||||
catch (FeedException e)
|
||||
{
|
||||
log.error(e);
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* @return the feed we built as serialized XML string
|
||||
*/
|
||||
public String outputString()
|
||||
throws FeedException
|
||||
{
|
||||
SyndFeedOutput feedWriter = new SyndFeedOutput();
|
||||
return feedWriter.outputString(feed);
|
||||
}
|
||||
|
||||
/**
|
||||
* send the output to designated Writer
|
||||
*/
|
||||
public void output(java.io.Writer writer)
|
||||
throws FeedException, IOException
|
||||
{
|
||||
SyndFeedOutput feedWriter = new SyndFeedOutput();
|
||||
feedWriter.output(feed, writer);
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a ROME plugin module (e.g. for OpenSearch) at the feed level.
|
||||
*/
|
||||
public void addModule(Module m)
|
||||
{
|
||||
feed.getModules().add(m);
|
||||
}
|
||||
|
||||
// utility to get config property with default value when not set.
|
||||
private static String getDefaultedConfiguration(String key, String dfl)
|
||||
{
|
||||
String result = ConfigurationManager.getProperty(key);
|
||||
return (result == null) ? dfl : result;
|
||||
}
|
||||
|
||||
// returns absolute URL to download content of bitstream (which might not belong to any Item)
|
||||
private String urlOfBitstream(HttpServletRequest request, Bitstream logo)
|
||||
{
|
||||
String name = logo.getName();
|
||||
return resolveURL(request,null) +
|
||||
(uiType.equalsIgnoreCase(UITYPE_XMLUI) ?"/bitstream/id/":"/retrieve/") +
|
||||
logo.getID()+"/"+(name == null?"":name);
|
||||
}
|
||||
|
||||
/**
|
||||
* Return a url to the DSpace object, either use the official
|
||||
* handle for the item or build a url based upon the current server.
|
||||
*
|
||||
* If the dspaceobject is null then a local url to the repository is generated.
|
||||
*
|
||||
* @param dso The object to refrence, null if to the repository.
|
||||
* @return
|
||||
*/
|
||||
private String baseURL = null; // cache the result for null
|
||||
|
||||
private String resolveURL(HttpServletRequest request, DSpaceObject dso)
|
||||
{
|
||||
// If no object given then just link to the whole repository,
|
||||
// since no offical handle exists so we have to use local resolution.
|
||||
if (dso == null)
|
||||
{
|
||||
if (baseURL == null)
|
||||
{
|
||||
if (request == null)
|
||||
{
|
||||
baseURL = ConfigurationManager.getProperty("dspace.url");
|
||||
}
|
||||
else
|
||||
{
|
||||
baseURL = (request.isSecure()) ? "https://" : "http://";
|
||||
baseURL += ConfigurationManager.getProperty("dspace.hostname");
|
||||
baseURL += ":" + request.getServerPort();
|
||||
baseURL += request.getContextPath();
|
||||
}
|
||||
}
|
||||
return baseURL;
|
||||
}
|
||||
|
||||
// return a link to handle in repository
|
||||
else if (ConfigurationManager.getBooleanProperty("webui.feed.localresolve"))
|
||||
{
|
||||
return resolveURL(request, null) + "/handle/" + dso.getHandle();
|
||||
}
|
||||
|
||||
// link to the Handle server or other persistent URL source
|
||||
else
|
||||
{
|
||||
return HandleManager.getCanonicalForm(dso.getHandle());
|
||||
}
|
||||
}
|
||||
|
||||
// retrieve text for localization key, or mark untranslated
|
||||
private String localize(Map<String, String> labels, String s)
|
||||
{
|
||||
return labels.containsKey(s) ? labels.get(s) : ("Untranslated:"+s);
|
||||
}
|
||||
|
||||
// spoonful of syntactic sugar when we only need first value
|
||||
private String getOneDC(Item item, String field)
|
||||
{
|
||||
DCValue dcv[] = item.getMetadata(field);
|
||||
return (dcv.length > 0) ? dcv[0].value : null;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,353 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.app.util;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.text.DecimalFormat;
|
||||
import java.text.NumberFormat;
|
||||
import java.util.Enumeration;
|
||||
import java.util.Locale;
|
||||
import java.util.Properties;
|
||||
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.core.Constants;
|
||||
|
||||
|
||||
/**
|
||||
* Miscellaneous utility methods
|
||||
*
|
||||
* @author Robert Tansley
|
||||
* @author Mark Diggory
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class Util {
|
||||
// cache for source version result
|
||||
private static String sourceVersion = null;
|
||||
|
||||
private static Logger log = Logger.getLogger(Util.class);
|
||||
|
||||
/**
|
||||
* Utility method to convert spaces in a string to HTML non-break space
|
||||
* elements.
|
||||
*
|
||||
* @param s
|
||||
* string to change spaces in
|
||||
* @return the string passed in with spaces converted to HTML non-break
|
||||
* spaces
|
||||
*/
|
||||
public static String nonBreakSpace(String s) {
|
||||
StringBuffer newString = new StringBuffer();
|
||||
|
||||
for (int i = 0; i < s.length(); i++)
|
||||
{
|
||||
char ch = s.charAt(i);
|
||||
|
||||
if (ch == ' ')
|
||||
{
|
||||
newString.append(" ");
|
||||
}
|
||||
else
|
||||
{
|
||||
newString.append(ch);
|
||||
}
|
||||
}
|
||||
|
||||
return newString.toString();
|
||||
}
|
||||
|
||||
/**
|
||||
* Encode a bitstream name for inclusion in a URL in an HTML document. This
|
||||
* differs from the usual URL-encoding, since we want pathname separators to
|
||||
* be passed through verbatim; this is required so that relative paths in
|
||||
* bitstream names and HTML references work correctly.
|
||||
* <P>
|
||||
* If the link to a bitstream is generated with the pathname separators
|
||||
* escaped (e.g. "%2F" instead of "/") then the Web user agent perceives it
|
||||
* to be one pathname element, and relative URI paths within that document
|
||||
* containing ".." elements will be handled incorrectly.
|
||||
* <P>
|
||||
*
|
||||
* @param stringIn
|
||||
* input string to encode
|
||||
* @param encoding
|
||||
* character encoding, e.g. UTF-8
|
||||
* @return the encoded string
|
||||
*/
|
||||
public static String encodeBitstreamName(String stringIn, String encoding) throws java.io.UnsupportedEncodingException {
|
||||
// FIXME: This should be moved elsewhere, as it is used outside the UI
|
||||
StringBuffer out = new StringBuffer();
|
||||
|
||||
final String[] pctEncoding = { "%00", "%01", "%02", "%03", "%04",
|
||||
"%05", "%06", "%07", "%08", "%09", "%0a", "%0b", "%0c", "%0d",
|
||||
"%0e", "%0f", "%10", "%11", "%12", "%13", "%14", "%15", "%16",
|
||||
"%17", "%18", "%19", "%1a", "%1b", "%1c", "%1d", "%1e", "%1f",
|
||||
"%20", "%21", "%22", "%23", "%24", "%25", "%26", "%27", "%28",
|
||||
"%29", "%2a", "%2b", "%2c", "%2d", "%2e", "%2f", "%30", "%31",
|
||||
"%32", "%33", "%34", "%35", "%36", "%37", "%38", "%39", "%3a",
|
||||
"%3b", "%3c", "%3d", "%3e", "%3f", "%40", "%41", "%42", "%43",
|
||||
"%44", "%45", "%46", "%47", "%48", "%49", "%4a", "%4b", "%4c",
|
||||
"%4d", "%4e", "%4f", "%50", "%51", "%52", "%53", "%54", "%55",
|
||||
"%56", "%57", "%58", "%59", "%5a", "%5b", "%5c", "%5d", "%5e",
|
||||
"%5f", "%60", "%61", "%62", "%63", "%64", "%65", "%66", "%67",
|
||||
"%68", "%69", "%6a", "%6b", "%6c", "%6d", "%6e", "%6f", "%70",
|
||||
"%71", "%72", "%73", "%74", "%75", "%76", "%77", "%78", "%79",
|
||||
"%7a", "%7b", "%7c", "%7d", "%7e", "%7f", "%80", "%81", "%82",
|
||||
"%83", "%84", "%85", "%86", "%87", "%88", "%89", "%8a", "%8b",
|
||||
"%8c", "%8d", "%8e", "%8f", "%90", "%91", "%92", "%93", "%94",
|
||||
"%95", "%96", "%97", "%98", "%99", "%9a", "%9b", "%9c", "%9d",
|
||||
"%9e", "%9f", "%a0", "%a1", "%a2", "%a3", "%a4", "%a5", "%a6",
|
||||
"%a7", "%a8", "%a9", "%aa", "%ab", "%ac", "%ad", "%ae", "%af",
|
||||
"%b0", "%b1", "%b2", "%b3", "%b4", "%b5", "%b6", "%b7", "%b8",
|
||||
"%b9", "%ba", "%bb", "%bc", "%bd", "%be", "%bf", "%c0", "%c1",
|
||||
"%c2", "%c3", "%c4", "%c5", "%c6", "%c7", "%c8", "%c9", "%ca",
|
||||
"%cb", "%cc", "%cd", "%ce", "%cf", "%d0", "%d1", "%d2", "%d3",
|
||||
"%d4", "%d5", "%d6", "%d7", "%d8", "%d9", "%da", "%db", "%dc",
|
||||
"%dd", "%de", "%df", "%e0", "%e1", "%e2", "%e3", "%e4", "%e5",
|
||||
"%e6", "%e7", "%e8", "%e9", "%ea", "%eb", "%ec", "%ed", "%ee",
|
||||
"%ef", "%f0", "%f1", "%f2", "%f3", "%f4", "%f5", "%f6", "%f7",
|
||||
"%f8", "%f9", "%fa", "%fb", "%fc", "%fd", "%fe", "%ff" };
|
||||
|
||||
byte[] bytes = stringIn.getBytes(encoding);
|
||||
|
||||
for (int i = 0; i < bytes.length; i++)
|
||||
{
|
||||
// Any unreserved char or "/" goes through unencoded
|
||||
if ((bytes[i] >= 'A' && bytes[i] <= 'Z')
|
||||
|| (bytes[i] >= 'a' && bytes[i] <= 'z')
|
||||
|| (bytes[i] >= '0' && bytes[i] <= '9') || bytes[i] == '-'
|
||||
|| bytes[i] == '.' || bytes[i] == '_' || bytes[i] == '~'
|
||||
|| bytes[i] == '/')
|
||||
{
|
||||
out.append((char) bytes[i]);
|
||||
}
|
||||
else if (bytes[i] >= 0)
|
||||
{
|
||||
// encode other chars (byte code < 128)
|
||||
out.append(pctEncoding[bytes[i]]);
|
||||
}
|
||||
else
|
||||
{
|
||||
// encode other chars (byte code > 127, so it appears as
|
||||
// negative in Java signed byte data type)
|
||||
out.append(pctEncoding[256 + bytes[i]]);
|
||||
}
|
||||
}
|
||||
log.debug("encoded \"" + stringIn + "\" to \"" + out.toString() + "\"");
|
||||
|
||||
return out.toString();
|
||||
}
|
||||
|
||||
/** Version of encodeBitstreamName with one parameter, uses default encoding
|
||||
* <P>
|
||||
* @param stringIn
|
||||
* input string to encode
|
||||
* @return the encoded string
|
||||
*/
|
||||
public static String encodeBitstreamName(String stringIn) throws java.io.UnsupportedEncodingException {
|
||||
return encodeBitstreamName(stringIn, Constants.DEFAULT_ENCODING);
|
||||
}
|
||||
|
||||
/**
|
||||
* Formats the file size. Examples:
|
||||
*
|
||||
* - 50 = 50B
|
||||
* - 1024 = 1KB
|
||||
* - 1,024,000 = 1MB etc
|
||||
*
|
||||
* The numbers are formatted using java Locales
|
||||
*
|
||||
* @param in The number to convert
|
||||
* @return the file size as a String
|
||||
*/
|
||||
public static String formatFileSize(double in) {
|
||||
// Work out the size of the file, and format appropriatly
|
||||
// FIXME: When full i18n support is available, use the user's Locale
|
||||
// rather than the default Locale.
|
||||
NumberFormat nf = NumberFormat.getNumberInstance(Locale.getDefault());
|
||||
DecimalFormat df = (DecimalFormat)nf;
|
||||
df.applyPattern("###,###.##");
|
||||
if (in < 1024)
|
||||
{
|
||||
df.applyPattern("0");
|
||||
return df.format(in) + " " + "B";
|
||||
}
|
||||
else if (in < 1024000)
|
||||
{
|
||||
in = in / 1024;
|
||||
return df.format(in) + " " + "kB";
|
||||
}
|
||||
else if (in < 1024000000)
|
||||
{
|
||||
in = in / 1024000;
|
||||
return df.format(in) + " " + "MB";
|
||||
}
|
||||
else
|
||||
{
|
||||
in = in / 1024000000;
|
||||
return df.format(in) + " " + "GB";
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Obtain a parameter from the given request as an int. <code>-1</code> is
|
||||
* returned if the parameter is garbled or does not exist.
|
||||
*
|
||||
* @param request
|
||||
* the HTTP request
|
||||
* @param param
|
||||
* the name of the parameter
|
||||
*
|
||||
* @return the integer value of the parameter, or -1
|
||||
*/
|
||||
public static int getIntParameter(HttpServletRequest request, String param)
|
||||
{
|
||||
String val = request.getParameter(param);
|
||||
|
||||
try
|
||||
{
|
||||
return Integer.parseInt(val.trim());
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
// Problem with parameter
|
||||
return -1;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Obtain an array of int parameters from the given request as an int. null
|
||||
* is returned if parameter doesn't exist. <code>-1</code> is returned in
|
||||
* array locations if that particular value is garbled.
|
||||
*
|
||||
* @param request
|
||||
* the HTTP request
|
||||
* @param param
|
||||
* the name of the parameter
|
||||
*
|
||||
* @return array of integers or null
|
||||
*/
|
||||
public static int[] getIntParameters(HttpServletRequest request,
|
||||
String param)
|
||||
{
|
||||
String[] request_values = request.getParameterValues(param);
|
||||
|
||||
if (request_values == null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
int[] return_values = new int[request_values.length];
|
||||
|
||||
for (int x = 0; x < return_values.length; x++)
|
||||
{
|
||||
try
|
||||
{
|
||||
return_values[x] = Integer.parseInt(request_values[x]);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
// Problem with parameter, stuff -1 in this slot
|
||||
return_values[x] = -1;
|
||||
}
|
||||
}
|
||||
|
||||
return return_values;
|
||||
}
|
||||
|
||||
/**
|
||||
* Obtain a parameter from the given request as a boolean.
|
||||
* <code>false</code> is returned if the parameter is garbled or does not
|
||||
* exist.
|
||||
*
|
||||
* @param request
|
||||
* the HTTP request
|
||||
* @param param
|
||||
* the name of the parameter
|
||||
*
|
||||
* @return the integer value of the parameter, or -1
|
||||
*/
|
||||
public static boolean getBoolParameter(HttpServletRequest request,
|
||||
String param)
|
||||
{
|
||||
return ((request.getParameter(param) != null) && request.getParameter(
|
||||
param).equals("true"));
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the button the user pressed on a submitted form. All buttons should
|
||||
* start with the text <code>submit</code> for this to work. A default
|
||||
* should be supplied, since often the browser will submit a form with no
|
||||
* submit button pressed if the user presses enter.
|
||||
*
|
||||
* @param request
|
||||
* the HTTP request
|
||||
* @param def
|
||||
* the default button
|
||||
*
|
||||
* @return the button pressed
|
||||
*/
|
||||
public static String getSubmitButton(HttpServletRequest request, String def)
|
||||
{
|
||||
Enumeration e = request.getParameterNames();
|
||||
|
||||
while (e.hasMoreElements())
|
||||
{
|
||||
String parameterName = (String) e.nextElement();
|
||||
|
||||
if (parameterName.startsWith("submit"))
|
||||
{
|
||||
return parameterName;
|
||||
}
|
||||
}
|
||||
|
||||
return def;
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets Maven version string of the source that built this instance.
|
||||
* @return string containing version, e.g. "1.5.2"; ends in "-SNAPSHOT" for development versions.
|
||||
*/
|
||||
public static String getSourceVersion()
|
||||
{
|
||||
if (sourceVersion == null)
|
||||
{
|
||||
Properties constants = new Properties();
|
||||
|
||||
InputStream cis = null;
|
||||
try
|
||||
{
|
||||
cis = Util.class.getResourceAsStream("/META-INF/maven/org.dspace/dspace-api/pom.properties");
|
||||
constants.load(cis);
|
||||
}
|
||||
catch(Exception e)
|
||||
{
|
||||
log.error(e.getMessage(),e);
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (cis != null)
|
||||
{
|
||||
try
|
||||
{
|
||||
cis.close();
|
||||
}
|
||||
catch (IOException e)
|
||||
{
|
||||
log.error("Unable to close input stream", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
sourceVersion = constants.getProperty("version", "none");
|
||||
}
|
||||
return sourceVersion;
|
||||
}
|
||||
}
|
||||
@@ -1,329 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authenticate;
|
||||
|
||||
import java.sql.SQLException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Enumeration;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import javax.servlet.http.HttpServletResponse;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.LogManager;
|
||||
import org.dspace.eperson.EPerson;
|
||||
import org.dspace.eperson.Group;
|
||||
|
||||
/**
|
||||
* Adds users to special groups based on IP address. Configuration parameter
|
||||
* form is:
|
||||
* <P>
|
||||
* {@code authentication.ip.<GROUPNAME> = <IPRANGE>[, <IPRANGE> ...]}
|
||||
* <P>
|
||||
* e.g. {@code authentication.ip.MIT = 18., 192.25.0.0/255.255.0.0}
|
||||
* <P>
|
||||
* Negative matches can be included by prepending the range with a '-'. For example if you want
|
||||
* to include all of a class B network except for users of a contained class c network, you could use:
|
||||
* <P>
|
||||
* 111.222,-111.222.333.
|
||||
* <p>
|
||||
* For supported IP ranges see {@link org.dspace.authenticate.IPMatcher}.
|
||||
*
|
||||
* @version $Revision$
|
||||
* @author Robert Tansley
|
||||
*/
|
||||
public class IPAuthentication implements AuthenticationMethod
|
||||
{
|
||||
/** Our logger */
|
||||
private static Logger log = Logger.getLogger(IPAuthentication.class);
|
||||
|
||||
/** Whether to look for x-forwarded headers for logging IP addresses */
|
||||
private static Boolean useProxies;
|
||||
|
||||
/** All the IP matchers */
|
||||
private List<IPMatcher> ipMatchers;
|
||||
|
||||
/** All the negative IP matchers */
|
||||
private List<IPMatcher> ipNegativeMatchers;
|
||||
|
||||
|
||||
/**
|
||||
* Maps IPMatchers to group names when we don't know group DB ID yet. When
|
||||
* the DB ID is known, the IPMatcher is moved to ipMatcherGroupIDs and then
|
||||
* points to the DB ID.
|
||||
*/
|
||||
private Map<IPMatcher, String> ipMatcherGroupNames;
|
||||
|
||||
/** Maps IPMatchers to group IDs (Integers) where we know the group DB ID */
|
||||
private Map<IPMatcher, Integer> ipMatcherGroupIDs;
|
||||
|
||||
/**
|
||||
* Initialize an IP authenticator, reading in the configuration. Note this
|
||||
* will never fail if the configuration is bad -- a warning will be logged.
|
||||
*/
|
||||
public IPAuthentication()
|
||||
{
|
||||
ipMatchers = new ArrayList<IPMatcher>();
|
||||
ipNegativeMatchers = new ArrayList<IPMatcher>();
|
||||
ipMatcherGroupIDs = new HashMap<IPMatcher, Integer>();
|
||||
ipMatcherGroupNames = new HashMap<IPMatcher, String>();
|
||||
|
||||
Enumeration e = ConfigurationManager.propertyNames();
|
||||
|
||||
while (e.hasMoreElements())
|
||||
{
|
||||
String propName = (String) e.nextElement();
|
||||
if (propName.startsWith("authentication.ip."))
|
||||
{
|
||||
String[] nameParts = propName.split("\\.");
|
||||
|
||||
if (nameParts.length == 3)
|
||||
{
|
||||
addMatchers(nameParts[2], ConfigurationManager
|
||||
.getProperty(propName));
|
||||
}
|
||||
else
|
||||
{
|
||||
log.warn("Malformed configuration property name: "
|
||||
+ propName);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Add matchers for the given comma-delimited IP ranges and group.
|
||||
*
|
||||
* @param groupName
|
||||
* name of group
|
||||
* @param ipRanges
|
||||
* IP ranges
|
||||
*/
|
||||
private void addMatchers(String groupName, String ipRanges)
|
||||
{
|
||||
String[] ranges = ipRanges.split("\\s*,\\s*");
|
||||
|
||||
for (String entry : ranges)
|
||||
{
|
||||
try
|
||||
{
|
||||
IPMatcher ipm;
|
||||
if (entry.startsWith("-"))
|
||||
{
|
||||
ipm = new IPMatcher(entry.substring(1));
|
||||
ipNegativeMatchers.add(ipm);
|
||||
}
|
||||
else
|
||||
{
|
||||
ipm = new IPMatcher(entry);
|
||||
ipMatchers.add(ipm);
|
||||
}
|
||||
ipMatcherGroupNames.put(ipm, groupName);
|
||||
|
||||
if (log.isDebugEnabled())
|
||||
{
|
||||
log.debug("Configured " + entry + " for special group "
|
||||
+ groupName);
|
||||
}
|
||||
}
|
||||
catch (IPMatcherException ipme)
|
||||
{
|
||||
log.warn("Malformed IP range specified for group " + groupName,
|
||||
ipme);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public boolean canSelfRegister(Context context, HttpServletRequest request,
|
||||
String username) throws SQLException
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
public void initEPerson(Context context, HttpServletRequest request,
|
||||
EPerson eperson) throws SQLException
|
||||
{
|
||||
}
|
||||
|
||||
public boolean allowSetPassword(Context context,
|
||||
HttpServletRequest request, String username) throws SQLException
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
public boolean isImplicit()
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
public int[] getSpecialGroups(Context context, HttpServletRequest request)
|
||||
throws SQLException
|
||||
{
|
||||
if (request == null)
|
||||
{
|
||||
return new int[0];
|
||||
}
|
||||
List<Integer> groupIDs = new ArrayList<Integer>();
|
||||
|
||||
// Get the user's IP address
|
||||
String addr = request.getRemoteAddr();
|
||||
if (useProxies == null) {
|
||||
useProxies = ConfigurationManager.getBooleanProperty("useProxies", false);
|
||||
}
|
||||
if (useProxies && request.getHeader("X-Forwarded-For") != null)
|
||||
{
|
||||
/* This header is a comma delimited list */
|
||||
for(String xfip : request.getHeader("X-Forwarded-For").split(","))
|
||||
{
|
||||
if(!request.getHeader("X-Forwarded-For").contains(addr))
|
||||
{
|
||||
addr = xfip.trim();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (IPMatcher ipm : ipMatchers)
|
||||
{
|
||||
try
|
||||
{
|
||||
if (ipm.match(addr))
|
||||
{
|
||||
// Do we know group ID?
|
||||
Integer g = ipMatcherGroupIDs.get(ipm);
|
||||
if (g != null)
|
||||
{
|
||||
groupIDs.add(g);
|
||||
}
|
||||
else
|
||||
{
|
||||
// See if we have a group name
|
||||
String groupName = ipMatcherGroupNames.get(ipm);
|
||||
|
||||
if (groupName != null)
|
||||
{
|
||||
Group group = Group.findByName(context, groupName);
|
||||
if (group != null)
|
||||
{
|
||||
// Add ID so we won't have to do lookup again
|
||||
ipMatcherGroupIDs.put(ipm, Integer.valueOf(group.getID()));
|
||||
ipMatcherGroupNames.remove(ipm);
|
||||
|
||||
groupIDs.add(Integer.valueOf(group.getID()));
|
||||
}
|
||||
else
|
||||
{
|
||||
log.warn(LogManager.getHeader(context,
|
||||
"configuration_error", "unknown_group="
|
||||
+ groupName));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (IPMatcherException ipme)
|
||||
{
|
||||
log.warn(LogManager.getHeader(context, "configuration_error",
|
||||
"bad_ip=" + addr), ipme);
|
||||
}
|
||||
}
|
||||
|
||||
// Now remove any negative matches
|
||||
for (IPMatcher ipm : ipNegativeMatchers)
|
||||
{
|
||||
try
|
||||
{
|
||||
if (ipm.match(addr))
|
||||
{
|
||||
// Do we know group ID?
|
||||
Integer g = ipMatcherGroupIDs.get(ipm);
|
||||
if (g != null)
|
||||
{
|
||||
groupIDs.remove(g);
|
||||
}
|
||||
else
|
||||
{
|
||||
// See if we have a group name
|
||||
String groupName = ipMatcherGroupNames.get(ipm);
|
||||
|
||||
if (groupName != null)
|
||||
{
|
||||
Group group = Group.findByName(context, groupName);
|
||||
if (group != null)
|
||||
{
|
||||
// Add ID so we won't have to do lookup again
|
||||
ipMatcherGroupIDs.put(ipm, Integer.valueOf(group.getID()));
|
||||
ipMatcherGroupNames.remove(ipm);
|
||||
|
||||
groupIDs.remove(Integer.valueOf(group.getID()));
|
||||
}
|
||||
else
|
||||
{
|
||||
log.warn(LogManager.getHeader(context,
|
||||
"configuration_error", "unknown_group="
|
||||
+ groupName));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (IPMatcherException ipme)
|
||||
{
|
||||
log.warn(LogManager.getHeader(context, "configuration_error",
|
||||
"bad_ip=" + addr), ipme);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
int[] results = new int[groupIDs.size()];
|
||||
for (int i = 0; i < groupIDs.size(); i++)
|
||||
{
|
||||
results[i] = (groupIDs.get(i)).intValue();
|
||||
}
|
||||
|
||||
if (log.isDebugEnabled())
|
||||
{
|
||||
StringBuffer gsb = new StringBuffer();
|
||||
|
||||
for (int i = 0; i < results.length; i++)
|
||||
{
|
||||
if (i > 0)
|
||||
{
|
||||
gsb.append(",");
|
||||
}
|
||||
gsb.append(results[i]);
|
||||
}
|
||||
|
||||
log.debug(LogManager.getHeader(context, "authenticated",
|
||||
"special_groups=" + gsb.toString()));
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
public int authenticate(Context context, String username, String password,
|
||||
String realm, HttpServletRequest request) throws SQLException
|
||||
{
|
||||
return BAD_ARGS;
|
||||
}
|
||||
|
||||
public String loginPageURL(Context context, HttpServletRequest request,
|
||||
HttpServletResponse response)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
public String loginPageTitle(Context context)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -1,198 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authenticate;
|
||||
|
||||
/**
|
||||
* Quickly tests whether a given IPv4 4-byte address matches an IP range. An
|
||||
* {@code IPMatcher} is initialized with a particular IP range specification.
|
||||
* Calls to {@link IPMatcher#match(String) match} method will then quickly
|
||||
* determine whether a given IP falls within that range.
|
||||
* <p>
|
||||
* Supported range specifications areL
|
||||
* <p>
|
||||
* <ul>
|
||||
* <li>Full IP address, e.g. {@code 12.34.56.78}</li>
|
||||
* <li>Partial IP address, e.g. {@code 12.34} (which matches any IP starting
|
||||
* {@code 12.34})</li>
|
||||
* <li>Network/netmask, e.g. {@code 18.25.0.0/255.255.0.0}</li>
|
||||
* <li>CIDR slash notation, e.g. {@code 18.25.0.0/16}</li>
|
||||
* </ul>
|
||||
*
|
||||
* @version $Revision$
|
||||
* @author Robert Tansley
|
||||
*/
|
||||
public class IPMatcher
|
||||
{
|
||||
/** Network to match */
|
||||
private byte[] network;
|
||||
|
||||
/** Network mask */
|
||||
private byte[] netmask;
|
||||
|
||||
/**
|
||||
* Construct an IPMatcher that will test for the given IP specification
|
||||
*
|
||||
* @param ipSpec
|
||||
* IP specification (full or partial URL, network/netmask,
|
||||
* network/cidr)
|
||||
* @throws IPMatcherException
|
||||
* if there is an error parsing the specification (i.e. it is
|
||||
* somehow malformed)
|
||||
*/
|
||||
public IPMatcher(String ipSpec) throws IPMatcherException
|
||||
{
|
||||
// Boil all specs down to network + mask
|
||||
network = new byte[4];
|
||||
netmask = new byte[] { -1, -1, -1, -1 };
|
||||
|
||||
// Allow partial IP
|
||||
boolean mustHave4 = false;
|
||||
|
||||
String ipPart = ipSpec;
|
||||
String[] parts = ipSpec.split("/");
|
||||
|
||||
switch (parts.length)
|
||||
{
|
||||
case 2:
|
||||
// Some kind of slash notation -- we'll need a full network IP
|
||||
ipPart = parts[0];
|
||||
mustHave4 = true;
|
||||
|
||||
String[] maskParts = parts[1].split("\\.");
|
||||
if (maskParts.length == 1)
|
||||
{
|
||||
// CIDR slash notation
|
||||
int x;
|
||||
|
||||
try
|
||||
{
|
||||
x = Integer.parseInt(maskParts[0]);
|
||||
}
|
||||
catch (NumberFormatException nfe)
|
||||
{
|
||||
throw new IPMatcherException(
|
||||
"Malformed IP range specification " + ipSpec, nfe);
|
||||
}
|
||||
|
||||
if (x < 0 || x > 32)
|
||||
{
|
||||
throw new IPMatcherException();
|
||||
}
|
||||
|
||||
int fullMask = -1 << (32 - x);
|
||||
netmask[0] = (byte) ((fullMask & 0xFF000000) >>> 24);
|
||||
netmask[1] = (byte) ((fullMask & 0x00FF0000) >>> 16);
|
||||
netmask[2] = (byte) ((fullMask & 0x0000FF00) >>> 8);
|
||||
netmask[3] = (byte) (fullMask & 0x000000FF);
|
||||
}
|
||||
else
|
||||
{
|
||||
// full subnet specified
|
||||
ipToBytes(parts[1], netmask, true);
|
||||
}
|
||||
|
||||
case 1:
|
||||
// Get IP
|
||||
int partCount = ipToBytes(ipPart, network, mustHave4);
|
||||
|
||||
// If partial IP, set mask for remaining bytes
|
||||
for (int i = 3; i >= partCount; i--)
|
||||
{
|
||||
netmask[i] = 0;
|
||||
}
|
||||
|
||||
break;
|
||||
|
||||
default:
|
||||
throw new IPMatcherException("Malformed IP range specification "
|
||||
+ ipSpec);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Fill out a given four-byte array with the IP address specified in the
|
||||
* given String
|
||||
*
|
||||
* @param ip
|
||||
* IP address as a dot-delimited String
|
||||
* @param bytes
|
||||
* 4-byte array to fill out
|
||||
* @param mustHave4
|
||||
* if true, will require that the given IP string specify all
|
||||
* four bytes
|
||||
* @return the number of actual IP bytes found in the given IP address
|
||||
* String
|
||||
* @throws IPMatcherException
|
||||
* if there is a problem parsing the IP string -- e.g. number
|
||||
* outside of range 0-255, too many numbers, less than 4 numbers
|
||||
* if {@code mustHave4} is true
|
||||
*/
|
||||
private int ipToBytes(String ip, byte[] bytes, boolean mustHave4)
|
||||
throws IPMatcherException
|
||||
{
|
||||
String[] parts = ip.split("\\.");
|
||||
|
||||
if (parts.length > 4 || mustHave4 && parts.length != 4)
|
||||
{
|
||||
throw new IPMatcherException("Malformed IP specification " + ip);
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
|
||||
for (int i = 0; i < parts.length; i++)
|
||||
{
|
||||
int p = Integer.parseInt(parts[i]);
|
||||
if (p < 0 || p > 255)
|
||||
{
|
||||
throw new IPMatcherException("Malformed IP specification "
|
||||
+ ip);
|
||||
|
||||
}
|
||||
|
||||
bytes[i] = (byte) (p < 128 ? p : p - 256);
|
||||
}
|
||||
}
|
||||
catch (NumberFormatException nfe)
|
||||
{
|
||||
throw new IPMatcherException("Malformed IP specification " + ip,
|
||||
nfe);
|
||||
}
|
||||
|
||||
return parts.length;
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine whether the given full IP falls within the range this
|
||||
* {@code IPMatcher} was initialized with.
|
||||
*
|
||||
* @param ipIn
|
||||
* IP address as dot-delimited String
|
||||
* @return {@code true} if the IP matches the range of this
|
||||
* {@code IPMatcher}; {@code false} otherwise
|
||||
* @throws IPMatcherException
|
||||
* if the IP passed in cannot be parsed correctly (i.e. is
|
||||
* malformed)
|
||||
*/
|
||||
public boolean match(String ipIn) throws IPMatcherException
|
||||
{
|
||||
byte[] bytes = new byte[4];
|
||||
|
||||
ipToBytes(ipIn, bytes, true);
|
||||
|
||||
for (int i = 0; i < 4; i++)
|
||||
{
|
||||
if ((bytes[i] & netmask[i]) != (network[i] & netmask[i]))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
}
|
||||
@@ -1,37 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authenticate;
|
||||
|
||||
/**
|
||||
* Thrown when there is a problem parsing an IP matcher specification.
|
||||
*
|
||||
* @version $Revision$
|
||||
* @author Robert Tansley
|
||||
*/
|
||||
public class IPMatcherException extends Exception
|
||||
{
|
||||
public IPMatcherException()
|
||||
{
|
||||
super();
|
||||
}
|
||||
|
||||
public IPMatcherException(String message)
|
||||
{
|
||||
super(message);
|
||||
}
|
||||
|
||||
public IPMatcherException(Throwable cause)
|
||||
{
|
||||
super(cause);
|
||||
}
|
||||
|
||||
public IPMatcherException(String message, Throwable cause)
|
||||
{
|
||||
super(message, cause);
|
||||
}
|
||||
}
|
||||
@@ -1,579 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authenticate;
|
||||
|
||||
import java.sql.SQLException;
|
||||
import java.util.Hashtable;
|
||||
|
||||
import javax.naming.NamingEnumeration;
|
||||
import javax.naming.NamingException;
|
||||
import javax.naming.directory.*;
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import javax.servlet.http.HttpServletResponse;
|
||||
|
||||
import org.apache.commons.lang.StringUtils;
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.LogManager;
|
||||
import org.dspace.eperson.EPerson;
|
||||
import org.dspace.eperson.Group;
|
||||
|
||||
/**
|
||||
* This LDAP authentication method is more complex than the simple 'LDAPAuthentication'
|
||||
* in that it allows authentication against structured hierarchical LDAP trees of
|
||||
* users. An initial bind is required using a user name and password in order to
|
||||
* search the tree and find the DN of the user. A second bind is then required to
|
||||
* check the credentials of the user by binding directly to their DN.
|
||||
*
|
||||
* @author Stuart Lewis, Chris Yates, Alex Barbieri, Flavio Botelho, Reuben Pasquini
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class LDAPHierarchicalAuthentication
|
||||
implements AuthenticationMethod {
|
||||
|
||||
/** log4j category */
|
||||
private static Logger log = Logger.getLogger(LDAPHierarchicalAuthentication.class);
|
||||
|
||||
/**
|
||||
* Let a real auth method return true if it wants.
|
||||
*/
|
||||
public boolean canSelfRegister(Context context,
|
||||
HttpServletRequest request,
|
||||
String username)
|
||||
throws SQLException
|
||||
{
|
||||
// Looks to see if webui.ldap.autoregister is set or not
|
||||
return ConfigurationManager.getBooleanProperty("webui.ldap.autoregister");
|
||||
}
|
||||
|
||||
/**
|
||||
* Nothing here, initialization is done when auto-registering.
|
||||
*/
|
||||
public void initEPerson(Context context, HttpServletRequest request,
|
||||
EPerson eperson)
|
||||
throws SQLException
|
||||
{
|
||||
// XXX should we try to initialize netid based on email addr,
|
||||
// XXX for eperson created by some other method??
|
||||
}
|
||||
|
||||
/**
|
||||
* Cannot change LDAP password through dspace, right?
|
||||
*/
|
||||
public boolean allowSetPassword(Context context,
|
||||
HttpServletRequest request,
|
||||
String username)
|
||||
throws SQLException
|
||||
{
|
||||
// XXX is this right?
|
||||
return false;
|
||||
}
|
||||
|
||||
/*
|
||||
* This is an explicit method.
|
||||
*/
|
||||
public boolean isImplicit()
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
/*
|
||||
* Add authenticated users to the group defined in dspace.cfg by
|
||||
* the ldap.login.specialgroup key.
|
||||
*/
|
||||
public int[] getSpecialGroups(Context context, HttpServletRequest request)
|
||||
{
|
||||
// Prevents anonymous users from being added to this group, and the second check
|
||||
// ensures they are LDAP users
|
||||
try
|
||||
{
|
||||
if (!context.getCurrentUser().getNetid().equals(""))
|
||||
{
|
||||
String groupName = ConfigurationManager.getProperty("ldap.login.specialgroup");
|
||||
if ((groupName != null) && (!groupName.trim().equals("")))
|
||||
{
|
||||
Group ldapGroup = Group.findByName(context, groupName);
|
||||
if (ldapGroup == null)
|
||||
{
|
||||
// Oops - the group isn't there.
|
||||
log.warn(LogManager.getHeader(context,
|
||||
"ldap_specialgroup",
|
||||
"Group defined in ldap.login.specialgroup does not exist"));
|
||||
return new int[0];
|
||||
} else
|
||||
{
|
||||
return new int[] { ldapGroup.getID() };
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (Exception npe) {
|
||||
// The user is not an LDAP user, so we don't need to worry about them
|
||||
}
|
||||
return new int[0];
|
||||
}
|
||||
|
||||
/*
|
||||
* Authenticate the given credentials.
|
||||
* This is the heart of the authentication method: test the
|
||||
* credentials for authenticity, and if accepted, attempt to match
|
||||
* (or optionally, create) an <code>EPerson</code>. If an <code>EPerson</code> is found it is
|
||||
* set in the <code>Context</code> that was passed.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context, will be modified (ePerson set) upon success.
|
||||
*
|
||||
* @param username
|
||||
* Username (or email address) when method is explicit. Use null for
|
||||
* implicit method.
|
||||
*
|
||||
* @param password
|
||||
* Password for explicit auth, or null for implicit method.
|
||||
*
|
||||
* @param realm
|
||||
* Realm is an extra parameter used by some authentication methods, leave null if
|
||||
* not applicable.
|
||||
*
|
||||
* @param request
|
||||
* The HTTP request that started this operation, or null if not applicable.
|
||||
*
|
||||
* @return One of:
|
||||
* SUCCESS, BAD_CREDENTIALS, CERT_REQUIRED, NO_SUCH_USER, BAD_ARGS
|
||||
* <p>Meaning:
|
||||
* <br>SUCCESS - authenticated OK.
|
||||
* <br>BAD_CREDENTIALS - user exists, but credentials (e.g. passwd) don't match
|
||||
* <br>CERT_REQUIRED - not allowed to login this way without X.509 cert.
|
||||
* <br>NO_SUCH_USER - user not found using this method.
|
||||
* <br>BAD_ARGS - user/pw not appropriate for this method
|
||||
*/
|
||||
public int authenticate(Context context,
|
||||
String netid,
|
||||
String password,
|
||||
String realm,
|
||||
HttpServletRequest request)
|
||||
throws SQLException
|
||||
{
|
||||
log.info(LogManager.getHeader(context, "auth", "attempting trivial auth of user="+netid));
|
||||
|
||||
// Skip out when no netid or password is given.
|
||||
if (netid == null || password == null)
|
||||
{
|
||||
return BAD_ARGS;
|
||||
}
|
||||
|
||||
// Locate the eperson
|
||||
EPerson eperson = null;
|
||||
try
|
||||
{
|
||||
eperson = EPerson.findByNetid(context, netid.toLowerCase());
|
||||
}
|
||||
catch (SQLException e)
|
||||
{
|
||||
}
|
||||
SpeakerToLDAP ldap = new SpeakerToLDAP(log);
|
||||
|
||||
// Get the DN of the user
|
||||
String adminUser = ConfigurationManager.getProperty("ldap.search.user");
|
||||
String adminPassword = ConfigurationManager.getProperty("ldap.search.password");
|
||||
String dn = ldap.getDNOfUser(adminUser, adminPassword, context, netid);
|
||||
|
||||
// Check a DN was found
|
||||
if ((dn == null) || (dn.trim().equals("")))
|
||||
{
|
||||
log.info(LogManager
|
||||
.getHeader(context, "failed_login", "no DN found for user " + netid));
|
||||
return BAD_CREDENTIALS;
|
||||
}
|
||||
|
||||
// if they entered a netid that matches an eperson
|
||||
if (eperson != null)
|
||||
{
|
||||
// e-mail address corresponds to active account
|
||||
if (eperson.getRequireCertificate())
|
||||
{
|
||||
return CERT_REQUIRED;
|
||||
}
|
||||
else if (!eperson.canLogIn())
|
||||
{
|
||||
return BAD_ARGS;
|
||||
}
|
||||
|
||||
if (ldap.ldapAuthenticate(dn, password, context))
|
||||
{
|
||||
context.setCurrentUser(eperson);
|
||||
log.info(LogManager
|
||||
.getHeader(context, "authenticate", "type=ldap"));
|
||||
return SUCCESS;
|
||||
}
|
||||
else
|
||||
{
|
||||
return BAD_CREDENTIALS;
|
||||
}
|
||||
}
|
||||
|
||||
// the user does not already exist so try and authenticate them
|
||||
// with ldap and create an eperson for them
|
||||
else
|
||||
{
|
||||
if (ldap.ldapAuthenticate(dn, password, context))
|
||||
{
|
||||
// Register the new user automatically
|
||||
log.info(LogManager.getHeader(context,
|
||||
"autoregister", "netid=" + netid));
|
||||
|
||||
if ((ldap.ldapEmail!=null)&&(!ldap.ldapEmail.equals("")))
|
||||
{
|
||||
try
|
||||
{
|
||||
eperson = EPerson.findByEmail(context, ldap.ldapEmail);
|
||||
if (eperson!=null)
|
||||
{
|
||||
log.info(LogManager.getHeader(context,
|
||||
"type=ldap-login", "type=ldap_but_already_email"));
|
||||
context.setIgnoreAuthorization(true);
|
||||
eperson.setNetid(netid.toLowerCase());
|
||||
eperson.update();
|
||||
context.commit();
|
||||
context.setIgnoreAuthorization(false);
|
||||
context.setCurrentUser(eperson);
|
||||
return SUCCESS;
|
||||
}
|
||||
else
|
||||
{
|
||||
if (canSelfRegister(context, request, netid))
|
||||
{
|
||||
// TEMPORARILY turn off authorisation
|
||||
try
|
||||
{
|
||||
context.setIgnoreAuthorization(true);
|
||||
eperson = EPerson.create(context);
|
||||
if ((ldap.ldapEmail != null) && (!ldap.ldapEmail.equals("")))
|
||||
{
|
||||
eperson.setEmail(ldap.ldapEmail);
|
||||
}
|
||||
else
|
||||
{
|
||||
eperson.setEmail(netid + ConfigurationManager.getProperty("ldap.netid_email_domain"));
|
||||
}
|
||||
if ((ldap.ldapGivenName!=null) && (!ldap.ldapGivenName.equals("")))
|
||||
{
|
||||
eperson.setFirstName(ldap.ldapGivenName);
|
||||
}
|
||||
if ((ldap.ldapSurname!=null) && (!ldap.ldapSurname.equals("")))
|
||||
{
|
||||
eperson.setLastName(ldap.ldapSurname);
|
||||
}
|
||||
if ((ldap.ldapPhone!=null)&&(!ldap.ldapPhone.equals("")))
|
||||
{
|
||||
eperson.setMetadata("phone", ldap.ldapPhone);
|
||||
}
|
||||
eperson.setNetid(netid.toLowerCase());
|
||||
eperson.setCanLogIn(true);
|
||||
AuthenticationManager.initEPerson(context, request, eperson);
|
||||
eperson.update();
|
||||
context.commit();
|
||||
context.setCurrentUser(eperson);
|
||||
}
|
||||
catch (AuthorizeException e)
|
||||
{
|
||||
return NO_SUCH_USER;
|
||||
}
|
||||
finally
|
||||
{
|
||||
context.setIgnoreAuthorization(false);
|
||||
}
|
||||
|
||||
log.info(LogManager.getHeader(context, "authenticate",
|
||||
"type=ldap-login, created ePerson"));
|
||||
return SUCCESS;
|
||||
}
|
||||
else
|
||||
{
|
||||
// No auto-registration for valid certs
|
||||
log.info(LogManager.getHeader(context,
|
||||
"failed_login", "type=ldap_but_no_record"));
|
||||
return NO_SUCH_USER;
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (AuthorizeException e)
|
||||
{
|
||||
eperson = null;
|
||||
}
|
||||
finally
|
||||
{
|
||||
context.setIgnoreAuthorization(false);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return BAD_ARGS;
|
||||
}
|
||||
|
||||
/**
|
||||
* Internal class to manage LDAP query and results, mainly
|
||||
* because there are multiple values to return.
|
||||
*/
|
||||
private static class SpeakerToLDAP {
|
||||
|
||||
private Logger log = null;
|
||||
|
||||
protected String ldapEmail = null;
|
||||
protected String ldapGivenName = null;
|
||||
protected String ldapSurname = null;
|
||||
protected String ldapPhone = null;
|
||||
|
||||
/** LDAP settings */
|
||||
String ldap_provider_url = ConfigurationManager.getProperty("ldap.provider_url");
|
||||
String ldap_id_field = ConfigurationManager.getProperty("ldap.id_field");
|
||||
String ldap_search_context = ConfigurationManager.getProperty("ldap.search_context");
|
||||
String ldap_object_context = ConfigurationManager.getProperty("ldap.object_context");
|
||||
String ldap_search_scope = ConfigurationManager.getProperty("ldap.search_scope");
|
||||
|
||||
String ldap_email_field = ConfigurationManager.getProperty("ldap.email_field");
|
||||
String ldap_givenname_field = ConfigurationManager.getProperty("ldap.givenname_field");
|
||||
String ldap_surname_field = ConfigurationManager.getProperty("ldap.surname_field");
|
||||
String ldap_phone_field = ConfigurationManager.getProperty("ldap.phone_field");
|
||||
|
||||
SpeakerToLDAP(Logger thelog)
|
||||
{
|
||||
log = thelog;
|
||||
}
|
||||
|
||||
protected String getDNOfUser(String adminUser, String adminPassword, Context context, String netid)
|
||||
{
|
||||
// The resultant DN
|
||||
String resultDN;
|
||||
|
||||
// The search scope to use (default to 0)
|
||||
int ldap_search_scope_value = 0;
|
||||
try
|
||||
{
|
||||
ldap_search_scope_value = Integer.parseInt(ldap_search_scope.trim());
|
||||
}
|
||||
catch (NumberFormatException e)
|
||||
{
|
||||
// Log the error if it has been set but is invalid
|
||||
if (ldap_search_scope != null)
|
||||
{
|
||||
log.warn(LogManager.getHeader(context,
|
||||
"ldap_authentication", "invalid search scope: " + ldap_search_scope));
|
||||
}
|
||||
}
|
||||
|
||||
// Set up environment for creating initial context
|
||||
Hashtable env = new Hashtable(11);
|
||||
env.put(javax.naming.Context.INITIAL_CONTEXT_FACTORY, "com.sun.jndi.ldap.LdapCtxFactory");
|
||||
env.put(javax.naming.Context.PROVIDER_URL, ldap_provider_url);
|
||||
|
||||
if ((adminUser != null) && (!adminUser.trim().equals("")) &&
|
||||
(adminPassword != null) && (!adminPassword.trim().equals("")))
|
||||
{
|
||||
// Use admin credencials for search// Authenticate
|
||||
env.put(javax.naming.Context.SECURITY_AUTHENTICATION, "simple");
|
||||
env.put(javax.naming.Context.SECURITY_PRINCIPAL, adminUser);
|
||||
env.put(javax.naming.Context.SECURITY_CREDENTIALS, adminPassword);
|
||||
}
|
||||
else
|
||||
{
|
||||
// Use anonymous authentication
|
||||
env.put(javax.naming.Context.SECURITY_AUTHENTICATION, "none");
|
||||
}
|
||||
|
||||
DirContext ctx = null;
|
||||
try
|
||||
{
|
||||
// Create initial context
|
||||
ctx = new InitialDirContext(env);
|
||||
|
||||
Attributes matchAttrs = new BasicAttributes(true);
|
||||
matchAttrs.put(new BasicAttribute(ldap_id_field, netid));
|
||||
|
||||
// look up attributes
|
||||
try
|
||||
{
|
||||
SearchControls ctrls = new SearchControls();
|
||||
ctrls.setSearchScope(ldap_search_scope_value);
|
||||
|
||||
NamingEnumeration<SearchResult> answer = ctx.search(
|
||||
ldap_provider_url + ldap_search_context,
|
||||
"(&({0}={1}))", new Object[] { ldap_id_field,
|
||||
netid }, ctrls);
|
||||
|
||||
while (answer.hasMoreElements()) {
|
||||
SearchResult sr = answer.next();
|
||||
if (StringUtils.isEmpty(ldap_search_context)) {
|
||||
resultDN = sr.getName();
|
||||
} else {
|
||||
resultDN = (sr.getName() + "," + ldap_search_context);
|
||||
}
|
||||
|
||||
String attlist[] = {ldap_email_field, ldap_givenname_field,
|
||||
ldap_surname_field, ldap_phone_field};
|
||||
Attributes atts = sr.getAttributes();
|
||||
Attribute att;
|
||||
|
||||
if (attlist[0] != null) {
|
||||
att = atts.get(attlist[0]);
|
||||
if (att != null)
|
||||
{
|
||||
ldapEmail = (String) att.get();
|
||||
}
|
||||
}
|
||||
|
||||
if (attlist[1] != null) {
|
||||
att = atts.get(attlist[1]);
|
||||
if (att != null)
|
||||
{
|
||||
ldapGivenName = (String) att.get();
|
||||
}
|
||||
}
|
||||
|
||||
if (attlist[2] != null) {
|
||||
att = atts.get(attlist[2]);
|
||||
if (att != null)
|
||||
{
|
||||
ldapSurname = (String) att.get();
|
||||
}
|
||||
}
|
||||
|
||||
if (attlist[3] != null) {
|
||||
att = atts.get(attlist[3]);
|
||||
if (att != null)
|
||||
{
|
||||
ldapPhone = (String) att.get();
|
||||
}
|
||||
}
|
||||
|
||||
if (answer.hasMoreElements()) {
|
||||
// Oh dear - more than one match
|
||||
// Ambiguous user, can't continue
|
||||
|
||||
} else {
|
||||
log.debug(LogManager.getHeader(context, "got DN", resultDN));
|
||||
return resultDN;
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (NamingException e)
|
||||
{
|
||||
// if the lookup fails go ahead and create a new record for them because the authentication
|
||||
// succeeded
|
||||
log.warn(LogManager.getHeader(context,
|
||||
"ldap_attribute_lookup", "type=failed_search "
|
||||
+ e));
|
||||
}
|
||||
}
|
||||
catch (NamingException e)
|
||||
{
|
||||
log.warn(LogManager.getHeader(context,
|
||||
"ldap_authentication", "type=failed_auth " + e));
|
||||
}
|
||||
finally
|
||||
{
|
||||
// Close the context when we're done
|
||||
try
|
||||
{
|
||||
if (ctx != null)
|
||||
{
|
||||
ctx.close();
|
||||
}
|
||||
}
|
||||
catch (NamingException e)
|
||||
{
|
||||
}
|
||||
}
|
||||
|
||||
// No DN match found
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* contact the ldap server and attempt to authenticate
|
||||
*/
|
||||
protected boolean ldapAuthenticate(String netid, String password,
|
||||
Context context) {
|
||||
if (!password.equals("")) {
|
||||
// Set up environment for creating initial context
|
||||
Hashtable<String, String> env = new Hashtable<String, String>();
|
||||
env.put(javax.naming.Context.INITIAL_CONTEXT_FACTORY,
|
||||
"com.sun.jndi.ldap.LdapCtxFactory");
|
||||
env.put(javax.naming.Context.PROVIDER_URL, ldap_provider_url);
|
||||
|
||||
// Authenticate
|
||||
env.put(javax.naming.Context.SECURITY_AUTHENTICATION, "Simple");
|
||||
env.put(javax.naming.Context.SECURITY_PRINCIPAL, netid);
|
||||
env.put(javax.naming.Context.SECURITY_CREDENTIALS, password);
|
||||
env.put(javax.naming.Context.AUTHORITATIVE, "true");
|
||||
env.put(javax.naming.Context.REFERRAL, "follow");
|
||||
|
||||
DirContext ctx = null;
|
||||
try {
|
||||
// Try to bind
|
||||
ctx = new InitialDirContext(env);
|
||||
} catch (NamingException e) {
|
||||
log.warn(LogManager.getHeader(context,
|
||||
"ldap_authentication", "type=failed_auth " + e));
|
||||
return false;
|
||||
} finally {
|
||||
// Close the context when we're done
|
||||
try {
|
||||
if (ctx != null)
|
||||
{
|
||||
ctx.close();
|
||||
}
|
||||
} catch (NamingException e) {
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
* Returns URL to which to redirect to obtain credentials (either password
|
||||
* prompt or e.g. HTTPS port for client cert.); null means no redirect.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context, will be modified (ePerson set) upon success.
|
||||
*
|
||||
* @param request
|
||||
* The HTTP request that started this operation, or null if not applicable.
|
||||
*
|
||||
* @param response
|
||||
* The HTTP response from the servlet method.
|
||||
*
|
||||
* @return fully-qualified URL
|
||||
*/
|
||||
public String loginPageURL(Context context,
|
||||
HttpServletRequest request,
|
||||
HttpServletResponse response)
|
||||
{
|
||||
return response.encodeRedirectURL(request.getContextPath() +
|
||||
"/ldap-login");
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns message key for title of the "login" page, to use
|
||||
* in a menu showing the choice of multiple login methods.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context, will be modified (ePerson set) upon success.
|
||||
*
|
||||
* @return Message key to look up in i18n message catalog.
|
||||
*/
|
||||
public String loginPageTitle(Context context)
|
||||
{
|
||||
return "org.dspace.eperson.LDAPAuthentication.title";
|
||||
}
|
||||
}
|
||||
@@ -1,439 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authenticate;
|
||||
|
||||
import java.sql.SQLException;
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import java.util.Collection;
|
||||
import javax.servlet.http.HttpServletResponse;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.LogManager;
|
||||
import org.dspace.authenticate.AuthenticationManager;
|
||||
import org.dspace.authenticate.AuthenticationMethod;
|
||||
import org.dspace.eperson.EPerson;
|
||||
import org.dspace.eperson.Group;
|
||||
|
||||
/**
|
||||
* Shibboleth authentication for DSpace, tested on Shibboleth 1.3.x and
|
||||
* Shibboleth 2.x. Read <a href=
|
||||
* "https://mams.melcoe.mq.edu.au/zope/mams/pubs/Installation/dspace15/view"
|
||||
* >Shib DSpace 1.5</a> for installation procedure. Read dspace.cfg for details
|
||||
* on options available.
|
||||
*
|
||||
* @author <a href="mailto:bliong@melcoe.mq.edu.au">Bruc Liong, MELCOE</a>
|
||||
* @author <a href="mailto:kli@melcoe.mq.edu.au">Xiang Kevin Li, MELCOE</a>
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class ShibAuthentication implements AuthenticationMethod
|
||||
{
|
||||
/** log4j category */
|
||||
private static Logger log = Logger.getLogger(ShibAuthentication.class);
|
||||
|
||||
public int authenticate(Context context, String username, String password,
|
||||
String realm, HttpServletRequest request) throws SQLException
|
||||
{
|
||||
if (request == null)
|
||||
{
|
||||
return BAD_ARGS;
|
||||
}
|
||||
log.info("Shibboleth login started...");
|
||||
|
||||
java.util.Enumeration names = request.getHeaderNames();
|
||||
String name;
|
||||
while (names.hasMoreElements())
|
||||
{
|
||||
name = names.nextElement().toString();
|
||||
log.debug("header:" + name + "=" + request.getHeader(name));
|
||||
}
|
||||
|
||||
boolean isUsingTomcatUser = ConfigurationManager.getBooleanProperty("authentication.shib.email-use-tomcat-remote-user");
|
||||
String emailHeader = ConfigurationManager.getProperty("authentication.shib.email-header");
|
||||
String fnameHeader = ConfigurationManager.getProperty("authentication.shib.firstname-header");
|
||||
String lnameHeader = ConfigurationManager.getProperty("authentication.shib.lastname-header");
|
||||
|
||||
String email = null;
|
||||
String fname = null;
|
||||
String lname = null;
|
||||
|
||||
if (emailHeader != null)
|
||||
{
|
||||
// try to grab email from the header
|
||||
email = request.getHeader(emailHeader);
|
||||
|
||||
// fail, try lower case
|
||||
if (email == null)
|
||||
{
|
||||
email = request.getHeader(emailHeader.toLowerCase());
|
||||
}
|
||||
}
|
||||
|
||||
// try to pull the "REMOTE_USER" info instead of the header
|
||||
if (email == null && isUsingTomcatUser)
|
||||
{
|
||||
email = request.getRemoteUser();
|
||||
log.info("RemoteUser identified as: " + email);
|
||||
}
|
||||
|
||||
// No email address, perhaps the eperson has been setup, better check it
|
||||
if (email == null)
|
||||
{
|
||||
EPerson p = context.getCurrentUser();
|
||||
if (p != null)
|
||||
{
|
||||
email = p.getEmail();
|
||||
}
|
||||
}
|
||||
|
||||
if (email == null)
|
||||
{
|
||||
log
|
||||
.error("No email is given, you're denied access by Shib, please release email address");
|
||||
return AuthenticationMethod.BAD_ARGS;
|
||||
}
|
||||
|
||||
email = email.toLowerCase();
|
||||
|
||||
if (fnameHeader != null)
|
||||
{
|
||||
// try to grab name from the header
|
||||
fname = request.getHeader(fnameHeader);
|
||||
|
||||
// fail, try lower case
|
||||
if (fname == null)
|
||||
{
|
||||
fname = request.getHeader(fnameHeader.toLowerCase());
|
||||
}
|
||||
}
|
||||
if (lnameHeader != null)
|
||||
{
|
||||
// try to grab name from the header
|
||||
lname = request.getHeader(lnameHeader);
|
||||
|
||||
// fail, try lower case
|
||||
if (lname == null)
|
||||
{
|
||||
lname = request.getHeader(lnameHeader.toLowerCase());
|
||||
}
|
||||
}
|
||||
|
||||
// future version can offer auto-update feature, this needs testing
|
||||
// before inclusion to core code
|
||||
|
||||
EPerson eperson = null;
|
||||
try
|
||||
{
|
||||
eperson = EPerson.findByEmail(context, email);
|
||||
context.setCurrentUser(eperson);
|
||||
}
|
||||
catch (AuthorizeException e)
|
||||
{
|
||||
log.warn("Fail to locate user with email:" + email, e);
|
||||
eperson = null;
|
||||
}
|
||||
|
||||
// auto create user if needed
|
||||
if (eperson == null
|
||||
&& ConfigurationManager
|
||||
.getBooleanProperty("authentication.shib.autoregister"))
|
||||
{
|
||||
log.info(LogManager.getHeader(context, "autoregister", "email="
|
||||
+ email));
|
||||
|
||||
// TEMPORARILY turn off authorisation
|
||||
context.setIgnoreAuthorization(true);
|
||||
try
|
||||
{
|
||||
eperson = EPerson.create(context);
|
||||
eperson.setEmail(email);
|
||||
if (fname != null)
|
||||
{
|
||||
eperson.setFirstName(fname);
|
||||
}
|
||||
if (lname != null)
|
||||
{
|
||||
eperson.setLastName(lname);
|
||||
}
|
||||
eperson.setCanLogIn(true);
|
||||
AuthenticationManager.initEPerson(context, request, eperson);
|
||||
eperson.update();
|
||||
context.commit();
|
||||
context.setCurrentUser(eperson);
|
||||
}
|
||||
catch (AuthorizeException e)
|
||||
{
|
||||
log.warn("Fail to authorize user with email:" + email, e);
|
||||
eperson = null;
|
||||
}
|
||||
finally
|
||||
{
|
||||
context.setIgnoreAuthorization(false);
|
||||
}
|
||||
}
|
||||
|
||||
if (eperson == null)
|
||||
{
|
||||
return AuthenticationMethod.NO_SUCH_USER;
|
||||
}
|
||||
else
|
||||
{
|
||||
// the person exists, just return ok
|
||||
context.setCurrentUser(eperson);
|
||||
request.getSession().setAttribute("shib.authenticated",
|
||||
Boolean.TRUE);
|
||||
}
|
||||
|
||||
return AuthenticationMethod.SUCCESS;
|
||||
}
|
||||
|
||||
/**
|
||||
* Grab the special groups to be automatically provisioned for the current
|
||||
* user. Currently the mapping for the groups is done one-to-one, future
|
||||
* version can consider the usage of regex for such mapping.
|
||||
*/
|
||||
public int[] getSpecialGroups(Context context, HttpServletRequest request)
|
||||
{
|
||||
// no user logged in or user not logged from shibboleth
|
||||
if (request == null || context.getCurrentUser() == null
|
||||
|| request.getSession().getAttribute("shib.authenticated") == null)
|
||||
{
|
||||
return new int[0];
|
||||
}
|
||||
|
||||
if (request.getSession().getAttribute("shib.specialgroup") != null)
|
||||
{
|
||||
return (int[]) request.getSession().getAttribute(
|
||||
"shib.specialgroup");
|
||||
}
|
||||
|
||||
java.util.Set groups = new java.util.HashSet();
|
||||
String roleHeader = ConfigurationManager
|
||||
.getProperty("authentication.shib.role-header");
|
||||
boolean roleHeader_ignoreScope = ConfigurationManager
|
||||
.getBooleanProperty("authentication.shib.role-header.ignore-scope");
|
||||
if (roleHeader == null || roleHeader.trim().length() == 0)
|
||||
{
|
||||
roleHeader = "Shib-EP-UnscopedAffiliation";
|
||||
} // fall back to default
|
||||
String affiliations = request.getHeader(roleHeader);
|
||||
|
||||
// try again with all lower case...maybe has better luck
|
||||
if (affiliations == null)
|
||||
{
|
||||
affiliations = request.getHeader(roleHeader.toLowerCase());
|
||||
}
|
||||
|
||||
// default role when fully authN but not releasing any roles?
|
||||
String defaultRoles = ConfigurationManager
|
||||
.getProperty("authentication.shib.default-roles");
|
||||
if (affiliations == null && defaultRoles != null)
|
||||
{
|
||||
affiliations = defaultRoles;
|
||||
}
|
||||
|
||||
if (affiliations != null)
|
||||
{
|
||||
java.util.StringTokenizer st = new java.util.StringTokenizer(
|
||||
affiliations, ";,");
|
||||
// do the mapping here
|
||||
while (st.hasMoreTokens())
|
||||
{
|
||||
String affiliation = st.nextToken().trim();
|
||||
|
||||
// strip scope if present and roleHeader_ignoreScope
|
||||
if (roleHeader_ignoreScope)
|
||||
{
|
||||
int index = affiliation.indexOf('@');
|
||||
if (index != -1)
|
||||
{
|
||||
affiliation = affiliation.substring(0, index);
|
||||
}
|
||||
}
|
||||
|
||||
// perform mapping here if necessary
|
||||
String groupLabels = ConfigurationManager
|
||||
.getProperty("authentication.shib.role." + affiliation);
|
||||
if (groupLabels == null || groupLabels.trim().length() == 0)
|
||||
{
|
||||
groupLabels = ConfigurationManager
|
||||
.getProperty("authentication.shib.role."
|
||||
+ affiliation.toLowerCase());
|
||||
}
|
||||
|
||||
// revert back to original entry when no mapping is provided
|
||||
if (groupLabels == null)
|
||||
{
|
||||
groupLabels = affiliation;
|
||||
}
|
||||
|
||||
String[] labels = groupLabels.split(",");
|
||||
for (int i = 0; i < labels.length; i++)
|
||||
{
|
||||
addGroup(groups, context, labels[i].trim());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
int ids[] = new int[groups.size()];
|
||||
java.util.Iterator it = groups.iterator();
|
||||
for (int i = 0; it.hasNext(); i++)
|
||||
{
|
||||
ids[i] = ((Integer) it.next()).intValue();
|
||||
}
|
||||
|
||||
// store the special group, if already transformed from headers
|
||||
// since subsequent header may not have the values anymore
|
||||
if (ids.length != 0)
|
||||
{
|
||||
request.getSession().setAttribute("shib.specialgroup", ids);
|
||||
}
|
||||
|
||||
return ids;
|
||||
}
|
||||
|
||||
/** Find dspaceGroup in DSpace database, if found, include it into groups */
|
||||
private void addGroup(Collection groups, Context context, String dspaceGroup)
|
||||
{
|
||||
try
|
||||
{
|
||||
Group g = Group.findByName(context, dspaceGroup);
|
||||
if (g == null)
|
||||
{
|
||||
// oops - no group defined
|
||||
log.warn(LogManager.getHeader(context, dspaceGroup
|
||||
+ " group is not found!! Admin needs to create one!",
|
||||
"requiredGroup=" + dspaceGroup));
|
||||
groups.add(Integer.valueOf(0));
|
||||
}
|
||||
else
|
||||
{
|
||||
groups.add(Integer.valueOf(g.getID()));
|
||||
}
|
||||
log.info("Mapping group: " + dspaceGroup + " to groupID: "
|
||||
+ (g == null ? 0 : g.getID()));
|
||||
}
|
||||
catch (SQLException e)
|
||||
{
|
||||
log.error("Mapping group:" + dspaceGroup + " failed with error", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicate whether or not a particular self-registering user can set
|
||||
* themselves a password in the profile info form.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context
|
||||
* @param request
|
||||
* HTTP request, in case anything in that is used to decide
|
||||
* @param email
|
||||
* e-mail address of user attempting to register
|
||||
*
|
||||
*/
|
||||
public boolean allowSetPassword(Context context,
|
||||
HttpServletRequest request, String email) throws SQLException
|
||||
{
|
||||
// don't use password at all
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Predicate, is this an implicit authentication method. An implicit method
|
||||
* gets credentials from the environment (such as an HTTP request or even
|
||||
* Java system properties) rather than the explicit username and password.
|
||||
* For example, a method that reads the X.509 certificates in an HTTPS
|
||||
* request is implicit.
|
||||
*
|
||||
* @return true if this method uses implicit authentication.
|
||||
*/
|
||||
public boolean isImplicit()
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Indicate whether or not a particular user can self-register, based on
|
||||
* e-mail address.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context
|
||||
* @param request
|
||||
* HTTP request, in case anything in that is used to decide
|
||||
* @param username
|
||||
* e-mail address of user attempting to register
|
||||
*
|
||||
*/
|
||||
public boolean canSelfRegister(Context context, HttpServletRequest request,
|
||||
String username) throws SQLException
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialise a new e-person record for a self-registered new user.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context
|
||||
* @param request
|
||||
* HTTP request, in case it's needed
|
||||
* @param eperson
|
||||
* newly created EPerson record - email + information from the
|
||||
* registration form will have been filled out.
|
||||
*
|
||||
*/
|
||||
public void initEPerson(Context context, HttpServletRequest request,
|
||||
EPerson eperson) throws SQLException
|
||||
{
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* Get login page to which to redirect. Returns URL (as string) to which to
|
||||
* redirect to obtain credentials (either password prompt or e.g. HTTPS port
|
||||
* for client cert.); null means no redirect.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context, will be modified (ePerson set) upon success.
|
||||
*
|
||||
* @param request
|
||||
* The HTTP request that started this operation, or null if not
|
||||
* applicable.
|
||||
*
|
||||
* @param response
|
||||
* The HTTP response from the servlet method.
|
||||
*
|
||||
* @return fully-qualified URL or null
|
||||
*/
|
||||
public String loginPageURL(Context context, HttpServletRequest request,
|
||||
HttpServletResponse response)
|
||||
{
|
||||
return response.encodeRedirectURL(request.getContextPath()
|
||||
+ "/shibboleth-login");
|
||||
}
|
||||
|
||||
/**
|
||||
* Get title of login page to which to redirect. Returns a <i>message
|
||||
* key</i> that gets translated into the title or label for "login page" (or
|
||||
* null, if not implemented) This title may be used to identify the link to
|
||||
* the login page in a selection menu, when there are multiple ways to
|
||||
* login.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context, will be modified (ePerson set) upon success.
|
||||
*
|
||||
* @return title text.
|
||||
*/
|
||||
public String loginPageTitle(Context context)
|
||||
{
|
||||
return "org.dspace.authenticate.ShibAuthentication.title";
|
||||
}
|
||||
}
|
||||
@@ -1,707 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authenticate;
|
||||
|
||||
import java.io.BufferedInputStream;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.security.GeneralSecurityException;
|
||||
import java.security.KeyStore;
|
||||
import java.security.Principal;
|
||||
import java.security.PublicKey;
|
||||
import java.security.cert.Certificate;
|
||||
import java.security.cert.CertificateException;
|
||||
import java.security.cert.CertificateFactory;
|
||||
import java.security.cert.X509Certificate;
|
||||
import java.sql.SQLException;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Enumeration;
|
||||
import java.util.List;
|
||||
import java.util.StringTokenizer;
|
||||
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import javax.servlet.http.HttpServletResponse;
|
||||
import javax.servlet.http.HttpSession;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.authenticate.AuthenticationMethod;
|
||||
import org.dspace.authenticate.AuthenticationManager;
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.core.LogManager;
|
||||
import org.dspace.eperson.EPerson;
|
||||
import org.dspace.eperson.Group;
|
||||
|
||||
/**
|
||||
* Implicit authentication method that gets credentials from the X.509 client
|
||||
* certificate supplied by the HTTPS client when connecting to this server. The
|
||||
* email address in that certificate is taken as the authenticated user name
|
||||
* with no further checking, so be sure your HTTP server (e.g. Tomcat) is
|
||||
* configured correctly to accept only client certificates it can validate.
|
||||
* <p>
|
||||
* See the <code>AuthenticationMethod</code> interface for more details.
|
||||
* <p>
|
||||
* <b>Configuration:</b>
|
||||
*
|
||||
* <pre>
|
||||
* authentication.x509.keystore.path =
|
||||
* <em>
|
||||
* path to Java keystore file
|
||||
* </em>
|
||||
* authentication.x509.keystore.password =
|
||||
* <em>
|
||||
* password to access the keystore
|
||||
* </em>
|
||||
* authentication.x509.ca.cert =
|
||||
* <em>
|
||||
* path to certificate file for CA whose client certs to accept.
|
||||
* </em>
|
||||
* authentication.x509.autoregister =
|
||||
* <em>
|
||||
* "true" if E-Person is created automatically for unknown new users.
|
||||
* </em>
|
||||
* authentication.x509.groups =
|
||||
* <em>
|
||||
* comma-delimited list of special groups to add user to if authenticated.
|
||||
* </em>
|
||||
* authentication.x509.emaildomain =
|
||||
* <em>
|
||||
* email address domain (after the 'at' symbol) to match before allowing
|
||||
* membership in special groups.
|
||||
* </em>
|
||||
* </pre>
|
||||
*
|
||||
* Only one of the "<code>keystore.path</code>" or "<code>ca.cert</code>"
|
||||
* options is required. If you supply a keystore, then all of the "trusted"
|
||||
* certificates in the keystore represent CAs whose client certificates will be
|
||||
* accepted. The <code>ca.cert</code> option only allows a single CA to be
|
||||
* named.
|
||||
* <p>
|
||||
* You can configure <em>both</em> a keystore and a CA cert, and both will be
|
||||
* used.
|
||||
* <p>
|
||||
* The <code>autoregister</code> configuration parameter determines what the
|
||||
* <code>canSelfRegister()</code> method returns. It also allows an EPerson
|
||||
* record to be created automatically when the presented certificate is
|
||||
* acceptable but there is no corresponding EPerson.
|
||||
*
|
||||
* @author Larry Stone
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class X509Authentication implements AuthenticationMethod
|
||||
{
|
||||
|
||||
/** log4j category */
|
||||
private static Logger log = Logger.getLogger(X509Authentication.class);
|
||||
|
||||
/** public key of CA to check client certs against. */
|
||||
private static PublicKey caPublicKey = null;
|
||||
|
||||
/** key store for CA certs if we use that */
|
||||
private static KeyStore caCertKeyStore = null;
|
||||
|
||||
private static String loginPageTitle = null;
|
||||
|
||||
private static String loginPageURL = null;
|
||||
|
||||
/**
|
||||
* Initialization: Set caPublicKey and/or keystore. This loads the
|
||||
* information needed to check if a client cert presented is valid and
|
||||
* acceptable.
|
||||
*/
|
||||
static
|
||||
{
|
||||
/*
|
||||
* allow identification of alternative entry points for certificate
|
||||
* authentication when selected by the user rather than implicitly.
|
||||
*/
|
||||
loginPageTitle = ConfigurationManager
|
||||
.getProperty("authentication.x509.chooser.title.key");
|
||||
loginPageURL = ConfigurationManager
|
||||
.getProperty("authentication.x509.chooser.uri");
|
||||
|
||||
String keystorePath = ConfigurationManager
|
||||
.getProperty("authentication.x509.keystore.path");
|
||||
String keystorePassword = ConfigurationManager
|
||||
.getProperty("authentication.x509.keystore.password");
|
||||
String caCertPath = ConfigurationManager
|
||||
.getProperty("authentication.x509.ca.cert");
|
||||
|
||||
// backward-compatible kludge
|
||||
if (caCertPath == null)
|
||||
{
|
||||
caCertPath = ConfigurationManager.getProperty("webui.cert.ca");
|
||||
}
|
||||
|
||||
// First look for keystore full of trusted certs.
|
||||
if (keystorePath != null)
|
||||
{
|
||||
FileInputStream fis = null;
|
||||
if (keystorePassword == null)
|
||||
{
|
||||
keystorePassword = "";
|
||||
}
|
||||
try
|
||||
{
|
||||
KeyStore ks = KeyStore.getInstance("JKS");
|
||||
fis = new FileInputStream(keystorePath);
|
||||
ks.load(fis, keystorePassword.toCharArray());
|
||||
caCertKeyStore = ks;
|
||||
}
|
||||
catch (IOException e)
|
||||
{
|
||||
log
|
||||
.error("X509Authentication: Failed to load CA keystore, file="
|
||||
+ keystorePath + ", error=" + e.toString());
|
||||
}
|
||||
catch (GeneralSecurityException e)
|
||||
{
|
||||
log
|
||||
.error("X509Authentication: Failed to extract CA keystore, file="
|
||||
+ keystorePath + ", error=" + e.toString());
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (fis != null)
|
||||
{
|
||||
try
|
||||
{
|
||||
fis.close();
|
||||
}
|
||||
catch (IOException ioe)
|
||||
{
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Second, try getting public key out of CA cert, if that's configured.
|
||||
if (caCertPath != null)
|
||||
{
|
||||
InputStream is = null;
|
||||
FileInputStream fis = null;
|
||||
try
|
||||
{
|
||||
fis = new FileInputStream(caCertPath);
|
||||
is = new BufferedInputStream(fis);
|
||||
X509Certificate cert = (X509Certificate) CertificateFactory
|
||||
.getInstance("X.509").generateCertificate(is);
|
||||
if (cert != null)
|
||||
{
|
||||
caPublicKey = cert.getPublicKey();
|
||||
}
|
||||
}
|
||||
catch (IOException e)
|
||||
{
|
||||
log.error("X509Authentication: Failed to load CA cert, file="
|
||||
+ caCertPath + ", error=" + e.toString());
|
||||
}
|
||||
catch (CertificateException e)
|
||||
{
|
||||
log
|
||||
.error("X509Authentication: Failed to extract CA cert, file="
|
||||
+ caCertPath + ", error=" + e.toString());
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (is != null)
|
||||
{
|
||||
try
|
||||
{
|
||||
is.close();
|
||||
}
|
||||
catch (IOException ioe)
|
||||
{
|
||||
}
|
||||
}
|
||||
|
||||
if (fis != null)
|
||||
{
|
||||
try
|
||||
{
|
||||
fis.close();
|
||||
}
|
||||
catch (IOException ioe)
|
||||
{
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the email address from <code>certificate</code>, or null if an
|
||||
* email address cannot be found in the certificate.
|
||||
* <p>
|
||||
* Note that the certificate parsing has only been tested with certificates
|
||||
* granted by the MIT Certification Authority, and may not work elsewhere.
|
||||
*
|
||||
* @param certificate -
|
||||
* An X509 certificate object
|
||||
* @return - The email address found in certificate, or null if an email
|
||||
* address cannot be found in the certificate.
|
||||
*/
|
||||
private static String getEmail(X509Certificate certificate)
|
||||
throws SQLException
|
||||
{
|
||||
Principal principal = certificate.getSubjectDN();
|
||||
|
||||
if (principal == null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
String dn = principal.getName();
|
||||
if (dn == null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
StringTokenizer tokenizer = new StringTokenizer(dn, ",");
|
||||
String token = null;
|
||||
while (tokenizer.hasMoreTokens())
|
||||
{
|
||||
int len = "emailaddress=".length();
|
||||
|
||||
token = (String) tokenizer.nextToken();
|
||||
|
||||
if (token.toLowerCase().startsWith("emailaddress="))
|
||||
{
|
||||
// Make sure the token actually contains something
|
||||
if (token.length() <= len)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
return token.substring(len).toLowerCase();
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify CERTIFICATE against KEY. Return true if and only if CERTIFICATE is
|
||||
* valid and can be verified against KEY.
|
||||
*
|
||||
* @param certificate -
|
||||
* An X509 certificate object
|
||||
* @param key -
|
||||
* PublicKey to check the certificate against.
|
||||
* @return - True if CERTIFICATE is valid and can be verified against KEY,
|
||||
* false otherwise.
|
||||
*/
|
||||
private static boolean isValid(Context context, X509Certificate certificate)
|
||||
{
|
||||
if (certificate == null)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
// This checks that current time is within cert's validity window:
|
||||
try
|
||||
{
|
||||
certificate.checkValidity();
|
||||
}
|
||||
catch (CertificateException e)
|
||||
{
|
||||
log.info(LogManager.getHeader(context, "authentication",
|
||||
"X.509 Certificate is EXPIRED or PREMATURE: "
|
||||
+ e.toString()));
|
||||
return false;
|
||||
}
|
||||
|
||||
// Try CA public key, if available.
|
||||
if (caPublicKey != null)
|
||||
{
|
||||
try
|
||||
{
|
||||
certificate.verify(caPublicKey);
|
||||
return true;
|
||||
}
|
||||
catch (GeneralSecurityException e)
|
||||
{
|
||||
log.info(LogManager.getHeader(context, "authentication",
|
||||
"X.509 Certificate FAILED SIGNATURE check: "
|
||||
+ e.toString()));
|
||||
}
|
||||
}
|
||||
|
||||
// Try it with keystore, if available.
|
||||
if (caCertKeyStore != null)
|
||||
{
|
||||
try
|
||||
{
|
||||
Enumeration ke = caCertKeyStore.aliases();
|
||||
|
||||
while (ke.hasMoreElements())
|
||||
{
|
||||
String alias = (String) ke.nextElement();
|
||||
if (caCertKeyStore.isCertificateEntry(alias))
|
||||
{
|
||||
Certificate ca = caCertKeyStore.getCertificate(alias);
|
||||
try
|
||||
{
|
||||
certificate.verify(ca.getPublicKey());
|
||||
return true;
|
||||
}
|
||||
catch (CertificateException ce)
|
||||
{
|
||||
}
|
||||
}
|
||||
}
|
||||
log
|
||||
.info(LogManager
|
||||
.getHeader(context, "authentication",
|
||||
"Keystore method FAILED SIGNATURE check on client cert."));
|
||||
}
|
||||
catch (GeneralSecurityException e)
|
||||
{
|
||||
log.info(LogManager.getHeader(context, "authentication",
|
||||
"X.509 Certificate FAILED SIGNATURE check: "
|
||||
+ e.toString()));
|
||||
}
|
||||
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Predicate, can new user automatically create EPerson. Checks
|
||||
* configuration value. You'll probably want this to be true to take
|
||||
* advantage of a Web certificate infrastructure with many more users than
|
||||
* are already known by DSpace.
|
||||
*/
|
||||
public boolean canSelfRegister(Context context, HttpServletRequest request,
|
||||
String username) throws SQLException
|
||||
{
|
||||
return ConfigurationManager
|
||||
.getBooleanProperty("authentication.x509.autoregister");
|
||||
}
|
||||
|
||||
/**
|
||||
* Nothing extra to initialize.
|
||||
*/
|
||||
public void initEPerson(Context context, HttpServletRequest request,
|
||||
EPerson eperson) throws SQLException
|
||||
{
|
||||
}
|
||||
|
||||
/**
|
||||
* We don't use EPerson password so there is no reason to change it.
|
||||
*/
|
||||
public boolean allowSetPassword(Context context,
|
||||
HttpServletRequest request, String username) throws SQLException
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns true, this is an implicit method.
|
||||
*/
|
||||
public boolean isImplicit()
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a list of group names that the user should be added to upon
|
||||
* successful authentication, configured in dspace.cfg.
|
||||
*
|
||||
* @return List<String> of special groups configured for this authenticator
|
||||
*/
|
||||
private List<String> getX509Groups()
|
||||
{
|
||||
List<String> groupNames = new ArrayList<String>();
|
||||
|
||||
String x509GroupConfig = null;
|
||||
x509GroupConfig = ConfigurationManager
|
||||
.getProperty("authentication.x509.groups");
|
||||
|
||||
if (null != x509GroupConfig && !x509GroupConfig.equals(""))
|
||||
{
|
||||
String[] groups = x509GroupConfig.split("\\s*,\\s*");
|
||||
|
||||
for (int i = 0; i < groups.length; i++)
|
||||
{
|
||||
groupNames.add(groups[i].trim());
|
||||
}
|
||||
}
|
||||
|
||||
return groupNames;
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks for configured email domain required to grant special groups
|
||||
* membership. If no email domain is configured to verify, special group
|
||||
* membership is simply granted.
|
||||
*
|
||||
* @param request -
|
||||
* The current request object
|
||||
* @param email -
|
||||
* The email address from the x509 certificate
|
||||
*/
|
||||
private void setSpecialGroupsFlag(HttpServletRequest request, String email)
|
||||
{
|
||||
String emailDomain = null;
|
||||
emailDomain = (String) request
|
||||
.getAttribute("authentication.x509.emaildomain");
|
||||
|
||||
HttpSession session = request.getSession(true);
|
||||
|
||||
if (null != emailDomain && !emailDomain.equals(""))
|
||||
{
|
||||
if (email.substring(email.length() - emailDomain.length()).equals(
|
||||
emailDomain))
|
||||
{
|
||||
session.setAttribute("x509Auth", Boolean.TRUE);
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
// No configured email domain to verify. Just flag
|
||||
// as authenticated so special groups are granted.
|
||||
session.setAttribute("x509Auth", Boolean.TRUE);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Return special groups configured in dspace.cfg for X509 certificate
|
||||
* authentication.
|
||||
*
|
||||
* @param context
|
||||
* @param request
|
||||
* object potentially containing the cert
|
||||
*
|
||||
* @return An int array of group IDs
|
||||
*
|
||||
*/
|
||||
public int[] getSpecialGroups(Context context, HttpServletRequest request)
|
||||
throws SQLException
|
||||
{
|
||||
if (request == null)
|
||||
{
|
||||
return new int[0];
|
||||
}
|
||||
|
||||
Boolean authenticated = false;
|
||||
HttpSession session = request.getSession(false);
|
||||
authenticated = (Boolean) session.getAttribute("x509Auth");
|
||||
authenticated = (null == authenticated) ? false : authenticated;
|
||||
|
||||
if (authenticated)
|
||||
{
|
||||
List<String> groupNames = getX509Groups();
|
||||
List<Integer> groupIDs = new ArrayList<Integer>();
|
||||
|
||||
if (groupNames != null)
|
||||
{
|
||||
for (String groupName : groupNames)
|
||||
{
|
||||
if (groupName != null)
|
||||
{
|
||||
Group group = Group.findByName(context, groupName);
|
||||
if (group != null)
|
||||
{
|
||||
groupIDs.add(Integer.valueOf(group.getID()));
|
||||
}
|
||||
else
|
||||
{
|
||||
log.warn(LogManager.getHeader(context,
|
||||
"configuration_error", "unknown_group="
|
||||
+ groupName));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
int[] results = new int[groupIDs.size()];
|
||||
for (int i = 0; i < groupIDs.size(); i++)
|
||||
{
|
||||
results[i] = (groupIDs.get(i)).intValue();
|
||||
}
|
||||
|
||||
if (log.isDebugEnabled())
|
||||
{
|
||||
StringBuffer gsb = new StringBuffer();
|
||||
|
||||
for (int i = 0; i < results.length; i++)
|
||||
{
|
||||
if (i > 0)
|
||||
{
|
||||
gsb.append(",");
|
||||
}
|
||||
gsb.append(results[i]);
|
||||
}
|
||||
|
||||
log.debug(LogManager.getHeader(context, "authenticated",
|
||||
"special_groups=" + gsb.toString()));
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
return new int[0];
|
||||
}
|
||||
|
||||
/**
|
||||
* X509 certificate authentication. The client certificate is obtained from
|
||||
* the <code>ServletRequest</code> object.
|
||||
* <ul>
|
||||
* <li>If the certificate is valid, and corresponds to an existing EPerson,
|
||||
* and the user is allowed to login, return success.</li>
|
||||
* <li>If the user is matched but is not allowed to login, it fails.</li>
|
||||
* <li>If the certificate is valid, but there is no corresponding EPerson,
|
||||
* the <code>"authentication.x509.autoregister"</code> configuration
|
||||
* parameter is checked (via <code>canSelfRegister()</code>)
|
||||
* <ul>
|
||||
* <li>If it's true, a new EPerson record is created for the certificate,
|
||||
* and the result is success.</li>
|
||||
* <li>If it's false, return that the user was unknown.</li>
|
||||
* </ul>
|
||||
* </li>
|
||||
* </ul>
|
||||
*
|
||||
* @return One of: SUCCESS, BAD_CREDENTIALS, NO_SUCH_USER, BAD_ARGS
|
||||
*/
|
||||
public int authenticate(Context context, String username, String password,
|
||||
String realm, HttpServletRequest request) throws SQLException
|
||||
{
|
||||
// Obtain the certificate from the request, if any
|
||||
X509Certificate[] certs = null;
|
||||
if (request != null)
|
||||
{
|
||||
certs = (X509Certificate[]) request
|
||||
.getAttribute("javax.servlet.request.X509Certificate");
|
||||
}
|
||||
|
||||
if ((certs == null) || (certs.length == 0))
|
||||
{
|
||||
return BAD_ARGS;
|
||||
}
|
||||
else
|
||||
{
|
||||
// We have a cert -- check it and get username from it.
|
||||
try
|
||||
{
|
||||
if (!isValid(context, certs[0]))
|
||||
{
|
||||
log
|
||||
.warn(LogManager
|
||||
.getHeader(context, "authenticate",
|
||||
"type=x509certificate, status=BAD_CREDENTIALS (not valid)"));
|
||||
return BAD_CREDENTIALS;
|
||||
}
|
||||
|
||||
// And it's valid - try and get an e-person
|
||||
String email = getEmail(certs[0]);
|
||||
EPerson eperson = null;
|
||||
if (email != null)
|
||||
{
|
||||
eperson = EPerson.findByEmail(context, email);
|
||||
}
|
||||
if (eperson == null)
|
||||
{
|
||||
// Cert is valid, but no record.
|
||||
if (email != null
|
||||
&& canSelfRegister(context, request, null))
|
||||
{
|
||||
// Register the new user automatically
|
||||
log.info(LogManager.getHeader(context, "autoregister",
|
||||
"from=x.509, email=" + email));
|
||||
|
||||
// TEMPORARILY turn off authorisation
|
||||
context.setIgnoreAuthorization(true);
|
||||
eperson = EPerson.create(context);
|
||||
eperson.setEmail(email);
|
||||
eperson.setCanLogIn(true);
|
||||
AuthenticationManager.initEPerson(context, request,
|
||||
eperson);
|
||||
eperson.update();
|
||||
context.commit();
|
||||
context.setIgnoreAuthorization(false);
|
||||
context.setCurrentUser(eperson);
|
||||
setSpecialGroupsFlag(request, email);
|
||||
return SUCCESS;
|
||||
}
|
||||
else
|
||||
{
|
||||
// No auto-registration for valid certs
|
||||
log
|
||||
.warn(LogManager
|
||||
.getHeader(context, "authenticate",
|
||||
"type=cert_but_no_record, cannot auto-register"));
|
||||
return NO_SUCH_USER;
|
||||
}
|
||||
}
|
||||
|
||||
// make sure this is a login account
|
||||
else if (!eperson.canLogIn())
|
||||
{
|
||||
log.warn(LogManager.getHeader(context, "authenticate",
|
||||
"type=x509certificate, email=" + email
|
||||
+ ", canLogIn=false, rejecting."));
|
||||
return BAD_ARGS;
|
||||
}
|
||||
|
||||
else
|
||||
{
|
||||
log.info(LogManager.getHeader(context, "login",
|
||||
"type=x509certificate"));
|
||||
context.setCurrentUser(eperson);
|
||||
setSpecialGroupsFlag(request, email);
|
||||
return SUCCESS;
|
||||
}
|
||||
}
|
||||
catch (AuthorizeException ce)
|
||||
{
|
||||
log.warn(LogManager.getHeader(context, "authorize_exception",
|
||||
""), ce);
|
||||
}
|
||||
|
||||
return BAD_ARGS;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns URL of password-login servlet.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context, will be modified (EPerson set) upon success.
|
||||
*
|
||||
* @param request
|
||||
* The HTTP request that started this operation, or null if not
|
||||
* applicable.
|
||||
*
|
||||
* @param response
|
||||
* The HTTP response from the servlet method.
|
||||
*
|
||||
* @return fully-qualified URL
|
||||
*/
|
||||
public String loginPageURL(Context context, HttpServletRequest request,
|
||||
HttpServletResponse response)
|
||||
{
|
||||
return loginPageURL;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns message key for title of the "login" page, to use in a menu
|
||||
* showing the choice of multiple login methods.
|
||||
*
|
||||
* @param context
|
||||
* DSpace context, will be modified (EPerson set) upon success.
|
||||
*
|
||||
* @return Message key to look up in i18n message catalog.
|
||||
*/
|
||||
public String loginPageTitle(Context context)
|
||||
{
|
||||
return loginPageTitle;
|
||||
}
|
||||
}
|
||||
@@ -1,22 +0,0 @@
|
||||
<!--
|
||||
|
||||
The contents of this file are subject to the license and copyright
|
||||
detailed in the LICENSE and NOTICE files at the root of the source
|
||||
tree and available online at
|
||||
|
||||
http://www.dspace.org/license/
|
||||
|
||||
-->
|
||||
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
|
||||
<html>
|
||||
<head>
|
||||
<!--
|
||||
Author: Robert Tansley
|
||||
Version: $Revision$
|
||||
Date: $Date$
|
||||
-->
|
||||
</head>
|
||||
<body bgcolor="white">
|
||||
End-user authentication manager, interface and implementations.
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,526 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authorize;
|
||||
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
|
||||
/**
|
||||
* This class is responsible to provide access to the configuration of the
|
||||
* Authorization System
|
||||
*
|
||||
* @author bollini
|
||||
*
|
||||
*/
|
||||
public class AuthorizeConfiguration
|
||||
{
|
||||
|
||||
private static boolean can_communityAdmin_group = ConfigurationManager
|
||||
.getBooleanProperty("core.authorization.community-admin.group",
|
||||
true);
|
||||
|
||||
// subcommunities and collections
|
||||
private static boolean can_communityAdmin_createSubelement = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.create-subelement",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_deleteSubelement = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.delete-subelement",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_policies = ConfigurationManager
|
||||
.getBooleanProperty("core.authorization.community-admin.policies",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_adminGroup = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.admin-group", true);
|
||||
|
||||
private static boolean can_communityAdmin_collectionPolicies = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.collection.policies",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_collectionTemplateItem = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.collection.template-item",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_collectionSubmitters = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.collection.submitters",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_collectionWorkflows = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.collection.workflows",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_collectionAdminGroup = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.collection.admin-group",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_itemDelete = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.item.delete", true);
|
||||
|
||||
private static boolean can_communityAdmin_itemWithdraw = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.item.withdraw", true);
|
||||
|
||||
private static boolean can_communityAdmin_itemReinstatiate = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.item.reinstatiate",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_itemPolicies = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.item.policies", true);
|
||||
|
||||
// # also bundle
|
||||
private static boolean can_communityAdmin_itemCreateBitstream = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.item.create-bitstream",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_itemDeleteBitstream = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.item.delete-bitstream",
|
||||
true);
|
||||
|
||||
private static boolean can_communityAdmin_itemAdminccLicense = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.community-admin.item-admin.cc-license",
|
||||
true);
|
||||
|
||||
// # COLLECTION ADMIN
|
||||
private static boolean can_collectionAdmin_policies = ConfigurationManager
|
||||
.getBooleanProperty("core.authorization.collection-admin.policies",
|
||||
true);
|
||||
|
||||
private static boolean can_collectionAdmin_templateItem = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.template-item", true);
|
||||
|
||||
private static boolean can_collectionAdmin_submitters = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.submitters", true);
|
||||
|
||||
private static boolean can_collectionAdmin_workflows = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.workflows", true);
|
||||
|
||||
private static boolean can_collectionAdmin_adminGroup = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.admin-group", true);
|
||||
|
||||
private static boolean can_collectionAdmin_itemDelete = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.item.delete", true);
|
||||
|
||||
private static boolean can_collectionAdmin_itemWithdraw = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.item.withdraw", true);
|
||||
|
||||
private static boolean can_collectionAdmin_itemReinstatiate = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.item.reinstatiate",
|
||||
true);
|
||||
|
||||
private static boolean can_collectionAdmin_itemPolicies = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.item.policies", true);
|
||||
|
||||
// # also bundle
|
||||
private static boolean can_collectionAdmin_itemCreateBitstream = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.item.create-bitstream",
|
||||
true);
|
||||
|
||||
private static boolean can_collectionAdmin_itemDeleteBitstream = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.item.delete-bitstream",
|
||||
true);
|
||||
|
||||
private static boolean can_collectionAdmin_itemAdminccLicense = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.collection-admin.item-admin.cc-license",
|
||||
true);
|
||||
|
||||
// # ITEM ADMIN
|
||||
private static boolean can_itemAdmin_policies = ConfigurationManager
|
||||
.getBooleanProperty("core.authorization.item-admin.policies", true);
|
||||
|
||||
// # also bundle
|
||||
private static boolean can_itemAdmin_createBitstream = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.item-admin.create-bitstream", true);
|
||||
|
||||
private static boolean can_itemAdmin_deleteBitstream = ConfigurationManager
|
||||
.getBooleanProperty(
|
||||
"core.authorization.item-admin.delete-bitstream", true);
|
||||
|
||||
private static boolean can_itemAdmin_ccLicense = ConfigurationManager
|
||||
.getBooleanProperty("core.authorization.item-admin.cc-license",
|
||||
true);
|
||||
|
||||
/**
|
||||
* Are community admins allowed to create new, not strictly community
|
||||
* related, group?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminPerformGroupCreation()
|
||||
{
|
||||
return can_communityAdmin_group;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to create collections or subcommunities?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminPerformSubelementCreation()
|
||||
{
|
||||
return can_communityAdmin_createSubelement;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to remove collections or subcommunities?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminPerformSubelementDeletion()
|
||||
{
|
||||
return can_communityAdmin_deleteSubelement;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to manage the community's and
|
||||
* subcommunities' policies?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManagePolicies()
|
||||
{
|
||||
return can_communityAdmin_policies;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to create/edit them community's and
|
||||
* subcommunities' admin groups?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManageAdminGroup()
|
||||
{
|
||||
return can_communityAdmin_adminGroup;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to create/edit the community's and
|
||||
* subcommunities' admin group?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManageCollectionPolicies()
|
||||
{
|
||||
return can_communityAdmin_collectionPolicies;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to manage the item template of them
|
||||
* collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManageCollectionTemplateItem()
|
||||
{
|
||||
return can_communityAdmin_collectionTemplateItem;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to manage (create/edit/remove) the
|
||||
* submitters group of them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManageCollectionSubmitters()
|
||||
{
|
||||
return can_communityAdmin_collectionSubmitters;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to manage (create/edit/remove) the workflows
|
||||
* group of them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManageCollectionWorkflows()
|
||||
{
|
||||
return can_communityAdmin_collectionWorkflows;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to manage (create/edit/remove) the admin
|
||||
* group of them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManageCollectionAdminGroup()
|
||||
{
|
||||
return can_communityAdmin_collectionAdminGroup;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to remove an item from them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminPerformItemDeletion()
|
||||
{
|
||||
return can_communityAdmin_itemDelete;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to withdrawn an item from them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminPerformItemWithdrawn()
|
||||
{
|
||||
return can_communityAdmin_itemWithdraw;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to reinstate an item from them
|
||||
* collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminPerformItemReinstatiate()
|
||||
{
|
||||
return can_communityAdmin_itemReinstatiate;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to manage the policies of an item owned by
|
||||
* one of them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManageItemPolicies()
|
||||
{
|
||||
return can_communityAdmin_itemPolicies;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to add a bitstream to an item owned by one
|
||||
* of them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminPerformBitstreamCreation()
|
||||
{
|
||||
return can_communityAdmin_itemCreateBitstream;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to remove a bitstream from an item owned by
|
||||
* one of them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminPerformBitstreamDeletion()
|
||||
{
|
||||
return can_communityAdmin_itemDeleteBitstream;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are community admins allowed to perform CC License replace or addition to
|
||||
* an item owned by one of them collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCommunityAdminManageCCLicense()
|
||||
{
|
||||
return can_communityAdmin_itemAdminccLicense;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to manage the collection's policies?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminManagePolicies()
|
||||
{
|
||||
return can_collectionAdmin_policies;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to manage (create/edit/delete) the
|
||||
* collection's item template?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminManageTemplateItem()
|
||||
{
|
||||
return can_collectionAdmin_templateItem;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to manage (create/edit/delete) the
|
||||
* collection's submitters group?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminManageSubmitters()
|
||||
{
|
||||
return can_collectionAdmin_submitters;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to manage (create/edit/delete) the
|
||||
* collection's workflows group?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminManageWorkflows()
|
||||
{
|
||||
return can_collectionAdmin_workflows;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to manage (create/edit) the collection's
|
||||
* admins group?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminManageAdminGroup()
|
||||
{
|
||||
return can_collectionAdmin_adminGroup;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to remove an item from the collection?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminPerformItemDeletion()
|
||||
{
|
||||
return can_collectionAdmin_itemDelete;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to withdrawn an item from the collection?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminPerformItemWithdrawn()
|
||||
{
|
||||
return can_collectionAdmin_itemWithdraw;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to reinstate an item from the
|
||||
* collection?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminPerformItemReinstatiate()
|
||||
{
|
||||
return can_collectionAdmin_itemReinstatiate;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to manage the policies of item owned by the
|
||||
* collection?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminManageItemPolicies()
|
||||
{
|
||||
return can_collectionAdmin_itemPolicies;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to add a bitstream to an item owned by the
|
||||
* collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminPerformBitstreamCreation()
|
||||
{
|
||||
return can_collectionAdmin_itemCreateBitstream;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to remove a bitstream from an item owned by
|
||||
* the collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminPerformBitstreamDeletion()
|
||||
{
|
||||
return can_collectionAdmin_itemDeleteBitstream;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are collection admins allowed to replace or adding a CC License to an
|
||||
* item owned by the collections?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canCollectionAdminManageCCLicense()
|
||||
{
|
||||
return can_collectionAdmin_itemAdminccLicense;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are item admins allowed to manage the item's policies?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canItemAdminManagePolicies()
|
||||
{
|
||||
return can_itemAdmin_policies;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are item admins allowed to add bitstreams to the item?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canItemAdminPerformBitstreamCreation()
|
||||
{
|
||||
return can_itemAdmin_createBitstream;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are item admins allowed to remove bitstreams from the item?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canItemAdminPerformBitstreamDeletion()
|
||||
{
|
||||
return can_itemAdmin_deleteBitstream;
|
||||
}
|
||||
|
||||
/**
|
||||
* Are item admins allowed to replace or adding CC License to the item?
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public static boolean canItemAdminManageCCLicense()
|
||||
{
|
||||
return can_itemAdmin_ccLicense;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,72 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authorize;
|
||||
|
||||
import org.dspace.content.DSpaceObject;
|
||||
|
||||
/**
|
||||
* Exception indicating the current user of the context does not have permission
|
||||
* to perform a particular action.
|
||||
*
|
||||
* @author David Stuve
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class AuthorizeException extends Exception
|
||||
{
|
||||
private int myaction; // action attempted, or -1
|
||||
|
||||
private DSpaceObject myobject; // object action attempted on or null
|
||||
|
||||
/**
|
||||
* Create an empty authorize exception
|
||||
*/
|
||||
public AuthorizeException()
|
||||
{
|
||||
super();
|
||||
|
||||
myaction = -1;
|
||||
myobject = null;
|
||||
}
|
||||
|
||||
/**
|
||||
* create an exception with only a message
|
||||
*
|
||||
* @param message
|
||||
*/
|
||||
public AuthorizeException(String message)
|
||||
{
|
||||
super(message);
|
||||
|
||||
myaction = -1;
|
||||
myobject = null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create an authorize exception with a message
|
||||
*
|
||||
* @param message
|
||||
* the message
|
||||
*/
|
||||
public AuthorizeException(String message, DSpaceObject o, int a)
|
||||
{
|
||||
super(message);
|
||||
|
||||
myobject = o;
|
||||
myaction = a;
|
||||
}
|
||||
|
||||
public int getAction()
|
||||
{
|
||||
return myaction;
|
||||
}
|
||||
|
||||
public DSpaceObject getObject()
|
||||
{
|
||||
return myobject;
|
||||
}
|
||||
}
|
||||
@@ -1,275 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.authorize;
|
||||
|
||||
import java.sql.SQLException;
|
||||
|
||||
import org.dspace.authorize.AuthorizeException;
|
||||
import org.dspace.authorize.AuthorizeManager;
|
||||
import org.dspace.authorize.ResourcePolicy;
|
||||
import org.dspace.content.Bitstream;
|
||||
import org.dspace.content.Bundle;
|
||||
import org.dspace.content.Collection;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.content.ItemIterator;
|
||||
import org.dspace.core.Constants;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.eperson.Group;
|
||||
|
||||
/**
|
||||
* Was Hack/Tool to set policies for items, bundles, and bitstreams. Now has
|
||||
* helpful method, setPolicies();
|
||||
*
|
||||
* @author dstuve
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class PolicySet
|
||||
{
|
||||
/**
|
||||
* Command line interface to setPolicies - run to see arguments
|
||||
*/
|
||||
public static void main(String[] argv) throws Exception
|
||||
{
|
||||
if (argv.length < 6)
|
||||
{
|
||||
System.out
|
||||
.println("Args: containerType containerID contentType actionID groupID command [filter]");
|
||||
System.out.println("container=COLLECTION command = ADD|REPLACE");
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
int containertype = Integer.parseInt(argv[0]);
|
||||
int containerID = Integer.parseInt(argv[1]);
|
||||
int contenttype = Integer.parseInt(argv[2]);
|
||||
int actionID = Integer.parseInt(argv[3]);
|
||||
int groupID = Integer.parseInt(argv[4]);
|
||||
|
||||
boolean isReplace = false;
|
||||
String command = argv[5];
|
||||
String filter = null;
|
||||
if ( argv.length == 7 )
|
||||
{
|
||||
filter = argv[6];
|
||||
}
|
||||
|
||||
if (command.equals("REPLACE"))
|
||||
{
|
||||
isReplace = true;
|
||||
}
|
||||
|
||||
Context c = new Context();
|
||||
|
||||
// turn off authorization
|
||||
c.setIgnoreAuthorization(true);
|
||||
|
||||
//////////////////////
|
||||
// carnage begins here
|
||||
//////////////////////
|
||||
setPoliciesFilter(c, containertype, containerID, contenttype, actionID,
|
||||
groupID, isReplace, false, filter);
|
||||
|
||||
c.complete();
|
||||
System.exit(0);
|
||||
}
|
||||
|
||||
/**
|
||||
* Useful policy wildcard tool. Can set entire collections' contents'
|
||||
* policies
|
||||
*
|
||||
* @param c
|
||||
* current context
|
||||
* @param containerType
|
||||
* type, Constants.ITEM or Constants.COLLECTION
|
||||
* @param containerID
|
||||
* ID of container (DB primary key)
|
||||
* @param contentType
|
||||
* type (BUNDLE, ITEM, or BITSTREAM)
|
||||
* @param actionID
|
||||
* action ID
|
||||
* @param groupID
|
||||
* group ID (database key)
|
||||
* @param isReplace
|
||||
* if <code>true</code>, existing policies are removed first,
|
||||
* otherwise add to existing policies
|
||||
* @param clearOnly
|
||||
* if <code>true</code>, just delete policies for matching
|
||||
* objects
|
||||
* @throws SQLException
|
||||
* if database problem
|
||||
* @throws AuthorizeException
|
||||
* if current user is not authorized to change these policies
|
||||
*/
|
||||
public static void setPolicies(Context c, int containerType,
|
||||
int containerID, int contentType, int actionID, int groupID,
|
||||
boolean isReplace, boolean clearOnly) throws SQLException,
|
||||
AuthorizeException
|
||||
{
|
||||
setPoliciesFilter(c, containerType, containerID, contentType,
|
||||
actionID, groupID, isReplace, clearOnly, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Useful policy wildcard tool. Can set entire collections' contents'
|
||||
* policies
|
||||
*
|
||||
* @param c
|
||||
* current context
|
||||
* @param containerType
|
||||
* type, Constants.ITEM or Constants.COLLECTION
|
||||
* @param containerID
|
||||
* ID of container (DB primary key)
|
||||
* @param contentType
|
||||
* type (BUNDLE, ITEM, or BITSTREAM)
|
||||
* @param actionID
|
||||
* action ID
|
||||
* @param groupID
|
||||
* group ID (database key)
|
||||
* @param isReplace
|
||||
* if <code>true</code>, existing policies are removed first,
|
||||
* otherwise add to existing policies
|
||||
* @param clearOnly
|
||||
* if <code>true</code>, just delete policies for matching
|
||||
* objects
|
||||
* @param filter
|
||||
* if non-null, only process bitstreams whose names contain filter
|
||||
* @throws SQLException
|
||||
* if database problem
|
||||
* @throws AuthorizeException
|
||||
* if current user is not authorized to change these policies
|
||||
*/
|
||||
public static void setPoliciesFilter(Context c, int containerType,
|
||||
int containerID, int contentType, int actionID, int groupID,
|
||||
boolean isReplace, boolean clearOnly, String filter) throws SQLException,
|
||||
AuthorizeException
|
||||
{
|
||||
if (containerType == Constants.COLLECTION)
|
||||
{
|
||||
Collection collection = Collection.find(c, containerID);
|
||||
Group group = Group.find(c, groupID);
|
||||
|
||||
ItemIterator i = collection.getItems();
|
||||
try
|
||||
{
|
||||
if (contentType == Constants.ITEM)
|
||||
{
|
||||
// build list of all items in a collection
|
||||
while (i.hasNext())
|
||||
{
|
||||
Item myitem = i.next();
|
||||
|
||||
// is this a replace? delete policies first
|
||||
if (isReplace || clearOnly)
|
||||
{
|
||||
AuthorizeManager.removeAllPolicies(c, myitem);
|
||||
}
|
||||
|
||||
if (!clearOnly)
|
||||
{
|
||||
// now add the policy
|
||||
ResourcePolicy rp = ResourcePolicy.create(c);
|
||||
|
||||
rp.setResource(myitem);
|
||||
rp.setAction(actionID);
|
||||
rp.setGroup(group);
|
||||
|
||||
rp.update();
|
||||
}
|
||||
}
|
||||
}
|
||||
else if (contentType == Constants.BUNDLE)
|
||||
{
|
||||
// build list of all items in a collection
|
||||
// build list of all bundles in those items
|
||||
while (i.hasNext())
|
||||
{
|
||||
Item myitem = i.next();
|
||||
|
||||
Bundle[] bundles = myitem.getBundles();
|
||||
|
||||
for (int j = 0; j < bundles.length; j++)
|
||||
{
|
||||
Bundle t = bundles[j]; // t for target
|
||||
|
||||
// is this a replace? delete policies first
|
||||
if (isReplace || clearOnly)
|
||||
{
|
||||
AuthorizeManager.removeAllPolicies(c, t);
|
||||
}
|
||||
|
||||
if (!clearOnly)
|
||||
{
|
||||
// now add the policy
|
||||
ResourcePolicy rp = ResourcePolicy.create(c);
|
||||
|
||||
rp.setResource(t);
|
||||
rp.setAction(actionID);
|
||||
rp.setGroup(group);
|
||||
|
||||
rp.update();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
else if (contentType == Constants.BITSTREAM)
|
||||
{
|
||||
// build list of all bitstreams in a collection
|
||||
// iterate over items, bundles, get bitstreams
|
||||
while (i.hasNext())
|
||||
{
|
||||
Item myitem = i.next();
|
||||
System.out.println("Item " + myitem.getID());
|
||||
|
||||
Bundle[] bundles = myitem.getBundles();
|
||||
|
||||
for (int j = 0; j < bundles.length; j++)
|
||||
{
|
||||
System.out.println("Bundle " + bundles[j].getID());
|
||||
|
||||
Bitstream[] bitstreams = bundles[j].getBitstreams();
|
||||
|
||||
for (int k = 0; k < bitstreams.length; k++)
|
||||
{
|
||||
Bitstream t = bitstreams[k]; // t for target
|
||||
|
||||
if ( filter == null ||
|
||||
t.getName().indexOf( filter ) != -1 )
|
||||
{
|
||||
// is this a replace? delete policies first
|
||||
if (isReplace || clearOnly)
|
||||
{
|
||||
AuthorizeManager.removeAllPolicies(c, t);
|
||||
}
|
||||
|
||||
if (!clearOnly)
|
||||
{
|
||||
// now add the policy
|
||||
ResourcePolicy rp = ResourcePolicy.create(c);
|
||||
|
||||
rp.setResource(t);
|
||||
rp.setAction(actionID);
|
||||
rp.setGroup(group);
|
||||
|
||||
rp.update();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
finally
|
||||
{
|
||||
if (i != null)
|
||||
{
|
||||
i.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,168 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.browse;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.HashSet;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
|
||||
import org.apache.log4j.Logger;
|
||||
import org.dspace.content.DSpaceObject;
|
||||
import org.dspace.content.Item;
|
||||
import org.dspace.core.Constants;
|
||||
import org.dspace.core.Context;
|
||||
import org.dspace.event.Consumer;
|
||||
import org.dspace.event.Event;
|
||||
|
||||
/**
|
||||
* Class for updating browse system from content events.
|
||||
* Prototype: only Item events recognized.
|
||||
*
|
||||
* XXX FIXME NOTE: The Browse Consumer is INCOMPLETE because the
|
||||
* deletion of an Item CANNOT be implemented as an event consumer:
|
||||
* When an Item is deleted, the browse tables must be updated
|
||||
* immediately, within the same transaction, to maintain referential
|
||||
* consistency. It cannot be handled in an Event consumer since by
|
||||
* definition that runs after the transaction is committed.
|
||||
* Perhaps this can be addressed if the Browse system is replaced.
|
||||
*
|
||||
* To handle create/modify events: accumulate Sets of Items to be added
|
||||
* and updated out of the event stream. Process them in endEvents()
|
||||
* filter out update requests for Items that were just created.
|
||||
*
|
||||
* Recommended filter: Item+Create|Modify|Modify_Metadata:Collection+Add|Remove
|
||||
*
|
||||
* @version $Revision$
|
||||
*/
|
||||
public class BrowseConsumer implements Consumer
|
||||
{
|
||||
/** log4j logger */
|
||||
private static Logger log = Logger.getLogger(BrowseConsumer.class);
|
||||
|
||||
// items to be updated in browse index
|
||||
private Map<Integer, ItemHolder> toUpdate = null;
|
||||
|
||||
public void initialize()
|
||||
throws Exception
|
||||
{
|
||||
|
||||
}
|
||||
|
||||
public void consume(Context ctx, Event event)
|
||||
throws Exception
|
||||
{
|
||||
if(toUpdate == null)
|
||||
{
|
||||
toUpdate = new HashMap<Integer, ItemHolder>();
|
||||
}
|
||||
|
||||
log.debug("consume() evaluating event: " + event.toString());
|
||||
|
||||
|
||||
int st = event.getSubjectType();
|
||||
int et = event.getEventType();
|
||||
|
||||
switch (st)
|
||||
{
|
||||
|
||||
// If an Item is created or its metadata is modified..
|
||||
case Constants.ITEM:
|
||||
if (et == Event.MODIFY_METADATA || et == Event.CREATE)
|
||||
{
|
||||
Item subj = (Item)event.getSubject(ctx);
|
||||
if (subj != null)
|
||||
{
|
||||
log.debug("consume() adding event to update queue: " + event.toString());
|
||||
if (et == Event.CREATE || !toUpdate.containsKey(subj.getID()))
|
||||
{
|
||||
toUpdate.put(subj.getID(), new ItemHolder(subj, et == Event.CREATE));
|
||||
}
|
||||
}
|
||||
}
|
||||
break;
|
||||
// track ADD and REMOVE from collections, that changes browse index.
|
||||
case Constants.COLLECTION:
|
||||
if (event.getObjectType() == Constants.ITEM && (et == Event.ADD || et == Event.REMOVE))
|
||||
{
|
||||
Item obj = (Item)event.getObject(ctx);
|
||||
if (obj != null)
|
||||
{
|
||||
log.debug("consume() adding event to update queue: " + event.toString());
|
||||
if (!toUpdate.containsKey(obj.getID()))
|
||||
{
|
||||
toUpdate.put(obj.getID(), new ItemHolder(obj, false));
|
||||
}
|
||||
}
|
||||
}
|
||||
break;
|
||||
default:
|
||||
log.debug("consume() ignoring event: " + event.toString());
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
public void end(Context ctx)
|
||||
throws Exception
|
||||
{
|
||||
|
||||
if (toUpdate != null)
|
||||
{
|
||||
|
||||
// Update/Add items
|
||||
for (ItemHolder i : toUpdate.values())
|
||||
{
|
||||
// FIXME: there is an exception handling problem here
|
||||
try
|
||||
{
|
||||
// Update browse indices
|
||||
ctx.turnOffAuthorisationSystem();
|
||||
IndexBrowse ib = new IndexBrowse(ctx);
|
||||
ib.indexItem(i.item, i.createEvent);
|
||||
ctx.restoreAuthSystemState();
|
||||
}
|
||||
catch (BrowseException e)
|
||||
{
|
||||
log.error("caught exception: ", e);
|
||||
//throw new SQLException(e.getMessage());
|
||||
}
|
||||
|
||||
if (log.isDebugEnabled())
|
||||
{
|
||||
log.debug("Updated browse indices for Item id="
|
||||
+ String.valueOf(i.item.getID()) + ", hdl="
|
||||
+ i.item.getHandle());
|
||||
}
|
||||
}
|
||||
|
||||
// NOTE: Removed items are necessarily handled inline (ugh).
|
||||
|
||||
// browse updates wrote to the DB, so we have to commit.
|
||||
ctx.getDBConnection().commit();
|
||||
|
||||
}
|
||||
|
||||
// clean out toUpdate
|
||||
toUpdate = null;
|
||||
}
|
||||
|
||||
public void finish(Context ctx) {
|
||||
|
||||
}
|
||||
|
||||
private final class ItemHolder {
|
||||
private Item item;
|
||||
private boolean createEvent;
|
||||
|
||||
ItemHolder(Item pItem, boolean pCreateEvent)
|
||||
{
|
||||
item = pItem;
|
||||
createEvent = pCreateEvent;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,372 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.browse;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
|
||||
/**
|
||||
* Interface for any class wishing to provide a browse storage later. This particular
|
||||
* Data Access Object deals with building and destroying the database, and inserting and
|
||||
* removing content from it. There is an alternative class BrowseDAO which deals with
|
||||
* Read-Only operations.
|
||||
*
|
||||
* If you implement this class, and you wish it to be loaded via the BrowseDAOFactory
|
||||
* you must supply a constructor of the form:
|
||||
*
|
||||
* public BrowseCreateDAOImpl(Context context) {}
|
||||
*
|
||||
* Where Context is the DSpace Context object
|
||||
*
|
||||
* Where tables are referred to in this class, they can be obtained from the BrowseIndex
|
||||
* class, which will answer queries given the context of the request on which table
|
||||
* is the relevant target.
|
||||
*
|
||||
* @author Richard Jones
|
||||
*
|
||||
*/
|
||||
public interface BrowseCreateDAO
|
||||
{
|
||||
// this must have a constructor which takes a DSpace Context as
|
||||
// an argument, thus:
|
||||
//
|
||||
// public BrowseCreateDAO(Context context)
|
||||
|
||||
/**
|
||||
* Delete the record for the given item id from the specified table.
|
||||
*
|
||||
* Table names can be obtained from the BrowseIndex class
|
||||
*
|
||||
* @param table the browse table to remove the index from
|
||||
* @param itemID the database id of the item to remove the index for
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public void deleteByItemID(String table, int itemID) throws BrowseException;
|
||||
|
||||
public void deleteCommunityMappings(int itemID) throws BrowseException;
|
||||
public void updateCommunityMappings(int itemID) throws BrowseException;
|
||||
|
||||
|
||||
/**
|
||||
* Insert an index record into the given table for the given item id. The Map should contain
|
||||
* key value pairs representing the sort column integer representation and the normalised
|
||||
* value for that field.
|
||||
*
|
||||
* For example, the caller might do as follows:
|
||||
*
|
||||
* <code>
|
||||
* Map map = new HashMap();
|
||||
* map.put(new Integer(1), "the title");
|
||||
* map.put(new Integer(2), "the subject");
|
||||
*
|
||||
* BrowseCreateDAO dao = BrowseDAOFactory.getCreateInstance();
|
||||
* dao.insertIndex("index_1", 21, map);
|
||||
* </code>
|
||||
*
|
||||
* @param table the browse table to insert the index in
|
||||
* @param itemID the database id of the item being indexed
|
||||
* @param sortCols an Integer-String map of sort column numbers and values
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public void insertIndex(String table, int itemID, Map<Integer, String> sortCols) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Updates an index record into the given table for the given item id. The Map should contain
|
||||
* key value pairs representing the sort column integer representation and the normalised
|
||||
* value for that field.
|
||||
*
|
||||
* For example, the caller might do as follows:
|
||||
*
|
||||
* <code>
|
||||
* Map map = new HashMap();
|
||||
* map.put(new Integer(1), "the title");
|
||||
* map.put(new Integer(2), "the subject");
|
||||
*
|
||||
* BrowseCreateDAO dao = BrowseDAOFactory.getCreateInstance();
|
||||
* dao.updateIndex("index_1", 21, map);
|
||||
* </code>
|
||||
*
|
||||
* @param table the browse table to insert the index in
|
||||
* @param itemID the database id of the item being indexed
|
||||
* @param sortCols an Integer-String map of sort column numbers and values
|
||||
* @return true if the record is updated, false if not found
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public boolean updateIndex(String table, int itemID, Map<Integer, String> sortCols) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Get the browse index's internal id for the location of the given string
|
||||
* and sort value in the given table. This method should always return a
|
||||
* positive integer, as if no existing ID is available for the given value
|
||||
* then one should be inserted using the data supplied, and the ID returned.
|
||||
*
|
||||
* Generally this method is used in conjunction with createDistinctMapping thus:
|
||||
*
|
||||
* <code>
|
||||
* BrowseCreateDAO dao = BrowseDAOFactory.getCreateInstance();
|
||||
* dao.createDistinctMapping("index_1_distinct_map", 21,
|
||||
* dao.getDistinctID("index_1_distinct", "Human Readable", "human readable"));
|
||||
* </code>
|
||||
*
|
||||
* When it creates a distinct record, it would usually do so through insertDistinctRecord
|
||||
* defined below.
|
||||
*
|
||||
* @param table the table in which to look for/create the id
|
||||
* @param value the value on which to search
|
||||
* @param sortValue the sort value to use in case of the need to create
|
||||
* @return the database id of the distinct record
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public int getDistinctID(String table, String value, String authority, String sortValue) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Insert the given value and sort value into the distinct index table. This
|
||||
* returns an integer which represents the database id of the created record, so
|
||||
* that it can be used, for example in createDistinctMapping thus:
|
||||
*
|
||||
* <code>
|
||||
* BrowseCreateDAO dao = BrowseDAOFactory.getCreateInstance();
|
||||
* dao.createDistinctMapping("index_1_distinct_map", 21,
|
||||
* dao.insertDistinctRecord("index_1_distinct", "Human Readable", "human readable"));
|
||||
* </code>
|
||||
*
|
||||
* This is less good than using getDistinctID defined above, as if there is
|
||||
* already a distinct value in the table it may throw an exception
|
||||
*
|
||||
* @param table the table into which to insert the record
|
||||
* @param value the value to insert
|
||||
* @param sortValue the sort value to insert
|
||||
* @return the database id of the created record
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public int insertDistinctRecord(String table, String value, String authority, String sortValue) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Update a mapping between an item id and a distinct metadata field such as an author,
|
||||
* who can appear in multiple items. To get the id of the distinct record you should
|
||||
* use either getDistinctID or insertDistinctRecord as defined above.
|
||||
*
|
||||
* @param table the mapping table
|
||||
* @param itemID the item id
|
||||
* @param distinctIDs the id of the distinct record
|
||||
* @return the ids of any distinct records that have been unmapped
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public MappingResults updateDistinctMappings(String table, int itemID, Set<Integer> distinctIDs) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Find out of a given table exists.
|
||||
*
|
||||
* @param table the table to test
|
||||
* @return true if exists, false if not
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public boolean testTableExistence(String table) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Drop the given table name, and all other resources that are attached to it. In normal
|
||||
* relational database land this will include constraints and views. If the boolean execute
|
||||
* is true this operation should be carried out, and if it is false it should not. The returned
|
||||
* string should contain the SQL (if relevant) that the caller can do with what they like
|
||||
* (for example, output to the screen).
|
||||
*
|
||||
* @param table The table to drop
|
||||
* @param execute Whether to action the removal or not
|
||||
* @return The instructions (SQL) that effect the removal
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String dropIndexAndRelated(String table, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Drop the given sequence name. This is relevant to most forms of database, but not all.
|
||||
* If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen)
|
||||
*
|
||||
* @param sequence the sequence to drop
|
||||
* @param execute whether to action the removal or not
|
||||
* @return The instructions (SQL) that effect the removal
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String dropSequence(String sequence, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Drop the given view name. This is relevant to most forms of database, but not all.
|
||||
* If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen)
|
||||
*
|
||||
* @param view the view to drop
|
||||
* @param execute whether to action the removal or not
|
||||
* @return The instructions (SQL) that effect the removal
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String dropView(String view, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Create the sequence with the given name. This is relevant to most forms of database, but not all.
|
||||
* If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen)
|
||||
*
|
||||
* @param sequence the sequence to create
|
||||
* @param execute whether to action the create or not
|
||||
* @return the instructions (SQL) that effect the creation
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String createSequence(String sequence, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Create the main index table. This is the one which will contain a single row per
|
||||
* item. If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen)
|
||||
*
|
||||
* This form is used for the primary item browse tables
|
||||
*
|
||||
* This should be used, for example, like this:
|
||||
*
|
||||
* <code>
|
||||
* List list = new ArrayList();
|
||||
* list.add(new Integer(1));
|
||||
* list.add(new Integer(2));
|
||||
*
|
||||
* BrowseCreateDAO dao = BrowseDAOFactory.getCreateInstance();
|
||||
* dao.createPrimaryTable("index_1", list, true);
|
||||
* </code>
|
||||
*
|
||||
* @param table the raw table to create
|
||||
* @param sortCols a List of Integers numbering the sort columns required
|
||||
* @param execute whether to action the create or not
|
||||
* @return the instructions (SQL) that effect the creation
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String createPrimaryTable(String table, List<Integer> sortCols, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Create any indices that the implementing DAO sees fit to maximise performance.
|
||||
* If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string array should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen). It's an array so that
|
||||
* you can return each bit of SQL as an element if you want.
|
||||
*
|
||||
* @param table the table upon which to create indices
|
||||
* @param sortCols TODO
|
||||
* @param execute whether to action the create or not
|
||||
* @return the instructions (SQL) that effect the indices
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String[] createDatabaseIndices(String table, List<Integer> sortCols, boolean value, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Create any indices that the implementing DAO sees fit to maximise performance.
|
||||
* If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string array should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen). It's an array so that
|
||||
* you can return each bit of SQL as an element if you want.
|
||||
*
|
||||
* @param disTable the distinct table upon which to create indices
|
||||
* @param mapTable the mapping table upon which to create indices
|
||||
* @param execute whether to action the create or not
|
||||
* @return the instructions (SQL) that effect the indices
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String[] createMapIndices(String disTable, String mapTable, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Create the View of the full item index as seen from a collection.
|
||||
* If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string array should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen).
|
||||
*
|
||||
* @param table the table to create the view on
|
||||
* @param view the name of the view to create
|
||||
* @param execute whether to action the create or not
|
||||
* @return the instructions (SQL) that effects the create
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String createCollectionView(String table, String view, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Create the View of the full item index as seen from a community
|
||||
* If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string array should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen).
|
||||
*
|
||||
* @param table the table to create the view on
|
||||
* @param view the name of the view to create
|
||||
* @param execute whether to action the create or not
|
||||
* @return the instructions (SQL) that effects the create
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String createCommunityView(String table, String view, boolean execute) throws BrowseException;
|
||||
|
||||
public List<Integer> deleteMappingsByItemID(String mapTable, int itemID) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Create the table which will hold the distinct metadata values that appear in multiple
|
||||
* items. For example, this table may hold a list of unique authors, each name in the
|
||||
* metadata for the entire system appearing only once. Or for subject classifications.
|
||||
* If the boolean execute is true this operation should be carried out, and if it is false
|
||||
* it should not. The returned string array should contain the SQL (if relevant) that the caller
|
||||
* can do with what they like (for example, output to the screen).
|
||||
*
|
||||
* @param table the table to create
|
||||
* @param execute whether to action the create or not
|
||||
* @return the instructions (SQL) that effects the create
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String createDistinctTable(String table, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Create a table to hold a mapping between an item and a distinct metadata value that can appear
|
||||
* across multiple items (for example, author names). If the boolean execute is true this
|
||||
* operation should be carried out, and if it is false it should not.
|
||||
*
|
||||
* @param table the name of the distinct table which holds the target of the mapping
|
||||
* @param map the name of the mapping table itself
|
||||
* @param execute whether to execute the query or not
|
||||
* @return
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String createDistinctMap(String table, String map, boolean execute) throws BrowseException;
|
||||
|
||||
/**
|
||||
* So that any left over indices for items which have been deleted can be assured to have
|
||||
* been removed, this method checks for indices for items which are not in the item table.
|
||||
* If it finds an index which does not have an associated item it removes it.
|
||||
*
|
||||
* @param table the index table to check
|
||||
* @param withdrawn TODO
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public void pruneExcess(String table, boolean withdrawn) throws BrowseException;
|
||||
|
||||
/**
|
||||
* So that any left over indices for items which have been deleted can be assured to have
|
||||
* been removed, this method checks for indices for items which are not in the item table.
|
||||
* If it finds an index which does not have an associated item it removes it.
|
||||
*
|
||||
* @param map the name of the associated distinct mapping table
|
||||
* @param withdrawn TODO
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public void pruneMapExcess(String map, boolean withdrawn, List<Integer> distinctIds) throws BrowseException;
|
||||
|
||||
/**
|
||||
* So that there are no distinct values indexed which are no longer referenced from the
|
||||
* map table, this method checks for values which are not referenced from the map,
|
||||
* and removes them.
|
||||
*
|
||||
* @param table the name of the distinct index table
|
||||
* @param map the name of the associated distinct mapping table.
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public void pruneDistinct(String table, String map, List<Integer> distinctIds) throws BrowseException;
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,389 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.browse;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* Interface for any class wishing to interact with the Browse storage layer for
|
||||
* Read Only operations. If you wish to modify the contents of the browse indices
|
||||
* or create and destroy index tables you should look at implementations for
|
||||
* BrowseCreateDAO.
|
||||
*
|
||||
* If you implement this class, and you wish it to be loaded via the BrowseDAOFactory
|
||||
* you must supply a constructor of the form:
|
||||
*
|
||||
* public BrowseDAOImpl(Context context) {}
|
||||
*
|
||||
* Where Context is the DSpace Context object
|
||||
*
|
||||
* Where tables are referred to in this class, they can be obtained from the BrowseIndex
|
||||
* class, which will answer queries given the context of the request on which table
|
||||
* is the relevant target.
|
||||
*
|
||||
* @author Richard Jones
|
||||
*
|
||||
*/
|
||||
public interface BrowseDAO
|
||||
{
|
||||
// Objects implementing this interface should also include
|
||||
// a constructor which takes the DSpace Context as an argument
|
||||
//
|
||||
// public BrowseDAOImpl(Context context) ...
|
||||
|
||||
/**
|
||||
* This executes a query which will count the number of results for the
|
||||
* parameters you set.
|
||||
*
|
||||
* @return the integer value of the number of results found
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public int doCountQuery() throws BrowseException;
|
||||
|
||||
/**
|
||||
* This executes a query which returns a List object containing String
|
||||
* values which represent the results of a single value browse (for
|
||||
* example, the list of all subject headings). This is most
|
||||
* commonly used with a Distinct browse type.
|
||||
*
|
||||
* @return List of Strings representing the single value query results
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public List<String[]> doValueQuery() throws BrowseException;
|
||||
|
||||
/**
|
||||
* This executes a query which returns a List object containing BrowseItem objects
|
||||
* representing the results of a full item browse.
|
||||
*
|
||||
* @return List of BrowseItem objects
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public List<BrowseItem> doQuery() throws BrowseException;
|
||||
|
||||
/**
|
||||
* This executes a query which returns the value of the "highest" (max) value
|
||||
* in the given table's column for the given item id.
|
||||
*
|
||||
* @param column the column to interrogate
|
||||
* @param table the table to query
|
||||
* @param itemID the item id
|
||||
* @return String representing the max value in the given column
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public String doMaxQuery(String column, String table, int itemID) throws BrowseException;
|
||||
|
||||
/**
|
||||
* This executes a query which returns the offset where the value (or nearest greater
|
||||
* equivalent) can be found in the specified table ordered by the column.
|
||||
*
|
||||
* @param column the column to interrogate
|
||||
* @param value the item id
|
||||
* @param isAscending browsing in ascending or descending order
|
||||
* @return the offset into the table
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public int doOffsetQuery(String column, String value, boolean isAscending) throws BrowseException;
|
||||
|
||||
/**
|
||||
* This executes a query which returns the offset where the value (or nearest greater
|
||||
* equivalent) can be found in the specified table ordered by the column.
|
||||
*
|
||||
* @param column the column to interrogate
|
||||
* @param value the item id
|
||||
* @param isAscending browsing in ascending or descending order
|
||||
* @return the offset into the table
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public int doDistinctOffsetQuery(String column, String value, boolean isAscending) throws BrowseException;
|
||||
|
||||
/**
|
||||
* Does the query use the equals comparator when doing less than or greater than
|
||||
* comparisons. @see setEqualsComparator
|
||||
*
|
||||
* Default value is true
|
||||
*
|
||||
* @return true if using it, false if not
|
||||
*/
|
||||
public boolean useEqualsComparator();
|
||||
|
||||
/**
|
||||
* Set whether the query should use an equals comparator when doing less than or
|
||||
* greater than comparisons. That is, if true then comparisons will be made
|
||||
* using the equivalent of "<=" and ">=", while if false it will use the
|
||||
* equivalent of "<" and ">"
|
||||
*
|
||||
* @param equalsComparator true to use, false to not.
|
||||
*/
|
||||
public void setEqualsComparator(boolean equalsComparator);
|
||||
|
||||
/**
|
||||
* Is the sort order ascending or descending?
|
||||
*
|
||||
* Default value is true
|
||||
*
|
||||
* @return true for ascending, false for descending
|
||||
*/
|
||||
public boolean isAscending();
|
||||
|
||||
/**
|
||||
* Set whether the results should be sorted in ascending order (on the given sort column)
|
||||
* or descending order.
|
||||
*
|
||||
* @param ascending true to ascend, false to descend
|
||||
*/
|
||||
public void setAscending(boolean ascending);
|
||||
|
||||
/**
|
||||
* Get the database ID of the container object. The container object will be a
|
||||
* Community or a Collection.
|
||||
*
|
||||
* @return the database id of the container, or -1 if none is set
|
||||
*/
|
||||
public int getContainerID();
|
||||
|
||||
/**
|
||||
* Set the database id of the container object. This should be the id of a
|
||||
* Community or Collection. This will constrain the results of the browse
|
||||
* to only items or values within items that appear in the given container.
|
||||
*
|
||||
* @param containerID
|
||||
*/
|
||||
public void setContainerID(int containerID);
|
||||
|
||||
/**
|
||||
* get the name of the field in which to look for the container id. This is
|
||||
* principally for use internal to the DAO.
|
||||
*
|
||||
* @return the name of the container id field. For example "collection_id" or
|
||||
* "community_id"
|
||||
*/
|
||||
public String getContainerIDField();
|
||||
|
||||
/**
|
||||
* set the name of the field in which to look for the container id.
|
||||
*
|
||||
* @param containerIDField the name of the container id field.
|
||||
* For example "collection_id" or "community_id"
|
||||
*/
|
||||
public void setContainerIDField(String containerIDField);
|
||||
|
||||
/**
|
||||
* Get the field in which we will match a focus value from which to start
|
||||
* the browse. This will either be the "sort_value" field or one of the
|
||||
* additional sort fields defined by configuration
|
||||
*
|
||||
* @return the name of the focus field
|
||||
*/
|
||||
public String getJumpToField();
|
||||
|
||||
/**
|
||||
* Set the focus field upon which we will match a value from which to start
|
||||
* the browse. This will either be the "sort_value" field or one of the
|
||||
* additional sort fields defined by configuration
|
||||
*
|
||||
* param focusField the name of the focus field
|
||||
*/
|
||||
public void setJumpToField(String focusField);
|
||||
|
||||
/**
|
||||
* Get the value at which the browse will start. The value supplied here will
|
||||
* be the top result on the page of results.
|
||||
*
|
||||
* @return the value to start browsing on
|
||||
*/
|
||||
public String getJumpToValue();
|
||||
|
||||
/**
|
||||
* Set the value upon which to start the browse from. The value supplied here
|
||||
* will be the top result on the page of results
|
||||
*
|
||||
* @param focusValue the value in the focus field on which to start browsing
|
||||
*/
|
||||
public void setJumpToValue(String focusValue);
|
||||
|
||||
/**
|
||||
* get the integer number which is the limit of the results that will be returned
|
||||
* by any query. The default is -1, which means unlimited results.
|
||||
*
|
||||
* @return the maximum possible number of results allowed to be returned
|
||||
*/
|
||||
public int getLimit();
|
||||
|
||||
/**
|
||||
* Set the limit for how many results should be returned. This is generally
|
||||
* for use in paging or limiting the number of items be be displayed. The default
|
||||
* is -1, meaning unlimited results. Note that if the number of results of the
|
||||
* query is less than this number, the size of the result set will be smaller
|
||||
* than this limit.
|
||||
*
|
||||
* @param limit the maximum number of results to return.
|
||||
*/
|
||||
public void setLimit(int limit);
|
||||
|
||||
/**
|
||||
* Get the offset from the first result from which to return results. This
|
||||
* functionality is present for backwards compatibility, but is ill advised. All
|
||||
* normal browse operations can be completed without it. The default is -1, which
|
||||
* means do not offset.
|
||||
*
|
||||
* @return the offset
|
||||
*/
|
||||
public int getOffset();
|
||||
|
||||
/**
|
||||
* Get the offset from the first result from which to return results. This
|
||||
* functionality is present for backwards compatibility, but is ill advised. All
|
||||
* normal browse operations can be completed without it. The default is -1, which
|
||||
* means do not offset.
|
||||
*
|
||||
* @param offset
|
||||
*/
|
||||
public void setOffset(int offset);
|
||||
|
||||
/**
|
||||
* Get the database field which will be used to do the sorting of result sets on.
|
||||
*
|
||||
* @return the field by which results will be sorted
|
||||
*/
|
||||
public String getOrderField();
|
||||
|
||||
/**
|
||||
* Set the database field which will be used to sort result sets on
|
||||
*
|
||||
* @param orderField the field by which results will be sorted
|
||||
*/
|
||||
public void setOrderField(String orderField);
|
||||
|
||||
/**
|
||||
* Get the array of values that we will be selecting on. The default is
|
||||
* to select all of the values from a given table
|
||||
*
|
||||
* @return an array of values to select on
|
||||
*/
|
||||
public String[] getSelectValues();
|
||||
|
||||
/**
|
||||
* Set the array of values to select on. This should be a list of the columns
|
||||
* available in the target table, or the SQL wildcards. The default is
|
||||
* single element array with the standard wildcard (*)
|
||||
*
|
||||
* @param selectValues the values to select on
|
||||
*/
|
||||
public void setSelectValues(String[] selectValues);
|
||||
|
||||
/**
|
||||
* Get the array of fields that we will be counting on.
|
||||
*
|
||||
* @return an array of fields to be counted over
|
||||
*/
|
||||
public String[] getCountValues();
|
||||
|
||||
/**
|
||||
* Set the array of columns that we will be counting over. In general, the
|
||||
* wildcard (*) will suffice
|
||||
*
|
||||
* @param fields an array of fields to be counted over
|
||||
*/
|
||||
public void setCountValues(String[] fields);
|
||||
|
||||
/**
|
||||
* get the name of the table that we are querying
|
||||
*
|
||||
* @return the name of the table
|
||||
*/
|
||||
public String getTable();
|
||||
|
||||
/**
|
||||
* Set the name of the table to query
|
||||
*
|
||||
* @param table the name of the table
|
||||
*/
|
||||
public void setTable(String table);
|
||||
|
||||
/**
|
||||
* Set the name of the mapping tables to use for filtering
|
||||
* @param tableDis the name of the table holding the distinct values
|
||||
* @param tableMap the name of the table holding the mappings
|
||||
*/
|
||||
public void setFilterMappingTables(String tableDis, String tableMap);
|
||||
|
||||
/**
|
||||
* Get the value which we are constraining all our browse results to contain.
|
||||
*
|
||||
* @return the value to which to constrain results
|
||||
*/
|
||||
public String getFilterValue();
|
||||
|
||||
/**
|
||||
* Set the value to which all our browse results should be constrained. For
|
||||
* example, if you are listing all of the publications by a single author
|
||||
* your value would be the author name.
|
||||
*
|
||||
* @param value the value to which to constrain results
|
||||
*/
|
||||
public void setFilterValue(String value);
|
||||
|
||||
/**
|
||||
* Sets whether we will treat the filter value as partial (like match), or exact
|
||||
*
|
||||
* @param part true if partial, false if exact
|
||||
*/
|
||||
public void setFilterValuePartial(boolean part);
|
||||
|
||||
/**
|
||||
* Get the name of the field in which the value to constrain results is
|
||||
* contained
|
||||
*
|
||||
* @return the name of the field
|
||||
*/
|
||||
public String getFilterValueField();
|
||||
|
||||
/**
|
||||
* Set he name of the field in which the value to constrain results is
|
||||
* contained
|
||||
*
|
||||
* @param valueField the name of the field
|
||||
*/
|
||||
public void setFilterValueField(String valueField);
|
||||
|
||||
/**
|
||||
* Set whether this is a distinct value browse or not
|
||||
*
|
||||
* @param bool true if distinct value, false if not
|
||||
*/
|
||||
public void setDistinct(boolean bool);
|
||||
|
||||
/**
|
||||
* Is this a distinct value browse?
|
||||
*
|
||||
* @return true if distinct, false if not
|
||||
*/
|
||||
public boolean isDistinct();
|
||||
|
||||
/**
|
||||
* If we have specified a container id and container field, we must also specify
|
||||
* a container table. This is the name of the table that maps the item onto
|
||||
* the distinct value. Since we are in a container, this value will actually be
|
||||
* the view which allows us to select only items which are within a given container
|
||||
*
|
||||
* @param containerTable the name of the container table mapping
|
||||
*/
|
||||
public void setContainerTable(String containerTable);
|
||||
|
||||
/**
|
||||
* Get the name of the container table that is being used to map items to distinct
|
||||
* values when in a container constrained browse
|
||||
*
|
||||
* @return the name of the table
|
||||
*/
|
||||
public String getContainerTable();
|
||||
|
||||
public void setAuthorityValue(String value);
|
||||
|
||||
public String getAuthorityValue();
|
||||
}
|
||||
@@ -1,124 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.browse;
|
||||
|
||||
import org.dspace.core.ConfigurationManager;
|
||||
import org.dspace.core.Context;
|
||||
|
||||
/**
|
||||
* Factory class to generate DAOs based on the configuration
|
||||
*
|
||||
* @author Richard Jones
|
||||
*
|
||||
*/
|
||||
public class BrowseDAOFactory
|
||||
{
|
||||
/**
|
||||
* Get an instance of the relevant Read Only DAO class, which will
|
||||
* conform to the BrowseDAO interface
|
||||
*
|
||||
* @param context the DSpace context
|
||||
* @return the relevant DAO
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public static BrowseDAO getInstance(Context context)
|
||||
throws BrowseException
|
||||
{
|
||||
String db = ConfigurationManager.getProperty("db.name");
|
||||
if ("postgres".equals(db))
|
||||
{
|
||||
return new BrowseDAOPostgres(context);
|
||||
}
|
||||
else if ("oracle".equals(db))
|
||||
{
|
||||
return new BrowseDAOOracle(context);
|
||||
}
|
||||
else
|
||||
{
|
||||
throw new BrowseException("The configuration for db.name is either invalid, or contains an unrecognised database");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an instance of the relevant Write Only DAO class, which will
|
||||
* conform to the BrowseCreateDAO interface
|
||||
*
|
||||
* @param context the DSpace context
|
||||
* @return the relevant DAO
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public static BrowseCreateDAO getCreateInstance(Context context)
|
||||
throws BrowseException
|
||||
{
|
||||
String db = ConfigurationManager.getProperty("db.name");
|
||||
if ("postgres".equals(db))
|
||||
{
|
||||
return new BrowseCreateDAOPostgres(context);
|
||||
}
|
||||
else if ("oracle".equals(db))
|
||||
{
|
||||
return new BrowseCreateDAOOracle(context);
|
||||
}
|
||||
else
|
||||
{
|
||||
throw new BrowseException("The configuration for db.name is either invalid, or contains an unrecognised database");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an instance of the relevant Read Only DAO class, which will
|
||||
* conform to the BrowseItemDAO interface
|
||||
*
|
||||
* @param context the DSpace context
|
||||
* @return the relevant DAO
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public static BrowseItemDAO getItemInstance(Context context)
|
||||
throws BrowseException
|
||||
{
|
||||
String db = ConfigurationManager.getProperty("db.name");
|
||||
if ("postgres".equals(db))
|
||||
{
|
||||
return new BrowseItemDAOPostgres(context);
|
||||
}
|
||||
else if ("oracle".equals(db))
|
||||
{
|
||||
return new BrowseItemDAOOracle(context);
|
||||
}
|
||||
else
|
||||
{
|
||||
throw new BrowseException("The configuration for db.name is either invalid, or contains an unrecognised database");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get an instance of the relevant DAO Utilities class, which will
|
||||
* conform to the BrowseDAOUtils interface
|
||||
*
|
||||
* @param context the DSpace context
|
||||
* @return the relevant DAO
|
||||
* @throws BrowseException
|
||||
*/
|
||||
public static BrowseDAOUtils getUtils(Context context)
|
||||
throws BrowseException
|
||||
{
|
||||
String db = ConfigurationManager.getProperty("db.name");
|
||||
if ("postgres".equals(db))
|
||||
{
|
||||
return new BrowseDAOUtilsPostgres();
|
||||
}
|
||||
else if ("oracle".equals(db))
|
||||
{
|
||||
return new BrowseDAOUtilsOracle();
|
||||
}
|
||||
else
|
||||
{
|
||||
throw new BrowseException("The configuration for db.name is either invalid, or contains an unrecognised database");
|
||||
}
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,92 +0,0 @@
|
||||
/**
|
||||
* The contents of this file are subject to the license and copyright
|
||||
* detailed in the LICENSE and NOTICE files at the root of the source
|
||||
* tree and available online at
|
||||
*
|
||||
* http://www.dspace.org/license/
|
||||
*/
|
||||
package org.dspace.browse;
|
||||
|
||||
/**
|
||||
* Utility class for retrieving the size of the columns to be used in the browse tables,
|
||||
* and applying truncation to the strings that will be inserted into the tables.
|
||||
*
|
||||
* Can be configured in dspace.cfg, with the following entries:
|
||||
*
|
||||
* webui.browse.value_columns.max
|
||||
* - the maximum number of characters in 'value' columns
|
||||
* (0 is unlimited)
|
||||
*
|
||||
* webui.browse.sort_columns.max
|
||||
* - the maximum number of characters in 'sort' columns
|
||||
* (0 is unlimited)
|
||||
*
|
||||
* webui.browse.value_columns.omission_mark
|
||||
* - a string to append to truncated values that will be entered into
|
||||
* the value columns (ie. '...')
|
||||
*
|
||||
* By default, the column sizes are '0' (unlimited), and no truncation is applied,
|
||||
* EXCEPT for Oracle, where we have to truncate the columns for it to work! (in which
|
||||
* case, both value and sort columns are by default limited to 2000 characters).
|
||||
*
|
||||
* @author Graham Triggs
|
||||
* @author Richard Jones
|
||||
*/
|
||||
public interface BrowseDAOUtils
|
||||
{
|
||||
/**
|
||||
* Get the size to use for the 'value' columns in characters
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public int getValueColumnMaxChars();
|
||||
|
||||
/**
|
||||
* Get the size to use for the sort columns in characters
|
||||
*
|
||||
* @return
|
||||
*/
|
||||
public int getSortColumnMaxChars();
|
||||
|
||||
/**
|
||||
* Truncate strings that are to be used for the 'value' columns
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public String truncateValue(String value);
|
||||
|
||||
/**
|
||||
* Truncate strings that are to be used for sorting
|
||||
*
|
||||
* @param value
|
||||
* @return
|
||||
*/
|
||||
public String truncateSortValue(String value);
|
||||
|
||||
/**
|
||||
* Truncate strings that are to be used for the 'value' columns.
|
||||
* Characters is the maximum number of characters to allow.
|
||||
* Actual truncation applied will be the SMALLER of the passed
|
||||
* value, or that read from the configuration.
|
||||
*
|
||||
* @param value
|
||||
* @param chars
|
||||
* @return
|
||||
* @deprecated
|
||||
*/
|
||||
public String truncateValue(String value, int chars);
|
||||
|
||||
/**
|
||||
* Truncate strings that are to be used for the sorting
|
||||
* Characters is the maximum number of characters to allow.
|
||||
* Actual truncation applied will be the SMALLER of the passed
|
||||
* value, or that read from the configuration.
|
||||
*
|
||||
* @param value
|
||||
* @param chars
|
||||
* @return
|
||||
* @deprecated
|
||||
*/
|
||||
public String truncateSortValue(String value, int chars);
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user