diff --git a/.codecov.yml b/.codecov.yml new file mode 100644 index 0000000000..3dba42ef37 --- /dev/null +++ b/.codecov.yml @@ -0,0 +1,29 @@ +# DSpace configuration for Codecov.io coverage reports +# These override the default YAML settings at +# https://docs.codecov.io/docs/codecov-yaml#section-default-yaml +# Can be validated via instructions at: +# https://docs.codecov.io/docs/codecov-yaml#validate-your-repository-yaml + +# Settings related to code coverage analysis +coverage: + status: + # Configuration for project-level checks. This checks how the PR changes overall coverage. + project: + default: + # For each PR, auto compare coverage to previous commit. + # Require that overall (project) coverage does NOT drop more than 0.5% + target: auto + threshold: 0.5% + # Configuration for patch-level checks. This checks the relative coverage of the new PR code ONLY. + patch: + default: + # For each PR, make sure the coverage of the new code is within 1% of current overall coverage. + # We let 'patch' be more lenient as we only require *project* coverage to not drop significantly. + target: auto + threshold: 1% + +# Turn PR comments "off". This feature adds the code coverage summary as a +# comment on each PR. See https://docs.codecov.io/docs/pull-request-comments +# However, this same info is available from the Codecov checks in the PR's +# "Checks" tab in GitHub. So, the comment is unnecessary. +comment: false diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md new file mode 100644 index 0000000000..9893d233e1 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -0,0 +1,22 @@ +--- +name: Bug report +about: Create a report to help us improve +title: '' +labels: bug, needs triage +assignees: '' + +--- + +**Describe the bug** +A clear and concise description of what the bug is. Include the version(s) of DSpace where you've seen this problem. Link to examples if they are public. + +**To Reproduce** +Steps to reproduce the behavior: +1. Do this +2. Then this... + +**Expected behavior** +A clear and concise description of what you expected to happen. + +**Related work** +Link to any related tickets or PRs here. diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md new file mode 100644 index 0000000000..34cc2c9e4f --- /dev/null +++ b/.github/ISSUE_TEMPLATE/feature_request.md @@ -0,0 +1,20 @@ +--- +name: Feature request +about: Suggest a new feature for this project +title: '' +labels: new feature, needs triage +assignees: '' + +--- + +**Is your feature request related to a problem? Please describe.** +A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] + +**Describe the solution you'd like** +A clear and concise description of what you want to happen. + +**Describe alternatives or workarounds you've considered** +A clear and concise description of any alternative solutions or features you've considered. + +**Additional context** +Add any other context or screenshots about the feature request here. diff --git a/.github/disabled-workflows/pull_request_opened.yml b/.github/disabled-workflows/pull_request_opened.yml new file mode 100644 index 0000000000..0dc718c0b9 --- /dev/null +++ b/.github/disabled-workflows/pull_request_opened.yml @@ -0,0 +1,26 @@ +# This workflow runs whenever a new pull request is created +# TEMPORARILY DISABLED. Unfortunately this doesn't work for PRs created from forked repositories (which is how we tend to create PRs). +# There is no known workaround yet. See https://github.community/t/how-to-use-github-token-for-prs-from-forks/16818 +name: Pull Request opened + +# Only run for newly opened PRs against the "main" branch +on: + pull_request: + types: [opened] + branches: + - main + +jobs: + automation: + runs-on: ubuntu-latest + steps: + # Assign the PR to whomever created it. This is useful for visualizing assignments on project boards + # See https://github.com/marketplace/actions/pull-request-assigner + - name: Assign PR to creator + uses: thomaseizinger/assign-pr-creator-action@v1.0.0 + # Note, this authentication token is created automatically + # See: https://docs.github.com/en/actions/configuring-and-managing-workflows/authenticating-with-the-github_token + with: + repo-token: ${{ secrets.GITHUB_TOKEN }} + # Ignore errors. It is possible the PR was created by someone who cannot be assigned + continue-on-error: true diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md index 3605531adb..6799d875f4 100644 --- a/.github/pull_request_template.md +++ b/.github/pull_request_template.md @@ -1,8 +1,7 @@ ## References -_Add references/links to any related tickets or PRs. These may include:_ -* Link to [JIRA](https://jira.lyrasis.org/projects/DS/summary) ticket(s), if any -* Link to [REST Contract](https://github.com/DSpace/Rest7Contract) or an open REST Contract PR, if any -* Link to [Angular issue or PR](https://github.com/DSpace/dspace-angular/issues) related to this PR, if any +_Add references/links to any related issues or PRs. These may include:_ +* Related to [REST Contract](https://github.com/DSpace/Rest7Contract) or an open REST Contract PR, if any +* Fixes [GitHub issue](https://github.com/DSpace/DSpace/issues), if any ## Description Short summary of changes (1-2 sentences). @@ -23,5 +22,5 @@ _This checklist provides a reminder of what we are going to look for when review - [ ] My PR passes Checkstyle validation based on the [Code Style Guide](https://wiki.lyrasis.org/display/DSPACE/Code+Style+Guide). - [ ] My PR includes Javadoc for _all new (or modified) public methods and classes_. It also includes Javadoc for large or complex private methods. - [ ] My PR passes all tests and includes new/updated Unit or Integration Tests based on the [Code Testing Guide](https://wiki.lyrasis.org/display/DSPACE/Code+Testing+Guide). -- [ ] If my PR includes new, third-party dependencies (in any `pom.xml`), I've made sure their licenses align with the [DSpace BSD License](https://github.com/DSpace/DSpace/blob/master/LICENSE) based on the [Licensing of Contributions](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines#CodeContributionGuidelines-LicensingofContributions) documentation. +- [ ] If my PR includes new, third-party dependencies (in any `pom.xml`), I've made sure their licenses align with the [DSpace BSD License](https://github.com/DSpace/DSpace/blob/main/LICENSE) based on the [Licensing of Contributions](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines#CodeContributionGuidelines-LicensingofContributions) documentation. - [ ] If my PR modifies the REST API, I've linked to the REST Contract page (or open PR) related to this change. diff --git a/.github/workflows/issue_opened.yml b/.github/workflows/issue_opened.yml new file mode 100644 index 0000000000..3ccdd22a0d --- /dev/null +++ b/.github/workflows/issue_opened.yml @@ -0,0 +1,29 @@ +# This workflow runs whenever a new issue is created +name: Issue opened + +on: + issues: + types: [opened] + +jobs: + automation: + runs-on: ubuntu-latest + steps: + # Add the new issue to a project board, if it needs triage + # See https://github.com/marketplace/actions/create-project-card-action + - name: Add issue to project board + # Only add to project board if issue is flagged as "needs triage" or has no labels + # NOTE: By default we flag new issues as "needs triage" in our issue template + if: (contains(github.event.issue.labels.*.name, 'needs triage') || join(github.event.issue.labels.*.name) == '') + uses: technote-space/create-project-card-action@v1 + # Note, the authentication token below is an ORG level Secret. + # It must be created/recreated manually via a personal access token with "public_repo" and "admin:org" permissions + # See: https://docs.github.com/en/actions/configuring-and-managing-workflows/authenticating-with-the-github_token#permissions-for-the-github_token + # This is necessary because the "DSpace Backlog" project is an org level project (i.e. not repo specific) + with: + GITHUB_TOKEN: ${{ secrets.ORG_PROJECT_TOKEN }} + PROJECT: DSpace Backlog + COLUMN: Triage + CHECK_ORG_PROJECT: true + # Ignore errors. + continue-on-error: true diff --git a/.github/workflows/label_merge_conflicts.yml b/.github/workflows/label_merge_conflicts.yml new file mode 100644 index 0000000000..dcbab18f1b --- /dev/null +++ b/.github/workflows/label_merge_conflicts.yml @@ -0,0 +1,25 @@ +# This workflow checks open PRs for merge conflicts and labels them when conflicts are found +name: Check for merge conflicts + +# Run whenever the "main" branch is updated +# NOTE: This means merge conflicts are only checked for when a PR is merged to main. +on: + push: + branches: + - main + +jobs: + triage: + runs-on: ubuntu-latest + steps: + # See: https://github.com/mschilde/auto-label-merge-conflicts/ + - name: Auto-label PRs with merge conflicts + uses: mschilde/auto-label-merge-conflicts@v2.0 + # Add "merge conflict" label if a merge conflict is detected. Remove it when resolved. + # Note, the authentication token is created automatically + # See: https://docs.github.com/en/actions/configuring-and-managing-workflows/authenticating-with-the-github_token + with: + CONFLICT_LABEL_NAME: 'merge conflict' + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + # Ignore errors + continue-on-error: true diff --git a/.lgtm.yml b/.lgtm.yml new file mode 100644 index 0000000000..132de8a6de --- /dev/null +++ b/.lgtm.yml @@ -0,0 +1,9 @@ +# LGTM Settings (https://lgtm.com/) +# For reference, see https://lgtm.com/help/lgtm/lgtm.yml-configuration-file +# or template at https://lgtm.com/static/downloads/lgtm.template.yml + +extraction: + java: + index: + # Specify the Java version required to build the project + java_version: 11 diff --git a/.travis.yml b/.travis.yml index dfc4c31799..89cb443597 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,46 +1,55 @@ +# DSpace's Travis CI Configuration +# Builds: https://travis-ci.com/github/DSpace/DSpace +# Travis configuration guide/validation: https://config.travis-ci.com/explore language: java -sudo: false +# TODO: Upgrade to Bionic dist: trusty - -env: - # Give Maven 1GB of memory to work with - - MAVEN_OPTS=-Xmx1024M +os: linux jdk: # DS-3384 Oracle JDK has DocLint enabled by default. # Let's use this to catch any newly introduced DocLint issues. - oraclejdk11 -## Should we run into any problems with oraclejdk8 on Travis, we may try the following workaround. -## https://docs.travis-ci.com/user/languages/java#Testing-Against-Multiple-JDKs -## https://github.com/travis-ci/travis-ci/issues/3259#issuecomment-130860338 -#addons: -# apt: -# packages: -# - oracle-java8-installer +# Define global environment variables (shared across all jobs) +env: + global: + # Suppress all Maven "downloading" messages in Travis logs (see https://stackoverflow.com/a/35653426) + # This also slightly speeds builds in Travis, as there is less logging + - HIDE_MAVEN_DOWNLOADS="-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn" + # Give Maven 1GB of memory to work with + - MAVEN_OPTS="-Xmx1024M $HIDE_MAVEN_DOWNLOADS" + # Maven options which will skip ALL code validation checks. Includes skipping: + # - enforcer.skip => Skip maven-enforcer-plugin rules + # - checkstyle.skip => Skip all checkstyle checks by maven-checkstyle-plugin + # - license.skip => Skip all license header checks by license-maven-plugin + # - xml.skip => Skip all XML/XSLT validation by xml-maven-plugin + # (Useful for builds which don't need to repeat code checks) + - SKIP_CODE_CHECKS="-Denforcer.skip=true -Dcheckstyle.skip=true -Dlicense.skip=true -Dxml.skip=true" -before_install: - # Remove outdated settings.xml from Travis builds. Workaround for https://github.com/travis-ci/travis-ci/issues/4629 - - rm ~/.m2/settings.xml +# Create two jobs to run Unit & Integration tests in parallel. +# These jobs only differ in the TEST_FLAGS defined below, +# and otherwise share all the other configs in this file +jobs: + include: + - name: "Run Unit Tests & Check Code" + # NOTE: unit tests include deprecated REST API v6 (as it has unit tests) + env: TEST_FLAGS="-DskipUnitTests=false -Pdspace-rest" + - name: "Run Integration Tests" + # NOTE: skips code checks, as they are already done by Unit Test job + env: TEST_FLAGS="-DskipIntegrationTests=false $SKIP_CODE_CHECKS" -# Skip install stage, as we'll do it below -install: "echo 'Skipping install stage, dependencies will be downloaded during build and test stages.'" +# Skip 'install' process to save time. We build/install/test all at once in "script" below. +install: skip -# Build DSpace and run both Unit and Integration Tests -script: - # Summary of flags used (below): - # license:check => Validate all source code license headers - # -Dmaven.test.skip=false => Enable DSpace Unit Tests - # -DskipITs=false => Enable DSpace Integration Tests - # -Pdspace-rest => Enable optional dspace-rest module as part of build - # -P !assembly => Skip assembly of "dspace-installer" directory (as it can be memory intensive) - # -B => Maven batch/non-interactive mode (recommended for CI) - # -V => Display Maven version info before build - # -Dsurefire.rerunFailingTestsCount=2 => try again for flakey tests, and keep track of/report on number of retries - - "mvn clean install license:check -Dmaven.test.skip=false -DskipITs=false -Pdspace-rest -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2" +# Build DSpace and run configured tests (see 'jobs' above) +# Notes on flags used: +# -B => Maven batch/non-interactive mode (recommended for CI) +# -V => Display Maven version info before build +# -P-assembly => Disable build of dspace-installer in [src]/dspace/, as it can be memory intensive +# -Pcoverage-report => Enable aggregate code coverage report (across all modules) via JaCoCo +script: mvn install -B -V -P-assembly -Pcoverage-report $TEST_FLAGS -# After a successful build and test (see 'script'), send code coverage reports to coveralls.io -# These code coverage reports are generated by jacoco-maven-plugin (during test process above). -after_success: - # Run "verify", enabling the "coveralls" profile. This sends our reports to coveralls.io (see coveralls-maven-plugin) - - "cd dspace && mvn verify -P coveralls" +# After a successful build and test (see 'script'), send aggregate code coverage reports +# (generated by -Pcoverage-report above) to CodeCov.io +after_success: bash <(curl -s https://codecov.io/bash) diff --git a/Dockerfile b/Dockerfile index 006f32f28e..2dc3ee9bda 100644 --- a/Dockerfile +++ b/Dockerfile @@ -1,5 +1,5 @@ # This image will be published as dspace/dspace -# See https://github.com/DSpace/DSpace/tree/master/dspace/src/main/docker for usage details +# See https://github.com/DSpace/DSpace/tree/main/dspace/src/main/docker for usage details # # This version is JDK11 compatible # - tomcat:8-jdk11 diff --git a/Dockerfile.cli b/Dockerfile.cli index 116b251f2d..d4204ebdd0 100644 --- a/Dockerfile.cli +++ b/Dockerfile.cli @@ -1,5 +1,5 @@ # This image will be published as dspace/dspace-cli -# See https://github.com/DSpace/DSpace/tree/master/dspace/src/main/docker for usage details +# See https://github.com/DSpace/DSpace/tree/main/dspace/src/main/docker for usage details # # This version is JDK11 compatible # - openjdk:11 diff --git a/Dockerfile.test b/Dockerfile.test index 090f714e28..82ffdef177 100644 --- a/Dockerfile.test +++ b/Dockerfile.test @@ -1,5 +1,5 @@ # This image will be published as dspace/dspace -# See https://github.com/DSpace/DSpace/tree/master/dspace/src/main/docker for usage details +# See https://github.com/DSpace/DSpace/tree/main/dspace/src/main/docker for usage details # # This version is JDK11 compatible # - tomcat:8-jdk11 diff --git a/README.md b/README.md index 49f3814b49..2e6c0ad54e 100644 --- a/README.md +++ b/README.md @@ -1,24 +1,24 @@ # DSpace -[![Build Status](https://travis-ci.org/DSpace/DSpace.png?branch=master)](https://travis-ci.org/DSpace/DSpace) +[![Build Status](https://travis-ci.com/DSpace/DSpace.png?branch=main)](https://travis-ci.com/DSpace/DSpace) -[DSpace Documentation](https://wiki.duraspace.org/display/DSDOC/) | +[DSpace Documentation](https://wiki.lyrasis.org/display/DSDOC/) | [DSpace Releases](https://github.com/DSpace/DSpace/releases) | -[DSpace Wiki](https://wiki.duraspace.org/display/DSPACE/Home) | -[Support](https://wiki.duraspace.org/display/DSPACE/Support) +[DSpace Wiki](https://wiki.lyrasis.org/display/DSPACE/Home) | +[Support](https://wiki.lyrasis.org/display/DSPACE/Support) DSpace open source software is a turnkey repository application used by more than 2,000 organizations and institutions worldwide to provide durable access to digital resources. For more information, visit http://www.dspace.org/ *** -:warning: **Work on DSpace 7 has begun on our `master` branch.** This means that there is temporarily NO user interface on this `master` branch. DSpace 7 will feature a new, unified [Angular](https://angular.io/) user interface, along with an enhanced, rebuilt REST API. The latest status of this work can be found on the [DSpace 7 UI Working Group](https://wiki.duraspace.org/display/DSPACE/DSpace+7+UI+Working+Group) page. Additionally, the codebases can be found in the following places: - * DSpace 7 REST API work is occurring on the [`master` branch](https://github.com/DSpace/DSpace/tree/master/dspace-server-webapp) of this repository. - * The REST Contract is being documented at https://github.com/DSpace/Rest7Contract +:warning: **Work on DSpace 7 has begun on our `main` branch.** This means that there is NO user interface on this `main` branch. DSpace 7 will feature a new, unified [Angular](https://angular.io/) user interface, along with an enhanced, rebuilt REST API. The latest status of this work can be found on the [DSpace 7 Working Group](https://wiki.lyrasis.org/display/DSPACE/DSpace+7+Working+Group) page. Additionally, the codebases can be found in the following places: + * DSpace 7 REST API work is occurring on the [`main` branch](https://github.com/DSpace/DSpace/tree/main/dspace-server-webapp) of this repository. + * The REST Contract is at https://github.com/DSpace/Rest7Contract * DSpace 7 Angular UI work is occurring at https://github.com/DSpace/dspace-angular -**If you would like to get involved in our DSpace 7 development effort, we welcome new contributors.** Just join one of our meetings or get in touch via Slack. See the [DSpace 7 UI Working Group](https://wiki.duraspace.org/display/DSPACE/DSpace+7+UI+Working+Group) wiki page for more info. +**If you would like to get involved in our DSpace 7 development effort, we welcome new contributors.** Just join one of our meetings or get in touch via Slack. See the [DSpace 7 Working Group](https://wiki.lyrasis.org/display/DSPACE/DSpace+7+Working+Group) wiki page for more info. **If you are looking for the ongoing maintenance work for DSpace 6 (or prior releases)**, you can find that work on the corresponding maintenance branch (e.g. [`dspace-6_x`](https://github.com/DSpace/DSpace/tree/dspace-6_x)) in this repository. *** @@ -31,10 +31,10 @@ Past releases are all available via GitHub at https://github.com/DSpace/DSpace/r ## Documentation / Installation -Documentation for each release may be viewed online or downloaded via our [Documentation Wiki](https://wiki.duraspace.org/display/DSDOC/). +Documentation for each release may be viewed online or downloaded via our [Documentation Wiki](https://wiki.lyrasis.org/display/DSDOC/). The latest DSpace Installation instructions are available at: -https://wiki.duraspace.org/display/DSDOC6x/Installing+DSpace +https://wiki.lyrasis.org/display/DSDOC6x/Installing+DSpace Please be aware that, as a Java web application, DSpace requires a database (PostgreSQL or Oracle) and a servlet container (usually Tomcat) in order to function. @@ -49,14 +49,14 @@ DSpace is a community built and supported project. We do not have a centralized but have a dedicated group of volunteers who help us improve the software, documentation, resources, etc. We welcome contributions of any type. Here's a few basic guides that provide suggestions for contributing to DSpace: -* [How to Contribute to DSpace](https://wiki.duraspace.org/display/DSPACE/How+to+Contribute+to+DSpace): How to contribute in general (via code, documentation, bug reports, expertise, etc) -* [Code Contribution Guidelines](https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines): How to give back code or contribute features, bug fixes, etc. -* [DSpace Community Advisory Team (DCAT)](https://wiki.duraspace.org/display/cmtygp/DSpace+Community+Advisory+Team): If you are not a developer, we also have an interest group specifically for repository managers. The DCAT group meets virtually, once a month, and sends open invitations to join their meetings via the [DCAT mailing list](https://groups.google.com/d/forum/DSpaceCommunityAdvisoryTeam). +* [How to Contribute to DSpace](https://wiki.lyrasis.org/display/DSPACE/How+to+Contribute+to+DSpace): How to contribute in general (via code, documentation, bug reports, expertise, etc) +* [Code Contribution Guidelines](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines): How to give back code or contribute features, bug fixes, etc. +* [DSpace Community Advisory Team (DCAT)](https://wiki.lyrasis.org/display/cmtygp/DSpace+Community+Advisory+Team): If you are not a developer, we also have an interest group specifically for repository managers. The DCAT group meets virtually, once a month, and sends open invitations to join their meetings via the [DCAT mailing list](https://groups.google.com/d/forum/DSpaceCommunityAdvisoryTeam). -We also encourage GitHub Pull Requests (PRs) at any time. Please see our [Development with Git](https://wiki.duraspace.org/display/DSPACE/Development+with+Git) guide for more info. +We also encourage GitHub Pull Requests (PRs) at any time. Please see our [Development with Git](https://wiki.lyrasis.org/display/DSPACE/Development+with+Git) guide for more info. In addition, a listing of all known contributors to DSpace software can be -found online at: https://wiki.duraspace.org/display/DSPACE/DSpaceContributors +found online at: https://wiki.lyrasis.org/display/DSPACE/DSpaceContributors ## Getting Help @@ -64,12 +64,12 @@ DSpace provides public mailing lists where you can post questions or raise topic We welcome everyone to participate in these lists: * [dspace-community@googlegroups.com](https://groups.google.com/d/forum/dspace-community) : General discussion about DSpace platform, announcements, sharing of best practices -* [dspace-tech@googlegroups.com](https://groups.google.com/d/forum/dspace-tech) : Technical support mailing list. See also our guide for [How to troubleshoot an error](https://wiki.duraspace.org/display/DSPACE/Troubleshoot+an+error). +* [dspace-tech@googlegroups.com](https://groups.google.com/d/forum/dspace-tech) : Technical support mailing list. See also our guide for [How to troubleshoot an error](https://wiki.lyrasis.org/display/DSPACE/Troubleshoot+an+error). * [dspace-devel@googlegroups.com](https://groups.google.com/d/forum/dspace-devel) : Developers / Development mailing list Great Q&A is also available under the [DSpace tag on Stackoverflow](http://stackoverflow.com/questions/tagged/dspace) -Additional support options are listed at https://wiki.duraspace.org/display/DSPACE/Support +Additional support options are at https://wiki.lyrasis.org/display/DSPACE/Support DSpace also has an active service provider network. If you'd rather hire a service provider to install, upgrade, customize or host DSpace, then we recommend getting in touch with one of our @@ -77,44 +77,46 @@ install, upgrade, customize or host DSpace, then we recommend getting in touch w ## Issue Tracker -The DSpace Issue Tracker can be found at: https://jira.duraspace.org/projects/DS/summary +DSpace uses GitHub to track issues: +* Backend (REST API) issues: https://github.com/DSpace/DSpace/issues +* Frontend (User Interface) issues: https://github.com/DSpace/dspace-angular/issues ## Testing ### Running Tests By default, in DSpace, Unit Tests and Integration Tests are disabled. However, they are -run automatically by [Travis CI](https://travis-ci.org/DSpace/DSpace/) for all Pull Requests and code commits. +run automatically by [Travis CI](https://travis-ci.com/DSpace/DSpace/) for all Pull Requests and code commits. * How to run both Unit Tests (via `maven-surefire-plugin`) and Integration Tests (via `maven-failsafe-plugin`): ``` - mvn clean test -Dmaven.test.skip=false -DskipITs=false + mvn install -DskipUnitTests=false -DskipIntegrationTests=false ``` -* How to run just Unit Tests: +* How to run _only_ Unit Tests: ``` - mvn test -Dmaven.test.skip=false + mvn test -DskipUnitTests=false ``` * How to run a *single* Unit Test ``` # Run all tests in a specific test class # NOTE: failIfNoTests=false is required to skip tests in other modules - mvn test -Dmaven.test.skip=false -Dtest=[full.package.testClassName] -DfailIfNoTests=false + mvn test -DskipUnitTests=false -Dtest=[full.package.testClassName] -DfailIfNoTests=false # Run one test method in a specific test class - mvn test -Dmaven.test.skip=false -Dtest=[full.package.testClassName]#[testMethodName] -DfailIfNoTests=false + mvn test -DskipUnitTests=false -Dtest=[full.package.testClassName]#[testMethodName] -DfailIfNoTests=false ``` -* How to run Integration Tests (requires enabling Unit tests too) +* How to run _only_ Integration Tests ``` - mvn verify -Dmaven.test.skip=false -DskipITs=false + mvn install -DskipIntegrationTests=false ``` -* How to run a *single* Integration Test (requires enabling Unit tests too) +* How to run a *single* Integration Test ``` # Run all integration tests in a specific test class # NOTE: failIfNoTests=false is required to skip tests in other modules - mvn test -Dmaven.test.skip=false -DskipITs=false -Dtest=[full.package.testClassName] -DfailIfNoTests=false + mvn install -DskipIntegrationTests=false -Dit.test=[full.package.testClassName] -DfailIfNoTests=false # Run one test method in a specific test class - mvn test -Dmaven.test.skip=false -DskipITs=false -Dtest=[full.package.testClassName]#[testMethodName] -DfailIfNoTests=false + mvn install -DskipIntegrationTests=false -Dit.test=[full.package.testClassName]#[testMethodName] -DfailIfNoTests=false ``` * How to run only tests of a specific DSpace module ``` @@ -130,4 +132,4 @@ run automatically by [Travis CI](https://travis-ci.org/DSpace/DSpace/) for all P ## License DSpace source code is freely available under a standard [BSD 3-Clause license](https://opensource.org/licenses/BSD-3-Clause). -The full license is available at http://www.dspace.org/license/ +The full license is available in the [LICENSE](LICENSE) file or online at http://www.dspace.org/license/ diff --git a/dspace-api/pom.xml b/dspace-api/pom.xml index a0714c04ee..ced0f562bf 100644 --- a/dspace-api/pom.xml +++ b/dspace-api/pom.xml @@ -12,7 +12,7 @@ org.dspace dspace-parent - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. @@ -127,44 +127,82 @@ + + + org.codehaus.gmaven + groovy-maven-plugin + + + setproperty + initialize + + execute + + + + project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/'); + log.info("Initializing Maven property 'agnostic.build.dir' to: {}", project.properties['agnostic.build.dir']); + + + + + + + + com.mycila + license-maven-plugin + + + src/test/resources/** + src/test/data/** + + src/main/resources/org/dspace/storage/rdbms/flywayupgrade/** + + + + - findbugs + spotbugs false - - org.codehaus.mojo - findbugs-maven-plugin + com.github.spotbugs + spotbugs-maven-plugin - + - test-environment + unit-test-environment false - maven.test.skip + skipUnitTests false - @@ -184,53 +222,16 @@ - setupTestEnvironment + setupUnitTestEnvironment generate-test-resources unpack - - setupIntegrationTestEnvironment - pre-integration-test - - unpack - - - - - org.codehaus.gmaven - groovy-maven-plugin - - - setproperty - initialize - - execute - - - - project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/'); - log.info("Initializing Maven property 'agnostic.build.dir' to: {}", project.properties['agnostic.build.dir']); - - - - - - - + maven-surefire-plugin @@ -241,11 +242,56 @@ ${agnostic.build.dir}/testing/dspace/ true + ${agnostic.build.dir}/testing/dspace/solr/ + + + - + + + integration-test-environment + + false + + skipIntegrationTests + false + + + + + + + maven-dependency-plugin + + ${project.build.directory}/testing + + + org.dspace + dspace-parent + ${project.version} + zip + testEnvironment + + + + + + setupIntegrationTestEnvironment + pre-integration-test + + unpack + + + + + + maven-failsafe-plugin @@ -255,12 +301,12 @@ ${agnostic.build.dir}/testing/dspace/ true + ${agnostic.build.dir}/testing/dspace/solr/ - @@ -291,9 +337,20 @@ - org.dspace + net.handle handle + + net.cnri + cnri-servlet-container + + + + org.ow2.asm + asm-commons + + + org.eclipse.jetty @@ -312,6 +369,18 @@ apache-jena-libs pom + + + + org.glassfish.jersey.inject + jersey-hk2 + ${jersey.version} + + + + commons-cli + commons-cli + commons-codec commons-codec @@ -468,16 +537,164 @@ org.apache.solr - solr-cell + solr-solrj + ${solr.client.version} + + + + + org.apache.solr + solr-core + test ${solr.client.version} - + + commons-cli + commons-cli + + + org.eclipse.jetty + jetty-continuation + + + org.eclipse.jetty + jetty-deploy + + + org.eclipse.jetty + jetty-http + + + org.eclipse.jetty + jetty-io + + + org.eclipse.jetty + jetty-jmx + + + org.eclipse.jetty + jetty-rewrite + + + org.eclipse.jetty + jetty-security + + + org.eclipse.jetty + jetty-server + + + org.eclipse.jetty + jetty-servlet + + + org.eclipse.jetty + jetty-servlets + + + org.eclipse.jetty + jetty-util + + + org.eclipse.jetty + jetty-webapp + + + org.eclipse.jetty + jetty-xml + + + + + org.apache.solr + solr-cell + + + + commons-cli + commons-cli + org.ow2.asm asm-commons + + org.bouncycastle + bcpkix-jdk15on + + + org.bouncycastle + bcprov-jdk15on + + + org.eclipse.jetty + jetty-xml + + + org.eclipse.jetty + jetty-http + + + org.eclipse.jetty + jetty-servlet + + + org.eclipse.jetty + jetty-webapp + + + org.eclipse.jetty + jetty-util + + + org.eclipse.jetty + jetty-deploy + + + org.eclipse.jetty + jetty-continuation + + + org.eclipse.jetty + jetty-servlets + + + org.eclipse.jetty + jetty-io + + + org.eclipse.jetty + jetty-security + + + org.apache.lucene + lucene-core + + + + org.apache.lucene + lucene-analyzers-icu + test + + + org.apache.lucene + lucene-analyzers-smartcn + test + + + org.apache.lucene + lucene-analyzers-stempel + test + + + org.apache.xmlbeans + xmlbeans + 2.6.0 + com.maxmind.geoip2 @@ -535,7 +752,7 @@ org.flywaydb flyway-core - 4.0.3 + 6.5.5 @@ -559,6 +776,7 @@ com.google.oauth-client google-oauth-client + com.google.code.findbugs @@ -568,6 +786,7 @@ com.google.code.findbugs annotations + joda-time joda-time @@ -658,7 +877,7 @@ org.xmlunit - xmlunit-matchers + xmlunit-core 2.6.3 test diff --git a/dspace-api/src/main/java/org/dspace/administer/CreateAdministrator.java b/dspace-api/src/main/java/org/dspace/administer/CreateAdministrator.java index a58691e251..983038c812 100644 --- a/dspace-api/src/main/java/org/dspace/administer/CreateAdministrator.java +++ b/dspace-api/src/main/java/org/dspace/administer/CreateAdministrator.java @@ -115,7 +115,7 @@ public final class CreateAdministrator { String lastName = null; char[] password1 = null; char[] password2 = null; - String language = I18nUtil.DEFAULTLOCALE.getLanguage(); + String language = I18nUtil.getDefaultLocale().getLanguage(); while (!dataOK) { System.out.print("E-mail address: "); diff --git a/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataExport.java b/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataExport.java index 2e4f333820..3332440f06 100644 --- a/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataExport.java +++ b/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataExport.java @@ -10,10 +10,14 @@ package org.dspace.app.bulkedit; import java.sql.SQLException; import org.apache.commons.cli.ParseException; +import org.apache.commons.lang3.StringUtils; +import org.dspace.content.DSpaceObject; +import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.MetadataDSpaceCsvExportService; import org.dspace.core.Context; import org.dspace.eperson.factory.EPersonServiceFactory; import org.dspace.eperson.service.EPersonService; +import org.dspace.handle.factory.HandleServiceFactory; import org.dspace.scripts.DSpaceRunnable; import org.dspace.utils.DSpace; @@ -41,8 +45,7 @@ public class MetadataExport extends DSpaceRunnable { + + + @Override + public Options getOptions() { + Options options = super.getOptions(); + options.addOption("f", "file", true, "destination where you want file written"); + options.getOption("f").setType(OutputStream .class); + options.getOption("f").setRequired(true); + super.options = options; + return options; + } +} diff --git a/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataExportScriptConfiguration.java b/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataExportScriptConfiguration.java index 65c0ddd8cf..0c513c4667 100644 --- a/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataExportScriptConfiguration.java +++ b/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataExportScriptConfiguration.java @@ -7,7 +7,6 @@ */ package org.dspace.app.bulkedit; -import java.io.OutputStream; import java.sql.SQLException; import org.apache.commons.cli.Options; @@ -56,9 +55,6 @@ public class MetadataExportScriptConfiguration extends options.addOption("i", "id", true, "ID or handle of thing to export (item, collection, or community)"); options.getOption("i").setType(String.class); - options.addOption("f", "file", true, "destination where you want file written"); - options.getOption("f").setType(OutputStream.class); - options.getOption("f").setRequired(true); options.addOption("a", "all", false, "include all metadata fields that are not normally changed (e.g. provenance)"); options.getOption("a").setType(boolean.class); diff --git a/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataImport.java b/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataImport.java index eb0a4e2935..67086c1536 100644 --- a/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataImport.java +++ b/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataImport.java @@ -182,24 +182,7 @@ public class MetadataImport extends DSpaceRunnable { + + @Override + public Options getOptions() { + Options options = super.getOptions(); + options.addOption("e", "email", true, "email address or user id of user (required if adding new items)"); + options.getOption("e").setType(String.class); + options.getOption("e").setRequired(true); + super.options = options; + return options; + } } diff --git a/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataImportScriptConfiguration.java b/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataImportScriptConfiguration.java index 9ea50b7de5..07e6a9aec9 100644 --- a/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataImportScriptConfiguration.java +++ b/dspace-api/src/main/java/org/dspace/app/bulkedit/MetadataImportScriptConfiguration.java @@ -57,9 +57,6 @@ public class MetadataImportScriptConfiguration extends options.addOption("f", "file", true, "source file"); options.getOption("f").setType(InputStream.class); options.getOption("f").setRequired(true); - options.addOption("e", "email", true, "email address or user id of user (required if adding new items)"); - options.getOption("e").setType(String.class); - options.getOption("e").setRequired(true); options.addOption("s", "silent", false, "silent operation - doesn't request confirmation of changes USE WITH CAUTION"); options.getOption("s").setType(boolean.class); diff --git a/dspace-api/src/main/java/org/dspace/app/itemimport/ItemImportServiceImpl.java b/dspace-api/src/main/java/org/dspace/app/itemimport/ItemImportServiceImpl.java index 12fcd84d04..13aa236f54 100644 --- a/dspace-api/src/main/java/org/dspace/app/itemimport/ItemImportServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/app/itemimport/ItemImportServiceImpl.java @@ -1519,6 +1519,12 @@ public class ItemImportServiceImpl implements ItemImportService, InitializingBea if (!dir.exists() && !dir.mkdirs()) { log.error("Unable to create directory: " + dir.getAbsolutePath()); } + // Verify that the directory the entry is using is a subpath of zipDir (and not somewhere else!) + if (!dir.toPath().normalize().startsWith(zipDir)) { + throw new IOException("Bad zip entry: '" + entry.getName() + + "' in file '" + zipfile.getAbsolutePath() + "'!" + + " Cannot process this file."); + } //Entries could have too many directories, and we need to adjust the sourcedir // file1.zip (SimpleArchiveFormat / item1 / contents|dublin_core|... @@ -1539,9 +1545,16 @@ public class ItemImportServiceImpl implements ItemImportService, InitializingBea } byte[] buffer = new byte[1024]; int len; + File outFile = new File(zipDir + entry.getName()); + // Verify that this file will be created in our zipDir (and not somewhere else!) + if (!outFile.toPath().normalize().startsWith(zipDir)) { + throw new IOException("Bad zip entry: '" + entry.getName() + + "' in file '" + zipfile.getAbsolutePath() + "'!" + + " Cannot process this file."); + } InputStream in = zf.getInputStream(entry); BufferedOutputStream out = new BufferedOutputStream( - new FileOutputStream(zipDir + entry.getName())); + new FileOutputStream(outFile)); while ((len = in.read(buffer)) >= 0) { out.write(buffer, 0, len); } diff --git a/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemAuthorExtractor.java b/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemAuthorExtractor.java index bba0913193..9b66030e90 100644 --- a/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemAuthorExtractor.java +++ b/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemAuthorExtractor.java @@ -19,6 +19,15 @@ import org.dspace.core.Context; * @author Andrea Bollini */ public interface RequestItemAuthorExtractor { - public RequestItemAuthor getRequestItemAuthor(Context context, Item item) - throws SQLException; + + /** + * Retrieve the auhtor to contact for a request copy of the give item. + * + * @param context DSpace context object + * @param item item to request + * @return An object containing name an email address to send the request to + * or null if no valid email address was found. + * @throws SQLException if database error + */ + public RequestItemAuthor getRequestItemAuthor(Context context, Item item) throws SQLException; } diff --git a/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemHelpdeskStrategy.java b/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemHelpdeskStrategy.java index a5f7341039..5d22efaa7a 100644 --- a/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemHelpdeskStrategy.java +++ b/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemHelpdeskStrategy.java @@ -74,8 +74,8 @@ public class RequestItemHelpdeskStrategy extends RequestItemSubmitterStrategy { return new RequestItemAuthor(helpdeskEPerson); } else { String helpdeskName = I18nUtil.getMessage( - "org.dspace.app.requestitem.RequestItemHelpdeskStrategy.helpdeskname", - context); + "org.dspace.app.requestitem.RequestItemHelpdeskStrategy.helpdeskname", + context); return new RequestItemAuthor(helpdeskName, helpDeskEmail); } } diff --git a/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemMetadataStrategy.java b/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemMetadataStrategy.java index 4d2f78408a..9838e58697 100644 --- a/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemMetadataStrategy.java +++ b/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemMetadataStrategy.java @@ -16,6 +16,7 @@ import org.dspace.content.MetadataValue; import org.dspace.content.service.ItemService; import org.dspace.core.Context; import org.dspace.core.I18nUtil; +import org.dspace.services.factory.DSpaceServicesFactory; import org.springframework.beans.factory.annotation.Autowired; /** @@ -38,6 +39,7 @@ public class RequestItemMetadataStrategy extends RequestItemSubmitterStrategy { @Override public RequestItemAuthor getRequestItemAuthor(Context context, Item item) throws SQLException { + RequestItemAuthor author = null; if (emailMetadata != null) { List vals = itemService.getMetadataByMetadataString(item, emailMetadata); if (vals.size() > 0) { @@ -49,19 +51,38 @@ public class RequestItemMetadataStrategy extends RequestItemSubmitterStrategy { fullname = nameVals.iterator().next().getValue(); } } - if (StringUtils.isBlank(fullname)) { fullname = I18nUtil - .getMessage( - "org.dspace.app.requestitem.RequestItemMetadataStrategy.unnamed", - context); + .getMessage( + "org.dspace.app.requestitem.RequestItemMetadataStrategy.unnamed", + context); } - RequestItemAuthor author = new RequestItemAuthor( - fullname, email); + author = new RequestItemAuthor(fullname, email); return author; } + } else { + // Uses the basic strategy to look for the original submitter + author = super.getRequestItemAuthor(context, item); + // Is the author or his email null, so get the help desk or admin name and email + if (null == author || null == author.getEmail()) { + String email = null; + String name = null; + //First get help desk name and email + email = DSpaceServicesFactory.getInstance() + .getConfigurationService().getProperty("mail.helpdesk"); + name = DSpaceServicesFactory.getInstance() + .getConfigurationService().getProperty("mail.helpdesk.name"); + // If help desk mail is null get the mail and name of admin + if (email == null) { + email = DSpaceServicesFactory.getInstance() + .getConfigurationService().getProperty("mail.admin"); + name = DSpaceServicesFactory.getInstance() + .getConfigurationService().getProperty("mail.admin.name"); + } + author = new RequestItemAuthor(name, email); + } } - return super.getRequestItemAuthor(context, item); + return author; } public void setEmailMetadata(String emailMetadata) { diff --git a/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemSubmitterStrategy.java b/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemSubmitterStrategy.java index 8ed6238a8c..2708c24ba9 100644 --- a/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemSubmitterStrategy.java +++ b/dspace-api/src/main/java/org/dspace/app/requestitem/RequestItemSubmitterStrategy.java @@ -23,13 +23,22 @@ public class RequestItemSubmitterStrategy implements RequestItemAuthorExtractor public RequestItemSubmitterStrategy() { } + /** + * Returns the submitter of an Item as RequestItemAuthor or null if the + * Submitter is deleted. + * + * @return The submitter of the item or null if the submitter is deleted + * @throws SQLException if database error + */ @Override public RequestItemAuthor getRequestItemAuthor(Context context, Item item) throws SQLException { EPerson submitter = item.getSubmitter(); - RequestItemAuthor author = new RequestItemAuthor( - submitter.getFullName(), submitter.getEmail()); + RequestItemAuthor author = null; + if (null != submitter) { + author = new RequestItemAuthor( + submitter.getFullName(), submitter.getEmail()); + } return author; } - } diff --git a/dspace-api/src/main/java/org/dspace/app/sherpa/SHERPAResponse.java b/dspace-api/src/main/java/org/dspace/app/sherpa/SHERPAResponse.java index c5b8bbebf3..bd2909c0c1 100644 --- a/dspace-api/src/main/java/org/dspace/app/sherpa/SHERPAResponse.java +++ b/dspace-api/src/main/java/org/dspace/app/sherpa/SHERPAResponse.java @@ -48,6 +48,9 @@ public class SHERPAResponse { factory.setValidating(false); factory.setIgnoringComments(true); factory.setIgnoringElementContentWhitespace(true); + // disallow DTD parsing to ensure no XXE attacks can occur. + // See https://cheatsheetseries.owasp.org/cheatsheets/XML_External_Entity_Prevention_Cheat_Sheet.html + factory.setFeature("http://apache.org/xml/features/disallow-doctype-decl", true); DocumentBuilder db = factory.newDocumentBuilder(); Document inDoc = db.parse(xmlData); diff --git a/dspace-api/src/main/java/org/dspace/app/sitemap/GenerateSitemaps.java b/dspace-api/src/main/java/org/dspace/app/sitemap/GenerateSitemaps.java index bb35cd3ff9..e2743951e7 100644 --- a/dspace-api/src/main/java/org/dspace/app/sitemap/GenerateSitemaps.java +++ b/dspace-api/src/main/java/org/dspace/app/sitemap/GenerateSitemaps.java @@ -27,6 +27,7 @@ import org.apache.commons.cli.HelpFormatter; import org.apache.commons.cli.Options; import org.apache.commons.cli.ParseException; import org.apache.commons.cli.PosixParser; +import org.apache.commons.io.FileUtils; import org.apache.commons.lang3.ArrayUtils; import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; @@ -84,6 +85,9 @@ public class GenerateSitemaps { options .addOption("p", "ping", true, "ping specified search engine URL"); + options + .addOption("d", "delete", false, + "delete sitemaps dir and its contents"); CommandLine line = null; @@ -105,10 +109,9 @@ public class GenerateSitemaps { } /* - * Sanity check -- if no sitemap generation or pinging to do, print - * usage + * Sanity check -- if no sitemap generation or pinging to do, or deletion, print usage */ - if (line.getArgs().length != 0 || line.hasOption('b') + if (line.getArgs().length != 0 || line.hasOption('d') || line.hasOption('b') && line.hasOption('s') && !line.hasOption('g') && !line.hasOption('m') && !line.hasOption('y') && !line.hasOption('p')) { @@ -123,6 +126,10 @@ public class GenerateSitemaps { generateSitemaps(!line.hasOption('b'), !line.hasOption('s')); } + if (line.hasOption('d')) { + deleteSitemaps(); + } + if (line.hasOption('a')) { pingConfiguredSearchEngines(); } @@ -140,6 +147,29 @@ public class GenerateSitemaps { System.exit(0); } + /** + * Runs generate-sitemaps without any params for the scheduler (task-scheduler.xml). + * + * @throws SQLException if a database error occurs. + * @throws IOException if IO error occurs. + */ + public static void generateSitemapsScheduled() throws IOException, SQLException { + generateSitemaps(true, true); + } + + /** + * Delete the sitemaps directory and its contents if it exists + * @throws IOException if IO error occurs + */ + public static void deleteSitemaps() throws IOException { + File outputDir = new File(configurationService.getProperty("sitemap.dir")); + if (!outputDir.exists() && !outputDir.isDirectory()) { + log.error("Unable to delete sitemaps directory, doesn't exist or isn't a directort"); + } else { + FileUtils.deleteDirectory(outputDir); + } + } + /** * Generate sitemap.org protocol and/or basic HTML sitemaps. * @@ -150,14 +180,9 @@ public class GenerateSitemaps { * @throws IOException if IO error * if IO error occurs. */ - public static void generateSitemaps(boolean makeHTMLMap, - boolean makeSitemapOrg) throws SQLException, IOException { - String sitemapStem = configurationService.getProperty("dspace.ui.url") - + "/sitemap"; - String htmlMapStem = configurationService.getProperty("dspace.ui.url") - + "/htmlmap"; - String handleURLStem = configurationService.getProperty("dspace.ui.url") - + "/handle/"; + public static void generateSitemaps(boolean makeHTMLMap, boolean makeSitemapOrg) throws SQLException, IOException { + String uiURLStem = configurationService.getProperty("dspace.ui.url"); + String sitemapStem = uiURLStem + "/sitemap"; File outputDir = new File(configurationService.getProperty("sitemap.dir")); if (!outputDir.exists() && !outputDir.mkdir()) { @@ -168,13 +193,11 @@ public class GenerateSitemaps { AbstractGenerator sitemapsOrg = null; if (makeHTMLMap) { - html = new HTMLSitemapGenerator(outputDir, htmlMapStem + "?map=", - null); + html = new HTMLSitemapGenerator(outputDir, sitemapStem, ".html"); } if (makeSitemapOrg) { - sitemapsOrg = new SitemapsOrgGenerator(outputDir, sitemapStem - + "?map=", null); + sitemapsOrg = new SitemapsOrgGenerator(outputDir, sitemapStem, ".xml"); } Context c = new Context(Context.Mode.READ_ONLY); @@ -182,7 +205,7 @@ public class GenerateSitemaps { List comms = communityService.findAll(c); for (Community comm : comms) { - String url = handleURLStem + comm.getHandle(); + String url = uiURLStem + "/communities/" + comm.getID(); if (makeHTMLMap) { html.addURL(url, null); @@ -197,7 +220,7 @@ public class GenerateSitemaps { List colls = collectionService.findAll(c); for (Collection coll : colls) { - String url = handleURLStem + coll.getHandle(); + String url = uiURLStem + "/collections/" + coll.getID(); if (makeHTMLMap) { html.addURL(url, null); @@ -214,7 +237,7 @@ public class GenerateSitemaps { while (allItems.hasNext()) { Item i = allItems.next(); - String url = handleURLStem + i.getHandle(); + String url = uiURLStem + "/items/" + i.getID(); Date lastMod = i.getLastModified(); if (makeHTMLMap) { diff --git a/dspace-api/src/main/java/org/dspace/app/sitemap/SitemapsOrgGenerator.java b/dspace-api/src/main/java/org/dspace/app/sitemap/SitemapsOrgGenerator.java index 9a0d5a6ba4..3ec4ca8239 100644 --- a/dspace-api/src/main/java/org/dspace/app/sitemap/SitemapsOrgGenerator.java +++ b/dspace-api/src/main/java/org/dspace/app/sitemap/SitemapsOrgGenerator.java @@ -59,7 +59,7 @@ public class SitemapsOrgGenerator extends AbstractGenerator { @Override public String getFilename(int number) { - return "sitemap" + number + ".xml.gz"; + return "sitemap" + number + ".xml"; } @Override @@ -100,12 +100,12 @@ public class SitemapsOrgGenerator extends AbstractGenerator { @Override public boolean useCompression() { - return true; + return false; } @Override public String getIndexFilename() { - return "sitemap_index.xml.gz"; + return "sitemap_index.xml"; } @Override diff --git a/dspace-api/src/main/java/org/dspace/app/util/AuthorizeUtil.java b/dspace-api/src/main/java/org/dspace/app/util/AuthorizeUtil.java index ea1fb87ff4..efd813d29b 100644 --- a/dspace-api/src/main/java/org/dspace/app/util/AuthorizeUtil.java +++ b/dspace-api/src/main/java/org/dspace/app/util/AuthorizeUtil.java @@ -9,7 +9,10 @@ package org.dspace.app.util; import java.sql.SQLException; import java.util.List; +import javax.servlet.http.HttpServletRequest; +import org.apache.logging.log4j.Logger; +import org.dspace.authenticate.factory.AuthenticateServiceFactory; import org.dspace.authorize.AuthorizeConfiguration; import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.ResourcePolicy; @@ -26,9 +29,12 @@ import org.dspace.content.service.CollectionService; import org.dspace.content.service.ItemService; import org.dspace.core.Constants; import org.dspace.core.Context; +import org.dspace.eperson.EPerson; import org.dspace.eperson.Group; import org.dspace.eperson.factory.EPersonServiceFactory; import org.dspace.eperson.service.GroupService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.dspace.utils.DSpace; import org.dspace.xmlworkflow.factory.XmlWorkflowServiceFactory; import org.dspace.xmlworkflow.storedcomponents.CollectionRole; import org.dspace.xmlworkflow.storedcomponents.service.CollectionRoleService; @@ -41,6 +47,7 @@ import org.dspace.xmlworkflow.storedcomponents.service.CollectionRoleService; */ public class AuthorizeUtil { + private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(AuthorizeUtil.class); /** * Default constructor */ @@ -605,9 +612,53 @@ public class AuthorizeUtil { throw new AuthorizeException("not authorized to manage this group"); } + /** + * This method will return a boolean indicating whether the current user is allowed to register a new + * account or not + * @param context The relevant DSpace context + * @param request The current request + * @return A boolean indicating whether the current user can register a new account or not + * @throws SQLException If something goes wrong + */ + public static boolean authorizeNewAccountRegistration(Context context, HttpServletRequest request) + throws SQLException { + if (DSpaceServicesFactory.getInstance().getConfigurationService() + .getBooleanProperty("user.registration", true)) { + // This allowSetPassword is currently the only mthod that would return true only when it's + // actually expected to be returning true. + // For example the LDAP canSelfRegister will return true due to auto-register, while that + // does not imply a new user can register explicitly + return AuthenticateServiceFactory.getInstance().getAuthenticationService() + .allowSetPassword(context, request, null); + } + return false; + } + + /** + * This method will return a boolean indicating whether it's allowed to update the password for the EPerson + * with the given email and canLogin property + * @param context The relevant DSpace context + * @param email The email to be checked + * @return A boolean indicating if the password can be updated or not + */ + public static boolean authorizeUpdatePassword(Context context, String email) { + try { + EPerson eperson = EPersonServiceFactory.getInstance().getEPersonService().findByEmail(context, email); + if (eperson != null && eperson.canLogIn()) { + HttpServletRequest request = new DSpace().getRequestService().getCurrentRequest() + .getHttpServletRequest(); + return AuthenticateServiceFactory.getInstance().getAuthenticationService() + .allowSetPassword(context, request, null); + } + } catch (SQLException e) { + log.error("Something went wrong trying to retrieve EPerson for email: " + email, e); + } + return false; + } + /** * This method checks if the community Admin can manage accounts - * + * * @return true if is able */ public static boolean canCommunityAdminManageAccounts() { @@ -625,7 +676,7 @@ public class AuthorizeUtil { /** * This method checks if the Collection Admin can manage accounts - * + * * @return true if is able */ public static boolean canCollectionAdminManageAccounts() { diff --git a/dspace-api/src/main/java/org/dspace/app/util/DCInput.java b/dspace-api/src/main/java/org/dspace/app/util/DCInput.java index a6444a3890..c3cbac115a 100644 --- a/dspace-api/src/main/java/org/dspace/app/util/DCInput.java +++ b/dspace-api/src/main/java/org/dspace/app/util/DCInput.java @@ -12,6 +12,7 @@ import java.util.List; import java.util.Map; import java.util.regex.Pattern; import java.util.regex.PatternSyntaxException; +import javax.annotation.Nullable; import org.apache.commons.lang3.StringUtils; import org.dspace.content.MetadataSchemaEnum; @@ -291,7 +292,7 @@ public class DCInput { * * @return the input type */ - public String getInputType() { + public @Nullable String getInputType() { return inputType; } diff --git a/dspace-api/src/main/java/org/dspace/app/util/DCInputSet.java b/dspace-api/src/main/java/org/dspace/app/util/DCInputSet.java index 0f2333004c..bfd4270cf2 100644 --- a/dspace-api/src/main/java/org/dspace/app/util/DCInputSet.java +++ b/dspace-api/src/main/java/org/dspace/app/util/DCInputSet.java @@ -10,6 +10,7 @@ package org.dspace.app.util; import java.util.List; import java.util.Map; +import org.apache.commons.lang3.StringUtils; import org.dspace.core.Utils; /** * Class representing all DC inputs required for a submission, organized into pages @@ -109,7 +110,7 @@ public class DCInputSet { for (int j = 0; j < inputs[i].length; j++) { DCInput field = inputs[i][j]; // If this is a "qualdrop_value" field, then the full field name is the field + dropdown qualifier - if (field.getInputType().equals("qualdrop_value")) { + if (StringUtils.equals(field.getInputType(), "qualdrop_value")) { List pairs = field.getPairs(); for (int k = 0; k < pairs.size(); k += 2) { String qualifier = pairs.get(k + 1); diff --git a/dspace-api/src/main/java/org/dspace/app/util/IndexVersion.java b/dspace-api/src/main/java/org/dspace/app/util/IndexVersion.java index d8b2d6868a..7bdaa95b5c 100644 --- a/dspace-api/src/main/java/org/dspace/app/util/IndexVersion.java +++ b/dspace-api/src/main/java/org/dspace/app/util/IndexVersion.java @@ -250,12 +250,8 @@ public class IndexVersion { } else if (firstMinor > secondMinor) { // If we get here, major versions must be EQUAL. Now, time to check our minor versions return GREATER_THAN; - } else if (firstMinor < secondMinor) { - return LESS_THAN; } else { - // This is an impossible scenario. - // This 'else' should never be triggered since we've checked for equality above already - return EQUAL; + return LESS_THAN; } } diff --git a/dspace-api/src/main/java/org/dspace/authenticate/IPMatcher.java b/dspace-api/src/main/java/org/dspace/authenticate/IPMatcher.java index 955b6c86d3..439e53af1d 100644 --- a/dspace-api/src/main/java/org/dspace/authenticate/IPMatcher.java +++ b/dspace-api/src/main/java/org/dspace/authenticate/IPMatcher.java @@ -87,13 +87,16 @@ public class IPMatcher { + ipSpec); } - int maskBytes = maskBits / 8; - for (int i = 0; i < maskBytes; i++) { - netmask[i] = (byte) 0Xff; - } - netmask[maskBytes] = (byte) ((byte) 0Xff << 8 - (maskBits % 8)); // FIXME test! - for (int i = maskBytes + 1; i < (128 / 8); i++) { - netmask[i] = 0; + for (int i = 0; i < netmask.length; i++) { + if (maskBits <= 0) { + netmask[i] = 0; + } else if (maskBits > 8) { + netmask[i] = (byte) 0Xff; + } else { + netmask[i] = (byte) ((byte) 0Xff << 8 - maskBits); + } + + maskBits = maskBits - 8; } break; case 1: // No explicit mask: fill the mask with 1s diff --git a/dspace-api/src/main/java/org/dspace/authorize/AuthorizeServiceImpl.java b/dspace-api/src/main/java/org/dspace/authorize/AuthorizeServiceImpl.java index 07a7ac70d3..eb7d60d84c 100644 --- a/dspace-api/src/main/java/org/dspace/authorize/AuthorizeServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/authorize/AuthorizeServiceImpl.java @@ -614,6 +614,12 @@ public class AuthorizeServiceImpl implements AuthorizeService { resourcePolicyService.removeDsoEPersonPolicies(c, o, e); } + @Override + public void removeAllEPersonPolicies(Context c, EPerson e) + throws SQLException, AuthorizeException { + resourcePolicyService.removeAllEPersonPolicies(c, e); + } + @Override public List getAuthorizedGroups(Context c, DSpaceObject o, int actionID) throws java.sql.SQLException { diff --git a/dspace-api/src/main/java/org/dspace/authorize/ResourcePolicyServiceImpl.java b/dspace-api/src/main/java/org/dspace/authorize/ResourcePolicyServiceImpl.java index 74b3c0633f..4a2addf781 100644 --- a/dspace-api/src/main/java/org/dspace/authorize/ResourcePolicyServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/authorize/ResourcePolicyServiceImpl.java @@ -114,6 +114,11 @@ public class ResourcePolicyServiceImpl implements ResourcePolicyService { return resourcePolicyDAO.findByEPersonGroupTypeIdAction(c, e, groups, action, type_id); } + @Override + public List find(Context context, EPerson ePerson) throws SQLException { + return resourcePolicyDAO.findByEPerson(context, ePerson); + } + @Override public List findByTypeGroupActionExceptId(Context context, DSpaceObject dso, Group group, int action, int notPolicyID) @@ -246,6 +251,11 @@ public class ResourcePolicyServiceImpl implements ResourcePolicyService { } + @Override + public void removeAllEPersonPolicies(Context context, EPerson ePerson) throws SQLException, AuthorizeException { + resourcePolicyDAO.deleteByEPerson(context, ePerson); + } + @Override public void removeGroupPolicies(Context c, Group group) throws SQLException { resourcePolicyDAO.deleteByGroup(c, group); diff --git a/dspace-api/src/main/java/org/dspace/authorize/dao/ResourcePolicyDAO.java b/dspace-api/src/main/java/org/dspace/authorize/dao/ResourcePolicyDAO.java index fa3b38efc8..5c898a5bca 100644 --- a/dspace-api/src/main/java/org/dspace/authorize/dao/ResourcePolicyDAO.java +++ b/dspace-api/src/main/java/org/dspace/authorize/dao/ResourcePolicyDAO.java @@ -33,6 +33,8 @@ public interface ResourcePolicyDAO extends GenericDAO { public List findByDsoAndType(Context context, DSpaceObject dSpaceObject, String type) throws SQLException; + public List findByEPerson(Context context, EPerson ePerson) throws SQLException; + public List findByGroup(Context context, Group group) throws SQLException; public List findByDSoAndAction(Context context, DSpaceObject dso, int actionId) throws SQLException; @@ -66,6 +68,15 @@ public interface ResourcePolicyDAO extends GenericDAO { public void deleteByDsoEPersonPolicies(Context context, DSpaceObject dso, EPerson ePerson) throws SQLException; + /** + * Deletes all policies that belong to an EPerson + * + * @param context DSpace context object + * @param ePerson ePerson whose policies to delete + * @throws SQLException if database error + */ + public void deleteByEPerson(Context context, EPerson ePerson) throws SQLException; + public void deleteByDsoAndTypeNotEqualsTo(Context c, DSpaceObject o, String type) throws SQLException; /** @@ -101,7 +112,7 @@ public interface ResourcePolicyDAO extends GenericDAO { * @return total resource policies of the ePerson * @throws SQLException if database error */ - public int countByEPerson(Context context, EPerson eperson) throws SQLException; + public int countByEPerson(Context context, EPerson ePerson) throws SQLException; /** * Return a paginated list of policies related to a resourceUuid belong to an ePerson diff --git a/dspace-api/src/main/java/org/dspace/authorize/dao/impl/ResourcePolicyDAOImpl.java b/dspace-api/src/main/java/org/dspace/authorize/dao/impl/ResourcePolicyDAOImpl.java index 6aa5d2bb2e..9dd368d667 100644 --- a/dspace-api/src/main/java/org/dspace/authorize/dao/impl/ResourcePolicyDAOImpl.java +++ b/dspace-api/src/main/java/org/dspace/authorize/dao/impl/ResourcePolicyDAOImpl.java @@ -63,6 +63,16 @@ public class ResourcePolicyDAOImpl extends AbstractHibernateDAO return list(context, criteriaQuery, false, ResourcePolicy.class, -1, -1); } + @Override + public List findByEPerson(Context context, EPerson ePerson) throws SQLException { + CriteriaBuilder criteriaBuilder = getCriteriaBuilder(context); + CriteriaQuery criteriaQuery = getCriteriaQuery(criteriaBuilder, ResourcePolicy.class); + Root resourcePolicyRoot = criteriaQuery.from(ResourcePolicy.class); + criteriaQuery.select(resourcePolicyRoot); + criteriaQuery.where(criteriaBuilder.equal(resourcePolicyRoot.get(ResourcePolicy_.eperson), ePerson)); + return list(context, criteriaQuery, false, ResourcePolicy.class, -1, -1); + } + @Override public List findByGroup(Context context, Group group) throws SQLException { CriteriaBuilder criteriaBuilder = getCriteriaBuilder(context); @@ -194,6 +204,15 @@ public class ResourcePolicyDAOImpl extends AbstractHibernateDAO } + @Override + public void deleteByEPerson(Context context, EPerson ePerson) throws SQLException { + String queryString = "delete from ResourcePolicy where eperson= :eperson"; + Query query = createQuery(context, queryString); + query.setParameter("eperson", ePerson); + query.executeUpdate(); + + } + @Override public void deleteByDsoAndTypeNotEqualsTo(Context context, DSpaceObject dso, String type) throws SQLException { @@ -247,10 +266,10 @@ public class ResourcePolicyDAOImpl extends AbstractHibernateDAO } @Override - public int countByEPerson(Context context, EPerson eperson) throws SQLException { + public int countByEPerson(Context context, EPerson ePerson) throws SQLException { Query query = createQuery(context, "SELECT count(*) FROM " + ResourcePolicy.class.getSimpleName() + " WHERE eperson_id = (:epersonUuid) "); - query.setParameter("epersonUuid", eperson.getID()); + query.setParameter("epersonUuid", ePerson.getID()); return count(query); } diff --git a/dspace-api/src/main/java/org/dspace/authorize/service/AuthorizeService.java b/dspace-api/src/main/java/org/dspace/authorize/service/AuthorizeService.java index f3ede72ac1..94a1c0297e 100644 --- a/dspace-api/src/main/java/org/dspace/authorize/service/AuthorizeService.java +++ b/dspace-api/src/main/java/org/dspace/authorize/service/AuthorizeService.java @@ -449,6 +449,16 @@ public interface AuthorizeService { */ public void removeEPersonPolicies(Context c, DSpaceObject o, EPerson e) throws SQLException, AuthorizeException; + /** + * Removes all policies from an eperson that belong to an EPerson. + * + * @param c current context + * @param e the eperson + * @throws SQLException if there's a database problem + * @throws AuthorizeException if authorization error + */ + public void removeAllEPersonPolicies(Context c, EPerson e) throws SQLException, AuthorizeException; + /** * Returns all groups authorized to perform an action on an object. Returns * empty array if no matches. diff --git a/dspace-api/src/main/java/org/dspace/authorize/service/ResourcePolicyService.java b/dspace-api/src/main/java/org/dspace/authorize/service/ResourcePolicyService.java index 48ec510c86..f1d8b30242 100644 --- a/dspace-api/src/main/java/org/dspace/authorize/service/ResourcePolicyService.java +++ b/dspace-api/src/main/java/org/dspace/authorize/service/ResourcePolicyService.java @@ -39,6 +39,16 @@ public interface ResourcePolicyService extends DSpaceCRUDService public List find(Context context, Group group) throws SQLException; + /** + * Retrieve a list of ResourcePolicies by EPerson + * + * @param c context + * @param ePerson the EPerson for which to look up the resource policies + * @return a list of ResourcePolicies for the provided EPerson + * @throws SQLException if there's a database problem + */ + public List find(Context c, EPerson ePerson) throws SQLException; + public List find(Context c, EPerson e, List groups, int action, int type_id) throws SQLException; @@ -72,6 +82,16 @@ public interface ResourcePolicyService extends DSpaceCRUDService public void removeDsoEPersonPolicies(Context context, DSpaceObject dso, EPerson ePerson) throws SQLException, AuthorizeException; + /** + * Removes all ResourcePolicies related to an EPerson + * + * @param context context + * @param ePerson the EPerson for which the ResourcePolicies will be deleted + * @throws SQLException if there's a database problem + * @throws AuthorizeException when the current user is not authorized + */ + public void removeAllEPersonPolicies(Context context, EPerson ePerson) throws SQLException, AuthorizeException; + public void removeGroupPolicies(Context c, Group group) throws SQLException; public void removeDsoAndTypeNotEqualsToPolicies(Context c, DSpaceObject o, String type) diff --git a/dspace-api/src/main/java/org/dspace/content/BitstreamFormatServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/BitstreamFormatServiceImpl.java index 21d1fa4ba4..89bf74ece6 100644 --- a/dspace-api/src/main/java/org/dspace/content/BitstreamFormatServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/BitstreamFormatServiceImpl.java @@ -153,7 +153,7 @@ public class BitstreamFormatServiceImpl implements BitstreamFormatService { // If the exception was thrown, unknown will == null so goahead and // load s. If not, check that the unknown's registry's name is not // being reset. - if (unknown == null || unknown.getID() != bitstreamFormat.getID()) { + if (unknown == null || !unknown.getID().equals(bitstreamFormat.getID())) { bitstreamFormat.setShortDescriptionInternal(shortDescription); } } @@ -208,7 +208,7 @@ public class BitstreamFormatServiceImpl implements BitstreamFormatService { // Find "unknown" type BitstreamFormat unknown = findUnknown(context); - if (unknown.getID() == bitstreamFormat.getID()) { + if (unknown.getID().equals(bitstreamFormat.getID())) { throw new IllegalArgumentException("The Unknown bitstream format may not be deleted."); } @@ -270,4 +270,4 @@ public class BitstreamFormatServiceImpl implements BitstreamFormatService { } return null; } -} \ No newline at end of file +} diff --git a/dspace-api/src/main/java/org/dspace/content/CollectionServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/CollectionServiceImpl.java index 34bf4f5fc1..559b95edb8 100644 --- a/dspace-api/src/main/java/org/dspace/content/CollectionServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/CollectionServiceImpl.java @@ -57,7 +57,6 @@ import org.dspace.harvest.HarvestedCollection; import org.dspace.harvest.service.HarvestedCollectionService; import org.dspace.workflow.factory.WorkflowServiceFactory; import org.dspace.xmlworkflow.WorkflowConfigurationException; -import org.dspace.xmlworkflow.XmlWorkflowFactoryImpl; import org.dspace.xmlworkflow.factory.XmlWorkflowFactory; import org.dspace.xmlworkflow.state.Workflow; import org.dspace.xmlworkflow.storedcomponents.CollectionRole; @@ -387,7 +386,7 @@ public class CollectionServiceImpl extends DSpaceObjectServiceImpl i log.error(LogManager.getHeader(context, "setWorkflowGroup", "collection_id=" + collection.getID() + " " + e.getMessage()), e); } - if (!StringUtils.equals(XmlWorkflowFactoryImpl.LEGACY_WORKFLOW_NAME, workflow.getID())) { + if (!StringUtils.equals(workflowFactory.getDefaultWorkflow().getID(), workflow.getID())) { throw new IllegalArgumentException( "setWorkflowGroup can be used only on collection with the default basic dspace workflow. " + "Instead, the collection: " diff --git a/dspace-api/src/main/java/org/dspace/content/CommunityServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/CommunityServiceImpl.java index c49442267a..2ad0c8c2bc 100644 --- a/dspace-api/src/main/java/org/dspace/content/CommunityServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/CommunityServiceImpl.java @@ -629,6 +629,10 @@ public class CommunityServiceImpl extends DSpaceObjectServiceImpl imp case Constants.DELETE: if (AuthorizeConfiguration.canCommunityAdminPerformSubelementDeletion()) { adminObject = getParentObject(context, community); + if (adminObject == null) { + //top-level community, has to be admin of the current community + adminObject = community; + } } break; case Constants.ADD: diff --git a/dspace-api/src/main/java/org/dspace/content/DSpaceObjectServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/DSpaceObjectServiceImpl.java index 6886d41e1b..d33ad7e416 100644 --- a/dspace-api/src/main/java/org/dspace/content/DSpaceObjectServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/DSpaceObjectServiceImpl.java @@ -207,8 +207,8 @@ public abstract class DSpaceObjectServiceImpl implements } @Override - public void addMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, - List values) throws SQLException { + public List addMetadata(Context context, T dso, String schema, String element, String qualifier, + String lang, List values) throws SQLException { MetadataField metadataField = metadataFieldService.findByElement(context, schema, element, qualifier); if (metadataField == null) { throw new SQLException( @@ -216,12 +216,12 @@ public abstract class DSpaceObjectServiceImpl implements "exist!"); } - addMetadata(context, dso, metadataField, lang, values); + return addMetadata(context, dso, metadataField, lang, values); } @Override - public void addMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, - List values, List authorities, List confidences) + public List addMetadata(Context context, T dso, String schema, String element, String qualifier, + String lang, List values, List authorities, List confidences) throws SQLException { // We will not verify that they are valid entries in the registry // until update() is called. @@ -231,15 +231,16 @@ public abstract class DSpaceObjectServiceImpl implements "bad_dublin_core schema=" + schema + "." + element + "." + qualifier + ". Metadata field does not " + "exist!"); } - addMetadata(context, dso, metadataField, lang, values, authorities, confidences); + return addMetadata(context, dso, metadataField, lang, values, authorities, confidences); } @Override - public void addMetadata(Context context, T dso, MetadataField metadataField, String lang, List values, - List authorities, List confidences) + public List addMetadata(Context context, T dso, MetadataField metadataField, String lang, + List values, List authorities, List confidences) throws SQLException { boolean authorityControlled = metadataAuthorityService.isAuthorityControlled(metadataField); boolean authorityRequired = metadataAuthorityService.isAuthorityRequired(metadataField); + List newMetadata = new ArrayList<>(values.size()); // We will not verify that they are valid entries in the registry // until update() is called. for (int i = 0; i < values.size(); i++) { @@ -250,6 +251,7 @@ public abstract class DSpaceObjectServiceImpl implements } } MetadataValue metadataValue = metadataValueService.create(context, dso, metadataField); + newMetadata.add(metadataValue); //Set place to list length of all metadatavalues for the given schema.element.qualifier combination. // Subtract one to adhere to the 0 as first element rule metadataValue.setPlace( @@ -304,29 +306,31 @@ public abstract class DSpaceObjectServiceImpl implements // metadataValueService.update(context, metadataValue); dso.addDetails(metadataField.toString()); } + return newMetadata; } @Override - public void addMetadata(Context context, T dso, MetadataField metadataField, String language, String value, - String authority, int confidence) throws SQLException { - addMetadata(context, dso, metadataField, language, Arrays.asList(value), Arrays.asList(authority), - Arrays.asList(confidence)); + public MetadataValue addMetadata(Context context, T dso, MetadataField metadataField, String language, + String value, String authority, int confidence) throws SQLException { + return addMetadata(context, dso, metadataField, language, Arrays.asList(value), Arrays.asList(authority), + Arrays.asList(confidence)).get(0); } @Override - public void addMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, - String value) throws SQLException { - addMetadata(context, dso, schema, element, qualifier, lang, Arrays.asList(value)); + public MetadataValue addMetadata(Context context, T dso, String schema, String element, String qualifier, + String lang, String value) throws SQLException { + return addMetadata(context, dso, schema, element, qualifier, lang, Arrays.asList(value)).get(0); } @Override - public void addMetadata(Context context, T dso, MetadataField metadataField, String language, String value) + public MetadataValue addMetadata(Context context, T dso, MetadataField metadataField, String language, String value) throws SQLException { - addMetadata(context, dso, metadataField, language, Arrays.asList(value)); + return addMetadata(context, dso, metadataField, language, Arrays.asList(value)).get(0); } @Override - public void addMetadata(Context context, T dso, MetadataField metadataField, String language, List values) + public List addMetadata(Context context, T dso, MetadataField metadataField, String language, + List values) throws SQLException { if (metadataField != null) { String fieldKey = metadataAuthorityService @@ -343,18 +347,19 @@ public abstract class DSpaceObjectServiceImpl implements getAuthoritiesAndConfidences(fieldKey, null, values, authorities, confidences, i); } } - addMetadata(context, dso, metadataField, language, values, authorities, confidences); + return addMetadata(context, dso, metadataField, language, values, authorities, confidences); } else { - addMetadata(context, dso, metadataField, language, values, null, null); + return addMetadata(context, dso, metadataField, language, values, null, null); } } + return new ArrayList<>(0); } @Override - public void addMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, - String value, String authority, int confidence) throws SQLException { - addMetadata(context, dso, schema, element, qualifier, lang, Arrays.asList(value), Arrays.asList(authority), - Arrays.asList(confidence)); + public MetadataValue addMetadata(Context context, T dso, String schema, String element, String qualifier, + String lang, String value, String authority, int confidence) throws SQLException { + return addMetadata(context, dso, schema, element, qualifier, lang, Arrays.asList(value), + Arrays.asList(authority), Arrays.asList(confidence)).get(0); } @Override @@ -660,33 +665,35 @@ public abstract class DSpaceObjectServiceImpl implements @Override public void addAndShiftRightMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, String value, String authority, int confidence, int index) - throws SQLException { + throws SQLException { List list = getMetadata(dso, schema, element, qualifier); - clearMetadata(context, dso, schema, element, qualifier, Item.ANY); - int idx = 0; + int place = 0; boolean last = true; for (MetadataValue rr : list) { if (idx == index) { - addMetadata(context, dso, schema, element, qualifier, - lang, value, authority, confidence); + MetadataValue newMetadata = addMetadata(context, dso, schema, element, qualifier, + lang, value, authority, confidence); + + moveSingleMetadataValue(context, dso, place, newMetadata); + place++; last = false; } - addMetadata(context, dso, schema, element, qualifier, - rr.getLanguage(), rr.getValue(), rr.getAuthority(), rr.getConfidence()); + moveSingleMetadataValue(context, dso, place, rr); + place++; idx++; } if (last) { addMetadata(context, dso, schema, element, qualifier, - lang, value, authority, confidence); + lang, value, authority, confidence); } } @Override public void moveMetadata(Context context, T dso, String schema, String element, String qualifier, int from, int to) - throws SQLException, IllegalArgumentException { + throws SQLException, IllegalArgumentException { if (from == to) { throw new IllegalArgumentException("The \"from\" location MUST be different from \"to\" location"); @@ -701,8 +708,6 @@ public abstract class DSpaceObjectServiceImpl implements "\n Idx from:" + from + " Idx to: " + to); } - clearMetadata(context, dso, schema, element, qualifier, Item.ANY); - int idx = 0; MetadataValue moved = null; for (MetadataValue md : list) { @@ -714,49 +719,46 @@ public abstract class DSpaceObjectServiceImpl implements } idx = 0; + int place = 0; boolean last = true; for (MetadataValue rr : list) { if (idx == to && to < from) { - addMetadata(context, dso, schema, element, qualifier, moved.getLanguage(), moved.getValue(), - moved.getAuthority(), moved.getConfidence()); + moveSingleMetadataValue(context, dso, place, moved); + place++; last = false; } if (idx != from) { - addMetadata(context, dso, schema, element, qualifier, rr.getLanguage(), rr.getValue(), - rr.getAuthority(), rr.getConfidence()); + moveSingleMetadataValue(context, dso, place, rr); + place++; } if (idx == to && to > from) { - addMetadata(context, dso, schema, element, qualifier, moved.getLanguage(), moved.getValue(), - moved.getAuthority(), moved.getConfidence()); + moveSingleMetadataValue(context, dso, place, moved); + place++; last = false; } idx++; } if (last) { - addMetadata(context, dso, schema, element, qualifier, moved.getLanguage(), moved.getValue(), - moved.getAuthority(), moved.getConfidence()); + moveSingleMetadataValue(context, dso, place, moved); } } + /** + * Supports moving metadata by updating the place of the metadata value + */ + protected void moveSingleMetadataValue(Context context, T dso, int place, MetadataValue rr) { + //just move the metadata + rr.setPlace(place); + } + @Override public void replaceMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, String value, String authority, int confidence, int index) throws SQLException { List list = getMetadata(dso, schema, element, qualifier); - clearMetadata(context, dso, schema, element, qualifier, Item.ANY); - - int idx = 0; - for (MetadataValue rr : list) { - if (idx == index) { - addMetadata(context, dso, schema, element, qualifier, - lang, value, authority, confidence); - } else { - addMetadata(context, dso, schema, element, qualifier, - rr.getLanguage(), rr.getValue(), rr.getAuthority(), rr.getConfidence()); - } - idx++; - } + removeMetadataValues(context, dso, Arrays.asList(list.get(index))); + addAndShiftRightMetadata(context, dso, schema, element, qualifier, lang, value, authority, confidence, index); } @Override diff --git a/dspace-api/src/main/java/org/dspace/content/EntityType.java b/dspace-api/src/main/java/org/dspace/content/EntityType.java index 15fe1739e5..d44ec5a35d 100644 --- a/dspace-api/src/main/java/org/dspace/content/EntityType.java +++ b/dspace-api/src/main/java/org/dspace/content/EntityType.java @@ -7,6 +7,7 @@ */ package org.dspace.content; +import java.util.Objects; import javax.persistence.Column; import javax.persistence.Entity; import javax.persistence.GeneratedValue; @@ -15,6 +16,8 @@ import javax.persistence.Id; import javax.persistence.SequenceGenerator; import javax.persistence.Table; +import org.apache.commons.lang3.StringUtils; +import org.apache.commons.lang3.builder.HashCodeBuilder; import org.dspace.core.ReloadableEntity; /** @@ -45,7 +48,8 @@ public class EntityType implements ReloadableEntity { /** * The standard setter for the ID of this EntityType - * @param id The ID that this EntityType's ID will be set to + * + * @param id The ID that this EntityType's ID will be set to */ public void setId(Integer id) { this.id = id; @@ -53,7 +57,8 @@ public class EntityType implements ReloadableEntity { /** * The standard getter for the label of this EntityType - * @return The label for this EntityType + * + * @return The label for this EntityType */ public String getLabel() { return label; @@ -61,6 +66,7 @@ public class EntityType implements ReloadableEntity { /** * The standard setter for the label of this EntityType + * * @param label The label that this EntityType's label will be set to */ public void setLabel(String label) { @@ -69,9 +75,40 @@ public class EntityType implements ReloadableEntity { /** * The standard getter for the ID of this EntityType - * @return The ID for this EntityType + * + * @return The ID for this EntityType */ public Integer getID() { return id; } + + /** + * Determines whether two entity types are equal based on the id and the label + * @param obj object to be compared + * @return + */ + public boolean equals(Object obj) { + if (!(obj instanceof EntityType)) { + return false; + } + EntityType entityType = (EntityType) obj; + + if (!Objects.equals(this.getID(), entityType.getID())) { + return false; + } + + if (!StringUtils.equals(this.getLabel(), entityType.getLabel())) { + return false; + } + return true; + } + + /** + * Returns a hash code value for the object. + * @return hash code value + */ + @Override + public int hashCode() { + return new HashCodeBuilder().append(getID()).toHashCode(); + } } diff --git a/dspace-api/src/main/java/org/dspace/content/ItemServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/ItemServiceImpl.java index c765f663f8..f7fae2f444 100644 --- a/dspace-api/src/main/java/org/dspace/content/ItemServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/ItemServiceImpl.java @@ -230,6 +230,12 @@ public class ItemServiceImpl extends DSpaceObjectServiceImpl implements It return itemDAO.findBySubmitter(context, eperson); } + @Override + public Iterator findBySubmitter(Context context, EPerson eperson, boolean retrieveAllItems) + throws SQLException { + return itemDAO.findBySubmitter(context, eperson, retrieveAllItems); + } + @Override public Iterator findBySubmitterDateSorted(Context context, EPerson eperson, Integer limit) throws SQLException { @@ -1100,19 +1106,7 @@ prevent the generation of resource policy entry values with null dspace_object a } break; case Constants.DELETE: - if (item.getOwningCollection() != null) { - if (AuthorizeConfiguration.canCollectionAdminPerformItemDeletion()) { - adminObject = collection; - } else if (AuthorizeConfiguration.canCommunityAdminPerformItemDeletion()) { - adminObject = community; - } - } else { - if (AuthorizeConfiguration.canCollectionAdminManageTemplateItem()) { - adminObject = collection; - } else if (AuthorizeConfiguration.canCommunityAdminManageCollectionTemplateItem()) { - adminObject = community; - } - } + adminObject = item; break; case Constants.WRITE: // if it is a template item we need to check the @@ -1372,6 +1366,32 @@ prevent the generation of resource policy entry values with null dspace_object a } + /** + * Supports moving metadata by adding the metadata value or updating the place of the relationship + */ + @Override + protected void moveSingleMetadataValue(Context context, Item dso, int place, MetadataValue rr) { + if (rr instanceof RelationshipMetadataValue) { + try { + //Retrieve the applicable relationship + Relationship rs = relationshipService.find(context, + ((RelationshipMetadataValue) rr).getRelationshipId()); + if (rs.getLeftItem() == dso) { + rs.setLeftPlace(place); + } else { + rs.setRightPlace(place); + } + relationshipService.update(context, rs); + } catch (Exception e) { + //should not occur, otherwise metadata can't be updated either + log.error("An error occurred while moving " + rr.getAuthority() + " for item " + dso.getID(), e); + } + } else { + //just move the metadata + rr.setPlace(place); + } + } + /** * This method will sort the List of MetadataValue objects based on the MetadataSchema, MetadataField Element, * MetadataField Qualifier and MetadataField Place in that order. diff --git a/dspace-api/src/main/java/org/dspace/content/MetadataDSpaceCsvExportServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/MetadataDSpaceCsvExportServiceImpl.java index 85cfcba2e0..1750938937 100644 --- a/dspace-api/src/main/java/org/dspace/content/MetadataDSpaceCsvExportServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/MetadataDSpaceCsvExportServiceImpl.java @@ -9,6 +9,7 @@ package org.dspace.content; import java.sql.SQLException; import java.util.ArrayList; +import java.util.Collections; import java.util.Iterator; import java.util.List; @@ -101,7 +102,7 @@ public class MetadataDSpaceCsvExportServiceImpl implements MetadataDSpaceCsvExpo throws SQLException { // Add all the collections List collections = community.getCollections(); - Iterator result = null; + Iterator result = Collections.emptyIterator(); for (Collection collection : collections) { Iterator items = itemService.findByCollection(context, collection); result = addItemsToResult(result, items); diff --git a/dspace-api/src/main/java/org/dspace/content/MetadataField.java b/dspace-api/src/main/java/org/dspace/content/MetadataField.java index 3f574dab0e..0ea176c751 100644 --- a/dspace-api/src/main/java/org/dspace/content/MetadataField.java +++ b/dspace-api/src/main/java/org/dspace/content/MetadataField.java @@ -168,11 +168,11 @@ public class MetadataField implements ReloadableEntity { return false; } Class objClass = HibernateProxyHelper.getClassWithoutInitializingProxy(obj); - if (getClass() != objClass) { + if (!getClass().equals(objClass)) { return false; } final MetadataField other = (MetadataField) obj; - if (this.getID() != other.getID()) { + if (!this.getID().equals(other.getID())) { return false; } if (!getMetadataSchema().equals(other.getMetadataSchema())) { diff --git a/dspace-api/src/main/java/org/dspace/content/MetadataFieldServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/MetadataFieldServiceImpl.java index c71db2d131..569b5840c6 100644 --- a/dspace-api/src/main/java/org/dspace/content/MetadataFieldServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/MetadataFieldServiceImpl.java @@ -9,6 +9,8 @@ package org.dspace.content; import java.io.IOException; import java.sql.SQLException; +import java.util.ArrayList; +import java.util.Arrays; import java.util.List; import org.apache.commons.collections4.CollectionUtils; @@ -20,8 +22,12 @@ import org.dspace.content.dao.MetadataFieldDAO; import org.dspace.content.service.MetadataFieldService; import org.dspace.content.service.MetadataSchemaService; import org.dspace.content.service.MetadataValueService; +import org.dspace.content.service.SiteService; +import org.dspace.core.Constants; import org.dspace.core.Context; import org.dspace.core.LogManager; +import org.dspace.discovery.indexobject.IndexableMetadataField; +import org.dspace.event.Event; import org.springframework.beans.factory.annotation.Autowired; /** @@ -46,6 +52,8 @@ public class MetadataFieldServiceImpl implements MetadataFieldService { protected MetadataValueService metadataValueService; @Autowired(required = true) protected MetadataSchemaService metadataSchemaService; + @Autowired + protected SiteService siteService; protected MetadataFieldServiceImpl() { @@ -77,6 +85,8 @@ public class MetadataFieldServiceImpl implements MetadataFieldService { log.info(LogManager.getHeader(context, "create_metadata_field", "metadata_field_id=" + metadataField.getID())); + // Update the index of type metadatafield + this.triggerEventToUpdateIndex(context, metadataField.getID()); return metadataField; } @@ -149,6 +159,8 @@ public class MetadataFieldServiceImpl implements MetadataFieldService { "metadata_field_id=" + metadataField.getID() + "element=" + metadataField .getElement() + "qualifier=" + metadataField.getQualifier())); + // Update the index of type metadatafield + this.triggerEventToUpdateIndex(context, metadataField.getID()); } @Override @@ -177,6 +189,21 @@ public class MetadataFieldServiceImpl implements MetadataFieldService { log.info(LogManager.getHeader(context, "delete_metadata_field", "metadata_field_id=" + metadataField.getID())); + // Update the index of type metadatafield + this.triggerEventToUpdateIndex(context, metadataField.getID()); + } + + /** + * Calls a MODIFY SITE event with the identifier of the changed mdField, so it can be indexed in + * {@link org.dspace.discovery.IndexEventConsumer}, with type of {@link org.dspace.discovery.IndexableObject} in + * {@link Event}.detail and the identifiers of the changed mdFields in {@link Event}.identifiers + * + * @param context DSpace context + * @param mdFieldId ID of the metadata field that needs to be (re)indexed + */ + private void triggerEventToUpdateIndex(Context context, int mdFieldId) { + context.addEvent(new Event(Event.MODIFY, Constants.SITE, null, IndexableMetadataField.TYPE, new ArrayList<>( + Arrays.asList(Integer.toString(mdFieldId))))); } /** diff --git a/dspace-api/src/main/java/org/dspace/content/MetadataSchema.java b/dspace-api/src/main/java/org/dspace/content/MetadataSchema.java index 96bef0fa2c..727181ee9d 100644 --- a/dspace-api/src/main/java/org/dspace/content/MetadataSchema.java +++ b/dspace-api/src/main/java/org/dspace/content/MetadataSchema.java @@ -67,11 +67,11 @@ public class MetadataSchema implements ReloadableEntity { return false; } Class objClass = HibernateProxyHelper.getClassWithoutInitializingProxy(obj); - if (getClass() != objClass) { + if (!getClass().equals(objClass)) { return false; } final MetadataSchema other = (MetadataSchema) obj; - if (this.id != other.id) { + if (!this.id.equals(other.id)) { return false; } if ((this.namespace == null) ? (other.namespace != null) : !this.namespace.equals(other.namespace)) { diff --git a/dspace-api/src/main/java/org/dspace/content/MetadataValue.java b/dspace-api/src/main/java/org/dspace/content/MetadataValue.java index 4ce0c291f7..2d9808ae45 100644 --- a/dspace-api/src/main/java/org/dspace/content/MetadataValue.java +++ b/dspace-api/src/main/java/org/dspace/content/MetadataValue.java @@ -239,17 +239,17 @@ public class MetadataValue implements ReloadableEntity { return false; } Class objClass = HibernateProxyHelper.getClassWithoutInitializingProxy(obj); - if (getClass() != objClass) { + if (!getClass().equals(objClass)) { return false; } final MetadataValue other = (MetadataValue) obj; - if (this.id != other.id) { + if (!this.id.equals(other.id)) { return false; } - if (this.getID() != other.getID()) { + if (!this.getID().equals(other.getID())) { return false; } - if (this.getDSpaceObject().getID() != other.getDSpaceObject().getID()) { + if (!this.getDSpaceObject().getID().equals(other.getDSpaceObject().getID())) { return false; } return true; diff --git a/dspace-api/src/main/java/org/dspace/content/RelationshipMetadataValue.java b/dspace-api/src/main/java/org/dspace/content/RelationshipMetadataValue.java index 88d2e38beb..637d1c094b 100644 --- a/dspace-api/src/main/java/org/dspace/content/RelationshipMetadataValue.java +++ b/dspace-api/src/main/java/org/dspace/content/RelationshipMetadataValue.java @@ -7,6 +7,8 @@ */ package org.dspace.content; +import org.dspace.core.Constants; + /** * This class is used as a representation of MetadataValues for the MetadataValues that are derived from the * Relationships that the item has. This includes the useForPlace property which we'll have to use to determine @@ -57,4 +59,13 @@ public class RelationshipMetadataValue extends MetadataValue { } return super.equals(obj); } + + /** + * Retrieves the Relationship ID from which the current RelationshipMetadataValue is derived + * + * @return the relationship ID + */ + public int getRelationshipId() { + return Integer.parseInt(getAuthority().substring(Constants.VIRTUAL_AUTHORITY_PREFIX.length())); + } } diff --git a/dspace-api/src/main/java/org/dspace/content/WorkspaceItem.java b/dspace-api/src/main/java/org/dspace/content/WorkspaceItem.java index f55dfaf2da..8049aa976c 100644 --- a/dspace-api/src/main/java/org/dspace/content/WorkspaceItem.java +++ b/dspace-api/src/main/java/org/dspace/content/WorkspaceItem.java @@ -156,11 +156,11 @@ public class WorkspaceItem return true; } Class objClass = HibernateProxyHelper.getClassWithoutInitializingProxy(o); - if (getClass() != objClass) { + if (!getClass().equals(objClass)) { return false; } final WorkspaceItem that = (WorkspaceItem) o; - if (this.getID() != that.getID()) { + if (!this.getID().equals(that.getID())) { return false; } diff --git a/dspace-api/src/main/java/org/dspace/content/WorkspaceItemServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/WorkspaceItemServiceImpl.java index c45f6c737c..8fc302f8bf 100644 --- a/dspace-api/src/main/java/org/dspace/content/WorkspaceItemServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/WorkspaceItemServiceImpl.java @@ -212,9 +212,8 @@ public class WorkspaceItemServiceImpl implements WorkspaceItemService { */ Item item = workspaceItem.getItem(); if (!authorizeService.isAdmin(context) - && ((context.getCurrentUser() == null) || (context - .getCurrentUser().getID() != item.getSubmitter() - .getID()))) { + && (item.getSubmitter() == null || (context.getCurrentUser() == null) + || (context.getCurrentUser().getID() != item.getSubmitter().getID()))) { // Not an admit, not the submitter throw new AuthorizeException("Must be an administrator or the " + "original submitter to delete a workspace item"); @@ -265,7 +264,12 @@ public class WorkspaceItemServiceImpl implements WorkspaceItemService { // Need to delete the workspaceitem row first since it refers // to item ID - workspaceItem.getSupervisorGroups().clear(); + try { + workspaceItem.getSupervisorGroups().clear(); + } catch (Exception e) { + log.error("failed to clear supervisor group", e); + } + workspaceItemDAO.delete(context, workspaceItem); } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/Choice.java b/dspace-api/src/main/java/org/dspace/content/authority/Choice.java index 9b68c75d28..6d73bdb5ea 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/Choice.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/Choice.java @@ -33,21 +33,62 @@ public class Choice { */ public String value = null; + /** + * A boolean representing if choice entry value can selected (usually true). + * Hierarchical authority can flag some choice as not selectable to force the + * use to choice a more detailed terms in the tree, such a leaf or a deeper + * branch + */ + public boolean selectable = true; + public Map extras = new HashMap(); public Choice() { } + /** + * Minimal constructor for this data object. It assumes an empty map of extras + * information and a selected choice + * + * @param authority the authority key + * @param value the text value to store in the metadata + * @param label the value to display to the user + */ public Choice(String authority, String value, String label) { this.authority = authority; this.value = value; this.label = label; } + /** + * Constructor to quickly setup the data object for basic authorities. The choice is assumed to be selectable. + * + * @param authority the authority key + * @param value the text value to store in the metadata + * @param label the value to display to the user + * @param extras a key value map of extra information related to this choice + */ public Choice(String authority, String label, String value, Map extras) { this.authority = authority; this.label = label; this.value = value; this.extras = extras; } + + /** + * Constructor for common need of Hierarchical authorities that want to + * explicitely set the selectable flag + * + * @param authority the authority key + * @param value the text value to store in the metadata + * @param label the value to display to the user + * @param selectable true if the choice can be selected, false if the a more + * accurate choice should be preferred + */ + public Choice(String authority, String label, String value, boolean selectable) { + this.authority = authority; + this.label = label; + this.value = value; + this.selectable = selectable; + } } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/ChoiceAuthority.java b/dspace-api/src/main/java/org/dspace/content/authority/ChoiceAuthority.java index d2d06fe983..750e761f3d 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/ChoiceAuthority.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/ChoiceAuthority.java @@ -7,7 +7,10 @@ */ package org.dspace.content.authority; -import org.dspace.content.Collection; +import java.util.HashMap; +import java.util.Map; + +import org.dspace.core.NameAwarePlugin; /** * Plugin interface that supplies an authority control mechanism for @@ -17,7 +20,7 @@ import org.dspace.content.Collection; * @see ChoiceAuthorityServiceImpl * @see MetadataAuthorityServiceImpl */ -public interface ChoiceAuthority { +public interface ChoiceAuthority extends NameAwarePlugin { /** * Get all values from the authority that match the preferred value. * Note that the offering was entered by the user and may contain @@ -32,15 +35,13 @@ public interface ChoiceAuthority { * defaultSelected index in the Choices instance to the choice, if any, * that matches the value. * - * @param field being matched for * @param text user's value to match - * @param collection database ID of Collection for context (owner of Item) * @param start choice at which to start, 0 is first. * @param limit maximum number of choices to return, 0 for no limit. * @param locale explicit localization key if available, or null * @return a Choices object (never null). */ - public Choices getMatches(String field, String text, Collection collection, int start, int limit, String locale); + public Choices getMatches(String text, int start, int limit, String locale); /** * Get the single "best" match (if any) of a value in the authority @@ -51,13 +52,11 @@ public interface ChoiceAuthority { * This call is typically used in non-interactive metadata ingest * where there is no interactive agent to choose from among options. * - * @param field being matched for * @param text user's value to match - * @param collection database ID of Collection for context (owner of Item) * @param locale explicit localization key if available, or null * @return a Choices object (never null) with 1 or 0 values. */ - public Choices getBestMatch(String field, String text, Collection collection, String locale); + public Choices getBestMatch(String text, String locale); /** * Get the canonical user-visible "label" (i.e. short descriptive text) @@ -67,31 +66,97 @@ public interface ChoiceAuthority { * This may get called many times while populating a Web page so it should * be implemented as efficiently as possible. * - * @param field being matched for * @param key authority key known to this authority. * @param locale explicit localization key if available, or null * @return descriptive label - should always return something, never null. */ - public String getLabel(String field, String key, String locale); + public String getLabel(String key, String locale); + /** + * Get the canonical value to store for a key in the authority. Can be localized + * given the implicit or explicit locale specification. + * + * @param key authority key known to this authority. + * @param locale explicit localization key if available, or null + * @return value to store - should always return something, never null. + */ + default String getValue(String key, String locale) { + return getLabel(key, locale); + } + + /** + * Get a map of additional information related to the specified key in the + * authority. + * + * @param key the key of the entry + * @param locale explicit localization key if available, or null + * @return a map of additional information related to the key + */ + default Map getExtra(String key, String locale) { + return new HashMap(); + } + + /** + * Return true for hierarchical authorities + * + * @return true if hierarchical, default false + */ default boolean isHierarchical() { return false; } + /** + * Scrollable authorities allows the scroll of the entries without applying + * filter/query to the + * {@link #getMatches(String, String, Collection, int, int, String)} + * + * @return true if scrollable, default false + */ default boolean isScrollable() { return false; } - default boolean hasIdentifier() { - return true; + /** + * Hierarchical authority can provide an hint for the UI about how many levels + * preload to improve the UX. It provides a valid default for hierarchical + * authorities + * + * @return 0 if hierarchical, null otherwise + */ + default Integer getPreloadLevel() { + return isHierarchical() ? 0 : null; } - default public Choice getChoice(String fieldKey, String authKey, String locale) { + /** + * Build the preferred choice associated with the authKey. The default + * implementation delegate the creato to the {@link #getLabel(String, String)} + * {@link #getValue(String, String)} and {@link #getExtra(String, String)} + * methods but can be directly overriden for better efficiency or special + * scenario + * + * @param authKey authority key known to this authority. + * @param locale explicit localization key if available, or null + * @return the preferred choice for this authKey and locale + */ + default public Choice getChoice(String authKey, String locale) { Choice result = new Choice(); result.authority = authKey; - result.label = getLabel(fieldKey, authKey, locale); - result.value = getLabel(fieldKey, authKey, locale); + result.label = getLabel(authKey, locale); + result.value = getValue(authKey, locale); + result.extras.putAll(getExtra(authKey, locale)); return result; } + /** + * Provide a recommendation to store the authority in the metadata value if + * available in the in the provided choice(s). Usually ChoiceAuthority should + * recommend that so the default is true and it only need to be implemented in + * the unusual scenario + * + * @return true if the authority provided in any choice of this + * authority should be stored in the metadata value + */ + default public boolean storeAuthorityInMetadata() { + return true; + } } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/ChoiceAuthorityServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/authority/ChoiceAuthorityServiceImpl.java index 4cc3f9d6db..0e05852af0 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/ChoiceAuthorityServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/ChoiceAuthorityServiceImpl.java @@ -7,10 +7,13 @@ */ package org.dspace.content.authority; +import java.util.ArrayList; import java.util.HashMap; +import java.util.HashSet; import java.util.Iterator; import java.util.List; import java.util.Map; +import java.util.Map.Entry; import java.util.Set; import org.apache.commons.lang3.StringUtils; @@ -19,6 +22,9 @@ import org.dspace.app.util.DCInput; import org.dspace.app.util.DCInputSet; import org.dspace.app.util.DCInputsReader; import org.dspace.app.util.DCInputsReaderException; +import org.dspace.app.util.SubmissionConfig; +import org.dspace.app.util.SubmissionConfigReader; +import org.dspace.app.util.SubmissionConfigReaderException; import org.dspace.content.Collection; import org.dspace.content.MetadataValue; import org.dspace.content.authority.service.ChoiceAuthorityService; @@ -54,23 +60,37 @@ public final class ChoiceAuthorityServiceImpl implements ChoiceAuthorityService // map of field key to authority plugin protected Map controller = new HashMap(); + // map of field key, form definition to authority plugin + protected Map> controllerFormDefinitions = + new HashMap>(); + // map of field key to presentation type protected Map presentation = new HashMap(); // map of field key to closed value protected Map closed = new HashMap(); - // map of authority name to field key - protected Map authorities = new HashMap(); + // flag to track the initialization status of the service + private boolean initialized = false; + + // map of authority name to field keys (the same authority can be configured over multiple metadata) + protected Map> authorities = new HashMap>(); + + // map of authority name to form definition and field keys + protected Map>> authoritiesFormDefinitions = + new HashMap>>(); + + // the item submission reader + private SubmissionConfigReader itemSubmissionConfigReader; @Autowired(required = true) protected ConfigurationService configurationService; @Autowired(required = true) protected PluginService pluginService; - private final String CHOICES_PLUGIN_PREFIX = "choices.plugin."; - private final String CHOICES_PRESENTATION_PREFIX = "choices.presentation."; - private final String CHOICES_CLOSED_PREFIX = "choices.closed."; + final static String CHOICES_PLUGIN_PREFIX = "choices.plugin."; + final static String CHOICES_PRESENTATION_PREFIX = "choices.presentation."; + final static String CHOICES_CLOSED_PREFIX = "choices.closed."; protected ChoiceAuthorityServiceImpl() { } @@ -96,10 +116,25 @@ public final class ChoiceAuthorityServiceImpl implements ChoiceAuthorityService @Override public Set getChoiceAuthoritiesNames() { - if (authorities.keySet().isEmpty()) { + init(); + Set authoritiesNames = new HashSet(); + authoritiesNames.addAll(authorities.keySet()); + authoritiesNames.addAll(authoritiesFormDefinitions.keySet()); + return authoritiesNames; + } + + private synchronized void init() { + if (!initialized) { + try { + itemSubmissionConfigReader = new SubmissionConfigReader(); + } catch (SubmissionConfigReaderException e) { + // the system is in an illegal state as the submission definition is not valid + throw new IllegalStateException("Error reading the item submission configuration: " + e.getMessage(), + e); + } loadChoiceAuthorityConfigurations(); + initialized = true; } - return authorities.keySet(); } @Override @@ -112,59 +147,62 @@ public final class ChoiceAuthorityServiceImpl implements ChoiceAuthorityService @Override public Choices getMatches(String fieldKey, String query, Collection collection, int start, int limit, String locale) { - ChoiceAuthority ma = getChoiceAuthorityMap().get(fieldKey); + ChoiceAuthority ma = getAuthorityByFieldKeyCollection(fieldKey, collection); if (ma == null) { throw new IllegalArgumentException( "No choices plugin was configured for field \"" + fieldKey - + "\"."); + + "\", collection=" + collection.getID().toString() + "."); } - return ma.getMatches(fieldKey, query, collection, start, limit, locale); + return ma.getMatches(query, start, limit, locale); } + @Override public Choices getMatches(String fieldKey, String query, Collection collection, int start, int limit, String locale, boolean externalInput) { - ChoiceAuthority ma = getChoiceAuthorityMap().get(fieldKey); + ChoiceAuthority ma = getAuthorityByFieldKeyCollection(fieldKey, collection); if (ma == null) { throw new IllegalArgumentException( "No choices plugin was configured for field \"" + fieldKey - + "\"."); + + "\", collection=" + collection.getID().toString() + "."); } if (externalInput && ma instanceof SolrAuthority) { ((SolrAuthority) ma).addExternalResultsInNextMatches(); } - return ma.getMatches(fieldKey, query, collection, start, limit, locale); + return ma.getMatches(query, start, limit, locale); } @Override public Choices getBestMatch(String fieldKey, String query, Collection collection, String locale) { - ChoiceAuthority ma = getChoiceAuthorityMap().get(fieldKey); + ChoiceAuthority ma = getAuthorityByFieldKeyCollection(fieldKey, collection); if (ma == null) { throw new IllegalArgumentException( "No choices plugin was configured for field \"" + fieldKey - + "\"."); + + "\", collection=" + collection.getID().toString() + "."); } - return ma.getBestMatch(fieldKey, query, collection, locale); + return ma.getBestMatch(query, locale); } @Override - public String getLabel(MetadataValue metadataValue, String locale) { - return getLabel(metadataValue.getMetadataField().toString(), metadataValue.getAuthority(), locale); + public String getLabel(MetadataValue metadataValue, Collection collection, String locale) { + return getLabel(metadataValue.getMetadataField().toString(), collection, metadataValue.getAuthority(), locale); } @Override - public String getLabel(String fieldKey, String authKey, String locale) { - ChoiceAuthority ma = getChoiceAuthorityMap().get(fieldKey); + public String getLabel(String fieldKey, Collection collection, String authKey, String locale) { + ChoiceAuthority ma = getAuthorityByFieldKeyCollection(fieldKey, collection); if (ma == null) { - throw new IllegalArgumentException("No choices plugin was configured for field \"" + fieldKey + "\"."); + throw new IllegalArgumentException( + "No choices plugin was configured for field \"" + fieldKey + + "\", collection=" + collection.getID().toString() + "."); } - return ma.getLabel(fieldKey, authKey, locale); + return ma.getLabel(authKey, locale); } @Override - public boolean isChoicesConfigured(String fieldKey) { - return getChoiceAuthorityMap().containsKey(fieldKey); + public boolean isChoicesConfigured(String fieldKey, Collection collection) { + return getAuthorityByFieldKeyCollection(fieldKey, collection) != null; } @Override @@ -178,8 +216,14 @@ public final class ChoiceAuthorityServiceImpl implements ChoiceAuthorityService } @Override - public List getVariants(MetadataValue metadataValue) { - ChoiceAuthority ma = getChoiceAuthorityMap().get(metadataValue.getMetadataField().toString()); + public List getVariants(MetadataValue metadataValue, Collection collection) { + String fieldKey = metadataValue.getMetadataField().toString(); + ChoiceAuthority ma = getAuthorityByFieldKeyCollection(fieldKey, collection); + if (ma == null) { + throw new IllegalArgumentException( + "No choices plugin was configured for field \"" + fieldKey + + "\", collection=" + collection.getID().toString() + "."); + } if (ma instanceof AuthorityVariantsSupport) { AuthorityVariantsSupport avs = (AuthorityVariantsSupport) ma; return avs.getVariants(metadataValue.getAuthority(), metadataValue.getLanguage()); @@ -189,42 +233,53 @@ public final class ChoiceAuthorityServiceImpl implements ChoiceAuthorityService @Override - public String getChoiceAuthorityName(String schema, String element, String qualifier) { - String makeFieldKey = makeFieldKey(schema, element, qualifier); - if (getChoiceAuthorityMap().containsKey(makeFieldKey)) { - for (String key : this.authorities.keySet()) { - if (this.authorities.get(key).equals(makeFieldKey)) { - return key; + public String getChoiceAuthorityName(String schema, String element, String qualifier, Collection collection) { + init(); + String fieldKey = makeFieldKey(schema, element, qualifier); + // check if there is an authority configured for the metadata valid for all the collections + if (controller.containsKey(fieldKey)) { + for (Entry> authority2md : authorities.entrySet()) { + if (authority2md.getValue().contains(fieldKey)) { + return authority2md.getKey(); + } + } + } else if (collection != null && controllerFormDefinitions.containsKey(fieldKey)) { + // there is an authority configured for the metadata valid for some collections, + // check if it is the requested collection + Map controllerFormDef = controllerFormDefinitions.get(fieldKey); + SubmissionConfig submissionConfig = itemSubmissionConfigReader + .getSubmissionConfigByCollection(collection.getHandle()); + String submissionName = submissionConfig.getSubmissionName(); + // check if the requested collection has a submission definition that use an authority for the metadata + if (controllerFormDef.containsKey(submissionName)) { + for (Entry>> authority2defs2md : + authoritiesFormDefinitions.entrySet()) { + List mdByDefinition = authority2defs2md.getValue().get(submissionName); + if (mdByDefinition != null && mdByDefinition.contains(fieldKey)) { + return authority2defs2md.getKey(); + } } } } - return configurationService.getProperty( - CHOICES_PLUGIN_PREFIX + schema + "." + element + (qualifier != null ? "." + qualifier : "")); + return null; } protected String makeFieldKey(String schema, String element, String qualifier) { return Utils.standardize(schema, element, qualifier, "_"); } - /** - * Return map of key to ChoiceAuthority plugin - * - * @return - */ - private Map getChoiceAuthorityMap() { - // If empty, load from configuration - if (controller.isEmpty()) { - loadChoiceAuthorityConfigurations(); - } - - return controller; - } - @Override public void clearCache() { controller.clear(); authorities.clear(); + presentation.clear(); + closed.clear(); + controllerFormDefinitions.clear(); + authoritiesFormDefinitions.clear(); + itemSubmissionConfigReader = null; + initialized = false; } + private void loadChoiceAuthorityConfigurations() { // Get all configuration keys starting with a given prefix List propKeys = configurationService.getPropertyKeys(CHOICES_PLUGIN_PREFIX); @@ -249,71 +304,127 @@ public final class ChoiceAuthorityServiceImpl implements ChoiceAuthorityService "Skipping invalid configuration for " + key + " because named plugin not found: " + authorityName); continue; } - if (!authorities.containsKey(authorityName)) { - controller.put(fkey, ma); - authorities.put(authorityName, fkey); - } else { - log.warn( - "Skipping invalid configuration for " + key + " because plugin is alredy in use: " + - authorityName + " used by " + authorities - .get(authorityName)); - continue; - } + controller.put(fkey, ma); + List fkeys; + if (authorities.containsKey(authorityName)) { + fkeys = authorities.get(authorityName); + } else { + fkeys = new ArrayList(); + } + fkeys.add(fkey); + authorities.put(authorityName, fkeys); log.debug("Choice Control: For field=" + fkey + ", Plugin=" + ma); } autoRegisterChoiceAuthorityFromInputReader(); } + /** + * This method will register all the authorities that are required due to the + * submission forms configuration. This includes authorities for value pairs and + * xml vocabularies + */ private void autoRegisterChoiceAuthorityFromInputReader() { try { + List submissionConfigs = itemSubmissionConfigReader + .getAllSubmissionConfigs(Integer.MAX_VALUE, 0); DCInputsReader dcInputsReader = new DCInputsReader(); - for (DCInputSet dcinputSet : dcInputsReader.getAllInputs(Integer.MAX_VALUE, 0)) { - DCInput[][] dcinputs = dcinputSet.getFields(); - for (DCInput[] dcrows : dcinputs) { - for (DCInput dcinput : dcrows) { - if (StringUtils.isNotBlank(dcinput.getPairsType()) - || StringUtils.isNotBlank(dcinput.getVocabulary())) { - String authorityName = dcinput.getPairsType(); - if (StringUtils.isBlank(authorityName)) { + + // loop over all the defined item submission configuration + for (SubmissionConfig subCfg : submissionConfigs) { + String submissionName = subCfg.getSubmissionName(); + List inputsBySubmissionName = dcInputsReader.getInputsBySubmissionName(submissionName); + // loop over the submission forms configuration eventually associated with the submission panel + for (DCInputSet dcinputSet : inputsBySubmissionName) { + DCInput[][] dcinputs = dcinputSet.getFields(); + for (DCInput[] dcrows : dcinputs) { + for (DCInput dcinput : dcrows) { + // for each input in the form check if it is associated with a real value pairs + // or an xml vocabulary + String authorityName = null; + if (StringUtils.isNotBlank(dcinput.getPairsType()) + && !StringUtils.equals(dcinput.getInputType(), "qualdrop_value")) { + authorityName = dcinput.getPairsType(); + } else if (StringUtils.isNotBlank(dcinput.getVocabulary())) { authorityName = dcinput.getVocabulary(); } - if (!StringUtils.equals(dcinput.getInputType(), "qualdrop_value")) { + + // do we have an authority? + if (StringUtils.isNotBlank(authorityName)) { String fieldKey = makeFieldKey(dcinput.getSchema(), dcinput.getElement(), dcinput.getQualifier()); ChoiceAuthority ca = controller.get(authorityName); if (ca == null) { - InputFormSelfRegisterWrapperAuthority ifa = new - InputFormSelfRegisterWrapperAuthority(); - if (controller.containsKey(fieldKey)) { - ifa = (InputFormSelfRegisterWrapperAuthority) controller.get(fieldKey); - } - - ChoiceAuthority ma = (ChoiceAuthority) pluginService + ca = (ChoiceAuthority) pluginService .getNamedPlugin(ChoiceAuthority.class, authorityName); - if (ma == null) { - log.warn("Skipping invalid configuration for " + fieldKey - + " because named plugin not found: " + authorityName); - continue; + if (ca == null) { + throw new IllegalStateException("Invalid configuration for " + fieldKey + + " in submission definition " + submissionName + + ", form definition " + dcinputSet.getFormName() + + " no named plugin found: " + authorityName); } - ifa.getDelegates().put(dcinputSet.getFormName(), ma); - controller.put(fieldKey, ifa); - } - - if (!authorities.containsKey(authorityName)) { - authorities.put(authorityName, fieldKey); } + addAuthorityToFormCacheMap(submissionName, fieldKey, ca); + addFormDetailsToAuthorityCacheMap(submissionName, authorityName, fieldKey); } } } } } } catch (DCInputsReaderException e) { - throw new IllegalStateException(e.getMessage(), e); + // the system is in an illegal state as the submission definition is not valid + throw new IllegalStateException("Error reading the item submission configuration: " + e.getMessage(), + e); } } + /** + * Add the form/field to the cache map keeping track of which form/field are + * associated with the specific authority name + * + * @param submissionName the form definition name + * @param authorityName the name of the authority plugin + * @param fieldKey the field key that use the authority + */ + private void addFormDetailsToAuthorityCacheMap(String submissionName, String authorityName, String fieldKey) { + Map> submissionDefinitionNames2fieldKeys; + if (authoritiesFormDefinitions.containsKey(authorityName)) { + submissionDefinitionNames2fieldKeys = authoritiesFormDefinitions.get(authorityName); + } else { + submissionDefinitionNames2fieldKeys = new HashMap>(); + } + + List fields; + if (submissionDefinitionNames2fieldKeys.containsKey(submissionName)) { + fields = submissionDefinitionNames2fieldKeys.get(submissionName); + } else { + fields = new ArrayList(); + } + fields.add(fieldKey); + submissionDefinitionNames2fieldKeys.put(submissionName, fields); + authoritiesFormDefinitions.put(authorityName, submissionDefinitionNames2fieldKeys); + } + + /** + * Add the authority plugin to the cache map keeping track of which authority is + * used by a specific form/field + * + * @param submissionName the submission definition name + * @param fieldKey the field key that require the authority + * @param ca the authority plugin + */ + private void addAuthorityToFormCacheMap(String submissionName, String fieldKey, ChoiceAuthority ca) { + Map definition2authority; + if (controllerFormDefinitions.containsKey(fieldKey)) { + definition2authority = controllerFormDefinitions.get(fieldKey); + } else { + definition2authority = new HashMap(); + } + definition2authority.put(submissionName, ca); + controllerFormDefinitions.put(fieldKey, definition2authority); + } + /** * Return map of key to presentation * @@ -370,26 +481,6 @@ public final class ChoiceAuthorityServiceImpl implements ChoiceAuthorityService return closed; } - @Override - public String getChoiceMetadatabyAuthorityName(String name) { - if (authorities.isEmpty()) { - loadChoiceAuthorityConfigurations(); - } - if (authorities.containsKey(name)) { - return authorities.get(name); - } - return null; - } - - @Override - public Choice getChoice(String fieldKey, String authKey, String locale) { - ChoiceAuthority ma = getChoiceAuthorityMap().get(fieldKey); - if (ma == null) { - throw new IllegalArgumentException("No choices plugin was configured for field \"" + fieldKey + "\"."); - } - return ma.getChoice(fieldKey, authKey, locale); - } - @Override public ChoiceAuthority getChoiceAuthorityByAuthorityName(String authorityName) { ChoiceAuthority ma = (ChoiceAuthority) @@ -401,4 +492,68 @@ public final class ChoiceAuthorityServiceImpl implements ChoiceAuthorityService } return ma; } + + private ChoiceAuthority getAuthorityByFieldKeyCollection(String fieldKey, Collection collection) { + init(); + ChoiceAuthority ma = controller.get(fieldKey); + if (ma == null && collection != null) { + SubmissionConfigReader configReader; + try { + configReader = new SubmissionConfigReader(); + SubmissionConfig submissionName = configReader.getSubmissionConfigByCollection(collection.getHandle()); + ma = controllerFormDefinitions.get(fieldKey).get(submissionName.getSubmissionName()); + } catch (SubmissionConfigReaderException e) { + // the system is in an illegal state as the submission definition is not valid + throw new IllegalStateException("Error reading the item submission configuration: " + e.getMessage(), + e); + } + } + return ma; + } + + @Override + public boolean storeAuthority(String fieldKey, Collection collection) { + // currently only named authority can eventually provide real authority + return controller.containsKey(fieldKey); + } + + /** + * Wrapper that calls getChoicesByParent method of the plugin. + * + * @param authorityName authority name + * @param parentId parent Id + * @param start choice at which to start, 0 is first. + * @param limit maximum number of choices to return, 0 for no limit. + * @param locale explicit localization key if available, or null + * @return a Choices object (never null). + * @see org.dspace.content.authority.ChoiceAuthority#getChoicesByParent(java.lang.String, java.lang.String, + * int, int, java.lang.String) + */ + @Override + public Choices getChoicesByParent(String authorityName, String parentId, int start, int limit, String locale) { + HierarchicalAuthority ma = (HierarchicalAuthority) getChoiceAuthorityByAuthorityName(authorityName); + return ma.getChoicesByParent(authorityName, parentId, start, limit, locale); + } + + /** + * Wrapper that calls getTopChoices method of the plugin. + * + * @param authorityName authority name + * @param start choice at which to start, 0 is first. + * @param limit maximum number of choices to return, 0 for no limit. + * @param locale explicit localization key if available, or null + * @return a Choices object (never null). + * @see org.dspace.content.authority.ChoiceAuthority#getTopChoices(java.lang.String, int, int, java.lang.String) + */ + @Override + public Choices getTopChoices(String authorityName, int start, int limit, String locale) { + HierarchicalAuthority ma = (HierarchicalAuthority) getChoiceAuthorityByAuthorityName(authorityName); + return ma.getTopChoices(authorityName, start, limit, locale); + } + + @Override + public Choice getParentChoice(String authorityName, String vocabularyId, String locale) { + HierarchicalAuthority ma = (HierarchicalAuthority) getChoiceAuthorityByAuthorityName(authorityName); + return ma.getParentChoice(authorityName, vocabularyId, locale); + } } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/DCInputAuthority.java b/dspace-api/src/main/java/org/dspace/content/authority/DCInputAuthority.java index a64ebdd971..b1d8cf36a5 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/DCInputAuthority.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/DCInputAuthority.java @@ -9,14 +9,20 @@ package org.dspace.content.authority; import java.util.ArrayList; import java.util.Arrays; +import java.util.HashMap; +import java.util.HashSet; import java.util.Iterator; import java.util.List; +import java.util.Locale; +import java.util.Map; +import java.util.Set; import org.apache.commons.lang3.ArrayUtils; +import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; import org.dspace.app.util.DCInputsReader; import org.dspace.app.util.DCInputsReaderException; -import org.dspace.content.Collection; +import org.dspace.core.I18nUtil; import org.dspace.core.SelfNamedPlugin; /** @@ -44,16 +50,38 @@ import org.dspace.core.SelfNamedPlugin; public class DCInputAuthority extends SelfNamedPlugin implements ChoiceAuthority { private static Logger log = org.apache.logging.log4j.LogManager.getLogger(DCInputAuthority.class); - private String values[] = null; - private String labels[] = null; + /** + * The map of the values available for a specific language. Examples of keys are + * "en", "it", "uk" + */ + private Map values = null; - private static DCInputsReader dci = null; + /** + * The map of the labels available for a specific language. Examples of keys are + * "en", "it", "uk" + */ + private Map labels = null; + + /** + * The map of the input form reader associated to use for a specific java locale + */ + private static Map dcis = null; private static String pluginNames[] = null; public DCInputAuthority() { super(); } + @Override + public boolean storeAuthorityInMetadata() { + // For backward compatibility value pairs don't store authority in + // the metadatavalue + return false; + } + public static void reset() { + pluginNames = null; + } + public static String[] getPluginNames() { if (pluginNames == null) { initPluginNames(); @@ -63,20 +91,28 @@ public class DCInputAuthority extends SelfNamedPlugin implements ChoiceAuthority } private static synchronized void initPluginNames() { + Locale[] locales = I18nUtil.getSupportedLocales(); + Set names = new HashSet(); if (pluginNames == null) { try { - if (dci == null) { - dci = new DCInputsReader(); + dcis = new HashMap(); + for (Locale locale : locales) { + dcis.put(locale, new DCInputsReader(I18nUtil.getInputFormsFileName(locale))); + } + for (Locale l : locales) { + Iterator pi = dcis.get(l).getPairsNameIterator(); + while (pi.hasNext()) { + names.add((String) pi.next()); + } + } + DCInputsReader dcirDefault = new DCInputsReader(); + Iterator pi = dcirDefault.getPairsNameIterator(); + while (pi.hasNext()) { + names.add((String) pi.next()); } } catch (DCInputsReaderException e) { log.error("Failed reading DCInputs initialization: ", e); } - List names = new ArrayList(); - Iterator pi = dci.getPairsNameIterator(); - while (pi.hasNext()) { - names.add((String) pi.next()); - } - pluginNames = names.toArray(new String[names.size()]); log.debug("Got plugin names = " + Arrays.deepToString(pluginNames)); } @@ -85,45 +121,65 @@ public class DCInputAuthority extends SelfNamedPlugin implements ChoiceAuthority // once-only load of values and labels private void init() { if (values == null) { + values = new HashMap(); + labels = new HashMap(); String pname = this.getPluginInstanceName(); - List pairs = dci.getPairs(pname); - if (pairs != null) { - values = new String[pairs.size() / 2]; - labels = new String[pairs.size() / 2]; - for (int i = 0; i < pairs.size(); i += 2) { - labels[i / 2] = pairs.get(i); - values[i / 2] = pairs.get(i + 1); + for (Locale l : dcis.keySet()) { + DCInputsReader dci = dcis.get(l); + List pairs = dci.getPairs(pname); + if (pairs != null) { + String[] valuesLocale = new String[pairs.size() / 2]; + String[]labelsLocale = new String[pairs.size() / 2]; + for (int i = 0; i < pairs.size(); i += 2) { + labelsLocale[i / 2] = pairs.get(i); + valuesLocale[i / 2] = pairs.get(i + 1); + } + values.put(l.getLanguage(), valuesLocale); + labels.put(l.getLanguage(), labelsLocale); + log.debug("Found pairs for name=" + pname + ",locale=" + l); + } else { + log.error("Failed to find any pairs for name=" + pname, new IllegalStateException()); } - log.debug("Found pairs for name=" + pname); - } else { - log.error("Failed to find any pairs for name=" + pname, new IllegalStateException()); } + } } @Override - public Choices getMatches(String field, String query, Collection collection, int start, int limit, String locale) { + public Choices getMatches(String query, int start, int limit, String locale) { init(); - + Locale currentLocale = I18nUtil.getSupportedLocale(locale); + String[] valuesLocale = values.get(currentLocale.getLanguage()); + String[] labelsLocale = labels.get(currentLocale.getLanguage()); int dflt = -1; - Choice v[] = new Choice[values.length]; - for (int i = 0; i < values.length; ++i) { - v[i] = new Choice(values[i], values[i], labels[i]); - if (values[i].equalsIgnoreCase(query)) { - dflt = i; + int found = 0; + List v = new ArrayList(); + for (int i = 0; i < valuesLocale.length; ++i) { + if (query == null || StringUtils.containsIgnoreCase(valuesLocale[i], query)) { + if (found >= start && v.size() < limit) { + v.add(new Choice(null, valuesLocale[i], labelsLocale[i])); + if (valuesLocale[i].equalsIgnoreCase(query)) { + dflt = i; + } + } + found++; } } - return new Choices(v, 0, v.length, Choices.CF_AMBIGUOUS, false, dflt); + Choice[] vArray = new Choice[v.size()]; + return new Choices(v.toArray(vArray), start, found, Choices.CF_AMBIGUOUS, false, dflt); } @Override - public Choices getBestMatch(String field, String text, Collection collection, String locale) { + public Choices getBestMatch(String text, String locale) { init(); - for (int i = 0; i < values.length; ++i) { - if (text.equalsIgnoreCase(values[i])) { + Locale currentLocale = I18nUtil.getSupportedLocale(locale); + String[] valuesLocale = values.get(currentLocale.getLanguage()); + String[] labelsLocale = labels.get(currentLocale.getLanguage()); + for (int i = 0; i < valuesLocale.length; ++i) { + if (text.equalsIgnoreCase(valuesLocale[i])) { Choice v[] = new Choice[1]; - v[0] = new Choice(String.valueOf(i), values[i], labels[i]); + v[0] = new Choice(String.valueOf(i), valuesLocale[i], labelsLocale[i]); return new Choices(v, 0, v.length, Choices.CF_UNCERTAIN, false, 0); } } @@ -131,19 +187,31 @@ public class DCInputAuthority extends SelfNamedPlugin implements ChoiceAuthority } @Override - public String getLabel(String field, String key, String locale) { + public String getLabel(String key, String locale) { init(); + + // Get default if locale is empty + if (StringUtils.isBlank(locale)) { + locale = I18nUtil.getDefaultLocale().getLanguage(); + } + + String[] labelsLocale = labels.get(locale); int pos = -1; - for (int i = 0; i < values.length; i++) { - if (values[i].equals(key)) { + for (int i = 0; i < labelsLocale.length; i++) { + if (labelsLocale[i].equals(key)) { pos = i; break; } } if (pos != -1) { - return labels[pos]; + return labelsLocale[pos]; } else { return "UNKNOWN KEY " + key; } } + + @Override + public boolean isScrollable() { + return true; + } } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/DSpaceControlledVocabulary.java b/dspace-api/src/main/java/org/dspace/content/authority/DSpaceControlledVocabulary.java index 097a19eb13..00c74bea9d 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/DSpaceControlledVocabulary.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/DSpaceControlledVocabulary.java @@ -10,7 +10,9 @@ package org.dspace.content.authority; import java.io.File; import java.util.ArrayList; import java.util.Arrays; +import java.util.HashMap; import java.util.List; +import java.util.Map; import javax.xml.xpath.XPath; import javax.xml.xpath.XPathConstants; import javax.xml.xpath.XPathExpressionException; @@ -19,7 +21,6 @@ import javax.xml.xpath.XPathFactory; import org.apache.commons.lang3.ArrayUtils; import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; -import org.dspace.content.Collection; import org.dspace.core.SelfNamedPlugin; import org.dspace.services.ConfigurationService; import org.dspace.services.factory.DSpaceServicesFactory; @@ -54,25 +55,35 @@ import org.xml.sax.InputSource; * @author Michael B. Klein */ -public class DSpaceControlledVocabulary extends SelfNamedPlugin implements ChoiceAuthority { +public class DSpaceControlledVocabulary extends SelfNamedPlugin implements HierarchicalAuthority { private static Logger log = org.apache.logging.log4j.LogManager.getLogger(DSpaceControlledVocabulary.class); protected static String xpathTemplate = "//node[contains(translate(@label,'ABCDEFGHIJKLMNOPQRSTUVWXYZ'," + "'abcdefghijklmnopqrstuvwxyz'),'%s')]"; protected static String idTemplate = "//node[@id = '%s']"; - protected static String idParentTemplate = "//node[@id = '%s']/parent::isComposedBy"; + protected static String labelTemplate = "//node[@label = '%s']"; + protected static String idParentTemplate = "//node[@id = '%s']/parent::isComposedBy/parent::node"; + protected static String rootTemplate = "/node"; protected static String pluginNames[] = null; protected String vocabularyName = null; protected InputSource vocabulary = null; - protected Boolean suggestHierarchy = true; + protected Boolean suggestHierarchy = false; protected Boolean storeHierarchy = true; protected String hierarchyDelimiter = "::"; + protected Integer preloadLevel = 1; public DSpaceControlledVocabulary() { super(); } + @Override + public boolean storeAuthorityInMetadata() { + // For backward compatibility controlled vocabularies don't store the node id in + // the metadatavalue + return false; + } + public static String[] getPluginNames() { if (pluginNames == null) { initPluginNames(); @@ -112,6 +123,7 @@ public class DSpaceControlledVocabulary extends SelfNamedPlugin implements Choic String configurationPrefix = "vocabulary.plugin." + vocabularyName; storeHierarchy = config.getBooleanProperty(configurationPrefix + ".hierarchy.store", storeHierarchy); suggestHierarchy = config.getBooleanProperty(configurationPrefix + ".hierarchy.suggest", suggestHierarchy); + preloadLevel = config.getIntProperty(configurationPrefix + ".hierarchy.preloadLevel", preloadLevel); String configuredDelimiter = config.getProperty(configurationPrefix + ".delimiter"); if (configuredDelimiter != null) { hierarchyDelimiter = configuredDelimiter.replaceAll("(^\"|\"$)", ""); @@ -142,7 +154,7 @@ public class DSpaceControlledVocabulary extends SelfNamedPlugin implements Choic } @Override - public Choices getMatches(String field, String text, Collection collection, int start, int limit, String locale) { + public Choices getMatches(String text, int start, int limit, String locale) { init(); log.debug("Getting matches for '" + text + "'"); String xpathExpression = ""; @@ -151,59 +163,60 @@ public class DSpaceControlledVocabulary extends SelfNamedPlugin implements Choic xpathExpression += String.format(xpathTemplate, textHierarchy[i].replaceAll("'", "'").toLowerCase()); } XPath xpath = XPathFactory.newInstance().newXPath(); - Choice[] choices; + int total = 0; + List choices = new ArrayList(); try { NodeList results = (NodeList) xpath.evaluate(xpathExpression, vocabulary, XPathConstants.NODESET); - String[] authorities = new String[results.getLength()]; - String[] values = new String[results.getLength()]; - String[] labels = new String[results.getLength()]; - String[] parent = new String[results.getLength()]; - String[] notes = new String[results.getLength()]; - for (int i = 0; i < results.getLength(); i++) { - Node node = results.item(i); - readNode(authorities, values, labels, parent, notes, i, node); - } - int resultCount = labels.length - start; - // limit = 0 means no limit - if ((limit > 0) && (resultCount > limit)) { - resultCount = limit; - } - choices = new Choice[resultCount]; - if (resultCount > 0) { - for (int i = 0; i < resultCount; i++) { - choices[i] = new Choice(authorities[start + i], values[start + i], labels[start + i]); - if (StringUtils.isNotBlank(parent[i])) { - choices[i].extras.put("parent", parent[i]); - } - if (StringUtils.isNotBlank(notes[i])) { - choices[i].extras.put("note", notes[i]); - } - } - } + total = results.getLength(); + choices = getChoicesFromNodeList(results, start, limit); } catch (XPathExpressionException e) { - choices = new Choice[0]; + log.warn(e.getMessage(), e); + return new Choices(true); } - return new Choices(choices, 0, choices.length, Choices.CF_AMBIGUOUS, false); + return new Choices(choices.toArray(new Choice[choices.size()]), start, total, Choices.CF_AMBIGUOUS, + total > start + limit); } @Override - public Choices getBestMatch(String field, String text, Collection collection, String locale) { + public Choices getBestMatch(String text, String locale) { init(); - log.debug("Getting best match for '" + text + "'"); - return getMatches(field, text, collection, 0, 2, locale); - } - - @Override - public String getLabel(String field, String key, String locale) { - init(); - String xpathExpression = String.format(idTemplate, key); + log.debug("Getting best matches for '" + text + "'"); + String xpathExpression = ""; + String[] textHierarchy = text.split(hierarchyDelimiter, -1); + for (int i = 0; i < textHierarchy.length; i++) { + xpathExpression += String.format(labelTemplate, textHierarchy[i].replaceAll("'", "'")); + } XPath xpath = XPathFactory.newInstance().newXPath(); + List choices = new ArrayList(); try { - Node node = (Node) xpath.evaluate(xpathExpression, vocabulary, XPathConstants.NODE); - return node.getAttributes().getNamedItem("label").getNodeValue(); + NodeList results = (NodeList) xpath.evaluate(xpathExpression, vocabulary, XPathConstants.NODESET); + choices = getChoicesFromNodeList(results, 0, 1); } catch (XPathExpressionException e) { - return (""); + log.warn(e.getMessage(), e); + return new Choices(true); } + return new Choices(choices.toArray(new Choice[choices.size()]), 0, choices.size(), Choices.CF_AMBIGUOUS, false); + } + + @Override + public String getLabel(String key, String locale) { + return getNodeLabel(key, this.suggestHierarchy); + } + + @Override + public String getValue(String key, String locale) { + return getNodeLabel(key, this.storeHierarchy); + } + + @Override + public Choice getChoice(String authKey, String locale) { + Node node; + try { + node = getNode(authKey); + } catch (XPathExpressionException e) { + return null; + } + return createChoiceFromNode(node); } @Override @@ -212,81 +225,227 @@ public class DSpaceControlledVocabulary extends SelfNamedPlugin implements Choic } @Override - public Choice getChoice(String fieldKey, String authKey, String locale) { + public Choices getTopChoices(String authorityName, int start, int limit, String locale) { init(); - log.debug("Getting matches for '" + authKey + "'"); - String xpathExpression = String.format(idTemplate, authKey); - XPath xpath = XPathFactory.newInstance().newXPath(); - try { - Node node = (Node) xpath.evaluate(xpathExpression, vocabulary, XPathConstants.NODE); - if (node != null) { - String[] authorities = new String[1]; - String[] values = new String[1]; - String[] labels = new String[1]; - String[] parent = new String[1]; - String[] note = new String[1]; - readNode(authorities, values, labels, parent, note, 0, node); - - if (values.length > 0) { - Choice choice = new Choice(authorities[0], values[0], labels[0]); - if (StringUtils.isNotBlank(parent[0])) { - choice.extras.put("parent", parent[0]); - } - if (StringUtils.isNotBlank(note[0])) { - choice.extras.put("note", note[0]); - } - return choice; - } - } - } catch (XPathExpressionException e) { - log.warn(e.getMessage(), e); - } - return null; + String xpathExpression = rootTemplate; + return getChoicesByXpath(xpathExpression, start, limit); } - private void readNode(String[] authorities, String[] values, String[] labels, String[] parent, String[] notes, - int i, Node node) { + @Override + public Choices getChoicesByParent(String authorityName, String parentId, int start, int limit, String locale) { + init(); + String xpathExpression = String.format(idTemplate, parentId); + return getChoicesByXpath(xpathExpression, start, limit); + } + + @Override + public Choice getParentChoice(String authorityName, String childId, String locale) { + init(); + try { + String xpathExpression = String.format(idParentTemplate, childId); + Choice choice = createChoiceFromNode(getNodeFromXPath(xpathExpression)); + return choice; + } catch (XPathExpressionException e) { + log.error(e.getMessage(), e); + return null; + } + } + + @Override + public Integer getPreloadLevel() { + return preloadLevel; + } + + private boolean isRootElement(Node node) { + if (node != null && node.getOwnerDocument().getDocumentElement().equals(node)) { + return true; + } + return false; + } + + private Node getNode(String key) throws XPathExpressionException { + init(); + String xpathExpression = String.format(idTemplate, key); + Node node = getNodeFromXPath(xpathExpression); + return node; + } + + private Node getNodeFromXPath(String xpathExpression) throws XPathExpressionException { + XPath xpath = XPathFactory.newInstance().newXPath(); + Node node = (Node) xpath.evaluate(xpathExpression, vocabulary, XPathConstants.NODE); + return node; + } + + private List getChoicesFromNodeList(NodeList results, int start, int limit) { + List choices = new ArrayList(); + for (int i = 0; i < results.getLength(); i++) { + if (i < start) { + continue; + } + if (choices.size() == limit) { + break; + } + Node node = results.item(i); + Choice choice = new Choice(getAuthority(node), getLabel(node), getValue(node), + isSelectable(node)); + choice.extras = addOtherInformation(getParent(node), getNote(node), getChildren(node), getAuthority(node)); + choices.add(choice); + } + return choices; + } + + private Map addOtherInformation(String parentCurr, String noteCurr, + List childrenCurr, String authorityCurr) { + Map extras = new HashMap(); + if (StringUtils.isNotBlank(parentCurr)) { + extras.put("parent", parentCurr); + } + if (StringUtils.isNotBlank(noteCurr)) { + extras.put("note", noteCurr); + } + if (childrenCurr.size() > 0) { + extras.put("hasChildren", "true"); + } else { + extras.put("hasChildren", "false"); + } + extras.put("id", authorityCurr); + return extras; + } + + private String getNodeLabel(String key, boolean useHierarchy) { + try { + Node node = getNode(key); + if (useHierarchy) { + return this.buildString(node); + } else { + return node.getAttributes().getNamedItem("label").getNodeValue(); + } + } catch (XPathExpressionException e) { + return (""); + } + } + + private String getLabel(Node node) { String hierarchy = this.buildString(node); if (this.suggestHierarchy) { - labels[i] = hierarchy; + return hierarchy; } else { - labels[i] = node.getAttributes().getNamedItem("label").getNodeValue(); - } - if (this.storeHierarchy) { - values[i] = hierarchy; - } else { - values[i] = node.getAttributes().getNamedItem("label").getNodeValue(); + return node.getAttributes().getNamedItem("label").getNodeValue(); } + } + private String getValue(Node node) { + String hierarchy = this.buildString(node); + if (this.storeHierarchy) { + return hierarchy; + } else { + return node.getAttributes().getNamedItem("label").getNodeValue(); + } + } + + private String getNote(Node node) { NodeList childNodes = node.getChildNodes(); for (int ci = 0; ci < childNodes.getLength(); ci++) { Node firstChild = childNodes.item(ci); if (firstChild != null && "hasNote".equals(firstChild.getNodeName())) { String nodeValue = firstChild.getTextContent(); if (StringUtils.isNotBlank(nodeValue)) { - notes[i] = nodeValue; + return nodeValue; } } } - Node idAttr = node.getAttributes().getNamedItem("id"); - if (null != idAttr) { // 'id' is optional - authorities[i] = idAttr.getNodeValue(); - if (isHierarchical()) { - Node parentN = node.getParentNode(); - if (parentN != null) { - parentN = parentN.getParentNode(); - if (parentN != null) { - Node parentIdAttr = parentN.getAttributes().getNamedItem("id"); - if (null != parentIdAttr) { - parent[i] = parentIdAttr.getNodeValue(); + return null; + } + + private List getChildren(Node node) { + List children = new ArrayList(); + NodeList childNodes = node.getChildNodes(); + for (int ci = 0; ci < childNodes.getLength(); ci++) { + Node firstChild = childNodes.item(ci); + if (firstChild != null && "isComposedBy".equals(firstChild.getNodeName())) { + for (int cii = 0; cii < firstChild.getChildNodes().getLength(); cii++) { + Node childN = firstChild.getChildNodes().item(cii); + if (childN != null && "node".equals(childN.getNodeName())) { + Node childIdAttr = childN.getAttributes().getNamedItem("id"); + if (null != childIdAttr) { + children.add(childIdAttr.getNodeValue()); } } } + break; } - } else { - authorities[i] = null; - parent[i] = null; + } + return children; + } + + private boolean isSelectable(Node node) { + Node selectableAttr = node.getAttributes().getNamedItem("selectable"); + if (null != selectableAttr) { + return Boolean.valueOf(selectableAttr.getNodeValue()); + } else { // Default is true + return true; } } + private String getParent(Node node) { + Node parentN = node.getParentNode(); + if (parentN != null) { + parentN = parentN.getParentNode(); + if (parentN != null && !isRootElement(parentN)) { + return buildString(parentN); + } + } + return null; + } + + private String getAuthority(Node node) { + Node idAttr = node.getAttributes().getNamedItem("id"); + if (null != idAttr) { // 'id' is optional + return idAttr.getNodeValue(); + } else { + return null; + } + } + + private Choices getChoicesByXpath(String xpathExpression, int start, int limit) { + List choices = new ArrayList(); + XPath xpath = XPathFactory.newInstance().newXPath(); + try { + Node parentNode = (Node) xpath.evaluate(xpathExpression, vocabulary, XPathConstants.NODE); + int count = 0; + if (parentNode != null) { + NodeList childNodes = (NodeList) xpath.evaluate(".//isComposedBy", parentNode, XPathConstants.NODE); + if (null != childNodes) { + for (int i = 0; i < childNodes.getLength(); i++) { + Node childNode = childNodes.item(i); + if (childNode != null && "node".equals(childNode.getNodeName())) { + if (count < start || choices.size() >= limit) { + count++; + continue; + } + count++; + choices.add(createChoiceFromNode(childNode)); + } + } + } + return new Choices(choices.toArray(new Choice[choices.size()]), start, count, + Choices.CF_AMBIGUOUS, false); + } + } catch (XPathExpressionException e) { + log.warn(e.getMessage(), e); + return new Choices(true); + } + return new Choices(false); + } + + private Choice createChoiceFromNode(Node node) { + if (node != null && !isRootElement(node)) { + Choice choice = new Choice(getAuthority(node), getLabel(node), getValue(node), + isSelectable(node)); + choice.extras = addOtherInformation(getParent(node), getNote(node),getChildren(node), getAuthority(node)); + return choice; + } + return null; + } + } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/HierarchicalAuthority.java b/dspace-api/src/main/java/org/dspace/content/authority/HierarchicalAuthority.java new file mode 100644 index 0000000000..c25b74d354 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/content/authority/HierarchicalAuthority.java @@ -0,0 +1,85 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.content.authority; + +/** + * Plugin interface that supplies an authority control mechanism for + * one metadata field. + * + * @author Larry Stone + * @see ChoiceAuthority + */ +public interface HierarchicalAuthority extends ChoiceAuthority { + + /** + * Get all values from the authority that match the preferred value. + * Note that the offering was entered by the user and may contain + * mixed/incorrect case, whitespace, etc so the plugin should be careful + * to clean up user data before making comparisons. + * + * Value of a "Name" field will be in canonical DSpace person name format, + * which is "Lastname, Firstname(s)", e.g. "Smith, John Q.". + * + * Some authorities with a small set of values may simply return the whole + * set for any sample value, although it's a good idea to set the + * defaultSelected index in the Choices instance to the choice, if any, + * that matches the value. + * + * @param authorityName authority name + * @param start choice at which to start, 0 is first. + * @param limit maximum number of choices to return, 0 for no limit. + * @param locale explicit localization key if available, or null + * @return a Choices object (never null). + */ + public Choices getTopChoices(String authorityName, int start, int limit, String locale); + + /** + * Get all values from the authority that match the preferred value. + * Note that the offering was entered by the user and may contain + * mixed/incorrect case, whitespace, etc so the plugin should be careful + * to clean up user data before making comparisons. + * + * Value of a "Name" field will be in canonical DSpace person name format, + * which is "Lastname, Firstname(s)", e.g. "Smith, John Q.". + * + * Some authorities with a small set of values may simply return the whole + * set for any sample value, although it's a good idea to set the + * defaultSelected index in the Choices instance to the choice, if any, + * that matches the value. + * + * @param authorityName authority name + * @param parentId user's value to match + * @param start choice at which to start, 0 is first. + * @param limit maximum number of choices to return, 0 for no limit. + * @param locale explicit localization key if available, or null + * @return a Choices object (never null). + */ + public Choices getChoicesByParent(String authorityName, String parentId, int start, int limit, String locale); + + /** + * It returns the parent choice in the hierarchy if any + * + * @param authorityName authority name + * @param vocabularyId user's value to match + * @param locale explicit localization key if available, or null + * @return a Choice object + */ + public Choice getParentChoice(String authorityName, String vocabularyId, String locale); + + /** + * Provides an hint for the UI to preload some levels to improve the UX. It + * usually mean that these preloaded level will be shown expanded by default + */ + public Integer getPreloadLevel(); + + @Override + default boolean isHierarchical() { + return true; + } + +} \ No newline at end of file diff --git a/dspace-api/src/main/java/org/dspace/content/authority/InputFormSelfRegisterWrapperAuthority.java b/dspace-api/src/main/java/org/dspace/content/authority/InputFormSelfRegisterWrapperAuthority.java deleted file mode 100644 index 8716ef38b9..0000000000 --- a/dspace-api/src/main/java/org/dspace/content/authority/InputFormSelfRegisterWrapperAuthority.java +++ /dev/null @@ -1,166 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.content.authority; - -import java.util.Arrays; -import java.util.HashMap; -import java.util.HashSet; -import java.util.Map; -import java.util.Set; - -import org.apache.commons.lang3.StringUtils; -import org.apache.logging.log4j.Logger; -import org.dspace.app.util.DCInputsReader; -import org.dspace.app.util.DCInputsReaderException; -import org.dspace.content.Collection; - -/** - * This authority is registered automatically by the ChoiceAuthorityService for - * all the metadata that use a value-pair or a vocabulary in the submission-form.xml - * - * It keeps a map of form-name vs ChoiceAuthority to delegate the execution of - * the method to the specific ChoiceAuthority configured for the collection when - * the same metadata have different vocabulary or value-pair on a collection - * basis - * - * @author Andrea Bollini (andrea.bollini at 4science.it) - */ -public class InputFormSelfRegisterWrapperAuthority implements ChoiceAuthority { - - private static Logger log = - org.apache.logging.log4j.LogManager.getLogger(InputFormSelfRegisterWrapperAuthority.class); - - private Map delegates = new HashMap(); - - private static DCInputsReader dci = null; - - private void init() { - try { - if (dci == null) { - dci = new DCInputsReader(); - } - } catch (DCInputsReaderException e) { - log.error("Failed reading DCInputs initialization: ", e); - } - } - - @Override - public Choices getMatches(String field, String query, Collection collection, int start, int limit, String locale) { - String formName; - try { - init(); - if (collection == null) { - Set choices = new HashSet(); - //workaround search in all authority configured - for (ChoiceAuthority ca : delegates.values()) { - Choices tmp = ca.getMatches(field, query, null, start, limit, locale); - if (tmp.total > 0) { - Set mySet = new HashSet(Arrays.asList(tmp.values)); - choices.addAll(mySet); - } - } - if (!choices.isEmpty()) { - Choice[] results = new Choice[choices.size()]; - choices.toArray(results); - return new Choices(results, 0, choices.size(), Choices.CF_AMBIGUOUS, false); - } - } else { - formName = dci.getInputFormNameByCollectionAndField(collection, field); - return delegates.get(formName).getMatches(field, query, collection, start, limit, locale); - } - } catch (DCInputsReaderException e) { - log.error(e.getMessage(), e); - } - return new Choices(Choices.CF_NOTFOUND); - } - - @Override - public Choices getBestMatch(String field, String text, Collection collection, String locale) { - String formName; - try { - init(); - if (collection == null) { - Set choices = new HashSet(); - //workaround search in all authority configured - for (ChoiceAuthority ca : delegates.values()) { - Choices tmp = ca.getBestMatch(field, text, null, locale); - if (tmp.total > 0) { - Set mySet = new HashSet(Arrays.asList(tmp.values)); - choices.addAll(mySet); - } - } - if (!choices.isEmpty()) { - Choice[] results = new Choice[choices.size() - 1]; - choices.toArray(results); - return new Choices(results, 0, choices.size(), Choices.CF_UNCERTAIN, false); - } - } else { - formName = dci.getInputFormNameByCollectionAndField(collection, field); - return delegates.get(formName).getBestMatch(field, text, collection, locale); - } - } catch (DCInputsReaderException e) { - log.error(e.getMessage(), e); - } - return new Choices(Choices.CF_NOTFOUND); - } - - @Override - public String getLabel(String field, String key, String locale) { - // TODO we need to manage REALLY the authority - // WRONG BEHAVIOUR: now in each delegates can exists the same key with - // different value - for (ChoiceAuthority delegate : delegates.values()) { - String label = delegate.getLabel(field, key, locale); - if (StringUtils.isNotBlank(label)) { - return label; - } - } - return "UNKNOWN KEY " + key; - } - - @Override - public boolean isHierarchical() { - // TODO we need to manage REALLY the authority - // WRONG BEHAVIOUR: now in each delegates can exists the same key with - // different value - for (ChoiceAuthority delegate : delegates.values()) { - return delegate.isHierarchical(); - } - return false; - } - - @Override - public boolean isScrollable() { - // TODO we need to manage REALLY the authority - // WRONG BEHAVIOUR: now in each delegates can exists the same key with - // different value - for (ChoiceAuthority delegate : delegates.values()) { - return delegate.isScrollable(); - } - return false; - } - - @Override - public boolean hasIdentifier() { - // TODO we need to manage REALLY the authority - // WRONG BEHAVIOUR: now in each delegates can exists the same key with - // different value - for (ChoiceAuthority delegate : delegates.values()) { - return delegate.hasIdentifier(); - } - return false; - } - - public Map getDelegates() { - return delegates; - } - - public void setDelegates(Map delegates) { - this.delegates = delegates; - } -} diff --git a/dspace-api/src/main/java/org/dspace/content/authority/MetadataAuthorityServiceImpl.java b/dspace-api/src/main/java/org/dspace/content/authority/MetadataAuthorityServiceImpl.java index 6a5b17a029..c542c6a89e 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/MetadataAuthorityServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/MetadataAuthorityServiceImpl.java @@ -14,12 +14,7 @@ import java.util.HashMap; import java.util.List; import java.util.Map; -import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; -import org.dspace.app.util.DCInput; -import org.dspace.app.util.DCInputSet; -import org.dspace.app.util.DCInputsReader; -import org.dspace.app.util.DCInputsReaderException; import org.dspace.content.MetadataField; import org.dspace.content.authority.service.MetadataAuthorityService; import org.dspace.content.service.MetadataFieldService; @@ -144,8 +139,6 @@ public class MetadataAuthorityServiceImpl implements MetadataAuthorityService { if (dmc >= Choices.CF_UNSET) { defaultMinConfidence = dmc; } - - autoRegisterAuthorityFromInputReader(); } } @@ -205,7 +198,6 @@ public class MetadataAuthorityServiceImpl implements MetadataAuthorityService { } } - /** * Give the minimal level of confidence required to consider valid an authority value * for the given metadata. @@ -229,35 +221,4 @@ public class MetadataAuthorityServiceImpl implements MetadataAuthorityService { } return copy; } - - - private void autoRegisterAuthorityFromInputReader() { - try { - DCInputsReader dcInputsReader = new DCInputsReader(); - for (DCInputSet dcinputSet : dcInputsReader.getAllInputs(Integer.MAX_VALUE, 0)) { - DCInput[][] dcinputs = dcinputSet.getFields(); - for (DCInput[] dcrows : dcinputs) { - for (DCInput dcinput : dcrows) { - if (StringUtils.isNotBlank(dcinput.getPairsType()) - || StringUtils.isNotBlank(dcinput.getVocabulary())) { - String authorityName = dcinput.getPairsType(); - if (StringUtils.isBlank(authorityName)) { - authorityName = dcinput.getVocabulary(); - } - if (!StringUtils.equals(dcinput.getInputType(), "qualdrop_value")) { - String fieldKey = makeFieldKey(dcinput.getSchema(), dcinput.getElement(), - dcinput.getQualifier()); - boolean req = ConfigurationManager - .getBooleanProperty("authority.required." + fieldKey, false); - controlled.put(fieldKey, true); - isAuthorityRequired.put(fieldKey, req); - } - } - } - } - } - } catch (DCInputsReaderException e) { - throw new IllegalStateException(e.getMessage(), e); - } - } } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/SampleAuthority.java b/dspace-api/src/main/java/org/dspace/content/authority/SampleAuthority.java index 8197f180af..e6cc9b9d44 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/SampleAuthority.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/SampleAuthority.java @@ -7,13 +7,13 @@ */ package org.dspace.content.authority; -import org.dspace.content.Collection; - /** * This is a *very* stupid test fixture for authority control, and also * serves as a trivial example of an authority plugin implementation. */ public class SampleAuthority implements ChoiceAuthority { + private String pluginInstanceName; + protected static String values[] = { "sun", "mon", @@ -35,7 +35,7 @@ public class SampleAuthority implements ChoiceAuthority { }; @Override - public Choices getMatches(String field, String query, Collection collection, int start, int limit, String locale) { + public Choices getMatches(String query, int start, int limit, String locale) { int dflt = -1; Choice v[] = new Choice[values.length]; for (int i = 0; i < values.length; ++i) { @@ -48,7 +48,7 @@ public class SampleAuthority implements ChoiceAuthority { } @Override - public Choices getBestMatch(String field, String text, Collection collection, String locale) { + public Choices getBestMatch(String text, String locale) { for (int i = 0; i < values.length; ++i) { if (text.equalsIgnoreCase(values[i])) { Choice v[] = new Choice[1]; @@ -60,7 +60,17 @@ public class SampleAuthority implements ChoiceAuthority { } @Override - public String getLabel(String field, String key, String locale) { + public String getLabel(String key, String locale) { return labels[Integer.parseInt(key)]; } + + @Override + public String getPluginInstanceName() { + return pluginInstanceName; + } + + @Override + public void setPluginInstanceName(String name) { + this.pluginInstanceName = name; + } } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/SolrAuthority.java b/dspace-api/src/main/java/org/dspace/content/authority/SolrAuthority.java index 5e913430b7..c93e6db786 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/SolrAuthority.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/SolrAuthority.java @@ -11,6 +11,7 @@ import java.util.ArrayList; import java.util.Iterator; import java.util.List; import java.util.Map; +import java.util.Map.Entry; import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; @@ -24,8 +25,9 @@ import org.dspace.authority.AuthorityValue; import org.dspace.authority.SolrAuthorityInterface; import org.dspace.authority.factory.AuthorityServiceFactory; import org.dspace.authority.service.AuthorityValueService; -import org.dspace.content.Collection; import org.dspace.core.ConfigurationManager; +import org.dspace.core.NameAwarePlugin; +import org.dspace.services.ConfigurationService; import org.dspace.services.factory.DSpaceServicesFactory; /** @@ -35,7 +37,14 @@ import org.dspace.services.factory.DSpaceServicesFactory; * @author Mark Diggory (markd at atmire dot com) */ public class SolrAuthority implements ChoiceAuthority { + /** the name assigned to the specific instance by the PluginService, @see {@link NameAwarePlugin} **/ + private String authorityName; + /** + * the metadata managed by the plugin instance, derived from its authority name + * in the form schema_element_qualifier + */ + private String field; protected SolrAuthorityInterface source = DSpaceServicesFactory.getInstance().getServiceManager() .getServiceByName("AuthoritySource", SolrAuthorityInterface.class); @@ -45,8 +54,9 @@ public class SolrAuthority implements ChoiceAuthority { protected boolean externalResults = false; protected final AuthorityValueService authorityValueService = AuthorityServiceFactory.getInstance() .getAuthorityValueService(); - - public Choices getMatches(String field, String text, Collection collection, int start, int limit, String locale, + protected final ConfigurationService configurationService = DSpaceServicesFactory.getInstance() + .getConfigurationService(); + public Choices getMatches(String text, int start, int limit, String locale, boolean bestMatch) { if (limit == 0) { limit = 10; @@ -193,13 +203,13 @@ public class SolrAuthority implements ChoiceAuthority { } @Override - public Choices getMatches(String field, String text, Collection collection, int start, int limit, String locale) { - return getMatches(field, text, collection, start, limit, locale, true); + public Choices getMatches(String text, int start, int limit, String locale) { + return getMatches(text, start, limit, locale, true); } @Override - public Choices getBestMatch(String field, String text, Collection collection, String locale) { - Choices matches = getMatches(field, text, collection, 0, 1, locale, false); + public Choices getBestMatch(String text, String locale) { + Choices matches = getMatches(text, 0, 1, locale, false); if (matches.values.length != 0 && !matches.values[0].value.equalsIgnoreCase(text)) { matches = new Choices(false); } @@ -207,7 +217,7 @@ public class SolrAuthority implements ChoiceAuthority { } @Override - public String getLabel(String field, String key, String locale) { + public String getLabel(String key, String locale) { try { if (log.isDebugEnabled()) { log.debug("requesting label for key " + key + " using locale " + locale); @@ -276,4 +286,23 @@ public class SolrAuthority implements ChoiceAuthority { public void addExternalResultsInNextMatches() { this.externalResults = true; } + + @Override + public void setPluginInstanceName(String name) { + authorityName = name; + for (Entry conf : configurationService.getProperties().entrySet()) { + if (StringUtils.startsWith((String) conf.getKey(), ChoiceAuthorityServiceImpl.CHOICES_PLUGIN_PREFIX) + && StringUtils.equals((String) conf.getValue(), authorityName)) { + field = ((String) conf.getKey()).substring(ChoiceAuthorityServiceImpl.CHOICES_PLUGIN_PREFIX.length()) + .replace(".", "_"); + // exit the look immediately as we have found it + break; + } + } + } + + @Override + public String getPluginInstanceName() { + return authorityName; + } } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/TestAuthority.java b/dspace-api/src/main/java/org/dspace/content/authority/TestAuthority.java index a017e8fe28..15c000e978 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/TestAuthority.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/TestAuthority.java @@ -11,7 +11,6 @@ import java.util.ArrayList; import java.util.List; import org.apache.commons.lang3.StringUtils; -import org.dspace.content.Collection; /** * This is a *very* stupid test fixture for authority control with AuthorityVariantsSupport. @@ -19,6 +18,7 @@ import org.dspace.content.Collection; * @author Andrea Bollini (CILEA) */ public class TestAuthority implements ChoiceAuthority, AuthorityVariantsSupport { + private String pluginInstanceName; @Override public List getVariants(String key, String locale) { @@ -33,8 +33,7 @@ public class TestAuthority implements ChoiceAuthority, AuthorityVariantsSupport } @Override - public Choices getMatches(String field, String text, Collection collection, - int start, int limit, String locale) { + public Choices getMatches(String text, int start, int limit, String locale) { Choices choices = new Choices(false); if (StringUtils.isNotBlank(text)) { @@ -52,8 +51,7 @@ public class TestAuthority implements ChoiceAuthority, AuthorityVariantsSupport } @Override - public Choices getBestMatch(String field, String text, Collection collection, - String locale) { + public Choices getBestMatch(String text, String locale) { Choices choices = new Choices(false); if (StringUtils.isNotBlank(text)) { @@ -70,10 +68,20 @@ public class TestAuthority implements ChoiceAuthority, AuthorityVariantsSupport } @Override - public String getLabel(String field, String key, String locale) { + public String getLabel(String key, String locale) { if (StringUtils.isNotBlank(key)) { return key.replaceAll("authority", "label"); } return "Unknown"; } + + @Override + public String getPluginInstanceName() { + return pluginInstanceName; + } + + @Override + public void setPluginInstanceName(String name) { + this.pluginInstanceName = name; + } } diff --git a/dspace-api/src/main/java/org/dspace/content/authority/service/ChoiceAuthorityService.java b/dspace-api/src/main/java/org/dspace/content/authority/service/ChoiceAuthorityService.java index 83db9a734e..1cc5075d02 100644 --- a/dspace-api/src/main/java/org/dspace/content/authority/service/ChoiceAuthorityService.java +++ b/dspace-api/src/main/java/org/dspace/content/authority/service/ChoiceAuthorityService.java @@ -48,10 +48,10 @@ public interface ChoiceAuthorityService { * @param element element of metadata field * @param qualifier qualifier of metadata field * @return the name of the choice authority associated with the specified - * metadata. Throw IllegalArgumentException if the supplied metadat + * metadata. Throw IllegalArgumentException if the supplied metadata * is not associated with an authority choice */ - public String getChoiceAuthorityName(String schema, String element, String qualifier); + public String getChoiceAuthorityName(String schema, String element, String qualifier, Collection collection); /** * Wrapper that calls getMatches method of the plugin corresponding to @@ -112,30 +112,33 @@ public interface ChoiceAuthorityService { * the metadata field defined by schema,element,qualifier. * * @param metadataValue metadata value + * @param collection Collection owner of Item * @param locale explicit localization key if available * @return label */ - public String getLabel(MetadataValue metadataValue, String locale); + public String getLabel(MetadataValue metadataValue, Collection collection, String locale); /** * Wrapper that calls getLabel method of the plugin corresponding to * the metadata field defined by single field key. * * @param fieldKey single string identifying metadata field + * @param collection Collection owner of Item * @param locale explicit localization key if available * @param authKey authority key * @return label */ - public String getLabel(String fieldKey, String authKey, String locale); + public String getLabel(String fieldKey, Collection collection, String authKey, String locale); /** * Predicate, is there a Choices configuration of any kind for the * given metadata field? * * @param fieldKey single string identifying metadata field + * @param collection Collection owner of Item * @return true if choices are configured for this field. */ - public boolean isChoicesConfigured(String fieldKey); + public boolean isChoicesConfigured(String fieldKey, Collection collection); /** * Get the presentation keyword (should be "lookup", "select" or "suggest", but this @@ -160,12 +163,14 @@ public interface ChoiceAuthorityService { * @param metadataValue metadata value * @return List of variants */ - public List getVariants(MetadataValue metadataValue); - - public String getChoiceMetadatabyAuthorityName(String name); - - public Choice getChoice(String fieldKey, String authKey, String locale); + public List getVariants(MetadataValue metadataValue, Collection collection); + /** + * Return the ChoiceAuthority instance identified by the specified name + * + * @param authorityName the ChoiceAuthority instance name + * @return the ChoiceAuthority identified by the specified name + */ public ChoiceAuthority getChoiceAuthorityByAuthorityName(String authorityName); /** @@ -173,4 +178,49 @@ public interface ChoiceAuthorityService { */ public void clearCache(); + /** + * Should we store the authority key (if any) for such field key and collection? + * + * @param fieldKey single string identifying metadata field + * @param collection Collection owner of Item or where the item is submitted to + * @return true if the configuration allows to store the authority value + */ + public boolean storeAuthority(String fieldKey, Collection collection); + + /** + * Wrapper that calls getChoicesByParent method of the plugin. + * + * @param authorityName authority name + * @param parentId parent Id + * @param start choice at which to start, 0 is first. + * @param limit maximum number of choices to return, 0 for no limit. + * @param locale explicit localization key if available, or null + * @return a Choices object (never null). + * @see org.dspace.content.authority.ChoiceAuthority#getChoicesByParent(java.lang.String, java.lang.String, + * int, int, java.lang.String) + */ + public Choices getChoicesByParent(String authorityName, String parentId, int start, int limit, String locale); + + /** + * Wrapper that calls getTopChoices method of the plugin. + * + * @param authorityName authority name + * @param start choice at which to start, 0 is first. + * @param limit maximum number of choices to return, 0 for no limit. + * @param locale explicit localization key if available, or null + * @return a Choices object (never null). + * @see org.dspace.content.authority.ChoiceAuthority#getTopChoices(java.lang.String, int, int, java.lang.String) + */ + public Choices getTopChoices(String authorityName, int start, int limit, String locale); + + /** + * Return the direct parent of an entry identified by its id in an hierarchical + * authority. + * + * @param authorityName authority name + * @param vocabularyId child id + * @param locale explicit localization key if available, or null + * @return the parent Choice object if any + */ + public Choice getParentChoice(String authorityName, String vocabularyId, String locale); } diff --git a/dspace-api/src/main/java/org/dspace/content/dao/ItemDAO.java b/dspace-api/src/main/java/org/dspace/content/dao/ItemDAO.java index 979f42836a..4c391d973b 100644 --- a/dspace-api/src/main/java/org/dspace/content/dao/ItemDAO.java +++ b/dspace-api/src/main/java/org/dspace/content/dao/ItemDAO.java @@ -47,6 +47,19 @@ public interface ItemDAO extends DSpaceObjectLegacySupportDAO { public Iterator findBySubmitter(Context context, EPerson eperson) throws SQLException; + /** + * Find all the items by a given submitter. The order is + * indeterminate. All items are included. + * + * @param context DSpace context object + * @param eperson the submitter + * @param retrieveAllItems flag to determine if only archive should be returned + * @return an iterator over the items submitted by eperson + * @throws SQLException if database error + */ + public Iterator findBySubmitter(Context context, EPerson eperson, boolean retrieveAllItems) + throws SQLException; + public Iterator findBySubmitter(Context context, EPerson eperson, MetadataField metadataField, int limit) throws SQLException; diff --git a/dspace-api/src/main/java/org/dspace/content/dao/ProcessDAO.java b/dspace-api/src/main/java/org/dspace/content/dao/ProcessDAO.java index f20225a202..4ef26cffcb 100644 --- a/dspace-api/src/main/java/org/dspace/content/dao/ProcessDAO.java +++ b/dspace-api/src/main/java/org/dspace/content/dao/ProcessDAO.java @@ -13,6 +13,7 @@ import java.util.List; import org.dspace.core.Context; import org.dspace.core.GenericDAO; import org.dspace.scripts.Process; +import org.dspace.scripts.ProcessQueryParameterContainer; /** * This is the Data Access Object for the {@link Process} object @@ -54,4 +55,30 @@ public interface ProcessDAO extends GenericDAO { */ int countRows(Context context) throws SQLException; + /** + * Returns a list of all Processes in the database which match the given field requirements. If the + * requirements are not null, they will be combined with an AND operation. + * @param context The relevant DSpace context + * @param processQueryParameterContainer The {@link ProcessQueryParameterContainer} containing all the values + * that the returned {@link Process} objects must adhere to + * @param limit The limit for the amount of Processes returned + * @param offset The offset for the Processes to be returned + * @return The list of all Processes which match the metadata requirements + * @throws SQLException If something goes wrong + */ + List search(Context context, ProcessQueryParameterContainer processQueryParameterContainer, int limit, + int offset) throws SQLException; + + /** + * Count all the processes which match the requirements. The requirements are evaluated like the search + * method. + * @param context The relevant DSpace context + * @param processQueryParameterContainer The {@link ProcessQueryParameterContainer} containing all the values + * that the returned {@link Process} objects must adhere to + * @return The number of results matching the query + * @throws SQLException If something goes wrong + */ + + int countTotalWithParameters(Context context, ProcessQueryParameterContainer processQueryParameterContainer) + throws SQLException; } diff --git a/dspace-api/src/main/java/org/dspace/content/dao/impl/ItemDAOImpl.java b/dspace-api/src/main/java/org/dspace/content/dao/impl/ItemDAOImpl.java index b935812c8c..683a6502c5 100644 --- a/dspace-api/src/main/java/org/dspace/content/dao/impl/ItemDAOImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/dao/impl/ItemDAOImpl.java @@ -108,6 +108,17 @@ public class ItemDAOImpl extends AbstractHibernateDSODAO implements ItemDA return iterate(query); } + @Override + public Iterator findBySubmitter(Context context, EPerson eperson, boolean retrieveAllItems) + throws SQLException { + if (!retrieveAllItems) { + return findBySubmitter(context, eperson); + } + Query query = createQuery(context, "FROM Item WHERE submitter= :submitter"); + query.setParameter("submitter", eperson); + return iterate(query); + } + @Override public Iterator findBySubmitter(Context context, EPerson eperson, MetadataField metadataField, int limit) throws SQLException { diff --git a/dspace-api/src/main/java/org/dspace/content/dao/impl/ProcessDAOImpl.java b/dspace-api/src/main/java/org/dspace/content/dao/impl/ProcessDAOImpl.java index 4c10387d93..5c8083a86b 100644 --- a/dspace-api/src/main/java/org/dspace/content/dao/impl/ProcessDAOImpl.java +++ b/dspace-api/src/main/java/org/dspace/content/dao/impl/ProcessDAOImpl.java @@ -8,15 +8,20 @@ package org.dspace.content.dao.impl; import java.sql.SQLException; +import java.util.LinkedList; import java.util.List; +import java.util.Map; import javax.persistence.criteria.CriteriaBuilder; import javax.persistence.criteria.CriteriaQuery; +import javax.persistence.criteria.Predicate; import javax.persistence.criteria.Root; +import org.apache.commons.lang3.StringUtils; import org.dspace.content.dao.ProcessDAO; import org.dspace.core.AbstractHibernateDAO; import org.dspace.core.Context; import org.dspace.scripts.Process; +import org.dspace.scripts.ProcessQueryParameterContainer; import org.dspace.scripts.Process_; /** @@ -56,6 +61,7 @@ public class ProcessDAOImpl extends AbstractHibernateDAO implements Pro CriteriaQuery criteriaQuery = getCriteriaQuery(criteriaBuilder, Process.class); Root processRoot = criteriaQuery.from(Process.class); criteriaQuery.select(processRoot); + criteriaQuery.orderBy(criteriaBuilder.desc(processRoot.get(Process_.processId))); return list(context, criteriaQuery, false, Process.class, limit, offset); } @@ -71,6 +77,76 @@ public class ProcessDAOImpl extends AbstractHibernateDAO implements Pro return count(context, criteriaQuery, criteriaBuilder, processRoot); } + + @Override + public List search(Context context, ProcessQueryParameterContainer processQueryParameterContainer, + int limit, int offset) throws SQLException { + CriteriaBuilder criteriaBuilder = getCriteriaBuilder(context); + CriteriaQuery criteriaQuery = getCriteriaQuery(criteriaBuilder, Process.class); + Root processRoot = criteriaQuery.from(Process.class); + criteriaQuery.select(processRoot); + + handleProcessQueryParameters(processQueryParameterContainer, criteriaBuilder, criteriaQuery, processRoot); + return list(context, criteriaQuery, false, Process.class, limit, offset); + + } + + /** + * This method will ensure that the params contained in the {@link ProcessQueryParameterContainer} are transferred + * to the ProcessRoot and that the correct conditions apply to the query + * @param processQueryParameterContainer The object containing the conditions that need to be met + * @param criteriaBuilder The criteriaBuilder to be used + * @param criteriaQuery The criteriaQuery to be used + * @param processRoot The processRoot to be used + */ + private void handleProcessQueryParameters(ProcessQueryParameterContainer processQueryParameterContainer, + CriteriaBuilder criteriaBuilder, CriteriaQuery criteriaQuery, + Root processRoot) { + addProcessQueryParameters(processQueryParameterContainer, criteriaBuilder, criteriaQuery, processRoot); + if (StringUtils.equalsIgnoreCase(processQueryParameterContainer.getSortOrder(), "asc")) { + criteriaQuery + .orderBy(criteriaBuilder.asc(processRoot.get(processQueryParameterContainer.getSortProperty()))); + } else if (StringUtils.equalsIgnoreCase(processQueryParameterContainer.getSortOrder(), "desc")) { + criteriaQuery + .orderBy(criteriaBuilder.desc(processRoot.get(processQueryParameterContainer.getSortProperty()))); + } + } + + /** + * This method will apply the variables in the {@link ProcessQueryParameterContainer} as criteria for the + * {@link Process} objects to the given CriteriaQuery. + * They'll need to adhere to these variables in order to be eligible for return + * @param processQueryParameterContainer The object containing the variables for the {@link Process} + * to adhere to + * @param criteriaBuilder The current CriteriaBuilder + * @param criteriaQuery The current CriteriaQuery + * @param processRoot The processRoot + */ + private void addProcessQueryParameters(ProcessQueryParameterContainer processQueryParameterContainer, + CriteriaBuilder criteriaBuilder, CriteriaQuery criteriaQuery, + Root processRoot) { + List andPredicates = new LinkedList<>(); + + for (Map.Entry entry : processQueryParameterContainer.getQueryParameterMap().entrySet()) { + andPredicates.add(criteriaBuilder.equal(processRoot.get(entry.getKey()), entry.getValue())); + } + criteriaQuery.where(criteriaBuilder.and(andPredicates.toArray(new Predicate[]{}))); + } + + @Override + public int countTotalWithParameters(Context context, ProcessQueryParameterContainer processQueryParameterContainer) + throws SQLException { + + CriteriaBuilder criteriaBuilder = getCriteriaBuilder(context); + CriteriaQuery criteriaQuery = getCriteriaQuery(criteriaBuilder, Process.class); + Root processRoot = criteriaQuery.from(Process.class); + criteriaQuery.select(processRoot); + + addProcessQueryParameters(processQueryParameterContainer, criteriaBuilder, criteriaQuery, processRoot); + return count(context, criteriaQuery, criteriaBuilder, processRoot); + } + + } diff --git a/dspace-api/src/main/java/org/dspace/content/packager/METSManifest.java b/dspace-api/src/main/java/org/dspace/content/packager/METSManifest.java index 53a8678df2..ed15037c11 100644 --- a/dspace-api/src/main/java/org/dspace/content/packager/METSManifest.java +++ b/dspace-api/src/main/java/org/dspace/content/packager/METSManifest.java @@ -272,12 +272,16 @@ public class METSManifest { // Set validation feature if (validate) { builder.setFeature("http://apache.org/xml/features/validation/schema", true); - } - // Tell the parser where local copies of schemas are, to speed up - // validation. Local XSDs are identified in the configuration file. - if (localSchemas.length() > 0) { - builder.setProperty("http://apache.org/xml/properties/schema/external-schemaLocation", localSchemas); + // Tell the parser where local copies of schemas are, to speed up + // validation & avoid XXE attacks from remote schemas. Local XSDs are identified in the configuration file. + if (localSchemas.length() > 0) { + builder.setProperty("http://apache.org/xml/properties/schema/external-schemaLocation", localSchemas); + } + } else { + // disallow DTD parsing to ensure no XXE attacks can occur. + // See https://cheatsheetseries.owasp.org/cheatsheets/XML_External_Entity_Prevention_Cheat_Sheet.html + builder.setFeature("http://apache.org/xml/features/disallow-doctype-decl", true); } // Parse the METS file diff --git a/dspace-api/src/main/java/org/dspace/content/service/DSpaceObjectService.java b/dspace-api/src/main/java/org/dspace/content/service/DSpaceObjectService.java index 203d2a1787..ff44713b38 100644 --- a/dspace-api/src/main/java/org/dspace/content/service/DSpaceObjectService.java +++ b/dspace-api/src/main/java/org/dspace/content/service/DSpaceObjectService.java @@ -200,10 +200,11 @@ public interface DSpaceObjectService { * and the ISO3166 country code. null means the * value has no language (for example, a date). * @param values the values to add. + * @return the list of MetadataValues added to the object * @throws SQLException if database error */ - public void addMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, - List values) throws SQLException; + public List addMetadata(Context context, T dso, String schema, String element, String qualifier, + String lang, List values) throws SQLException; /** * Add metadata fields. These are appended to existing values. @@ -223,10 +224,11 @@ public interface DSpaceObjectService { * @param values the values to add. * @param authorities the external authority key for this value (or null) * @param confidences the authority confidence (default 0) + * @return the list of MetadataValues added to the object * @throws SQLException if database error */ - public void addMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, - List values, List authorities, List confidences) + public List addMetadata(Context context, T dso, String schema, String element, String qualifier, + String lang, List values, List authorities, List confidences) throws SQLException; /** @@ -243,32 +245,64 @@ public interface DSpaceObjectService { * @param values the values to add. * @param authorities the external authority key for this value (or null) * @param confidences the authority confidence (default 0) + * @return the list of MetadataValues added to the object * @throws SQLException if database error */ - public void addMetadata(Context context, T dso, MetadataField metadataField, String lang, List values, - List authorities, List confidences) throws SQLException; + public List addMetadata(Context context, T dso, MetadataField metadataField, String lang, + List values, List authorities, List confidences) throws SQLException; /** * Shortcut for {@link #addMetadata(Context, DSpaceObject, MetadataField, String, List, List, List)} when a single * value need to be added - * - * @param context - * @param dso - * @param metadataField - * @param language - * @param value - * @param authority - * @param confidence + * + * @param context DSpace context + * @param dso DSpaceObject + * @param metadataField the metadata field to which the value is to be set + * @param language the ISO639 language code, optionally followed by an underscore + * and the ISO3166 country code. null means the + * value has no language (for example, a date). + * @param value the value to add. + * @param authority the external authority key for this value (or null) + * @param confidence the authority confidence (default 0) + * @return the MetadataValue added ot the object * @throws SQLException */ - public void addMetadata(Context context, T dso, MetadataField metadataField, String language, String value, - String authority, int confidence) throws SQLException; + public MetadataValue addMetadata(Context context, T dso, MetadataField metadataField, String language, + String value, String authority, int confidence) throws SQLException; - public void addMetadata(Context context, T dso, MetadataField metadataField, String language, String value) + /** + * Add a metadatafield. These are appended to existing values. + * Use clearMetadata to remove values. + * + * @param context DSpace context + * @param dso DSpaceObject + * @param metadataField the metadata field to which the value is to be set + * @param language the ISO639 language code, optionally followed by an underscore + * and the ISO3166 country code. null means the + * value has no language (for example, a date). + * @param value the value to add. + * @return the MetadataValue added ot the object + * @throws SQLException if database error + */ + public MetadataValue addMetadata(Context context, T dso, MetadataField metadataField, String language, String value) throws SQLException; - public void addMetadata(Context context, T dso, MetadataField metadataField, String language, List values) - throws SQLException; + /** + * Add a metadatafields. These are appended to existing values. + * Use clearMetadata to remove values. + * + * @param context DSpace context + * @param dso DSpaceObject + * @param metadataField the metadata field to which the value is to be set + * @param language the ISO639 language code, optionally followed by an underscore + * and the ISO3166 country code. null means the + * value has no language (for example, a date). + * @param values the values to add. + * @return the list of MetadataValues added to the object + * @throws SQLException if database error + */ + public List addMetadata(Context context, T dso, MetadataField metadataField, String language, + List values) throws SQLException; /** * Add a single metadata field. This is appended to existing @@ -285,10 +319,11 @@ public interface DSpaceObjectService { * and the ISO3166 country code. null means the * value has no language (for example, a date). * @param value the value to add. + * @return the MetadataValue added ot the object * @throws SQLException if database error */ - public void addMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, - String value) throws SQLException; + public MetadataValue addMetadata(Context context, T dso, String schema, String element, String qualifier, + String lang, String value) throws SQLException; /** * Add a single metadata field. This is appended to existing @@ -307,10 +342,11 @@ public interface DSpaceObjectService { * @param value the value to add. * @param authority the external authority key for this value (or null) * @param confidence the authority confidence (default 0) + * @return the MetadataValue added ot the object * @throws SQLException if database error */ - public void addMetadata(Context context, T dso, String schema, String element, String qualifier, String lang, - String value, String authority, int confidence) throws SQLException; + public MetadataValue addMetadata(Context context, T dso, String schema, String element, String qualifier, + String lang, String value, String authority, int confidence) throws SQLException; /** * Clear metadata values. As with getDC above, diff --git a/dspace-api/src/main/java/org/dspace/content/service/ItemService.java b/dspace-api/src/main/java/org/dspace/content/service/ItemService.java index 71b736f1bd..ff30ffe0e0 100644 --- a/dspace-api/src/main/java/org/dspace/content/service/ItemService.java +++ b/dspace-api/src/main/java/org/dspace/content/service/ItemService.java @@ -113,6 +113,21 @@ public interface ItemService public Iterator findBySubmitter(Context context, EPerson eperson) throws SQLException; + /** + * Find all the items by a given submitter. The order is + * indeterminate. All items are included. + * + * @param context DSpace context object + * @param eperson the submitter + * @param retrieveAllItems flag to determine if all items should be returned or only archived items. + * If true, all items (regardless of status) are returned. + * If false, only archived items will be returned. + * @return an iterator over the items submitted by eperson + * @throws SQLException if database error + */ + public Iterator findBySubmitter(Context context, EPerson eperson, boolean retrieveAllItems) + throws SQLException; + /** * Retrieve the list of items submitted by eperson, ordered by recently submitted, optionally limitable * diff --git a/dspace-api/src/main/java/org/dspace/core/Context.java b/dspace-api/src/main/java/org/dspace/core/Context.java index ecfc29d29d..e878367ec4 100644 --- a/dspace-api/src/main/java/org/dspace/core/Context.java +++ b/dspace-api/src/main/java/org/dspace/core/Context.java @@ -179,7 +179,7 @@ public class Context implements AutoCloseable { } currentUser = null; - currentLocale = I18nUtil.DEFAULTLOCALE; + currentLocale = I18nUtil.getDefaultLocale(); extraLogInfo = ""; ignoreAuth = false; @@ -190,7 +190,15 @@ public class Context implements AutoCloseable { setMode(this.mode); } - public static boolean updateDatabase() { + /** + * Update the DSpace database, ensuring that any necessary migrations are run prior to initializing + * Hibernate. + *

+ * This is synchronized as it only needs to be run successfully *once* (for the first Context initialized). + * + * @return true/false, based on whether database was successfully updated + */ + public static synchronized boolean updateDatabase() { //If the database has not been updated yet, update it and remember that. if (databaseUpdated.compareAndSet(false, true)) { @@ -200,7 +208,7 @@ public class Context implements AutoCloseable { try { DatabaseUtils.updateDatabase(); } catch (SQLException sqle) { - log.fatal("Cannot initialize database via Flyway!", sqle); + log.fatal("Cannot update or initialize database via Flyway!", sqle); databaseUpdated.set(false); } } @@ -641,9 +649,9 @@ public class Context implements AutoCloseable { /** * Temporary change the user bound to the context, empty the special groups that * are retained to allow subsequent restore - * + * * @param newUser the EPerson to bound to the context - * + * * @throws IllegalStateException if the switch was already performed without be * restored */ @@ -661,7 +669,7 @@ public class Context implements AutoCloseable { /** * Restore the user bound to the context and his special groups - * + * * @throws IllegalStateException if no switch was performed before */ public void restoreContextUser() { @@ -876,4 +884,5 @@ public class Context implements AutoCloseable { private void reloadContextBoundEntities() throws SQLException { currentUser = reloadEntity(currentUser); } + } diff --git a/dspace-api/src/main/java/org/dspace/core/I18nUtil.java b/dspace-api/src/main/java/org/dspace/core/I18nUtil.java index 37e48c4a4f..cd0609e29f 100644 --- a/dspace-api/src/main/java/org/dspace/core/I18nUtil.java +++ b/dspace-api/src/main/java/org/dspace/core/I18nUtil.java @@ -37,9 +37,6 @@ import org.dspace.services.factory.DSpaceServicesFactory; public class I18nUtil { private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(I18nUtil.class); - // the default Locale of this DSpace Instance - public static final Locale DEFAULTLOCALE = getDefaultLocale(); - // delimiters between elements of UNIX/POSIX locale spec, e.g. en_US.UTF-8 private static final String LOCALE_DELIMITERS = " _."; @@ -127,7 +124,7 @@ public class I18nUtil { return parseLocales(locales); } else { Locale[] availableLocales = new Locale[1]; - availableLocales[0] = DEFAULTLOCALE; + availableLocales[0] = getDefaultLocale(); return availableLocales; } } @@ -148,7 +145,7 @@ public class I18nUtil { Locale supportedLocale = null; String testLocale = ""; if (availableLocales == null) { - supportedLocale = DEFAULTLOCALE; + supportedLocale = getDefaultLocale(); } else { if (!locale.getVariant().equals("")) { testLocale = locale.toString(); @@ -188,12 +185,29 @@ public class I18nUtil { } } if (!isSupported) { - supportedLocale = DEFAULTLOCALE; + supportedLocale = getDefaultLocale(); } } return supportedLocale; } + /** + * Gets the appropriate supported Locale according for a given Locale If + * no appropriate supported locale is found, the DEFAULTLOCALE is used + * + * @param locale String to find the corresponding Locale + * @return supportedLocale + * Locale for session according to locales supported by this DSpace instance as set in dspace.cfg + */ + public static Locale getSupportedLocale(String locale) { + Locale currentLocale = null; + if (locale != null) { + currentLocale = I18nUtil.getSupportedLocale(new Locale(locale)); + } else { + currentLocale = I18nUtil.getDefaultLocale(); + } + return currentLocale; + } /** * Get the appropriate localized version of submission-forms.xml according to language settings @@ -220,7 +234,7 @@ public class I18nUtil { * String of the message */ public static String getMessage(String key) { - return getMessage(key.trim(), DEFAULTLOCALE); + return getMessage(key.trim(), getDefaultLocale()); } /** @@ -233,7 +247,7 @@ public class I18nUtil { */ public static String getMessage(String key, Locale locale) { if (locale == null) { - locale = DEFAULTLOCALE; + locale = getDefaultLocale(); } ResourceBundle.Control control = ResourceBundle.Control.getNoFallbackControl( @@ -384,4 +398,23 @@ public class I18nUtil { } return resultList.toArray(new Locale[resultList.size()]); } + + /** + * Check if the input locale is in the list of supported locales + * @param locale + * @return true if locale is supported, false otherwise + */ + public static boolean isSupportedLocale(Locale locale) { + boolean isSupported = false; + Locale[] supportedLocales = getSupportedLocales(); + if (supportedLocales != null) { + for (Locale sLocale: supportedLocales) { + if (locale.getLanguage().equals(sLocale.getLanguage()) ) { + isSupported = true; + break; + } + } + } + return isSupported; + } } diff --git a/dspace-api/src/main/java/org/dspace/core/LegacyPluginServiceImpl.java b/dspace-api/src/main/java/org/dspace/core/LegacyPluginServiceImpl.java index f8291dc977..ea8cdc1403 100644 --- a/dspace-api/src/main/java/org/dspace/core/LegacyPluginServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/core/LegacyPluginServiceImpl.java @@ -345,8 +345,8 @@ public class LegacyPluginServiceImpl implements PluginService { " for interface=" + iname + " pluginName=" + name); Object result = pluginClass.newInstance(); - if (result instanceof SelfNamedPlugin) { - ((SelfNamedPlugin) result).setPluginInstanceName(name); + if (result instanceof NameAwarePlugin) { + ((NameAwarePlugin) result).setPluginInstanceName(name); } return result; } diff --git a/dspace-api/src/main/java/org/dspace/core/NameAwarePlugin.java b/dspace-api/src/main/java/org/dspace/core/NameAwarePlugin.java new file mode 100644 index 0000000000..6c562ea04c --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/core/NameAwarePlugin.java @@ -0,0 +1,42 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.core; + +/** + * This is the interface that should be implemented by all the named plugin that + * like to be aware of their name + * + * @author Andrea Bollini (andrea.bollini at 4science.it) + * @version $Revision$ + * @see org.dspace.core.service.PluginService + */ +public interface NameAwarePlugin { + + /** + * Get the instance's particular name. + * Returns the name by which the class was chosen when + * this instance was created. Only works for instances created + * by PluginService, or if someone remembers to call setPluginName. + *

+ * Useful when the implementation class wants to be configured differently + * when it is invoked under different names. + * + * @return name or null if not available. + */ + public String getPluginInstanceName(); + + /** + * Set the name under which this plugin was instantiated. + * Not to be invoked by application code, it is + * called automatically by PluginService.getNamedPlugin() + * when the plugin is instantiated. + * + * @param name -- name used to select this class. + */ + public void setPluginInstanceName(String name); +} diff --git a/dspace-api/src/main/java/org/dspace/core/SelfNamedPlugin.java b/dspace-api/src/main/java/org/dspace/core/SelfNamedPlugin.java index 2bdcf830e7..680fa15c80 100644 --- a/dspace-api/src/main/java/org/dspace/core/SelfNamedPlugin.java +++ b/dspace-api/src/main/java/org/dspace/core/SelfNamedPlugin.java @@ -28,7 +28,7 @@ package org.dspace.core; * @version $Revision$ * @see org.dspace.core.service.PluginService */ -public abstract class SelfNamedPlugin { +public abstract class SelfNamedPlugin implements NameAwarePlugin { // the specific alias used to find the class that created this instance. private String myName = null; @@ -52,30 +52,13 @@ public abstract class SelfNamedPlugin { return null; } - /** - * Get an instance's particular name. - * Returns the name by which the class was chosen when - * this instance was created. Only works for instances created - * by PluginService, or if someone remembers to call setPluginName. - *

- * Useful when the implementation class wants to be configured differently - * when it is invoked under different names. - * - * @return name or null if not available. - */ + @Override public String getPluginInstanceName() { return myName; } - /** - * Set the name under which this plugin was instantiated. - * Not to be invoked by application code, it is - * called automatically by PluginService.getNamedPlugin() - * when the plugin is instantiated. - * - * @param name -- name used to select this class. - */ - protected void setPluginInstanceName(String name) { + @Override + public void setPluginInstanceName(String name) { myName = name; } } diff --git a/dspace-api/src/main/java/org/dspace/ctask/general/MetadataWebService.java b/dspace-api/src/main/java/org/dspace/ctask/general/MetadataWebService.java index 2b6c52d0d6..754f3b4ab3 100644 --- a/dspace-api/src/main/java/org/dspace/ctask/general/MetadataWebService.java +++ b/dspace-api/src/main/java/org/dspace/ctask/general/MetadataWebService.java @@ -199,6 +199,9 @@ public class MetadataWebService extends AbstractCurationTask implements Namespac DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance(); factory.setNamespaceAware(true); try { + // disallow DTD parsing to ensure no XXE attacks can occur. + // See https://cheatsheetseries.owasp.org/cheatsheets/XML_External_Entity_Prevention_Cheat_Sheet.html + factory.setFeature("http://apache.org/xml/features/disallow-doctype-decl", true); docBuilder = factory.newDocumentBuilder(); } catch (ParserConfigurationException pcE) { log.error("caught exception: " + pcE); diff --git a/dspace-api/src/main/java/org/dspace/curate/Curation.java b/dspace-api/src/main/java/org/dspace/curate/Curation.java new file mode 100644 index 0000000000..44cbb24ed9 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/curate/Curation.java @@ -0,0 +1,371 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.curate; + +import java.io.BufferedReader; +import java.io.File; +import java.io.FileNotFoundException; +import java.io.FileReader; +import java.io.IOException; +import java.io.OutputStream; +import java.io.OutputStreamWriter; +import java.io.PrintStream; +import java.io.Writer; +import java.sql.SQLException; +import java.util.HashMap; +import java.util.Iterator; +import java.util.Map; +import java.util.UUID; + +import org.apache.commons.cli.ParseException; +import org.apache.commons.io.output.NullOutputStream; +import org.dspace.authorize.AuthorizeException; +import org.dspace.content.DSpaceObject; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.core.Context; +import org.dspace.core.factory.CoreServiceFactory; +import org.dspace.curate.factory.CurateServiceFactory; +import org.dspace.eperson.EPerson; +import org.dspace.eperson.factory.EPersonServiceFactory; +import org.dspace.eperson.service.EPersonService; +import org.dspace.handle.factory.HandleServiceFactory; +import org.dspace.handle.service.HandleService; +import org.dspace.scripts.DSpaceRunnable; +import org.dspace.utils.DSpace; + +/** + * CurationCli provides command-line access to Curation tools and processes. + * + * @author richardrodgers + */ +public class Curation extends DSpaceRunnable { + + protected EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); + + protected Context context; + private CurationClientOptions curationClientOptions; + + private String task; + private String taskFile; + private String id; + private String queue; + private String scope; + private String reporter; + private Map parameters; + private boolean verbose; + + @Override + public void internalRun() throws Exception { + if (curationClientOptions == CurationClientOptions.HELP) { + printHelp(); + return; + } + + Curator curator = initCurator(); + + // load curation tasks + if (curationClientOptions == CurationClientOptions.TASK) { + long start = System.currentTimeMillis(); + handleCurationTask(curator); + this.endScript(start); + } + + // process task queue + if (curationClientOptions == CurationClientOptions.QUEUE) { + // process the task queue + TaskQueue taskQueue = (TaskQueue) CoreServiceFactory.getInstance().getPluginService() + .getSinglePlugin(TaskQueue.class); + if (taskQueue == null) { + super.handler.logError("No implementation configured for queue"); + throw new UnsupportedOperationException("No queue service available"); + } + long timeRun = this.runQueue(taskQueue, curator); + this.endScript(timeRun); + } + } + + /** + * Does the curation task (-t) or the task in the given file (-T). + * Checks: + * - if required option -i is missing. + * - if option -t has a valid task option + */ + private void handleCurationTask(Curator curator) throws IOException, SQLException { + String taskName; + if (commandLine.hasOption('t')) { + if (verbose) { + handler.logInfo("Adding task: " + this.task); + } + curator.addTask(this.task); + if (verbose && !curator.hasTask(this.task)) { + handler.logInfo("Task: " + this.task + " not resolved"); + } + } else if (commandLine.hasOption('T')) { + // load taskFile + BufferedReader reader = null; + try { + reader = new BufferedReader(new FileReader(this.taskFile)); + while ((taskName = reader.readLine()) != null) { + if (verbose) { + super.handler.logInfo("Adding task: " + taskName); + } + curator.addTask(taskName); + } + } finally { + if (reader != null) { + reader.close(); + } + } + } + // run tasks against object + if (verbose) { + super.handler.logInfo("Starting curation"); + super.handler.logInfo("Curating id: " + this.id); + } + if ("all".equals(this.id)) { + // run on whole Site + curator.curate(context, + ContentServiceFactory.getInstance().getSiteService().findSite(context).getHandle()); + } else { + curator.curate(context, this.id); + } + } + + /** + * Runs task queue (-q set) + * + * @param queue The task queue + * @param curator The curator + * @return Time when queue started + */ + private long runQueue(TaskQueue queue, Curator curator) throws SQLException, AuthorizeException, IOException { + // use current time as our reader 'ticket' + long ticket = System.currentTimeMillis(); + Iterator entryIter = queue.dequeue(this.queue, ticket).iterator(); + while (entryIter.hasNext()) { + TaskQueueEntry entry = entryIter.next(); + if (verbose) { + super.handler.logInfo("Curating id: " + entry.getObjectId()); + } + curator.clear(); + // does entry relate to a DSO or workflow object? + if (entry.getObjectId().indexOf('/') > 0) { + for (String taskName : entry.getTaskNames()) { + curator.addTask(taskName); + } + curator.curate(context, entry.getObjectId()); + } else { + // make eperson who queued task the effective user + EPerson agent = ePersonService.findByEmail(context, entry.getEpersonId()); + if (agent != null) { + context.setCurrentUser(agent); + } + CurateServiceFactory.getInstance().getWorkflowCuratorService() + .curate(curator, context, entry.getObjectId()); + } + } + queue.release(this.queue, ticket, true); + return ticket; + } + + /** + * End of curation script; logs script time if -v verbose is set + * + * @param timeRun Time script was started + * @throws SQLException If DSpace contextx can't complete + */ + private void endScript(long timeRun) throws SQLException { + context.complete(); + if (verbose) { + long elapsed = System.currentTimeMillis() - timeRun; + this.handler.logInfo("Ending curation. Elapsed time: " + elapsed); + } + } + + /** + * Initialize the curator with command line variables + * + * @return Initialised curator + * @throws FileNotFoundException If file of command line variable -r reporter is not found + */ + private Curator initCurator() throws FileNotFoundException { + Curator curator = new Curator(); + OutputStream reporterStream; + if (null == this.reporter) { + reporterStream = new NullOutputStream(); + } else if ("-".equals(this.reporter)) { + reporterStream = System.out; + } else { + reporterStream = new PrintStream(this.reporter); + } + Writer reportWriter = new OutputStreamWriter(reporterStream); + curator.setReporter(reportWriter); + + if (this.scope != null) { + Curator.TxScope txScope = Curator.TxScope.valueOf(this.scope.toUpperCase()); + curator.setTransactionScope(txScope); + } + + curator.addParameters(parameters); + // we are operating in batch mode, if anyone cares. + curator.setInvoked(Curator.Invoked.BATCH); + return curator; + } + + @Override + public void printHelp() { + super.printHelp(); + super.handler.logInfo("\nwhole repo: CurationCli -t estimate -i all"); + super.handler.logInfo("single item: CurationCli -t generate -i itemId"); + super.handler.logInfo("task queue: CurationCli -q monthly"); + } + + @Override + public CurationScriptConfiguration getScriptConfiguration() { + return new DSpace().getServiceManager().getServiceByName("curate", CurationScriptConfiguration.class); + } + + @Override + public void setup() throws ParseException { + assignCurrentUserInContext(); + this.curationClientOptions = CurationClientOptions.getClientOption(commandLine); + + if (this.curationClientOptions != null) { + this.initGeneralLineOptionsAndCheckIfValid(); + if (curationClientOptions == CurationClientOptions.TASK) { + this.initTaskLineOptionsAndCheckIfValid(); + } else if (curationClientOptions == CurationClientOptions.QUEUE) { + this.queue = this.commandLine.getOptionValue('q'); + } + } else { + throw new IllegalArgumentException("[--help || --task|--taskfile <> -identifier <> || -queue <> ] must be" + + " specified"); + } + } + + /** + * This method will assign the currentUser to the {@link Context} variable which is also created in this method. + * The instance of the method in this class will fetch the EPersonIdentifier from this class, this identifier + * was given to this class upon instantiation, it'll then be used to find the {@link EPerson} associated with it + * and this {@link EPerson} will be set as the currentUser of the created {@link Context} + * @throws ParseException If something went wrong with the retrieval of the EPerson Identifier + */ + protected void assignCurrentUserInContext() throws ParseException { + UUID currentUserUuid = this.getEpersonIdentifier(); + try { + this.context = new Context(Context.Mode.BATCH_EDIT); + EPerson eperson = ePersonService.find(context, currentUserUuid); + if (eperson == null) { + super.handler.logError("EPerson not found: " + currentUserUuid); + throw new IllegalArgumentException("Unable to find a user with uuid: " + currentUserUuid); + } + this.context.setCurrentUser(eperson); + } catch (SQLException e) { + handler.handleException("Something went wrong trying to fetch eperson for uuid: " + currentUserUuid, e); + } + } + + /** + * Fills in some optional command line options. + * Checks if there are missing required options or invalid values for options. + */ + private void initGeneralLineOptionsAndCheckIfValid() { + // report file + if (this.commandLine.hasOption('r')) { + this.reporter = this.commandLine.getOptionValue('r'); + } + + // parameters + this.parameters = new HashMap<>(); + if (this.commandLine.hasOption('p')) { + for (String parameter : this.commandLine.getOptionValues('p')) { + String[] parts = parameter.split("=", 2); + String name = parts[0].trim(); + String value; + if (parts.length > 1) { + value = parts[1].trim(); + } else { + value = "true"; + } + this.parameters.put(name, value); + } + } + + // verbose + verbose = false; + if (commandLine.hasOption('v')) { + verbose = true; + } + + // scope + if (this.commandLine.getOptionValue('s') != null) { + this.scope = this.commandLine.getOptionValue('s'); + if (this.scope != null && Curator.TxScope.valueOf(this.scope.toUpperCase()) == null) { + this.handler.logError("Bad transaction scope '" + this.scope + "': only 'object', 'curation' or " + + "'open' recognized"); + throw new IllegalArgumentException( + "Bad transaction scope '" + this.scope + "': only 'object', 'curation' or " + + "'open' recognized"); + } + } + } + + /** + * Fills in required command line options for the task or taskFile option. + * Checks if there are is a missing required -i option and if -i is either 'all' or a valid dso handle. + * Checks if -t task has a valid task option. + * Checks if -T taskfile is a valid file. + */ + private void initTaskLineOptionsAndCheckIfValid() { + // task or taskFile + if (this.commandLine.hasOption('t')) { + this.task = this.commandLine.getOptionValue('t'); + if (!CurationClientOptions.getTaskOptions().contains(this.task)) { + super.handler + .logError("-t task must be one of: " + CurationClientOptions.getTaskOptions()); + throw new IllegalArgumentException( + "-t task must be one of: " + CurationClientOptions.getTaskOptions()); + } + } else if (this.commandLine.hasOption('T')) { + this.taskFile = this.commandLine.getOptionValue('T'); + if (!(new File(this.taskFile).isFile())) { + super.handler + .logError("-T taskFile must be valid file: " + this.taskFile); + throw new IllegalArgumentException("-T taskFile must be valid file: " + this.taskFile); + } + } + + if (this.commandLine.hasOption('i')) { + this.id = this.commandLine.getOptionValue('i').toLowerCase(); + if (!this.id.equalsIgnoreCase("all")) { + HandleService handleService = HandleServiceFactory.getInstance().getHandleService(); + DSpaceObject dso; + try { + dso = handleService.resolveToObject(this.context, id); + } catch (SQLException e) { + super.handler.logError("SQLException trying to resolve handle " + id + " to a valid dso"); + throw new IllegalArgumentException( + "SQLException trying to resolve handle " + id + " to a valid dso"); + } + if (dso == null) { + super.handler.logError("Id must be specified: a valid dso handle or 'all'; " + this.id + " could " + + "not be resolved to valid dso handle"); + throw new IllegalArgumentException( + "Id must be specified: a valid dso handle or 'all'; " + this.id + " could " + + "not be resolved to valid dso handle"); + } + } + } else { + super.handler.logError("Id must be specified: a handle, 'all', or no -i and a -q task queue (-h for " + + "help)"); + throw new IllegalArgumentException( + "Id must be specified: a handle, 'all', or no -i and a -q task queue (-h for " + + "help)"); + } + } +} diff --git a/dspace-api/src/main/java/org/dspace/curate/CurationCli.java b/dspace-api/src/main/java/org/dspace/curate/CurationCli.java index 3832ddf3ec..f70aea5b1d 100644 --- a/dspace-api/src/main/java/org/dspace/curate/CurationCli.java +++ b/dspace-api/src/main/java/org/dspace/curate/CurationCli.java @@ -7,269 +7,42 @@ */ package org.dspace.curate; -import java.io.BufferedReader; -import java.io.FileReader; -import java.io.OutputStream; -import java.io.OutputStreamWriter; -import java.io.PrintStream; -import java.io.Writer; -import java.util.HashMap; -import java.util.Iterator; -import java.util.Map; +import java.sql.SQLException; -import org.apache.commons.cli.CommandLine; -import org.apache.commons.cli.CommandLineParser; -import org.apache.commons.cli.HelpFormatter; -import org.apache.commons.cli.Options; -import org.apache.commons.cli.PosixParser; -import org.apache.commons.io.output.NullOutputStream; -import org.dspace.content.factory.ContentServiceFactory; +import org.apache.commons.cli.ParseException; import org.dspace.core.Context; -import org.dspace.core.factory.CoreServiceFactory; -import org.dspace.curate.factory.CurateServiceFactory; import org.dspace.eperson.EPerson; -import org.dspace.eperson.factory.EPersonServiceFactory; -import org.dspace.eperson.service.EPersonService; /** - * CurationCli provides command-line access to Curation tools and processes. - * - * @author richardrodgers + * This is the CLI version of the {@link Curation} script. + * This will only be called when the curate script is called from a commandline instance. */ -public class CurationCli { +public class CurationCli extends Curation { /** - * Default constructor + * This is the overridden instance of the {@link Curation#assignCurrentUserInContext()} method in the parent class + * {@link Curation}. + * This is done so that the CLI version of the Script is able to retrieve its currentUser from the -e flag given + * with the parameters of the Script. + * @throws ParseException If the e flag was not given to the parameters when calling the script */ - private CurationCli() { } - - public static void main(String[] args) throws Exception { - // create an options object and populate it - CommandLineParser parser = new PosixParser(); - - Options options = new Options(); - - options.addOption("t", "task", true, - "curation task name"); - options.addOption("T", "taskfile", true, - "file containing curation task names"); - options.addOption("i", "id", true, - "Id (handle) of object to perform task on, or 'all' to perform on whole repository"); - options.addOption("p", "parameter", true, - "a task parameter 'NAME=VALUE'"); - options.addOption("q", "queue", true, - "name of task queue to process"); - options.addOption("e", "eperson", true, - "email address of curating eperson"); - options.addOption("r", "reporter", true, - "relative or absolute path to the desired report file. " - + "Use '-' to report to console. " - + "If absent, no reporting"); - options.addOption("s", "scope", true, - "transaction scope to impose: use 'object', 'curation', or 'open'. If absent, 'open' " + - "applies"); - options.addOption("v", "verbose", false, - "report activity to stdout"); - options.addOption("h", "help", false, "help"); - - CommandLine line = parser.parse(options, args); - - String taskName = null; - String taskFileName = null; - String idName = null; - String taskQueueName = null; - String ePersonName = null; - String reporterName = null; - String scope = null; - boolean verbose = false; - final Map parameters = new HashMap<>(); - - if (line.hasOption('h')) { - HelpFormatter help = new HelpFormatter(); - help.printHelp("CurationCli\n", options); - System.out - .println("\nwhole repo: CurationCli -t estimate -i all"); - System.out - .println("single item: CurationCli -t generate -i itemId"); - System.out - .println("task queue: CurationCli -q monthly"); - System.exit(0); - } - - if (line.hasOption('t')) { // task - taskName = line.getOptionValue('t'); - } - - if (line.hasOption('T')) { // task file - taskFileName = line.getOptionValue('T'); - } - - if (line.hasOption('i')) { // id - idName = line.getOptionValue('i'); - } - - if (line.hasOption('q')) { // task queue - taskQueueName = line.getOptionValue('q'); - } - - if (line.hasOption('e')) { // eperson - ePersonName = line.getOptionValue('e'); - } - - if (line.hasOption('p')) { // parameter - for (String parameter : line.getOptionValues('p')) { - String[] parts = parameter.split("=", 2); - String name = parts[0].trim(); - String value; - if (parts.length > 1) { - value = parts[1].trim(); - } else { - value = "true"; - } - parameters.put(name, value); - } - } - if (line.hasOption('r')) { // report file - reporterName = line.getOptionValue('r'); - } - - - if (line.hasOption('s')) { // transaction scope - scope = line.getOptionValue('s'); - } - - if (line.hasOption('v')) { // verbose - verbose = true; - } - - // now validate the args - if (idName == null && taskQueueName == null) { - System.out.println("Id must be specified: a handle, 'all', or a task queue (-h for help)"); - System.exit(1); - } - - if (taskName == null && taskFileName == null && taskQueueName == null) { - System.out.println("A curation task or queue must be specified (-h for help)"); - System.exit(1); - } - - if (scope != null && Curator.TxScope.valueOf(scope.toUpperCase()) == null) { - System.out.println("Bad transaction scope '" + scope + "': only 'object', 'curation' or 'open' recognized"); - System.exit(1); - } - EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); - - Context c = new Context(Context.Mode.BATCH_EDIT); - if (ePersonName != null) { - EPerson ePerson = ePersonService.findByEmail(c, ePersonName); - if (ePerson == null) { - System.out.println("EPerson not found: " + ePersonName); - System.exit(1); - } - c.setCurrentUser(ePerson); - } else { - c.turnOffAuthorisationSystem(); - } - - Curator curator = new Curator(); - OutputStream reporter; - if (null == reporterName) { - reporter = new NullOutputStream(); - } else if ("-".equals(reporterName)) { - reporter = System.out; - } else { - reporter = new PrintStream(reporterName); - } - Writer reportWriter = new OutputStreamWriter(reporter); - curator.setReporter(reportWriter); - - if (scope != null) { - Curator.TxScope txScope = Curator.TxScope.valueOf(scope.toUpperCase()); - curator.setTransactionScope(txScope); - } - curator.addParameters(parameters); - // we are operating in batch mode, if anyone cares. - curator.setInvoked(Curator.Invoked.BATCH); - // load curation tasks - if (taskName != null) { - if (verbose) { - System.out.println("Adding task: " + taskName); - } - curator.addTask(taskName); - if (verbose && !curator.hasTask(taskName)) { - System.out.println("Task: " + taskName + " not resolved"); - } - } else if (taskQueueName == null) { - // load taskFile - BufferedReader reader = null; + @Override + protected void assignCurrentUserInContext() throws ParseException { + if (this.commandLine.hasOption('e')) { + String ePersonEmail = this.commandLine.getOptionValue('e'); + this.context = new Context(Context.Mode.BATCH_EDIT); try { - reader = new BufferedReader(new FileReader(taskFileName)); - while ((taskName = reader.readLine()) != null) { - if (verbose) { - System.out.println("Adding task: " + taskName); - } - curator.addTask(taskName); + EPerson ePerson = ePersonService.findByEmail(this.context, ePersonEmail); + if (ePerson == null) { + super.handler.logError("EPerson not found: " + ePersonEmail); + throw new IllegalArgumentException("Unable to find a user with email: " + ePersonEmail); } - } finally { - if (reader != null) { - reader.close(); - } - } - } - // run tasks against object - long start = System.currentTimeMillis(); - if (verbose) { - System.out.println("Starting curation"); - } - if (idName != null) { - if (verbose) { - System.out.println("Curating id: " + idName); - } - if ("all".equals(idName)) { - // run on whole Site - curator.curate(c, ContentServiceFactory.getInstance().getSiteService().findSite(c).getHandle()); - } else { - curator.curate(c, idName); + this.context.setCurrentUser(ePerson); + } catch (SQLException e) { + throw new IllegalArgumentException("SQLException trying to find user with email: " + ePersonEmail); } } else { - // process the task queue - TaskQueue queue = (TaskQueue) CoreServiceFactory.getInstance().getPluginService() - .getSinglePlugin(TaskQueue.class); - if (queue == null) { - System.out.println("No implementation configured for queue"); - throw new UnsupportedOperationException("No queue service available"); - } - // use current time as our reader 'ticket' - long ticket = System.currentTimeMillis(); - Iterator entryIter = queue.dequeue(taskQueueName, ticket).iterator(); - while (entryIter.hasNext()) { - TaskQueueEntry entry = entryIter.next(); - if (verbose) { - System.out.println("Curating id: " + entry.getObjectId()); - } - curator.clear(); - // does entry relate to a DSO or workflow object? - if (entry.getObjectId().indexOf("/") > 0) { - for (String task : entry.getTaskNames()) { - curator.addTask(task); - } - curator.curate(c, entry.getObjectId()); - } else { - // make eperson who queued task the effective user - EPerson agent = ePersonService.findByEmail(c, entry.getEpersonId()); - if (agent != null) { - c.setCurrentUser(agent); - } - CurateServiceFactory.getInstance().getWorkflowCuratorService() - .curate(curator, c, entry.getObjectId()); - } - } - queue.release(taskQueueName, ticket, true); - } - c.complete(); - if (verbose) { - long elapsed = System.currentTimeMillis() - start; - System.out.println("Ending curation. Elapsed time: " + elapsed); + throw new ParseException("Required parameter -e missing!"); } } } diff --git a/dspace-api/src/main/java/org/dspace/curate/CurationCliScriptConfiguration.java b/dspace-api/src/main/java/org/dspace/curate/CurationCliScriptConfiguration.java new file mode 100644 index 0000000000..5e1d014873 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/curate/CurationCliScriptConfiguration.java @@ -0,0 +1,26 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.curate; + +import org.apache.commons.cli.Options; + +/** + * This is the CLI version of the {@link CurationScriptConfiguration} class that handles the configuration for the + * {@link CurationCli} script + */ +public class CurationCliScriptConfiguration extends CurationScriptConfiguration { + + @Override + public Options getOptions() { + options = super.getOptions(); + options.addOption("e", "eperson", true, "email address of curating eperson"); + options.getOption("e").setType(String.class); + options.getOption("e").setRequired(true); + return options; + } +} diff --git a/dspace-api/src/main/java/org/dspace/curate/CurationClientOptions.java b/dspace-api/src/main/java/org/dspace/curate/CurationClientOptions.java new file mode 100644 index 0000000000..8ec0f14697 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/curate/CurationClientOptions.java @@ -0,0 +1,89 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.curate; + +import java.util.ArrayList; +import java.util.List; + +import org.apache.commons.cli.CommandLine; +import org.apache.commons.cli.Options; +import org.apache.commons.lang3.StringUtils; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; + +/** + * This Enum holds all the possible options and combinations for the Curation script + * + * @author Maria Verdonck (Atmire) on 23/06/2020 + */ +public enum CurationClientOptions { + TASK, + QUEUE, + HELP; + + private static List taskOptions; + + /** + * This method resolves the CommandLine parameters to figure out which action the curation script should perform + * + * @param commandLine The relevant CommandLine for the curation script + * @return The curation option to be ran, parsed from the CommandLine + */ + protected static CurationClientOptions getClientOption(CommandLine commandLine) { + if (commandLine.hasOption("h")) { + return CurationClientOptions.HELP; + } else if (commandLine.hasOption("t") || commandLine.hasOption("T")) { + return CurationClientOptions.TASK; + } else if (commandLine.hasOption("q")) { + return CurationClientOptions.QUEUE; + } + return null; + } + + /** + * This method will create all the possible Options for the {@link Curation} script. + * This will be used by {@link CurationScriptConfiguration} + * @return The options for the {@link Curation} script + */ + protected static Options constructOptions() { + Options options = new Options(); + + options.addOption("t", "task", true, "curation task name; options: " + getTaskOptions()); + options.addOption("T", "taskfile", true, "file containing curation task names"); + options.addOption("i", "id", true, + "Id (handle) of object to perform task on, or 'all' to perform on whole repository"); + options.addOption("p", "parameter", true, "a task parameter 'NAME=VALUE'"); + options.addOption("q", "queue", true, "name of task queue to process"); + options.addOption("r", "reporter", true, + "relative or absolute path to the desired report file. Use '-' to report to console. If absent, no " + + "reporting"); + options.addOption("s", "scope", true, + "transaction scope to impose: use 'object', 'curation', or 'open'. If absent, 'open' applies"); + options.addOption("v", "verbose", false, "report activity to stdout"); + options.addOption("h", "help", false, "help"); + + return options; + } + + /** + * Creates list of the taskOptions' keys from the configs of plugin.named.org.dspace.curate.CurationTask + * + * @return List of the taskOptions' keys from the configs of plugin.named.org.dspace.curate.CurationTask + */ + public static List getTaskOptions() { + if (taskOptions == null) { + ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + String[] taskConfigs = configurationService.getArrayProperty("plugin.named.org.dspace.curate.CurationTask"); + taskOptions = new ArrayList<>(); + for (String taskConfig : taskConfigs) { + taskOptions.add(StringUtils.substringAfterLast(taskConfig, "=").trim()); + } + } + return taskOptions; + } +} diff --git a/dspace-api/src/main/java/org/dspace/curate/CurationScriptConfiguration.java b/dspace-api/src/main/java/org/dspace/curate/CurationScriptConfiguration.java new file mode 100644 index 0000000000..fefb4eb768 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/curate/CurationScriptConfiguration.java @@ -0,0 +1,61 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.curate; + +import java.sql.SQLException; + +import org.apache.commons.cli.Options; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.core.Context; +import org.dspace.scripts.configuration.ScriptConfiguration; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * The {@link ScriptConfiguration} for the {@link Curation} script + * + * @author Maria Verdonck (Atmire) on 23/06/2020 + */ +public class CurationScriptConfiguration extends ScriptConfiguration { + + @Autowired + private AuthorizeService authorizeService; + + private Class dspaceRunnableClass; + + @Override + public Class getDspaceRunnableClass() { + return this.dspaceRunnableClass; + } + + @Override + public void setDspaceRunnableClass(Class dspaceRunnableClass) { + this.dspaceRunnableClass = dspaceRunnableClass; + } + + /** + * Only admin can run Curation script via the scripts and processes endpoints. + * @param context The relevant DSpace context + * @return True if currentUser is admin, otherwise false + */ + @Override + public boolean isAllowedToExecute(Context context) { + try { + return authorizeService.isAdmin(context); + } catch (SQLException e) { + throw new RuntimeException("SQLException occurred when checking if the current user is an admin", e); + } + } + + @Override + public Options getOptions() { + if (options == null) { + super.options = CurationClientOptions.constructOptions(); + } + return options; + } +} diff --git a/dspace-api/src/main/java/org/dspace/curate/Curator.java b/dspace-api/src/main/java/org/dspace/curate/Curator.java index 44733174df..8f12750bae 100644 --- a/dspace-api/src/main/java/org/dspace/curate/Curator.java +++ b/dspace-api/src/main/java/org/dspace/curate/Curator.java @@ -98,6 +98,7 @@ public class Curator { communityService = ContentServiceFactory.getInstance().getCommunityService(); itemService = ContentServiceFactory.getInstance().getItemService(); handleService = HandleServiceFactory.getInstance().getHandleService(); + resolver = new TaskResolver(); } /** @@ -142,10 +143,10 @@ public class Curator { // performance order currently FIFO - to be revisited perfList.add(taskName); } catch (IOException ioE) { - log.error("Task: '" + taskName + "' initialization failure: " + ioE.getMessage()); + System.out.println("Task: '" + taskName + "' initialization failure: " + ioE.getMessage()); } } else { - log.error("Task: '" + taskName + "' does not resolve"); + System.out.println("Task: '" + taskName + "' does not resolve"); } return this; } @@ -259,13 +260,6 @@ public class Curator { /** * Performs all configured tasks upon DSpace object * (Community, Collection or Item). - *

- * Note: Site-wide tasks will default to running as - * an Anonymous User unless you call the Site-wide task - * via the {@link curate(Context,String)} or - * {@link #curate(Context, DSpaceObject)} method with an - * authenticated Context object. - * * @param dso the DSpace object * @throws IOException if IO error */ @@ -325,7 +319,7 @@ public class Curator { taskQ.enqueue(queueId, new TaskQueueEntry(c.getCurrentUser().getName(), System.currentTimeMillis(), perfList, id)); } else { - log.error("curate - no TaskQueue implemented"); + System.out.println("curate - no TaskQueue implemented"); } } @@ -346,7 +340,7 @@ public class Curator { try { reporter.append(message); } catch (IOException ex) { - log.error("Task reporting failure", ex); + System.out.println("Task reporting failure: " + ex); } } @@ -552,7 +546,7 @@ public class Curator { return !suspend(statusCode); } catch (IOException ioe) { //log error & pass exception upwards - log.error("Error executing curation task '" + task.getName() + "'", ioe); + System.out.println("Error executing curation task '" + task.getName() + "'; " + ioe); throw ioe; } } @@ -568,7 +562,7 @@ public class Curator { return !suspend(statusCode); } catch (IOException ioe) { //log error & pass exception upwards - log.error("Error executing curation task '" + task.getName() + "'", ioe); + System.out.println("Error executing curation task '" + task.getName() + "'; " + ioe); throw ioe; } } diff --git a/dspace-api/src/main/java/org/dspace/discovery/DiscoverQuery.java b/dspace-api/src/main/java/org/dspace/discovery/DiscoverQuery.java index d3efb3c626..d82779015f 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/DiscoverQuery.java +++ b/dspace-api/src/main/java/org/dspace/discovery/DiscoverQuery.java @@ -7,6 +7,9 @@ */ package org.dspace.discovery; +import static java.util.Collections.singletonList; +import static org.apache.commons.lang3.StringUtils.isNotBlank; + import java.util.ArrayList; import java.util.Arrays; import java.util.Collections; @@ -31,7 +34,7 @@ public class DiscoverQuery { **/ private String query; private List filterQueries; - private String DSpaceObjectFilter = null; + private List dspaceObjectFilters = new ArrayList<>(); private List fieldPresentQueries; private boolean spellCheck; @@ -118,20 +121,33 @@ public class DiscoverQuery { * Sets the DSpace object filter, must be an DSpace Object type integer * can be used to only return objects from a certain DSpace Object type * - * @param DSpaceObjectFilter the DSpace object filer + * @param dspaceObjectFilter the DSpace object filter */ - public void setDSpaceObjectFilter(String DSpaceObjectFilter) { - this.DSpaceObjectFilter = DSpaceObjectFilter; + public void setDSpaceObjectFilter(String dspaceObjectFilter) { + this.dspaceObjectFilters = singletonList(dspaceObjectFilter); } /** - * Gets the DSpace object filter - * can be used to only return objects from a certain DSpace Object type + * Adds a DSpace object filter, must be an DSpace Object type integer. + * Can be used to also return objects from a certain DSpace Object type. * - * @return the DSpace object filer + * @param dspaceObjectFilter the DSpace object filer */ - public String getDSpaceObjectFilter() { - return DSpaceObjectFilter; + public void addDSpaceObjectFilter(String dspaceObjectFilter) { + + if (isNotBlank(dspaceObjectFilter)) { + this.dspaceObjectFilters.add(dspaceObjectFilter); + } + } + + /** + * Gets the DSpace object filters + * can be used to only return objects from certain DSpace Object types + * + * @return the DSpace object filters + */ + public List getDSpaceObjectFilters() { + return dspaceObjectFilters; } /** diff --git a/dspace-api/src/main/java/org/dspace/discovery/IndexEventConsumer.java b/dspace-api/src/main/java/org/dspace/discovery/IndexEventConsumer.java index 43ea9eefb2..195c9cd6fc 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/IndexEventConsumer.java +++ b/dspace-api/src/main/java/org/dspace/discovery/IndexEventConsumer.java @@ -8,6 +8,7 @@ package org.dspace.discovery; import java.util.HashSet; +import java.util.Optional; import java.util.Set; import org.apache.logging.log4j.Logger; @@ -15,6 +16,7 @@ import org.dspace.content.Bundle; import org.dspace.content.DSpaceObject; import org.dspace.core.Constants; import org.dspace.core.Context; +import org.dspace.discovery.indexobject.factory.IndexFactory; import org.dspace.discovery.indexobject.factory.IndexObjectFactoryFactory; import org.dspace.event.Consumer; import org.dspace.event.Event; @@ -67,7 +69,7 @@ public class IndexEventConsumer implements Consumer { int st = event.getSubjectType(); if (!(st == Constants.ITEM || st == Constants.BUNDLE - || st == Constants.COLLECTION || st == Constants.COMMUNITY)) { + || st == Constants.COLLECTION || st == Constants.COMMUNITY || st == Constants.SITE)) { log .warn("IndexConsumer should not have been given this kind of Subject in an event, skipping: " + event.toString()); @@ -104,10 +106,28 @@ public class IndexEventConsumer implements Consumer { case Event.MODIFY: case Event.MODIFY_METADATA: if (subject == null) { - log.warn(event.getEventTypeAsString() + " event, could not get object for " + if (st == Constants.SITE) { + // Update the indexable objects of type in event.detail of objects with ids in event.identifiers + for (String id : event.getIdentifiers()) { + IndexFactory indexableObjectService = IndexObjectFactoryFactory.getInstance(). + getIndexFactoryByType(event.getDetail()); + Optional indexableObject = Optional.empty(); + indexableObject = indexableObjectService.findIndexableObject(ctx, id); + if (indexableObject.isPresent()) { + log.debug("consume() adding event to update queue: " + event.toString()); + objectsToUpdate + .addAll(indexObjectServiceFactory + .getIndexableObjects(ctx, indexableObject.get().getIndexedObject())); + } else { + log.warn("Cannot resolve " + id); + } + } + } else { + log.warn(event.getEventTypeAsString() + " event, could not get object for " + event.getSubjectTypeAsString() + " id=" + event.getSubjectID() + ", perhaps it has been deleted."); + } } else { log.debug("consume() adding event to update queue: " + event.toString()); objectsToUpdate.addAll(indexObjectServiceFactory.getIndexableObjects(ctx, subject)); diff --git a/dspace-api/src/main/java/org/dspace/discovery/SolrServiceImpl.java b/dspace-api/src/main/java/org/dspace/discovery/SolrServiceImpl.java index 1c47d46162..88e32d0aaf 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/SolrServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/discovery/SolrServiceImpl.java @@ -7,6 +7,8 @@ */ package org.dspace.discovery; +import static java.util.stream.Collectors.joining; + import java.io.IOException; import java.io.PrintWriter; import java.io.StringWriter; @@ -751,8 +753,13 @@ public class SolrServiceImpl implements SearchService, IndexingService { String filterQuery = discoveryQuery.getFilterQueries().get(i); solrQuery.addFilterQuery(filterQuery); } - if (discoveryQuery.getDSpaceObjectFilter() != null) { - solrQuery.addFilterQuery(SearchUtils.RESOURCE_TYPE_FIELD + ":" + discoveryQuery.getDSpaceObjectFilter()); + if (discoveryQuery.getDSpaceObjectFilters() != null) { + solrQuery.addFilterQuery( + discoveryQuery.getDSpaceObjectFilters() + .stream() + .map(filter -> SearchUtils.RESOURCE_TYPE_FIELD + ":" + filter) + .collect(joining(" OR ")) + ); } for (int i = 0; i < discoveryQuery.getFieldPresentQueries().size(); i++) { diff --git a/dspace-api/src/main/java/org/dspace/discovery/SolrServiceMetadataBrowseIndexingPlugin.java b/dspace-api/src/main/java/org/dspace/discovery/SolrServiceMetadataBrowseIndexingPlugin.java index 187c6b0600..2b2be66384 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/SolrServiceMetadataBrowseIndexingPlugin.java +++ b/dspace-api/src/main/java/org/dspace/discovery/SolrServiceMetadataBrowseIndexingPlugin.java @@ -17,6 +17,7 @@ import org.apache.logging.log4j.Logger; import org.apache.solr.common.SolrInputDocument; import org.dspace.browse.BrowseException; import org.dspace.browse.BrowseIndex; +import org.dspace.content.Collection; import org.dspace.content.Item; import org.dspace.content.MetadataValue; import org.dspace.content.authority.service.ChoiceAuthorityService; @@ -63,7 +64,7 @@ public class SolrServiceMetadataBrowseIndexingPlugin implements SolrServiceIndex return; } Item item = ((IndexableItem) indexableObject).getIndexedObject(); - + Collection collection = item.getOwningCollection(); // Get the currently configured browse indexes BrowseIndex[] bis; try { @@ -175,7 +176,7 @@ public class SolrServiceMetadataBrowseIndexingPlugin implements SolrServiceIndex true); if (!ignorePrefered) { preferedLabel = choiceAuthorityService - .getLabel(values.get(x), values.get(x).getLanguage()); + .getLabel(values.get(x), collection, values.get(x).getLanguage()); } List variants = null; @@ -195,7 +196,7 @@ public class SolrServiceMetadataBrowseIndexingPlugin implements SolrServiceIndex if (!ignoreVariants) { variants = choiceAuthorityService .getVariants( - values.get(x)); + values.get(x), collection); } if (StringUtils diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/AbstractIndexableObject.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/AbstractIndexableObject.java new file mode 100644 index 0000000000..90aafcbd30 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/AbstractIndexableObject.java @@ -0,0 +1,43 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.discovery.indexobject; + +import java.io.Serializable; + +import org.dspace.core.ReloadableEntity; +import org.dspace.discovery.IndexableObject; + +/** + * This class exists in order to provide a default implementation for the equals and hashCode methods. + * Since IndexableObjects can be made multiple times for the same underlying object, we needed a more finetuned + * equals and hashcode methods. We're simply checking that the underlying objects are equal and generating the hashcode + * for the underlying object. This way, we'll always get a proper result when calling equals or hashcode on an + * IndexableObject because it'll depend on the underlying object + * @param Refers to the underlying entity that is linked to this object + * @param The type of ID that this entity uses + */ +public abstract class AbstractIndexableObject, PK extends Serializable> + implements IndexableObject { + + @Override + public boolean equals(Object obj) { + //Two IndexableObjects of the same DSpaceObject are considered equal + if (!(obj instanceof AbstractIndexableObject)) { + return false; + } + IndexableDSpaceObject other = (IndexableDSpaceObject) obj; + return other.getIndexedObject().equals(getIndexedObject()); + } + + @Override + public int hashCode() { + //Two IndexableObjects of the same DSpaceObject are considered equal + return getIndexedObject().hashCode(); + } + +} diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexFactoryImpl.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexFactoryImpl.java index ca1423e593..2e4eb67723 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexFactoryImpl.java +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexFactoryImpl.java @@ -12,6 +12,7 @@ import java.sql.SQLException; import java.util.Date; import java.util.List; +import org.apache.commons.collections4.ListUtils; import org.apache.commons.lang3.StringUtils; import org.apache.solr.client.solrj.SolrClient; import org.apache.solr.client.solrj.SolrServerException; @@ -56,7 +57,7 @@ public abstract class IndexFactoryImpl implements doc.addField(SearchUtils.RESOURCE_ID_FIELD, indexableObject.getID().toString()); //Do any additional indexing, depends on the plugins - for (SolrServiceIndexPlugin solrServiceIndexPlugin : solrServiceIndexPlugins) { + for (SolrServiceIndexPlugin solrServiceIndexPlugin : ListUtils.emptyIfNull(solrServiceIndexPlugins)) { solrServiceIndexPlugin.additionalIndex(context, indexableObject, doc); } @@ -190,4 +191,4 @@ public abstract class IndexFactoryImpl implements public void deleteAll() throws IOException, SolrServerException { solrSearchCore.getSolr().deleteByQuery(SearchUtils.RESOURCE_TYPE_FIELD + ":" + getType()); } -} \ No newline at end of file +} diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableClaimedTask.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableClaimedTask.java index 3810b6803f..b96899b618 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableClaimedTask.java +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableClaimedTask.java @@ -7,7 +7,6 @@ */ package org.dspace.discovery.indexobject; -import org.dspace.discovery.IndexableObject; import org.dspace.xmlworkflow.storedcomponents.ClaimedTask; /** @@ -15,7 +14,7 @@ import org.dspace.xmlworkflow.storedcomponents.ClaimedTask; * * @author Kevin Van de Velde (kevin at atmire dot com) */ -public class IndexableClaimedTask implements IndexableObject { +public class IndexableClaimedTask extends AbstractIndexableObject { private ClaimedTask claimedTask; public static final String TYPE = ClaimedTask.class.getSimpleName(); diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableDSpaceObject.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableDSpaceObject.java index 7ad82b1a95..7abc11eb7f 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableDSpaceObject.java +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableDSpaceObject.java @@ -10,7 +10,6 @@ package org.dspace.discovery.indexobject; import java.util.UUID; import org.dspace.content.DSpaceObject; -import org.dspace.discovery.IndexableObject; /** * DSpaceObject implementation for the IndexableObject, contains methods used by all DSpaceObject methods @@ -18,7 +17,7 @@ import org.dspace.discovery.IndexableObject; * * @author Kevin Van de Velde (kevin at atmire dot com) */ -public abstract class IndexableDSpaceObject implements IndexableObject { +public abstract class IndexableDSpaceObject extends AbstractIndexableObject { private T dso; @@ -40,4 +39,6 @@ public abstract class IndexableDSpaceObject implements I public UUID getID() { return dso.getID(); } -} \ No newline at end of file + + +} diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableInProgressSubmission.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableInProgressSubmission.java index cfa27ff814..d6dd785801 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableInProgressSubmission.java +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableInProgressSubmission.java @@ -8,14 +8,13 @@ package org.dspace.discovery.indexobject; import org.dspace.content.InProgressSubmission; -import org.dspace.discovery.IndexableObject; /** * InProgressSubmission implementation for the IndexableObject * @author Kevin Van de Velde (kevin at atmire dot com) */ public abstract class IndexableInProgressSubmission - implements IndexableObject { + extends AbstractIndexableObject { protected T inProgressSubmission; diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableMetadataField.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableMetadataField.java new file mode 100644 index 0000000000..70e63d19ba --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexableMetadataField.java @@ -0,0 +1,51 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.discovery.indexobject; + +import org.dspace.content.MetadataField; +import org.dspace.discovery.IndexableObject; + +/** + * {@link MetadataField} implementation for the {@link IndexableObject} + * + * @author Maria Verdonck (Atmire) on 14/07/2020 + */ +public class IndexableMetadataField extends AbstractIndexableObject { + + private MetadataField metadataField; + public static final String TYPE = MetadataField.class.getSimpleName(); + + public IndexableMetadataField(MetadataField metadataField) { + this.metadataField = metadataField; + } + + @Override + public String getType() { + return TYPE; + } + + @Override + public Integer getID() { + return this.metadataField.getID(); + } + + @Override + public MetadataField getIndexedObject() { + return this.metadataField; + } + + @Override + public void setIndexedObject(MetadataField metadataField) { + this.metadataField = metadataField; + } + + @Override + public String getTypeText() { + return TYPE.toUpperCase(); + } +} diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexablePoolTask.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexablePoolTask.java index 6eea1f0ebb..39fdb8b8b5 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexablePoolTask.java +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/IndexablePoolTask.java @@ -7,14 +7,13 @@ */ package org.dspace.discovery.indexobject; -import org.dspace.discovery.IndexableObject; import org.dspace.xmlworkflow.storedcomponents.PoolTask; /** * PoolTask implementation for the IndexableObject * @author Kevin Van de Velde (kevin at atmire dot com) */ -public class IndexablePoolTask implements IndexableObject { +public class IndexablePoolTask extends AbstractIndexableObject { public static final String TYPE = PoolTask.class.getSimpleName(); diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/ItemIndexFactoryImpl.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/ItemIndexFactoryImpl.java index 7f98131566..2a1008aaf9 100644 --- a/dspace-api/src/main/java/org/dspace/discovery/indexobject/ItemIndexFactoryImpl.java +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/ItemIndexFactoryImpl.java @@ -173,6 +173,8 @@ public class ItemIndexFactoryImpl extends DSpaceObjectIndexFactoryImpl discoveryConfigurations) throws SQLException, IOException { + // use the item service to retrieve the owning collection also for inprogress submission + Collection collection = (Collection) itemService.getParentObject(context, item); //Keep a list of our sort values which we added, sort values can only be added once List sortFieldsAdded = new ArrayList<>(); Map> searchFilters = null; @@ -359,7 +361,7 @@ public class ItemIndexFactoryImpl extends DSpaceObjectIndexFactoryImpl + implements MetadataFieldIndexFactory { + + public static final String SCHEMA_FIELD_NAME = "schema"; + public static final String ELEMENT_FIELD_NAME = "element"; + public static final String QUALIFIER_FIELD_NAME = "qualifier"; + public static final String FIELD_NAME_VARIATIONS = "fieldName"; + + protected GroupService groupService = EPersonServiceFactory.getInstance().getGroupService(); + + @Override + public SolrInputDocument buildDocument(Context context, IndexableMetadataField indexableObject) throws SQLException, + IOException { + // Add the ID's, types and call the SolrServiceIndexPlugins + final SolrInputDocument doc = super.buildDocument(context, indexableObject); + final MetadataField metadataField = indexableObject.getIndexedObject(); + // add schema, element, qualifier and full fieldName + addFacetIndex(doc, SCHEMA_FIELD_NAME, metadataField.getMetadataSchema().getName(), + metadataField.getMetadataSchema().getName()); + addFacetIndex(doc, ELEMENT_FIELD_NAME, metadataField.getElement(), metadataField.getElement()); + String fieldName = metadataField.toString().replace('_', '.'); + addFacetIndex(doc, FIELD_NAME_VARIATIONS, fieldName, fieldName); + if (StringUtils.isNotBlank(metadataField.getQualifier())) { + addFacetIndex(doc, QUALIFIER_FIELD_NAME, metadataField.getQualifier(), metadataField.getQualifier()); + addFacetIndex(doc, FIELD_NAME_VARIATIONS, fieldName, + metadataField.getElement() + "." + metadataField.getQualifier()); + addFacetIndex(doc, FIELD_NAME_VARIATIONS, metadataField.getQualifier(), metadataField.getQualifier()); + } else { + addFacetIndex(doc, FIELD_NAME_VARIATIONS, metadataField.getElement(), metadataField.getElement()); + } + addNamedResourceTypeIndex(doc, indexableObject.getTypeText()); + Group anonymousGroup = groupService.findByName(context, Group.ANONYMOUS); + // add read permission on doc for anonymous group + doc.addField("read", "g" + anonymousGroup.getID()); + return doc; + } + + @Autowired + private MetadataFieldService metadataFieldService; + + @Override + public Iterator findAll(Context context) throws SQLException { + final Iterator metadataFields = metadataFieldService.findAll(context).iterator(); + return new Iterator<>() { + @Override + public boolean hasNext() { + return metadataFields.hasNext(); + } + + @Override + public IndexableMetadataField next() { + return new IndexableMetadataField(metadataFields.next()); + } + }; + } + + @Override + public String getType() { + return IndexableMetadataField.TYPE; + } + + @Override + public Optional findIndexableObject(Context context, String id) throws SQLException { + final MetadataField metadataField = metadataFieldService.find(context, Integer.parseInt(id)); + return metadataField == null ? Optional.empty() : Optional.of(new IndexableMetadataField(metadataField)); + } + + @Override + public boolean supports(Object object) { + return object instanceof MetadataField; + } + + @Override + public List getIndexableObjects(Context context, MetadataField object) { + return Arrays.asList(new IndexableMetadataField(object)); + } +} diff --git a/dspace-api/src/main/java/org/dspace/discovery/indexobject/factory/MetadataFieldIndexFactory.java b/dspace-api/src/main/java/org/dspace/discovery/indexobject/factory/MetadataFieldIndexFactory.java new file mode 100644 index 0000000000..976cc4511c --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/discovery/indexobject/factory/MetadataFieldIndexFactory.java @@ -0,0 +1,19 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.discovery.indexobject.factory; + +import org.dspace.content.MetadataField; +import org.dspace.discovery.indexobject.IndexableMetadataField; + +/** + * Factory interface for indexing/retrieving {@link org.dspace.content.MetadataField} items in the search core + * + * @author Maria Verdonck (Atmire) on 14/07/2020 + */ +public interface MetadataFieldIndexFactory extends IndexFactory { +} diff --git a/dspace-api/src/main/java/org/dspace/disseminate/service/CitationDocumentService.java b/dspace-api/src/main/java/org/dspace/disseminate/service/CitationDocumentService.java index d6c7935a86..4a59de3f5f 100644 --- a/dspace-api/src/main/java/org/dspace/disseminate/service/CitationDocumentService.java +++ b/dspace-api/src/main/java/org/dspace/disseminate/service/CitationDocumentService.java @@ -38,7 +38,7 @@ public interface CitationDocumentService { * Citation enabled globally (all citable bitstreams will get "watermarked") modules/disseminate-citation: * enable_globally * OR - * The container is this object is whitelist enabled. + * The container is this object is "allow list" enabled. * - community: modules/disseminate-citation: enabled_communities * - collection: modules/disseminate-citation: enabled_collections * AND diff --git a/dspace-api/src/main/java/org/dspace/eperson/AccountServiceImpl.java b/dspace-api/src/main/java/org/dspace/eperson/AccountServiceImpl.java index e00a9568e3..40da31a0f9 100644 --- a/dspace-api/src/main/java/org/dspace/eperson/AccountServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/eperson/AccountServiceImpl.java @@ -12,6 +12,7 @@ import java.sql.SQLException; import java.util.Locale; import javax.mail.MessagingException; +import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; import org.dspace.authorize.AuthorizeException; import org.dspace.core.ConfigurationManager; @@ -22,6 +23,7 @@ import org.dspace.core.Utils; import org.dspace.eperson.service.AccountService; import org.dspace.eperson.service.EPersonService; import org.dspace.eperson.service.RegistrationDataService; +import org.dspace.services.ConfigurationService; import org.springframework.beans.factory.annotation.Autowired; /** @@ -47,6 +49,8 @@ public class AccountServiceImpl implements AccountService { protected EPersonService ePersonService; @Autowired(required = true) protected RegistrationDataService registrationDataService; + @Autowired + private ConfigurationService configurationService; protected AccountServiceImpl() { @@ -67,6 +71,9 @@ public class AccountServiceImpl implements AccountService { public void sendRegistrationInfo(Context context, String email) throws SQLException, IOException, MessagingException, AuthorizeException { + if (!configurationService.getBooleanProperty("user.registration", true)) { + throw new IllegalStateException("The user.registration parameter was set to false"); + } sendInfo(context, email, true, true); } @@ -155,6 +162,14 @@ public class AccountServiceImpl implements AccountService { registrationDataService.deleteByToken(context, token); } + @Override + public boolean verifyPasswordStructure(String password) { + if (StringUtils.length(password) < 6) { + return false; + } + return true; + } + /** * THIS IS AN INTERNAL METHOD. THE SEND PARAMETER ALLOWS IT TO BE USED FOR * TESTING PURPOSES. @@ -233,8 +248,8 @@ public class AccountServiceImpl implements AccountService { // Note change from "key=" to "token=" String specialLink = new StringBuffer().append(base).append( base.endsWith("/") ? "" : "/").append( - isRegister ? "register" : "forgot").append("?") - .append("token=").append(rd.getToken()) + isRegister ? "register" : "forgot").append("/") + .append(rd.getToken()) .toString(); Locale locale = context.getCurrentLocale(); Email bean = Email.getEmail(I18nUtil.getEmailFilename(locale, isRegister ? "register" diff --git a/dspace-api/src/main/java/org/dspace/eperson/EPerson.java b/dspace-api/src/main/java/org/dspace/eperson/EPerson.java index fc2950ee2b..3c48a5244a 100644 --- a/dspace-api/src/main/java/org/dspace/eperson/EPerson.java +++ b/dspace-api/src/main/java/org/dspace/eperson/EPerson.java @@ -141,7 +141,7 @@ public class EPerson extends DSpaceObject implements DSpaceObjectLegacySupport { return false; } final EPerson other = (EPerson) obj; - if (this.getID() != other.getID()) { + if (!this.getID().equals(other.getID())) { return false; } if (!StringUtils.equals(this.getEmail(), other.getEmail())) { diff --git a/dspace-api/src/main/java/org/dspace/eperson/EPersonCLITool.java b/dspace-api/src/main/java/org/dspace/eperson/EPersonCLITool.java index 850cb992bc..547044d460 100644 --- a/dspace-api/src/main/java/org/dspace/eperson/EPersonCLITool.java +++ b/dspace-api/src/main/java/org/dspace/eperson/EPersonCLITool.java @@ -7,8 +7,11 @@ */ package org.dspace.eperson; +import java.io.BufferedReader; import java.io.IOException; +import java.io.InputStreamReader; import java.sql.SQLException; +import java.util.List; import java.util.Locale; import org.apache.commons.cli.CommandLine; @@ -196,7 +199,6 @@ public class EPersonCLITool { try { ePersonService.update(context, eperson); - context.complete(); System.out.printf("Created EPerson %s\n", eperson.getID().toString()); } catch (SQLException ex) { context.abort(); @@ -259,16 +261,26 @@ public class EPersonCLITool { } try { - ePersonService.delete(context, eperson); - context.complete(); - System.out.printf("Deleted EPerson %s\n", eperson.getID().toString()); - } catch (SQLException ex) { - System.err.println(ex.getMessage()); - return 1; - } catch (AuthorizeException ex) { - System.err.println(ex.getMessage()); - return 1; - } catch (IOException ex) { + List tableList = ePersonService.getDeleteConstraints(context, eperson); + if (!tableList.isEmpty()) { + System.out.printf("The EPerson with ID: %s is referenced by the following database tables:%n", + eperson.getID().toString()); + tableList.forEach((s) -> { + System.out.println(s); + }); + } + System.out.printf("Are you sure you want to delete this EPerson with ID: %s? (y or n): ", + eperson.getID().toString()); + BufferedReader input = new BufferedReader(new InputStreamReader(System.in)); + System.out.flush(); + String s = input.readLine(); + if (s != null && s.trim().toLowerCase().startsWith("y")) { + ePersonService.delete(context, eperson); + System.out.printf("%nDeleted EPerson with ID: %s", eperson.getID().toString()); + } else { + System.out.printf("%nAbort Deletion of EPerson with ID: %s %n", eperson.getID().toString()); + } + } catch (SQLException | AuthorizeException | IOException ex) { System.err.println(ex.getMessage()); return 1; } @@ -373,7 +385,6 @@ public class EPersonCLITool { if (modified) { try { ePersonService.update(context, eperson); - context.complete(); System.out.printf("Modified EPerson %s\n", eperson.getID().toString()); } catch (SQLException ex) { context.abort(); diff --git a/dspace-api/src/main/java/org/dspace/eperson/EPersonDeletionException.java b/dspace-api/src/main/java/org/dspace/eperson/EPersonDeletionException.java index 5429f3d102..b86d5f5e8e 100644 --- a/dspace-api/src/main/java/org/dspace/eperson/EPersonDeletionException.java +++ b/dspace-api/src/main/java/org/dspace/eperson/EPersonDeletionException.java @@ -9,6 +9,8 @@ package org.dspace.eperson; import java.util.List; +import org.apache.commons.lang3.ArrayUtils; + /** * Exception indicating that an EPerson may not be deleted due to the presence * of the EPerson's ID in certain tables @@ -33,7 +35,10 @@ public class EPersonDeletionException extends Exception { * deleted if it exists in these tables. */ public EPersonDeletionException(List tableList) { - super(); + // this may not be the most beautiful way to print the tablenames as part or the error message. + // but it has to be a one liner, as the super() call must be the first statement in the constructor. + super("Cannot delete EPerson as it is referenced by the following database tables: " + + ArrayUtils.toString(tableList.toArray())); myTableList = tableList; } diff --git a/dspace-api/src/main/java/org/dspace/eperson/EPersonServiceImpl.java b/dspace-api/src/main/java/org/dspace/eperson/EPersonServiceImpl.java index f173250cf3..ab9f7831c7 100644 --- a/dspace-api/src/main/java/org/dspace/eperson/EPersonServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/eperson/EPersonServiceImpl.java @@ -7,10 +7,13 @@ */ package org.dspace.eperson; +import java.io.IOException; import java.sql.SQLException; import java.util.ArrayList; import java.util.Arrays; +import java.util.Collections; import java.util.Date; +import java.util.HashSet; import java.util.Iterator; import java.util.List; import java.util.Set; @@ -21,26 +24,56 @@ import org.apache.commons.collections4.CollectionUtils; import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; import org.dspace.authorize.AuthorizeException; +import org.dspace.authorize.factory.AuthorizeServiceFactory; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.authorize.service.ResourcePolicyService; import org.dspace.content.DSpaceObjectServiceImpl; import org.dspace.content.Item; import org.dspace.content.MetadataField; +import org.dspace.content.WorkspaceItem; +import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.ItemService; +import org.dspace.content.service.WorkspaceItemService; import org.dspace.core.Constants; import org.dspace.core.Context; import org.dspace.core.LogManager; import org.dspace.core.Utils; import org.dspace.eperson.dao.EPersonDAO; import org.dspace.eperson.service.EPersonService; +import org.dspace.eperson.service.GroupService; import org.dspace.eperson.service.SubscribeService; import org.dspace.event.Event; +import org.dspace.versioning.Version; +import org.dspace.versioning.VersionHistory; +import org.dspace.versioning.dao.VersionDAO; +import org.dspace.versioning.factory.VersionServiceFactory; +import org.dspace.versioning.service.VersionHistoryService; +import org.dspace.versioning.service.VersioningService; import org.dspace.workflow.WorkflowService; import org.dspace.workflow.factory.WorkflowServiceFactory; +import org.dspace.workflowbasic.BasicWorkflowItem; +import org.dspace.workflowbasic.BasicWorkflowServiceImpl; +import org.dspace.workflowbasic.factory.BasicWorkflowServiceFactory; +import org.dspace.workflowbasic.service.BasicWorkflowItemService; +import org.dspace.workflowbasic.service.BasicWorkflowService; +import org.dspace.workflowbasic.service.TaskListItemService; +import org.dspace.xmlworkflow.WorkflowConfigurationException; +import org.dspace.xmlworkflow.factory.XmlWorkflowServiceFactory; +import org.dspace.xmlworkflow.service.WorkflowRequirementsService; +import org.dspace.xmlworkflow.service.XmlWorkflowService; +import org.dspace.xmlworkflow.storedcomponents.ClaimedTask; +import org.dspace.xmlworkflow.storedcomponents.CollectionRole; +import org.dspace.xmlworkflow.storedcomponents.XmlWorkflowItem; +import org.dspace.xmlworkflow.storedcomponents.service.ClaimedTaskService; +import org.dspace.xmlworkflow.storedcomponents.service.CollectionRoleService; +import org.dspace.xmlworkflow.storedcomponents.service.PoolTaskService; +import org.dspace.xmlworkflow.storedcomponents.service.WorkflowItemRoleService; +import org.dspace.xmlworkflow.storedcomponents.service.XmlWorkflowItemService; import org.springframework.beans.factory.annotation.Autowired; /** - * Service implementation for the EPerson object. - * This class is responsible for all business logic calls for the EPerson object and is autowired by spring. + * Service implementation for the EPerson object. This class is responsible for + * all business logic calls for the EPerson object and is autowired by spring. * This class should never be accessed directly. * * @author kevinvandevelde at atmire.com @@ -60,7 +93,17 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme @Autowired(required = true) protected ItemService itemService; @Autowired(required = true) + protected WorkflowItemRoleService workflowItemRoleService; + @Autowired(required = true) + CollectionRoleService collectionRoleService; + @Autowired(required = true) + protected GroupService groupService; + @Autowired(required = true) protected SubscribeService subscribeService; + @Autowired(required = true) + protected VersionDAO versionDAO; + @Autowired(required = true) + protected ClaimedTaskService claimedTaskService; protected EPersonServiceImpl() { super(); @@ -129,7 +172,7 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme query = null; } return ePersonDAO.search(context, query, Arrays.asList(firstNameField, lastNameField), - Arrays.asList(firstNameField, lastNameField), offset, limit); + Arrays.asList(firstNameField, lastNameField), offset, limit); } } @@ -179,45 +222,202 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme // authorized? if (!authorizeService.isAdmin(context)) { throw new AuthorizeException( - "You must be an admin to create an EPerson"); + "You must be an admin to create an EPerson"); } // Create a table row EPerson e = ePersonDAO.create(context, new EPerson()); log.info(LogManager.getHeader(context, "create_eperson", "eperson_id=" - + e.getID())); + + e.getID())); context.addEvent(new Event(Event.CREATE, Constants.EPERSON, e.getID(), - null, getIdentifiers(context, e))); + null, getIdentifiers(context, e))); return e; } @Override public void delete(Context context, EPerson ePerson) throws SQLException, AuthorizeException { + try { + delete(context, ePerson, true); + } catch (AuthorizeException ex) { + log.error("This AuthorizeException: " + ex + " occured while deleting Eperson with the ID: " + + ePerson.getID()); + throw new AuthorizeException(ex); + } catch (IOException ex) { + log.error("This IOException: " + ex + " occured while deleting Eperson with the ID: " + ePerson.getID()); + throw new AuthorizeException(ex); + } catch (EPersonDeletionException e) { + throw new IllegalStateException(e); + } + } + + /** + * Deletes an EPerson. The argument cascade defines whether all references + * on an EPerson should be deleted as well (by either deleting the + * referencing object - e.g. WorkspaceItem, ResourcePolicy - or by setting + * the foreign key null - e.g. archived Items). If cascade is set to false + * and the EPerson is referenced somewhere, this leads to an + * AuthorizeException. EPersons may be referenced by Items, ResourcePolicies + * and workflow tasks. + * + * @param context DSpace context + * @param ePerson The EPerson to delete. + * @param cascade Whether to delete references on the EPerson (cascade = + * true) or to abort the deletion (cascade = false) if the EPerson is + * referenced within DSpace. + * + * @throws SQLException + * @throws AuthorizeException + * @throws IOException + */ + public void delete(Context context, EPerson ePerson, boolean cascade) + throws SQLException, AuthorizeException, IOException, EPersonDeletionException { // authorized? if (!authorizeService.isAdmin(context)) { throw new AuthorizeException( - "You must be an admin to delete an EPerson"); + "You must be an admin to delete an EPerson"); + } + Set workFlowGroups = getAllWorkFlowGroups(context, ePerson); + for (Group group: workFlowGroups) { + List ePeople = groupService.allMembers(context, group); + if (ePeople.size() == 1 && ePeople.contains(ePerson)) { + throw new IllegalStateException( + "Refused to delete user " + ePerson.getID() + " because it the only member of the workflow group" + + group.getID() + ". Delete the tasks and group first if you want to remove this user."); + } } - // check for presence of eperson in tables that // have constraints on eperson_id List constraintList = getDeleteConstraints(context, ePerson); - - // if eperson exists in tables that have constraints - // on eperson, throw an exception if (constraintList.size() > 0) { - throw new AuthorizeException(new EPersonDeletionException(constraintList)); - } + // Check if the constraints we found should be deleted + if (cascade) { + boolean isBasicFramework = WorkflowServiceFactory.getInstance().getWorkflowService() + instanceof BasicWorkflowService; + boolean isXmlFramework = WorkflowServiceFactory.getInstance().getWorkflowService() + instanceof XmlWorkflowService; + Iterator constraintsIterator = constraintList.iterator(); + while (constraintsIterator.hasNext()) { + String tableName = constraintsIterator.next(); + if (StringUtils.equals(tableName, "item") || StringUtils.equals(tableName, "workspaceitem")) { + Iterator itemIterator = itemService.findBySubmitter(context, ePerson, true); + + VersionHistoryService versionHistoryService = VersionServiceFactory.getInstance() + .getVersionHistoryService(); + VersioningService versioningService = VersionServiceFactory.getInstance().getVersionService(); + + while (itemIterator.hasNext()) { + Item item = itemIterator.next(); + + VersionHistory versionHistory = versionHistoryService.findByItem(context, item); + if (null != versionHistory) { + for (Version version : versioningService.getVersionsByHistory(context, + versionHistory)) { + version.setePerson(null); + versionDAO.save(context, version); + } + } + WorkspaceItemService workspaceItemService = ContentServiceFactory.getInstance() + .getWorkspaceItemService(); + WorkspaceItem wsi = workspaceItemService.findByItem(context, item); + + if (null != wsi) { + workspaceItemService.deleteAll(context, wsi); + } else { + // we can do that as dc.provenance still contains + // information about who submitted and who + // archived an item. + item.setSubmitter(null); + itemService.update(context, item); + } + } + } else if (StringUtils.equals(tableName, "cwf_claimtask") && isXmlFramework) { + // Unclaim all XmlWorkflow tasks + XmlWorkflowItemService xmlWorkflowItemService = XmlWorkflowServiceFactory + .getInstance().getXmlWorkflowItemService(); + ClaimedTaskService claimedTaskService = XmlWorkflowServiceFactory + .getInstance().getClaimedTaskService(); + XmlWorkflowService xmlWorkflowService = XmlWorkflowServiceFactory + .getInstance().getXmlWorkflowService(); + WorkflowRequirementsService workflowRequirementsService = XmlWorkflowServiceFactory + .getInstance().getWorkflowRequirementsService(); + + List xmlWorkflowItems = xmlWorkflowItemService + .findBySubmitter(context, ePerson); + List claimedTasks = claimedTaskService.findByEperson(context, ePerson); + + for (ClaimedTask task : claimedTasks) { + xmlWorkflowService.deleteClaimedTask(context, task.getWorkflowItem(), task); + + try { + workflowRequirementsService.removeClaimedUser(context, task.getWorkflowItem(), + ePerson, task.getStepID()); + } catch (WorkflowConfigurationException ex) { + log.error("This WorkflowConfigurationException: " + ex + + " occured while deleting Eperson with the ID: " + ePerson.getID()); + throw new AuthorizeException(new EPersonDeletionException(Collections + .singletonList(tableName))); + } + } + } else if (StringUtils.equals(tableName, "workflowitem") && isBasicFramework) { + // Remove basicWorkflow workflowitem and unclaim them + BasicWorkflowItemService basicWorkflowItemService = BasicWorkflowServiceFactory.getInstance() + .getBasicWorkflowItemService(); + BasicWorkflowService basicWorkflowService = BasicWorkflowServiceFactory.getInstance() + .getBasicWorkflowService(); + TaskListItemService taskListItemService = BasicWorkflowServiceFactory.getInstance() + .getTaskListItemService(); + List workflowItems = basicWorkflowItemService.findByOwner(context, ePerson); + for (BasicWorkflowItem workflowItem : workflowItems) { + int state = workflowItem.getState(); + // unclaim tasks that are in the pool. + if (state == BasicWorkflowServiceImpl.WFSTATE_STEP1 + || state == BasicWorkflowServiceImpl.WFSTATE_STEP2 + || state == BasicWorkflowServiceImpl.WFSTATE_STEP3) { + log.info(LogManager.getHeader(context, "unclaim_workflow", + "workflow_id=" + workflowItem.getID() + ", claiming EPerson is deleted")); + basicWorkflowService.unclaim(context, workflowItem, context.getCurrentUser()); + // remove the EPerson from the list of persons that can (re-)claim the task + // while we are doing it below, we must do this here as well as the previously + // unclaimed tasks was put back into pool and we do not know the order the tables + // are checked. + taskListItemService.deleteByWorkflowItemAndEPerson(context, workflowItem, ePerson); + } + } + } else if (StringUtils.equals(tableName, "resourcepolicy")) { + // we delete the EPerson, it won't need any rights anymore. + authorizeService.removeAllEPersonPolicies(context, ePerson); + } else if (StringUtils.equals(tableName, "tasklistitem") && isBasicFramework) { + // remove EPerson from the list of EPersons that may claim some specific workflow tasks. + TaskListItemService taskListItemService = BasicWorkflowServiceFactory.getInstance() + .getTaskListItemService(); + taskListItemService.deleteByEPerson(context, ePerson); + } else if (StringUtils.equals(tableName, "cwf_pooltask") && isXmlFramework) { + PoolTaskService poolTaskService = XmlWorkflowServiceFactory.getInstance().getPoolTaskService(); + poolTaskService.deleteByEperson(context, ePerson); + } else if (StringUtils.equals(tableName, "cwf_workflowitemrole") && isXmlFramework) { + WorkflowItemRoleService workflowItemRoleService = XmlWorkflowServiceFactory.getInstance() + .getWorkflowItemRoleService(); + workflowItemRoleService.deleteByEPerson(context, ePerson); + } else { + log.warn("EPerson is referenced in table '" + tableName + + "'. Deletion of EPerson " + ePerson.getID() + " may fail " + + "if the database does not handle this " + + "reference."); + } + } + } else { + throw new EPersonDeletionException(constraintList); + } + } context.addEvent(new Event(Event.DELETE, Constants.EPERSON, ePerson.getID(), ePerson.getEmail(), - getIdentifiers(context, ePerson))); + getIdentifiers(context, ePerson))); // XXX FIXME: This sidesteps the object model code so it won't // generate REMOVE events on the affected Groups. - // Remove any group memberships first // Remove any group memberships first Iterator groups = ePerson.getGroups().iterator(); @@ -234,7 +434,20 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme ePersonDAO.delete(context, ePerson); log.info(LogManager.getHeader(context, "delete_eperson", - "eperson_id=" + ePerson.getID())); + "eperson_id=" + ePerson.getID())); + } + + private Set getAllWorkFlowGroups(Context context, EPerson ePerson) throws SQLException { + Set workFlowGroups = new HashSet<>(); + + Set groups = groupService.allMemberGroupsSet(context, ePerson); + for (Group group: groups) { + List collectionRoles = collectionRoleService.findByGroup(context, group); + if (!collectionRoles.isEmpty()) { + workFlowGroups.add(group); + } + } + return workFlowGroups; } @Override @@ -268,8 +481,8 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme PasswordHash hash = null; try { hash = new PasswordHash(ePerson.getDigestAlgorithm(), - ePerson.getSalt(), - ePerson.getPassword()); + ePerson.getSalt(), + ePerson.getPassword()); } catch (DecoderException ex) { log.error("Problem decoding stored salt or hash: " + ex.getMessage()); } @@ -281,9 +494,9 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme PasswordHash myHash; try { myHash = new PasswordHash( - ePerson.getDigestAlgorithm(), - ePerson.getSalt(), - ePerson.getPassword()); + ePerson.getDigestAlgorithm(), + ePerson.getSalt(), + ePerson.getPassword()); } catch (DecoderException ex) { log.error(ex.getMessage()); return false; @@ -312,8 +525,8 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme // Check authorisation - if you're not the eperson // see if the authorization system says you can if (!context.ignoreAuthorization() - && ((context.getCurrentUser() == null) || (ePerson.getID() != context - .getCurrentUser().getID()))) { + && ((context.getCurrentUser() == null) || (ePerson.getID() != context + .getCurrentUser().getID()))) { authorizeService.authorizeAction(context, ePerson, Constants.WRITE); } @@ -322,11 +535,11 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme ePersonDAO.save(context, ePerson); log.info(LogManager.getHeader(context, "update_eperson", - "eperson_id=" + ePerson.getID())); + "eperson_id=" + ePerson.getID())); if (ePerson.isModified()) { context.addEvent(new Event(Event.MODIFY, Constants.EPERSON, - ePerson.getID(), null, getIdentifiers(context, ePerson))); + ePerson.getID(), null, getIdentifiers(context, ePerson))); ePerson.clearModified(); } if (ePerson.isMetadataModified()) { @@ -339,11 +552,22 @@ public class EPersonServiceImpl extends DSpaceObjectServiceImpl impleme List tableList = new ArrayList(); // check for eperson in item table - Iterator itemsBySubmitter = itemService.findBySubmitter(context, ePerson); + Iterator itemsBySubmitter = itemService.findBySubmitter(context, ePerson, true); if (itemsBySubmitter.hasNext()) { tableList.add("item"); } + WorkspaceItemService workspaceItemService = ContentServiceFactory.getInstance().getWorkspaceItemService(); + List workspaceBySubmitter = workspaceItemService.findByEPerson(context, ePerson); + if (workspaceBySubmitter.size() > 0) { + tableList.add("workspaceitem"); + } + + ResourcePolicyService resourcePolicyService = AuthorizeServiceFactory.getInstance().getResourcePolicyService(); + if (resourcePolicyService.find(context, ePerson).size() > 0) { + tableList.add("resourcepolicy"); + } + WorkflowService workflowService = WorkflowServiceFactory.getInstance().getWorkflowService(); List workflowConstraints = workflowService.getEPersonDeleteConstraints(context, ePerson); tableList.addAll(workflowConstraints); diff --git a/dspace-api/src/main/java/org/dspace/eperson/GroupServiceImpl.java b/dspace-api/src/main/java/org/dspace/eperson/GroupServiceImpl.java index 4437516315..71fbcce7d3 100644 --- a/dspace-api/src/main/java/org/dspace/eperson/GroupServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/eperson/GroupServiceImpl.java @@ -42,8 +42,15 @@ import org.dspace.eperson.service.EPersonService; import org.dspace.eperson.service.GroupService; import org.dspace.event.Event; import org.dspace.util.UUIDUtils; +import org.dspace.xmlworkflow.Role; +import org.dspace.xmlworkflow.factory.XmlWorkflowFactory; +import org.dspace.xmlworkflow.state.Step; +import org.dspace.xmlworkflow.storedcomponents.ClaimedTask; import org.dspace.xmlworkflow.storedcomponents.CollectionRole; +import org.dspace.xmlworkflow.storedcomponents.PoolTask; +import org.dspace.xmlworkflow.storedcomponents.service.ClaimedTaskService; import org.dspace.xmlworkflow.storedcomponents.service.CollectionRoleService; +import org.dspace.xmlworkflow.storedcomponents.service.PoolTaskService; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; @@ -81,6 +88,13 @@ public class GroupServiceImpl extends DSpaceObjectServiceImpl implements @Autowired(required = true) protected ResourcePolicyService resourcePolicyService; + @Autowired(required = true) + protected PoolTaskService poolTaskService; + @Autowired(required = true) + protected ClaimedTaskService claimedTaskService; + @Autowired(required = true) + protected XmlWorkflowFactory workflowFactory; + protected GroupServiceImpl() { super(); } @@ -143,8 +157,48 @@ public class GroupServiceImpl extends DSpaceObjectServiceImpl implements groupChild.getName(), getIdentifiers(context, groupParent))); } + /** + * Removes a member of a group. + * The removal will be refused if the group is linked to a workflow step which has claimed tasks or pool tasks + * and no other member is present in the group to handle these. + * @param context DSpace context object + * @param group DSpace group + * @param ePerson eperson + * @throws SQLException + */ @Override - public void removeMember(Context context, Group group, EPerson ePerson) { + public void removeMember(Context context, Group group, EPerson ePerson) throws SQLException { + List collectionRoles = collectionRoleService.findByGroup(context, group); + if (!collectionRoles.isEmpty()) { + List poolTasks = poolTaskService.findByGroup(context, group); + List claimedTasks = claimedTaskService.findByEperson(context, ePerson); + for (ClaimedTask claimedTask : claimedTasks) { + Step stepByName = workflowFactory.getStepByName(claimedTask.getStepID()); + Role role = stepByName.getRole(); + for (CollectionRole collectionRole : collectionRoles) { + if (StringUtils.equals(collectionRole.getRoleId(), role.getId()) + && claimedTask.getWorkflowItem().getCollection() == collectionRole.getCollection()) { + List ePeople = allMembers(context, group); + if (ePeople.size() == 1 && ePeople.contains(ePerson)) { + throw new IllegalStateException( + "Refused to remove user " + ePerson + .getID() + " from workflow group because the group " + group + .getID() + " has tasks assigned and no other members"); + } + + } + } + } + if (!poolTasks.isEmpty()) { + List ePeople = allMembers(context, group); + if (ePeople.size() == 1 && ePeople.contains(ePerson)) { + throw new IllegalStateException( + "Refused to remove user " + ePerson + .getID() + " from workflow group because the group " + group + .getID() + " has tasks assigned and no other members"); + } + } + } if (group.remove(ePerson)) { context.addEvent(new Event(Event.REMOVE, Constants.GROUP, group.getID(), Constants.EPERSON, ePerson.getID(), ePerson.getEmail(), getIdentifiers(context, group))); @@ -153,6 +207,20 @@ public class GroupServiceImpl extends DSpaceObjectServiceImpl implements @Override public void removeMember(Context context, Group groupParent, Group childGroup) throws SQLException { + List collectionRoles = collectionRoleService.findByGroup(context, groupParent); + if (!collectionRoles.isEmpty()) { + List poolTasks = poolTaskService.findByGroup(context, groupParent); + if (!poolTasks.isEmpty()) { + List parentPeople = allMembers(context, groupParent); + List childPeople = allMembers(context, childGroup); + if (childPeople.containsAll(parentPeople)) { + throw new IllegalStateException( + "Refused to remove sub group " + childGroup + .getID() + " from workflow group because the group " + groupParent + .getID() + " has tasks assigned and no other members"); + } + } + } if (groupParent.remove(childGroup)) { childGroup.removeParentGroup(groupParent); context.addEvent( @@ -189,7 +257,8 @@ public class GroupServiceImpl extends DSpaceObjectServiceImpl implements return false; // special, everyone is member of group 0 (anonymous) - } else if (StringUtils.equals(group.getName(), Group.ANONYMOUS)) { + } else if (StringUtils.equals(group.getName(), Group.ANONYMOUS) || + isParentOf(context, group, findByName(context, Group.ANONYMOUS))) { return true; } else { diff --git a/dspace-api/src/main/java/org/dspace/eperson/service/AccountService.java b/dspace-api/src/main/java/org/dspace/eperson/service/AccountService.java index c8ecb0cc67..45fa6d26b1 100644 --- a/dspace-api/src/main/java/org/dspace/eperson/service/AccountService.java +++ b/dspace-api/src/main/java/org/dspace/eperson/service/AccountService.java @@ -46,4 +46,11 @@ public interface AccountService { public void deleteToken(Context context, String token) throws SQLException; + + /** + * This method verifies that a certain String adheres to the password rules for DSpace + * @param password The String to be checked + * @return A boolean indicating whether or not the given String adheres to the password rules + */ + public boolean verifyPasswordStructure(String password); } diff --git a/dspace-api/src/main/java/org/dspace/eperson/service/GroupService.java b/dspace-api/src/main/java/org/dspace/eperson/service/GroupService.java index f750419af1..b49ee857fb 100644 --- a/dspace-api/src/main/java/org/dspace/eperson/service/GroupService.java +++ b/dspace-api/src/main/java/org/dspace/eperson/service/GroupService.java @@ -76,7 +76,7 @@ public interface GroupService extends DSpaceObjectService, DSpaceObjectLe * @param group DSpace group * @param ePerson eperson */ - public void removeMember(Context context, Group group, EPerson ePerson); + public void removeMember(Context context, Group group, EPerson ePerson) throws SQLException; /** diff --git a/dspace-api/src/main/java/org/dspace/external/provider/impl/LiveImportDataProvider.java b/dspace-api/src/main/java/org/dspace/external/provider/impl/LiveImportDataProvider.java new file mode 100644 index 0000000000..45855a74ad --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/external/provider/impl/LiveImportDataProvider.java @@ -0,0 +1,162 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.external.provider.impl; + +import java.util.Collection; +import java.util.List; +import java.util.Optional; +import java.util.stream.Collectors; + +import org.apache.commons.lang3.StringUtils; +import org.dspace.content.dto.MetadataValueDTO; +import org.dspace.external.model.ExternalDataObject; +import org.dspace.external.provider.ExternalDataProvider; +import org.dspace.importer.external.datamodel.ImportRecord; +import org.dspace.importer.external.exception.MetadataSourceException; +import org.dspace.importer.external.metadatamapping.MetadatumDTO; +import org.dspace.importer.external.service.components.QuerySource; + +/** + * This class allows to configure a Live Import Provider as an External Data Provider + * + * @author Andrea Bollini (andrea.bollini at 4science.it) + * + */ +public class LiveImportDataProvider implements ExternalDataProvider { + /** + * The {@link QuerySource} live import provider + */ + private QuerySource querySource; + + /** + * An unique human readable identifier for this provider + */ + private String sourceIdentifier; + + private String recordIdMetadata; + + private String displayMetadata = "dc.title"; + + @Override + public String getSourceIdentifier() { + return sourceIdentifier; + } + + /** + * This method set the SourceIdentifier for the ExternalDataProvider + * @param sourceIdentifier The UNIQUE sourceIdentifier to be set on any LiveImport data provider + */ + public void setSourceIdentifier(String sourceIdentifier) { + this.sourceIdentifier = sourceIdentifier; + } + + /** + * This method set the MetadataSource for the ExternalDataProvider + * @param metadataSource {@link org.dspace.importer.external.service.components.MetadataSource} implementation used to process the input data + */ + public void setMetadataSource(QuerySource querySource) { + this.querySource = querySource; + } + + /** + * This method set dublin core identifier to use as metadata id + * @param recordIdMetadata dublin core identifier to use as metadata id + */ + public void setRecordIdMetadata(String recordIdMetadata) { + this.recordIdMetadata = recordIdMetadata; + } + + /** + * This method set the dublin core identifier to display the title + * @param displayMetadata metadata to use as title + */ + public void setDisplayMetadata(String displayMetadata) { + this.displayMetadata = displayMetadata; + } + + @Override + public Optional getExternalDataObject(String id) { + try { + ExternalDataObject externalDataObject = getExternalDataObject(querySource.getRecord(id)); + return Optional.of(externalDataObject); + } catch (MetadataSourceException e) { + throw new RuntimeException( + "The live import provider " + querySource.getImportSource() + " throws an exception", e); + } + } + + @Override + public List searchExternalDataObjects(String query, int start, int limit) { + Collection records; + try { + records = querySource.getRecords(query, start, limit); + return records.stream().map(r -> getExternalDataObject(r)).collect(Collectors.toList()); + } catch (MetadataSourceException e) { + throw new RuntimeException( + "The live import provider " + querySource.getImportSource() + " throws an exception", e); + } + } + + @Override + public boolean supports(String source) { + return StringUtils.equalsIgnoreCase(sourceIdentifier, source); + } + + @Override + public int getNumberOfResults(String query) { + try { + return querySource.getRecordsCount(query); + } catch (MetadataSourceException e) { + throw new RuntimeException( + "The live import provider " + querySource.getImportSource() + " throws an exception", e); + } + } + + /** + * Internal method to convert an ImportRecord to an ExternalDataObject + * + * FIXME it would be useful to remove ImportRecord at all in favor of the + * ExternalDataObject + * + * @param record + * @return + */ + private ExternalDataObject getExternalDataObject(ImportRecord record) { + //return 400 if no record were found + if (record == null) { + throw new IllegalArgumentException("No record found for query or id"); + } + ExternalDataObject externalDataObject = new ExternalDataObject(sourceIdentifier); + String id = getFirstValue(record, recordIdMetadata); + String display = getFirstValue(record, displayMetadata); + externalDataObject.setId(id); + externalDataObject.setDisplayValue(display); + externalDataObject.setValue(display); + for (MetadatumDTO dto : record.getValueList()) { + // FIXME it would be useful to remove MetadatumDTO in favor of MetadataValueDTO + MetadataValueDTO mvDTO = new MetadataValueDTO(); + mvDTO.setSchema(dto.getSchema()); + mvDTO.setElement(dto.getElement()); + mvDTO.setQualifier(dto.getQualifier()); + mvDTO.setValue(dto.getValue()); + externalDataObject.addMetadata(mvDTO); + } + return externalDataObject; + } + + private String getFirstValue(ImportRecord record, String metadata) { + String id = null; + String[] split = StringUtils.split(metadata, ".", 3); + Collection values = record.getValue(split[0], split[1], split.length == 3 ? split[2] : null); + if (!values.isEmpty()) { + id = (values.iterator().next().getValue()); + } + return id; + } + +} diff --git a/dspace-api/src/main/java/org/dspace/harvest/HarvestScheduler.java b/dspace-api/src/main/java/org/dspace/harvest/HarvestScheduler.java index d668b09bc4..5d0545845c 100644 --- a/dspace-api/src/main/java/org/dspace/harvest/HarvestScheduler.java +++ b/dspace-api/src/main/java/org/dspace/harvest/HarvestScheduler.java @@ -134,11 +134,13 @@ public class HarvestScheduler implements Runnable { if (maxActiveThreads == 0) { maxActiveThreads = 3; } - minHeartbeat = ConfigurationManager.getIntProperty("oai", "harvester.minHeartbeat") * 1000; + minHeartbeat = ConfigurationManager.getIntProperty("oai", "harvester.minHeartbeat"); + minHeartbeat = minHeartbeat * 1000; // multiple by 1000 to turn seconds to ms if (minHeartbeat == 0) { minHeartbeat = 30000; } - maxHeartbeat = ConfigurationManager.getIntProperty("oai", "harvester.maxHeartbeat") * 1000; + maxHeartbeat = ConfigurationManager.getIntProperty("oai", "harvester.maxHeartbeat"); + maxHeartbeat = maxHeartbeat * 1000; // multiple by 1000 to turn seconds to ms if (maxHeartbeat == 0) { maxHeartbeat = 3600000; } diff --git a/dspace-api/src/main/java/org/dspace/identifier/DOIIdentifierProvider.java b/dspace-api/src/main/java/org/dspace/identifier/DOIIdentifierProvider.java index 46bc317d13..9db4402007 100644 --- a/dspace-api/src/main/java/org/dspace/identifier/DOIIdentifierProvider.java +++ b/dspace-api/src/main/java/org/dspace/identifier/DOIIdentifierProvider.java @@ -761,9 +761,9 @@ public class DOIIdentifierProvider Item item = (Item) dso; List metadata = itemService.getMetadata(item, MD_SCHEMA, DOI_ELEMENT, DOI_QUALIFIER, null); + String leftPart = DOI.RESOLVER + SLASH + getPrefix() + SLASH + getNamespaceSeparator(); for (MetadataValue id : metadata) { - if (id.getValue().startsWith( - DOI.RESOLVER + String.valueOf(SLASH) + PREFIX + String.valueOf(SLASH) + NAMESPACE_SEPARATOR)) { + if (id.getValue().startsWith(leftPart)) { return doiService.DOIFromExternalFormat(id.getValue()); } } diff --git a/dspace-api/src/main/java/org/dspace/importer/external/arxiv/metadatamapping/ArXivFieldMapping.java b/dspace-api/src/main/java/org/dspace/importer/external/arxiv/metadatamapping/ArXivFieldMapping.java new file mode 100644 index 0000000000..272b149015 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/arxiv/metadatamapping/ArXivFieldMapping.java @@ -0,0 +1,37 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.arxiv.metadatamapping; + +import java.util.Map; +import javax.annotation.Resource; + +import org.dspace.importer.external.metadatamapping.AbstractMetadataFieldMapping; + +/** + * An implementation of {@link AbstractMetadataFieldMapping} + * Responsible for defining the mapping of the ArXiv metadatum fields on the DSpace metadatum fields + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ +public class ArXivFieldMapping extends AbstractMetadataFieldMapping { + + /** + * Defines which metadatum is mapped on which metadatum. Note that while the key must be unique it + * only matters here for postprocessing of the value. The mapped MetadatumContributor has full control over + * what metadatafield is generated. + * + * @param metadataFieldMap The map containing the link between retrieve metadata and metadata that will be set to + * the item. + */ + @Override + @Resource(name = "arxivMetadataFieldMap") + public void setMetadataFieldMap(Map metadataFieldMap) { + super.setMetadataFieldMap(metadataFieldMap); + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/arxiv/metadatamapping/contributor/ArXivIdMetadataContributor.java b/dspace-api/src/main/java/org/dspace/importer/external/arxiv/metadatamapping/contributor/ArXivIdMetadataContributor.java new file mode 100644 index 0000000000..ed5ac5960b --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/arxiv/metadatamapping/contributor/ArXivIdMetadataContributor.java @@ -0,0 +1,60 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.arxiv.metadatamapping.contributor; + +import java.util.Collection; + +import org.apache.axiom.om.OMElement; +import org.dspace.importer.external.metadatamapping.MetadatumDTO; +import org.dspace.importer.external.metadatamapping.contributor.MetadataContributor; +import org.dspace.importer.external.metadatamapping.contributor.SimpleXpathMetadatumContributor; + +/** + * Arxiv specific implementation of {@link MetadataContributor} + * Responsible for generating the ArXiv Id from the retrieved item. + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + * + */ +public class ArXivIdMetadataContributor extends SimpleXpathMetadatumContributor { + + /** + * Retrieve the metadata associated with the given object. + * Depending on the retrieved node (using the query), different types of values will be added to the MetadatumDTO + * list + * + * @param t A class to retrieve metadata from. + * @return a collection of import records. Only the identifier of the found records may be put in the record. + */ + @Override + public Collection contributeMetadata(OMElement t) { + Collection values = super.contributeMetadata(t); + parseValue(values); + return values; + } + + /** + * ArXiv returns a full URL as in the value, e.g. http://arxiv.org/abs/1911.11405v1. + * This method parses out the identifier from the end of the URL, e.g. 1911.11405v1. + * + * @param dtos Metadata which contains the items uri + */ + private void parseValue(Collection dtos) { + if (dtos != null) { + for (MetadatumDTO dto : dtos) { + if (dto != null && dto.getValue() != null && dto.getValue().contains("/")) { + int startIndex = dto.getValue().lastIndexOf('/') + 1; + int endIndex = dto.getValue().length(); + String id = dto.getValue().substring(startIndex, endIndex); + dto.setValue(id); + } + } + } + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/arxiv/service/ArXivImportMetadataSourceServiceImpl.java b/dspace-api/src/main/java/org/dspace/importer/external/arxiv/service/ArXivImportMetadataSourceServiceImpl.java new file mode 100644 index 0000000000..6b418423fa --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/arxiv/service/ArXivImportMetadataSourceServiceImpl.java @@ -0,0 +1,421 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.arxiv.service; + +import java.io.StringReader; +import java.util.ArrayList; +import java.util.Collection; +import java.util.List; +import java.util.concurrent.Callable; +import javax.el.MethodNotFoundException; +import javax.ws.rs.client.Client; +import javax.ws.rs.client.ClientBuilder; +import javax.ws.rs.client.Invocation; +import javax.ws.rs.client.WebTarget; +import javax.ws.rs.core.MediaType; +import javax.ws.rs.core.Response; + +import org.apache.axiom.om.OMElement; +import org.apache.axiom.om.OMXMLBuilderFactory; +import org.apache.axiom.om.OMXMLParserWrapper; +import org.apache.axiom.om.xpath.AXIOMXPath; +import org.apache.commons.lang3.StringUtils; +import org.dspace.content.Item; +import org.dspace.importer.external.datamodel.ImportRecord; +import org.dspace.importer.external.datamodel.Query; +import org.dspace.importer.external.exception.MetadataSourceException; +import org.dspace.importer.external.service.AbstractImportMetadataSourceService; +import org.dspace.importer.external.service.components.QuerySource; +import org.jaxen.JaxenException; + +/** + * Implements a data source for querying ArXiv + * + * @author Pasquale Cavallo (pasquale.cavallo at 4Science dot it) + * + */ +public class ArXivImportMetadataSourceServiceImpl extends AbstractImportMetadataSourceService + implements QuerySource { + + private WebTarget webTarget; + private String baseAddress; + + /** + * Find the number of records matching the query string in ArXiv. Supports pagination. + * + * @param query a query string to base the search on. + * @param start offset to start at + * @param count number of records to retrieve. + * @return a set of records. Fully transformed. + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + @Override + public Collection getRecords(String query, int start, int count) throws MetadataSourceException { + return retry(new SearchByQueryCallable(query, count, start)); + } + + /** + * Find records based on a object query and convert them to a list metadata mapped in ImportRecord. + * The entry with the key "query" of the Query's map will be used as query string value. + * + * @see org.dspace.importer.external.datamodel.Query + * @see org.dspace.importer.external.datamodel.ImportRecord + * @param query a query object to base the search on. + * @return a set of records. Fully transformed. + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + @Override + public Collection getRecords(Query query) throws MetadataSourceException { + return retry(new SearchByQueryCallable(query)); + } + + /** + * Find the number of records matching the query string in ArXiv; + * + * @param query a query object to base the search on. + * @return the sum of the matching records over this import source + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + @Override + public int getRecordsCount(String query) throws MetadataSourceException { + return retry(new CountByQueryCallable(query)); + } + + + /** + * Find the number of records matching a query; + * The entry with the key "query" of the Query's map will be used to get the query string. + * + * @see org.dspace.importer.external.datamodel.Query + * @param query a query string to base the search on. + * @return the sum of the matching records over this import source + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + @Override + public int getRecordsCount(Query query) throws MetadataSourceException { + return retry(new CountByQueryCallable(query)); + } + + /** + * Get a single record of metadata from the arxiv by ArXiv ID. + * + * @param id id of the record in ArXiv + * @return the first matching record + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + + @Override + public ImportRecord getRecord(String id) throws MetadataSourceException { + List records = retry(new SearchByIdCallable(id)); + return records == null || records.isEmpty() ? null : records.get(0); + } + + /** + * Get a single record from the ArXiv matching the query. + * Field "query" will be used to get data from. + * + * @see org.dspace.importer.external.datamodel.Query + * @param query a query matching a single record + * @return the first matching record + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + @Override + public ImportRecord getRecord(Query query) throws MetadataSourceException { + List records = retry(new SearchByIdCallable(query)); + return records == null || records.isEmpty() ? null : records.get(0); + } + + /** + * Initialize the class + * + * @throws Exception on generic exception + */ + @Override + public void init() throws Exception { + Client client = ClientBuilder.newClient(); + webTarget = client.target(baseAddress); + } + + /** + * The string that identifies this import implementation. Preferable a URI + * + * @return the identifying uri + */ + @Override + public String getImportSource() { + return "arxiv"; + } + + /** + * Expect this method will be not used and erased from the interface soon + */ + @Override + public Collection findMatchingRecords(Item item) throws MetadataSourceException { + // FIXME: we need this method? + throw new MethodNotFoundException("This method is not implemented for ArXiv"); + } + + /** + * Finds records based on query object. + * Supports search by title and/or author + * + * @param query a query object to base the search on. + * @return a collection of import records. + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + @Override + public Collection findMatchingRecords(Query query) throws MetadataSourceException { + return retry(new FindMatchingRecordCallable(query)); + } + + /** + * This class is a Callable implementation to count the number of entries for an ArXiv + * query. + * This Callable use as query value to ArXiv the string queryString passed to constructor. + * If the object will be construct through Query.class instance, the value of the Query's + * map with the key "query" will be used. + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + * + */ + private class CountByQueryCallable implements Callable { + private Query query; + + + private CountByQueryCallable(String queryString) { + query = new Query(); + query.addParameter("query", queryString); + } + + private CountByQueryCallable(Query query) { + this.query = query; + } + + + @Override + public Integer call() throws Exception { + String queryString = query.getParameterAsClass("query", String.class); + Integer start = query.getParameterAsClass("start", Integer.class); + Integer maxResult = query.getParameterAsClass("count", Integer.class); + WebTarget local = webTarget.queryParam("search_query", queryString); + if (maxResult != null) { + local = local.queryParam("max_results", String.valueOf(maxResult)); + } + if (start != null) { + local = local.queryParam("start", String.valueOf(start)); + } + Invocation.Builder invocationBuilder = local.request(MediaType.TEXT_PLAIN_TYPE); + Response response = invocationBuilder.get(); + if (response.getStatus() == 200) { + String responseString = response.readEntity(String.class); + OMXMLParserWrapper records = OMXMLBuilderFactory.createOMBuilder(new StringReader(responseString)); + OMElement element = records.getDocumentElement(); + AXIOMXPath xpath = null; + try { + xpath = new AXIOMXPath("opensearch:totalResults"); + xpath.addNamespace("opensearch", "http://a9.com/-/spec/opensearch/1.1/"); + OMElement count = (OMElement) xpath.selectSingleNode(element); + return Integer.parseInt(count.getText()); + } catch (JaxenException e) { + return null; + } + } else { + return null; + } + } + } + + /** + * This class is a Callable implementation to get ArXiv entries based on + * query object. + * This Callable use as query value the string queryString passed to constructor. + * If the object will be construct through Query.class instance, a Query's map entry with key "query" will be used. + * Pagination is supported too, using the value of the Query's map with keys "start" and "count". + * + * @see org.dspace.importer.external.datamodel.Query + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + * + */ + private class SearchByQueryCallable implements Callable> { + private Query query; + + + private SearchByQueryCallable(String queryString, Integer maxResult, Integer start) { + query = new Query(); + query.addParameter("query", queryString); + query.addParameter("start", start); + query.addParameter("count", maxResult); + } + + private SearchByQueryCallable(Query query) { + this.query = query; + } + + + @Override + public List call() throws Exception { + List results = new ArrayList(); + String queryString = query.getParameterAsClass("query", String.class); + Integer start = query.getParameterAsClass("start", Integer.class); + Integer maxResult = query.getParameterAsClass("count", Integer.class); + WebTarget local = webTarget.queryParam("search_query", queryString); + if (maxResult != null) { + local = local.queryParam("max_results", String.valueOf(maxResult)); + } + if (start != null) { + local = local.queryParam("start", String.valueOf(start)); + } + Invocation.Builder invocationBuilder = local.request(MediaType.TEXT_PLAIN_TYPE); + Response response = invocationBuilder.get(); + if (response.getStatus() == 200) { + String responseString = response.readEntity(String.class); + List omElements = splitToRecords(responseString); + for (OMElement record : omElements) { + results.add(transformSourceRecords(record)); + } + return results; + } else { + return null; + } + } + } + + /** + * This class is a Callable implementation to get an ArXiv entry using ArXiv ID + * The ID to use can be passed through the constructor as a String or as Query's map entry, with the key "id". + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + * + */ + private class SearchByIdCallable implements Callable> { + private Query query; + + private SearchByIdCallable(Query query) { + this.query = query; + } + + private SearchByIdCallable(String id) { + this.query = new Query(); + query.addParameter("id", id); + } + + @Override + public List call() throws Exception { + List results = new ArrayList(); + String arxivid = query.getParameterAsClass("id", String.class); + if (StringUtils.isNotBlank(arxivid)) { + arxivid = arxivid.trim(); + if (arxivid.startsWith("http://arxiv.org/abs/")) { + arxivid = arxivid.substring("http://arxiv.org/abs/".length()); + } else if (arxivid.toLowerCase().startsWith("arxiv:")) { + arxivid = arxivid.substring("arxiv:".length()); + } + } + WebTarget local = webTarget.queryParam("id_list", arxivid); + Invocation.Builder invocationBuilder = local.request(MediaType.TEXT_PLAIN_TYPE); + Response response = invocationBuilder.get(); + if (response.getStatus() == 200) { + String responseString = response.readEntity(String.class); + List omElements = splitToRecords(responseString); + for (OMElement record : omElements) { + results.add(transformSourceRecords(record)); + } + return results; + } else { + return null; + } + } + } + + /** + * This class is a Callable implementation to search ArXiv entries + * using author and title. + * There are two field in the Query map to pass, with keys "title" and "author" + * (at least one must be used). + * + * @see org.dspace.importer.external.datamodel.Query + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + * + */ + private class FindMatchingRecordCallable implements Callable> { + + private Query query; + + private FindMatchingRecordCallable(Query q) { + query = q; + } + + @Override + public List call() throws Exception { + String queryString = getQuery(this.query); + List results = new ArrayList(); + WebTarget local = webTarget.queryParam("search_query", queryString); + Invocation.Builder invocationBuilder = local.request(MediaType.TEXT_PLAIN_TYPE); + Response response = invocationBuilder.get(); + if (response.getStatus() == 200) { + String responseString = response.readEntity(String.class); + List omElements = splitToRecords(responseString); + for (OMElement record : omElements) { + results.add(transformSourceRecords(record)); + } + return results; + } else { + return null; + } + } + + private String getQuery(Query query) { + String title = query.getParameterAsClass("title", String.class); + String author = query.getParameterAsClass("author", String.class); + StringBuffer queryString = new StringBuffer(); + if (StringUtils.isNotBlank(title)) { + queryString.append("ti:\"").append(title).append("\""); + } + if (StringUtils.isNotBlank(author)) { + // [FAU] + if (queryString.length() > 0) { + queryString.append(" AND "); + } + queryString.append("au:\"").append(author).append("\""); + } + return queryString.toString(); + } + } + + private List splitToRecords(String recordsSrc) { + OMXMLParserWrapper records = OMXMLBuilderFactory.createOMBuilder(new StringReader(recordsSrc)); + OMElement element = records.getDocumentElement(); + AXIOMXPath xpath = null; + try { + xpath = new AXIOMXPath("ns:entry"); + xpath.addNamespace("ns", "http://www.w3.org/2005/Atom"); + List recordsList = xpath.selectNodes(element); + return recordsList; + } catch (JaxenException e) { + return null; + } + } + + /** + * Return the baseAddress set to this object + * + * @return The String object that represents the baseAddress of this object + */ + public String getBaseAddress() { + return baseAddress; + } + + /** + * Set the baseAddress to this object + * + * @param baseAddress The String object that represents the baseAddress of this object + */ + public void setBaseAddress(String baseAddress) { + this.baseAddress = baseAddress; + } +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/bibtex/service/BibtexImportMetadataSourceServiceImpl.java b/dspace-api/src/main/java/org/dspace/importer/external/bibtex/service/BibtexImportMetadataSourceServiceImpl.java new file mode 100644 index 0000000000..7468d601f5 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/bibtex/service/BibtexImportMetadataSourceServiceImpl.java @@ -0,0 +1,107 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ + +package org.dspace.importer.external.bibtex.service; + +import java.io.IOException; +import java.io.InputStream; +import java.io.InputStreamReader; +import java.io.Reader; +import java.util.ArrayList; +import java.util.List; +import java.util.Map; +import java.util.Map.Entry; +import javax.annotation.Resource; + +import org.dspace.importer.external.exception.FileSourceException; +import org.dspace.importer.external.service.components.AbstractPlainMetadataSource; +import org.dspace.importer.external.service.components.dto.PlainMetadataKeyValueItem; +import org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto; +import org.jbibtex.BibTeXDatabase; +import org.jbibtex.BibTeXEntry; +import org.jbibtex.BibTeXParser; +import org.jbibtex.Key; +import org.jbibtex.ParseException; +import org.jbibtex.Value; + +/** + * Implements a metadata importer for BibTeX files + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ +public class BibtexImportMetadataSourceServiceImpl extends AbstractPlainMetadataSource { + + + /** + * The string that identifies this import implementation as + * MetadataSource implementation + * + * @return the identifying uri + */ + @Override + public String getImportSource() { + return "BibTeXMetadataSource"; + } + + @Override + protected List readData (InputStream + inputStream) throws FileSourceException { + List list = new ArrayList<>(); + BibTeXDatabase database; + try { + database = parseBibTex(inputStream); + } catch (IOException | ParseException e) { + throw new FileSourceException("Unable to parse file with BibTeX parser"); + } + if (database == null || database.getEntries() == null) { + throw new FileSourceException("File results in an empty list of metadata"); + } + if (database.getEntries() != null) { + for (Entry entry : database.getEntries().entrySet()) { + PlainMetadataSourceDto item = new PlainMetadataSourceDto(); + List keyValues = new ArrayList<>(); + item.setMetadata(keyValues); + PlainMetadataKeyValueItem keyValueItem = new PlainMetadataKeyValueItem(); + keyValueItem.setKey(entry.getValue().getType().getValue()); + keyValueItem.setValue(entry.getKey().getValue()); + keyValues.add(keyValueItem); + if (entry.getValue().getFields() != null) { + for (Entry subentry : entry.getValue().getFields().entrySet()) { + PlainMetadataKeyValueItem innerItem = new PlainMetadataKeyValueItem(); + innerItem.setKey(subentry.getKey().getValue()); + innerItem.setValue(subentry.getValue().toUserString()); + keyValues.add(innerItem); + } + } + list.add(item); + } + } + return list; + } + + private BibTeXDatabase parseBibTex(InputStream inputStream) throws IOException, ParseException { + Reader reader = new InputStreamReader(inputStream); + BibTeXParser bibtexParser = new BibTeXParser(); + return bibtexParser.parse(reader); + } + + + /** + * Retrieve the MetadataFieldMapping containing the mapping between RecordType + * (in this case PlainMetadataSourceDto.class) and Metadata + * + * @return The configured MetadataFieldMapping + */ + @Override + @SuppressWarnings("unchecked") + @Resource(name = "bibtexMetadataFieldMap") + public void setMetadataFieldMap(@SuppressWarnings("rawtypes") Map metadataFieldMap) { + super.setMetadataFieldMap(metadataFieldMap); + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/csv/service/CharacterSeparatedImportMetadataSourceServiceImpl.java b/dspace-api/src/main/java/org/dspace/importer/external/csv/service/CharacterSeparatedImportMetadataSourceServiceImpl.java new file mode 100644 index 0000000000..31ee1e5e5a --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/csv/service/CharacterSeparatedImportMetadataSourceServiceImpl.java @@ -0,0 +1,154 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.csv.service; + +import java.io.IOException; +import java.io.InputStream; +import java.io.InputStreamReader; +import java.nio.charset.StandardCharsets; +import java.util.ArrayList; +import java.util.List; +import java.util.Map; + +import au.com.bytecode.opencsv.CSVReader; +import org.dspace.importer.external.exception.FileSourceException; +import org.dspace.importer.external.metadatamapping.MetadataFieldConfig; +import org.dspace.importer.external.metadatamapping.contributor.MetadataContributor; +import org.dspace.importer.external.service.components.AbstractPlainMetadataSource; +import org.dspace.importer.external.service.components.MetadataSource; +import org.dspace.importer.external.service.components.dto.PlainMetadataKeyValueItem; +import org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto; + + +/** + * This class is an implementation of {@link MetadataSource} which extends {@link AbstractPlainMetadataSource} + * in order to parse "character separated" files like csv, tsv, etc using the Live Import framework. + * + * @author Pasquale Cavallo + * + */ +public class CharacterSeparatedImportMetadataSourceServiceImpl extends AbstractPlainMetadataSource { + + private char separator = ','; + + private char escapeCharacter = '"'; + + private Integer skipLines = 1; + + private String importSource = "CsvMetadataSource"; + + /** + * Set the number of lines to skip at the start of the file. This method is suitable, + * for example, to skip file headers. + * + * @param skipLines number of the line at the start of the file to skip. + */ + public void setSkipLines(Integer skipLines) { + this.skipLines = skipLines; + } + + /** + * + * @return the number of the lines to skip + */ + public Integer getSkipLines() { + return skipLines; + } + + /** + * Method to inject the separator + * This must be the ASCII integer + * related to the char. + * In example, 9 for tab, 44 for comma + */ + public void setSeparator(char separator) { + this.separator = separator; + } + + @Override + public String getImportSource() { + return importSource; + } + + /** + * Method to set the name of the source + */ + public void setImportSource(String importSource) { + this.importSource = importSource; + } + + /** + * Method to inject the escape character. This must be the ASCII integer + * related to the char. + * In example, 9 for tab, 44 for comma + * + */ + public void setEscapeCharacter(char escapeCharacter) { + this.escapeCharacter = escapeCharacter; + } + + /** + * The method process any kind of "character separated" files, like CSV, TSV, and so on. + * It return a List of PlainMetadataSourceDto. + * Using the superclass methods AbstractPlainMetadataSource.getRecord(s), any of this + * element will then be converted in an {@link org.dspace.importer.external.datamodel.ImportRecord}. + + * Columns will be identified by their position, zero based notation. + * Separator character and escape character MUST be defined at class level. Number of lines to skip (headers) + * could also be defined in the field skipLines. + * + * @param InputStream The inputStream of the file + * @return A list of PlainMetadataSourceDto + * @throws FileSourceException if, for any reason, the file is not parsable + + */ + @Override + protected List readData(InputStream inputStream) throws FileSourceException { + List plainMetadataList = new ArrayList<>(); + try (CSVReader csvReader = new CSVReader(new InputStreamReader(inputStream, StandardCharsets.UTF_8), + separator, escapeCharacter);) { + // read all row + List lines = csvReader.readAll(); + int listSize = lines == null ? 0 : lines.size(); + int count = skipLines; + // iterate over row (skipping the first skipLines) + while (count < listSize) { + String [] items = lines.get(count); + List keyValueList = new ArrayList<>(); + if (items != null) { + int size = items.length; + int index = 0; + //iterate over column in the selected row + while (index < size) { + //create key/value item for the specifics row/column + PlainMetadataKeyValueItem keyValueItem = new PlainMetadataKeyValueItem(); + keyValueItem.setKey(String.valueOf(index)); + keyValueItem.setValue(items[index]); + keyValueList.add(keyValueItem); + index++; + } + //save all column key/value for the given row + PlainMetadataSourceDto dto = new PlainMetadataSourceDto(); + dto.setMetadata(keyValueList); + plainMetadataList.add(dto); + } + count++; + } + } catch (IOException e) { + throw new FileSourceException("Error reading file", e); + } + return plainMetadataList; + } + + @Override + public void setMetadataFieldMap(Map> metadataFieldMap) { + super.setMetadataFieldMap(metadataFieldMap); + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/datamodel/Query.java b/dspace-api/src/main/java/org/dspace/importer/external/datamodel/Query.java index 8c5e1b394a..8f392bdb52 100644 --- a/dspace-api/src/main/java/org/dspace/importer/external/datamodel/Query.java +++ b/dspace-api/src/main/java/org/dspace/importer/external/datamodel/Query.java @@ -71,7 +71,7 @@ public class Query { return null; } else { Object o = c.iterator().next(); - if (clazz.isAssignableFrom(o.getClass())) { + if (o != null && clazz.isAssignableFrom(o.getClass())) { return (T) o; } else { return null; diff --git a/dspace-api/src/main/java/org/dspace/importer/external/endnote/service/EndnoteImportMetadataSourceServiceImpl.java b/dspace-api/src/main/java/org/dspace/importer/external/endnote/service/EndnoteImportMetadataSourceServiceImpl.java new file mode 100644 index 0000000000..9881832369 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/endnote/service/EndnoteImportMetadataSourceServiceImpl.java @@ -0,0 +1,140 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.endnote.service; + +import java.io.BufferedReader; +import java.io.IOException; +import java.io.InputStream; +import java.io.InputStreamReader; +import java.util.ArrayList; +import java.util.List; +import java.util.Map; +import java.util.regex.Matcher; +import java.util.regex.Pattern; + +import org.dspace.importer.external.exception.FileSourceException; +import org.dspace.importer.external.metadatamapping.MetadataFieldConfig; +import org.dspace.importer.external.metadatamapping.contributor.MetadataContributor; +import org.dspace.importer.external.service.components.AbstractPlainMetadataSource; +import org.dspace.importer.external.service.components.dto.PlainMetadataKeyValueItem; +import org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto; + +/** + * Implements a metadata importer for Endnote files + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ +public class EndnoteImportMetadataSourceServiceImpl extends AbstractPlainMetadataSource { + + @Override + public String getImportSource() { + return "EndnoteMetadataSource"; + } + + /** + * This method map the data present in the inputStream, then return a list PlainMetadataSourceDto. + * Any PlainMetadataSourceDto will be used to create a single {@link org.dspace.importer.external.datamodel.ImportRecord} + * + * @param inputStream the inputStream of the Endnote file + * @return List of {@link org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto} + * @throws FileSourceException + * @see org.dspace.importer.external.service.components.AbstractPlainMetadataSource + */ + @Override + protected List readData(InputStream fileInpuStream) throws FileSourceException { + List list = new ArrayList<>(); + try { + // row start from 3, because the first 2 (FN and VR) will be removed by tokenize + int lineForDebug = 3; + List tokenized = tokenize(fileInpuStream); + List tmpList = new ArrayList<>(); + // iterate over key/value pairs, create a new PlainMetadataSourceDto on "ER" rows (which means "new record) + // and stop on EF (end of file). + for (PlainMetadataKeyValueItem item : tokenized) { + if (item.getKey() == null || item.getKey().isEmpty()) { + throw new FileSourceException("Null or empty key expected on line " + + lineForDebug + ". Keys cannot be null nor empty"); + } + if ("EF".equals(item.getKey())) { + // end of file + break; + } + if ("ER".equals(item.getKey())) { + // new ImportRecord start from here (ER is a content delimiter) + // save the previous, then create a new one + PlainMetadataSourceDto dto = new PlainMetadataSourceDto(); + dto.setMetadata(new ArrayList<>(tmpList)); + list.add(dto); + tmpList = new ArrayList<>(); + } else { + if (item.getValue() == null || item.getValue().isEmpty()) { + throw new FileSourceException("Null or empty value expected on line " + + lineForDebug + ". Value expected"); + } + tmpList.add(item); + } + lineForDebug++; + } + } catch (Exception e) { + throw new FileSourceException("Error reading file", e); + } + return list; + } + + + /** + * This method iterate over file rows, split content in a list of key/value items through RexExp + * and save the content sequentially. + * Key "FN" and "VR", which is a preamble in Endnote, will be checked but not saved. + * + * @param fileInpuStream the inputStream of the Endnote file + * @return A list of key/value items which map the file's row sequentially + * @throws IOException + * @throws FileSourceException + */ + private List tokenize(InputStream fileInpuStream) + throws IOException, FileSourceException { + BufferedReader reader = new BufferedReader(new InputStreamReader(fileInpuStream)); + String line; + line = reader.readLine(); + // FN and VR works as preamble, just check and skip them + if (line == null || !line.startsWith("FN")) { + throw new FileSourceException("Invalid endNote file"); + } + line = reader.readLine(); + if (line == null || !line.startsWith("VR")) { + throw new FileSourceException("Invalid endNote file"); + } + // split any row into first part ^[A-Z]{2} used as key (the meaning of the data) + // and second part ?(.*) used as value (the data) + Pattern pattern = Pattern.compile("(^[A-Z]{2}) ?(.*)$"); + List list = new ArrayList(); + while ((line = reader.readLine()) != null) { + line = line.trim(); + // skip empty lines + if (line.isEmpty() || line.equals("")) { + continue; + } + Matcher matcher = pattern.matcher(line); + if (matcher.matches()) { + PlainMetadataKeyValueItem item = new PlainMetadataKeyValueItem(); + item.setKey(matcher.group(1)); + item.setValue(matcher.group(2)); + list.add(item); + } + } + return list; + } + + @Override + public void setMetadataFieldMap(Map> metadataFieldMap) { + super.setMetadataFieldMap(metadataFieldMap); + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/exception/FileMultipleOccurencesException.java b/dspace-api/src/main/java/org/dspace/importer/external/exception/FileMultipleOccurencesException.java new file mode 100644 index 0000000000..d09889a7ff --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/exception/FileMultipleOccurencesException.java @@ -0,0 +1,29 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ + +package org.dspace.importer.external.exception; + +/** + * This exception could be throws when more than one element is found + * in a method that works on one only. + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ + +public class FileMultipleOccurencesException extends Exception { + + private static final long serialVersionUID = 1222409723339501937L; + + public FileMultipleOccurencesException(String message, Throwable cause) { + super(message, cause); + } + + public FileMultipleOccurencesException(String message) { + super(message); + } +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/exception/FileSourceException.java b/dspace-api/src/main/java/org/dspace/importer/external/exception/FileSourceException.java new file mode 100644 index 0000000000..c41ce94151 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/exception/FileSourceException.java @@ -0,0 +1,28 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ + +package org.dspace.importer.external.exception; + +/** + * Represents a problem with the File content: e.g. null input stream, invalid content, ... + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ + +public class FileSourceException extends Exception { + + private static final long serialVersionUID = 6895579588455260182L; + + public FileSourceException(String message, Throwable cause) { + super(message, cause); + } + + public FileSourceException(String message) { + super(message); + } +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/AbstractMetadataFieldMapping.java b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/AbstractMetadataFieldMapping.java index 3ce45d6048..aed2f0e084 100644 --- a/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/AbstractMetadataFieldMapping.java +++ b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/AbstractMetadataFieldMapping.java @@ -117,16 +117,13 @@ public abstract class AbstractMetadataFieldMapping public Collection resultToDCValueMapping(RecordType record) { List values = new LinkedList(); - for (MetadataContributor query : getMetadataFieldMap().values()) { try { values.addAll(query.contributeMetadata(record)); } catch (Exception e) { log.error("Error", e); } - } return values; - } } diff --git a/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/EnhancedSimpleMetadataContributor.java b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/EnhancedSimpleMetadataContributor.java new file mode 100644 index 0000000000..b06322ac2c --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/EnhancedSimpleMetadataContributor.java @@ -0,0 +1,108 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.metadatamapping.contributor; + +import java.io.IOException; +import java.io.StringReader; +import java.util.Collection; +import java.util.LinkedList; +import java.util.List; + +import au.com.bytecode.opencsv.CSVReader; +import org.dspace.importer.external.metadatamapping.MetadatumDTO; +import org.dspace.importer.external.service.components.dto.PlainMetadataKeyValueItem; +import org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto; + + +/** + * This class implements functionalities to handle common situation regarding plain metadata. + * In some scenario, like csv or tsv, the format don't allow lists. + * We can use this MetadataContribut to parse a given plain metadata and split it into + * related list, based on the delimiter. No escape character is present. + * Default values are comma (,) for delimiter, and double quote (") for escape character + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + * + */ +public class EnhancedSimpleMetadataContributor extends SimpleMetadataContributor { + + private char delimiter = ','; + + private char escape = '"'; + + /** + * This method could be used to set the delimiter used during parse + * If no delimiter is set, comma will be used + */ + public void setDelimiter(char delimiter) { + this.delimiter = delimiter; + } + + /** + * This method could be used to get the delimiter used in this class + */ + public char getDelimiter() { + return delimiter; + } + + /** + * Method to inject the escape character. + * This must be the ASCII integer + * related to the char. + * In example, 9 for tab, 44 for comma + * If no escape is set, double quote will be used + */ + public void setEscape(char escape) { + this.escape = escape; + } + + /** + * Method to get the escape character. + * + */ + public char getEscape() { + return escape; + } + + @Override + public Collection contributeMetadata(PlainMetadataSourceDto t) { + Collection values = null; + values = new LinkedList<>(); + for (PlainMetadataKeyValueItem metadatum : t.getMetadata()) { + if (getKey().equals(metadatum.getKey())) { + String[] splitted = splitToRecord(metadatum.getValue()); + for (String value : splitted) { + MetadatumDTO dcValue = new MetadatumDTO(); + dcValue.setValue(value); + dcValue.setElement(getField().getElement()); + dcValue.setQualifier(getField().getQualifier()); + dcValue.setSchema(getField().getSchema()); + values.add(dcValue); + } + } + } + return values; + } + + private String[] splitToRecord(String value) { + List rows; + // For example, list of author must be: Author 1, author 2, author 3 + // if author name contains comma, is important to escape its in + // this way: Author 1, \"Author 2, something\", Author 3 + try (CSVReader csvReader = new CSVReader(new StringReader(value), + delimiter, escape);) { + rows = csvReader.readAll(); + } catch (IOException e) { + //fallback, use the inpu as value + return new String[] { value }; + } + //must be one row + return rows.get(0); + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/MultipleMetadataContributor.java b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/MultipleMetadataContributor.java new file mode 100644 index 0000000000..2685948fd9 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/MultipleMetadataContributor.java @@ -0,0 +1,139 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.metadatamapping.contributor; + +import java.util.ArrayList; +import java.util.Collection; +import java.util.LinkedList; +import java.util.List; + +import org.dspace.importer.external.metadatamapping.MetadataFieldConfig; +import org.dspace.importer.external.metadatamapping.MetadataFieldMapping; +import org.dspace.importer.external.metadatamapping.MetadatumDTO; + +/** + * This Contributor is helpful to avoid the limit of the Live Import Framework. + * In Live Import, one dc schema/element/qualifier could be associate with one and + * only one MetadataContributor, because the map they're saved in use dc entity as key. + * + * In fact, in this implementation we use the MetadataFieldConfig present in this MultipleMetadataContributor + * contributor, but the data (values of the dc metadatum) will be loaded using any of the contributor defined + * in the List metadatumContributors, by iterating over them. + * + * @see org.dspace.importer.external.metadatamapping.AbstractMetadataFieldMapping + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + * + */ +public class MultipleMetadataContributor implements MetadataContributor { + + private MetadataFieldConfig field; + + private List metadatumContributors; + + /** + * Empty constructor + */ + public MultipleMetadataContributor() { + } + + /** + * @param field {@link org.dspace.importer.external.metadatamapping.MetadataFieldConfig} used in + * mapping + * @param metadatumContributors A list of MetadataContributor + */ + public MultipleMetadataContributor(MetadataFieldConfig field, List metadatumContributors) { + this.field = field; + this.metadatumContributors = (LinkedList) metadatumContributors; + } + + /** + * Set the metadatafieldMapping used in the transforming of a record to actual metadata + * + * @param metadataFieldMapping the new mapping. + */ + @Override + public void setMetadataFieldMapping(MetadataFieldMapping> metadataFieldMapping) { + for (MetadataContributor metadatumContributor : metadatumContributors) { + metadatumContributor.setMetadataFieldMapping(metadataFieldMapping); + } + } + + + /** + * a separate Metadatum object is created for each index of Metadatum returned from the calls to + * MetadatumContributor.contributeMetadata(t) for each MetadatumContributor in the metadatumContributors list. + * All of them have as dc schema/element/qualifier the values defined in MetadataFieldConfig. + * + * @param t the object we are trying to translate + * @return a collection of metadata got from each MetadataContributor + */ + @Override + public Collection contributeMetadata(T t) { + Collection values = new ArrayList<>(); + for (MetadataContributor metadatumContributor : metadatumContributors) { + Collection metadata = metadatumContributor.contributeMetadata(t); + values.addAll(metadata); + } + changeDC(values); + return values; + } + + /** + * This method does the trick of this implementation. + * It changes the DC schema/element/qualifier of the given Metadatum into + * the ones present in this contributor. + * In this way, the contributors in metadatumContributors could have any dc values, + * because this method remap them all. + * + * @param the list of metadata we want to remap + */ + private void changeDC(Collection values) { + for (MetadatumDTO dto : values) { + dto.setElement(field.getElement()); + dto.setQualifier(field.getQualifier()); + dto.setSchema(field.getSchema()); + } + } + + /** + * Return the MetadataFieldConfig used while retrieving MetadatumDTO + * + * @return MetadataFieldConfig + */ + public MetadataFieldConfig getField() { + return field; + } + + /** + * Setting the MetadataFieldConfig + * + * @param field MetadataFieldConfig used while retrieving MetadatumDTO + */ + public void setField(MetadataFieldConfig field) { + this.field = field; + } + + /** + * Return the List of MetadataContributor objects set to this class + * + * @return metadatumContributors, list of MetadataContributor + */ + public List getMetadatumContributors() { + return metadatumContributors; + } + + /** + * Set the List of MetadataContributor objects set to this class + * + * @param metadatumContributors A list of MetadatumContributor classes + */ + public void setMetadatumContributors(List metadatumContributors) { + this.metadatumContributors = metadatumContributors; + } +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/SimpleMetadataContributor.java b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/SimpleMetadataContributor.java new file mode 100644 index 0000000000..1b9007f23c --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/SimpleMetadataContributor.java @@ -0,0 +1,109 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ + +package org.dspace.importer.external.metadatamapping.contributor; + +import java.util.Collection; +import java.util.LinkedList; +import java.util.List; + +import org.dspace.importer.external.metadatamapping.MetadataFieldConfig; +import org.dspace.importer.external.metadatamapping.MetadataFieldMapping; +import org.dspace.importer.external.metadatamapping.MetadatumDTO; +import org.dspace.importer.external.service.components.dto.PlainMetadataKeyValueItem; +import org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto; + +/** + * Metadata contributor that takes an PlainMetadataSourceDto instance and turns it into a + * collection of metadatum + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ +public class SimpleMetadataContributor implements MetadataContributor { + + private MetadataFieldConfig field; + + private String key; + + private MetadataFieldMapping> metadataFieldMapping; + + public SimpleMetadataContributor(MetadataFieldConfig field, String key) { + this.field = field; + this.key = key; + } + + public SimpleMetadataContributor() { } + + /** + * Set the metadataFieldMapping of this SimpleMetadataContributor + * + * @param metadataFieldMapping the new mapping. + */ + @Override + public void setMetadataFieldMapping( + MetadataFieldMapping> metadataFieldMapping) { + this.metadataFieldMapping = metadataFieldMapping; + } + + /** + * Retrieve the metadata associated with the given object. + * It match the key found in PlainMetadataSourceDto instance with the key passed to constructor. + * In case of success, new metadatum is constructer (using field elements and PlainMetadataSourceDto value) + * and added to the list. + * + * @param t A class to retrieve metadata and key to match from. t and contained list "metadata" MUST be not null. + * @return a collection of import records. Only the identifier of the found records may be put in the record. + */ + @Override + public Collection contributeMetadata(PlainMetadataSourceDto t) { + List values = new LinkedList<>(); + for (PlainMetadataKeyValueItem metadatum : t.getMetadata()) { + if (key.equals(metadatum.getKey())) { + MetadatumDTO dcValue = new MetadatumDTO(); + dcValue.setValue(metadatum.getValue()); + dcValue.setElement(field.getElement()); + dcValue.setQualifier(field.getQualifier()); + dcValue.setSchema(field.getSchema()); + values.add(dcValue); + } + } + return values; + } + + /** + * Method to inject field item + * + * @param field the {@link MetadataFieldConfig} to use in this contributor + */ + public void setField(MetadataFieldConfig field) { + this.field = field; + } + + /** + * Method to inject key value + */ + public void setKey(String key) { + this.key = key; + } + + /** + * Method to retrieve field item + */ + public String getKey() { + return key; + } + + /** + * Method to retrieve the {@link MetadataFieldConfig} used in this contributor + */ + public MetadataFieldConfig getField() { + return field; + } +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/SimpleXpathMetadatumContributor.java b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/SimpleXpathMetadatumContributor.java index ba5afceb5f..c8d2467d5f 100644 --- a/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/SimpleXpathMetadatumContributor.java +++ b/dspace-api/src/main/java/org/dspace/importer/external/metadatamapping/contributor/SimpleXpathMetadatumContributor.java @@ -21,6 +21,8 @@ import org.dspace.importer.external.metadatamapping.MetadataFieldConfig; import org.dspace.importer.external.metadatamapping.MetadataFieldMapping; import org.dspace.importer.external.metadatamapping.MetadatumDTO; import org.jaxen.JaxenException; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Required; /** @@ -31,6 +33,8 @@ import org.springframework.beans.factory.annotation.Required; public class SimpleXpathMetadatumContributor implements MetadataContributor { private MetadataFieldConfig field; + private static final Logger log = LoggerFactory.getLogger(SimpleXpathMetadatumContributor.class); + /** * Return prefixToNamespaceMapping * @@ -79,7 +83,7 @@ public class SimpleXpathMetadatumContributor implements MetadataContributorMetadataFieldConfig + * MetadataFieldConfig */ public SimpleXpathMetadatumContributor(String query, Map prefixToNamespaceMapping, MetadataFieldConfig field) { @@ -157,12 +161,12 @@ public class SimpleXpathMetadatumContributor implements MetadataContributor { +public class PubmedImportMetadataSourceServiceImpl extends AbstractImportMetadataSourceService + implements QuerySource, FileSource { + private String baseAddress; private WebTarget pubmedWebTarget; + private List supportedExtensions; + + /** + * Set the file extensions supported by this metadata service + * + * @param supportedExtensionsthe file extensions (xml,txt,...) supported by this service + */ + public void setSupportedExtensions(List supportedExtensions) { + this.supportedExtensions = supportedExtensions; + } + + @Override + public List getSupportedExtensions() { + return supportedExtensions; + } + /** * Find the number of records matching a query; * @@ -49,7 +77,7 @@ public class PubmedImportMetadataSourceServiceImpl extends AbstractImportMetadat * @throws MetadataSourceException if the underlying methods throw any exception. */ @Override - public int getNbRecords(String query) throws MetadataSourceException { + public int getRecordsCount(String query) throws MetadataSourceException { return retry(new GetNbRecords(query)); } @@ -61,7 +89,7 @@ public class PubmedImportMetadataSourceServiceImpl extends AbstractImportMetadat * @throws MetadataSourceException if the underlying methods throw any exception. */ @Override - public int getNbRecords(Query query) throws MetadataSourceException { + public int getRecordsCount(Query query) throws MetadataSourceException { return retry(new GetNbRecords(query)); } @@ -357,7 +385,6 @@ public class PubmedImportMetadataSourceServiceImpl extends AbstractImportMetadat @Override public Collection call() throws Exception { - List records = new LinkedList(); WebTarget getRecordIdsTarget = pubmedWebTarget .queryParam("term", query.getParameterAsClass("term", String.class)); @@ -382,13 +409,41 @@ public class PubmedImportMetadataSourceServiceImpl extends AbstractImportMetadat invocationBuilder = getRecordsTarget.request(MediaType.TEXT_PLAIN_TYPE); response = invocationBuilder.get(); - List omElements = splitToRecords(response.readEntity(String.class)); - - for (OMElement record : omElements) { - records.add(transformSourceRecords(record)); - } - - return records; + String xml = response.readEntity(String.class); + return parseXMLString(xml); } } + + + @Override + public List getRecords(InputStream inputStream) throws FileSourceException { + String xml = null; + try (Reader reader = new InputStreamReader(inputStream, "UTF-8")) { + xml = CharStreams.toString(reader); + return parseXMLString(xml); + } catch (IOException e) { + throw new FileSourceException ("Cannot read XML from InputStream", e); + } + } + + @Override + public ImportRecord getRecord(InputStream inputStream) throws FileSourceException, FileMultipleOccurencesException { + List importRecord = getRecords(inputStream); + if (importRecord == null || importRecord.isEmpty()) { + throw new FileSourceException("Cannot find (valid) record in File"); + } else if (importRecord.size() > 1) { + throw new FileMultipleOccurencesException("File contains more than one entry"); + } else { + return importRecord.get(0); + } + } + + private List parseXMLString(String xml) { + List records = new LinkedList(); + List omElements = splitToRecords(xml); + for (OMElement record : omElements) { + records.add(transformSourceRecords(record)); + } + return records; + } } diff --git a/dspace-api/src/main/java/org/dspace/importer/external/ris/service/RisImportMetadataSourceServiceImpl.java b/dspace-api/src/main/java/org/dspace/importer/external/ris/service/RisImportMetadataSourceServiceImpl.java new file mode 100644 index 0000000000..2574e187df --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/ris/service/RisImportMetadataSourceServiceImpl.java @@ -0,0 +1,141 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.ris.service; + +import java.io.BufferedReader; +import java.io.InputStream; +import java.io.InputStreamReader; +import java.util.ArrayList; +import java.util.Iterator; +import java.util.LinkedList; +import java.util.List; +import java.util.Map; +import java.util.regex.Matcher; +import java.util.regex.Pattern; +import javax.annotation.Resource; + +import org.dspace.importer.external.exception.FileSourceException; +import org.dspace.importer.external.service.components.AbstractPlainMetadataSource; +import org.dspace.importer.external.service.components.dto.PlainMetadataKeyValueItem; +import org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto; + +/** + * Implements a metadata importer for RIS files + * Implementations insprider by BTE DataLoader {@link https://github.com/EKT/Biblio-Transformation-Engine/blob/master/bte-io/src/main/java/gr/ekt/bteio/loaders/RISDataLoader.java} + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ +public class RisImportMetadataSourceServiceImpl extends AbstractPlainMetadataSource { + + @Override + public String getImportSource() { + return "RISMetadataSource"; + } + + @Override + protected List readData(InputStream inputStream) throws FileSourceException { + return aggregateData(inputStream); + } + + /** + * This method map the data present in the inputStream, then return a list PlainMetadataSourceDto. + * Any PlainMetadataSourceDto will be used to create a single {@link org.dspace.importer.external.datamodel.ImportRecord} + * + * @see org.dspace.importer.external.service.components.AbstractPlainMetadataSource + * + * @param inputStream the inputStream of the RIS file + * @return List of {@link org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto} + * @throws FileSourceException + */ + private List aggregateData(InputStream inputStream) throws FileSourceException { + List metadata = new ArrayList<>(); + //map any line of the field to a key/value pair + List notAggregatedItems = notAggregatedData(inputStream); + List aggregatedTmpList = null; + Iterator itr = notAggregatedItems.iterator(); + // iterate over the list of key/value items + // create a new PlainMetadataSourceDto (which map and ImportRecord) + // any times the key is "TY" (content separator in RIS) + while (itr.hasNext()) { + PlainMetadataKeyValueItem item = itr.next(); + if ("TY".equals(item.getKey())) { + if (aggregatedTmpList != null) { + PlainMetadataSourceDto dto = new PlainMetadataSourceDto(); + dto.setMetadata(new ArrayList<>(aggregatedTmpList)); + metadata.add(dto); + } + aggregatedTmpList = new ArrayList<>(); + aggregatedTmpList.add(item); + } else { + if (aggregatedTmpList != null) { + aggregatedTmpList.add(item); + // save last iteration metadata + if (!itr.hasNext()) { + PlainMetadataSourceDto dto = new PlainMetadataSourceDto(); + dto.setMetadata(new ArrayList<>(aggregatedTmpList)); + metadata.add(dto); + } + } + } + } + return metadata; + } + + /** + * This method transform any row of the RIS file into a PlainMetadataKeyValueItem, + * splitting the row sequentially through a RegExp without take care of the means of the data. + * In this way, all entries present in the file are mapped in the resulting list. + * + * @param inputStream the inputStrem of the file + * @return A list + * @throws FileSourceException + */ + private List notAggregatedData(InputStream inputStream) throws FileSourceException { + LinkedList items = new LinkedList<>(); + BufferedReader reader; + try { + reader = new BufferedReader(new InputStreamReader(inputStream, "UTF-8")); + String line; + while ((line = reader.readLine()) != null) { + if (line.isEmpty() || line.equals("") || line.matches("^\\s*$")) { + continue; + } + //match valid RIS entry + Pattern risPattern = Pattern.compile("^([A-Z][A-Z0-9]) - (.*)$"); + Matcher risMatcher = risPattern.matcher(line); + if (risMatcher.matches()) { + PlainMetadataKeyValueItem keyValueItem = new PlainMetadataKeyValueItem(); + keyValueItem.setValue(risMatcher.group(2)); + keyValueItem.setKey(risMatcher.group(1)); + items.add(keyValueItem); + } else { + if (!items.isEmpty()) { + items.getLast().setValue(items.getLast().getValue().concat(line)); + } + } + } + } catch (Exception e) { + throw new FileSourceException("Cannot parse RIS file", e); + } + return items; + } + + /** + * Retrieve the MetadataFieldMapping containing the mapping between RecordType + * (in this case PlainMetadataSourceDto.class) and Metadata + * + * @return The configured MetadataFieldMapping + */ + @Override + @SuppressWarnings("unchecked") + @Resource(name = "risMetadataFieldMap") + public void setMetadataFieldMap(@SuppressWarnings("rawtypes") Map metadataFieldMap) { + super.setMetadataFieldMap(metadataFieldMap); + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/service/AbstractImportMetadataSourceService.java b/dspace-api/src/main/java/org/dspace/importer/external/service/AbstractImportMetadataSourceService.java index a803958a9d..3bf76438cd 100644 --- a/dspace-api/src/main/java/org/dspace/importer/external/service/AbstractImportMetadataSourceService.java +++ b/dspace-api/src/main/java/org/dspace/importer/external/service/AbstractImportMetadataSourceService.java @@ -16,7 +16,6 @@ import org.dspace.importer.external.metadatamapping.contributor.MetadataContribu import org.dspace.importer.external.metadatamapping.transform.GenerateQueryService; import org.dspace.importer.external.service.components.AbstractRemoteMetadataSource; import org.dspace.importer.external.service.components.MetadataSource; -import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Required; /** @@ -49,7 +48,6 @@ public abstract class AbstractImportMetadataSourceService extends Ab * * @param generateQueryForItem the query generator to be used. */ - @Autowired public void setGenerateQueryForItem(GenerateQueryService generateQueryForItem) { this.generateQueryForItem = generateQueryForItem; } diff --git a/dspace-api/src/main/java/org/dspace/importer/external/service/ImportService.java b/dspace-api/src/main/java/org/dspace/importer/external/service/ImportService.java index 87c2bd0029..815a10b5a7 100644 --- a/dspace-api/src/main/java/org/dspace/importer/external/service/ImportService.java +++ b/dspace-api/src/main/java/org/dspace/importer/external/service/ImportService.java @@ -8,6 +8,10 @@ package org.dspace.importer.external.service; +import java.io.File; +import java.io.FileInputStream; +import java.io.IOException; +import java.io.InputStream; import java.util.Collection; import java.util.Collections; import java.util.HashMap; @@ -19,11 +23,16 @@ import org.apache.logging.log4j.Logger; import org.dspace.content.Item; import org.dspace.importer.external.datamodel.ImportRecord; import org.dspace.importer.external.datamodel.Query; +import org.dspace.importer.external.exception.FileMultipleOccurencesException; +import org.dspace.importer.external.exception.FileSourceException; import org.dspace.importer.external.exception.MetadataSourceException; import org.dspace.importer.external.service.components.Destroyable; +import org.dspace.importer.external.service.components.FileSource; import org.dspace.importer.external.service.components.MetadataSource; +import org.dspace.importer.external.service.components.QuerySource; import org.springframework.beans.factory.annotation.Autowired; + /** * Main entry point for the import framework. * Instead of calling the different importer implementations, the ImportService should be called instead. @@ -32,8 +41,10 @@ import org.springframework.beans.factory.annotation.Autowired; * importer implementation you want to use. * * @author Roeland Dillen (roeland at atmire dot com) + * @author Pasquale Cavallo (pasquale.cavallo@4science.it) */ public class ImportService implements Destroyable { + private HashMap importSources = new HashMap<>(); Logger log = org.apache.logging.log4j.LogManager.getLogger(ImportService.class); @@ -101,11 +112,11 @@ public class ImportService implements Destroyable { public Collection findMatchingRecords(String uri, Item item) throws MetadataSourceException { try { List recordList = new LinkedList(); - for (MetadataSource metadataSource : matchingImports(uri)) { - recordList.addAll(metadataSource.findMatchingRecords(item)); + if (metadataSource instanceof QuerySource) { + recordList.addAll(((QuerySource)metadataSource).findMatchingRecords(item)); + } } - return recordList; } catch (Exception e) { throw new MetadataSourceException(e); @@ -125,9 +136,10 @@ public class ImportService implements Destroyable { try { List recordList = new LinkedList(); for (MetadataSource metadataSource : matchingImports(uri)) { - recordList.addAll(metadataSource.findMatchingRecords(query)); + if (metadataSource instanceof QuerySource) { + recordList.addAll(((QuerySource)metadataSource).findMatchingRecords(query)); + } } - return recordList; } catch (Exception e) { throw new MetadataSourceException(e); @@ -145,8 +157,10 @@ public class ImportService implements Destroyable { public int getNbRecords(String uri, String query) throws MetadataSourceException { try { int total = 0; - for (MetadataSource MetadataSource : matchingImports(uri)) { - total += MetadataSource.getNbRecords(query); + for (MetadataSource metadataSource : matchingImports(uri)) { + if (metadataSource instanceof QuerySource) { + total += ((QuerySource)metadataSource).getRecordsCount(query); + } } return total; } catch (Exception e) { @@ -165,8 +179,10 @@ public class ImportService implements Destroyable { public int getNbRecords(String uri, Query query) throws MetadataSourceException { try { int total = 0; - for (MetadataSource MetadataSource : matchingImports(uri)) { - total += MetadataSource.getNbRecords(query); + for (MetadataSource metadataSource : matchingImports(uri)) { + if (metadataSource instanceof QuerySource) { + total += ((QuerySource)metadataSource).getRecordsCount(query); + } } return total; } catch (Exception e) { @@ -189,7 +205,9 @@ public class ImportService implements Destroyable { try { List recordList = new LinkedList<>(); for (MetadataSource metadataSource : matchingImports(uri)) { - recordList.addAll(metadataSource.getRecords(query, start, count)); + if (metadataSource instanceof QuerySource) { + recordList.addAll(((QuerySource)metadataSource).getRecords(query, start, count)); + } } return recordList; } catch (Exception e) { @@ -209,7 +227,9 @@ public class ImportService implements Destroyable { try { List recordList = new LinkedList<>(); for (MetadataSource metadataSource : matchingImports(uri)) { - recordList.addAll(metadataSource.getRecords(query)); + if (metadataSource instanceof QuerySource) { + recordList.addAll(((QuerySource)metadataSource).getRecords(query)); + } } return recordList; } catch (Exception e) { @@ -229,10 +249,12 @@ public class ImportService implements Destroyable { public ImportRecord getRecord(String uri, String id) throws MetadataSourceException { try { for (MetadataSource metadataSource : matchingImports(uri)) { - if (metadataSource.getRecord(id) != null) { - return metadataSource.getRecord(id); + if (metadataSource instanceof QuerySource) { + QuerySource querySource = (QuerySource)metadataSource; + if (querySource.getRecord(id) != null) { + return querySource.getRecord(id); + } } - } return null; } catch (Exception e) { @@ -252,10 +274,12 @@ public class ImportService implements Destroyable { public ImportRecord getRecord(String uri, Query query) throws MetadataSourceException { try { for (MetadataSource metadataSource : matchingImports(uri)) { - if (metadataSource.getRecord(query) != null) { - return metadataSource.getRecord(query); + if (metadataSource instanceof QuerySource) { + QuerySource querySource = (QuerySource)metadataSource; + if (querySource.getRecord(query) != null) { + return querySource.getRecord(query); + } } - } return null; } catch (Exception e) { @@ -272,6 +296,41 @@ public class ImportService implements Destroyable { return importSources.keySet(); } + /* + * Get a collection of record from File, + * The first match will be return. + * + * @param file The file from which will read records + * @param originalName The original file name or full path + * @return a single record contains the metadatum + * @throws FileMultipleOccurencesException if more than one entry is found + */ + public ImportRecord getRecord(File file, String originalName) + throws FileMultipleOccurencesException, FileSourceException { + ImportRecord importRecords = null; + for (MetadataSource metadataSource : importSources.values()) { + try (InputStream fileInputStream = new FileInputStream(file)) { + if (metadataSource instanceof FileSource) { + FileSource fileSource = (FileSource)metadataSource; + if (fileSource.isValidSourceForFile(originalName)) { + importRecords = fileSource.getRecord(fileInputStream); + break; + } + } + //catch statements is required because we could have supported format (i.e. XML) + //which fail on schema validation + } catch (FileSourceException e) { + log.debug(metadataSource.getImportSource() + " isn't a valid parser for file"); + } catch (FileMultipleOccurencesException e) { + log.debug("File contains multiple metadata, return with error"); + throw e; + } catch (IOException e1) { + throw new FileSourceException("File cannot be read, may be null"); + } + } + return importRecords; + } + /** * Call destroy on all {@link Destroyable} {@link MetadataSource} objects set in this ImportService */ diff --git a/dspace-api/src/main/java/org/dspace/importer/external/service/components/AbstractPlainMetadataSource.java b/dspace-api/src/main/java/org/dspace/importer/external/service/components/AbstractPlainMetadataSource.java new file mode 100644 index 0000000000..019cf33177 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/service/components/AbstractPlainMetadataSource.java @@ -0,0 +1,103 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ + +package org.dspace.importer.external.service.components; + +import java.io.InputStream; +import java.util.ArrayList; +import java.util.List; + +import org.dspace.importer.external.datamodel.ImportRecord; +import org.dspace.importer.external.exception.FileMultipleOccurencesException; +import org.dspace.importer.external.exception.FileSourceException; +import org.dspace.importer.external.metadatamapping.AbstractMetadataFieldMapping; +import org.dspace.importer.external.metadatamapping.MetadatumDTO; +import org.dspace.importer.external.service.components.dto.PlainMetadataSourceDto; + + +/** + * This class is an abstract implementation of {@link MetadataSource} useful in cases + * of plain metadata sources. + * It provides the methot to mapping metadata to DSpace Format when source is a file + * whit a list of strings. + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ + +public abstract class AbstractPlainMetadataSource + extends AbstractMetadataFieldMapping + implements FileSource { + + protected abstract List + readData(InputStream fileInpuStream) throws FileSourceException; + + + private List supportedExtensions; + + /** + * Set the file extensions supported by this metadata service + * + * @param supportedExtensionsthe file extensions (xml,txt,...) supported by this service + */ + public void setSupportedExtensions(List supportedExtensions) { + this.supportedExtensions = supportedExtensions; + } + + @Override + public List getSupportedExtensions() { + return supportedExtensions; + } + + /** + * Return a list of ImportRecord constructed from input file. This list is based on + * the results retrieved from the file (InputStream) parsed through abstract method readData + * + * @param InputStream The inputStream of the file + * @return A list of {@link ImportRecord} + * @throws FileSourceException if, for any reason, the file is not parsable + */ + @Override + public List getRecords(InputStream is) throws FileSourceException { + List datas = readData(is); + List records = new ArrayList<>(); + for (PlainMetadataSourceDto item : datas) { + records.add(toRecord(item)); + } + return records; + } + + /** + * Return an ImportRecord constructed from input file. This list is based on + * the result retrieved from the file (InputStream) parsed through abstract method + * "readData" implementation + * + * @param InputStream The inputStream of the file + * @return An {@link ImportRecord} matching the file content + * @throws FileSourceException if, for any reason, the file is not parsable + * @throws FileMultipleOccurencesException if the file contains more than one entry + */ + @Override + public ImportRecord getRecord(InputStream is) throws FileSourceException, FileMultipleOccurencesException { + List datas = readData(is); + if (datas == null || datas.isEmpty()) { + throw new FileSourceException("File is empty"); + } + if (datas.size() > 1) { + throw new FileMultipleOccurencesException("File " + + "contains more than one entry (" + datas.size() + " entries"); + } + return toRecord(datas.get(0)); + } + + + private ImportRecord toRecord(PlainMetadataSourceDto entry) { + List metadata = new ArrayList<>(); + metadata.addAll(resultToDCValueMapping(entry)); + return new ImportRecord(metadata); + } +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/service/components/FileSource.java b/dspace-api/src/main/java/org/dspace/importer/external/service/components/FileSource.java new file mode 100644 index 0000000000..5bef0984df --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/service/components/FileSource.java @@ -0,0 +1,70 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ + +package org.dspace.importer.external.service.components; + +import java.io.InputStream; +import java.util.List; + +import org.dspace.importer.external.datamodel.ImportRecord; +import org.dspace.importer.external.exception.FileMultipleOccurencesException; +import org.dspace.importer.external.exception.FileSourceException; + +/** + * This interface declare the base methods to work with files containing metadata. + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ +public interface FileSource extends MetadataSource { + + /** + * Get the file extensions (xml, csv, txt, ...) supported by the FileSource + */ + public List getSupportedExtensions(); + + /** + * Return a list of ImportRecord constructed from input file. + * + * @param InputStream The inputStream of the file + * @return A list of {@link ImportRecord} + * @throws FileSourceException if, for any reason, the file is not parsable + */ + public List getRecords(InputStream inputStream) + throws FileSourceException; + + /** + * Return an ImportRecord constructed from input file. + * + * @param InputStream The inputStream of the file + * @return An {@link ImportRecord} matching the file content + * @throws FileSourceException if, for any reason, the file is not parsable + * @throws FileMultipleOccurencesException if the file contains more than one entry + */ + public ImportRecord getRecord(InputStream inputStream) + throws FileSourceException, FileMultipleOccurencesException; + + /** + * This method is used to decide if the FileSource manage the file format + * + * @param originalName the file file original name + * @return true if the FileSource can parse the file, false otherwise + */ + public default boolean isValidSourceForFile(String originalName) { + List extensions = getSupportedExtensions(); + if (extensions == null || extensions.isEmpty()) { + return false; + } + if (originalName != null && originalName.contains(".")) { + String extension = originalName.substring(originalName.lastIndexOf('.') + 1, + originalName.length()); + return getSupportedExtensions().contains(extension); + } + return false; + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/service/components/MetadataSource.java b/dspace-api/src/main/java/org/dspace/importer/external/service/components/MetadataSource.java index 79bdcfa903..353f77b798 100644 --- a/dspace-api/src/main/java/org/dspace/importer/external/service/components/MetadataSource.java +++ b/dspace-api/src/main/java/org/dspace/importer/external/service/components/MetadataSource.java @@ -8,76 +8,14 @@ package org.dspace.importer.external.service.components; -import java.util.Collection; - -import org.dspace.content.Item; -import org.dspace.importer.external.datamodel.ImportRecord; -import org.dspace.importer.external.datamodel.Query; -import org.dspace.importer.external.exception.MetadataSourceException; - /** - * Common interface for all import implementations. + * Super interface for all import implementations. * * @author Roeland Dillen (roeland at atmire dot com) + * @author Pasquale Cavallo (pasquale.cavallo@4science.it) */ public interface MetadataSource { - /** - * Gets the number of records matching a query - * - * @param query the query in string format - * @return the number of records matching the query - * @throws MetadataSourceException if the underlying methods throw any exception. - */ - public int getNbRecords(String query) throws MetadataSourceException; - /** - * Gets the number of records matching a query - * - * @param query the query object - * @return the number of records matching the query - * @throws MetadataSourceException if the underlying methods throw any exception. - */ - public int getNbRecords(Query query) throws MetadataSourceException; - - /** - * Gets a set of records matching a query. Supports pagination - * - * @param query the query. The query will generally be posted 'as is' to the source - * @param start offset - * @param count page size - * @return a collection of fully transformed id's - * @throws MetadataSourceException if the underlying methods throw any exception. - */ - public Collection getRecords(String query, int start, int count) throws MetadataSourceException; - - /** - * Find records based on a object query. - * - * @param query a query object to base the search on. - * @return a set of records. Fully transformed. - * @throws MetadataSourceException if the underlying methods throw any exception. - */ - public Collection getRecords(Query query) throws MetadataSourceException; - - /** - * Get a single record from the source. - * The first match will be returned - * - * @param id identifier for the record - * @return a matching record - * @throws MetadataSourceException if the underlying methods throw any exception. - */ - public ImportRecord getRecord(String id) throws MetadataSourceException; - - /** - * Get a single record from the source. - * The first match will be returned - * - * @param query a query matching a single record - * @return a matching record - * @throws MetadataSourceException if the underlying methods throw any exception. - */ - public ImportRecord getRecord(Query query) throws MetadataSourceException; /** * The string that identifies this import implementation. Preferable a URI @@ -86,23 +24,4 @@ public interface MetadataSource { */ public String getImportSource(); - /** - * Finds records based on an item - * Delegates to one or more MetadataSource implementations based on the uri. Results will be aggregated. - * - * @param item an item to base the search on - * @return a collection of import records. Only the identifier of the found records may be put in the record. - * @throws MetadataSourceException if the underlying methods throw any exception. - */ - public Collection findMatchingRecords(Item item) throws MetadataSourceException; - - /** - * Finds records based on query object. - * Delegates to one or more MetadataSource implementations based on the uri. Results will be aggregated. - * - * @param query a query object to base the search on. - * @return a collection of import records. Only the identifier of the found records may be put in the record. - * @throws MetadataSourceException passed through. - */ - public Collection findMatchingRecords(Query query) throws MetadataSourceException; } diff --git a/dspace-api/src/main/java/org/dspace/importer/external/service/components/QuerySource.java b/dspace-api/src/main/java/org/dspace/importer/external/service/components/QuerySource.java new file mode 100644 index 0000000000..bcd10cc554 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/service/components/QuerySource.java @@ -0,0 +1,106 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ + +package org.dspace.importer.external.service.components; + +import java.util.Collection; + +import org.dspace.content.Item; +import org.dspace.importer.external.datamodel.ImportRecord; +import org.dspace.importer.external.datamodel.Query; +import org.dspace.importer.external.exception.MetadataSourceException; + + +/** + * Common interface for database-based imports. + * + * @author Roeland Dillen (roeland at atmire dot com) + * @author Pasquale Cavallo (pasquale.cavallo@4science.it) + */ + +public interface QuerySource extends MetadataSource { + + /** + * Get a single record from the source. + * The first match will be returned + * + * @param id identifier for the record + * @return a matching record + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + public ImportRecord getRecord(String id) throws MetadataSourceException; + + /** + * Gets the number of records matching a query + * + * @param query the query in string format + * @return the number of records matching the query + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + public int getRecordsCount(String query) throws MetadataSourceException; + + /** + * Gets the number of records matching a query + * + * @param query the query object + * @return the number of records matching the query + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + public int getRecordsCount(Query query) throws MetadataSourceException; + + /** + * Gets a set of records matching a query. Supports pagination + * + * @param query the query. The query will generally be posted 'as is' to the source + * @param start offset + * @param count page size + * @return a collection of fully transformed id's + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + public Collection getRecords(String query, int start, int count) throws MetadataSourceException; + + /** + * Find records based on a object query. + * + * @param query a query object to base the search on. + * @return a set of records. Fully transformed. + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + public Collection getRecords(Query query) throws MetadataSourceException; + + /** + * Get a single record from the source. + * The first match will be returned + * + * @param query a query matching a single record + * @return a matching record + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + public ImportRecord getRecord(Query query) throws MetadataSourceException; + + /** + * Finds records based on query object. + * Delegates to one or more MetadataSource implementations based on the uri. Results will be aggregated. + * + * @param query a query object to base the search on. + * @return a collection of import records. Only the identifier of the found records may be put in the record. + * @throws MetadataSourceException passed through. + */ + public Collection findMatchingRecords(Query query) throws MetadataSourceException; + + /** + * Finds records based on an item + * Delegates to one or more MetadataSource implementations based on the uri. Results will be aggregated. + * + * @param item an item to base the search on + * @return a collection of import records. Only the identifier of the found records may be put in the record. + * @throws MetadataSourceException if the underlying methods throw any exception. + */ + public Collection findMatchingRecords(Item item) throws MetadataSourceException; + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/service/components/dto/PlainMetadataKeyValueItem.java b/dspace-api/src/main/java/org/dspace/importer/external/service/components/dto/PlainMetadataKeyValueItem.java new file mode 100644 index 0000000000..fa362760b9 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/service/components/dto/PlainMetadataKeyValueItem.java @@ -0,0 +1,50 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.service.components.dto; + +/** + * Simple object to construct items + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ +public class PlainMetadataKeyValueItem { + + private String key; + private String value; + + /* + * In a key-value items, like PlainMetadata, this method get the item's key + */ + public String getKey() { + return key; + } + + /* + * In a key-value items, like PlainMetadata, this method set the item's key. + * Never set or leave this field to null + * + */ + public void setKey(String key) { + this.key = key; + } + + /* + * In key-value items, like PlainMetadata, this method get the item's value + */ + public String getValue() { + return value; + } + + /* + * In key-value items, like PlainMetadata, this method set the item's value + */ + public void setValue(String value) { + this.value = value; + } + +} diff --git a/dspace-api/src/main/java/org/dspace/importer/external/service/components/dto/PlainMetadataSourceDto.java b/dspace-api/src/main/java/org/dspace/importer/external/service/components/dto/PlainMetadataSourceDto.java new file mode 100644 index 0000000000..041823b027 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/importer/external/service/components/dto/PlainMetadataSourceDto.java @@ -0,0 +1,38 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.importer.external.service.components.dto; + +import java.util.List; + + +/** + * Simple object used to construct a list of items. + * This type is used in file plain metadata import as RecordType. + * + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + */ + +public class PlainMetadataSourceDto { + + private List metadata; + + /* + * Method used to get the Metadata list + */ + public List getMetadata() { + return metadata; + } + + /* + * Method used to set the metadata list + */ + public void setMetadata(List metadata) { + this.metadata = metadata; + } + +} diff --git a/dspace-api/src/main/java/org/dspace/license/CCLicenseConnectorServiceImpl.java b/dspace-api/src/main/java/org/dspace/license/CCLicenseConnectorServiceImpl.java index a237a91984..792c25d629 100644 --- a/dspace-api/src/main/java/org/dspace/license/CCLicenseConnectorServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/license/CCLicenseConnectorServiceImpl.java @@ -75,6 +75,10 @@ public class CCLicenseConnectorServiceImpl implements CCLicenseConnectorService, .disableAutomaticRetries() .setMaxConnTotal(5) .build(); + + // disallow DTD parsing to ensure no XXE attacks can occur. + // See https://cheatsheetseries.owasp.org/cheatsheets/XML_External_Entity_Prevention_Cheat_Sheet.html + parser.setFeature("http://apache.org/xml/features/disallow-doctype-decl", true); } /** diff --git a/dspace-api/src/main/java/org/dspace/rdf/negotiation/Negotiator.java b/dspace-api/src/main/java/org/dspace/rdf/negotiation/Negotiator.java index c28b9ec1e6..d011d305b1 100644 --- a/dspace-api/src/main/java/org/dspace/rdf/negotiation/Negotiator.java +++ b/dspace-api/src/main/java/org/dspace/rdf/negotiation/Negotiator.java @@ -15,6 +15,7 @@ import java.util.Iterator; import javax.servlet.http.HttpServletResponse; import org.apache.commons.lang3.StringUtils; +import org.apache.commons.validator.routines.UrlValidator; import org.apache.logging.log4j.Logger; import org.dspace.rdf.RDFUtil; import org.dspace.services.factory.DSpaceServicesFactory; @@ -197,6 +198,7 @@ public class Negotiator { if (extraPathInfo == null) { extraPathInfo = ""; } + UrlValidator urlValidator = new UrlValidator(UrlValidator.ALLOW_LOCAL_URLS); StringBuilder urlBuilder = new StringBuilder(); String lang = null; @@ -256,12 +258,15 @@ public class Negotiator { urlBuilder.append(handle).append("/").append(extraPathInfo); } String url = urlBuilder.toString(); - - log.debug("Will forward to '" + url + "'."); - response.setStatus(HttpServletResponse.SC_SEE_OTHER); - response.setHeader("Location", url); - response.flushBuffer(); - return true; + if (urlValidator.isValid(url)) { + log.debug("Will forward to '" + url + "'."); + response.setStatus(HttpServletResponse.SC_SEE_OTHER); + response.setHeader("Location", url); + response.flushBuffer(); + return true; + } else { + throw new IOException("Invalid URL '" + url + "', cannot redirect."); + } } // currently we cannot serve statistics as rdf @@ -287,10 +292,14 @@ public class Negotiator { urlBuilder.append("/handle/").append(handle); urlBuilder.append("/").append(lang); String url = urlBuilder.toString(); - log.debug("Will forward to '" + url + "'."); - response.setStatus(HttpServletResponse.SC_SEE_OTHER); - response.setHeader("Location", url); - response.flushBuffer(); - return true; + if (urlValidator.isValid(url)) { + log.debug("Will forward to '" + url + "'."); + response.setStatus(HttpServletResponse.SC_SEE_OTHER); + response.setHeader("Location", url); + response.flushBuffer(); + return true; + } else { + throw new IOException("Invalid URL '" + url + "', cannot redirect."); + } } } diff --git a/dspace-api/src/main/java/org/dspace/scripts/Process.java b/dspace-api/src/main/java/org/dspace/scripts/Process.java index c58669e6d9..574ba59760 100644 --- a/dspace-api/src/main/java/org/dspace/scripts/Process.java +++ b/dspace-api/src/main/java/org/dspace/scripts/Process.java @@ -82,6 +82,7 @@ public class Process implements ReloadableEntity { private Date creationTime; public static final String BITSTREAM_TYPE_METADATAFIELD = "dspace.process.filetype"; + public static final String OUTPUT_TYPE = "script_output"; protected Process() { } diff --git a/dspace-api/src/main/java/org/dspace/scripts/ProcessLogLevel.java b/dspace-api/src/main/java/org/dspace/scripts/ProcessLogLevel.java new file mode 100644 index 0000000000..306ea3dde6 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/scripts/ProcessLogLevel.java @@ -0,0 +1,14 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.scripts; + +public enum ProcessLogLevel { + INFO, + WARNING, + ERROR +} \ No newline at end of file diff --git a/dspace-api/src/main/java/org/dspace/scripts/ProcessQueryParameterContainer.java b/dspace-api/src/main/java/org/dspace/scripts/ProcessQueryParameterContainer.java new file mode 100644 index 0000000000..d571834246 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/scripts/ProcessQueryParameterContainer.java @@ -0,0 +1,78 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.scripts; + +import java.util.HashMap; +import java.util.Map; + +/** + * This is a container class in which the variables can be stored that a {@link Process} must adhere to when being + * retrieved from the DB through the search methods + */ +public class ProcessQueryParameterContainer { + + + private Map queryParameterMap = new HashMap<>(); + + /** + * Generic getter for the queryParameterMap + * @return the queryParameterMap value of this ProcessQueryParameterContainer + */ + public Map getQueryParameterMap() { + return queryParameterMap; + } + + private String sortProperty = "startTime"; + private String sortOrder = "desc"; + /** + * Generic setter for the queryParameterMap + * @param queryParameterMap The queryParameterMap to be set on this ProcessQueryParameterContainer + */ + public void setQueryParameterMap(Map queryParameterMap) { + this.queryParameterMap = queryParameterMap; + } + + public void addToQueryParameterMap(String key, Object object) { + if (queryParameterMap == null) { + queryParameterMap = new HashMap<>(); + } + queryParameterMap.put(key, object); + } + + /** + * Generic getter for the sortProperty + * @return the sortProperty value of this ProcessQueryParameterContainer + */ + public String getSortProperty() { + return sortProperty; + } + + /** + * Generic setter for the sortProperty + * @param sortProperty The sortProperty to be set on this ProcessQueryParameterContainer + */ + public void setSortProperty(String sortProperty) { + this.sortProperty = sortProperty; + } + + /** + * Generic getter for the sortOrder + * @return the sortOrder value of this ProcessQueryParameterContainer + */ + public String getSortOrder() { + return sortOrder; + } + + /** + * Generic setter for the sortOrder + * @param sortOrder The sortOrder to be set on this ProcessQueryParameterContainer + */ + public void setSortOrder(String sortOrder) { + this.sortOrder = sortOrder; + } +} diff --git a/dspace-api/src/main/java/org/dspace/scripts/ProcessServiceImpl.java b/dspace-api/src/main/java/org/dspace/scripts/ProcessServiceImpl.java index 1cdf3505db..aa193f30bc 100644 --- a/dspace-api/src/main/java/org/dspace/scripts/ProcessServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/scripts/ProcessServiceImpl.java @@ -7,9 +7,14 @@ */ package org.dspace.scripts; +import java.io.BufferedWriter; +import java.io.File; +import java.io.FileInputStream; +import java.io.FileWriter; import java.io.IOException; import java.io.InputStream; import java.sql.SQLException; +import java.text.SimpleDateFormat; import java.util.ArrayList; import java.util.Collections; import java.util.Comparator; @@ -20,6 +25,7 @@ import java.util.Set; import java.util.regex.Pattern; import org.apache.commons.collections4.ListUtils; +import org.apache.commons.io.FileUtils; import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; import org.dspace.authorize.AuthorizeException; @@ -37,6 +43,7 @@ import org.dspace.core.Constants; import org.dspace.core.Context; import org.dspace.core.LogManager; import org.dspace.eperson.EPerson; +import org.dspace.eperson.service.EPersonService; import org.dspace.scripts.service.ProcessService; import org.springframework.beans.factory.annotation.Autowired; @@ -62,6 +69,9 @@ public class ProcessServiceImpl implements ProcessService { @Autowired private MetadataFieldService metadataFieldService; + @Autowired + private EPersonService ePersonService; + @Override public Process create(Context context, EPerson ePerson, String scriptName, List parameters) throws SQLException { @@ -245,4 +255,59 @@ public class ProcessServiceImpl implements ProcessService { return new ArrayList<>(fileTypesSet); } + @Override + public List search(Context context, ProcessQueryParameterContainer processQueryParameterContainer, + int limit, int offset) throws SQLException { + return processDAO.search(context, processQueryParameterContainer, limit, offset); + } + + @Override + public int countSearch(Context context, ProcessQueryParameterContainer processQueryParameterContainer) + throws SQLException { + return processDAO.countTotalWithParameters(context, processQueryParameterContainer); + } + + + @Override + public void appendLog(int processId, String scriptName, String output, ProcessLogLevel processLogLevel) + throws IOException { + File tmpDir = FileUtils.getTempDirectory(); + File tempFile = new File(tmpDir, scriptName + processId + ".log"); + FileWriter out = new FileWriter(tempFile, true); + try { + try (BufferedWriter writer = new BufferedWriter(out)) { + writer.append(formatLogLine(processId, scriptName, output, processLogLevel)); + writer.newLine(); + } + } finally { + out.close(); + } + } + + @Override + public void createLogBitstream(Context context, Process process) + throws IOException, SQLException, AuthorizeException { + File tmpDir = FileUtils.getTempDirectory(); + File tempFile = new File(tmpDir, process.getName() + process.getID() + ".log"); + FileInputStream inputStream = FileUtils.openInputStream(tempFile); + appendFile(context, process, inputStream, Process.OUTPUT_TYPE, process.getName() + process.getID() + ".log"); + inputStream.close(); + tempFile.delete(); + } + + private String formatLogLine(int processId, String scriptName, String output, ProcessLogLevel processLogLevel) { + SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS"); + StringBuilder sb = new StringBuilder(); + sb.append(sdf.format(new Date())); + sb.append(" "); + sb.append(processLogLevel); + sb.append(" "); + sb.append(scriptName); + sb.append(" - "); + sb.append(processId); + sb.append(" @ "); + sb.append(output); + return sb.toString(); + } + } diff --git a/dspace-api/src/main/java/org/dspace/scripts/service/ProcessService.java b/dspace-api/src/main/java/org/dspace/scripts/service/ProcessService.java index 80a6ec932b..27c0c75a35 100644 --- a/dspace-api/src/main/java/org/dspace/scripts/service/ProcessService.java +++ b/dspace-api/src/main/java/org/dspace/scripts/service/ProcessService.java @@ -18,6 +18,8 @@ import org.dspace.core.Context; import org.dspace.eperson.EPerson; import org.dspace.scripts.DSpaceCommandLineParameter; import org.dspace.scripts.Process; +import org.dspace.scripts.ProcessLogLevel; +import org.dspace.scripts.ProcessQueryParameterContainer; /** * An interface for the ProcessService with methods regarding the Process workload @@ -189,4 +191,48 @@ public interface ProcessService { */ public List getFileTypesForProcessBitstreams(Context context, Process process); + /** + * Returns a list of all Processes in the database which match the given field requirements. If the + * requirements are not null, they will be combined with an AND operation. + * @param context The relevant DSpace context + * @param processQueryParameterContainer The {@link ProcessQueryParameterContainer} containing all the values + * that the returned {@link Process} objects must adhere to + * @param limit The limit for the amount of Processes returned + * @param offset The offset for the Processes to be returned + * @return The list of all Processes which match the metadata requirements + * @throws SQLException If something goes wrong + */ + List search(Context context, ProcessQueryParameterContainer processQueryParameterContainer, int limit, + int offset) throws SQLException; + + /** + * Count all the processes which match the requirements. The requirements are evaluated like the search + * method. + * @param context The relevant DSpace context + * @param processQueryParameterContainer The {@link ProcessQueryParameterContainer} containing all the values + * that the returned {@link Process} objects must adhere to + * @return The number of results matching the query + * @throws SQLException If something goes wrong + */ + int countSearch(Context context, ProcessQueryParameterContainer processQueryParameterContainer) throws SQLException; + /** + * This method will append the given output to the {@link Process} its logs + * @param processId The ID of the {@link Process} to append the log for + * @param scriptName The name of the Script that Process runs + * @param output The output to append + * @param processLogLevel The loglevel of the output + * @throws IOException If something goes wrong + */ + void appendLog(int processId, String scriptName, String output, ProcessLogLevel processLogLevel) throws IOException; + + /** + * This method will create a {@link Bitstream} containing the logs for the given {@link Process} + * @param context The relevant DSpace context + * @param process The {@link Process} for which we're making the {@link Bitstream} + * @throws IOException If something goes wrong + * @throws SQLException If something goes wrong + * @throws AuthorizeException If something goes wrong + */ + void createLogBitstream(Context context, Process process) + throws IOException, SQLException, AuthorizeException; } diff --git a/dspace-api/src/main/java/org/dspace/statistics/SolrLoggerServiceImpl.java b/dspace-api/src/main/java/org/dspace/statistics/SolrLoggerServiceImpl.java index e1ff0c69b8..cd46f8dc8a 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/SolrLoggerServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/statistics/SolrLoggerServiceImpl.java @@ -252,8 +252,11 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea solr.add(doc1); - //commits are executed automatically using the solr autocommit -// solr.commit(false, false); + // commits are executed automatically using the solr autocommit + boolean useAutoCommit = configurationService.getBooleanProperty("solr-statistics.autoCommit", true); + if (!useAutoCommit) { + solr.commit(false, false); + } } catch (RuntimeException re) { throw re; @@ -289,7 +292,10 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea solr.add(doc1); // commits are executed automatically using the solr autocommit - // solr.commit(false, false); + boolean useAutoCommit = configurationService.getBooleanProperty("solr-statistics.autoCommit", true); + if (!useAutoCommit) { + solr.commit(false, false); + } } catch (RuntimeException re) { throw re; @@ -842,18 +848,18 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea } @Override - public void query(String query, int max) + public void query(String query, int max, int facetMinCount) throws SolrServerException, IOException { - query(query, null, null, 0, max, null, null, null, null, null, false); + query(query, null, null, 0, max, null, null, null, null, null, false, facetMinCount); } @Override public ObjectCount[] queryFacetField(String query, String filterQuery, String facetField, int max, boolean showTotal, - List facetQueries) + List facetQueries, int facetMinCount) throws SolrServerException, IOException { QueryResponse queryResponse = query(query, filterQuery, facetField, - 0, max, null, null, null, facetQueries, null, false); + 0, max, null, null, null, facetQueries, null, false, facetMinCount); if (queryResponse == null) { return new ObjectCount[0]; } @@ -887,50 +893,55 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea @Override public ObjectCount[] queryFacetDate(String query, String filterQuery, int max, String dateType, String dateStart, - String dateEnd, boolean showTotal, Context context) + String dateEnd, boolean showTotal, Context context, int facetMinCount) throws SolrServerException, IOException { QueryResponse queryResponse = query(query, filterQuery, null, 0, max, - dateType, dateStart, dateEnd, null, null, false); + dateType, dateStart, dateEnd, null, null, false, facetMinCount); if (queryResponse == null) { return new ObjectCount[0]; } - FacetField dateFacet = queryResponse.getFacetDate("time"); - // TODO: check if this cannot crash I checked it, it crashed!!! - // Create an array for our result - ObjectCount[] result = new ObjectCount[dateFacet.getValueCount() - + (showTotal ? 1 : 0)]; - // Run over our datefacet & store all the values - for (int i = 0; i < dateFacet.getValues().size(); i++) { - FacetField.Count dateCount = dateFacet.getValues().get(i); - result[i] = new ObjectCount(); - result[i].setCount(dateCount.getCount()); - result[i].setValue(getDateView(dateCount.getName(), dateType, context)); + List rangeFacets = queryResponse.getFacetRanges(); + for (RangeFacet rangeFacet: rangeFacets) { + if (rangeFacet.getName().equalsIgnoreCase("time")) { + RangeFacet timeFacet = rangeFacet; + // Create an array for our result + ObjectCount[] result = new ObjectCount[timeFacet.getCounts().size() + + (showTotal ? 1 : 0)]; + // Run over our datefacet & store all the values + for (int i = 0; i < timeFacet.getCounts().size(); i++) { + RangeFacet.Count dateCount = (RangeFacet.Count) timeFacet.getCounts().get(i); + result[i] = new ObjectCount(); + result[i].setCount(dateCount.getCount()); + result[i].setValue(getDateView(dateCount.getValue(), dateType, context)); + } + if (showTotal) { + result[result.length - 1] = new ObjectCount(); + result[result.length - 1].setCount(queryResponse.getResults() + .getNumFound()); + // TODO: Make sure that this total is gotten out of the msgs.xml + result[result.length - 1].setValue("total"); + } + return result; + } } - if (showTotal) { - result[result.length - 1] = new ObjectCount(); - result[result.length - 1].setCount(queryResponse.getResults() - .getNumFound()); - // TODO: Make sure that this total is gotten out of the msgs.xml - result[result.length - 1].setValue("total"); - } - return result; + return new ObjectCount[0]; } @Override - public Map queryFacetQuery(String query, - String filterQuery, List facetQueries) + public Map queryFacetQuery(String query, String filterQuery, List facetQueries, + int facetMinCount) throws SolrServerException, IOException { QueryResponse response = query(query, filterQuery, null, 0, 1, null, null, - null, facetQueries, null, false); + null, facetQueries, null, false, facetMinCount); return response.getFacetQuery(); } @Override - public ObjectCount queryTotal(String query, String filterQuery) + public ObjectCount queryTotal(String query, String filterQuery, int facetMinCount) throws SolrServerException, IOException { QueryResponse queryResponse = query(query, filterQuery, null, 0, -1, null, - null, null, null, null, false); + null, null, null, null, false, facetMinCount); ObjectCount objCount = new ObjectCount(); objCount.setCount(queryResponse.getResults().getNumFound()); @@ -985,7 +996,8 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea @Override public QueryResponse query(String query, String filterQuery, String facetField, int rows, int max, String dateType, String dateStart, - String dateEnd, List facetQueries, String sort, boolean ascending) + String dateEnd, List facetQueries, String sort, boolean ascending, + int facetMinCount) throws SolrServerException, IOException { if (solr == null) { return null; @@ -993,20 +1005,20 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea // System.out.println("QUERY"); SolrQuery solrQuery = new SolrQuery().setRows(rows).setQuery(query) - .setFacetMinCount(1); + .setFacetMinCount(facetMinCount); addAdditionalSolrYearCores(solrQuery); // Set the date facet if present if (dateType != null) { - solrQuery.setParam("facet.date", "time") + solrQuery.setParam("facet.range", "time") . // EXAMPLE: NOW/MONTH+1MONTH - setParam("facet.date.end", + setParam("f.time.facet.range.end", "NOW/" + dateType + dateEnd + dateType).setParam( - "facet.date.gap", "+1" + dateType) + "f.time.facet.range.gap", "+1" + dateType) . // EXAMPLE: NOW/MONTH-" + nbMonths + "MONTHS - setParam("facet.date.start", + setParam("f.time.facet.range.start", "NOW/" + dateType + dateStart + dateType + "S") .setFacet(true); } @@ -1555,7 +1567,8 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea * initialization at the same time. */ protected synchronized void initSolrYearCores() { - if (statisticYearCoresInit || !(solr instanceof HttpSolrClient)) { + if (statisticYearCoresInit || !(solr instanceof HttpSolrClient) || !configurationService.getBooleanProperty( + "usage-statistics.shardedByYear", false)) { return; } diff --git a/dspace-api/src/main/java/org/dspace/statistics/content/DatasetTimeGenerator.java b/dspace-api/src/main/java/org/dspace/statistics/content/DatasetTimeGenerator.java index 12c8bab6d3..1152ee669c 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/content/DatasetTimeGenerator.java +++ b/dspace-api/src/main/java/org/dspace/statistics/content/DatasetTimeGenerator.java @@ -31,7 +31,7 @@ public class DatasetTimeGenerator extends DatasetGenerator { /** * Default constructor */ - private DatasetTimeGenerator() { } + public DatasetTimeGenerator() { } /** * Sets the date interval. diff --git a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsBSAdapter.java b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsBSAdapter.java index ec6aecde98..7fc2167e05 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsBSAdapter.java +++ b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsBSAdapter.java @@ -68,11 +68,11 @@ public class StatisticsBSAdapter { switch (visitType) { case ITEM_VISITS: return solrLoggerService - .queryTotal("type: " + Constants.ITEM + " AND id: " + item.getID(), resolveFilterQueries()) + .queryTotal("type: " + Constants.ITEM + " AND id: " + item.getID(), resolveFilterQueries(), 0) .getCount(); case BITSTREAM_VISITS: return solrLoggerService.queryTotal("type: " + Constants.BITSTREAM + " AND owningItem: " + item.getID(), - resolveFilterQueries()).getCount(); + resolveFilterQueries(), 0).getCount(); case TOTAL_VISITS: return getNumberOfVisits(ITEM_VISITS, item) + getNumberOfVisits(BITSTREAM_VISITS, item); default: diff --git a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsData.java b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsData.java index 1b09859362..9e307ecb40 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsData.java +++ b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsData.java @@ -115,13 +115,14 @@ public abstract class StatisticsData { * Run the accumulated query and return its results. * * @param context The relevant DSpace Context. + * @param facetMinCount Minimum count of results facet must have to return a result * @return accumulated query results * @throws SQLException An exception that provides information on a database access error or other errors. * @throws SolrServerException Exception from the Solr server to the solrj Java client. * @throws IOException A general class of exceptions produced by failed or interrupted I/O operations. * @throws ParseException if the dataset cannot be parsed */ - public abstract Dataset createDataset(Context context) throws SQLException, + public abstract Dataset createDataset(Context context, int facetMinCount) throws SQLException, SolrServerException, IOException, ParseException; } diff --git a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataSearches.java b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataSearches.java index 662108c1d7..b8c2a63c84 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataSearches.java +++ b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataSearches.java @@ -50,7 +50,7 @@ public class StatisticsDataSearches extends StatisticsData { @Override - public Dataset createDataset(Context context) + public Dataset createDataset(Context context, int facetMinCount) throws SQLException, SolrServerException, IOException, ParseException { // Check if we already have one. // If we do then give it back. @@ -85,7 +85,7 @@ public class StatisticsDataSearches extends StatisticsData { ObjectCount[] topCounts = solrLoggerService .queryFacetField(query, fqBuffer.toString(), typeGenerator.getType(), typeGenerator.getMax(), - (typeGenerator.isPercentage() || typeGenerator.isIncludeTotal()), null); + (typeGenerator.isPercentage() || typeGenerator.isIncludeTotal()), null, 0); long totalCount = -1; if (typeGenerator.isPercentage() && 0 < topCounts.length) { //Retrieve the total required to calculate the percentage @@ -133,14 +133,15 @@ public class StatisticsDataSearches extends StatisticsData { queryString = "\"\""; } - ObjectCount totalPageViews = getTotalPageViews("query:" + queryString, defaultFilterQuery); + ObjectCount totalPageViews = getTotalPageViews("query:" + queryString, defaultFilterQuery + , facetMinCount); dataset.addValueToMatrix(i, 3, pageViewFormat .format((float) totalPageViews.getCount() / queryCount.getCount())); } } } else if (typeGenerator.getMode() == DatasetSearchGenerator.Mode.SEARCH_OVERVIEW_TOTAL) { //Retrieve the total counts ! - ObjectCount totalCount = solrLoggerService.queryTotal(query, getSearchFilterQuery()); + ObjectCount totalCount = solrLoggerService.queryTotal(query, getSearchFilterQuery(), facetMinCount); //Retrieve the filtered count by using the default filter query StringBuilder fqBuffer = new StringBuilder(defaultFilterQuery); @@ -149,7 +150,7 @@ public class StatisticsDataSearches extends StatisticsData { } fqBuffer.append(getSearchFilterQuery()); - ObjectCount totalFiltered = solrLoggerService.queryTotal(query, fqBuffer.toString()); + ObjectCount totalFiltered = solrLoggerService.queryTotal(query, fqBuffer.toString(), facetMinCount); fqBuffer = new StringBuilder(defaultFilterQuery); @@ -159,7 +160,7 @@ public class StatisticsDataSearches extends StatisticsData { fqBuffer.append("statistics_type:") .append(SolrLoggerServiceImpl.StatisticsType.SEARCH_RESULT.text()); - ObjectCount totalPageViews = getTotalPageViews(query, defaultFilterQuery); + ObjectCount totalPageViews = getTotalPageViews(query, defaultFilterQuery, facetMinCount); dataset = new Dataset(1, 3); dataset.setRowLabel(0, ""); @@ -221,7 +222,7 @@ public class StatisticsDataSearches extends StatisticsData { return query; } - protected ObjectCount getTotalPageViews(String query, String defaultFilterQuery) + protected ObjectCount getTotalPageViews(String query, String defaultFilterQuery, int facetMinCount) throws SolrServerException, IOException { StringBuilder fqBuffer; fqBuffer = new StringBuilder(defaultFilterQuery); @@ -232,7 +233,7 @@ public class StatisticsDataSearches extends StatisticsData { //Retrieve the number of page views by this query ! - return solrLoggerService.queryTotal(query, fqBuffer.toString()); + return solrLoggerService.queryTotal(query, fqBuffer.toString(), facetMinCount); } /** diff --git a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataVisits.java b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataVisits.java index 7ad9e9cf88..9010edacf3 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataVisits.java +++ b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataVisits.java @@ -58,7 +58,7 @@ import org.dspace.statistics.util.LocationUtils; *

  • Add a {@link DatasetDSpaceObjectGenerator} for the appropriate object type.
  • *
  • Add other generators as required to get the statistic you want.
  • *
  • Add {@link org.dspace.statistics.content.filter filters} as required.
  • - *
  • {@link #createDataset(Context)} will run the query and return a result matrix. + *
  • {@link #createDataset(Context, int)} will run the query and return a result matrix. * Subsequent calls skip the query and return the same matrix.
  • * * @@ -117,7 +117,7 @@ public class StatisticsDataVisits extends StatisticsData { } @Override - public Dataset createDataset(Context context) throws SQLException, + public Dataset createDataset(Context context, int facetMinCount) throws SQLException, SolrServerException, ParseException, IOException { // Check if we already have one. // If we do then give it back. @@ -214,7 +214,8 @@ public class StatisticsDataVisits extends StatisticsData { // We are asking from our current query all the visits faceted by date ObjectCount[] results = solrLoggerService .queryFacetDate(query, filterQuery, dataSetQuery.getMax(), dateFacet.getDateType(), - dateFacet.getStartDate(), dateFacet.getEndDate(), showTotal, context); + dateFacet.getStartDate(), dateFacet.getEndDate(), showTotal, context, + facetMinCount); dataset = new Dataset(1, results.length); // Now that we have our results put em in a matrix for (int j = 0; j < results.length; j++) { @@ -230,15 +231,15 @@ public class StatisticsDataVisits extends StatisticsData { // the datasettimequery ObjectCount[] maxObjectCounts = solrLoggerService .queryFacetField(query, filterQuery, dataSetQuery.getFacetField(), dataSetQuery.getMax(), - false, null); + false, null, facetMinCount); for (int j = 0; j < maxObjectCounts.length; j++) { ObjectCount firstCount = maxObjectCounts[j]; String newQuery = dataSetQuery.getFacetField() + ": " + ClientUtils .escapeQueryChars(firstCount.getValue()) + " AND " + query; ObjectCount[] maxDateFacetCounts = solrLoggerService .queryFacetDate(newQuery, filterQuery, dataSetQuery.getMax(), dateFacet.getDateType(), - dateFacet.getStartDate(), dateFacet.getEndDate(), showTotal, context); - + dateFacet.getStartDate(), dateFacet.getEndDate(), showTotal, context, + facetMinCount); // Make sure we have a dataSet if (dataset == null) { @@ -283,7 +284,8 @@ public class StatisticsDataVisits extends StatisticsData { ObjectCount[] topCounts1 = null; // if (firsDataset.getQueries().size() == 1) { - topCounts1 = queryFacetField(firsDataset, firsDataset.getQueries().get(0).getQuery(), filterQuery); + topCounts1 = + queryFacetField(firsDataset, firsDataset.getQueries().get(0).getQuery(), filterQuery, facetMinCount); // } else { // TODO: do this // } @@ -292,7 +294,7 @@ public class StatisticsDataVisits extends StatisticsData { DatasetQuery secondDataSet = datasetQueries.get(1); // Now do the second one ObjectCount[] topCounts2 = queryFacetField(secondDataSet, secondDataSet.getQueries().get(0).getQuery(), - filterQuery); + filterQuery, facetMinCount); // Now that have results for both of them lets do x.y queries List facetQueries = new ArrayList(); for (ObjectCount count2 : topCounts2) { @@ -325,7 +327,7 @@ public class StatisticsDataVisits extends StatisticsData { } Map facetResult = solrLoggerService - .queryFacetQuery(query, filterQuery, facetQueries); + .queryFacetQuery(query, filterQuery, facetQueries, facetMinCount); // TODO: the show total @@ -671,7 +673,7 @@ public class StatisticsDataVisits extends StatisticsData { case Constants.ITEM: Item item = itemService.findByIdOrLegacyId(context, dsoId); - if (item == null) { + if (item == null || item.getHandle() == null) { break; } @@ -680,7 +682,7 @@ public class StatisticsDataVisits extends StatisticsData { case Constants.COLLECTION: Collection coll = collectionService.findByIdOrLegacyId(context, dsoId); - if (coll == null) { + if (coll == null || coll.getHandle() == null) { break; } @@ -689,7 +691,7 @@ public class StatisticsDataVisits extends StatisticsData { case Constants.COMMUNITY: Community comm = communityService.findByIdOrLegacyId(context, dsoId); - if (comm == null) { + if (comm == null || comm.getHandle() == null) { break; } @@ -704,12 +706,12 @@ public class StatisticsDataVisits extends StatisticsData { protected ObjectCount[] queryFacetField(DatasetQuery dataset, String query, - String filterQuery) + String filterQuery, int facetMinCount) throws SolrServerException, IOException { String facetType = dataset.getFacetField() == null ? "id" : dataset .getFacetField(); return solrLoggerService.queryFacetField(query, filterQuery, facetType, - dataset.getMax(), false, null); + dataset.getMax(), false, null, facetMinCount); } public static class DatasetQuery { diff --git a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataWorkflow.java b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataWorkflow.java index 7d3a7ff37a..409b79cb69 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataWorkflow.java +++ b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDataWorkflow.java @@ -65,7 +65,7 @@ public class StatisticsDataWorkflow extends StatisticsData { @Override - public Dataset createDataset(Context context) + public Dataset createDataset(Context context, int facetMinCount) throws SQLException, SolrServerException, IOException, ParseException { // Check if we already have one. // If we do then give it back. @@ -92,16 +92,16 @@ public class StatisticsDataWorkflow extends StatisticsData { DatasetTypeGenerator typeGenerator = (DatasetTypeGenerator) datasetGenerator; ObjectCount[] topCounts = solrLoggerService .queryFacetField(query, defaultFilterQuery, typeGenerator.getType(), typeGenerator.getMax(), - typeGenerator.isIncludeTotal(), null); + typeGenerator.isIncludeTotal(), null, facetMinCount); //Retrieve our total field counts Map totalFieldCounts = new HashMap(); if (averageMonths != -1) { - totalFieldCounts = getTotalFacetCounts(typeGenerator); + totalFieldCounts = getTotalFacetCounts(typeGenerator, facetMinCount); } long monthDifference = 1; - if (getOldestWorkflowItemDate() != null) { - monthDifference = getMonthsDifference(new Date(), getOldestWorkflowItemDate()); + if (getOldestWorkflowItemDate(facetMinCount) != null) { + monthDifference = getMonthsDifference(new Date(), getOldestWorkflowItemDate(facetMinCount)); } dataset = new Dataset(topCounts.length, (averageMonths != -1 ? 3 : 2)); @@ -168,10 +168,10 @@ public class StatisticsDataWorkflow extends StatisticsData { * @throws org.apache.solr.client.solrj.SolrServerException passed through. * @throws java.io.IOException passed through. */ - protected Map getTotalFacetCounts(DatasetTypeGenerator typeGenerator) + protected Map getTotalFacetCounts(DatasetTypeGenerator typeGenerator, int facetMinCount) throws SolrServerException, IOException { ObjectCount[] objectCounts = solrLoggerService - .queryFacetField(getQuery(), null, typeGenerator.getType(), -1, false, null); + .queryFacetField(getQuery(), null, typeGenerator.getType(), -1, false, null, facetMinCount); Map result = new HashMap<>(); for (ObjectCount objectCount : objectCounts) { result.put(objectCount.getValue(), objectCount.getCount()); @@ -179,14 +179,14 @@ public class StatisticsDataWorkflow extends StatisticsData { return result; } - protected Date getOldestWorkflowItemDate() + protected Date getOldestWorkflowItemDate(int facetMinCount) throws SolrServerException, IOException { ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); String workflowStartDate = configurationService.getProperty("usage-statistics.workflow-start-date"); if (workflowStartDate == null) { //Query our solr for it ! QueryResponse oldestRecord = solrLoggerService - .query(getQuery(), null, null, 1, 0, null, null, null, null, "time", true); + .query(getQuery(), null, null, 1, 0, null, null, null, null, "time", true, facetMinCount); if (0 < oldestRecord.getResults().getNumFound()) { SolrDocument solrDocument = oldestRecord.getResults().get(0); Date oldestDate = (Date) solrDocument.getFieldValue("time"); diff --git a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDisplay.java b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDisplay.java index 9bd54c189f..a1058c907f 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDisplay.java +++ b/dspace-api/src/main/java/org/dspace/statistics/content/StatisticsDisplay.java @@ -83,8 +83,9 @@ public abstract class StatisticsDisplay { return statisticsData.getDataset(); } - public Dataset getDataset(Context context) throws SQLException, SolrServerException, IOException, ParseException { - return statisticsData.createDataset(context); + public Dataset getDataset(Context context, int facetMinCount) throws SQLException, SolrServerException, IOException, + ParseException { + return statisticsData.createDataset(context, facetMinCount); } public void addCss(String style) { diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/FailedOpenURLTrackerServiceImpl.java b/dspace-api/src/main/java/org/dspace/statistics/export/FailedOpenURLTrackerServiceImpl.java new file mode 100644 index 0000000000..cb8e64cc65 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/FailedOpenURLTrackerServiceImpl.java @@ -0,0 +1,59 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export; + +import java.sql.SQLException; +import java.util.List; + +import org.dspace.core.Context; +import org.dspace.statistics.export.dao.OpenURLTrackerDAO; +import org.dspace.statistics.export.service.FailedOpenURLTrackerService; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * Implementation of the service that handles the OpenURLTracker database operations + */ +public class FailedOpenURLTrackerServiceImpl implements FailedOpenURLTrackerService { + + @Autowired(required = true) + protected OpenURLTrackerDAO openURLTrackerDAO; + + /** + * Removes an OpenURLTracker from the database + * @param context + * @param openURLTracker + * @throws SQLException + */ + @Override + public void remove(Context context, OpenURLTracker openURLTracker) throws SQLException { + openURLTrackerDAO.delete(context, openURLTracker); + } + + /** + * Returns all OpenURLTrackers from the database + * @param context + * @return all OpenURLTrackers + * @throws SQLException + */ + @Override + public List findAll(Context context) throws SQLException { + return openURLTrackerDAO.findAll(context, OpenURLTracker.class); + } + + /** + * Creates a new OpenURLTracker + * @param context + * @return the creatred OpenURLTracker + * @throws SQLException + */ + @Override + public OpenURLTracker create(Context context) throws SQLException { + OpenURLTracker openURLTracker = openURLTrackerDAO.create(context, new OpenURLTracker()); + return openURLTracker; + } +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/IrusExportUsageEventListener.java b/dspace-api/src/main/java/org/dspace/statistics/export/IrusExportUsageEventListener.java new file mode 100644 index 0000000000..a8101c51de --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/IrusExportUsageEventListener.java @@ -0,0 +1,74 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export; + +import java.util.UUID; + +import org.apache.log4j.Logger; +import org.dspace.content.Bitstream; +import org.dspace.content.Item; +import org.dspace.core.Context; +import org.dspace.core.LogManager; +import org.dspace.services.ConfigurationService; +import org.dspace.services.model.Event; +import org.dspace.statistics.export.processor.BitstreamEventProcessor; +import org.dspace.statistics.export.processor.ItemEventProcessor; +import org.dspace.usage.AbstractUsageEventListener; +import org.dspace.usage.UsageEvent; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * Class to receive usage events and send corresponding data to IRUS + */ +public class IrusExportUsageEventListener extends AbstractUsageEventListener { + /* Log4j logger*/ + private static Logger log = Logger.getLogger(IrusExportUsageEventListener.class); + + @Autowired + ConfigurationService configurationService; + + /** + * Receives an event and processes to create a URL to send to IRUS when certain conditions are met + * + * @param event includes all the information related to the event that occurred + */ + public void receiveEvent(Event event) { + if (configurationService.getBooleanProperty("irus.statistics.tracker.enabled", false)) { + if (event instanceof UsageEvent) { + UsageEvent ue = (UsageEvent) event; + Context context = ue.getContext(); + + try { + //Check for item investigation + if (ue.getObject() instanceof Item) { + ItemEventProcessor itemEventProcessor = new ItemEventProcessor(context, ue.getRequest(), + (Item) ue.getObject()); + itemEventProcessor.processEvent(); + } else if (ue.getObject() instanceof Bitstream) { + + BitstreamEventProcessor bitstreamEventProcessor = + new BitstreamEventProcessor(context, ue.getRequest(), (Bitstream) ue.getObject()); + bitstreamEventProcessor.processEvent(); + } + } catch (Exception e) { + UUID id; + id = ue.getObject().getID(); + + int type; + try { + type = ue.getObject().getType(); + } catch (Exception e1) { + type = -1; + } + log.error(LogManager.getHeader(ue.getContext(), "Error while processing export of use event", + "Id: " + id + " type: " + type), e); + } + } + } + } +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/OpenURLTracker.java b/dspace-api/src/main/java/org/dspace/statistics/export/OpenURLTracker.java new file mode 100644 index 0000000000..b853f255e8 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/OpenURLTracker.java @@ -0,0 +1,121 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export; + +import java.util.Date; +import javax.persistence.Column; +import javax.persistence.Entity; +import javax.persistence.GeneratedValue; +import javax.persistence.GenerationType; +import javax.persistence.Id; +import javax.persistence.SequenceGenerator; +import javax.persistence.Table; +import javax.persistence.Temporal; +import javax.persistence.TemporalType; + +import org.dspace.core.ReloadableEntity; +import org.hibernate.proxy.HibernateProxyHelper; + +/** + * Class that represents an OpenURLTracker which tracks a failed transmission to IRUS + */ +@Entity +@Table(name = "OpenUrlTracker") +public class OpenURLTracker implements ReloadableEntity { + + @Id + @Column(name = "tracker_id") + @GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "openurltracker_seq") + @SequenceGenerator(name = "openurltracker_seq", sequenceName = "openurltracker_seq", allocationSize = 1) + private Integer id; + + @Column(name = "tracker_url", length = 1000) + private String url; + + @Column(name = "uploaddate") + @Temporal(TemporalType.DATE) + private Date uploadDate; + + protected OpenURLTracker() { + } + + /** + * Gets the OpenURLTracker id + * @return the id + */ + @Override + public Integer getID() { + return id; + } + + /** + * Gets the OpenURLTracker url + * @return the url + */ + public String getUrl() { + return url; + } + + /** + * Sets the OpenURLTracker url + * @param url + */ + public void setUrl(String url) { + this.url = url; + } + + /** + * Returns the upload date + * @return upload date + */ + public Date getUploadDate() { + return uploadDate; + } + + /** + * Set the upload date + * @param uploadDate + */ + public void setUploadDate(Date uploadDate) { + this.uploadDate = uploadDate; + } + + /** + * Determines whether two objects of this class are equal by comparing the ID + * @param o - object to compare + * @return whether the objects are equal + */ + @Override + public boolean equals(Object o) { + if (this == o) { + return true; + } + Class objClass = HibernateProxyHelper.getClassWithoutInitializingProxy(o); + if (getClass() != objClass) { + return false; + } + + final OpenURLTracker that = (OpenURLTracker) o; + if (!this.getID().equals(that.getID())) { + return false; + } + + return true; + } + + /** + * Returns the hash code value for the object + * @return hash code + */ + @Override + public int hashCode() { + int hash = 8; + hash = 74 * hash + this.getID(); + return hash; + } +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/RetryFailedOpenUrlTracker.java b/dspace-api/src/main/java/org/dspace/statistics/export/RetryFailedOpenUrlTracker.java new file mode 100644 index 0000000000..6b1bea0de1 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/RetryFailedOpenUrlTracker.java @@ -0,0 +1,84 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export; + +import org.apache.commons.cli.ParseException; +import org.apache.commons.lang3.StringUtils; +import org.dspace.core.Context; +import org.dspace.scripts.DSpaceRunnable; +import org.dspace.statistics.export.factory.OpenURLTrackerLoggerServiceFactory; +import org.dspace.statistics.export.service.OpenUrlService; +import org.dspace.utils.DSpace; + +/** + * Script to retry the failed url transmissions to IRUS + * This script also has an option to add new failed urls for testing purposes + */ +public class RetryFailedOpenUrlTracker extends DSpaceRunnable { + + private String lineToAdd = null; + private boolean help = false; + private boolean retryFailed = false; + + private OpenUrlService openUrlService; + + /** + * Run the script + * When the -a option is used, a new "failed" url will be added to the database + * + * @throws Exception + */ + public void internalRun() throws Exception { + if (help) { + printHelp(); + return; + } + Context context = new Context(); + context.turnOffAuthorisationSystem(); + + if (StringUtils.isNotBlank(lineToAdd)) { + openUrlService.logfailed(context, lineToAdd); + handler.logInfo("Created dummy entry in OpenUrlTracker with URL: " + lineToAdd); + } + if (retryFailed) { + handler.logInfo("Reprocessing failed URLs stored in the db"); + openUrlService.reprocessFailedQueue(context); + } + context.restoreAuthSystemState(); + context.complete(); + } + + public RetryFailedOpenUrlTrackerScriptConfiguration getScriptConfiguration() { + return new DSpace().getServiceManager().getServiceByName("retry-tracker", + RetryFailedOpenUrlTrackerScriptConfiguration.class); + } + + /** + * Setups the parameters + * + * @throws ParseException + */ + public void setup() throws ParseException { + openUrlService = OpenURLTrackerLoggerServiceFactory.getInstance().getOpenUrlService(); + + if (!(commandLine.hasOption('a') || commandLine.hasOption('r') || commandLine.hasOption('h'))) { + throw new ParseException("At least one of the parameters (-a, -r, -h) is required!"); + } + + if (commandLine.hasOption('h')) { + help = true; + } + if (commandLine.hasOption('a')) { + lineToAdd = commandLine.getOptionValue('a'); + } + if (commandLine.hasOption('r')) { + retryFailed = true; + } + } + +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/RetryFailedOpenUrlTrackerScriptConfiguration.java b/dspace-api/src/main/java/org/dspace/statistics/export/RetryFailedOpenUrlTrackerScriptConfiguration.java new file mode 100644 index 0000000000..b5d65aa4e5 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/RetryFailedOpenUrlTrackerScriptConfiguration.java @@ -0,0 +1,74 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export; + +import java.sql.SQLException; + +import org.apache.commons.cli.Options; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.core.Context; +import org.dspace.scripts.configuration.ScriptConfiguration; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * The {@link ScriptConfiguration} for the {@link RetryFailedOpenUrlTracker} script + */ +public class RetryFailedOpenUrlTrackerScriptConfiguration + extends ScriptConfiguration { + + @Autowired + private AuthorizeService authorizeService; + + private Class dspaceRunnableClass; + + @Override + public Class getDspaceRunnableClass() { + return dspaceRunnableClass; + } + + /** + * Generic setter for the dspaceRunnableClass + * + * @param dspaceRunnableClass The dspaceRunnableClass to be set on this RetryFailedOpenUrlTrackerScriptConfiguration + */ + @Override + public void setDspaceRunnableClass(Class dspaceRunnableClass) { + this.dspaceRunnableClass = dspaceRunnableClass; + } + + @Override + public boolean isAllowedToExecute(Context context) { + try { + return authorizeService.isAdmin(context); + } catch (SQLException e) { + throw new RuntimeException("SQLException occurred when checking if the current user is an admin", e); + } + } + + @Override + public Options getOptions() { + if (options == null) { + Options options = new Options(); + + options.addOption("a", true, "Add a new \"failed\" row to the table with a url (test purposes only)"); + options.getOption("a").setType(String.class); + + options.addOption("r", false, + "Retry sending requests to all urls stored in the table with failed requests. " + + "This includes the url that can be added through the -a option."); + options.getOption("r").setType(boolean.class); + + options.addOption("h", "help", false, "print this help message"); + options.getOption("h").setType(boolean.class); + + super.options = options; + } + return options; + } + +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/dao/OpenURLTrackerDAO.java b/dspace-api/src/main/java/org/dspace/statistics/export/dao/OpenURLTrackerDAO.java new file mode 100644 index 0000000000..e3b957db1d --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/dao/OpenURLTrackerDAO.java @@ -0,0 +1,21 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.dao; + +import org.dspace.core.GenericDAO; +import org.dspace.statistics.export.OpenURLTracker; + +/** + * Database Access Object interface class for the OpenURLTracker object. + * The implementation of this class is responsible for all database calls for the OpenURLTracker object and is + * autowired by spring + * This class should only be accessed from a single service and should never be exposed outside of the API + */ +public interface OpenURLTrackerDAO extends GenericDAO { + +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/dao/impl/OpenURLTrackerDAOImpl.java b/dspace-api/src/main/java/org/dspace/statistics/export/dao/impl/OpenURLTrackerDAOImpl.java new file mode 100644 index 0000000000..d057f45bac --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/dao/impl/OpenURLTrackerDAOImpl.java @@ -0,0 +1,26 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.dao.impl; + +import org.dspace.core.AbstractHibernateDAO; +import org.dspace.statistics.export.OpenURLTracker; +import org.dspace.statistics.export.dao.OpenURLTrackerDAO; + +/** + * Hibernate implementation of the Database Access Object interface class for the OpenURLTracker object. + * This class is responsible for all database calls for the OpenURLTracker object and is autowired by spring + * This class should never be accessed directly. + * + */ +public class OpenURLTrackerDAOImpl extends AbstractHibernateDAO implements OpenURLTrackerDAO { + + protected OpenURLTrackerDAOImpl() { + super(); + } + +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/factory/OpenURLTrackerLoggerServiceFactory.java b/dspace-api/src/main/java/org/dspace/statistics/export/factory/OpenURLTrackerLoggerServiceFactory.java new file mode 100644 index 0000000000..b31b076f68 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/factory/OpenURLTrackerLoggerServiceFactory.java @@ -0,0 +1,41 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.factory; + +import org.dspace.services.factory.DSpaceServicesFactory; +import org.dspace.statistics.export.service.FailedOpenURLTrackerService; +import org.dspace.statistics.export.service.OpenUrlService; + +/** + * The service factory for the OpenUrlTracker related services + */ +public abstract class OpenURLTrackerLoggerServiceFactory { + + /** + * Returns the FailedOpenURLTrackerService + * @return FailedOpenURLTrackerService instance + */ + public abstract FailedOpenURLTrackerService getOpenUrlTrackerLoggerService(); + + /** + * Retrieve the OpenURLTrackerLoggerServiceFactory + * @return OpenURLTrackerLoggerServiceFactory instance + */ + public static OpenURLTrackerLoggerServiceFactory getInstance() { + return DSpaceServicesFactory.getInstance().getServiceManager() + .getServiceByName("openURLTrackerLoggerServiceFactory", + OpenURLTrackerLoggerServiceFactory.class); + + } + + /** + * Returns the OpenUrlService + * @return OpenUrlService instance + */ + public abstract OpenUrlService getOpenUrlService(); +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/factory/OpenURLTrackerLoggerServiceFactoryImpl.java b/dspace-api/src/main/java/org/dspace/statistics/export/factory/OpenURLTrackerLoggerServiceFactoryImpl.java new file mode 100644 index 0000000000..f585fdf376 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/factory/OpenURLTrackerLoggerServiceFactoryImpl.java @@ -0,0 +1,42 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.factory; + +import org.dspace.statistics.export.service.FailedOpenURLTrackerService; +import org.dspace.statistics.export.service.OpenUrlService; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * The service factory implementation for the OpenUrlTracker related services + */ +public class OpenURLTrackerLoggerServiceFactoryImpl extends OpenURLTrackerLoggerServiceFactory { + + @Autowired(required = true) + private FailedOpenURLTrackerService failedOpenURLTrackerService; + + @Autowired(required = true) + private OpenUrlService openUrlService; + + /** + * Returns the FailedOpenURLTrackerService + * @return FailedOpenURLTrackerService instance + */ + @Override + public FailedOpenURLTrackerService getOpenUrlTrackerLoggerService() { + return failedOpenURLTrackerService; + } + + /** + * Returns the OpenUrlService + * @return OpenUrlService instance + */ + @Override + public OpenUrlService getOpenUrlService() { + return openUrlService; + } +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/processor/BitstreamEventProcessor.java b/dspace-api/src/main/java/org/dspace/statistics/export/processor/BitstreamEventProcessor.java new file mode 100644 index 0000000000..85cb7bc14c --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/processor/BitstreamEventProcessor.java @@ -0,0 +1,129 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.processor; + +import java.io.IOException; +import java.io.UnsupportedEncodingException; +import java.net.URLEncoder; +import java.sql.SQLException; +import javax.servlet.http.HttpServletRequest; + +import org.dspace.content.Bitstream; +import org.dspace.content.Bundle; +import org.dspace.content.Item; +import org.dspace.core.Context; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.dspace.statistics.util.SpiderDetector; + +/** + * Processor that handles Bitstream events from the IrusExportUsageEventListener + */ +public class BitstreamEventProcessor extends ExportEventProcessor { + + private ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + + + private Item item; + private Bitstream bitstream; + + /** + * Creates a new BitstreamEventProcessor that will set the params and obtain the parent item of the bitstream + * + * @param context + * @param request + * @param bitstream + * @throws SQLException + */ + public BitstreamEventProcessor(Context context, HttpServletRequest request, Bitstream bitstream) + throws SQLException { + super(context, request); + this.bitstream = bitstream; + this.item = getItem(request); + } + + /** + * Returns the parent item of the bitsream + * + * @return parent item of the bitstream + * @throws SQLException + */ + private Item getItem(HttpServletRequest request) throws SQLException { + if (0 < bitstream.getBundles().size()) { + if (!SpiderDetector.isSpider(request)) { + Bundle bundle = bitstream.getBundles().get(0); + if (bundle.getName() == null || !bundle.getName().equals("ORIGINAL")) { + return null; + } + + if (0 < bundle.getItems().size()) { + Item item = bundle.getItems().get(0); + return item; + } + } + } + return null; + } + + /** + * Process the event + * Check if the item should be processed + * Create the url to be transmitted based on item and bitstream data + * + * @throws SQLException + * @throws IOException + */ + public void processEvent() throws SQLException, IOException { + if (shouldProcessItem(item)) { + String baseParam = getBaseParameters(item); + String fullParam = addObjectSpecificData(baseParam, bitstream); + processObject(fullParam); + } + } + + /** + * Adds additional item and bitstream data to the url + * + * @param string to which the additional data needs to be added + * @param bitstream + * @return the string with additional data + * @throws UnsupportedEncodingException + */ + protected String addObjectSpecificData(final String string, Bitstream bitstream) + throws UnsupportedEncodingException { + StringBuilder data = new StringBuilder(string); + + String bitstreamInfo = getBitstreamInfo(bitstream); + data.append("&").append(URLEncoder.encode("svc_dat", UTF_8)).append("=") + .append(URLEncoder.encode(bitstreamInfo, UTF_8)); + data.append("&").append(URLEncoder.encode("rft_dat", UTF_8)).append("=") + .append(URLEncoder.encode(BITSTREAM_DOWNLOAD, UTF_8)); + + return data.toString(); + } + + /** + * Get Bitstream info used for the url + * + * @param bitstream + * @return bitstream info + */ + private String getBitstreamInfo(final Bitstream bitstream) { + + String dspaceRestUrl = configurationService.getProperty("dspace.server.url"); + + StringBuilder sb = new StringBuilder(); + + sb.append(dspaceRestUrl); + sb.append("/api/core/bitstreams/"); + sb.append(bitstream.getID()); + sb.append("/content"); + + return sb.toString(); + } +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/processor/ExportEventProcessor.java b/dspace-api/src/main/java/org/dspace/statistics/export/processor/ExportEventProcessor.java new file mode 100644 index 0000000000..021481c54a --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/processor/ExportEventProcessor.java @@ -0,0 +1,258 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.processor; + +import java.io.IOException; +import java.io.UnsupportedEncodingException; +import java.net.URLEncoder; +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Date; +import java.util.List; +import javax.servlet.http.HttpServletRequest; + +import org.apache.commons.codec.CharEncoding; +import org.apache.commons.lang3.StringUtils; +import org.apache.log4j.Logger; +import org.dspace.content.DCDate; +import org.dspace.content.Entity; +import org.dspace.content.EntityType; +import org.dspace.content.Item; +import org.dspace.content.MetadataValue; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.content.service.EntityService; +import org.dspace.content.service.ItemService; +import org.dspace.core.Context; +import org.dspace.core.Utils; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.dspace.statistics.export.factory.OpenURLTrackerLoggerServiceFactory; +import org.dspace.statistics.export.service.OpenUrlService; + +/** + * Abstract export event processor that contains all shared logic to handle both Items and Bitstreams + * from the IrusExportUsageEventListener + */ +public abstract class ExportEventProcessor { + + private static Logger log = Logger.getLogger(ExportEventProcessor.class); + + protected static final String ENTITY_TYPE_DEFAULT = "Publication"; + + protected static final String ITEM_VIEW = "Investigation"; + protected static final String BITSTREAM_DOWNLOAD = "Request"; + + protected final static String UTF_8 = CharEncoding.UTF_8; + + private ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + private EntityService entityService = ContentServiceFactory.getInstance().getEntityService(); + private ItemService itemService = ContentServiceFactory.getInstance().getItemService(); + private OpenUrlService openUrlService = OpenURLTrackerLoggerServiceFactory.getInstance().getOpenUrlService(); + + + private Context context; + private HttpServletRequest request; + + /** + * Creates a new ExportEventProcessor based on the params and initializes the services + * + * @param context + * @param request + */ + ExportEventProcessor(Context context, HttpServletRequest request) { + this.context = context; + this.request = request; + } + + /** + * Processes the event + * + * @throws SQLException + * @throws IOException + */ + public abstract void processEvent() throws SQLException, IOException; + + /** + * Process the url obtained from the object to be transmitted + * + * @param urlParameters + * @throws IOException + * @throws SQLException + */ + protected void processObject(String urlParameters) throws IOException, SQLException { + String baseUrl; + if (StringUtils.equals(configurationService.getProperty("irus.statistics.tracker.environment"), "production")) { + baseUrl = configurationService.getProperty("irus.statistics.tracker.produrl"); + } else { + baseUrl = configurationService.getProperty("irus.statistics.tracker.testurl"); + } + + openUrlService.processUrl(context, baseUrl + "?" + urlParameters); + } + + /** + * Get the base parameters for the url to be transmitted + * + * @param item + * @return the parameter string to be used in the url + * @throws UnsupportedEncodingException + */ + protected String getBaseParameters(Item item) + throws UnsupportedEncodingException { + + //We have a valid url collect the rest of the data + String clientIP = request.getRemoteAddr(); + if (configurationService.getBooleanProperty("useProxies", false) && request + .getHeader("X-Forwarded-For") != null) { + /* This header is a comma delimited list */ + for (String xfip : request.getHeader("X-Forwarded-For").split(",")) { + /* proxy itself will sometime populate this header with the same value in + remote address. ordering in spec is vague, we'll just take the last + not equal to the proxy + */ + if (!request.getHeader("X-Forwarded-For").contains(clientIP)) { + clientIP = xfip.trim(); + } + } + } + String clientUA = StringUtils.defaultIfBlank(request.getHeader("USER-AGENT"), ""); + String referer = StringUtils.defaultIfBlank(request.getHeader("referer"), ""); + + //Start adding our data + StringBuilder data = new StringBuilder(); + data.append(URLEncoder.encode("url_ver", UTF_8) + "=" + + URLEncoder.encode(configurationService.getProperty("irus.statistics.tracker.urlversion"), UTF_8)); + data.append("&").append(URLEncoder.encode("req_id", UTF_8)).append("=") + .append(URLEncoder.encode(clientIP, UTF_8)); + data.append("&").append(URLEncoder.encode("req_dat", UTF_8)).append("=") + .append(URLEncoder.encode(clientUA, UTF_8)); + + String hostName = Utils.getHostName(configurationService.getProperty("dspace.ui.url")); + + data.append("&").append(URLEncoder.encode("rft.artnum", UTF_8)).append("="). + append(URLEncoder.encode("oai:" + hostName + ":" + item + .getHandle(), UTF_8)); + data.append("&").append(URLEncoder.encode("rfr_dat", UTF_8)).append("=") + .append(URLEncoder.encode(referer, UTF_8)); + data.append("&").append(URLEncoder.encode("rfr_id", UTF_8)).append("=") + .append(URLEncoder.encode(hostName, UTF_8)); + data.append("&").append(URLEncoder.encode("url_tim", UTF_8)).append("=") + .append(URLEncoder.encode(getCurrentDateString(), UTF_8)); + + return data.toString(); + } + + /** + * Get the current date + * + * @return the current date as a string + */ + protected String getCurrentDateString() { + return new DCDate(new Date()).toString(); + } + + /** + * Checks if an item should be processed + * + * @param item to be checked + * @return whether the item should be processed + * @throws SQLException + */ + protected boolean shouldProcessItem(Item item) throws SQLException { + if (item == null) { + return false; + } + if (!item.isArchived()) { + return false; + } + if (itemService.canEdit(context, item)) { + return false; + } + if (!shouldProcessItemType(item)) { + return false; + } + if (!shouldProcessEntityType(item)) { + return false; + } + return true; + } + + /** + * Checks if the item's entity type should be processed + * + * @param item to be checked + * @return whether the item should be processed + * @throws SQLException + */ + protected boolean shouldProcessEntityType(Item item) throws SQLException { + Entity entity = entityService.findByItemId(context, item.getID()); + EntityType type = entityService.getType(context, entity); + + String[] entityTypeStrings = configurationService.getArrayProperty("irus.statistics.tracker.entity-types"); + List entityTypes = new ArrayList<>(); + + if (entityTypeStrings.length != 0) { + entityTypes.addAll(Arrays.asList(entityTypeStrings)); + } else { + entityTypes.add(ENTITY_TYPE_DEFAULT); + } + + if (type != null && entityTypes.contains(type.getLabel())) { + return true; + } + return false; + } + + /** + * Checks if the item should be excluded based on the its type + * + * @param item to be checked + * @return whether the item should be processed + */ + protected boolean shouldProcessItemType(Item item) { + String trackerTypeMetadataField = configurationService.getProperty("irus.statistics.tracker.type-field"); + String[] metadataValues = configurationService.getArrayProperty("irus.statistics.tracker.type-value"); + List trackerTypeMetadataValues; + if (metadataValues.length > 0) { + trackerTypeMetadataValues = new ArrayList<>(); + for (String metadataValue : metadataValues) { + trackerTypeMetadataValues.add(metadataValue.toLowerCase()); + } + } else { + trackerTypeMetadataValues = null; + } + + if (trackerTypeMetadataField != null && trackerTypeMetadataValues != null) { + + // Contains the schema, element and if present qualifier of the metadataField + String[] metadataFieldSplit = trackerTypeMetadataField.split("\\."); + + List types = itemService + .getMetadata(item, metadataFieldSplit[0], metadataFieldSplit[1], + metadataFieldSplit.length == 2 ? null : metadataFieldSplit[2], Item.ANY); + + if (!types.isEmpty()) { + //Find out if we have a type that needs to be excluded + for (MetadataValue type : types) { + if (trackerTypeMetadataValues.contains(type.getValue().toLowerCase())) { + //We have found no type so process this item + return false; + } + } + return true; + } else { + // No types in this item, so not excluded + return true; + } + } else { + // No types to be excluded + return true; + } + } +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/processor/ItemEventProcessor.java b/dspace-api/src/main/java/org/dspace/statistics/export/processor/ItemEventProcessor.java new file mode 100644 index 0000000000..507ca92382 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/processor/ItemEventProcessor.java @@ -0,0 +1,91 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.processor; + +import java.io.IOException; +import java.io.UnsupportedEncodingException; +import java.net.URLEncoder; +import java.sql.SQLException; +import javax.servlet.http.HttpServletRequest; + +import org.dspace.content.Item; +import org.dspace.core.Context; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; + + +/** + * Processor that handles Item events from the IrusExportUsageEventListener + */ +public class ItemEventProcessor extends ExportEventProcessor { + + private ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + + private Item item; + + /** + * Creates a new ItemEventProcessor that will set the params + * + * @param context + * @param request + * @param item + */ + public ItemEventProcessor(Context context, HttpServletRequest request, Item item) { + super(context, request); + this.item = item; + } + + /** + * Process the event + * Check if the item should be processed + * Create the url to be transmitted based on item data + * + * @throws SQLException + * @throws IOException + */ + public void processEvent() throws SQLException, IOException { + if (shouldProcessItem(item)) { + String baseParam = getBaseParameters(item); + String fullParam = addObjectSpecificData(baseParam, item); + processObject(fullParam); + } + } + + /** + * Adds additional item data to the url + * + * @param string to which the additional data needs to be added + * @param item + * @return the string with additional data + * @throws UnsupportedEncodingException + */ + protected String addObjectSpecificData(final String string, Item item) throws UnsupportedEncodingException { + StringBuilder data = new StringBuilder(string); + String itemInfo = getItemInfo(item); + data.append("&").append(URLEncoder.encode("svc_dat", UTF_8)).append("=") + .append(URLEncoder.encode(itemInfo, UTF_8)); + data.append("&").append(URLEncoder.encode("rft_dat", UTF_8)).append("=") + .append(URLEncoder.encode(ITEM_VIEW, UTF_8)); + return data.toString(); + } + + /** + * Get Item info used for the url + * + * @param item + * @return item info + */ + private String getItemInfo(final Item item) { + StringBuilder sb = new StringBuilder(configurationService.getProperty("dspace.ui.url")); + sb.append("/handle/").append(item.getHandle()); + + return sb.toString(); + } + + +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/service/FailedOpenURLTrackerService.java b/dspace-api/src/main/java/org/dspace/statistics/export/service/FailedOpenURLTrackerService.java new file mode 100644 index 0000000000..9b482e3d54 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/service/FailedOpenURLTrackerService.java @@ -0,0 +1,44 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.service; + +import java.sql.SQLException; +import java.util.List; + +import org.dspace.core.Context; +import org.dspace.statistics.export.OpenURLTracker; + +/** + * Interface of the service that handles the OpenURLTracker database operations + */ +public interface FailedOpenURLTrackerService { + + /** + * Removes an OpenURLTracker from the database + * @param context + * @param openURLTracker + * @throws SQLException + */ + void remove(Context context, OpenURLTracker openURLTracker) throws SQLException; + + /** + * Returns all OpenURLTrackers from the database + * @param context + * @return all OpenURLTrackers + * @throws SQLException + */ + List findAll(Context context) throws SQLException; + + /** + * Creates a new OpenURLTracker + * @param context + * @return the creatred OpenURLTracker + * @throws SQLException + */ + OpenURLTracker create(Context context) throws SQLException; +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/service/OpenUrlService.java b/dspace-api/src/main/java/org/dspace/statistics/export/service/OpenUrlService.java new file mode 100644 index 0000000000..881cfb62d3 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/service/OpenUrlService.java @@ -0,0 +1,44 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.service; + +import java.io.IOException; +import java.sql.SQLException; + +import org.dspace.core.Context; + +/** + * The Service responsible for processing urls + */ +public interface OpenUrlService { + /** + * Process the url + * @param c - the context + * @param urlStr - the url to be processed + * @throws IOException + * @throws SQLException + */ + void processUrl(Context c, String urlStr) throws SQLException; + + /** + * Will process all urls stored in the database and try contacting them again + * @param context + * @throws SQLException + */ + void reprocessFailedQueue(Context context) throws SQLException; + + /** + * Will log the failed url in the database + * @param context + * @param url + * @throws SQLException + */ + void logfailed(Context context, String url) throws SQLException; + + +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/export/service/OpenUrlServiceImpl.java b/dspace-api/src/main/java/org/dspace/statistics/export/service/OpenUrlServiceImpl.java new file mode 100644 index 0000000000..8555bb0986 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/statistics/export/service/OpenUrlServiceImpl.java @@ -0,0 +1,139 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.service; + +import java.io.IOException; +import java.net.HttpURLConnection; +import java.net.URL; +import java.net.URLConnection; +import java.sql.SQLException; +import java.util.Date; +import java.util.List; + +import org.apache.commons.lang.StringUtils; +import org.apache.log4j.Logger; +import org.dspace.core.Context; +import org.dspace.statistics.export.OpenURLTracker; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * Implementation of the OpenUrlService interface + */ +public class OpenUrlServiceImpl implements OpenUrlService { + + private Logger log = Logger.getLogger(OpenUrlService.class); + + @Autowired + protected FailedOpenURLTrackerService failedOpenUrlTrackerService; + + /** + * Processes the url + * When the contacting the url fails, the url will be logged in a db table + * @param c - the context + * @param urlStr - the url to be processed + * @throws SQLException + */ + public void processUrl(Context c, String urlStr) throws SQLException { + log.debug("Prepared to send url to tracker URL: " + urlStr); + + try { + int responseCode = getResponseCodeFromUrl(urlStr); + if (responseCode != HttpURLConnection.HTTP_OK) { + logfailed(c, urlStr); + } else if (log.isDebugEnabled()) { + log.debug("Successfully posted " + urlStr + " on " + new Date()); + } + } catch (Exception e) { + log.error("Failed to send url to tracker URL: " + urlStr); + logfailed(c, urlStr); + } + } + + /** + * Returns the response code from accessing the url + * @param urlStr + * @return response code from the url + * @throws IOException + */ + protected int getResponseCodeFromUrl(final String urlStr) throws IOException { + URLConnection conn; + URL url = new URL(urlStr); + conn = url.openConnection(); + + HttpURLConnection httpURLConnection = (HttpURLConnection) conn; + int responseCode = httpURLConnection.getResponseCode(); + httpURLConnection.disconnect(); + + return responseCode; + } + + /** + * Retry to send a failed url + * @param context + * @param tracker - db object containing the failed url + * @throws SQLException + */ + protected void tryReprocessFailed(Context context, OpenURLTracker tracker) throws SQLException { + boolean success = false; + try { + + int responseCode = getResponseCodeFromUrl(tracker.getUrl()); + + if (responseCode == HttpURLConnection.HTTP_OK) { + success = true; + } + } catch (Exception e) { + success = false; + } finally { + if (success) { + failedOpenUrlTrackerService + .remove(context, tracker); + // If the tracker was able to post successfully, we remove it from the database + log.info("Successfully posted " + tracker.getUrl() + " from " + tracker.getUploadDate()); + } else { + // Still no luck - write an error msg but keep the entry in the table for future executions + log.error("Failed attempt from " + tracker.getUrl() + " originating from " + tracker.getUploadDate()); + } + } + } + + /** + * Reprocess all url trackers present in the database + * @param context + * @throws SQLException + */ + public void reprocessFailedQueue(Context context) throws SQLException { + if (failedOpenUrlTrackerService == null) { + log.error("Error retrieving the \"failedOpenUrlTrackerService\" instance, aborting the processing"); + return; + } + List openURLTrackers = failedOpenUrlTrackerService.findAll(context); + for (OpenURLTracker openURLTracker : openURLTrackers) { + tryReprocessFailed(context, openURLTracker); + } + } + + /** + * Log a failed url in the database + * @param context + * @param url + * @throws SQLException + */ + public void logfailed(Context context, String url) throws SQLException { + Date now = new Date(); + if (StringUtils.isBlank(url)) { + return; + } + + OpenURLTracker tracker = failedOpenUrlTrackerService.create(context); + tracker.setUploadDate(now); + tracker.setUrl(url); + } + + +} diff --git a/dspace-api/src/main/java/org/dspace/statistics/service/SolrLoggerService.java b/dspace-api/src/main/java/org/dspace/statistics/service/SolrLoggerService.java index 53c94f2668..5db2d9f7df 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/service/SolrLoggerService.java +++ b/dspace-api/src/main/java/org/dspace/statistics/service/SolrLoggerService.java @@ -116,7 +116,7 @@ public interface SolrLoggerService { List fieldNames, List> fieldValuesList) throws SolrServerException, IOException; - public void query(String query, int max) + public void query(String query, int max, int facetMinCount) throws SolrServerException, IOException; /** @@ -130,13 +130,14 @@ public interface SolrLoggerService { * @param showTotal a boolean determining whether the total amount should be given * back as the last element of the array * @param facetQueries list of facet queries + * @param facetMinCount Minimum count of results facet must have to return a result * @return an array containing our results * @throws SolrServerException Exception from the Solr server to the solrj Java client. * @throws java.io.IOException passed through. */ public ObjectCount[] queryFacetField(String query, String filterQuery, String facetField, int max, boolean showTotal, - List facetQueries) + List facetQueries, int facetMinCount) throws SolrServerException, IOException; /** @@ -154,25 +155,27 @@ public interface SolrLoggerService { * @param showTotal a boolean determining whether the total amount should be given * back as the last element of the array * @param context The relevant DSpace Context. + * @param facetMinCount Minimum count of results facet must have to return a result * @return and array containing our results * @throws SolrServerException Exception from the Solr server to the solrj Java client. * @throws java.io.IOException passed through. */ public ObjectCount[] queryFacetDate(String query, String filterQuery, int max, String dateType, String dateStart, - String dateEnd, boolean showTotal, Context context) + String dateEnd, boolean showTotal, Context context, int facetMinCount) throws SolrServerException, IOException; - public Map queryFacetQuery(String query, - String filterQuery, List facetQueries) + public Map queryFacetQuery(String query, String filterQuery, List facetQueries, + int facetMinCount) throws SolrServerException, IOException; - public ObjectCount queryTotal(String query, String filterQuery) + public ObjectCount queryTotal(String query, String filterQuery, int facetMinCount) throws SolrServerException, IOException; public QueryResponse query(String query, String filterQuery, String facetField, int rows, int max, String dateType, String dateStart, - String dateEnd, List facetQueries, String sort, boolean ascending) + String dateEnd, List facetQueries, String sort, boolean ascending, + int facetMinCount) throws SolrServerException, IOException; /** diff --git a/dspace-api/src/main/java/org/dspace/statistics/util/LocationUtils.java b/dspace-api/src/main/java/org/dspace/statistics/util/LocationUtils.java index 0b08085f52..073dc45551 100644 --- a/dspace-api/src/main/java/org/dspace/statistics/util/LocationUtils.java +++ b/dspace-api/src/main/java/org/dspace/statistics/util/LocationUtils.java @@ -8,7 +8,9 @@ package org.dspace.statistics.util; import java.io.IOException; +import java.util.HashMap; import java.util.Locale; +import java.util.Map; import java.util.MissingResourceException; import java.util.Properties; import java.util.ResourceBundle; @@ -34,7 +36,8 @@ public class LocationUtils { /** * Default constructor */ - private LocationUtils() { } + private LocationUtils() { + } /** * Map DSpace continent codes onto ISO country codes. @@ -53,7 +56,7 @@ public class LocationUtils { if (countryToContinent.isEmpty()) { try { countryToContinent.load(LocationUtils.class - .getResourceAsStream("country-continent-codes.properties")); + .getResourceAsStream("country-continent-codes.properties")); } catch (IOException e) { logger.error("Could not load country/continent map file", e); } @@ -105,7 +108,7 @@ public class LocationUtils { names = ResourceBundle.getBundle(CONTINENT_NAMES_BUNDLE, locale); } catch (MissingResourceException e) { logger.error("Could not load continent code/name resource bundle", - e); + e); return I18nUtil .getMessage("org.dspace.statistics.util.LocationUtils.unknown-continent"); } @@ -115,7 +118,7 @@ public class LocationUtils { name = names.getString(continentCode); } catch (MissingResourceException e) { logger.info("No continent code " + continentCode + " in bundle " - + names.getLocale().getDisplayName()); + + names.getLocale().getDisplayName()); return I18nUtil .getMessage("org.dspace.statistics.util.LocationUtils.unknown-continent"); } @@ -134,6 +137,36 @@ public class LocationUtils { return getCountryName(countryCode, Locale.getDefault()); } + /** + * Revert a country name back into a country code (iso2) + * Source: https://stackoverflow.com/a/38588988 + * + * @param countryName Name of country (according to Locale) + * @return Corresponding iso2 country code + */ + static public String getCountryCode(String countryName) { + // Get all country codes in a string array. + String[] isoCountryCodes = Locale.getISOCountries(); + Map countryMap = new HashMap<>(); + Locale locale; + String name; + + // Iterate through all country codes: + for (String code : isoCountryCodes) { + // Create a locale using each country code + locale = new Locale("", code); + // Get country name for each code. + name = locale.getDisplayCountry(); + // Map all country names and codes in key - value pairs. + countryMap.put(name, code); + } + + // Return the country code for the given country name using the map. + // Here you will need some validation or better yet + // a list of countries to give to user to choose from. + return countryMap.get(countryName); // "NL" for Netherlands. + } + /** * Map ISO country codes onto localized country names. * diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/DatabaseRegistryUpdater.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/DatabaseRegistryUpdater.java index c9e126f8fa..74653d8996 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/DatabaseRegistryUpdater.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/DatabaseRegistryUpdater.java @@ -9,7 +9,6 @@ package org.dspace.storage.rdbms; import java.io.File; import java.io.IOException; -import java.sql.Connection; import java.sql.SQLException; import javax.xml.parsers.ParserConfigurationException; import javax.xml.transform.TransformerException; @@ -24,8 +23,8 @@ import org.dspace.services.ConfigurationService; import org.dspace.services.factory.DSpaceServicesFactory; import org.dspace.workflow.factory.WorkflowServiceFactory; import org.dspace.xmlworkflow.service.XmlWorkflowService; -import org.flywaydb.core.api.MigrationInfo; -import org.flywaydb.core.api.callback.FlywayCallback; +import org.flywaydb.core.api.callback.Callback; +import org.flywaydb.core.api.callback.Event; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.xml.sax.SAXException; @@ -50,7 +49,7 @@ import org.xml.sax.SAXException; * * @author Tim Donohue */ -public class DatabaseRegistryUpdater implements FlywayCallback { +public class DatabaseRegistryUpdater implements Callback { /** * logging category */ @@ -107,73 +106,38 @@ public class DatabaseRegistryUpdater implements FlywayCallback { } } + + /** + * Events supported by this callback. + * @param event Flyway event + * @param context Flyway context + * @return true if AFTER_MIGRATE event + */ @Override - public void beforeClean(Connection connection) { - - } - - @Override - public void afterClean(Connection connection) { - - } - - @Override - public void beforeMigrate(Connection connection) { - - } - - @Override - public void afterMigrate(Connection connection) { + public boolean supports(Event event, org.flywaydb.core.api.callback.Context context) { // Must run AFTER all migrations complete, since it is dependent on Hibernate + return event.equals(Event.AFTER_MIGRATE); + } + + /** + * Whether event can be handled in a transaction or whether it must be handle outside of transaction. + * @param event Flyway event + * @param context Flyway context + * @return true + */ + @Override + public boolean canHandleInTransaction(Event event, org.flywaydb.core.api.callback.Context context) { + // Always return true, as our handle() method is updating the database. + return true; + } + + /** + * What to run when the callback is triggered. + * @param event Flyway event + * @param context Flyway context + */ + @Override + public void handle(Event event, org.flywaydb.core.api.callback.Context context) { updateRegistries(); } - - @Override - public void beforeEachMigrate(Connection connection, MigrationInfo migrationInfo) { - - } - - @Override - public void afterEachMigrate(Connection connection, MigrationInfo migrationInfo) { - - } - - @Override - public void beforeValidate(Connection connection) { - - } - - @Override - public void afterValidate(Connection connection) { - - } - - @Override - public void beforeBaseline(Connection connection) { - - } - - @Override - public void afterBaseline(Connection connection) { - - } - - @Override - public void beforeRepair(Connection connection) { - - } - - @Override - public void afterRepair(Connection connection) { - - } - - @Override - public void beforeInfo(Connection connection) { - - } - - @Override - public void afterInfo(Connection connection) { - } } diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/DatabaseUtils.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/DatabaseUtils.java index cadd3eee52..4432949a85 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/DatabaseUtils.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/DatabaseUtils.java @@ -36,11 +36,13 @@ import org.dspace.workflow.factory.WorkflowServiceFactory; import org.flywaydb.core.Flyway; import org.flywaydb.core.api.FlywayException; import org.flywaydb.core.api.MigrationInfo; -import org.flywaydb.core.api.callback.FlywayCallback; -import org.flywaydb.core.internal.dbsupport.DbSupport; -import org.flywaydb.core.internal.dbsupport.DbSupportFactory; -import org.flywaydb.core.internal.dbsupport.SqlScript; +import org.flywaydb.core.api.callback.Callback; +import org.flywaydb.core.api.configuration.FluentConfiguration; import org.flywaydb.core.internal.info.MigrationInfoDumper; +import org.flywaydb.core.internal.license.VersionPrinter; +import org.springframework.dao.DataAccessException; +import org.springframework.jdbc.core.JdbcTemplate; +import org.springframework.jdbc.datasource.SingleConnectionDataSource; /** * Utility class used to manage the Database. This class is used by the @@ -58,9 +60,6 @@ public class DatabaseUtils { */ private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(DatabaseUtils.class); - // Our Flyway DB object (initialized by setupFlyway()) - private static Flyway flywaydb; - // When this temp file exists, the "checkReindexDiscovery()" method will auto-reindex Discovery // Reindex flag file is at [dspace]/solr/search/conf/reindex.flag // See also setReindexDiscovery()/getReindexDiscover() @@ -76,6 +75,9 @@ public class DatabaseUtils { public static final String DBMS_ORACLE = "oracle"; public static final String DBMS_H2 = "h2"; + // Name of the table that Flyway uses for its migration history + public static final String FLYWAY_TABLE = "schema_version"; + /** * Default constructor */ @@ -100,8 +102,13 @@ public class DatabaseUtils { // Get a reference to our configured DataSource DataSource dataSource = getDataSource(); - // Point Flyway API to our database - Flyway flyway = setupFlyway(dataSource); + // Initialize Flyway against our database + FluentConfiguration flywayConfiguration = setupFlyway(dataSource); + Flyway flyway = flywayConfiguration.load(); + + // Now, check our Flyway database table to see if it needs upgrading + // *before* any other Flyway commands can be run. This is a safety check. + FlywayUpgradeUtils.upgradeFlywayTable(flyway, dataSource.getConnection()); // "test" = Test Database Connection if (argv[0].equalsIgnoreCase("test")) { @@ -140,7 +147,7 @@ public class DatabaseUtils { // If Flyway is NOT yet initialized, also print the determined version information // NOTE: search is case sensitive, as flyway table name is ALWAYS lowercase, // See: http://flywaydb.org/documentation/faq.html#case-sensitive - if (!tableExists(connection, flyway.getTable(), true)) { + if (!tableExists(connection, flyway.getConfiguration().getTable(), true)) { System.out .println("\nNOTE: This database is NOT yet initialized for auto-migrations (via Flyway)."); // Determine which version of DSpace this looks like @@ -265,7 +272,7 @@ public class DatabaseUtils { // "clean" = Run Flyway clean script // If clean is disabled, return immediately - if (flyway.isCleanDisabled()) { + if (flyway.getConfiguration().isCleanDisabled()) { System.out.println( "\nWARNING: 'clean' command is currently disabled, as it is dangerous to run in Production " + "scenarios!"); @@ -413,6 +420,8 @@ public class DatabaseUtils { "PostgreSQL '" + PostgresUtils.PGCRYPTO + "' extension installed/up-to-date? " + pgcryptoUpToDate + "" + " " + ((pgcryptoVersion != null) ? "(version=" + pgcryptoVersion + ")" : "(not installed)")); } + // Finally, print out our version of Flyway + System.out.println("FlywayDB Version: " + VersionPrinter.getVersion()); } /** @@ -505,70 +514,79 @@ public class DatabaseUtils { } /** - * Setup/Initialize the Flyway API to run against our DSpace database + * Setup/Initialize the Flyway Configuration to run against our DSpace database * and point at our migration scripts. * * @param datasource DataSource object initialized by DatabaseManager - * @return initialized Flyway object + * @return initialized FluentConfiguration (Flyway configuration object) */ - private synchronized static Flyway setupFlyway(DataSource datasource) { + private synchronized static FluentConfiguration setupFlyway(DataSource datasource) { ConfigurationService config = DSpaceServicesFactory.getInstance().getConfigurationService(); - if (flywaydb == null) { - try (Connection connection = datasource.getConnection()) { - // Initialize Flyway DB API (http://flywaydb.org/), used to perform DB migrations - flywaydb = new Flyway(); - flywaydb.setDataSource(datasource); - flywaydb.setEncoding("UTF-8"); + // Initialize Flyway Configuration (http://flywaydb.org/), used to perform DB migrations + FluentConfiguration flywayConfiguration = Flyway.configure(); - // Default cleanDisabled to "true" (which disallows the ability to run 'database clean') - flywaydb.setCleanDisabled(config.getBooleanProperty("db.cleanDisabled", true)); + try (Connection connection = datasource.getConnection()) { + flywayConfiguration.dataSource(datasource); + flywayConfiguration.encoding("UTF-8"); - // Migration scripts are based on DBMS Keyword (see full path below) - String dbType = getDbType(connection); - connection.close(); + // Default cleanDisabled to "true" (which disallows the ability to run 'database clean') + flywayConfiguration.cleanDisabled(config.getBooleanProperty("db.cleanDisabled", true)); - // Determine location(s) where Flyway will load all DB migrations - ArrayList scriptLocations = new ArrayList(); + // Migration scripts are based on DBMS Keyword (see full path below) + String dbType = getDbType(connection); + connection.close(); - // First, add location for custom SQL migrations, if any (based on DB Type) - // e.g. [dspace.dir]/etc/[dbtype]/ - // (We skip this for H2 as it's only used for unit testing) - if (!dbType.equals(DBMS_H2)) { - scriptLocations.add("filesystem:" + config.getProperty("dspace.dir") + - "/etc/" + dbType); - } + // Determine location(s) where Flyway will load all DB migrations + ArrayList scriptLocations = new ArrayList<>(); - // Also add the Java package where Flyway will load SQL migrations from (based on DB Type) - scriptLocations.add("classpath:org.dspace.storage.rdbms.sqlmigration." + dbType); - - // Also add the Java package where Flyway will load Java migrations from - // NOTE: this also loads migrations from any sub-package - scriptLocations.add("classpath:org.dspace.storage.rdbms.migration"); - - //Add all potential workflow migration paths - List workflowFlywayMigrationLocations = WorkflowServiceFactory.getInstance() - .getWorkflowService() - .getFlywayMigrationLocations(); - scriptLocations.addAll(workflowFlywayMigrationLocations); - - // Now tell Flyway which locations to load SQL / Java migrations from - log.info("Loading Flyway DB migrations from: " + StringUtils.join(scriptLocations, ", ")); - flywaydb.setLocations(scriptLocations.toArray(new String[scriptLocations.size()])); - - // Set flyway callbacks (i.e. classes which are called post-DB migration and similar) - // In this situation, we have a Registry Updater that runs PRE-migration - // NOTE: DatabaseLegacyReindexer only indexes in Legacy Lucene & RDBMS indexes. It can be removed - // once those are obsolete. - List flywayCallbacks = DSpaceServicesFactory.getInstance().getServiceManager() - .getServicesByType(FlywayCallback.class); - flywaydb.setCallbacks(flywayCallbacks.toArray(new FlywayCallback[flywayCallbacks.size()])); - } catch (SQLException e) { - log.error("Unable to setup Flyway against DSpace database", e); + // First, add location for custom SQL migrations, if exists (based on DB Type) + // e.g. [dspace.dir]/etc/[dbtype]/ + // (We skip this for H2 as it's only used for unit testing) + String etcDirPath = config.getProperty("dspace.dir") + "/etc/" + dbType; + File etcDir = new File(etcDirPath); + if (etcDir.exists() && !dbType.equals(DBMS_H2)) { + scriptLocations.add("filesystem:" + etcDirPath); } + + // Also add the Java package where Flyway will load SQL migrations from (based on DB Type) + scriptLocations.add("classpath:org/dspace/storage/rdbms/sqlmigration/" + dbType); + + // Also add the Java package where Flyway will load Java migrations from + // NOTE: this also loads migrations from any sub-package + scriptLocations.add("classpath:org/dspace/storage/rdbms/migration"); + + //Add all potential workflow migration paths + List workflowFlywayMigrationLocations = WorkflowServiceFactory.getInstance() + .getWorkflowService() + .getFlywayMigrationLocations(); + scriptLocations.addAll(workflowFlywayMigrationLocations); + + // Now tell Flyway which locations to load SQL / Java migrations from + log.info("Loading Flyway DB migrations from: " + StringUtils.join(scriptLocations, ", ")); + flywayConfiguration.locations(scriptLocations.toArray(new String[scriptLocations.size()])); + + // Tell Flyway NOT to throw a validation error if it finds older "Ignored" migrations. + // For DSpace, we sometimes have to insert "old" migrations in after a major release + // if further development/bug fixes are needed in older versions. So, "Ignored" migrations are + // nothing to worry about...you can always trigger them to run using "database migrate ignored" from CLI + flywayConfiguration.ignoreIgnoredMigrations(true); + + // Set Flyway callbacks (i.e. classes which are called post-DB migration and similar) + List flywayCallbacks = DSpaceServicesFactory.getInstance().getServiceManager() + .getServicesByType(Callback.class); + + flywayConfiguration.callbacks(flywayCallbacks.toArray(new Callback[flywayCallbacks.size()])); + + // Tell Flyway to use the "schema_version" table in the database to manage its migration history + // As of Flyway v5, the default table is named "flyway_schema_history" + // We are using the older name ("schema_version") for backwards compatibility. + flywayConfiguration.table(FLYWAY_TABLE); + } catch (SQLException e) { + log.error("Unable to setup Flyway against DSpace database", e); } - return flywaydb; + return flywayConfiguration; } /** @@ -645,36 +663,48 @@ public class DatabaseUtils { try { // Setup Flyway API against our database - Flyway flyway = setupFlyway(datasource); + FluentConfiguration flywayConfiguration = setupFlyway(datasource); - // Set whethe Flyway will run migrations "out of order". By default, this is false, + // Set whether Flyway will run migrations "out of order". By default, this is false, // and Flyway ONLY runs migrations that have a higher version number. - flyway.setOutOfOrder(outOfOrder); + flywayConfiguration.outOfOrder(outOfOrder); // If a target version was specified, tell Flyway to ONLY migrate to that version // (i.e. all later migrations are left as "pending"). By default we always migrate to latest version. if (!StringUtils.isBlank(targetVersion)) { - flyway.setTargetAsString(targetVersion); + flywayConfiguration.target(targetVersion); } + // Initialized Flyway object (will be created by flywayConfiguration.load() below) + Flyway flyway; + // Does the necessary Flyway table ("schema_version") exist in this database? // If not, then this is the first time Flyway has run, and we need to initialize // NOTE: search is case sensitive, as flyway table name is ALWAYS lowercase, // See: http://flywaydb.org/documentation/faq.html#case-sensitive - if (!tableExists(connection, flyway.getTable(), true)) { + if (!tableExists(connection, flywayConfiguration.getTable(), true)) { // Try to determine our DSpace database version, so we know what to tell Flyway to do - String dbVersion = determineDBVersion(connection); + String dspaceVersion = determineDBVersion(connection); - // If this is a fresh install, dbVersion will be null - if (dbVersion == null) { - // Initialize the Flyway database table with defaults (version=1) - flyway.baseline(); - } else { - // Otherwise, pass our determined DB version to Flyway to initialize database table - flyway.setBaselineVersionAsString(dbVersion); - flyway.setBaselineDescription("Initializing from DSpace " + dbVersion + " database schema"); - flyway.baseline(); + // If this is NOT a fresh install (i.e. dspaceVersion is not null) + if (dspaceVersion != null) { + // Pass our determined DSpace version to Flyway to initialize database table + flywayConfiguration.baselineVersion(dspaceVersion); + flywayConfiguration.baselineDescription( + "Initializing from DSpace " + dspaceVersion + " database schema"); } + + // Initialize Flyway in DB with baseline version (either dspaceVersion or default of 1) + flyway = flywayConfiguration.load(); + flyway.baseline(); + } else { + // Otherwise, this database already ran Flyway before + // So, just load our Flyway configuration, initializing latest Flyway. + flyway = flywayConfiguration.load(); + + // Now, check our Flyway database table to see if it needs upgrading + // *before* any other Flyway commands can be run. + FlywayUpgradeUtils.upgradeFlywayTable(flyway, connection); } // Determine pending Database migrations @@ -1049,16 +1079,13 @@ public class DatabaseUtils { */ public static void executeSql(Connection connection, String sqlToExecute) throws SQLException { try { - // Create a Flyway DbSupport object (based on our connection) - // This is how Flyway determines the database *type* (e.g. Postgres vs Oracle) - DbSupport dbSupport = DbSupportFactory.createDbSupport(connection, false); - - // Load our SQL string & execute via Flyway's SQL parser - SqlScript script = new SqlScript(sqlToExecute, dbSupport); - script.execute(dbSupport.getJdbcTemplate()); - } catch (FlywayException fe) { - // If any FlywayException (Runtime) is thrown, change it to a SQLException - throw new SQLException("Flyway executeSql() error occurred", fe); + // Run the SQL using Spring JDBC as documented in Flyway's guide for using Spring JDBC directly + // https://flywaydb.org/documentation/migrations#spring + new JdbcTemplate(new SingleConnectionDataSource(connection, true)) + .execute(sqlToExecute); + } catch (DataAccessException dae) { + // If any Exception is thrown, change it to a SQLException + throw new SQLException("Flyway executeSql() error occurred", dae); } } @@ -1329,13 +1356,6 @@ public class DatabaseUtils { return dataSource; } - /** - * In case of a unit test the flyway db is cached to long leading to exceptions, we need to clear the object - */ - public static void clearFlywayDBCache() { - flywaydb = null; - } - /** * Returns the current Flyway schema_version being used by the given database. * (i.e. the version of the highest numbered migration that this database has run) @@ -1346,7 +1366,7 @@ public class DatabaseUtils { */ public static String getCurrentFlywayState(Connection connection) throws SQLException { PreparedStatement statement = connection - .prepareStatement("SELECT \"version\" FROM \"schema_version\" ORDER BY \"version\" desc"); + .prepareStatement("SELECT \"version\" FROM \"" + FLYWAY_TABLE + "\" ORDER BY \"version\" desc"); ResultSet resultSet = statement.executeQuery(); resultSet.next(); return resultSet.getString("version"); diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/FlywayUpgradeUtils.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/FlywayUpgradeUtils.java new file mode 100644 index 0000000000..7bd524b612 --- /dev/null +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/FlywayUpgradeUtils.java @@ -0,0 +1,117 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.storage.rdbms; + +import static org.dspace.storage.rdbms.DatabaseUtils.FLYWAY_TABLE; +import static org.dspace.storage.rdbms.DatabaseUtils.executeSql; +import static org.dspace.storage.rdbms.DatabaseUtils.getCurrentFlywayState; +import static org.dspace.storage.rdbms.DatabaseUtils.getDbType; +import static org.dspace.storage.rdbms.DatabaseUtils.getSchemaName; + +import java.sql.Connection; +import java.sql.SQLException; +import java.util.HashMap; +import java.util.Map; + +import org.apache.commons.text.StringSubstitutor; +import org.apache.logging.log4j.Logger; +import org.dspace.storage.rdbms.migration.MigrationUtils; +import org.flywaydb.core.Flyway; + +/** + * Utility class used to detect issues with the Flyway migration history table and attempt to correct/fix them. + * These issues can occur when attempting to upgrade your database across multiple versions/releases of Flyway. + *

    + * As documented in this issue ticket, Flyway does not normally support skipping over any + * major release (for example going from v3 to v5 is unsuppored): https://github.com/flyway/flyway/issues/2126 + *

    + * This class allows us to do a migration (where needed) through multiple major versions of Flyway. + * + * @author Tim Donohue + */ +public class FlywayUpgradeUtils { + /** + * log4j category + */ + private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(FlywayUpgradeUtils.class); + + // Resource path of all Flyway upgrade scripts + private static final String UPGRADE_SCRIPT_PATH = "org/dspace/storage/rdbms/flywayupgrade/"; + + + /** + * Default constructor + */ + private FlywayUpgradeUtils() { } + + /** + * Ensures the Flyway migration history table (FLYWAY_TABLE) is upgraded to the latest version of Flyway safely. + *

    + * Unfortunately, Flyway does not always support skipping major versions (e.g. upgrading directly from Flyway + * v3.x to 5.x is not possible, see https://github.com/flyway/flyway/issues/2126). + *

    + * While sometimes it's possible to do so, other times you MUST upgrade through each major version. This method + * ensures we upgrade the Flyway history table through each version of Flyway where deemed necessary. + * + * @param flyway initialized/configured Flyway object + * @param connection current database connection + */ + protected static synchronized void upgradeFlywayTable(Flyway flyway, Connection connection) + throws SQLException { + // Whether the Flyway table needs updating or not + boolean needsUpgrade = false; + + // Determine if Flyway needs updating by running a simple info() command. + // This command will not run any pending migrations, but it will throw an exception + // if the Flyway migration history table is NOT valid for the current version of Flyway + try { + flyway.info(); + } catch (Exception e) { + // ignore error, but log info statement to say we will try to upgrade to fix problem + log.info("Flyway table '{}' appears to be outdated. Will attempt to upgrade it automatically. " + + "Flyway Exception was '{}'", FLYWAY_TABLE, e.toString()); + needsUpgrade = true; + } + + if (needsUpgrade) { + // Get the DSpace version info from the LAST migration run. + String lastMigration = getCurrentFlywayState(connection); + // If this is an older DSpace 5.x compatible database, then it used Flyway 3.x. + // Because we cannot upgrade directly from Flyway 3.x -> 6.x, we need to FIRST update this + // database to be compatible with Flyway 4.2.0 (which can be upgraded directly to Flyway 6.x) + if (lastMigration.startsWith("5.")) { + // Based on type of DB, get path to our Flyway 4.x upgrade script + String dbtype = getDbType(connection); + String scriptPath = UPGRADE_SCRIPT_PATH + dbtype + "/upgradeToFlyway4x.sql"; + + log.info("Attempting to upgrade Flyway table '{}' using script at '{}'", + FLYWAY_TABLE, scriptPath); + // Load the Flyway v4.2.0 upgrade SQL script as a String + String flywayUpgradeSQL = MigrationUtils.getResourceAsString(scriptPath); + + // As this Flyway upgrade SQL was borrowed from Flyway v4.2.0 directly, it contains some inline + // variables which need replacing, namely ${schema} and ${table} variables. + // We'll use the StringSubstitutor to replace those variables with their proper values. + Map valuesMap = new HashMap<>(); + valuesMap.put("schema", getSchemaName(connection)); + valuesMap.put("table", FLYWAY_TABLE); + StringSubstitutor sub = new StringSubstitutor(valuesMap); + flywayUpgradeSQL = sub.replace(flywayUpgradeSQL); + + // Run the script to update the Flyway table to be compatible with FLyway v4.x + executeSql(connection, flywayUpgradeSQL); + } + // NOTE: no other DSpace versions require a specialized Flyway upgrade script at this time. + // DSpace 4 didn't use Flyway. DSpace 6 used Flyway v4, which Flyway v6 can update automatically. + + // After any Flyway table upgrade, we MUST run a Flyway repair() to cleanup migration checksums if needed + log.info("Repairing Flyway table '{}' after upgrade...", FLYWAY_TABLE); + flyway.repair(); + } + } +} diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/GroupServiceInitializer.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/GroupServiceInitializer.java index 11018d37e0..7338dd75bc 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/GroupServiceInitializer.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/GroupServiceInitializer.java @@ -7,13 +7,11 @@ */ package org.dspace.storage.rdbms; -import java.sql.Connection; - import org.apache.logging.log4j.Logger; import org.dspace.core.Context; import org.dspace.eperson.service.GroupService; -import org.flywaydb.core.api.MigrationInfo; -import org.flywaydb.core.api.callback.FlywayCallback; +import org.flywaydb.core.api.callback.Callback; +import org.flywaydb.core.api.callback.Event; import org.springframework.beans.factory.annotation.Autowired; /** @@ -22,7 +20,7 @@ import org.springframework.beans.factory.annotation.Autowired; * * @author kevinvandevelde at atmire.com */ -public class GroupServiceInitializer implements FlywayCallback { +public class GroupServiceInitializer implements Callback { private final Logger log = org.apache.logging.log4j.LogManager.getLogger(GroupServiceInitializer.class); @@ -53,73 +51,36 @@ public class GroupServiceInitializer implements FlywayCallback { } + /** + * Events supported by this callback. + * @param event Flyway event + * @param context Flyway context + * @return true if AFTER_MIGRATE event + */ @Override - public void beforeClean(Connection connection) { - + public boolean supports(Event event, org.flywaydb.core.api.callback.Context context) { + // Must run AFTER all migrations complete, since it is dependent on Hibernate + return event.equals(Event.AFTER_MIGRATE); } + /** + * Whether event can be handled in a transaction or whether it must be handle outside of transaction. + * @param event Flyway event + * @param context Flyway context + * @return true + */ @Override - public void afterClean(Connection connection) { - + public boolean canHandleInTransaction(Event event, org.flywaydb.core.api.callback.Context context) { + return true; } + /** + * What to run when the callback is triggered. + * @param event Flyway event + * @param context Flyway context + */ @Override - public void beforeMigrate(Connection connection) { - - } - - @Override - public void afterMigrate(Connection connection) { + public void handle(Event event, org.flywaydb.core.api.callback.Context context) { initGroups(); } - - @Override - public void beforeEachMigrate(Connection connection, MigrationInfo migrationInfo) { - - } - - @Override - public void afterEachMigrate(Connection connection, MigrationInfo migrationInfo) { - - } - - @Override - public void beforeValidate(Connection connection) { - - } - - @Override - public void afterValidate(Connection connection) { - - } - - @Override - public void beforeBaseline(Connection connection) { - - } - - @Override - public void afterBaseline(Connection connection) { - - } - - @Override - public void beforeRepair(Connection connection) { - - } - - @Override - public void afterRepair(Connection connection) { - - } - - @Override - public void beforeInfo(Connection connection) { - - } - - @Override - public void afterInfo(Connection connection) { - - } } diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/PostgreSQLCryptoChecker.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/PostgreSQLCryptoChecker.java index 48f2e4e6f0..5798f4254c 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/PostgreSQLCryptoChecker.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/PostgreSQLCryptoChecker.java @@ -13,8 +13,9 @@ import java.sql.Statement; import org.apache.logging.log4j.Logger; import org.flywaydb.core.api.FlywayException; -import org.flywaydb.core.api.MigrationInfo; -import org.flywaydb.core.api.callback.FlywayCallback; +import org.flywaydb.core.api.callback.Callback; +import org.flywaydb.core.api.callback.Context; +import org.flywaydb.core.api.callback.Event; /** * This is a FlywayCallback class which automatically verifies that "pgcrypto" @@ -28,7 +29,7 @@ import org.flywaydb.core.api.callback.FlywayCallback; * * @author Tim Donohue */ -public class PostgreSQLCryptoChecker implements FlywayCallback { +public class PostgreSQLCryptoChecker implements Callback { private Logger log = org.apache.logging.log4j.LogManager.getLogger(PostgreSQLCryptoChecker.class); /** @@ -96,76 +97,43 @@ public class PostgreSQLCryptoChecker implements FlywayCallback { } } + /** + * Events supported by this callback. + * @param event Flyway event + * @param context Flyway context + * @return true if BEFORE_BASELINE, BEFORE_MIGRATE or BEFORE_CLEAN + */ @Override - public void beforeClean(Connection connection) { - // If pgcrypto is installed, remove it - removePgCrypto(connection); + public boolean supports(Event event, Context context) { + return event.equals(Event.BEFORE_BASELINE) || event.equals(Event.BEFORE_MIGRATE) || + event.equals(Event.BEFORE_CLEAN); } + /** + * Whether event can be handled in a transaction or whether it must be handle outside of transaction. + * @param event Flyway event + * @param context Flyway context + * @return true + */ @Override - public void afterClean(Connection connection) { - + public boolean canHandleInTransaction(Event event, Context context) { + return true; } + /** + * What to run when the callback is triggered. + * @param event Flyway event + * @param context Flyway context + */ @Override - public void beforeMigrate(Connection connection) { - // Before migrating database, check for pgcrypto - checkPgCrypto(connection); - } - - @Override - public void afterMigrate(Connection connection) { - - } - - @Override - public void beforeEachMigrate(Connection connection, MigrationInfo migrationInfo) { - - } - - @Override - public void afterEachMigrate(Connection connection, MigrationInfo migrationInfo) { - - } - - @Override - public void beforeValidate(Connection connection) { - - } - - @Override - public void afterValidate(Connection connection) { - - } - - @Override - public void beforeBaseline(Connection connection) { - // Before initializing database, check for pgcrypto - checkPgCrypto(connection); - } - - @Override - public void afterBaseline(Connection connection) { - - } - - @Override - public void beforeRepair(Connection connection) { - - } - - @Override - public void afterRepair(Connection connection) { - - } - - @Override - public void beforeInfo(Connection connection) { - - } - - @Override - public void afterInfo(Connection connection) { + public void handle(Event event, Context context) { + // If, before initializing or migrating database, check for pgcrypto + // Else, before Cleaning database, remove pgcrypto (if exists) + if (event.equals(Event.BEFORE_BASELINE) || event.equals(Event.BEFORE_MIGRATE)) { + checkPgCrypto(context.getConnection()); + } else if (event.equals(Event.BEFORE_CLEAN)) { + removePgCrypto(context.getConnection()); + } } } diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/SiteServiceInitializer.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/SiteServiceInitializer.java index a4b7129546..d755150f79 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/SiteServiceInitializer.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/SiteServiceInitializer.java @@ -7,13 +7,16 @@ */ package org.dspace.storage.rdbms; -import java.sql.Connection; - import org.apache.logging.log4j.Logger; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.Site; import org.dspace.content.service.SiteService; +import org.dspace.core.Constants; import org.dspace.core.Context; -import org.flywaydb.core.api.MigrationInfo; -import org.flywaydb.core.api.callback.FlywayCallback; +import org.dspace.eperson.Group; +import org.dspace.eperson.service.GroupService; +import org.flywaydb.core.api.callback.Callback; +import org.flywaydb.core.api.callback.Event; import org.springframework.beans.factory.annotation.Autowired; /** @@ -22,29 +25,44 @@ import org.springframework.beans.factory.annotation.Autowired; * * @author kevinvandevelde at atmire.com */ -public class SiteServiceInitializer implements FlywayCallback { +public class SiteServiceInitializer implements Callback { private Logger log = org.apache.logging.log4j.LogManager.getLogger(SiteServiceInitializer.class); @Autowired(required = true) protected SiteService siteService; + @Autowired + private AuthorizeService authorizeService; + + @Autowired + private GroupService groupService; + public void initializeSiteObject() { // After every migrate, ensure default Site is setup correctly. Context context = null; try { context = new Context(); context.turnOffAuthorisationSystem(); - // While it's not really a formal "registry", we need to ensure the - // default, required Groups exist in the DSpace database + // Create Site object if it doesn't exist in database + Site site = null; if (siteService.findSite(context) == null) { - siteService.createSite(context); + site = siteService.createSite(context); } context.restoreAuthSystemState(); + // Give Anonymous users READ permissions on the Site Object (if doesn't exist) + if (!authorizeService.authorizeActionBoolean(context, site, Constants.READ)) { + context.turnOffAuthorisationSystem(); + Group anonGroup = groupService.findByName(context, Group.ANONYMOUS); + if (anonGroup != null) { + authorizeService.addPolicy(context, site, Constants.READ, anonGroup); + } + context.restoreAuthSystemState(); + } // Commit changes and close context context.complete(); } catch (Exception e) { - log.error("Error attempting to add/update default DSpace Groups", e); + log.error("Error attempting to add/update default Site object", e); } finally { // Clean up our context, if it still exists & it was never completed if (context != null && context.isValid()) { @@ -55,73 +73,36 @@ public class SiteServiceInitializer implements FlywayCallback { } + /** + * Events supported by this callback. + * @param event Flyway event + * @param context Flyway context + * @return true if AFTER_MIGRATE event + */ @Override - public void beforeClean(Connection connection) { - + public boolean supports(Event event, org.flywaydb.core.api.callback.Context context) { + // Must run AFTER all migrations complete, since it is dependent on Hibernate + return event.equals(Event.AFTER_MIGRATE); } + /** + * Whether event can be handled in a transaction or whether it must be handle outside of transaction. + * @param event Flyway event + * @param context Flyway context + * @return true + */ @Override - public void afterClean(Connection connection) { - + public boolean canHandleInTransaction(Event event, org.flywaydb.core.api.callback.Context context) { + return true; } + /** + * What to run when the callback is triggered. + * @param event Flyway event + * @param context Flyway context + */ @Override - public void beforeMigrate(Connection connection) { - - } - - @Override - public void afterMigrate(Connection connection) { + public void handle(Event event, org.flywaydb.core.api.callback.Context context) { initializeSiteObject(); } - - @Override - public void beforeEachMigrate(Connection connection, MigrationInfo migrationInfo) { - - } - - @Override - public void afterEachMigrate(Connection connection, MigrationInfo migrationInfo) { - - } - - @Override - public void beforeValidate(Connection connection) { - - } - - @Override - public void afterValidate(Connection connection) { - - } - - @Override - public void beforeBaseline(Connection connection) { - - } - - @Override - public void afterBaseline(Connection connection) { - - } - - @Override - public void beforeRepair(Connection connection) { - - } - - @Override - public void afterRepair(Connection connection) { - - } - - @Override - public void beforeInfo(Connection connection) { - - } - - @Override - public void afterInfo(Connection connection) { - - } } diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/MigrationUtils.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/MigrationUtils.java index ce481d0caf..624d0cb55a 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/MigrationUtils.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/MigrationUtils.java @@ -7,12 +7,19 @@ */ package org.dspace.storage.rdbms.migration; +import java.io.IOException; +import java.io.InputStreamReader; +import java.io.Reader; +import java.io.UncheckedIOException; import java.sql.Connection; import java.sql.PreparedStatement; import java.sql.ResultSet; import java.sql.SQLException; +import java.util.Objects; import org.apache.commons.lang3.StringUtils; +import org.dspace.core.Constants; +import org.springframework.util.FileCopyUtils; /** * This Utility class offers utility methods which may be of use to perform @@ -270,4 +277,25 @@ public class MigrationUtils { return checksum; } + + /** + * Read a given Resource, converting to a String. This is used by several Java-based + * migrations to read a SQL migration into a string, so that it can be executed under + * specific scenarios. + * @param resourcePath relative path of resource to read + * @return String contents of Resource + */ + public static String getResourceAsString(String resourcePath) { + // Read the resource, copying to a string + try (Reader reader = + new InputStreamReader( + Objects.requireNonNull(MigrationUtils.class.getClassLoader().getResourceAsStream(resourcePath)), + Constants.DEFAULT_ENCODING)) { + return FileCopyUtils.copyToString(reader); + } catch (IOException e) { + throw new UncheckedIOException(e); + } catch (NullPointerException e) { + throw new IllegalStateException("Resource at " + resourcePath + " was not found", e); + } + } } diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V1_3_9__Drop_constraint_for_DSpace_1_4_schema.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V1_3_9__Drop_constraint_for_DSpace_1_4_schema.java index c3a79783ad..56c5b474d9 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V1_3_9__Drop_constraint_for_DSpace_1_4_schema.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V1_3_9__Drop_constraint_for_DSpace_1_4_schema.java @@ -8,11 +8,10 @@ package org.dspace.storage.rdbms.migration; import java.io.IOException; -import java.sql.Connection; import java.sql.SQLException; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; /** * This class is in support of the "V1.4__Upgrade_to_DSpace_1.4_schema.sql" @@ -38,22 +37,22 @@ import org.flywaydb.core.api.migration.jdbc.JdbcMigration; * @author Tim Donohue */ public class V1_3_9__Drop_constraint_for_DSpace_1_4_schema - implements JdbcMigration, MigrationChecksumProvider { + extends BaseJavaMigration { /* The checksum to report for this migration (when successful) */ private int checksum = -1; /** * Actually migrate the existing database * - * @param connection SQL Connection object + * @param context Flyway Migration Context * @throws IOException A general class of exceptions produced by failed or interrupted I/O operations. * @throws SQLException An exception that provides information on a database access error or other errors. */ @Override - public void migrate(Connection connection) + public void migrate(Context context) throws IOException, SQLException { // Drop the constraint associated with "name" column of "community" - checksum = MigrationUtils.dropDBConstraint(connection, "community", "name", "key"); + checksum = MigrationUtils.dropDBConstraint(context.getConnection(), "community", "name", "key"); } /** diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V1_5_9__Drop_constraint_for_DSpace_1_6_schema.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V1_5_9__Drop_constraint_for_DSpace_1_6_schema.java index 77eb7a070d..6d82055e53 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V1_5_9__Drop_constraint_for_DSpace_1_6_schema.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V1_5_9__Drop_constraint_for_DSpace_1_6_schema.java @@ -8,11 +8,10 @@ package org.dspace.storage.rdbms.migration; import java.io.IOException; -import java.sql.Connection; import java.sql.SQLException; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; /** * This class is in support of the "V1.6__Upgrade_to_DSpace_1.6_schema.sql" @@ -38,26 +37,29 @@ import org.flywaydb.core.api.migration.jdbc.JdbcMigration; * @author Tim Donohue */ public class V1_5_9__Drop_constraint_for_DSpace_1_6_schema - implements JdbcMigration, MigrationChecksumProvider { + extends BaseJavaMigration { /* The checksum to report for this migration (when successful) */ private int checksum = -1; /** * Actually migrate the existing database * - * @param connection SQL Connection object + * @param context Flyway Migration Context * @throws IOException A general class of exceptions produced by failed or interrupted I/O operations. * @throws SQLException An exception that provides information on a database access error or other errors. */ @Override - public void migrate(Connection connection) + public void migrate(Context context) throws IOException, SQLException { // Drop the constraint associated with "collection_id" column of "community2collection" table - int return1 = MigrationUtils.dropDBConstraint(connection, "community2collection", "collection_id", "pkey"); + int return1 = MigrationUtils.dropDBConstraint(context.getConnection(), "community2collection", + "collection_id", "pkey"); // Drop the constraint associated with "child_comm_id" column of "community2community" table - int return2 = MigrationUtils.dropDBConstraint(connection, "community2community", "child_comm_id", "pkey"); + int return2 = MigrationUtils.dropDBConstraint(context.getConnection(), "community2community", + "child_comm_id", "pkey"); // Drop the constraint associated with "item_id" column of "collection2item" table - int return3 = MigrationUtils.dropDBConstraint(connection, "collection2item", "item_id", "pkey"); + int return3 = MigrationUtils.dropDBConstraint(context.getConnection(), "collection2item", + "item_id", "pkey"); // Checksum will just be the sum of those three return values checksum = return1 + return2 + return3; diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V5_0_2014_09_25__DS_1582_Metadata_For_All_Objects_drop_constraint.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V5_0_2014_09_25__DS_1582_Metadata_For_All_Objects_drop_constraint.java index 17598ade6c..ea72d99b6e 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V5_0_2014_09_25__DS_1582_Metadata_For_All_Objects_drop_constraint.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V5_0_2014_09_25__DS_1582_Metadata_For_All_Objects_drop_constraint.java @@ -8,11 +8,10 @@ package org.dspace.storage.rdbms.migration; import java.io.IOException; -import java.sql.Connection; import java.sql.SQLException; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; /** * This class is in support of the DS-1582 Metadata for All Objects feature. @@ -39,22 +38,22 @@ import org.flywaydb.core.api.migration.jdbc.JdbcMigration; * @author Tim Donohue */ public class V5_0_2014_09_25__DS_1582_Metadata_For_All_Objects_drop_constraint - implements JdbcMigration, MigrationChecksumProvider { + extends BaseJavaMigration { /* The checksum to report for this migration (when successful) */ private int checksum = -1; /** * Actually migrate the existing database * - * @param connection SQL Connection object + * @param context Flyway Migration Context * @throws IOException A general class of exceptions produced by failed or interrupted I/O operations. * @throws SQLException An exception that provides information on a database access error or other errors. */ @Override - public void migrate(Connection connection) + public void migrate(Context context) throws IOException, SQLException { // Drop the constraint associated with "item_id" column of "metadatavalue" - checksum = MigrationUtils.dropDBConstraint(connection, "metadatavalue", "item_id", "fkey"); + checksum = MigrationUtils.dropDBConstraint(context.getConnection(), "metadatavalue", "item_id", "fkey"); } /** diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V5_7_2017_05_05__DS_3431_Add_Policies_for_BasicWorkflow.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V5_7_2017_05_05__DS_3431_Add_Policies_for_BasicWorkflow.java index 3b8e551b12..58fdc78d06 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V5_7_2017_05_05__DS_3431_Add_Policies_for_BasicWorkflow.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V5_7_2017_05_05__DS_3431_Add_Policies_for_BasicWorkflow.java @@ -7,25 +7,21 @@ */ package org.dspace.storage.rdbms.migration; -import java.sql.Connection; - -import org.dspace.core.Constants; import org.dspace.storage.rdbms.DatabaseUtils; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; -import org.flywaydb.core.internal.util.scanner.classpath.ClassPathResource; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; public class V5_7_2017_05_05__DS_3431_Add_Policies_for_BasicWorkflow - implements JdbcMigration, MigrationChecksumProvider { + extends BaseJavaMigration { // Size of migration script run Integer migration_file_size = -1; @Override - public void migrate(Connection connection) throws Exception { + public void migrate(Context context) throws Exception { // Based on type of DB, get path to SQL migration script - String dbtype = DatabaseUtils.getDbType(connection); + String dbtype = DatabaseUtils.getDbType(context.getConnection()); String dataMigrateSQL; String sqlMigrationPath = "org/dspace/storage/rdbms/sqlmigration/workflow/" + dbtype + "/"; @@ -33,19 +29,18 @@ public class V5_7_2017_05_05__DS_3431_Add_Policies_for_BasicWorkflow // If XMLWorkflow Table does NOT exist in this database, then lets do the migration! // If XMLWorkflow Table ALREADY exists, then this migration is a noop, we assume you manually ran the sql // scripts - if (DatabaseUtils.tableExists(connection, "cwf_workflowitem")) { + if (DatabaseUtils.tableExists(context.getConnection(), "cwf_workflowitem")) { return; } else { //Migrate the basic workflow // Get the contents of our data migration script, based on path & DB type - dataMigrateSQL = new ClassPathResource(sqlMigrationPath + "basicWorkflow" + "/V5.7_2017.05.05__DS-3431.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + dataMigrateSQL = MigrationUtils.getResourceAsString( + sqlMigrationPath + "basicWorkflow/V5.7_2017.05.05__DS-3431.sql"); } // Actually execute the Data migration SQL // This will migrate all existing traditional workflows to the new XMLWorkflow system & tables - DatabaseUtils.executeSql(connection, dataMigrateSQL); + DatabaseUtils.executeSql(context.getConnection(), dataMigrateSQL); migration_file_size = dataMigrateSQL.length(); } diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2015_03_06__DS_2701_Dso_Uuid_Migration.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2015_03_06__DS_2701_Dso_Uuid_Migration.java index 98ac8752be..7aa0dc50a7 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2015_03_06__DS_2701_Dso_Uuid_Migration.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2015_03_06__DS_2701_Dso_Uuid_Migration.java @@ -7,30 +7,29 @@ */ package org.dspace.storage.rdbms.migration; -import java.sql.Connection; - -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; /** * Migration class that will drop the public key for the dspace objects, the integer based key will be moved to a UUID * * @author kevinvandevelde at atmire.com */ -public class V6_0_2015_03_06__DS_2701_Dso_Uuid_Migration implements JdbcMigration, MigrationChecksumProvider { +public class V6_0_2015_03_06__DS_2701_Dso_Uuid_Migration extends BaseJavaMigration { private int checksum = -1; @Override - public void migrate(Connection connection) throws Exception { - checksum += MigrationUtils.dropDBConstraint(connection, "eperson", "eperson_id", "pkey"); - checksum += MigrationUtils.dropDBConstraint(connection, "epersongroup", "eperson_group_id", "pkey"); - checksum += MigrationUtils.dropDBConstraint(connection, "community", "community_id", "pkey"); - checksum += MigrationUtils.dropDBConstraint(connection, "collection", "collection_id", "pkey"); - checksum += MigrationUtils.dropDBConstraint(connection, "item", "item_id", "pkey"); - checksum += MigrationUtils.dropDBConstraint(connection, "bundle", "bundle_id", "pkey"); - checksum += MigrationUtils.dropDBConstraint(connection, "bitstream", "bitstream_id", "pkey"); + public void migrate(Context context) throws Exception { + checksum += MigrationUtils.dropDBConstraint(context.getConnection(), "eperson", "eperson_id", "pkey"); + checksum += MigrationUtils.dropDBConstraint(context.getConnection(), "epersongroup", + "eperson_group_id", "pkey"); + checksum += MigrationUtils.dropDBConstraint(context.getConnection(), "community", "community_id", "pkey"); + checksum += MigrationUtils.dropDBConstraint(context.getConnection(), "collection", "collection_id", "pkey"); + checksum += MigrationUtils.dropDBConstraint(context.getConnection(), "item", "item_id", "pkey"); + checksum += MigrationUtils.dropDBConstraint(context.getConnection(), "bundle", "bundle_id", "pkey"); + checksum += MigrationUtils.dropDBConstraint(context.getConnection(), "bitstream", "bitstream_id", "pkey"); } @Override diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2015_08_31__DS_2701_Hibernate_Workflow_Migration.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2015_08_31__DS_2701_Hibernate_Workflow_Migration.java index 62f4b126ba..dd01aa8d2c 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2015_08_31__DS_2701_Hibernate_Workflow_Migration.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2015_08_31__DS_2701_Hibernate_Workflow_Migration.java @@ -7,29 +7,25 @@ */ package org.dspace.storage.rdbms.migration; -import java.sql.Connection; - -import org.dspace.core.Constants; import org.dspace.storage.rdbms.DatabaseUtils; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; -import org.flywaydb.core.internal.util.scanner.classpath.ClassPathResource; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; /** * User: kevin (kevin at atmire.com) * Date: 1/09/15 * Time: 12:08 */ -public class V6_0_2015_08_31__DS_2701_Hibernate_Workflow_Migration implements JdbcMigration, MigrationChecksumProvider { +public class V6_0_2015_08_31__DS_2701_Hibernate_Workflow_Migration extends BaseJavaMigration { // Size of migration script run Integer migration_file_size = -1; @Override - public void migrate(Connection connection) throws Exception { + public void migrate(Context context) throws Exception { // Based on type of DB, get path to SQL migration script - String dbtype = DatabaseUtils.getDbType(connection); + String dbtype = DatabaseUtils.getDbType(context.getConnection()); String dataMigrateSQL; String sqlMigrationPath = "org/dspace/storage/rdbms/sqlmigration/workflow/" + dbtype + "/"; @@ -37,24 +33,20 @@ public class V6_0_2015_08_31__DS_2701_Hibernate_Workflow_Migration implements Jd // If XMLWorkflow Table does NOT exist in this database, then lets do the migration! // If XMLWorkflow Table ALREADY exists, then this migration is a noop, we assume you manually ran the sql // scripts - if (DatabaseUtils.tableExists(connection, "cwf_workflowitem")) { + if (DatabaseUtils.tableExists(context.getConnection(), "cwf_workflowitem")) { // Get the contents of our data migration script, based on path & DB type - dataMigrateSQL = new ClassPathResource(sqlMigrationPath + "xmlworkflow" + - "/V6.0_2015.08.11__DS-2701_Xml_Workflow_Migration.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + dataMigrateSQL = MigrationUtils.getResourceAsString(sqlMigrationPath + "xmlworkflow" + + "/V6.0_2015.08.11__DS-2701_Xml_Workflow_Migration.sql"); } else { //Migrate the basic workflow // Get the contents of our data migration script, based on path & DB type - dataMigrateSQL = new ClassPathResource(sqlMigrationPath + "basicWorkflow" + - "/V6.0_2015.08.11__DS-2701_Basic_Workflow_Migration.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + dataMigrateSQL = MigrationUtils.getResourceAsString(sqlMigrationPath + "basicWorkflow" + + "/V6.0_2015.08.11__DS-2701_Basic_Workflow_Migration.sql"); } // Actually execute the Data migration SQL // This will migrate all existing traditional workflows to the new XMLWorkflow system & tables - DatabaseUtils.executeSql(connection, dataMigrateSQL); + DatabaseUtils.executeSql(context.getConnection(), dataMigrateSQL); migration_file_size = dataMigrateSQL.length(); } diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2016_01_26__DS_2188_Remove_DBMS_Browse_Tables.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2016_01_26__DS_2188_Remove_DBMS_Browse_Tables.java index 2b614b5356..daf2269e92 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2016_01_26__DS_2188_Remove_DBMS_Browse_Tables.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_0_2016_01_26__DS_2188_Remove_DBMS_Browse_Tables.java @@ -14,8 +14,8 @@ import org.apache.logging.log4j.Logger; import org.dspace.browse.BrowseException; import org.dspace.browse.BrowseIndex; import org.dspace.storage.rdbms.DatabaseUtils; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; /** * This Flyway Java migration deletes any legacy DBMS browse tables found in @@ -23,7 +23,7 @@ import org.flywaydb.core.api.migration.jdbc.JdbcMigration; * * @author Tim Donohue */ -public class V6_0_2016_01_26__DS_2188_Remove_DBMS_Browse_Tables implements JdbcMigration, MigrationChecksumProvider { +public class V6_0_2016_01_26__DS_2188_Remove_DBMS_Browse_Tables extends BaseJavaMigration { /** * log4j category */ @@ -34,8 +34,8 @@ public class V6_0_2016_01_26__DS_2188_Remove_DBMS_Browse_Tables implements JdbcM private int checksum = -1; @Override - public void migrate(Connection connection) throws Exception, SQLException { - removeDBMSBrowseTables(connection); + public void migrate(Context context) throws Exception, SQLException { + removeDBMSBrowseTables(context.getConnection()); } /** diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_1_2017_01_03__DS_3431_Add_Policies_for_BasicWorkflow.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_1_2017_01_03__DS_3431_Add_Policies_for_BasicWorkflow.java index 51e401b400..13636b311e 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_1_2017_01_03__DS_3431_Add_Policies_for_BasicWorkflow.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V6_1_2017_01_03__DS_3431_Add_Policies_for_BasicWorkflow.java @@ -7,25 +7,21 @@ */ package org.dspace.storage.rdbms.migration; -import java.sql.Connection; - -import org.dspace.core.Constants; import org.dspace.storage.rdbms.DatabaseUtils; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; -import org.flywaydb.core.internal.util.scanner.classpath.ClassPathResource; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; public class V6_1_2017_01_03__DS_3431_Add_Policies_for_BasicWorkflow - implements JdbcMigration, MigrationChecksumProvider { + extends BaseJavaMigration { // Size of migration script run Integer migration_file_size = -1; @Override - public void migrate(Connection connection) throws Exception { + public void migrate(Context context) throws Exception { // Based on type of DB, get path to SQL migration script - String dbtype = DatabaseUtils.getDbType(connection); + String dbtype = DatabaseUtils.getDbType(context.getConnection()); String dataMigrateSQL; String sqlMigrationPath = "org/dspace/storage/rdbms/sqlmigration/workflow/" + dbtype + "/"; @@ -33,19 +29,18 @@ public class V6_1_2017_01_03__DS_3431_Add_Policies_for_BasicWorkflow // If XMLWorkflow Table does NOT exist in this database, then lets do the migration! // If XMLWorkflow Table ALREADY exists, then this migration is a noop, we assume you manually ran the sql // scripts - if (DatabaseUtils.tableExists(connection, "cwf_workflowitem")) { + if (DatabaseUtils.tableExists(context.getConnection(), "cwf_workflowitem")) { return; } else { //Migrate the basic workflow // Get the contents of our data migration script, based on path & DB type - dataMigrateSQL = new ClassPathResource(sqlMigrationPath + "basicWorkflow" + "/V6.1_2017.01.03__DS-3431.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + dataMigrateSQL = MigrationUtils.getResourceAsString(sqlMigrationPath + "basicWorkflow" + + "/V6.1_2017.01.03__DS-3431.sql"); } // Actually execute the Data migration SQL // This will migrate all existing traditional workflows to the new XMLWorkflow system & tables - DatabaseUtils.executeSql(connection, dataMigrateSQL); + DatabaseUtils.executeSql(context.getConnection(), dataMigrateSQL); migration_file_size = dataMigrateSQL.length(); } diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V7_0_2018_04_03__Upgrade_Workflow_Policy.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V7_0_2018_04_03__Upgrade_Workflow_Policy.java index 100e345df3..3da7f8b40f 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V7_0_2018_04_03__Upgrade_Workflow_Policy.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/migration/V7_0_2018_04_03__Upgrade_Workflow_Policy.java @@ -7,15 +7,11 @@ */ package org.dspace.storage.rdbms.migration; -import java.sql.Connection; - -import org.dspace.core.Constants; import org.dspace.storage.rdbms.DatabaseUtils; import org.dspace.workflow.factory.WorkflowServiceFactory; import org.dspace.xmlworkflow.service.XmlWorkflowService; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; -import org.flywaydb.core.internal.util.scanner.classpath.ClassPathResource; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; /** * This class automatically adding rptype to the resource policy created with a migration into XML-based Configurable @@ -23,30 +19,27 @@ import org.flywaydb.core.internal.util.scanner.classpath.ClassPathResource; * * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) */ -public class V7_0_2018_04_03__Upgrade_Workflow_Policy implements JdbcMigration, MigrationChecksumProvider { +public class V7_0_2018_04_03__Upgrade_Workflow_Policy extends BaseJavaMigration { // Size of migration script run protected Integer migration_file_size = -1; @Override - public void migrate(Connection connection) throws Exception { + public void migrate(Context context) throws Exception { // Make sure XML Workflow is enabled, shouldn't even be needed since this class is only loaded if the service // is enabled. if (WorkflowServiceFactory.getInstance().getWorkflowService() instanceof XmlWorkflowService) { // Now, check if the XMLWorkflow table (cwf_workflowitem) already exists in this database - if (DatabaseUtils.tableExists(connection, "cwf_workflowitem")) { - String dbtype = DatabaseUtils.getDbType(connection); + if (DatabaseUtils.tableExists(context.getConnection(), "cwf_workflowitem")) { + String dbtype = DatabaseUtils.getDbType(context.getConnection()); String sqlMigrationPath = "org/dspace/storage/rdbms/sqlmigration/workflow/" + dbtype + "/"; - String dataMigrateSQL = new ClassPathResource(sqlMigrationPath + - "xmlworkflow" + - "/V7.0_2018.04.03__upgrade_workflow_policy.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + String dataMigrateSQL = MigrationUtils.getResourceAsString( + sqlMigrationPath + "xmlworkflow/V7.0_2018.04.03__upgrade_workflow_policy.sql"); // Actually execute the Data migration SQL // This will migrate all existing traditional workflows to the new XMLWorkflow system & tables - DatabaseUtils.executeSql(connection, dataMigrateSQL); + DatabaseUtils.executeSql(context.getConnection(), dataMigrateSQL); // Assuming both succeeded, save the size of the scripts for getChecksum() below migration_file_size = dataMigrateSQL.length(); diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/xmlworkflow/V5_0_2014_11_04__Enable_XMLWorkflow_Migration.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/xmlworkflow/V5_0_2014_11_04__Enable_XMLWorkflow_Migration.java index 8f6a305bb9..dc8c7c22df 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/xmlworkflow/V5_0_2014_11_04__Enable_XMLWorkflow_Migration.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/xmlworkflow/V5_0_2014_11_04__Enable_XMLWorkflow_Migration.java @@ -8,16 +8,14 @@ package org.dspace.storage.rdbms.xmlworkflow; import java.io.IOException; -import java.sql.Connection; import java.sql.SQLException; -import org.dspace.core.Constants; import org.dspace.storage.rdbms.DatabaseUtils; +import org.dspace.storage.rdbms.migration.MigrationUtils; import org.dspace.workflow.factory.WorkflowServiceFactory; import org.dspace.xmlworkflow.service.XmlWorkflowService; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; -import org.flywaydb.core.internal.util.scanner.classpath.ClassPathResource; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -38,7 +36,7 @@ import org.slf4j.LoggerFactory; * @author Tim Donohue */ public class V5_0_2014_11_04__Enable_XMLWorkflow_Migration - implements JdbcMigration, MigrationChecksumProvider { + extends BaseJavaMigration { /** * logging category */ @@ -50,12 +48,12 @@ public class V5_0_2014_11_04__Enable_XMLWorkflow_Migration /** * Actually migrate the existing database * - * @param connection SQL Connection object + * @param context Flyway Migration Context * @throws IOException A general class of exceptions produced by failed or interrupted I/O operations. * @throws SQLException An exception that provides information on a database access error or other errors. */ @Override - public void migrate(Connection connection) + public void migrate(Context context) throws IOException, SQLException { // Make sure XML Workflow is enabled, shouldn't even be needed since this class is only loaded if the service // is enabled. @@ -64,13 +62,13 @@ public class V5_0_2014_11_04__Enable_XMLWorkflow_Migration // migration, as it is incompatible // with a 6.x database. In that scenario the corresponding 6.x XML Workflow migration will create // necessary tables. - && DatabaseUtils.getCurrentFlywayDSpaceState(connection) < 6) { + && DatabaseUtils.getCurrentFlywayDSpaceState(context.getConnection()) < 6) { // Now, check if the XMLWorkflow table (cwf_workflowitem) already exists in this database // If XMLWorkflow Table does NOT exist in this database, then lets do the migration! // If XMLWorkflow Table ALREADY exists, then this migration is a noop, we assume you manually ran the sql // scripts - if (!DatabaseUtils.tableExists(connection, "cwf_workflowitem")) { - String dbtype = connection.getMetaData().getDatabaseProductName(); + if (!DatabaseUtils.tableExists(context.getConnection(), "cwf_workflowitem")) { + String dbtype = context.getConnection().getMetaData().getDatabaseProductName(); String dbFileLocation = null; if (dbtype.toLowerCase().contains("postgres")) { dbFileLocation = "postgres"; @@ -88,27 +86,21 @@ public class V5_0_2014_11_04__Enable_XMLWorkflow_Migration // Get the contents of our DB Schema migration script, based on path & DB type // (e.g. /src/main/resources/[path-to-this-class]/postgres/xml_workflow_migration.sql) - String dbMigrateSQL = new ClassPathResource(packagePath + "/" + - dbFileLocation + - "/xml_workflow_migration.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + String dbMigrateSQL = MigrationUtils.getResourceAsString(packagePath + "/" + dbFileLocation + + "/xml_workflow_migration.sql"); // Actually execute the Database schema migration SQL // This will create the necessary tables for the XMLWorkflow feature - DatabaseUtils.executeSql(connection, dbMigrateSQL); + DatabaseUtils.executeSql(context.getConnection(), dbMigrateSQL); // Get the contents of our data migration script, based on path & DB type // (e.g. /src/main/resources/[path-to-this-class]/postgres/data_workflow_migration.sql) - String dataMigrateSQL = new ClassPathResource(packagePath + "/" + - dbFileLocation + - "/data_workflow_migration.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + String dataMigrateSQL = MigrationUtils.getResourceAsString(packagePath + "/" + dbFileLocation + + "/data_workflow_migration.sql"); // Actually execute the Data migration SQL // This will migrate all existing traditional workflows to the new XMLWorkflow system & tables - DatabaseUtils.executeSql(connection, dataMigrateSQL); + DatabaseUtils.executeSql(context.getConnection(), dataMigrateSQL); // Assuming both succeeded, save the size of the scripts for getChecksum() below migration_file_size = dbMigrateSQL.length() + dataMigrateSQL.length(); diff --git a/dspace-api/src/main/java/org/dspace/storage/rdbms/xmlworkflow/V6_0_2015_09_01__DS_2701_Enable_XMLWorkflow_Migration.java b/dspace-api/src/main/java/org/dspace/storage/rdbms/xmlworkflow/V6_0_2015_09_01__DS_2701_Enable_XMLWorkflow_Migration.java index ab090f5bf1..b70b19f3a5 100644 --- a/dspace-api/src/main/java/org/dspace/storage/rdbms/xmlworkflow/V6_0_2015_09_01__DS_2701_Enable_XMLWorkflow_Migration.java +++ b/dspace-api/src/main/java/org/dspace/storage/rdbms/xmlworkflow/V6_0_2015_09_01__DS_2701_Enable_XMLWorkflow_Migration.java @@ -7,15 +7,13 @@ */ package org.dspace.storage.rdbms.xmlworkflow; -import java.sql.Connection; -import org.dspace.core.Constants; import org.dspace.storage.rdbms.DatabaseUtils; +import org.dspace.storage.rdbms.migration.MigrationUtils; import org.dspace.workflow.factory.WorkflowServiceFactory; import org.dspace.xmlworkflow.service.XmlWorkflowService; -import org.flywaydb.core.api.migration.MigrationChecksumProvider; -import org.flywaydb.core.api.migration.jdbc.JdbcMigration; -import org.flywaydb.core.internal.util.scanner.classpath.ClassPathResource; +import org.flywaydb.core.api.migration.BaseJavaMigration; +import org.flywaydb.core.api.migration.Context; /** * This class automatically migrates your DSpace Database to use the @@ -34,13 +32,13 @@ import org.flywaydb.core.internal.util.scanner.classpath.ClassPathResource; * Date: 1/09/15 * Time: 11:34 */ -public class V6_0_2015_09_01__DS_2701_Enable_XMLWorkflow_Migration implements JdbcMigration, MigrationChecksumProvider { +public class V6_0_2015_09_01__DS_2701_Enable_XMLWorkflow_Migration extends BaseJavaMigration { // Size of migration script run protected Integer migration_file_size = -1; @Override - public void migrate(Connection connection) throws Exception { + public void migrate(Context context) throws Exception { // Make sure XML Workflow is enabled, shouldn't even be needed since this class is only loaded if the service // is enabled. if (WorkflowServiceFactory.getInstance().getWorkflowService() instanceof XmlWorkflowService) { @@ -48,8 +46,8 @@ public class V6_0_2015_09_01__DS_2701_Enable_XMLWorkflow_Migration implements Jd // If XMLWorkflow Table does NOT exist in this database, then lets do the migration! // If XMLWorkflow Table ALREADY exists, then this migration is a noop, we assume you manually ran the sql // scripts - if (!DatabaseUtils.tableExists(connection, "cwf_workflowitem")) { - String dbtype = connection.getMetaData().getDatabaseProductName(); + if (!DatabaseUtils.tableExists(context.getConnection(), "cwf_workflowitem")) { + String dbtype = context.getConnection().getMetaData().getDatabaseProductName(); String dbFileLocation = null; if (dbtype.toLowerCase().contains("postgres")) { dbFileLocation = "postgres"; @@ -67,27 +65,21 @@ public class V6_0_2015_09_01__DS_2701_Enable_XMLWorkflow_Migration implements Jd // Get the contents of our DB Schema migration script, based on path & DB type // (e.g. /src/main/resources/[path-to-this-class]/postgres/xml_workflow_migration.sql) - String dbMigrateSQL = new ClassPathResource(packagePath + "/" + - dbFileLocation + - "/v6.0__DS-2701_xml_workflow_migration.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + String dbMigrateSQL = MigrationUtils.getResourceAsString(packagePath + "/" + dbFileLocation + + "/v6.0__DS-2701_xml_workflow_migration.sql"); // Actually execute the Database schema migration SQL // This will create the necessary tables for the XMLWorkflow feature - DatabaseUtils.executeSql(connection, dbMigrateSQL); + DatabaseUtils.executeSql(context.getConnection(), dbMigrateSQL); // Get the contents of our data migration script, based on path & DB type // (e.g. /src/main/resources/[path-to-this-class]/postgres/data_workflow_migration.sql) - String dataMigrateSQL = new ClassPathResource(packagePath + "/" + - dbFileLocation + - "/v6.0__DS-2701_data_workflow_migration.sql", - getClass().getClassLoader()) - .loadAsString(Constants.DEFAULT_ENCODING); + String dataMigrateSQL = MigrationUtils.getResourceAsString(packagePath + "/" + dbFileLocation + + "/v6.0__DS-2701_data_workflow_migration.sql"); // Actually execute the Data migration SQL // This will migrate all existing traditional workflows to the new XMLWorkflow system & tables - DatabaseUtils.executeSql(connection, dataMigrateSQL); + DatabaseUtils.executeSql(context.getConnection(), dataMigrateSQL); // Assuming both succeeded, save the size of the scripts for getChecksum() below migration_file_size = dbMigrateSQL.length() + dataMigrateSQL.length(); diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivFileDataLoader.java b/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivFileDataLoader.java deleted file mode 100644 index ebc898e4cf..0000000000 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivFileDataLoader.java +++ /dev/null @@ -1,146 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ - -package org.dspace.submit.lookup; - -import java.io.File; -import java.io.FileInputStream; -import java.io.FileNotFoundException; -import java.io.IOException; -import java.io.InputStream; -import java.util.List; -import java.util.Map; -import javax.xml.parsers.DocumentBuilder; -import javax.xml.parsers.DocumentBuilderFactory; -import javax.xml.parsers.ParserConfigurationException; - -import gr.ekt.bte.core.DataLoadingSpec; -import gr.ekt.bte.core.Record; -import gr.ekt.bte.core.RecordSet; -import gr.ekt.bte.core.Value; -import gr.ekt.bte.dataloader.FileDataLoader; -import gr.ekt.bte.exceptions.MalformedSourceException; -import org.apache.commons.lang3.StringUtils; -import org.apache.logging.log4j.Logger; -import org.dspace.app.util.XMLUtils; -import org.w3c.dom.Document; -import org.w3c.dom.Element; -import org.xml.sax.SAXException; - -/** - * @author Andrea Bollini - * @author Kostas Stamatis - * @author Luigi Andrea Pascarelli - * @author Panagiotis Koutsourakis - */ -public class ArXivFileDataLoader extends FileDataLoader { - - private static Logger log = org.apache.logging.log4j.LogManager.getLogger(ArXivFileDataLoader.class); - - Map fieldMap; // mapping between service fields and local - // intermediate fields - - /** - * Empty constructor - */ - public ArXivFileDataLoader() { - } - - /** - * @param filename Name of file to load ArXiv data from. - */ - public ArXivFileDataLoader(String filename) { - super(filename); - } - - /* - * {@see gr.ekt.bte.core.DataLoader#getRecords()} - * - * @throws MalformedSourceException - */ - @Override - public RecordSet getRecords() throws MalformedSourceException { - - RecordSet recordSet = new RecordSet(); - - try { - InputStream inputStream = new FileInputStream(new File(filename)); - - DocumentBuilderFactory factory = DocumentBuilderFactory - .newInstance(); - factory.setValidating(false); - factory.setIgnoringComments(true); - factory.setIgnoringElementContentWhitespace(true); - - DocumentBuilder db = factory.newDocumentBuilder(); - Document inDoc = db.parse(inputStream); - - Element xmlRoot = inDoc.getDocumentElement(); - List dataRoots = XMLUtils.getElementList(xmlRoot, "entry"); - - for (Element dataRoot : dataRoots) { - Record record = ArxivUtils.convertArxixDomToRecord(dataRoot); - if (record != null) { - recordSet.addRecord(convertFields(record)); - } - } - } catch (FileNotFoundException e) { - log.error(e.getMessage(), e); - } catch (ParserConfigurationException e) { - log.error(e.getMessage(), e); - } catch (SAXException e) { - log.error(e.getMessage(), e); - } catch (IOException e) { - log.error(e.getMessage(), e); - } - - return recordSet; - } - - /* - * (non-Javadoc) - * - * @see - * gr.ekt.bte.core.DataLoader#getRecords(gr.ekt.bte.core.DataLoadingSpec) - */ - @Override - public RecordSet getRecords(DataLoadingSpec spec) - throws MalformedSourceException { - if (spec.getOffset() > 0) { - return new RecordSet(); - } - return getRecords(); - } - - public Record convertFields(Record publication) { - for (String fieldName : fieldMap.keySet()) { - String md = null; - if (fieldMap != null) { - md = this.fieldMap.get(fieldName); - } - - if (StringUtils.isBlank(md)) { - continue; - } else { - md = md.trim(); - } - - if (publication.isMutable()) { - List values = publication.getValues(fieldName); - publication.makeMutable().removeField(fieldName); - publication.makeMutable().addField(md, values); - } - } - - return publication; - } - - public void setFieldMap(Map fieldMap) { - this.fieldMap = fieldMap; - } -} diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivOnlineDataLoader.java b/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivOnlineDataLoader.java deleted file mode 100644 index e477412621..0000000000 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivOnlineDataLoader.java +++ /dev/null @@ -1,84 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.submit.lookup; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; -import java.util.Map; -import java.util.Set; - -import gr.ekt.bte.core.Record; -import org.apache.http.HttpException; -import org.dspace.core.Context; - -/** - * @author Andrea Bollini - * @author Kostas Stamatis - * @author Luigi Andrea Pascarelli - * @author Panagiotis Koutsourakis - */ -public class ArXivOnlineDataLoader extends NetworkSubmissionLookupDataLoader { - protected ArXivService arXivService = new ArXivService(); - - protected boolean searchProvider = true; - - public void setArXivService(ArXivService arXivService) { - this.arXivService = arXivService; - } - - @Override - public List getSupportedIdentifiers() { - return Arrays.asList(new String[] {ARXIV, DOI}); - } - - public void setSearchProvider(boolean searchProvider) { - this.searchProvider = searchProvider; - } - - @Override - public boolean isSearchProvider() { - return searchProvider; - } - - @Override - public List getByIdentifier(Context context, - Map> keys) throws HttpException, IOException { - List results = new ArrayList(); - if (keys != null) { - Set dois = keys.get(DOI); - Set arxivids = keys.get(ARXIV); - List items = new ArrayList(); - if (dois != null && dois.size() > 0) { - items.addAll(arXivService.getByDOIs(dois)); - } - if (arxivids != null && arxivids.size() > 0) { - for (String arxivid : arxivids) { - items.add(arXivService.getByArXivIDs(arxivid)); - } - } - - for (Record item : items) { - results.add(convertFields(item)); - } - } - return results; - } - - @Override - public List search(Context context, String title, String author, - int year) throws HttpException, IOException { - List results = new ArrayList(); - List items = arXivService.searchByTerm(title, author, year); - for (Record item : items) { - results.add(convertFields(item)); - } - return results; - } -} diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivService.java b/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivService.java deleted file mode 100644 index 0a32871758..0000000000 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/ArXivService.java +++ /dev/null @@ -1,159 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.submit.lookup; - -import java.io.IOException; -import java.net.URISyntaxException; -import java.util.ArrayList; -import java.util.List; -import java.util.Set; -import javax.xml.parsers.DocumentBuilder; -import javax.xml.parsers.DocumentBuilderFactory; - -import gr.ekt.bte.core.Record; -import org.apache.commons.lang3.StringUtils; -import org.apache.http.HttpException; -import org.apache.http.HttpResponse; -import org.apache.http.HttpStatus; -import org.apache.http.StatusLine; -import org.apache.http.client.HttpClient; -import org.apache.http.client.methods.HttpGet; -import org.apache.http.client.utils.URIBuilder; -import org.apache.http.impl.client.DefaultHttpClient; -import org.apache.http.params.CoreConnectionPNames; -import org.apache.http.params.HttpParams; -import org.dspace.app.util.XMLUtils; -import org.w3c.dom.Document; -import org.w3c.dom.Element; - -/** - * @author Andrea Bollini - * @author Kostas Stamatis - * @author Luigi Andrea Pascarelli - * @author Panagiotis Koutsourakis - */ -public class ArXivService { - private int timeout = 1000; - - /** - * How long to wait for a connection to be established. - * - * @param timeout milliseconds - */ - public void setTimeout(int timeout) { - this.timeout = timeout; - } - - public List getByDOIs(Set dois) throws HttpException, - IOException { - if (dois != null && dois.size() > 0) { - String doisQuery = StringUtils.join(dois.iterator(), " OR "); - return search(doisQuery, null, 100); - } - return null; - } - - public List searchByTerm(String title, String author, int year) - throws HttpException, IOException { - StringBuffer query = new StringBuffer(); - if (StringUtils.isNotBlank(title)) { - query.append("ti:\"").append(title).append("\""); - } - if (StringUtils.isNotBlank(author)) { - // [FAU] - if (query.length() > 0) { - query.append(" AND "); - } - query.append("au:\"").append(author).append("\""); - } - return search(query.toString(), "", 10); - } - - protected List search(String query, String arxivid, int max_result) - throws IOException, HttpException { - List results = new ArrayList(); - HttpGet method = null; - try { - HttpClient client = new DefaultHttpClient(); - HttpParams params = client.getParams(); - params.setIntParameter(CoreConnectionPNames.CONNECTION_TIMEOUT, timeout); - - try { - URIBuilder uriBuilder = new URIBuilder("http://export.arxiv.org/api/query"); - uriBuilder.addParameter("id_list", arxivid); - uriBuilder.addParameter("search_query", query); - uriBuilder.addParameter("max_results", String.valueOf(max_result)); - method = new HttpGet(uriBuilder.build()); - } catch (URISyntaxException ex) { - throw new HttpException(ex.getMessage()); - } - - // Execute the method. - HttpResponse response = client.execute(method); - StatusLine responseStatus = response.getStatusLine(); - int statusCode = responseStatus.getStatusCode(); - - if (statusCode != HttpStatus.SC_OK) { - if (statusCode == HttpStatus.SC_BAD_REQUEST) { - throw new RuntimeException("arXiv query is not valid"); - } else { - throw new RuntimeException("Http call failed: " - + responseStatus); - } - } - - try { - DocumentBuilderFactory factory = DocumentBuilderFactory - .newInstance(); - factory.setValidating(false); - factory.setIgnoringComments(true); - factory.setIgnoringElementContentWhitespace(true); - - DocumentBuilder db = factory.newDocumentBuilder(); - Document inDoc = db.parse(response.getEntity().getContent()); - - Element xmlRoot = inDoc.getDocumentElement(); - List dataRoots = XMLUtils.getElementList(xmlRoot, - "entry"); - - for (Element dataRoot : dataRoots) { - Record crossitem = ArxivUtils - .convertArxixDomToRecord(dataRoot); - if (crossitem != null) { - results.add(crossitem); - } - } - } catch (Exception e) { - throw new RuntimeException( - "ArXiv identifier is not valid or not exist"); - } - } finally { - if (method != null) { - method.releaseConnection(); - } - } - - return results; - } - - public Record getByArXivIDs(String raw) throws HttpException, IOException { - if (StringUtils.isNotBlank(raw)) { - raw = raw.trim(); - if (raw.startsWith("http://arxiv.org/abs/")) { - raw = raw.substring("http://arxiv.org/abs/".length()); - } else if (raw.toLowerCase().startsWith("arxiv:")) { - raw = raw.substring("arxiv:".length()); - } - List result = search("", raw, 1); - if (result != null && result.size() > 0) { - return result.get(0); - } - } - return null; - } -} diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/ArxivUtils.java b/dspace-api/src/main/java/org/dspace/submit/lookup/ArxivUtils.java deleted file mode 100644 index 4caa0a957b..0000000000 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/ArxivUtils.java +++ /dev/null @@ -1,151 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -/** - * - */ -package org.dspace.submit.lookup; - -import java.util.LinkedList; -import java.util.List; - -import gr.ekt.bte.core.MutableRecord; -import gr.ekt.bte.core.Record; -import gr.ekt.bte.core.StringValue; -import gr.ekt.bte.core.Value; -import org.dspace.app.util.XMLUtils; -import org.dspace.submit.util.SubmissionLookupPublication; -import org.w3c.dom.Element; - -/** - * @author Andrea Bollini - * @author Kostas Stamatis - * @author Luigi Andrea Pascarelli - * @author Panagiotis Koutsourakis - */ -public class ArxivUtils { - - /** - * Default constructor - */ - private ArxivUtils() { } - - public static Record convertArxixDomToRecord(Element dataRoot) { - MutableRecord record = new SubmissionLookupPublication(""); - - String articleTitle = XMLUtils.getElementValue(dataRoot, "title"); - if (articleTitle != null) { - record.addValue("title", new StringValue(articleTitle)); - } - String summary = XMLUtils.getElementValue(dataRoot, "summary"); - if (summary != null) { - record.addValue("summary", new StringValue(summary)); - } - String year = XMLUtils.getElementValue(dataRoot, "published"); - if (year != null) { - record.addValue("published", new StringValue(year)); - } - String splashPageUrl = XMLUtils.getElementValue(dataRoot, "id"); - if (splashPageUrl != null) { - record.addValue("id", new StringValue(splashPageUrl)); - } - String comment = XMLUtils.getElementValue(dataRoot, "arxiv:comment"); - if (comment != null) { - record.addValue("comment", new StringValue(comment)); - } - - List links = XMLUtils.getElementList(dataRoot, "link"); - if (links != null) { - for (Element link : links) { - if ("related".equals(link.getAttribute("rel")) - && "pdf".equals(link.getAttribute("title"))) { - String pdfUrl = link.getAttribute("href"); - if (pdfUrl != null) { - record.addValue("pdfUrl", new StringValue(pdfUrl)); - } - } - } - } - - String doi = XMLUtils.getElementValue(dataRoot, "arxiv:doi"); - if (doi != null) { - record.addValue("doi", new StringValue(doi)); - } - String journalRef = XMLUtils.getElementValue(dataRoot, - "arxiv:journal_ref"); - if (journalRef != null) { - record.addValue("journalRef", new StringValue(journalRef)); - } - - List primaryCategory = new LinkedList(); - List primaryCategoryList = XMLUtils.getElementList(dataRoot, - "arxiv:primary_category"); - if (primaryCategoryList != null) { - for (Element primaryCategoryElement : primaryCategoryList) { - primaryCategory - .add(primaryCategoryElement.getAttribute("term")); - } - } - - if (primaryCategory.size() > 0) { - List values = new LinkedList(); - for (String s : primaryCategory) { - values.add(new StringValue(s)); - } - record.addField("primaryCategory", values); - } - - List category = new LinkedList(); - List categoryList = XMLUtils.getElementList(dataRoot, - "category"); - if (categoryList != null) { - for (Element categoryElement : categoryList) { - category.add(categoryElement.getAttribute("term")); - } - } - - if (category.size() > 0) { - List values = new LinkedList(); - for (String s : category) { - values.add(new StringValue(s)); - } - record.addField("category", values); - } - - List authors = new LinkedList(); - List authorsWithAffiliations = new LinkedList(); - List authorList = XMLUtils.getElementList(dataRoot, "author"); - if (authorList != null) { - for (Element authorElement : authorList) { - String authorName = XMLUtils.getElementValue(authorElement, "name"); - String authorAffiliation = XMLUtils.getElementValue(authorElement, "arxiv:affiliation"); - - authors.add(authorName); - authorsWithAffiliations.add(authorName + ": " + authorAffiliation); - } - } - - if (authors.size() > 0) { - List values = new LinkedList(); - for (String sArray : authors) { - values.add(new StringValue(sArray)); - } - record.addField("author", values); - } - - if (authorsWithAffiliations.size() > 0) { - List values = new LinkedList(); - for (String sArray : authorsWithAffiliations) { - values.add(new StringValue(sArray)); - } - record.addField("authorWithAffiliation", values); - } - - return record; - } - -} diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/CiNiiService.java b/dspace-api/src/main/java/org/dspace/submit/lookup/CiNiiService.java index 23026353fd..bb59043e52 100644 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/CiNiiService.java +++ b/dspace-api/src/main/java/org/dspace/submit/lookup/CiNiiService.java @@ -102,6 +102,9 @@ public class CiNiiService { factory.setValidating(false); factory.setIgnoringComments(true); factory.setIgnoringElementContentWhitespace(true); + // disallow DTD parsing to ensure no XXE attacks can occur. + // See https://cheatsheetseries.owasp.org/cheatsheets/XML_External_Entity_Prevention_Cheat_Sheet.html + factory.setFeature("http://apache.org/xml/features/disallow-doctype-decl", true); DocumentBuilder db = factory.newDocumentBuilder(); Document inDoc = db.parse(response.getEntity().getContent()); @@ -178,6 +181,9 @@ public class CiNiiService { factory.setValidating(false); factory.setIgnoringComments(true); factory.setIgnoringElementContentWhitespace(true); + // disallow DTD parsing to ensure no XXE attacks can occur. + // See https://cheatsheetseries.owasp.org/cheatsheets/XML_External_Entity_Prevention_Cheat_Sheet.html + factory.setFeature("http://apache.org/xml/features/disallow-doctype-decl", true); DocumentBuilder db = factory.newDocumentBuilder(); Document inDoc = db.parse(response.getEntity().getContent()); diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/CrossRefService.java b/dspace-api/src/main/java/org/dspace/submit/lookup/CrossRefService.java index f73e9c0352..4b99cf1f8b 100644 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/CrossRefService.java +++ b/dspace-api/src/main/java/org/dspace/submit/lookup/CrossRefService.java @@ -99,6 +99,9 @@ public class CrossRefService { factory.setValidating(false); factory.setIgnoringComments(true); factory.setIgnoringElementContentWhitespace(true); + // disallow DTD parsing to ensure no XXE attacks can occur. + // See https://cheatsheetseries.owasp.org/cheatsheets/XML_External_Entity_Prevention_Cheat_Sheet.html + factory.setFeature("http://apache.org/xml/features/disallow-doctype-decl", true); DocumentBuilder db = factory .newDocumentBuilder(); diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedFileDataLoader.java b/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedFileDataLoader.java deleted file mode 100644 index 05a37e64d6..0000000000 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedFileDataLoader.java +++ /dev/null @@ -1,148 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ - -package org.dspace.submit.lookup; - -import java.io.File; -import java.io.FileInputStream; -import java.io.FileNotFoundException; -import java.io.IOException; -import java.io.InputStream; -import java.util.List; -import java.util.Map; -import javax.xml.parsers.DocumentBuilder; -import javax.xml.parsers.DocumentBuilderFactory; -import javax.xml.parsers.ParserConfigurationException; - -import gr.ekt.bte.core.DataLoadingSpec; -import gr.ekt.bte.core.Record; -import gr.ekt.bte.core.RecordSet; -import gr.ekt.bte.core.Value; -import gr.ekt.bte.dataloader.FileDataLoader; -import gr.ekt.bte.exceptions.MalformedSourceException; -import org.apache.commons.lang3.StringUtils; -import org.dspace.app.util.XMLUtils; -import org.w3c.dom.Document; -import org.w3c.dom.Element; -import org.xml.sax.SAXException; - -/** - * @author Andrea Bollini - * @author Kostas Stamatis - * @author Luigi Andrea Pascarelli - * @author Panagiotis Koutsourakis - */ -public class PubmedFileDataLoader extends FileDataLoader { - - Map fieldMap; // mapping between service fields and local - // intermediate fields - - /** - * - */ - public PubmedFileDataLoader() { - } - - /** - * @param filename Name of file to load CiNii data from. - */ - public PubmedFileDataLoader(String filename) { - super(filename); - } - - /* - * {@see gr.ekt.bte.core.DataLoader#getRecords()} - * - * @throws MalformedSourceException - */ - @Override - public RecordSet getRecords() throws MalformedSourceException { - - RecordSet recordSet = new RecordSet(); - - try { - InputStream inputStream = new FileInputStream(new File(filename)); - - DocumentBuilderFactory factory = DocumentBuilderFactory - .newInstance(); - factory.setValidating(false); - factory.setIgnoringComments(true); - factory.setIgnoringElementContentWhitespace(true); - - DocumentBuilder builder = factory.newDocumentBuilder(); - Document inDoc = builder.parse(inputStream); - - Element xmlRoot = inDoc.getDocumentElement(); - List pubArticles = XMLUtils.getElementList(xmlRoot, - "PubmedArticle"); - - for (Element xmlArticle : pubArticles) { - Record record = null; - try { - record = PubmedUtils.convertPubmedDomToRecord(xmlArticle); - recordSet.addRecord(convertFields(record)); - } catch (Exception e) { - throw new RuntimeException(e.getMessage(), e); - } - } - } catch (FileNotFoundException e) { - e.printStackTrace(); - } catch (ParserConfigurationException e) { - e.printStackTrace(); - } catch (SAXException e) { - e.printStackTrace(); - } catch (IOException e) { - e.printStackTrace(); - } - - return recordSet; - - } - - /* - * (non-Javadoc) - * - * @see - * gr.ekt.bte.core.DataLoader#getRecords(gr.ekt.bte.core.DataLoadingSpec) - */ - @Override - public RecordSet getRecords(DataLoadingSpec spec) - throws MalformedSourceException { - if (spec.getOffset() > 0) { - return new RecordSet(); - } - return getRecords(); - } - - public Record convertFields(Record publication) { - for (String fieldName : fieldMap.keySet()) { - String md = null; - if (fieldMap != null) { - md = this.fieldMap.get(fieldName); - } - - if (StringUtils.isBlank(md)) { - continue; - } else { - md = md.trim(); - } - - if (publication.isMutable()) { - List values = publication.getValues(fieldName); - publication.makeMutable().removeField(fieldName); - publication.makeMutable().addField(md, values); - } - } - - return publication; - } - - public void setFieldMap(Map fieldMap) { - this.fieldMap = fieldMap; - } -} diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedOnlineDataLoader.java b/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedOnlineDataLoader.java deleted file mode 100644 index 094ce4e21d..0000000000 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedOnlineDataLoader.java +++ /dev/null @@ -1,116 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.submit.lookup; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; -import java.util.Map; -import java.util.Set; - -import gr.ekt.bte.core.Record; -import org.apache.http.HttpException; -import org.apache.logging.log4j.Logger; -import org.dspace.core.Context; -import org.dspace.core.LogManager; - -/** - * @author Andrea Bollini - * @author Kostas Stamatis - * @author Luigi Andrea Pascarelli - * @author Panagiotis Koutsourakis - */ -public class PubmedOnlineDataLoader extends NetworkSubmissionLookupDataLoader { - protected boolean searchProvider = true; - - private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(PubmedOnlineDataLoader.class); - - protected PubmedService pubmedService = new PubmedService(); - - public void setPubmedService(PubmedService pubmedService) { - this.pubmedService = pubmedService; - } - - @Override - public List getSupportedIdentifiers() { - return Arrays.asList(new String[] {PUBMED, DOI}); - } - - public void setSearchProvider(boolean searchProvider) { - this.searchProvider = searchProvider; - } - - @Override - public boolean isSearchProvider() { - return searchProvider; - } - - @Override - public List getByIdentifier(Context context, - Map> keys) throws HttpException, IOException { - Set pmids = keys != null ? keys.get(PUBMED) : null; - Set dois = keys != null ? keys.get(DOI) : null; - List results = new ArrayList(); - if (pmids != null && pmids.size() > 0 - && (dois == null || dois.size() == 0)) { - for (String pmid : pmids) { - Record p = null; - try { - p = pubmedService.getByPubmedID(pmid); - } catch (Exception e) { - log.error(LogManager.getHeader(context, "getByIdentifier", - "pmid=" + pmid), e); - } - if (p != null) { - results.add(convertFields(p)); - } - } - } else if (dois != null && dois.size() > 0 - && (pmids == null || pmids.size() == 0)) { - StringBuffer query = new StringBuffer(); - for (String d : dois) { - if (query.length() > 0) { - query.append(" OR "); - } - query.append(d).append("[AI]"); - } - - List pubmedResults = pubmedService.search(query.toString()); - for (Record p : pubmedResults) { - results.add(convertFields(p)); - } - } else if (dois != null && dois.size() > 0 && pmids != null - && pmids.size() > 0) { - // EKT:ToDo: support list of dois and pmids in the search method of - // pubmedService - List pubmedResults = pubmedService.search(dois.iterator() - .next(), pmids.iterator().next()); - if (pubmedResults != null) { - for (Record p : pubmedResults) { - results.add(convertFields(p)); - } - } - } - - return results; - } - - @Override - public List search(Context context, String title, String author, - int year) throws HttpException, IOException { - List pubmedResults = pubmedService.search(title, author, year); - List results = new ArrayList(); - if (pubmedResults != null) { - for (Record p : pubmedResults) { - results.add(convertFields(p)); - } - } - return results; - } -} diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedService.java b/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedService.java deleted file mode 100644 index fa30ee8ea5..0000000000 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedService.java +++ /dev/null @@ -1,265 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.submit.lookup; - -import java.io.File; -import java.io.FileInputStream; -import java.io.IOException; -import java.io.InputStream; -import java.net.URISyntaxException; -import java.util.ArrayList; -import java.util.List; -import javax.xml.parsers.DocumentBuilder; -import javax.xml.parsers.DocumentBuilderFactory; -import javax.xml.parsers.ParserConfigurationException; - -import gr.ekt.bte.core.Record; -import org.apache.commons.lang3.StringUtils; -import org.apache.http.HttpException; -import org.apache.http.HttpResponse; -import org.apache.http.HttpStatus; -import org.apache.http.StatusLine; -import org.apache.http.client.HttpClient; -import org.apache.http.client.methods.HttpGet; -import org.apache.http.client.utils.URIBuilder; -import org.apache.http.impl.client.DefaultHttpClient; -import org.apache.http.params.CoreConnectionPNames; -import org.apache.logging.log4j.Logger; -import org.dspace.app.util.XMLUtils; -import org.dspace.core.ConfigurationManager; -import org.w3c.dom.Document; -import org.w3c.dom.Element; -import org.xml.sax.SAXException; - -/** - * @author Andrea Bollini - * @author Kostas Stamatis - * @author Luigi Andrea Pascarelli - * @author Panagiotis Koutsourakis - */ -public class PubmedService { - - private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(PubmedService.class); - - protected int timeout = 1000; - - public void setTimeout(int timeout) { - this.timeout = timeout; - } - - public Record getByPubmedID(String pubmedid) throws HttpException, - IOException, ParserConfigurationException, SAXException { - List ids = new ArrayList(); - ids.add(pubmedid.trim()); - List items = getByPubmedIDs(ids); - if (items != null && items.size() > 0) { - return items.get(0); - } - return null; - } - - public List search(String title, String author, int year) - throws HttpException, IOException { - StringBuffer query = new StringBuffer(); - if (StringUtils.isNotBlank(title)) { - query.append("((").append(title).append("[TI]) OR ("); - // [TI] does not always work, book chapter title - query.append("(").append(title).append("[book]))"); - } - if (StringUtils.isNotBlank(author)) { - // [FAU] - if (query.length() > 0) { - query.append(" AND "); - } - query.append("(").append(author).append("[AU])"); - } - if (year != -1) { - // [DP] - if (query.length() > 0) { - query.append(" AND "); - } - query.append(year).append("[DP]"); - } - return search(query.toString()); - } - - public List search(String query) throws IOException, HttpException { - List results = new ArrayList<>(); - if (!ConfigurationManager.getBooleanProperty(SubmissionLookupService.CFG_MODULE, "remoteservice.demo")) { - HttpGet method = null; - try { - HttpClient client = new DefaultHttpClient(); - client.getParams().setIntParameter(CoreConnectionPNames.CONNECTION_TIMEOUT, timeout); - - URIBuilder uriBuilder = new URIBuilder( - "https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esearch.fcgi"); - uriBuilder.addParameter("db", "pubmed"); - uriBuilder.addParameter("datetype", "edat"); - uriBuilder.addParameter("retmax", "10"); - uriBuilder.addParameter("term", query); - method = new HttpGet(uriBuilder.build()); - - // Execute the method. - HttpResponse response = client.execute(method); - StatusLine statusLine = response.getStatusLine(); - int statusCode = statusLine.getStatusCode(); - - if (statusCode != HttpStatus.SC_OK) { - throw new RuntimeException("WS call failed: " - + statusLine); - } - - DocumentBuilderFactory factory = DocumentBuilderFactory - .newInstance(); - factory.setValidating(false); - factory.setIgnoringComments(true); - factory.setIgnoringElementContentWhitespace(true); - - DocumentBuilder builder; - try { - builder = factory.newDocumentBuilder(); - - Document inDoc = builder.parse(response.getEntity().getContent()); - - Element xmlRoot = inDoc.getDocumentElement(); - Element idList = XMLUtils.getSingleElement(xmlRoot, - "IdList"); - List pubmedIDs = XMLUtils.getElementValueList( - idList, "Id"); - results = getByPubmedIDs(pubmedIDs); - } catch (ParserConfigurationException e1) { - log.error(e1.getMessage(), e1); - } catch (SAXException e1) { - log.error(e1.getMessage(), e1); - } - } catch (Exception e1) { - log.error(e1.getMessage(), e1); - } finally { - if (method != null) { - method.releaseConnection(); - } - } - } else { - InputStream stream = null; - try { - File file = new File( - ConfigurationManager.getProperty("dspace.dir") - + "/config/crosswalks/demo/pubmed-search.xml"); - stream = new FileInputStream(file); - DocumentBuilderFactory factory = DocumentBuilderFactory - .newInstance(); - factory.setValidating(false); - factory.setIgnoringComments(true); - factory.setIgnoringElementContentWhitespace(true); - - DocumentBuilder builder = factory.newDocumentBuilder(); - Document inDoc = builder.parse(stream); - - Element xmlRoot = inDoc.getDocumentElement(); - Element idList = XMLUtils.getSingleElement(xmlRoot, "IdList"); - List pubmedIDs = XMLUtils.getElementValueList(idList, - "Id"); - results = getByPubmedIDs(pubmedIDs); - } catch (Exception e) { - throw new RuntimeException(e.getMessage(), e); - } finally { - if (stream != null) { - try { - stream.close(); - } catch (IOException e) { - e.printStackTrace(); - } - } - } - } - return results; - } - - public List getByPubmedIDs(List pubmedIDs) - throws HttpException, IOException, ParserConfigurationException, - SAXException { - List results = new ArrayList(); - HttpGet method = null; - try { - HttpClient client = new DefaultHttpClient(); - client.getParams().setIntParameter(CoreConnectionPNames.CONNECTION_TIMEOUT, 5 * timeout); - - try { - URIBuilder uriBuilder = new URIBuilder( - "https://eutils.ncbi.nlm.nih.gov/entrez/eutils/efetch.fcgi"); - uriBuilder.addParameter("db", "pubmed"); - uriBuilder.addParameter("retmode", "xml"); - uriBuilder.addParameter("rettype", "full"); - uriBuilder.addParameter("id", StringUtils.join( - pubmedIDs.iterator(), ",")); - method = new HttpGet(uriBuilder.build()); - } catch (URISyntaxException ex) { - throw new RuntimeException("Request not sent", ex); - } - - // Execute the method. - HttpResponse response = client.execute(method); - StatusLine statusLine = response.getStatusLine(); - int statusCode = statusLine.getStatusCode(); - - if (statusCode != HttpStatus.SC_OK) { - throw new RuntimeException("WS call failed: " + statusLine); - } - - DocumentBuilderFactory factory = DocumentBuilderFactory - .newInstance(); - factory.setValidating(false); - factory.setIgnoringComments(true); - factory.setIgnoringElementContentWhitespace(true); - - DocumentBuilder builder = factory.newDocumentBuilder(); - Document inDoc = builder - .parse(response.getEntity().getContent()); - - Element xmlRoot = inDoc.getDocumentElement(); - List pubArticles = XMLUtils.getElementList(xmlRoot, - "PubmedArticle"); - - for (Element xmlArticle : pubArticles) { - Record pubmedItem = null; - try { - pubmedItem = PubmedUtils - .convertPubmedDomToRecord(xmlArticle); - results.add(pubmedItem); - } catch (Exception e) { - throw new RuntimeException( - "PubmedID is not valid or not exist: " - + e.getMessage(), e); - } - } - - return results; - } finally { - if (method != null) { - method.releaseConnection(); - } - } - } - - public List search(String doi, String pmid) throws HttpException, - IOException { - StringBuffer query = new StringBuffer(); - if (StringUtils.isNotBlank(doi)) { - query.append(doi); - query.append("[AID]"); - } - if (StringUtils.isNotBlank(pmid)) { - // [FAU] - if (query.length() > 0) { - query.append(" OR "); - } - query.append(pmid).append("[PMID]"); - } - return search(query.toString()); - } -} diff --git a/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedUtils.java b/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedUtils.java deleted file mode 100644 index bca34de295..0000000000 --- a/dspace-api/src/main/java/org/dspace/submit/lookup/PubmedUtils.java +++ /dev/null @@ -1,316 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -/** - * - */ -package org.dspace.submit.lookup; - -import java.util.HashMap; -import java.util.LinkedList; -import java.util.List; -import java.util.Map; - -import gr.ekt.bte.core.MutableRecord; -import gr.ekt.bte.core.Record; -import gr.ekt.bte.core.StringValue; -import gr.ekt.bte.core.Value; -import org.apache.commons.lang3.StringUtils; -import org.dspace.app.util.XMLUtils; -import org.dspace.submit.util.SubmissionLookupPublication; -import org.w3c.dom.Element; - -/** - * @author Andrea Bollini - * @author Kostas Stamatis - * @author Luigi Andrea Pascarelli - * @author Panagiotis Koutsourakis - */ -public class PubmedUtils { - - /** - * Default constructor - */ - private PubmedUtils() { } - - public static Record convertPubmedDomToRecord(Element pubArticle) { - MutableRecord record = new SubmissionLookupPublication(""); - - Map monthToNum = new HashMap(); - monthToNum.put("Jan", "01"); - monthToNum.put("Feb", "02"); - monthToNum.put("Mar", "03"); - monthToNum.put("Apr", "04"); - monthToNum.put("May", "05"); - monthToNum.put("Jun", "06"); - monthToNum.put("Jul", "07"); - monthToNum.put("Aug", "08"); - monthToNum.put("Sep", "09"); - monthToNum.put("Oct", "10"); - monthToNum.put("Nov", "11"); - monthToNum.put("Dec", "12"); - - Element medline = XMLUtils.getSingleElement(pubArticle, - "MedlineCitation"); - - Element article = XMLUtils.getSingleElement(medline, "Article"); - Element pubmed = XMLUtils.getSingleElement(pubArticle, "PubmedData"); - - Element identifierList = XMLUtils.getSingleElement(pubmed, - "ArticleIdList"); - if (identifierList != null) { - List identifiers = XMLUtils.getElementList(identifierList, - "ArticleId"); - if (identifiers != null) { - for (Element id : identifiers) { - if ("pubmed".equals(id.getAttribute("IdType"))) { - String pubmedID = id.getTextContent().trim(); - if (pubmedID != null) { - record.addValue("pubmedID", new StringValue( - pubmedID)); - } - } else if ("doi".equals(id.getAttribute("IdType"))) { - String doi = id.getTextContent().trim(); - if (doi != null) { - record.addValue("doi", new StringValue(doi)); - } - } - } - } - } - - String status = XMLUtils.getElementValue(pubmed, "PublicationStatus"); - if (status != null) { - record.addValue("publicationStatus", new StringValue(status)); - } - - String pubblicationModel = XMLUtils.getElementAttribute(medline, - "Article", "PubModel"); - if (pubblicationModel != null) { - record.addValue("pubModel", new StringValue( - pubblicationModel)); - } - - String title = XMLUtils.getElementValue(article, "ArticleTitle"); - if (title != null) { - record.addValue("articleTitle", new StringValue(title)); - } - - Element abstractElement = XMLUtils - .getSingleElement(article, "Abstract"); - if (abstractElement == null) { - abstractElement = XMLUtils.getSingleElement(medline, - "OtherAbstract"); - } - if (abstractElement != null) { - String summary = XMLUtils.getElementValue(abstractElement, - "AbstractText"); - if (summary != null) { - record.addValue("abstractText", new StringValue(summary)); - } - } - - List authors = new LinkedList(); - Element authorList = XMLUtils.getSingleElement(article, "AuthorList"); - if (authorList != null) { - List authorsElement = XMLUtils.getElementList(authorList, - "Author"); - if (authorsElement != null) { - for (Element author : authorsElement) { - if (StringUtils.isBlank(XMLUtils.getElementValue(author, - "CollectiveName"))) { - authors.add(new String[] { - XMLUtils.getElementValue(author, "ForeName"), - XMLUtils.getElementValue(author, "LastName")}); - } - } - } - } - if (authors.size() > 0) { - List values = new LinkedList(); - for (String[] sArray : authors) { - values.add(new StringValue(sArray[1] + ", " + sArray[0])); - } - record.addField("author", values); - } - - Element journal = XMLUtils.getSingleElement(article, "Journal"); - if (journal != null) { - List jnumbers = XMLUtils.getElementList(journal, "ISSN"); - if (jnumbers != null) { - for (Element jnumber : jnumbers) { - if ("Print".equals(jnumber.getAttribute("IssnType"))) { - String issn = jnumber.getTextContent().trim(); - if (issn != null) { - record.addValue("printISSN", new StringValue(issn)); - } - } else { - String eissn = jnumber.getTextContent().trim(); - if (eissn != null) { - record.addValue("electronicISSN", new StringValue(eissn)); - } - } - } - } - - String journalTitle = XMLUtils.getElementValue(journal, "Title"); - if (journalTitle != null) { - record.addValue("journalTitle", new StringValue(journalTitle)); - } - - Element journalIssueElement = XMLUtils.getSingleElement(journal, - "JournalIssue"); - if (journalIssueElement != null) { - String volume = XMLUtils.getElementValue(journalIssueElement, - "Volume"); - if (volume != null) { - record.addValue("journalVolume", new StringValue(volume)); - } - - String issue = XMLUtils.getElementValue(journalIssueElement, - "Issue"); - if (issue != null) { - record.addValue("journalIssue", new StringValue(issue)); - } - - Element pubDateElement = XMLUtils.getSingleElement( - journalIssueElement, "PubDate"); - - String pubDate = null; - if (pubDateElement != null) { - pubDate = XMLUtils.getElementValue(pubDateElement, "Year"); - - String mounth = XMLUtils.getElementValue(pubDateElement, - "Month"); - String day = XMLUtils - .getElementValue(pubDateElement, "Day"); - if (StringUtils.isNotBlank(mounth) - && monthToNum.containsKey(mounth)) { - pubDate += "-" + monthToNum.get(mounth); - if (StringUtils.isNotBlank(day)) { - pubDate += "-" + (day.length() == 1 ? "0" + day : day); - } - } - } - if (pubDate == null) { - pubDate = XMLUtils.getElementValue(pubDateElement, "MedlineDate"); - } - if (pubDate != null) { - record.addValue("pubDate", new StringValue(pubDate)); - } - } - - String language = XMLUtils.getElementValue(article, "Language"); - if (language != null) { - record.addValue("language", new StringValue(language)); - } - - List type = new LinkedList(); - Element publicationTypeList = XMLUtils.getSingleElement(article, - "PublicationTypeList"); - if (publicationTypeList != null) { - List publicationTypes = XMLUtils.getElementList( - publicationTypeList, "PublicationType"); - for (Element publicationType : publicationTypes) { - type.add(publicationType.getTextContent().trim()); - } - } - if (type.size() > 0) { - List values = new LinkedList(); - for (String s : type) { - values.add(new StringValue(s)); - } - record.addField("publicationType", values); - } - - List primaryKeywords = new LinkedList(); - List secondaryKeywords = new LinkedList(); - Element keywordsList = XMLUtils.getSingleElement(medline, - "KeywordList"); - if (keywordsList != null) { - List keywords = XMLUtils.getElementList(keywordsList, - "Keyword"); - for (Element keyword : keywords) { - if ("Y".equals(keyword.getAttribute("MajorTopicYN"))) { - primaryKeywords.add(keyword.getTextContent().trim()); - } else { - secondaryKeywords.add(keyword.getTextContent().trim()); - } - } - } - if (primaryKeywords.size() > 0) { - List values = new LinkedList(); - for (String s : primaryKeywords) { - values.add(new StringValue(s)); - } - record.addField("primaryKeyword", values); - } - if (secondaryKeywords.size() > 0) { - List values = new LinkedList(); - for (String s : secondaryKeywords) { - values.add(new StringValue(s)); - } - record.addField("secondaryKeyword", values); - } - - List primaryMeshHeadings = new LinkedList(); - List secondaryMeshHeadings = new LinkedList(); - Element meshHeadingsList = XMLUtils.getSingleElement(medline, - "MeshHeadingList"); - if (meshHeadingsList != null) { - List meshHeadings = XMLUtils.getElementList( - meshHeadingsList, "MeshHeading"); - for (Element meshHeading : meshHeadings) { - if ("Y".equals(XMLUtils.getElementAttribute(meshHeading, - "DescriptorName", "MajorTopicYN"))) { - primaryMeshHeadings.add(XMLUtils.getElementValue( - meshHeading, "DescriptorName")); - } else { - secondaryMeshHeadings.add(XMLUtils.getElementValue( - meshHeading, "DescriptorName")); - } - } - } - if (primaryMeshHeadings.size() > 0) { - List values = new LinkedList(); - for (String s : primaryMeshHeadings) { - values.add(new StringValue(s)); - } - record.addField("primaryMeshHeading", values); - } - if (secondaryMeshHeadings.size() > 0) { - List values = new LinkedList(); - for (String s : secondaryMeshHeadings) { - values.add(new StringValue(s)); - } - record.addField("secondaryMeshHeading", values); - } - - Element paginationElement = XMLUtils.getSingleElement(article, - "Pagination"); - if (paginationElement != null) { - String startPage = XMLUtils.getElementValue(paginationElement, - "StartPage"); - String endPage = XMLUtils.getElementValue(paginationElement, - "EndPage"); - if (StringUtils.isBlank(startPage)) { - startPage = XMLUtils.getElementValue(paginationElement, - "MedlinePgn"); - } - - if (startPage != null) { - record.addValue("startPage", new StringValue(startPage)); - } - if (endPage != null) { - record.addValue("endPage", new StringValue(endPage)); - } - } - } - - return record; - } -} diff --git a/dspace-api/src/main/java/org/dspace/versioning/Version.java b/dspace-api/src/main/java/org/dspace/versioning/Version.java index a926fba0f8..2d4d359545 100644 --- a/dspace-api/src/main/java/org/dspace/versioning/Version.java +++ b/dspace-api/src/main/java/org/dspace/versioning/Version.java @@ -135,12 +135,12 @@ public class Version implements ReloadableEntity { return true; } Class objClass = HibernateProxyHelper.getClassWithoutInitializingProxy(o); - if (getClass() != objClass) { + if (!getClass().equals(objClass)) { return false; } final Version that = (Version) o; - if (this.getID() != that.getID()) { + if (!this.getID().equals(that.getID())) { return false; } diff --git a/dspace-api/src/main/java/org/dspace/versioning/VersionHistory.java b/dspace-api/src/main/java/org/dspace/versioning/VersionHistory.java index 0f5b9384bd..1acacc7838 100644 --- a/dspace-api/src/main/java/org/dspace/versioning/VersionHistory.java +++ b/dspace-api/src/main/java/org/dspace/versioning/VersionHistory.java @@ -93,12 +93,12 @@ public class VersionHistory implements ReloadableEntity { return true; } Class objClass = HibernateProxyHelper.getClassWithoutInitializingProxy(o); - if (getClass() != objClass) { + if (!getClass().equals(objClass)) { return false; } final VersionHistory that = (VersionHistory) o; - if (this.getID() != that.getID()) { + if (!this.getID().equals(that.getID())) { return false; } diff --git a/dspace-api/src/main/java/org/dspace/workflow/WorkflowService.java b/dspace-api/src/main/java/org/dspace/workflow/WorkflowService.java index e190a56bc1..ced074d71d 100644 --- a/dspace-api/src/main/java/org/dspace/workflow/WorkflowService.java +++ b/dspace-api/src/main/java/org/dspace/workflow/WorkflowService.java @@ -80,6 +80,20 @@ public interface WorkflowService { */ public WorkspaceItem abort(Context c, T wi, EPerson e) throws SQLException, AuthorizeException, IOException; + /** + * Deletes workflow task item in correct order. + * + * @param c The relevant DSpace Context. + * @param wi The WorkflowItem that shall be deleted. + * @param e Admin that deletes this workflow task and item (for logging + * @throws SQLException An exception that provides information on a database access error or other errors. + * @throws AuthorizeException Exception indicating the current user of the context does not have permission + * to perform a particular action. + * @throws IOException A general class of exceptions produced by failed or interrupted I/O operations. + */ + public void deleteWorkflowByWorkflowItem(Context c, T wi, EPerson e) + throws SQLException, AuthorizeException, IOException; + public WorkspaceItem sendWorkflowItemBackSubmission(Context c, T workflowItem, EPerson e, String provenance, String rejection_message) throws SQLException, AuthorizeException, IOException; diff --git a/dspace-api/src/main/java/org/dspace/workflowbasic/BasicWorkflowServiceImpl.java b/dspace-api/src/main/java/org/dspace/workflowbasic/BasicWorkflowServiceImpl.java index 7c13211d08..f97e5d9e4a 100644 --- a/dspace-api/src/main/java/org/dspace/workflowbasic/BasicWorkflowServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/workflowbasic/BasicWorkflowServiceImpl.java @@ -798,33 +798,37 @@ public class BasicWorkflowServiceImpl implements BasicWorkflowService { try { // Get submitter EPerson ep = item.getSubmitter(); - // Get the Locale - Locale supportedLocale = I18nUtil.getEPersonLocale(ep); - Email email = Email.getEmail(I18nUtil.getEmailFilename(supportedLocale, "submit_archive")); - // Get the item handle to email to user - String handle = handleService.findHandle(context, item); + // send the notification to the submitter unless the submitter eperson has been deleted + if (ep != null) { + // Get the Locale + Locale supportedLocale = I18nUtil.getEPersonLocale(ep); + Email email = Email.getEmail(I18nUtil.getEmailFilename(supportedLocale, "submit_archive")); - // Get title - String title = item.getName(); - if (StringUtils.isBlank(title)) { - try { - title = I18nUtil.getMessage("org.dspace.workflow.WorkflowManager.untitled"); - } catch (MissingResourceException e) { - title = "Untitled"; + // Get the item handle to email to user + String handle = handleService.findHandle(context, item); + + // Get title + String title = item.getName(); + if (StringUtils.isBlank(title)) { + try { + title = I18nUtil.getMessage("org.dspace.workflow.WorkflowManager.untitled"); + } catch (MissingResourceException e) { + title = "Untitled"; + } } + + email.addRecipient(ep.getEmail()); + email.addArgument(title); + email.addArgument(coll.getName()); + email.addArgument(handleService.getCanonicalForm(handle)); + + email.send(); } - - email.addRecipient(ep.getEmail()); - email.addArgument(title); - email.addArgument(coll.getName()); - email.addArgument(handleService.getCanonicalForm(handle)); - - email.send(); } catch (MessagingException e) { log.warn(LogManager.getHeader(context, "notifyOfArchive", - "cannot email user; item_id=" + item.getID() - + ": " + e.getMessage())); + "cannot email user; item_id=" + item.getID() + + ": " + e.getMessage())); } } @@ -866,6 +870,22 @@ public class BasicWorkflowServiceImpl implements BasicWorkflowService { return workspaceItem; } + @Override + public void deleteWorkflowByWorkflowItem(Context context, BasicWorkflowItem wi, EPerson e) + throws SQLException, AuthorizeException, IOException { + Item myitem = wi.getItem(); + UUID itemID = myitem.getID(); + Integer workflowID = wi.getID(); + UUID collID = wi.getCollection().getID(); + // stop workflow + taskListItemService.deleteByWorkflowItem(context, wi); + // Now remove the workflow object manually from the database + workflowItemService.deleteWrapper(context, wi); + // Now delete the item + itemService.delete(context, myitem); + log.info(LogManager.getHeader(context, "delete_workflow", String.format("workflow_item_id=%s " + + "item_id=%s collection_id=%s eperson_id=%s", workflowID, itemID, collID, e.getID()))); + } @Override public WorkspaceItem sendWorkflowItemBackSubmission(Context context, BasicWorkflowItem workflowItem, @@ -1047,25 +1067,30 @@ public class BasicWorkflowServiceImpl implements BasicWorkflowService { protected void notifyOfReject(Context context, BasicWorkflowItem workflowItem, EPerson e, String reason) { try { - // Get the item title - String title = getItemTitle(workflowItem); + // Get submitter + EPerson ep = workflowItem.getSubmitter(); + // send the notification only if the person was not deleted in the meantime + if (ep != null) { + // Get the item title + String title = getItemTitle(workflowItem); - // Get the collection - Collection coll = workflowItem.getCollection(); + // Get the collection + Collection coll = workflowItem.getCollection(); - // Get rejector's name - String rejector = getEPersonName(e); - Locale supportedLocale = I18nUtil.getEPersonLocale(e); - Email email = Email.getEmail(I18nUtil.getEmailFilename(supportedLocale, "submit_reject")); + // Get rejector's name + String rejector = getEPersonName(e); + Locale supportedLocale = I18nUtil.getEPersonLocale(e); + Email email = Email.getEmail(I18nUtil.getEmailFilename(supportedLocale, "submit_reject")); - email.addRecipient(workflowItem.getSubmitter().getEmail()); - email.addArgument(title); - email.addArgument(coll.getName()); - email.addArgument(rejector); - email.addArgument(reason); - email.addArgument(getMyDSpaceLink()); + email.addRecipient(ep.getEmail()); + email.addArgument(title); + email.addArgument(coll.getName()); + email.addArgument(rejector); + email.addArgument(reason); + email.addArgument(getMyDSpaceLink()); - email.send(); + email.send(); + } } catch (RuntimeException re) { // log this email error log.warn(LogManager.getHeader(context, "notify_of_reject", @@ -1101,7 +1126,9 @@ public class BasicWorkflowServiceImpl implements BasicWorkflowService { @Override public String getSubmitterName(BasicWorkflowItem wi) throws SQLException { EPerson e = wi.getSubmitter(); - + if (e == null) { + return null; + } return getEPersonName(e); } diff --git a/dspace-api/src/main/java/org/dspace/workflowbasic/TaskListItemServiceImpl.java b/dspace-api/src/main/java/org/dspace/workflowbasic/TaskListItemServiceImpl.java index d064600191..ba55ca9460 100644 --- a/dspace-api/src/main/java/org/dspace/workflowbasic/TaskListItemServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/workflowbasic/TaskListItemServiceImpl.java @@ -46,6 +46,17 @@ public class TaskListItemServiceImpl implements TaskListItemService { taskListItemDAO.deleteByWorkflowItem(context, workflowItem); } + @Override + public void deleteByWorkflowItemAndEPerson(Context context, BasicWorkflowItem workflowItem, EPerson ePerson) + throws SQLException { + taskListItemDAO.deleteByWorkflowItemAndEPerson(context, workflowItem, ePerson); + } + + @Override + public void deleteByEPerson(Context context, EPerson ePerson) throws SQLException { + taskListItemDAO.deleteByEPerson(context, ePerson); + } + @Override public void update(Context context, TaskListItem taskListItem) throws SQLException { taskListItemDAO.save(context, taskListItem); diff --git a/dspace-api/src/main/java/org/dspace/workflowbasic/dao/TaskListItemDAO.java b/dspace-api/src/main/java/org/dspace/workflowbasic/dao/TaskListItemDAO.java index b09cac72e0..5cdf3e0611 100644 --- a/dspace-api/src/main/java/org/dspace/workflowbasic/dao/TaskListItemDAO.java +++ b/dspace-api/src/main/java/org/dspace/workflowbasic/dao/TaskListItemDAO.java @@ -28,5 +28,10 @@ public interface TaskListItemDAO extends GenericDAO { public void deleteByWorkflowItem(Context context, BasicWorkflowItem workflowItem) throws SQLException; + public void deleteByWorkflowItemAndEPerson(Context context, BasicWorkflowItem workflowItem, EPerson ePerson) + throws SQLException; + + public void deleteByEPerson(Context context, EPerson ePerson) throws SQLException; + public List findByEPerson(Context context, EPerson ePerson) throws SQLException; } diff --git a/dspace-api/src/main/java/org/dspace/workflowbasic/dao/impl/TaskListItemDAOImpl.java b/dspace-api/src/main/java/org/dspace/workflowbasic/dao/impl/TaskListItemDAOImpl.java index ec92faec03..2f6448fb4e 100644 --- a/dspace-api/src/main/java/org/dspace/workflowbasic/dao/impl/TaskListItemDAOImpl.java +++ b/dspace-api/src/main/java/org/dspace/workflowbasic/dao/impl/TaskListItemDAOImpl.java @@ -42,6 +42,24 @@ public class TaskListItemDAOImpl extends AbstractHibernateDAO impl query.executeUpdate(); } + @Override + public void deleteByWorkflowItemAndEPerson(Context context, BasicWorkflowItem workflowItem, EPerson ePerson) + throws SQLException { + String queryString = "delete from TaskListItem where workflowItem = :workflowItem AND ePerson = :ePerson"; + Query query = createQuery(context, queryString); + query.setParameter("workflowItem", workflowItem); + query.setParameter("ePerson", ePerson); + query.executeUpdate(); + } + + @Override + public void deleteByEPerson(Context context, EPerson ePerson) throws SQLException { + String queryString = "delete from TaskListItem where ePerson = :ePerson"; + Query query = createQuery(context, queryString); + query.setParameter("ePerson", ePerson); + query.executeUpdate(); + } + @Override public List findByEPerson(Context context, EPerson ePerson) throws SQLException { CriteriaBuilder criteriaBuilder = getCriteriaBuilder(context); diff --git a/dspace-api/src/main/java/org/dspace/workflowbasic/service/TaskListItemService.java b/dspace-api/src/main/java/org/dspace/workflowbasic/service/TaskListItemService.java index 4ce605f87f..3a8aac65fe 100644 --- a/dspace-api/src/main/java/org/dspace/workflowbasic/service/TaskListItemService.java +++ b/dspace-api/src/main/java/org/dspace/workflowbasic/service/TaskListItemService.java @@ -28,6 +28,11 @@ public interface TaskListItemService { public void deleteByWorkflowItem(Context context, BasicWorkflowItem workflowItem) throws SQLException; + public void deleteByWorkflowItemAndEPerson(Context context, BasicWorkflowItem workflowItem, EPerson ePerson) + throws SQLException; + + public void deleteByEPerson(Context context, EPerson ePerson) throws SQLException; + public void update(Context context, TaskListItem taskListItem) throws SQLException; public List findByEPerson(Context context, EPerson ePerson) throws SQLException; diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/XmlWorkflowFactoryImpl.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/XmlWorkflowFactoryImpl.java index ffc62dcddb..4150d84d04 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/XmlWorkflowFactoryImpl.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/XmlWorkflowFactoryImpl.java @@ -97,7 +97,7 @@ public class XmlWorkflowFactoryImpl implements XmlWorkflowFactory { } @Override - public List getCollectionHandlesMappedToWorklow(Context context, String workflowName) { + public List getCollectionHandlesMappedToWorkflow(Context context, String workflowName) { List collectionsMapped = new ArrayList<>(); for (String handle : this.workflowMapping.keySet()) { if (this.workflowMapping.get(handle).getID().equals(workflowName)) { @@ -107,7 +107,7 @@ public class XmlWorkflowFactoryImpl implements XmlWorkflowFactory { collectionsMapped.add(collection); } } catch (SQLException e) { - log.error("SQLException in XmlWorkflowFactoryImpl.getCollectionHandlesMappedToWorklow trying to " + + log.error("SQLException in XmlWorkflowFactoryImpl.getCollectionHandlesMappedToWorkflow trying to " + "retrieve collection with handle: " + handle, e); } } diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/XmlWorkflowServiceImpl.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/XmlWorkflowServiceImpl.java index 74356d06c1..285a219cfc 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/XmlWorkflowServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/XmlWorkflowServiceImpl.java @@ -188,7 +188,7 @@ public class XmlWorkflowServiceImpl implements XmlWorkflowService { @Override public List getFlywayMigrationLocations() { - return Collections.singletonList("classpath:org.dspace.storage.rdbms.xmlworkflow"); + return Collections.singletonList("classpath:org/dspace/storage/rdbms/xmlworkflow"); } @Override @@ -269,19 +269,21 @@ public class XmlWorkflowServiceImpl implements XmlWorkflowService { } protected void grantSubmitterReadPolicies(Context context, Item item) throws SQLException, AuthorizeException { - //A list of policies the user has for this item - List userHasPolicies = new ArrayList(); - List itempols = authorizeService.getPolicies(context, item); EPerson submitter = item.getSubmitter(); - for (ResourcePolicy resourcePolicy : itempols) { - if (submitter.equals(resourcePolicy.getEPerson())) { - //The user has already got this policy so add it to the list - userHasPolicies.add(resourcePolicy.getAction()); + if (null != submitter) { + //A list of policies the user has for this item + List userHasPolicies = new ArrayList<>(); + List itempols = authorizeService.getPolicies(context, item); + for (ResourcePolicy resourcePolicy : itempols) { + if (submitter.equals(resourcePolicy.getEPerson())) { + //The user has already got this policy so add it to the list + userHasPolicies.add(resourcePolicy.getAction()); + } + } + //Make sure we don't add duplicate policies + if (!userHasPolicies.contains(Constants.READ)) { + addPolicyToItem(context, item, Constants.READ, submitter, ResourcePolicy.TYPE_SUBMISSION); } - } - //Make sure we don't add duplicate policies - if (!userHasPolicies.contains(Constants.READ)) { - addPolicyToItem(context, item, Constants.READ, submitter, ResourcePolicy.TYPE_SUBMISSION); } } @@ -589,35 +591,38 @@ public class XmlWorkflowServiceImpl implements XmlWorkflowService { try { // Get submitter EPerson ep = item.getSubmitter(); - // Get the Locale - Locale supportedLocale = I18nUtil.getEPersonLocale(ep); - Email email = Email.getEmail(I18nUtil.getEmailFilename(supportedLocale, "submit_archive")); + // send the notification to the submitter unless the submitter eperson has been deleted + if (null != ep) { + // Get the Locale + Locale supportedLocale = I18nUtil.getEPersonLocale(ep); + Email email = Email.getEmail(I18nUtil.getEmailFilename(supportedLocale, "submit_archive")); - // Get the item handle to email to user - String handle = handleService.findHandle(context, item); + // Get the item handle to email to user + String handle = handleService.findHandle(context, item); - // Get title - List titles = itemService - .getMetadata(item, MetadataSchemaEnum.DC.getName(), "title", null, Item.ANY); - String title = ""; - try { - title = I18nUtil.getMessage("org.dspace.workflow.WorkflowManager.untitled"); - } catch (MissingResourceException e) { - title = "Untitled"; + // Get title + List titles = itemService + .getMetadata(item, MetadataSchemaEnum.DC.getName(), "title", null, Item.ANY); + String title = ""; + try { + title = I18nUtil.getMessage("org.dspace.workflow.WorkflowManager.untitled"); + } catch (MissingResourceException e) { + title = "Untitled"; + } + if (titles.size() > 0) { + title = titles.iterator().next().getValue(); + } + + email.addRecipient(ep.getEmail()); + email.addArgument(title); + email.addArgument(coll.getName()); + email.addArgument(handleService.getCanonicalForm(handle)); + + email.send(); } - if (titles.size() > 0) { - title = titles.iterator().next().getValue(); - } - - email.addRecipient(ep.getEmail()); - email.addArgument(title); - email.addArgument(coll.getName()); - email.addArgument(handleService.getCanonicalForm(handle)); - - email.send(); } catch (MessagingException e) { log.warn(LogManager.getHeader(context, "notifyOfArchive", - "cannot email user" + " item_id=" + item.getID())); + "cannot email user" + " item_id=" + item.getID())); } } @@ -825,7 +830,7 @@ public class XmlWorkflowServiceImpl implements XmlWorkflowService { } public void removeUserItemPolicies(Context context, Item item, EPerson e) throws SQLException, AuthorizeException { - if (e != null) { + if (e != null && item.getSubmitter() != null) { //Also remove any lingering authorizations from this user authorizeService.removeEPersonPolicies(context, item, e); //Remove the bundle rights @@ -847,7 +852,7 @@ public class XmlWorkflowServiceImpl implements XmlWorkflowService { protected void removeGroupItemPolicies(Context context, Item item, Group e) throws SQLException, AuthorizeException { - if (e != null) { + if (e != null && item.getSubmitter() != null) { //Also remove any lingering authorizations from this user authorizeService.removeGroupPolicies(context, item, e); //Remove the bundle rights @@ -863,7 +868,32 @@ public class XmlWorkflowServiceImpl implements XmlWorkflowService { } @Override - public WorkspaceItem sendWorkflowItemBackSubmission(Context context, XmlWorkflowItem wi, EPerson e, + public void deleteWorkflowByWorkflowItem(Context context, XmlWorkflowItem wi, EPerson e) + throws SQLException, AuthorizeException, IOException { + Item myitem = wi.getItem(); + UUID itemID = myitem.getID(); + Integer workflowID = wi.getID(); + UUID collID = wi.getCollection().getID(); + // stop workflow + deleteAllTasks(context, wi); + context.turnOffAuthorisationSystem(); + //Also clear all info for this step + workflowRequirementsService.clearInProgressUsers(context, wi); + // Remove (if any) the workflowItemroles for this item + workflowItemRoleService.deleteForWorkflowItem(context, wi); + // Now remove the workflow object manually from the database + xmlWorkflowItemService.deleteWrapper(context, wi); + // Now delete the item + itemService.delete(context, myitem); + log.info(LogManager.getHeader(context, "delete_workflow", "workflow_item_id=" + + workflowID + "item_id=" + itemID + + "collection_id=" + collID + "eperson_id=" + + e.getID())); + context.restoreAuthSystemState(); + } + + @Override + public WorkspaceItem sendWorkflowItemBackSubmission(Context context, XmlWorkflowItem wi, EPerson e, String provenance, String rejection_message) throws SQLException, AuthorizeException, @@ -1038,31 +1068,38 @@ public class XmlWorkflowServiceImpl implements XmlWorkflowService { protected void notifyOfReject(Context c, XmlWorkflowItem wi, EPerson e, String reason) { try { - // Get the item title - String title = wi.getItem().getName(); + // send the notification only if the person was not deleted in the + // meantime between submission and archiving. + EPerson eperson = wi.getSubmitter(); + if (eperson != null) { + // Get the item title + String title = wi.getItem().getName(); - // Get the collection - Collection coll = wi.getCollection(); + // Get the collection + Collection coll = wi.getCollection(); - // Get rejector's name - String rejector = getEPersonName(e); - Locale supportedLocale = I18nUtil.getEPersonLocale(e); - Email email = Email.getEmail(I18nUtil.getEmailFilename(supportedLocale, "submit_reject")); + // Get rejector's name + String rejector = getEPersonName(e); + Locale supportedLocale = I18nUtil.getEPersonLocale(e); + Email email = Email.getEmail(I18nUtil.getEmailFilename(supportedLocale, "submit_reject")); - email.addRecipient(wi.getSubmitter().getEmail()); - email.addArgument(title); - email.addArgument(coll.getName()); - email.addArgument(rejector); - email.addArgument(reason); - email.addArgument(ConfigurationManager.getProperty("dspace.ui.url") + "/mydspace"); + email.addRecipient(eperson.getEmail()); + email.addArgument(title); + email.addArgument(coll.getName()); + email.addArgument(rejector); + email.addArgument(reason); + email.addArgument(ConfigurationManager.getProperty("dspace.ui.url") + "/mydspace"); - email.send(); + email.send(); + } else { + // DO nothing + } } catch (Exception ex) { // log this email error log.warn(LogManager.getHeader(c, "notify_of_reject", - "cannot email user" + " eperson_id" + e.getID() - + " eperson_email" + e.getEmail() - + " workflow_item_id" + wi.getID())); + "cannot email user" + " eperson_id" + e.getID() + + " eperson_email" + e.getEmail() + + " workflow_item_id" + wi.getID())); } } diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/factory/XmlWorkflowFactory.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/factory/XmlWorkflowFactory.java index 5d33843747..db856bb57b 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/factory/XmlWorkflowFactory.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/factory/XmlWorkflowFactory.java @@ -86,7 +86,7 @@ public interface XmlWorkflowFactory { * @param workflowName Name of workflow we want the collections of that are mapped to is * @return List of collections mapped to the requested workflow */ - public List getCollectionHandlesMappedToWorklow(Context context, String workflowName); + public List getCollectionHandlesMappedToWorkflow(Context context, String workflowName); /** * Returns list of collections that are not mapped to any configured workflow, and thus use the default workflow diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/Step.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/Step.java index a982107d78..16befc2626 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/Step.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/Step.java @@ -81,7 +81,7 @@ public class Step implements BeanNameAware { /** * Get the next step based on out the outcome * @param outcome the outcome of the previous step - * @return the next stepp or NULL if there is no step configured for this outcome + * @return the next step or NULL if there is no step configured for this outcome */ public Step getNextStep(int outcome) { return outcomes.get(outcome); diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/AcceptEditRejectAction.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/AcceptEditRejectAction.java index cb74bcf22d..743d00b2b6 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/AcceptEditRejectAction.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/AcceptEditRejectAction.java @@ -36,6 +36,7 @@ public class AcceptEditRejectAction extends ProcessingAction { private static final String SUBMIT_APPROVE = "submit_approve"; private static final String SUBMIT_REJECT = "submit_reject"; + private static final String SUBMITTER_IS_DELETED_PAGE = "submitter_deleted"; //TODO: rename to AcceptAndEditMetadataAction @@ -53,6 +54,8 @@ public class AcceptEditRejectAction extends ProcessingAction { return processAccept(c, wfi); case SUBMIT_REJECT: return processRejectPage(c, wfi, request); + case SUBMITTER_IS_DELETED_PAGE: + return processSubmitterIsDeletedPage(c, wfi, request); default: return new ActionResult(ActionResult.TYPE.TYPE_CANCEL); } @@ -93,6 +96,22 @@ public class AcceptEditRejectAction extends ProcessingAction { return new ActionResult(ActionResult.TYPE.TYPE_SUBMISSION_PAGE); } + public ActionResult processSubmitterIsDeletedPage(Context c, XmlWorkflowItem wfi, HttpServletRequest request) + throws SQLException, AuthorizeException, IOException { + if (request.getParameter("submit_delete") != null) { + XmlWorkflowServiceFactory.getInstance().getXmlWorkflowService() + .deleteWorkflowByWorkflowItem(c, wfi, c.getCurrentUser()); + // Delete and send user back to myDspace page + return new ActionResult(ActionResult.TYPE.TYPE_SUBMISSION_PAGE); + } else if (request.getParameter("submit_keep_it") != null) { + // Do nothing, just send it back to myDspace page + return new ActionResult(ActionResult.TYPE.TYPE_SUBMISSION_PAGE); + } else { + //Cancel, go back to the main task page + return new ActionResult(ActionResult.TYPE.TYPE_PAGE); + } + } + private void addApprovedProvenance(Context c, XmlWorkflowItem wfi) throws SQLException, AuthorizeException { //Add the provenance for the accept String now = DCDate.getCurrent().toString(); diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/ReviewAction.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/ReviewAction.java index 5630087d57..8474757be6 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/ReviewAction.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/ReviewAction.java @@ -38,6 +38,8 @@ public class ReviewAction extends ProcessingAction { private static final String SUBMIT_APPROVE = "submit_approve"; private static final String SUBMIT_REJECT = "submit_reject"; + private static final String SUBMITTER_IS_DELETED_PAGE = "submitter_deleted"; + @Override public void activate(Context c, XmlWorkflowItem wfItem) { @@ -53,6 +55,8 @@ public class ReviewAction extends ProcessingAction { return processAccept(c, wfi); case SUBMIT_REJECT: return processRejectPage(c, wfi, step, request); + case SUBMITTER_IS_DELETED_PAGE: + return processSubmitterIsDeletedPage(c, wfi, request); default: return new ActionResult(ActionResult.TYPE.TYPE_CANCEL); } @@ -108,4 +112,21 @@ public class ReviewAction extends ProcessingAction { return new ActionResult(ActionResult.TYPE.TYPE_SUBMISSION_PAGE); } + + public ActionResult processSubmitterIsDeletedPage(Context c, XmlWorkflowItem wfi, HttpServletRequest request) + throws SQLException, AuthorizeException, IOException { + if (request.getParameter("submit_delete") != null) { + XmlWorkflowServiceFactory.getInstance().getXmlWorkflowService() + .deleteWorkflowByWorkflowItem(c, wfi, c.getCurrentUser()); + // Delete and send user back to myDspace page + return new ActionResult(ActionResult.TYPE.TYPE_SUBMISSION_PAGE); + } else if (request.getParameter("submit_keep_it") != null) { + // Do nothing, just send it back to myDspace page + return new ActionResult(ActionResult.TYPE.TYPE_SUBMISSION_PAGE); + } else { + //Cancel, go back to the main task page + request.setAttribute("page", MAIN_PAGE); + return new ActionResult(ActionResult.TYPE.TYPE_PAGE); + } + } } diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/SingleUserReviewAction.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/SingleUserReviewAction.java index d115832389..9ef554821d 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/SingleUserReviewAction.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/processingaction/SingleUserReviewAction.java @@ -37,6 +37,7 @@ public class SingleUserReviewAction extends ProcessingAction { public static final int MAIN_PAGE = 0; public static final int REJECT_PAGE = 1; + public static final int SUBMITTER_IS_DELETED_PAGE = 2; public static final int OUTCOME_REJECT = 1; @@ -59,6 +60,8 @@ public class SingleUserReviewAction extends ProcessingAction { return processMainPage(c, wfi, step, request); case REJECT_PAGE: return processRejectPage(c, wfi, step, request); + case SUBMITTER_IS_DELETED_PAGE: + return processSubmitterIsDeletedPage(c, wfi, request); default: return new ActionResult(ActionResult.TYPE.TYPE_CANCEL); } @@ -82,7 +85,11 @@ public class SingleUserReviewAction extends ProcessingAction { return new ActionResult(ActionResult.TYPE.TYPE_OUTCOME, ActionResult.OUTCOME_COMPLETE); } else if (request.getParameter(SUBMIT_REJECT) != null) { // Make sure we indicate which page we want to process - request.setAttribute("page", REJECT_PAGE); + if (wfi.getSubmitter() == null) { + request.setAttribute("page", SUBMITTER_IS_DELETED_PAGE); + } else { + request.setAttribute("page", REJECT_PAGE); + } // We have pressed reject item, so take the user to a page where he can reject return new ActionResult(ActionResult.TYPE.TYPE_PAGE); } else if (request.getParameter(SUBMIT_DECLINE_TASK) != null) { @@ -135,4 +142,21 @@ public class SingleUserReviewAction extends ProcessingAction { return new ActionResult(ActionResult.TYPE.TYPE_PAGE); } } + + public ActionResult processSubmitterIsDeletedPage(Context c, XmlWorkflowItem wfi, HttpServletRequest request) + throws SQLException, AuthorizeException, IOException { + if (request.getParameter("submit_delete") != null) { + XmlWorkflowServiceFactory.getInstance().getXmlWorkflowService() + .deleteWorkflowByWorkflowItem(c, wfi, c.getCurrentUser()); + // Delete and send user back to myDspace page + return new ActionResult(ActionResult.TYPE.TYPE_SUBMISSION_PAGE); + } else if (request.getParameter("submit_keep_it") != null) { + // Do nothing, just send it back to myDspace page + return new ActionResult(ActionResult.TYPE.TYPE_SUBMISSION_PAGE); + } else { + //Cancel, go back to the main task page + request.setAttribute("page", MAIN_PAGE); + return new ActionResult(ActionResult.TYPE.TYPE_PAGE); + } + } } diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/userassignment/AssignOriginalSubmitterAction.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/userassignment/AssignOriginalSubmitterAction.java index 01d995ccf6..3c8d85997a 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/userassignment/AssignOriginalSubmitterAction.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/userassignment/AssignOriginalSubmitterAction.java @@ -74,20 +74,22 @@ public class AssignOriginalSubmitterAction extends UserSelectionAction { @Override public void alertUsersOnActivation(Context c, XmlWorkflowItem wfi, RoleMembers roleMembers) throws IOException, SQLException { - try { - XmlWorkflowService xmlWorkflowService = XmlWorkflowServiceFactory.getInstance().getXmlWorkflowService(); - xmlWorkflowService.alertUsersOnTaskActivation(c, wfi, "submit_task", Arrays.asList(wfi.getSubmitter()), - //The arguments - wfi.getItem().getName(), - wfi.getCollection().getName(), - wfi.getSubmitter().getFullName(), - //TODO: message - "New task available.", - xmlWorkflowService.getMyDSpaceLink() - ); - } catch (MessagingException e) { - log.info(LogManager.getHeader(c, "error emailing user(s) for claimed task", + if (wfi.getSubmitter() != null) { + try { + XmlWorkflowService xmlWorkflowService = XmlWorkflowServiceFactory.getInstance().getXmlWorkflowService(); + xmlWorkflowService.alertUsersOnTaskActivation(c, wfi, "submit_task", Arrays.asList(wfi.getSubmitter()), + //The arguments + wfi.getItem().getName(), + wfi.getCollection().getName(), + wfi.getSubmitter().getFullName(), + //TODO: message + "New task available.", + xmlWorkflowService.getMyDSpaceLink() + ); + } catch (MessagingException e) { + log.info(LogManager.getHeader(c, "error emailing user(s) for claimed task", "step: " + getParent().getStep().getId() + " workflowitem: " + wfi.getID())); + } } } @@ -107,9 +109,9 @@ public class AssignOriginalSubmitterAction extends UserSelectionAction { .getId() + " to assign a submitter to. Aborting the action."); throw new IllegalStateException(); } - - createTaskForEPerson(c, wfi, step, nextAction, submitter); - + if (submitter != null) { + createTaskForEPerson(c, wfi, step, nextAction, submitter); + } //It is important that we return to the submission page since we will continue our actions with the submitter return new ActionResult(ActionResult.TYPE.TYPE_OUTCOME, ActionResult.OUTCOME_COMPLETE); } diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/userassignment/ClaimAction.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/userassignment/ClaimAction.java index 78742c6553..36b2aaeee5 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/userassignment/ClaimAction.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/state/actions/userassignment/ClaimAction.java @@ -77,22 +77,25 @@ public class ClaimAction extends UserSelectionAction { public void alertUsersOnActivation(Context c, XmlWorkflowItem wfi, RoleMembers roleMembers) throws IOException, SQLException { try { + EPerson ep = wfi.getSubmitter(); + String submitterName = null; + if (ep != null) { + submitterName = ep.getFullName(); + } XmlWorkflowService xmlWorkflowService = XmlWorkflowServiceFactory.getInstance().getXmlWorkflowService(); xmlWorkflowService.alertUsersOnTaskActivation(c, wfi, "submit_task", roleMembers.getAllUniqueMembers(c), - //The arguments - wfi.getItem().getName(), - wfi.getCollection().getName(), - wfi.getSubmitter().getFullName(), - //TODO: message - "New task available.", - xmlWorkflowService.getMyDSpaceLink() + //The arguments + wfi.getItem().getName(), + wfi.getCollection().getName(), + submitterName, + //TODO: message + "New task available.", + xmlWorkflowService.getMyDSpaceLink() ); } catch (MessagingException e) { log.info(LogManager.getHeader(c, "error emailing user(s) for claimed task", - "step: " + getParent().getStep().getId() + " workflowitem: " + wfi.getID())); + "step: " + getParent().getStep().getId() + " workflowitem: " + wfi.getID())); } - - } @Override diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/PoolTaskServiceImpl.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/PoolTaskServiceImpl.java index 25b4c79206..f64f1b3942 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/PoolTaskServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/PoolTaskServiceImpl.java @@ -125,6 +125,19 @@ public class PoolTaskServiceImpl implements PoolTaskService { } } + @Override + public void deleteByEperson(Context context, EPerson ePerson) + throws SQLException, AuthorizeException, IOException { + List tasks = findByEperson(context, ePerson); + //Use an iterator to remove the tasks ! + Iterator iterator = tasks.iterator(); + while (iterator.hasNext()) { + PoolTask poolTask = iterator.next(); + iterator.remove(); + delete(context, poolTask); + } + } + @Override public List findByEPerson(Context context, EPerson ePerson) throws SQLException { return poolTaskDAO.findByEPerson(context, ePerson); diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/WorkflowItemRoleServiceImpl.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/WorkflowItemRoleServiceImpl.java index c96bcd032d..4204c7dcc3 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/WorkflowItemRoleServiceImpl.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/WorkflowItemRoleServiceImpl.java @@ -58,6 +58,16 @@ public class WorkflowItemRoleServiceImpl implements WorkflowItemRoleService { } } + @Override + public void deleteByEPerson(Context context, EPerson ePerson) throws SQLException, AuthorizeException { + Iterator workflowItemRoles = findByEPerson(context, ePerson).iterator(); + while (workflowItemRoles.hasNext()) { + WorkflowItemRole workflowItemRole = workflowItemRoles.next(); + workflowItemRoles.remove(); + delete(context, workflowItemRole); + } + } + @Override public List findByEPerson(Context context, EPerson ePerson) throws SQLException { return workflowItemRoleDAO.findByEPerson(context, ePerson); diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/service/PoolTaskService.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/service/PoolTaskService.java index 3dadcc7804..7f5ed5e6a0 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/service/PoolTaskService.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/service/PoolTaskService.java @@ -41,6 +41,8 @@ public interface PoolTaskService extends DSpaceCRUDService { public void deleteByWorkflowItem(Context context, XmlWorkflowItem xmlWorkflowItem) throws SQLException, AuthorizeException; + public void deleteByEperson(Context context, EPerson ePerson) throws SQLException, AuthorizeException, IOException; + public List findByEPerson(Context context, EPerson ePerson) throws SQLException; /** diff --git a/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/service/WorkflowItemRoleService.java b/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/service/WorkflowItemRoleService.java index 62c661f02a..9f909231f1 100644 --- a/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/service/WorkflowItemRoleService.java +++ b/dspace-api/src/main/java/org/dspace/xmlworkflow/storedcomponents/service/WorkflowItemRoleService.java @@ -33,5 +33,7 @@ public interface WorkflowItemRoleService extends DSpaceCRUDService findByEPerson(Context context, EPerson ePerson) throws SQLException; } diff --git a/dspace-api/src/main/resources/Messages.properties b/dspace-api/src/main/resources/Messages.properties index bf1b475375..974c71083d 100644 --- a/dspace-api/src/main/resources/Messages.properties +++ b/dspace-api/src/main/resources/Messages.properties @@ -6,6 +6,8 @@ # http://www.dspace.org/license/ # +admin.name = DSpace Administrator + browse.page-title = Browsing DSpace browse.et-al = et al @@ -252,6 +254,12 @@ jsp.dspace-admin.eperson-browse.phone = Telephone jsp.dspace-admin.eperson-browse.self = Self Registered jsp.dspace-admin.eperson-browse.title = E-People jsp.dspace-admin.eperson-confirm-delete.confirm = Are you sure this e-person should be deleted? +jsp.dspace-admin.eperson-confirm-delete.confirm.constraint = This EPerson +jsp.dspace-admin.eperson-confirm-delete.confirm.item = has submitted one or more items which will be kept +jsp.dspace-admin.eperson-confirm-delete.confirm.workspaceitem = has unsubmitted workspace items which will be deleted +jsp.dspace-admin.eperson-confirm-delete.confirm.workflowitem = has an active submission workflow which will be put back into the pool +jsp.dspace-admin.eperson-confirm-delete.confirm.resourcepolicy = has resource policies associated with him which will be deleted +jsp.dspace-admin.eperson-confirm-delete.confirm.tasklistitem = has a workflow task awaiting their attention jsp.dspace-admin.eperson-confirm-delete.heading = Delete e-person: {0} ({1}) jsp.dspace-admin.eperson-confirm-delete.title = Delete E-Person jsp.dspace-admin.eperson-deletion-error.errormsg = The EPerson {0} cannot be deleted because a reference to it exists in the following table(s): @@ -576,7 +584,6 @@ jsp.general.without-contributor jsp.general.without-date = No date given jsp.help = jsp.help.formats.contact1 = Please contact your -jsp.help.formats.contact2 = DSpace Administrator jsp.help.formats.contact3 = if you have questions about a particular format. jsp.help.formats.extensions = Extensions jsp.help.formats.here = (Your Site's Format Support Policy Here) @@ -741,6 +748,12 @@ jsp.mydspace.reject-reason.cancel.button = Cancel Rejecti jsp.mydspace.reject-reason.reject.button = Reject Item jsp.mydspace.reject-reason.text1 = Please enter the reason you are rejecting the submission into the box below. Please indicate in your message whether the submitter should fix a problem and resubmit. jsp.mydspace.reject-reason.title = Enter Reason for Rejection +jsp.mydspace.reject-deleted-submitter.title = The Submitter of this item has been deleted +jsp.mydspace.reject-deleted-submitter.message = Do you want to delete the document, keep it as a task and work on it later or cancel the rejection process? +jsp.mydspace.reject-deleted-submitter-keep-it.button = Keep it I will work on it later +jsp.mydspace.reject-deleted-submitter-delete.button = Delete Item +jsp.mydspace.reject-deleted-submitter-delete.title = Delete successfully +jsp.mydspace.reject-deleted-submitter-delete.info = Reviewing task is done, the submitted item has been successfully removed from the system. jsp.mydspace.remove-item.cancel.button = Cancel Removal jsp.mydspace.remove-item.confirmation = Are you sure you want to remove the following incomplete item? jsp.mydspace.remove-item.remove.button = Remove the Item @@ -1643,6 +1656,7 @@ org.dspace.workflow.WorkflowManager.step1 org.dspace.workflow.WorkflowManager.step2 = The submission must be checked before inclusion in the archive. org.dspace.workflow.WorkflowManager.step3 = The metadata needs to be checked to ensure compliance with the collection's standards, and edited if necessary. org.dspace.workflow.WorkflowManager.untitled = Untitled +org.dspace.workflow.WorkflowManager.deleted-submitter = Unknown (deleted submitter) search.order.asc = Ascending search.order.desc = Descending @@ -1824,6 +1838,11 @@ In response to your request I have the pleasure to send you in attachment a copy Best regards,\n\ {3} <{4}> +itemRequest.admin.response.body.approve = Dear {0},\n\ +In response to your request please see the attached copy of the file(s) related to the document: "{2}" ({1}).\n\n\ +Best regards,\n\ +{3} + itemRequest.response.subject.reject = Request copy of document itemRequest.response.body.reject = Dear {0},\n\ In response to your request I regret to inform you that it''s not possible to send you a copy of the file(s) you have requested, concerning the document: "{2}" ({1}), of which I am author (or co-author).\n\n\ @@ -1845,6 +1864,12 @@ itemRequest.response.body.contactRequester = Dear {0},\n\n\ Thanks for your interest! Since the author owns the copyright for this work, I will contact the author and ask permission to send you a copy. I''ll let you know as soon as I hear from the author.\n\n\ Thanks!\n\ {1} <{2}> + +itemRequest.admin.response.body.reject = Dear {0},\n\ +In response to your request I regret to inform you that it''s not possible to send you a copy of the file(s) you have requested, concerning the document: "{2}" ({1}).\n\n\ +Best regards,\n\ +{3} + jsp.request.item.request-form.info2 = Request a document copy: {0} jsp.request.item.request-form.problem = You must fill all the missing fields. jsp.request.item.request-form.reqname = Requester name: @@ -1886,7 +1911,8 @@ jsp.request.item.request-free-acess.free = Change to Open Access jsp.request.item.request-free-acess.name = Name: jsp.request.item.request-free-acess.email = E-mail: org.dspace.app.requestitem.RequestItemMetadataStrategy.unnamed = Corresponding Author -org.dspace.app.requestitem.RequestItemHelpdeskStrategy.helpdeskname = Help Desk +org.dspace.app.requestitem.helpdeskname = Help Desk +org.dspace.app.requestitem.default-author-name = DSpace User org.dspace.app.webui.jsptag.ItemTag.restrict = Request a copy jsp.layout.navbar-admin.batchimport = Batch import diff --git a/dspace-api/src/main/resources/org/dspace/storage/rdbms/flywayupgrade/oracle/upgradeToFlyway4x.sql b/dspace-api/src/main/resources/org/dspace/storage/rdbms/flywayupgrade/oracle/upgradeToFlyway4x.sql new file mode 100644 index 0000000000..7907fccc00 --- /dev/null +++ b/dspace-api/src/main/resources/org/dspace/storage/rdbms/flywayupgrade/oracle/upgradeToFlyway4x.sql @@ -0,0 +1,29 @@ +-- +-- Copyright 2010-2017 Boxfuse GmbH +-- +-- Licensed under the Apache License, Version 2.0 (the "License"); +-- you may not use this file except in compliance with the License. +-- You may obtain a copy of the License at +-- +-- http://www.apache.org/licenses/LICENSE-2.0 +-- +-- Unless required by applicable law or agreed to in writing, software +-- distributed under the License is distributed on an "AS IS" BASIS, +-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +-- See the License for the specific language governing permissions and +-- limitations under the License. +-- +----------------- +-- This is the Oracle upgrade script from Flyway v4.2.0, copied/borrowed from: +-- https://github.com/flyway/flyway/blob/flyway-4.2.0/flyway-core/src/main/resources/org/flywaydb/core/internal/dbsupport/oracle/upgradeMetaDataTable.sql +-- +-- The variables in this script are replaced in FlywayUpgradeUtils.upgradeFlywayTable() +------------------ + +DROP INDEX "${schema}"."${table}_vr_idx"; +DROP INDEX "${schema}"."${table}_ir_idx"; +ALTER TABLE "${schema}"."${table}" DROP COLUMN "version_rank"; +ALTER TABLE "${schema}"."${table}" DROP PRIMARY KEY DROP INDEX; +ALTER TABLE "${schema}"."${table}" MODIFY "version" NULL; +ALTER TABLE "${schema}"."${table}" ADD CONSTRAINT "${table}_pk" PRIMARY KEY ("installed_rank"); +UPDATE "${schema}"."${table}" SET "type"='BASELINE' WHERE "type"='INIT'; diff --git a/dspace-api/src/main/resources/org/dspace/storage/rdbms/flywayupgrade/postgres/upgradeToFlyway4x.sql b/dspace-api/src/main/resources/org/dspace/storage/rdbms/flywayupgrade/postgres/upgradeToFlyway4x.sql new file mode 100644 index 0000000000..7548fa4c6a --- /dev/null +++ b/dspace-api/src/main/resources/org/dspace/storage/rdbms/flywayupgrade/postgres/upgradeToFlyway4x.sql @@ -0,0 +1,29 @@ +-- +-- Copyright 2010-2017 Boxfuse GmbH +-- +-- Licensed under the Apache License, Version 2.0 (the "License"); +-- you may not use this file except in compliance with the License. +-- You may obtain a copy of the License at +-- +-- http://www.apache.org/licenses/LICENSE-2.0 +-- +-- Unless required by applicable law or agreed to in writing, software +-- distributed under the License is distributed on an "AS IS" BASIS, +-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +-- See the License for the specific language governing permissions and +-- limitations under the License. +-- +----------------- +-- This is the PostgreSQL upgrade script from Flyway v4.2.0, copied/borrowed from: +-- https://github.com/flyway/flyway/blob/flyway-4.2.0/flyway-core/src/main/resources/org/flywaydb/core/internal/dbsupport/oracle/upgradeMetaDataTable.sql +-- +-- The variables in this script are replaced in FlywayUpgradeUtils.upgradeFlywayTable() +------------------ + +DROP INDEX "${schema}"."${table}_vr_idx"; +DROP INDEX "${schema}"."${table}_ir_idx"; +ALTER TABLE "${schema}"."${table}" DROP COLUMN "version_rank"; +ALTER TABLE "${schema}"."${table}" DROP CONSTRAINT "${table}_pk"; +ALTER TABLE "${schema}"."${table}" ALTER COLUMN "version" DROP NOT NULL; +ALTER TABLE "${schema}"."${table}" ADD CONSTRAINT "${table}_pk" PRIMARY KEY ("installed_rank"); +UPDATE "${schema}"."${table}" SET "type"='BASELINE' WHERE "type"='INIT'; diff --git a/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/h2/V7.0_2020.01.08__DS-626-statistics-tracker.sql b/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/h2/V7.0_2020.01.08__DS-626-statistics-tracker.sql new file mode 100644 index 0000000000..48d182af61 --- /dev/null +++ b/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/h2/V7.0_2020.01.08__DS-626-statistics-tracker.sql @@ -0,0 +1,29 @@ +-- +-- The contents of this file are subject to the license and copyright +-- detailed in the LICENSE and NOTICE files at the root of the source +-- tree and available online at +-- +-- http://www.dspace.org/license/ +-- + +-- =============================================================== +-- WARNING WARNING WARNING WARNING WARNING WARNING WARNING WARNING +-- +-- DO NOT MANUALLY RUN THIS DATABASE MIGRATION. IT WILL BE EXECUTED +-- AUTOMATICALLY (IF NEEDED) BY "FLYWAY" WHEN YOU STARTUP DSPACE. +-- http://flywaydb.org/ +-- =============================================================== + +------------------------------------------------------------- +-- This will create the setup for the IRUS statistics harvester +------------------------------------------------------------- + +CREATE SEQUENCE openurltracker_seq; + +CREATE TABLE openurltracker +( + tracker_id INTEGER, + tracker_url VARCHAR(1000), + uploaddate DATE, + CONSTRAINT openurltracker_PK PRIMARY KEY (tracker_id) +); \ No newline at end of file diff --git a/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/oracle/V7.0_2020.01.08__DS-626-statistics-tracker.sql b/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/oracle/V7.0_2020.01.08__DS-626-statistics-tracker.sql new file mode 100644 index 0000000000..a108fd74b4 --- /dev/null +++ b/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/oracle/V7.0_2020.01.08__DS-626-statistics-tracker.sql @@ -0,0 +1,29 @@ +-- +-- The contents of this file are subject to the license and copyright +-- detailed in the LICENSE and NOTICE files at the root of the source +-- tree and available online at +-- +-- http://www.dspace.org/license/ +-- + +-- =============================================================== +-- WARNING WARNING WARNING WARNING WARNING WARNING WARNING WARNING +-- +-- DO NOT MANUALLY RUN THIS DATABASE MIGRATION. IT WILL BE EXECUTED +-- AUTOMATICALLY (IF NEEDED) BY "FLYWAY" WHEN YOU STARTUP DSPACE. +-- http://flywaydb.org/ +-- =============================================================== + +------------------------------------------------------------- +-- This will create the setup for the IRUS statistics harvester +------------------------------------------------------------- + +CREATE SEQUENCE openurltracker_seq; + +CREATE TABLE openurltracker +( + tracker_id NUMBER, + tracker_url VARCHAR2(1000), + uploaddate DATE, + CONSTRAINT openurltracker_PK PRIMARY KEY (tracker_id) +); \ No newline at end of file diff --git a/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/postgres/V7.0_2020.01.08__DS-626-statistics-tracker.sql b/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/postgres/V7.0_2020.01.08__DS-626-statistics-tracker.sql new file mode 100644 index 0000000000..48d182af61 --- /dev/null +++ b/dspace-api/src/main/resources/org/dspace/storage/rdbms/sqlmigration/postgres/V7.0_2020.01.08__DS-626-statistics-tracker.sql @@ -0,0 +1,29 @@ +-- +-- The contents of this file are subject to the license and copyright +-- detailed in the LICENSE and NOTICE files at the root of the source +-- tree and available online at +-- +-- http://www.dspace.org/license/ +-- + +-- =============================================================== +-- WARNING WARNING WARNING WARNING WARNING WARNING WARNING WARNING +-- +-- DO NOT MANUALLY RUN THIS DATABASE MIGRATION. IT WILL BE EXECUTED +-- AUTOMATICALLY (IF NEEDED) BY "FLYWAY" WHEN YOU STARTUP DSPACE. +-- http://flywaydb.org/ +-- =============================================================== + +------------------------------------------------------------- +-- This will create the setup for the IRUS statistics harvester +------------------------------------------------------------- + +CREATE SEQUENCE openurltracker_seq; + +CREATE TABLE openurltracker +( + tracker_id INTEGER, + tracker_url VARCHAR(1000), + uploaddate DATE, + CONSTRAINT openurltracker_PK PRIMARY KEY (tracker_id) +); \ No newline at end of file diff --git a/dspace-api/src/main/resources/spring/spring-dspace-addon-import-services.xml b/dspace-api/src/main/resources/spring/spring-dspace-addon-import-services.xml index bbdf085619..0046366f2e 100644 --- a/dspace-api/src/main/resources/spring/spring-dspace-addon-import-services.xml +++ b/dspace-api/src/main/resources/spring/spring-dspace-addon-import-services.xml @@ -19,11 +19,6 @@ - - - - - - - - - + + + + + + + + + + + - + + + + + + + + + + + + + + xml + + + + + + + + + + ris + + + + + + + + bib + bibtex + + + + + + + + + + csv + + + + + + + + + + + + tsv + + + + + + + + + enl + enw + + + + + diff --git a/dspace-api/src/test/data/dspaceFolder/assetstore/curate.txt b/dspace-api/src/test/data/dspaceFolder/assetstore/curate.txt new file mode 100644 index 0000000000..ff2cb89ef6 --- /dev/null +++ b/dspace-api/src/test/data/dspaceFolder/assetstore/curate.txt @@ -0,0 +1,2 @@ +checklinks +requiredmetadata diff --git a/dspace-api/src/test/data/dspaceFolder/config/item-submission.xml b/dspace-api/src/test/data/dspaceFolder/config/item-submission.xml index d78b14c437..cd53a5c1c6 100644 --- a/dspace-api/src/test/data/dspaceFolder/config/item-submission.xml +++ b/dspace-api/src/test/data/dspaceFolder/config/item-submission.xml @@ -18,6 +18,7 @@ + @@ -108,6 +109,12 @@ org.dspace.submit.step.SampleStep sample + + + submit.progressbar.describe.stepone + org.dspace.app.rest.submit.step.DescribeStep + submission-form + @@ -149,6 +156,10 @@ + + + + diff --git a/dspace-api/src/test/data/dspaceFolder/config/local.cfg b/dspace-api/src/test/data/dspaceFolder/config/local.cfg index 51ce1a0165..5f32bd0919 100644 --- a/dspace-api/src/test/data/dspaceFolder/config/local.cfg +++ b/dspace-api/src/test/data/dspaceFolder/config/local.cfg @@ -109,6 +109,11 @@ plugin.sequence.java.util.Collection = \ java.util.Stack, \ java.util.TreeSet +# Enable a test authority control on dc.language.iso field +choices.plugin.dc.language.iso = common_iso_languages +choices.presentation.dc.language.iso = select +authority.controlled.dc.language.iso = true + ########################################### # PROPERTIES USED TO TEST CONFIGURATION # # PROPERTY EXPOSURE VIA REST # @@ -120,3 +125,7 @@ rest.properties.exposed = configuration.not.existing configuration.not.exposed = secret_value configuration.exposed.single.value = public_value configuration.exposed.array.value = public_value_1, public_value_2 + +# Test config for the authentication ip functionality +authentication-ip.Staff = 5.5.5.5 +authentication-ip.Student = 6.6.6.6 diff --git a/dspace-api/src/test/data/dspaceFolder/config/spring/api/event-service-listeners.xml b/dspace-api/src/test/data/dspaceFolder/config/spring/api/event-service-listeners.xml new file mode 100644 index 0000000000..15de9735d7 --- /dev/null +++ b/dspace-api/src/test/data/dspaceFolder/config/spring/api/event-service-listeners.xml @@ -0,0 +1,14 @@ + + + + + + + + + \ No newline at end of file diff --git a/dspace-api/src/test/data/dspaceFolder/config/spring/api/openurltracker.xml b/dspace-api/src/test/data/dspaceFolder/config/spring/api/openurltracker.xml new file mode 100644 index 0000000000..1d3be040a3 --- /dev/null +++ b/dspace-api/src/test/data/dspaceFolder/config/spring/api/openurltracker.xml @@ -0,0 +1,13 @@ + + + + + + + + + + \ No newline at end of file diff --git a/dspace-api/src/test/data/dspaceFolder/config/spring/api/scripts.xml b/dspace-api/src/test/data/dspaceFolder/config/spring/api/scripts.xml index 30fb69a3c4..c614a3158d 100644 --- a/dspace-api/src/test/data/dspaceFolder/config/spring/api/scripts.xml +++ b/dspace-api/src/test/data/dspaceFolder/config/spring/api/scripts.xml @@ -4,6 +4,9 @@ xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd"> + + + @@ -14,13 +17,29 @@ - + - + + + + + + + + + + + + + + + + + diff --git a/dspace-api/src/test/data/dspaceFolder/config/spring/api/solr-services.xml b/dspace-api/src/test/data/dspaceFolder/config/spring/api/solr-services.xml index 5ad031b688..80d45bdd58 100644 --- a/dspace-api/src/test/data/dspaceFolder/config/spring/api/solr-services.xml +++ b/dspace-api/src/test/data/dspaceFolder/config/spring/api/solr-services.xml @@ -19,19 +19,29 @@ - + - + - - + + - + - + - - + + diff --git a/dspace-api/src/test/data/dspaceFolder/config/submission-forms.xml b/dspace-api/src/test/data/dspaceFolder/config/submission-forms.xml index 6ddfef9b83..14e2affacb 100644 --- a/dspace-api/src/test/data/dspaceFolder/config/submission-forms.xml +++ b/dspace-api/src/test/data/dspaceFolder/config/submission-forms.xml @@ -237,7 +237,7 @@ it, please enter the types and the actual numbers or codes.

    - isVolumeOfJournal + isJournalOfVolume periodical creativework.publisher:somepublishername @@ -282,6 +282,70 @@ it, please enter the types and the actual numbers or codes.
    +
    + + + dc + contributor + author + + name + false + You must enter at least the author. + Enter the names of the authors of this item in the form Lastname, Firstname [i.e. Smith, Josh or Smith, J]. + + + + + person + affiliation + name + + onebox + false + + Enter the affiliation of the author as stated on the publication. + + +
    + +
    + + + dc + contributor + author + true + + onebox + Author field that can be associated with an authority providing suggestion + + + + + + dc + contributor + editor + false + + name + Editor field that can be associated with an authority providing the special name lookup + + + + + + dc + subject + true + + onebox + Subject field that can be associated with an authority providing lookup + + + +
    diff --git a/dspace-api/src/test/data/dspaceFolder/config/submission-forms_it.xml b/dspace-api/src/test/data/dspaceFolder/config/submission-forms_it.xml new file mode 100644 index 0000000000..66ed4a926c --- /dev/null +++ b/dspace-api/src/test/data/dspaceFolder/config/submission-forms_it.xml @@ -0,0 +1,169 @@ + + + + + + + + + + + + + + + + + + + + + + + +
    + + + dc + title + + false + + onebox + Inserisci nome del file + È necessario inserire un titolo principale per questo item + + + + + dc + description + true + + textarea + Inserisci descrizione per questo file + + + +
    + +
    + + + isAuthorOfPublication + person + true + + Aggiungi un autore + + dc + contributor + author + name + + È richiesto almeno un autore + + + + + dc + title + + false + + onebox + Inserisci titolo principale di questo item + È necessario inserire un titolo principale per questo item + + + + + + + + dc + language + iso + false + + dropdown + Selezionare la lingua del contenuto principale dell'item. Se la lingua non compare nell'elenco, selezionare (Altro). Se il contenuto non ha davvero una lingua (ad esempio, se è un set di dati o un'immagine) selezionare (N/A). + + + + +
    +
    + + + + + + + + + + + + + + + + + + + + + N/A + + + + Inglese (USA) + en_US + + + Inglese + en + + + Spagnolo + es + + + Tedesco + de + + + Francese + fr + + + Italiano + it + + + Giapponese + ja + + + Cinese + zh + + + Portogallo + pt + + + Ucraino + uk + + + (Altro) + other + + + + +
    \ No newline at end of file diff --git a/dspace-api/src/test/data/dspaceFolder/config/submission-forms_uk.xml b/dspace-api/src/test/data/dspaceFolder/config/submission-forms_uk.xml new file mode 100644 index 0000000000..49a2ccc1a9 --- /dev/null +++ b/dspace-api/src/test/data/dspaceFolder/config/submission-forms_uk.xml @@ -0,0 +1,166 @@ + + + + + + + + + + + + + + + + + + + + + + + +
    + + + dc + title + + false + + onebox + Ввести основний заголовок файла. + Заговолок файла обов'язковий ! + + + + + dc + description + true + + textarea + Ввести опис для цього файла + + + +
    + +
    + + + isAuthorOfPublication + person + true + + Додати автора + + dc + contributor + author + name + + Потрібно ввести хочаб одного автора! + + + + + dc + title + + false + + onebox + Ввести основний заголовок файла + Заговолок файла обов'язковий ! + + + + + + + dc + language + iso + false + + dropdown + Виберiть мову головного змiсту файлу, як що мови немає у списку, вибрати (Iнша). Як що вмiст вайлу не є текстовим, наприклад є фотографiєю, тодi вибрати (N/A) + + + +
    +
    + + + + + + + + + + + + + + + + + + + + N/A + + + + Американська (USA) + en_US + + + Англiйська + en + + + Iспанська + es + + + Нiмецька + de + + + Французька + fr + + + Iталiйська + it + + + Японська + ja + + + Китайська + zh + + + Португальська + pt + + + Турецька + tr + + + (Iнша) + other + + + + +
    \ No newline at end of file diff --git a/dspace-api/src/test/data/solr/solr.xml b/dspace-api/src/test/data/solr/solr.xml new file mode 100644 index 0000000000..8f3644098a --- /dev/null +++ b/dspace-api/src/test/data/solr/solr.xml @@ -0,0 +1,3 @@ + + + diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractDSpaceIntegrationTest.java b/dspace-api/src/test/java/org/dspace/AbstractDSpaceIntegrationTest.java similarity index 97% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractDSpaceIntegrationTest.java rename to dspace-api/src/test/java/org/dspace/AbstractDSpaceIntegrationTest.java index e3bb0a0500..1abc4e017d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractDSpaceIntegrationTest.java +++ b/dspace-api/src/test/java/org/dspace/AbstractDSpaceIntegrationTest.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.test; +package org.dspace; import static org.junit.Assert.fail; @@ -17,7 +17,7 @@ import java.util.TimeZone; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.dspace.app.rest.builder.AbstractBuilder; +import org.dspace.builder.AbstractBuilder; import org.dspace.servicemanager.DSpaceKernelImpl; import org.dspace.servicemanager.DSpaceKernelInit; import org.junit.AfterClass; @@ -90,8 +90,9 @@ public class AbstractDSpaceIntegrationTest { } /** - * This method will be run after all tests finish as per @AfterClass. It + * This method will be run after all tests finish as per @AfterClass. It * will clean resources initialized by the @BeforeClass methods. + * @throws java.sql.SQLException */ @AfterClass public static void destroyTestEnvironment() throws SQLException { diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractIntegrationTestWithDatabase.java b/dspace-api/src/test/java/org/dspace/AbstractIntegrationTestWithDatabase.java similarity index 89% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractIntegrationTestWithDatabase.java rename to dspace-api/src/test/java/org/dspace/AbstractIntegrationTestWithDatabase.java index 0b4cb7791b..6d95b399ea 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractIntegrationTestWithDatabase.java +++ b/dspace-api/src/test/java/org/dspace/AbstractIntegrationTestWithDatabase.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.test; +package org.dspace; import static org.junit.Assert.fail; @@ -14,21 +14,20 @@ import java.sql.SQLException; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.dspace.app.launcher.ScriptLauncher; -import org.dspace.app.rest.builder.AbstractBuilder; import org.dspace.app.scripts.handler.impl.TestDSpaceRunnableHandler; -import org.dspace.authority.AuthoritySearchService; import org.dspace.authority.MockAuthoritySolrServiceImpl; import org.dspace.authorize.AuthorizeException; +import org.dspace.builder.AbstractBuilder; import org.dspace.content.Community; import org.dspace.core.Context; import org.dspace.core.I18nUtil; import org.dspace.discovery.MockSolrSearchCore; -import org.dspace.discovery.SolrSearchCore; import org.dspace.eperson.EPerson; import org.dspace.eperson.Group; import org.dspace.eperson.factory.EPersonServiceFactory; import org.dspace.eperson.service.EPersonService; import org.dspace.eperson.service.GroupService; +import org.dspace.kernel.ServiceManager; import org.dspace.services.factory.DSpaceServicesFactory; import org.dspace.statistics.MockSolrLoggerServiceImpl; import org.dspace.storage.rdbms.DatabaseUtils; @@ -84,11 +83,6 @@ public class AbstractIntegrationTestWithDatabase extends AbstractDSpaceIntegrati */ @BeforeClass public static void initDatabase() { - // Clear our old flyway object. Because this DB is in-memory, its - // data is lost when the last connection is closed. So, we need - // to (re)start Flyway from scratch for each Unit Test class. - DatabaseUtils.clearFlywayDBCache(); - try { // Update/Initialize the database to latest version (via Flyway) DatabaseUtils.updateDatabase(); @@ -181,21 +175,20 @@ public class AbstractIntegrationTestWithDatabase extends AbstractDSpaceIntegrati parentCommunity = null; cleanupContext(); + ServiceManager serviceManager = DSpaceServicesFactory.getInstance().getServiceManager(); // Clear the search core. - MockSolrSearchCore searchService = DSpaceServicesFactory.getInstance() - .getServiceManager() - .getServiceByName(SolrSearchCore.class.getName(), MockSolrSearchCore.class); + MockSolrSearchCore searchService = serviceManager + .getServiceByName(null, MockSolrSearchCore.class); searchService.reset(); - MockSolrLoggerServiceImpl statisticsService = DSpaceServicesFactory.getInstance() - .getServiceManager() - .getServiceByName("solrLoggerService", MockSolrLoggerServiceImpl.class); + MockSolrLoggerServiceImpl statisticsService = serviceManager + .getServiceByName(null, MockSolrLoggerServiceImpl.class); statisticsService.reset(); - MockAuthoritySolrServiceImpl authorityService = DSpaceServicesFactory.getInstance() - .getServiceManager() - .getServiceByName(AuthoritySearchService.class.getName(), MockAuthoritySolrServiceImpl.class); + MockAuthoritySolrServiceImpl authorityService = serviceManager + .getServiceByName(null, MockAuthoritySolrServiceImpl.class); authorityService.reset(); + // Reload our ConfigurationService (to reset configs to defaults again) DSpaceServicesFactory.getInstance().getConfigurationService().reloadConfig(); @@ -209,6 +202,7 @@ public class AbstractIntegrationTestWithDatabase extends AbstractDSpaceIntegrati /** * Utility method to cleanup a created Context object (to save memory). * This can also be used by individual tests to cleanup context objects they create. + * @throws java.sql.SQLException passed through. */ protected void cleanupContext() throws SQLException { // If context still valid, flush all database changes and close it @@ -253,4 +247,4 @@ public class AbstractIntegrationTestWithDatabase extends AbstractDSpaceIntegrati } } } -} \ No newline at end of file +} diff --git a/dspace-api/src/test/java/org/dspace/AbstractUnitTest.java b/dspace-api/src/test/java/org/dspace/AbstractUnitTest.java index cd3669b143..d91240d218 100644 --- a/dspace-api/src/test/java/org/dspace/AbstractUnitTest.java +++ b/dspace-api/src/test/java/org/dspace/AbstractUnitTest.java @@ -75,11 +75,6 @@ public class AbstractUnitTest extends AbstractDSpaceTest { */ @BeforeClass public static void initDatabase() { - // Clear our old flyway object. Because this DB is in-memory, its - // data is lost when the last connection is closed. So, we need - // to (re)start Flyway from scratch for each Unit Test class. - DatabaseUtils.clearFlywayDBCache(); - try { // Update/Initialize the database to latest version (via Flyway) DatabaseUtils.updateDatabase(); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/ExitException.java b/dspace-api/src/test/java/org/dspace/ExitException.java similarity index 93% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/test/ExitException.java rename to dspace-api/src/test/java/org/dspace/ExitException.java index a377d42238..3e7ce2fdc2 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/ExitException.java +++ b/dspace-api/src/test/java/org/dspace/ExitException.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.test; +package org.dspace; public class ExitException extends SecurityException { private final int status; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/NoExitSecurityManager.java b/dspace-api/src/test/java/org/dspace/NoExitSecurityManager.java similarity index 95% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/test/NoExitSecurityManager.java rename to dspace-api/src/test/java/org/dspace/NoExitSecurityManager.java index 79d75dcaf1..7d98f688ef 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/NoExitSecurityManager.java +++ b/dspace-api/src/test/java/org/dspace/NoExitSecurityManager.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.test; +package org.dspace; import java.security.Permission; diff --git a/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataExportIT.java b/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataExportIT.java new file mode 100644 index 0000000000..d7379351e5 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataExportIT.java @@ -0,0 +1,103 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.bulkedit; + +import static junit.framework.TestCase.assertTrue; + +import java.io.File; +import java.io.FileInputStream; +import java.nio.charset.StandardCharsets; + +import org.apache.commons.cli.ParseException; +import org.apache.commons.io.IOUtils; +import org.dspace.AbstractIntegrationTestWithDatabase; +import org.dspace.app.launcher.ScriptLauncher; +import org.dspace.app.scripts.handler.impl.TestDSpaceRunnableHandler; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.scripts.DSpaceRunnable; +import org.dspace.scripts.configuration.ScriptConfiguration; +import org.dspace.scripts.factory.ScriptServiceFactory; +import org.dspace.scripts.service.ScriptService; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.junit.Rule; +import org.junit.Test; +import org.junit.rules.ExpectedException; + +public class MetadataExportIT + extends AbstractIntegrationTestWithDatabase { + + @Rule + public ExpectedException thrown = ExpectedException.none(); + + private final ConfigurationService configurationService + = DSpaceServicesFactory.getInstance().getConfigurationService(); + + @Test + public void metadataExportToCsvTest() throws Exception { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context) + .build(); + Collection collection = CollectionBuilder.createCollection(context, community) + .build(); + Item item = ItemBuilder.createItem(context, collection) + .withAuthor("Donald, Smith") + .build(); + context.restoreAuthSystemState(); + String fileLocation = configurationService.getProperty("dspace.dir") + + testProps.get("test.exportcsv").toString(); + + String[] args = new String[] {"metadata-export", + "-i", String.valueOf(item.getHandle()), + "-f", fileLocation}; + TestDSpaceRunnableHandler testDSpaceRunnableHandler + = new TestDSpaceRunnableHandler(); + + ScriptLauncher.handleScript(args, ScriptLauncher.getConfig(kernelImpl), + testDSpaceRunnableHandler, kernelImpl); + File file = new File(fileLocation); + String fileContent = IOUtils.toString(new FileInputStream(file), StandardCharsets.UTF_8); + assertTrue(fileContent.contains("Donald, Smith")); + assertTrue(fileContent.contains(String.valueOf(item.getID()))); + } + + @Test(expected = ParseException.class) + public void metadataExportWithoutFileParameter() + throws IllegalAccessException, InstantiationException, ParseException { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context) + .build(); + Collection collection = CollectionBuilder.createCollection(context, community) + .build(); + Item item = ItemBuilder.createItem(context, collection) + .withAuthor("Donald, Smith") + .build(); + context.restoreAuthSystemState(); + + String[] args = new String[] {"metadata-export", + "-i", String.valueOf(item.getHandle())}; + TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); + + ScriptService scriptService = ScriptServiceFactory.getInstance().getScriptService(); + ScriptConfiguration scriptConfiguration = scriptService.getScriptConfiguration(args[0]); + + DSpaceRunnable script = null; + if (scriptConfiguration != null) { + script = scriptService.createDSpaceRunnableForScriptConfiguration(scriptConfiguration); + } + if (script != null) { + script.initialize(args, testDSpaceRunnableHandler, null); + script.run(); + } + } +} diff --git a/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataExportTest.java b/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataExportTest.java deleted file mode 100644 index 9594e2a2b1..0000000000 --- a/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataExportTest.java +++ /dev/null @@ -1,71 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.app.bulkedit; - -import static junit.framework.TestCase.assertTrue; - -import java.io.File; -import java.io.FileInputStream; -import java.nio.charset.StandardCharsets; - -import org.apache.commons.io.IOUtils; -import org.dspace.AbstractIntegrationTest; -import org.dspace.app.launcher.ScriptLauncher; -import org.dspace.app.scripts.handler.impl.TestDSpaceRunnableHandler; -import org.dspace.content.Collection; -import org.dspace.content.Community; -import org.dspace.content.Item; -import org.dspace.content.WorkspaceItem; -import org.dspace.content.factory.ContentServiceFactory; -import org.dspace.content.service.CollectionService; -import org.dspace.content.service.CommunityService; -import org.dspace.content.service.InstallItemService; -import org.dspace.content.service.ItemService; -import org.dspace.content.service.WorkspaceItemService; -import org.dspace.services.ConfigurationService; -import org.dspace.services.factory.DSpaceServicesFactory; -import org.junit.Test; - -public class MetadataExportTest extends AbstractIntegrationTest { - - private ItemService itemService = ContentServiceFactory.getInstance().getItemService(); - private CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); - private CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService(); - private WorkspaceItemService workspaceItemService = ContentServiceFactory.getInstance().getWorkspaceItemService(); - private InstallItemService installItemService = ContentServiceFactory.getInstance().getInstallItemService(); - private ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); - - @Test - public void metadataExportToCsvTest() throws Exception { - context.turnOffAuthorisationSystem(); - Community community = communityService.create(null, context); - Collection collection = collectionService.create(context, community); - WorkspaceItem wi = workspaceItemService.create(context, collection, true); - Item item = wi.getItem(); - itemService.addMetadata(context, item, "dc", "contributor", "author", null, "Donald, Smith"); - item = installItemService.installItem(context, wi); - context.restoreAuthSystemState(); - String fileLocation = configurationService.getProperty("dspace.dir") + testProps.get("test.exportcsv") - .toString(); - - String[] args = new String[] {"metadata-export", "-i", String.valueOf(item.getHandle()), "-f", fileLocation}; - TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); - - ScriptLauncher.handleScript(args, ScriptLauncher.getConfig(kernelImpl), testDSpaceRunnableHandler, kernelImpl); - File file = new File(fileLocation); - String fileContent = IOUtils.toString(new FileInputStream(file), StandardCharsets.UTF_8); - assertTrue(fileContent.contains("Donald, Smith")); - assertTrue(fileContent.contains(String.valueOf(item.getID()))); - - context.turnOffAuthorisationSystem(); - itemService.delete(context, itemService.find(context, item.getID())); - collectionService.delete(context, collectionService.find(context, collection.getID())); - communityService.delete(context, communityService.find(context, community.getID())); - context.restoreAuthSystemState(); - } -} diff --git a/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataImportTest.java b/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataImportTest.java index c0eb2789bc..4a0043586b 100644 --- a/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataImportTest.java +++ b/dspace-api/src/test/java/org/dspace/app/bulkedit/MetadataImportTest.java @@ -7,10 +7,12 @@ */ package org.dspace.app.bulkedit; +import static junit.framework.TestCase.assertEquals; import static junit.framework.TestCase.assertTrue; import java.io.File; +import org.apache.commons.cli.ParseException; import org.apache.commons.lang3.StringUtils; import org.dspace.AbstractIntegrationTest; import org.dspace.app.launcher.ScriptLauncher; @@ -22,16 +24,25 @@ import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.CollectionService; import org.dspace.content.service.CommunityService; import org.dspace.content.service.ItemService; -import org.dspace.services.ConfigurationService; -import org.dspace.services.factory.DSpaceServicesFactory; +import org.dspace.scripts.DSpaceRunnable; +import org.dspace.scripts.configuration.ScriptConfiguration; +import org.dspace.scripts.factory.ScriptServiceFactory; +import org.dspace.scripts.service.ScriptService; +import org.junit.Rule; import org.junit.Test; +import org.junit.rules.ExpectedException; public class MetadataImportTest extends AbstractIntegrationTest { - private ItemService itemService = ContentServiceFactory.getInstance().getItemService(); - private CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); - private CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService(); - private ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + private final ItemService itemService + = ContentServiceFactory.getInstance().getItemService(); + private final CollectionService collectionService + = ContentServiceFactory.getInstance().getCollectionService(); + private final CommunityService communityService + = ContentServiceFactory.getInstance().getCommunityService(); + + @Rule + public ExpectedException thrown = ExpectedException.none(); @Test public void metadataImportTest() throws Exception { @@ -50,6 +61,7 @@ public class MetadataImportTest extends AbstractIntegrationTest { StringUtils.equals( itemService.getMetadata(importedItem, "dc", "contributor", "author", Item.ANY).get(0).getValue(), "Donald, SmithImported")); + assertEquals(importedItem.getSubmitter(), eperson); context.turnOffAuthorisationSystem(); itemService.delete(context, itemService.find(context, importedItem.getID())); @@ -57,4 +69,24 @@ public class MetadataImportTest extends AbstractIntegrationTest { communityService.delete(context, communityService.find(context, community.getID())); context.restoreAuthSystemState(); } + + @Test(expected = ParseException.class) + public void metadataImportWithoutEPersonParameterTest() + throws IllegalAccessException, InstantiationException, ParseException { + String fileLocation = new File(testProps.get("test.importcsv").toString()).getAbsolutePath(); + String[] args = new String[] {"metadata-import", "-f", fileLocation, "-s"}; + TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); + + ScriptService scriptService = ScriptServiceFactory.getInstance().getScriptService(); + ScriptConfiguration scriptConfiguration = scriptService.getScriptConfiguration(args[0]); + + DSpaceRunnable script = null; + if (scriptConfiguration != null) { + script = scriptService.createDSpaceRunnableForScriptConfiguration(scriptConfiguration); + } + if (script != null) { + script.initialize(args, testDSpaceRunnableHandler, null); + script.run(); + } + } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CSVMetadataImportReferenceIT.java b/dspace-api/src/test/java/org/dspace/app/csv/CSVMetadataImportReferenceIT.java similarity index 65% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CSVMetadataImportReferenceIT.java rename to dspace-api/src/test/java/org/dspace/app/csv/CSVMetadataImportReferenceIT.java index 53f3966c7c..2dfe3a781f 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CSVMetadataImportReferenceIT.java +++ b/dspace-api/src/test/java/org/dspace/app/csv/CSVMetadataImportReferenceIT.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.csv; +package org.dspace.app.csv; import static junit.framework.TestCase.assertEquals; @@ -19,13 +19,18 @@ import java.util.Iterator; import java.util.List; import java.util.UUID; +import org.dspace.AbstractIntegrationTestWithDatabase; import org.dspace.app.bulkedit.MetadataImportException; import org.dspace.app.bulkedit.MetadataImportInvalidHeadingException; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.test.AbstractEntityIntegrationTest; +import org.dspace.app.scripts.handler.impl.TestDSpaceRunnableHandler; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EntityTypeBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.RelationshipTypeBuilder; import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.EntityType; import org.dspace.content.Item; import org.dspace.content.MetadataField; import org.dspace.content.MetadataValue; @@ -35,35 +40,57 @@ import org.dspace.content.service.ItemService; import org.dspace.content.service.MetadataFieldService; import org.dspace.content.service.MetadataValueService; import org.dspace.content.service.RelationshipService; +import org.dspace.scripts.DSpaceRunnable; +import org.dspace.scripts.configuration.ScriptConfiguration; +import org.dspace.scripts.factory.ScriptServiceFactory; +import org.dspace.scripts.service.ScriptService; import org.junit.Before; import org.junit.Test; -import org.springframework.beans.factory.annotation.Autowired; /** * Created by: Andrew Wood * Date: 26 Jul 2019 */ -public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest { +public class CSVMetadataImportReferenceIT extends AbstractIntegrationTestWithDatabase { //Common collection to utilize for test private Collection col1; - @Autowired - private RelationshipService relationshipService; + private RelationshipService relationshipService = ContentServiceFactory.getInstance().getRelationshipService(); + private ItemService itemService = ContentServiceFactory.getInstance().getItemService(); - @Autowired - private ItemService itemService; + + Community parentCommunity; /** * Setup testing enviorment */ @Before - public void setup() { + public void setup() throws SQLException { context.turnOffAuthorisationSystem(); parentCommunity = CommunityBuilder.createCommunity(context) .withName("Parent Community") .build(); + col1 = CollectionBuilder.createCollection(context, parentCommunity).withName("Collection 1").build(); + + + context.turnOffAuthorisationSystem(); + + EntityType publication = EntityTypeBuilder.createEntityTypeBuilder(context, "Publication").build(); + EntityType person = EntityTypeBuilder.createEntityTypeBuilder(context, "Person").build(); + EntityType project = EntityTypeBuilder.createEntityTypeBuilder(context, "Project").build(); + EntityType orgUnit = EntityTypeBuilder.createEntityTypeBuilder(context, "OrgUnit").build(); + + RelationshipTypeBuilder + .createRelationshipTypeBuilder(context, publication, person, "isAuthorOfPublication", + "isPublicationOfAuthor", 0, null, 0, + null).withCopyToLeft(false).withCopyToRight(true).build(); + + RelationshipTypeBuilder.createRelationshipTypeBuilder(context, publication, project, "isProjectOfPublication", + "isPublicationOfProject", 0, null, 0, + null).withCopyToRight(true).build(); + context.restoreAuthSystemState(); } @@ -102,8 +129,8 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest @Test public void testSingleMdRef() throws Exception { String[] csv = {"id,relationship.type,relation.isAuthorOfPublication,collection,dc.identifier.other", - "+,Person,," + col1.getHandle() + ",0", - "+,Publication,dc.identifier.other:0," + col1.getHandle() + ",1"}; + "+,Person,," + col1.getHandle() + ",0", + "+,Publication,dc.identifier.other:0," + col1.getHandle() + ",1"}; Item[] items = runImport(csv); assertRelationship(items[1], items[0], 1, "left", 0); } @@ -119,7 +146,7 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest performImportScript(csvLines, false); Item[] items = new Item[csvLines.length - 1]; for (int i = 0; i < items.length; i++) { - items[i] = itemService.findByIdOrLegacyId(context, getUUIDByIdentifierOther("" + i).toString()); + items[i] = itemService.findByIdOrLegacyId(context, getUUIDByIdentifierOther("" + i).toString()); } return items; } @@ -132,8 +159,8 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest public void testSingleRowNameRef() throws Exception { String[] csv = {"id,dc.title,relationship.type,relation.isAuthorOfPublication,collection,rowName," + "dc.identifier.other", - "+,Test Item 1,Person,," + col1.getHandle() + ",idVal,0", - "+,Test Item 2,Publication,rowName:idVal," + col1.getHandle() + ",anything,1"}; + "+,Test Item 1,Person,," + col1.getHandle() + ",idVal,0", + "+,Test Item 2,Publication,rowName:idVal," + col1.getHandle() + ",anything,1"}; Item[] items = runImport(csv); assertRelationship(items[1], items[0], 1, "left", 0); } @@ -145,9 +172,9 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest @Test public void testMultiMdRef() throws Exception { String[] csv = {"id,relationship.type,relation.isAuthorOfPublication,collection,dc.identifier.other", - "+,Person,," + col1.getHandle() + ",0", - "+,Person,," + col1.getHandle() + ",1", - "+,Publication,dc.identifier.other:0||dc.identifier.other:1," + col1.getHandle() + ",2"}; + "+,Person,," + col1.getHandle() + ",0", + "+,Person,," + col1.getHandle() + ",1", + "+,Publication,dc.identifier.other:0||dc.identifier.other:1," + col1.getHandle() + ",2"}; Item[] items = runImport(csv); assertRelationship(items[2], items[0], 1, "left", 0); assertRelationship(items[2], items[1], 1, "left", 1); @@ -160,9 +187,9 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest @Test public void testMultiRowNameRef() throws Exception { String[] csv = {"id,relationship.type,relation.isAuthorOfPublication,collection,dc.identifier.other,rowName", - "+,Person,," + col1.getHandle() + ",0,val1", - "+,Person,," + col1.getHandle() + ",1,val2", - "+,Publication,rowName:val1||rowName:val2," + col1.getHandle() + ",2,val3"}; + "+,Person,," + col1.getHandle() + ",0,val1", + "+,Person,," + col1.getHandle() + ",1,val2", + "+,Publication,rowName:val1||rowName:val2," + col1.getHandle() + ",2,val3"}; Item[] items = runImport(csv); assertRelationship(items[2], items[0], 1, "left", 0); assertRelationship(items[2], items[1], 1, "left", 1); @@ -176,11 +203,16 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest public void testSingleUUIDReference() throws Exception { context.turnOffAuthorisationSystem(); Item person = ItemBuilder.createItem(context, col1) - .withRelationshipType("Person") - .build(); + .withTitle("Author1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("Donald") + .withRelationshipType("Person") + .build(); context.restoreAuthSystemState(); String[] csv = {"id,relationship.type,relation.isAuthorOfPublication,collection,rowName,dc.identifier.other", - "+,Publication," + person.getID().toString() + "," + col1.getHandle() + ",anything,0"}; + "+,Publication," + person.getID().toString() + "," + col1.getHandle() + ",anything,0"}; Item[] items = runImport(csv); assertRelationship(items[0], person, 1, "left", 0); } @@ -193,12 +225,21 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest public void testMultiUUIDReference() throws Exception { context.turnOffAuthorisationSystem(); Item person = ItemBuilder.createItem(context, col1) + .withTitle("Author1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("Donald") .withRelationshipType("Person") .build(); Item person2 = ItemBuilder.createItem(context, col1) - .withRelationshipType("Person") - .build(); - context.restoreAuthSystemState(); + .withTitle("Author2") + .withIssueDate("2017-10-17") + .withAuthor("Smith, John") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("John") + .withRelationshipType("Person") + .build(); String[] csv = {"id,relationship.type,relation.isAuthorOfPublication,collection,rowName,dc.identifier.other", "+,Publication," + person.getID().toString() + "||" + person2.getID().toString() + "," + col1.getHandle() + ",anything,0"}; @@ -216,12 +257,16 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest context.turnOffAuthorisationSystem(); Item person = ItemBuilder.createItem(context, col1) .withTitle("Person") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("Donald") .withRelationshipType("Person") .build(); String[] csv = {"id,dc.title,relationship.type,relation.isAuthorOfPublication,collection,rowName," + "dc.identifier.other", - "+,Person2,Person,," + col1.getHandle() + ",idVal,0", - "+,Pub1,Publication,dc.title:Person||dc.title:Person2," + col1.getHandle() + ",anything,1"}; + "+,Person2,Person,," + col1.getHandle() + ",idVal,0", + "+,Pub1,Publication,dc.title:Person||dc.title:Person2," + col1.getHandle() + ",anything,1"}; context.restoreAuthSystemState(); Item[] items = runImport(csv); assertRelationship(items[1], person, 1, "left", 0); @@ -238,16 +283,25 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest context.turnOffAuthorisationSystem(); Item person = ItemBuilder.createItem(context, col1) .withTitle("Person") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("Donald") .withRelationshipType("Person") .build(); Item person2 = ItemBuilder.createItem(context, col1) - .withTitle("Person2") - .withRelationshipType("Person") - .build(); + .withTitle("Person2") + .withIssueDate("2017-10-17") + .withAuthor("Smith, John") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("John") + .withRelationshipType("Person") + .build(); + context.restoreAuthSystemState(); String[] csv = {"id,dc.title,relationship.type,relation.isAuthorOfPublication,collection,rowName," + "dc.identifier.other", - "+,Person3,Person,," + col1.getHandle() + ",idVal,0", + "+,Person3,Person,," + col1.getHandle() + ",idVal,0", "+,Pub1,Publication," + person.getID() + "||dc.title:Person2||rowName:idVal," + col1.getHandle() + ",anything,1"}; Item[] items = runImport(csv); @@ -264,8 +318,8 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest public void testRefWithSpecialChar() throws Exception { String[] csv = {"id,dc.title,relationship.type,relation.isAuthorOfPublication,collection,rowName," + "dc.identifier.other", - "+,Person:,Person,," + col1.getHandle() + ",idVal,0", - "+,Pub1,Publication,dc.title:Person:," + col1.getHandle() + ",anything,1"}; + "+,Person:,Person,," + col1.getHandle() + ",idVal,0", + "+,Pub1,Publication,dc.title:Person:," + col1.getHandle() + ",anything,1"}; Item[] items = runImport(csv); assertRelationship(items[1], items[0], 1, "left", 0); } @@ -300,14 +354,25 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest @Test(expected = MetadataImportException.class) public void testNonUniqueMDRefInDb() throws Exception { context.turnOffAuthorisationSystem(); - ItemBuilder.createItem(context, col1) - .withRelationshipType("Person") - .withIdentifierOther("1") - .build(); - ItemBuilder.createItem(context, col1) - .withRelationshipType("Person") - .withIdentifierOther("1") - .build(); + Item person = ItemBuilder.createItem(context, col1) + .withTitle("Person") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("Donald") + .withRelationshipType("Person") + .withIdentifierOther("1") + .build(); + Item person2 = ItemBuilder.createItem(context, col1) + .withTitle("Person2") + .withIssueDate("2017-10-17") + .withAuthor("Smith, John") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("John") + .withRelationshipType("Person") + .withIdentifierOther("1") + .build(); + context.restoreAuthSystemState(); String[] csv = {"id,relationship.type,relation.isAuthorOfPublication,collection,dc.identifier.other", "+,Publication,dc.identifier.other:1," + col1.getHandle() + ",2"}; @@ -320,10 +385,15 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest @Test(expected = MetadataImportException.class) public void testNonUniqueMDRefInBoth() throws Exception { context.turnOffAuthorisationSystem(); - ItemBuilder.createItem(context, col1) - .withRelationshipType("Person") - .withIdentifierOther("1") - .build(); + Item person = ItemBuilder.createItem(context, col1) + .withTitle("Person") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("Donald") + .withRelationshipType("Person") + .withIdentifierOther("1") + .build(); context.restoreAuthSystemState(); String[] csv = {"id,relationship.type,relation.isAuthorOfPublication,collection,dc.identifier.other", "+,Person,," + col1.getHandle() + ",1", @@ -382,8 +452,10 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest public void testInvalidRelationshipArchivedOrigin() throws Exception { context.turnOffAuthorisationSystem(); Item testItem = ItemBuilder.createItem(context, col1) - .withRelationshipType("OrgUnit") - .build(); + .withTitle("OrgUnit") + .withIssueDate("2017-10-17") + .withRelationshipType("OrgUnit") + .build(); context.restoreAuthSystemState(); String[] csv = {"id,relationship.type,relation.isAuthorOfPublication,collection,rowName", "+,Person,," + col1.getHandle() + ",1" + @@ -398,6 +470,8 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest public void testInvalidRelationshipArchivedTarget() throws Exception { context.turnOffAuthorisationSystem(); Item testItem = ItemBuilder.createItem(context, col1) + .withTitle("OrgUnit") + .withIssueDate("2017-10-17") .withRelationshipType("OrgUnit") .build(); context.restoreAuthSystemState(); @@ -413,26 +487,42 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest @Test public void testValidRelationshipNoDefinedTypesInCSV() throws Exception { context.turnOffAuthorisationSystem(); - Item testItemOne = ItemBuilder.createItem(context, col1) - .withRelationshipType("Person") - .withIdentifierOther("testItemOne") - .build(); - Item testItemTwo = ItemBuilder.createItem(context, col1) - .withRelationshipType("Publication") - .withIdentifierOther("testItemTwo") - .build(); - Item testItemThree = ItemBuilder.createItem(context, col1) - .withRelationshipType("Project") - .withIdentifierOther("testItemThree") - .build(); + + Item testItem = ItemBuilder.createItem(context, col1) + .withTitle("Person") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald") + .withPersonIdentifierLastName("Smith") + .withPersonIdentifierFirstName("Donald") + .withRelationshipType("Person") + .withIdentifierOther("testItemOne") + .build(); + + + Item testItem2 = ItemBuilder.createItem(context, col1) + .withTitle("Publication") + .withIssueDate("2017-10-17") + .withRelationshipType("Publication") + .withIdentifierOther("testItemTwo") + .build(); + + + Item testItem3 = ItemBuilder.createItem(context, col1) + .withTitle("Project") + .withIssueDate("2017-10-17") + .withRelationshipType("Project") + .withIdentifierOther("testItemThree") + .build(); + + context.restoreAuthSystemState(); String[] csv = {"id,relation.isAuthorOfPublication,relation.isPublicationOfProject,collection", - testItemOne.getID().toString() + ",,," + col1.getHandle(), - testItemTwo.getID().toString() + ",dc.identifier.other:testItemOne,," + col1.getHandle(), - testItemThree.getID().toString() + ",,dc.identifier.other:testItemTwo," + col1.getHandle()}; + testItem.getID().toString() + ",,," + col1.getHandle(), + testItem2.getID().toString() + ",dc.identifier.other:testItemOne,," + col1.getHandle(), + testItem3.getID().toString() + ",,dc.identifier.other:testItemTwo," + col1.getHandle()}; performImportScript(csv, false); - assertRelationship(testItemTwo, testItemOne, 1, "left", 0); - assertRelationship(testItemTwo, testItemThree, 1, "left", 0); + assertRelationship(testItem2, testItem, 1, "left", 0); + assertRelationship(testItem2, testItem3, 1, "left", 0); } /** @@ -455,14 +545,17 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest @Test(expected = MetadataImportException.class) public void testInvalidTypeNameDefined() throws Exception { context.turnOffAuthorisationSystem(); + Item testItem = ItemBuilder.createItem(context, col1) - .withRelationshipType("Publication") - .build(); + .withTitle("Publication") + .withIssueDate("2017-10-17") + .withRelationshipType("Publication") + .build(); context.restoreAuthSystemState(); String[] csv = {"id,collection,relationship.type,dc.title," + "relation.isProjectOfPublication,relation.isPublicationOfProject", "+," + col1.getHandle() + ",Project,Title," + - testItem.getID().toString() + "," + testItem.getID().toString() }; + testItem.getID().toString() + "," + testItem.getID().toString()}; performImportScript(csv, true); } @@ -477,17 +570,34 @@ public class CSVMetadataImportReferenceIT extends AbstractEntityIntegrationTest } out.flush(); out.close(); + String fileLocation = csvFile.getAbsolutePath(); try { + String[] args = null; if (validateOnly) { - return runDSpaceScript("metadata-import", "-f", csvFile.getAbsolutePath(), "-e", "admin@email.com", - "-s", "-v"); + args = new String[] {"metadata-import", "-f", fileLocation, "-e", eperson.getEmail(), "-s", "-v"}; } else { - return runDSpaceScript("metadata-import", "-f", csvFile.getAbsolutePath(), "-e", "admin@email.com", - "-s"); + args = new String[] {"metadata-import", "-f", fileLocation, "-e", eperson.getEmail(), "-s",}; + } + TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); + + ScriptService scriptService = ScriptServiceFactory.getInstance().getScriptService(); + ScriptConfiguration scriptConfiguration = scriptService.getScriptConfiguration(args[0]); + + DSpaceRunnable script = null; + if (scriptConfiguration != null) { + script = scriptService.createDSpaceRunnableForScriptConfiguration(scriptConfiguration); + } + if (script != null) { + script.initialize(args, testDSpaceRunnableHandler, null); + script.run(); + } + if (testDSpaceRunnableHandler.getException() != null) { + throw testDSpaceRunnableHandler.getException(); } } finally { csvFile.delete(); } + return 0; } /** diff --git a/dspace-api/src/test/java/org/dspace/authenticate/IPMatcherTest.java b/dspace-api/src/test/java/org/dspace/authenticate/IPMatcherTest.java index 511ea0da25..6f73c3abc4 100644 --- a/dspace-api/src/test/java/org/dspace/authenticate/IPMatcherTest.java +++ b/dspace-api/src/test/java/org/dspace/authenticate/IPMatcherTest.java @@ -153,6 +153,14 @@ public class IPMatcherTest { assertFalse(ipMatcher.match("0:0:0:0:0:0:0:1")); } + @Test + public void testIPv6FullMaskMatching() throws Exception { + final IPMatcher ipMatcher = new IPMatcher("::2/128"); + + assertTrue(ipMatcher.match("0:0:0:0:0:0:0:2")); + assertFalse(ipMatcher.match("0:0:0:0:0:0:0:1")); + } + @Test public void testAsteriskMatchingSuccess() throws Exception { diff --git a/dspace-api/src/test/java/org/dspace/authority/MockAuthoritySolrServiceImpl.java b/dspace-api/src/test/java/org/dspace/authority/MockAuthoritySolrServiceImpl.java index e1e018ef33..6c0ad5ace8 100644 --- a/dspace-api/src/test/java/org/dspace/authority/MockAuthoritySolrServiceImpl.java +++ b/dspace-api/src/test/java/org/dspace/authority/MockAuthoritySolrServiceImpl.java @@ -21,4 +21,8 @@ public class MockAuthoritySolrServiceImpl extends AuthoritySolrServiceImpl imple //We don't use SOLR in the tests of this module solr = null; } + + public void reset() { + // This method intentionally left blank. + } } diff --git a/dspace-api/src/test/java/org/dspace/authorize/AuthorizeConfigIntegrationTest.java b/dspace-api/src/test/java/org/dspace/authorize/AuthorizeConfigIT.java similarity index 97% rename from dspace-api/src/test/java/org/dspace/authorize/AuthorizeConfigIntegrationTest.java rename to dspace-api/src/test/java/org/dspace/authorize/AuthorizeConfigIT.java index d338bc6e2c..3218c14d7e 100644 --- a/dspace-api/src/test/java/org/dspace/authorize/AuthorizeConfigIntegrationTest.java +++ b/dspace-api/src/test/java/org/dspace/authorize/AuthorizeConfigIT.java @@ -20,7 +20,7 @@ import org.junit.Test; * @author Andrea Bollini (andrea.bollini at 4science.it) * */ -public class AuthorizeConfigIntegrationTest extends AbstractIntegrationTest { +public class AuthorizeConfigIT extends AbstractIntegrationTest { @Test public void testReloadConfiguration() { diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractBuilder.java b/dspace-api/src/test/java/org/dspace/builder/AbstractBuilder.java similarity index 96% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/AbstractBuilder.java index faa8c473af..76fd02916f 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/AbstractBuilder.java @@ -5,18 +5,18 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.sql.SQLException; import java.util.List; import org.apache.commons.collections4.CollectionUtils; import org.apache.logging.log4j.Logger; -import org.dspace.app.rest.builder.util.AbstractBuilderCleanupUtil; import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.factory.AuthorizeServiceFactory; import org.dspace.authorize.service.AuthorizeService; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.util.AbstractBuilderCleanupUtil; import org.dspace.content.Bitstream; import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.BitstreamFormatService; @@ -55,8 +55,8 @@ import org.dspace.xmlworkflow.storedcomponents.service.XmlWorkflowItemService; /** * Abstract builder class that holds references to all available services * - * @param This param represents the Model object for the Builder - * @param This param represents the Service object for the builder + * @param This parameter represents the Model object for the Builder + * @param This parameter represents the Service object for the builder * @author Jonas Van Goolen - (jonas@atmire.com) */ public abstract class AbstractBuilder { @@ -96,7 +96,8 @@ public abstract class AbstractBuilder { * This static class will make sure that the objects built with the builders are disposed of in a foreign-key * constraint safe manner by predefining an order */ - private static AbstractBuilderCleanupUtil abstractBuilderCleanupUtil = new AbstractBuilderCleanupUtil(); + private static final AbstractBuilderCleanupUtil abstractBuilderCleanupUtil + = new AbstractBuilderCleanupUtil(); /** * log4j category */ diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractCRUDBuilder.java b/dspace-api/src/test/java/org/dspace/builder/AbstractCRUDBuilder.java similarity index 90% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractCRUDBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/AbstractCRUDBuilder.java index 884bcc9e3c..ff2bef51c2 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractCRUDBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/AbstractCRUDBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import org.dspace.core.Context; import org.dspace.core.ReloadableEntity; @@ -13,6 +13,8 @@ import org.dspace.service.DSpaceCRUDService; /** * @author Jonas Van Goolen - (jonas@atmire.com) + * + * @param A specific kind of ReloadableEntity. */ public abstract class AbstractCRUDBuilder extends AbstractBuilder { @@ -20,8 +22,10 @@ public abstract class AbstractCRUDBuilder extends Ab super(context); } + @Override protected abstract DSpaceCRUDService getService(); + @Override public abstract T build(); public void delete(T dso) throws Exception { diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractDSpaceObjectBuilder.java b/dspace-api/src/test/java/org/dspace/builder/AbstractDSpaceObjectBuilder.java similarity index 87% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractDSpaceObjectBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/AbstractDSpaceObjectBuilder.java index 02b7e221e8..69cfd0e136 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/AbstractDSpaceObjectBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/AbstractDSpaceObjectBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.sql.SQLException; import java.util.Date; @@ -43,12 +43,15 @@ public abstract class AbstractDSpaceObjectBuilder this.context = context; } + @Override public abstract void cleanup() throws Exception; + @Override protected abstract DSpaceObjectService getService(); + @Override protected B handleException(final Exception e) { log.error(e.getMessage(), e); return null; @@ -143,6 +146,32 @@ public abstract class AbstractDSpaceObjectBuilder } return (B) this; } + /** + * Support method to grant the {@link Constants#READ} permission over an object only to a specific group. Any other + * READ permissions will be removed + * + * @param dso + * the DSpaceObject on which grant the permission + * @param eperson + * the eperson that will be granted of the permission + * @return the builder properly configured to build the object with the additional admin permission + */ + protected > B setAdminPermission(DSpaceObject dso, EPerson eperson, + Date startDate) { + try { + + ResourcePolicy rp = authorizeService.createOrModifyPolicy(null, context, null, null, + eperson, startDate, Constants.ADMIN, + "Integration Test", dso); + if (rp != null) { + resourcePolicyService.update(context, rp); + } + } catch (Exception e) { + return handleException(e); + } + return (B) this; + + } /** * Support method to grant {@link Constants#REMOVE} permission to a specific eperson @@ -231,13 +260,15 @@ public abstract class AbstractDSpaceObjectBuilder return (B) this; } + @Override public abstract T build() throws SQLException, AuthorizeException; + @Override public void delete(Context c, T dso) throws Exception { - if (dso != null) { - getService().delete(c, dso); - } - c.complete(); - indexingService.commit(); + if (dso != null) { + getService().delete(c, dso); + } + c.complete(); + indexingService.commit(); } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BitstreamBuilder.java b/dspace-api/src/test/java/org/dspace/builder/BitstreamBuilder.java similarity index 98% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BitstreamBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/BitstreamBuilder.java index 42a375a58e..b8942a17d0 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BitstreamBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/BitstreamBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.io.InputStream; @@ -129,6 +129,7 @@ public class BitstreamBuilder extends AbstractDSpaceObjectBuilder { return this; } + @Override public Bitstream build() { try { bitstreamService.update(context, bitstream); @@ -152,7 +153,7 @@ public class BitstreamBuilder extends AbstractDSpaceObjectBuilder { @Override public void cleanup() throws Exception { - try (Context c = new Context()) { + try (Context c = new Context()) { c.turnOffAuthorisationSystem(); // Ensure object and any related objects are reloaded before checking to see what needs cleanup bitstream = c.reloadEntity(bitstream); @@ -163,6 +164,7 @@ public class BitstreamBuilder extends AbstractDSpaceObjectBuilder { } } + @Override protected DSpaceObjectService getService() { return bitstreamService; } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BitstreamFormatBuilder.java b/dspace-api/src/test/java/org/dspace/builder/BitstreamFormatBuilder.java similarity index 98% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BitstreamFormatBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/BitstreamFormatBuilder.java index 3cd7084577..1051712326 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BitstreamFormatBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/BitstreamFormatBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.sql.SQLException; @@ -71,7 +71,6 @@ public class BitstreamFormatBuilder extends AbstractCRUDBuilder log.error(e); } catch (AuthorizeException e) { log.error(e); - ; } return bitstreamFormat; } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BundleBuilder.java b/dspace-api/src/test/java/org/dspace/builder/BundleBuilder.java similarity index 95% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BundleBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/BundleBuilder.java index 76d4a90104..614cd54c6d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/BundleBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/BundleBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.sql.SQLException; @@ -25,7 +25,7 @@ public class BundleBuilder extends AbstractDSpaceObjectBuilder { private Bundle bundle; private Item item; private String name; - private List bitstreams = new ArrayList<>(); + private final List bitstreams = new ArrayList<>(); protected BundleBuilder(Context context) { super(context); @@ -52,6 +52,7 @@ public class BundleBuilder extends AbstractDSpaceObjectBuilder { return this; } + @Override public void cleanup() throws Exception { try (Context c = new Context()) { c.turnOffAuthorisationSystem(); @@ -64,10 +65,12 @@ public class BundleBuilder extends AbstractDSpaceObjectBuilder { } } + @Override protected DSpaceObjectService getService() { return bundleService; } + @Override public Bundle build() throws SQLException, AuthorizeException { bundle = bundleService.create(context, item, name); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/ClaimedTaskBuilder.java b/dspace-api/src/test/java/org/dspace/builder/ClaimedTaskBuilder.java similarity index 99% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/ClaimedTaskBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/ClaimedTaskBuilder.java index 72acd3f27d..338739285f 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/ClaimedTaskBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/ClaimedTaskBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.InputStream; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/CollectionBuilder.java b/dspace-api/src/test/java/org/dspace/builder/CollectionBuilder.java similarity index 99% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/CollectionBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/CollectionBuilder.java index d472316c74..da46281290 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/CollectionBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/CollectionBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.io.InputStream; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/CommunityBuilder.java b/dspace-api/src/test/java/org/dspace/builder/CommunityBuilder.java similarity index 99% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/CommunityBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/CommunityBuilder.java index f7b13e117f..5500697da4 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/CommunityBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/CommunityBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.io.InputStream; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/EPersonBuilder.java b/dspace-api/src/test/java/org/dspace/builder/EPersonBuilder.java similarity index 87% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/EPersonBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/EPersonBuilder.java index 26fc2b51c4..256b3432d4 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/EPersonBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/EPersonBuilder.java @@ -5,12 +5,14 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.sql.SQLException; import java.util.UUID; +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; import org.dspace.authorize.AuthorizeException; import org.dspace.content.service.DSpaceObjectService; import org.dspace.core.Context; @@ -19,6 +21,7 @@ import org.dspace.eperson.EPerson; import org.dspace.eperson.Group; public class EPersonBuilder extends AbstractDSpaceObjectBuilder { + private static final Logger LOG = LogManager.getLogger(EPersonBuilder.class); private EPerson ePerson; @@ -28,7 +31,7 @@ public class EPersonBuilder extends AbstractDSpaceObjectBuilder { @Override public void cleanup() throws Exception { - try (Context c = new Context()) { + try (Context c = new Context()) { c.turnOffAuthorisationSystem(); // Ensure object and any related objects are reloaded before checking to see what needs cleanup ePerson = c.reloadEntity(ePerson); @@ -36,23 +39,21 @@ public class EPersonBuilder extends AbstractDSpaceObjectBuilder { delete(c, ePerson); c.complete(); } - } + } } + @Override protected DSpaceObjectService getService() { return ePersonService; } + @Override public EPerson build() { try { ePersonService.update(context, ePerson); indexingService.commit(); - } catch (SearchServiceException e) { - e.printStackTrace(); - } catch (SQLException e) { - e.printStackTrace(); - } catch (AuthorizeException e) { - e.printStackTrace(); + } catch (SearchServiceException | SQLException | AuthorizeException e) { + LOG.warn("Failed to complete the EPerson", e); } return ePerson; } @@ -65,10 +66,8 @@ public class EPersonBuilder extends AbstractDSpaceObjectBuilder { private EPersonBuilder create() { try { ePerson = ePersonService.create(context); - } catch (SQLException e) { - e.printStackTrace(); - } catch (AuthorizeException e) { - e.printStackTrace(); + } catch (SQLException | AuthorizeException e) { + LOG.warn("Failed to create the EPerson", e); } return this; } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/EntityTypeBuilder.java b/dspace-api/src/test/java/org/dspace/builder/EntityTypeBuilder.java similarity index 96% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/EntityTypeBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/EntityTypeBuilder.java index 8a2efaffa6..ae0e807198 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/EntityTypeBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/EntityTypeBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.sql.SQLException; @@ -53,6 +53,7 @@ public class EntityTypeBuilder extends AbstractBuilder { @Override public void cleanup() throws Exception { - try (Context c = new Context()) { + try (Context c = new Context()) { c.turnOffAuthorisationSystem(); // Ensure object and any related objects are reloaded before checking to see what needs cleanup group = c.reloadEntity(group); @@ -42,7 +42,7 @@ public class GroupBuilder extends AbstractDSpaceObjectBuilder { delete(c, group); c.complete(); } - } + } } public static GroupBuilder createGroup(final Context context) { diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/ItemBuilder.java b/dspace-api/src/test/java/org/dspace/builder/ItemBuilder.java similarity index 91% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/ItemBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/ItemBuilder.java index ddc2a1963c..27fdf10038 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/ItemBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/ItemBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.sql.SQLException; @@ -19,6 +19,7 @@ import org.dspace.content.MetadataSchemaEnum; import org.dspace.content.WorkspaceItem; import org.dspace.content.service.DSpaceObjectService; import org.dspace.core.Context; +import org.dspace.eperson.EPerson; import org.dspace.eperson.Group; /** @@ -89,6 +90,10 @@ public class ItemBuilder extends AbstractDSpaceObjectBuilder { return addMetadataValue(item, "relationship", "type", null, relationshipType); } + public ItemBuilder withType(final String type) { + return addMetadataValue(item, "dc", "type", null, type); + } + public ItemBuilder withPublicationIssueNumber(final String issueNumber) { return addMetadataValue(item, "publicationissue", "issueNumber", null, issueNumber); } @@ -126,6 +131,19 @@ public class ItemBuilder extends AbstractDSpaceObjectBuilder { return this; } + /** + * Create an admin group for the collection with the specified members + * + * @param members epersons to add to the admin group + * @return this builder + * @throws SQLException + * @throws AuthorizeException + */ + public ItemBuilder withAdminUser(EPerson ePerson) throws SQLException, AuthorizeException { + return setAdminPermission(item, ePerson, null); + } + + @Override public Item build() { try { diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/MetadataFieldBuilder.java b/dspace-api/src/test/java/org/dspace/builder/MetadataFieldBuilder.java similarity index 90% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/MetadataFieldBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/MetadataFieldBuilder.java index 76d411cf14..dfc9112a3f 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/MetadataFieldBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/MetadataFieldBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.sql.SQLException; @@ -64,17 +64,9 @@ public class MetadataFieldBuilder extends AbstractBuilder { return this; } + public ProcessBuilder withProcessStatus(ProcessStatus processStatus) { + process.setProcessStatus(processStatus); + return this; + } + + public ProcessBuilder withStartAndEndTime(String startTime, String endTime) throws ParseException { + SimpleDateFormat simpleDateFormat = new SimpleDateFormat("dd/MM/yyyy"); + process.setStartTime(simpleDateFormat.parse(startTime)); + process.setFinishedTime(simpleDateFormat.parse(endTime)); + return this; + } + @Override public void cleanup() throws Exception { try (Context c = new Context()) { @@ -57,6 +71,7 @@ public class ProcessBuilder extends AbstractBuilder { } } + @Override public Process build() { try { processService.update(context, process); @@ -68,6 +83,7 @@ public class ProcessBuilder extends AbstractBuilder { return process; } + @Override protected ProcessService getService() { return processService; } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/RelationshipBuilder.java b/dspace-api/src/test/java/org/dspace/builder/RelationshipBuilder.java similarity index 97% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/RelationshipBuilder.java rename to dspace-api/src/test/java/org/dspace/builder/RelationshipBuilder.java index c054521569..773a4a8b8b 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/builder/RelationshipBuilder.java +++ b/dspace-api/src/test/java/org/dspace/builder/RelationshipBuilder.java @@ -5,7 +5,7 @@ * * http://www.dspace.org/license/ */ -package org.dspace.app.rest.builder; +package org.dspace.builder; import java.io.IOException; import java.sql.SQLException; @@ -56,6 +56,7 @@ public class RelationshipBuilder extends AbstractBuilder> map = new LinkedHashMap<>(); + private final LinkedHashMap> map + = new LinkedHashMap<>(); /** * Constructor that will initialize the Map with a predefined order for deletion diff --git a/dspace-api/src/test/java/org/dspace/content/CommunityTest.java b/dspace-api/src/test/java/org/dspace/content/CommunityTest.java index 812060d019..74f8c20cf2 100644 --- a/dspace-api/src/test/java/org/dspace/content/CommunityTest.java +++ b/dspace-api/src/test/java/org/dspace/content/CommunityTest.java @@ -1013,7 +1013,8 @@ public class CommunityTest extends AbstractDSpaceObjectTest { equalTo(c)); assertThat("testGetAdminObject 1", (Community) communityService.getAdminObject(context, c, Constants.ADD), equalTo(c)); - assertThat("testGetAdminObject 2", communityService.getAdminObject(context, c, Constants.DELETE), nullValue()); + assertThat("testGetAdminObject 2", (Community) communityService.getAdminObject(context, c, Constants.DELETE), + equalTo(c)); assertThat("testGetAdminObject 3", (Community) communityService.getAdminObject(context, c, Constants.ADMIN), equalTo(c)); } diff --git a/dspace-api/src/test/java/org/dspace/content/ItemTest.java b/dspace-api/src/test/java/org/dspace/content/ItemTest.java index 8c3cfa5a04..01a2bea0bc 100644 --- a/dspace-api/src/test/java/org/dspace/content/ItemTest.java +++ b/dspace-api/src/test/java/org/dspace/content/ItemTest.java @@ -490,8 +490,8 @@ public class ItemTest extends AbstractDSpaceObjectTest { // Set the item to have two pieces of metadata for dc.type and dc2.type String dcType = "DC-TYPE"; String testType = "TEST-TYPE"; - itemService.addMetadata(context, it, "dc", "type", null, null, dcType, "accepted", 0); - itemService.addMetadata(context, it, "test", "type", null, null, testType, "accepted", 0); + itemService.addMetadata(context, it, "dc", "type", null, null, dcType); + itemService.addMetadata(context, it, "test", "type", null, null, testType); // Check that only one is returned when we ask for all dc.type values List values = itemService.getMetadata(it, "dc", "type", null, null); @@ -1598,8 +1598,8 @@ public class ItemTest extends AbstractDSpaceObjectTest { assertThat("testGetAdminObject 0", (Item) itemService.getAdminObject(context, it, Constants.REMOVE), equalTo(it)); assertThat("testGetAdminObject 1", (Item) itemService.getAdminObject(context, it, Constants.ADD), equalTo(it)); - assertThat("testGetAdminObject 2", (Collection) itemService.getAdminObject(context, it, Constants.DELETE), - equalTo(collection)); + assertThat("testGetAdminObject 2", (Item) itemService.getAdminObject(context, it, Constants.DELETE), + equalTo(it)); assertThat("testGetAdminObject 3", (Item) itemService.getAdminObject(context, it, Constants.ADMIN), equalTo(it)); } diff --git a/dspace-api/src/test/java/org/dspace/content/SiteTest.java b/dspace-api/src/test/java/org/dspace/content/SiteTest.java index 02e868e19b..8cc57410f1 100644 --- a/dspace-api/src/test/java/org/dspace/content/SiteTest.java +++ b/dspace-api/src/test/java/org/dspace/content/SiteTest.java @@ -14,13 +14,16 @@ import static org.junit.Assert.assertTrue; import static org.junit.Assert.fail; import java.sql.SQLException; +import java.util.List; +import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; import org.dspace.AbstractUnitTest; import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.SiteService; import org.dspace.core.ConfigurationManager; import org.dspace.core.Constants; +import org.dspace.eperson.Group; import org.junit.After; import org.junit.Before; import org.junit.Test; @@ -143,4 +146,17 @@ public class SiteTest extends AbstractUnitTest { assertThat("testGetURL 0", s.getURL(), equalTo(ConfigurationManager.getProperty("dspace.ui.url"))); } + @Test + public void testAnonymousReadRights() throws Exception { + List groupList = authorizeService.getAuthorizedGroups(context, s, Constants.READ); + boolean foundAnonInList = false; + for (Group group : groupList) { + if (StringUtils.equalsIgnoreCase(group.getName(), "Anonymous")) { + foundAnonInList = true; + } + } + assertTrue(foundAnonInList); + + } + } diff --git a/dspace-api/src/test/java/org/dspace/content/authority/DSpaceControlledVocabularyTest.java b/dspace-api/src/test/java/org/dspace/content/authority/DSpaceControlledVocabularyTest.java index 0d431a5a5b..77cf105dd4 100644 --- a/dspace-api/src/test/java/org/dspace/content/authority/DSpaceControlledVocabularyTest.java +++ b/dspace-api/src/test/java/org/dspace/content/authority/DSpaceControlledVocabularyTest.java @@ -78,7 +78,7 @@ public class DSpaceControlledVocabularyTest extends AbstractDSpaceTest { String text = "north 40"; Collection collection = null; int start = 0; - int limit = 0; + int limit = 10; String locale = null; // This "farm" Controlled Vocab is included in TestEnvironment data // (under /src/test/data/dspaceFolder/) and it should be auto-loaded @@ -86,8 +86,7 @@ public class DSpaceControlledVocabularyTest extends AbstractDSpaceTest { DSpaceControlledVocabulary instance = (DSpaceControlledVocabulary) CoreServiceFactory.getInstance().getPluginService().getNamedPlugin(Class.forName(PLUGIN_INTERFACE), "farm"); assertNotNull(instance); - Choices result = instance.getMatches(field, text, collection, start, - limit, locale); + Choices result = instance.getMatches(text, start, limit, locale); assertEquals("the farm::north 40", result.values[0].value); } diff --git a/dspace-api/src/test/java/org/dspace/core/ContextTest.java b/dspace-api/src/test/java/org/dspace/core/ContextTest.java index f5697a72dc..0c29e053ec 100644 --- a/dspace-api/src/test/java/org/dspace/core/ContextTest.java +++ b/dspace-api/src/test/java/org/dspace/core/ContextTest.java @@ -130,7 +130,7 @@ public class ContextTest extends AbstractUnitTest { public void testGetCurrentLocale() { //NOTE: CurrentLocale is not initialized in AbstractUnitTest. So it should be DEFAULTLOCALE assertThat("testGetCurrentLocale 0", context.getCurrentLocale(), notNullValue()); - assertThat("testGetCurrentLocale 1", context.getCurrentLocale(), equalTo(I18nUtil.DEFAULTLOCALE)); + assertThat("testGetCurrentLocale 1", context.getCurrentLocale(), equalTo(I18nUtil.getDefaultLocale())); } /** diff --git a/dspace-api/src/test/java/org/dspace/curate/CurationTest.java b/dspace-api/src/test/java/org/dspace/curate/CurationTest.java new file mode 100644 index 0000000000..dadf131c38 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/curate/CurationTest.java @@ -0,0 +1,76 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.curate; + +import org.apache.commons.cli.ParseException; +import org.dspace.AbstractIntegrationTestWithDatabase; +import org.dspace.app.scripts.handler.impl.TestDSpaceRunnableHandler; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.scripts.DSpaceRunnable; +import org.dspace.scripts.configuration.ScriptConfiguration; +import org.dspace.scripts.factory.ScriptServiceFactory; +import org.dspace.scripts.service.ScriptService; +import org.junit.Test; + +public class CurationTest extends AbstractIntegrationTestWithDatabase { + + @Test(expected = ParseException.class) + public void curationWithoutEPersonParameterTest() throws Exception { + + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context) + .build(); + Collection collection = CollectionBuilder.createCollection(context, community) + .build(); + context.restoreAuthSystemState(); + String[] args = new String[] {"curate", "-t", CurationClientOptions.getTaskOptions().get(0), + "-i", collection.getHandle()}; + TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); + + ScriptService scriptService = ScriptServiceFactory.getInstance().getScriptService(); + ScriptConfiguration scriptConfiguration = scriptService.getScriptConfiguration(args[0]); + + DSpaceRunnable script = null; + if (scriptConfiguration != null) { + script = scriptService.createDSpaceRunnableForScriptConfiguration(scriptConfiguration); + } + if (script != null) { + script.initialize(args, testDSpaceRunnableHandler, null); + script.run(); + } + } + + @Test + public void curationWithEPersonParameterTest() throws Exception { + + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context) + .build(); + Collection collection = CollectionBuilder.createCollection(context, community) + .build(); + context.restoreAuthSystemState(); + String[] args = new String[] {"curate", "-e", "admin@email.com", "-t", + CurationClientOptions.getTaskOptions().get(0), "-i", collection.getHandle()}; + TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); + + ScriptService scriptService = ScriptServiceFactory.getInstance().getScriptService(); + ScriptConfiguration scriptConfiguration = scriptService.getScriptConfiguration(args[0]); + + DSpaceRunnable script = null; + if (scriptConfiguration != null) { + script = scriptService.createDSpaceRunnableForScriptConfiguration(scriptConfiguration); + } + if (script != null) { + script.initialize(args, testDSpaceRunnableHandler, null); + script.run(); + } + } +} diff --git a/dspace-api/src/test/java/org/dspace/curate/CuratorTest.java b/dspace-api/src/test/java/org/dspace/curate/CuratorTest.java index 8ca6b6c172..0abb3b48ac 100644 --- a/dspace-api/src/test/java/org/dspace/curate/CuratorTest.java +++ b/dspace-api/src/test/java/org/dspace/curate/CuratorTest.java @@ -8,23 +8,27 @@ package org.dspace.curate; import static org.junit.Assert.assertEquals; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; import java.util.HashMap; import java.util.Map; import org.dspace.AbstractUnitTest; import org.dspace.content.DSpaceObject; +import org.dspace.content.Item; import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.SiteService; +import org.dspace.core.factory.CoreServiceFactory; +import org.dspace.ctask.general.NoOpCurationTask; import org.dspace.services.ConfigurationService; import org.junit.Test; /** - * * @author mhwood */ -public class CuratorTest - extends AbstractUnitTest { +public class CuratorTest extends AbstractUnitTest { + private static final SiteService SITE_SERVICE = ContentServiceFactory.getInstance().getSiteService(); static final String RUN_PARAMETER_NAME = "runParameter"; @@ -32,28 +36,32 @@ public class CuratorTest static final String TASK_PROPERTY_NAME = "taskProperty"; static final String TASK_PROPERTY_VALUE = "a property"; - /** Value of a known runtime parameter, if any. */ + /** + * Value of a known runtime parameter, if any. + */ static String runParameter; - /** Value of a known task property, if any. */ + /** + * Value of a known task property, if any. + */ static String taskProperty; /** * Test of curate method, of class Curator. * Currently this just tests task properties and run parameters. + * * @throws java.lang.Exception passed through. */ @Test - public void testCurate_DSpaceObject() - throws Exception { - System.out.println("curate"); + public void testCurate_DSpaceObject() throws Exception { + CoreServiceFactory.getInstance().getPluginService().clearNamedPluginClasses(); final String TASK_NAME = "dummyTask"; // Configure the task to be run. ConfigurationService cfg = kernelImpl.getConfigurationService(); cfg.setProperty("plugin.named.org.dspace.curate.CurationTask", - DummyTask.class.getName() + " = " + TASK_NAME); + DummyTask.class.getName() + " = " + TASK_NAME); cfg.setProperty(TASK_NAME + '.' + TASK_PROPERTY_NAME, TASK_PROPERTY_VALUE); // Get and configure a Curator. @@ -72,12 +80,40 @@ public class CuratorTest // Check the result. System.out.format("Task %s result was '%s'%n", - TASK_NAME, instance.getResult(TASK_NAME)); + TASK_NAME, instance.getResult(TASK_NAME)); System.out.format("Task %s status was %d%n", - TASK_NAME, instance.getStatus(TASK_NAME)); + TASK_NAME, instance.getStatus(TASK_NAME)); assertEquals("Unexpected task status", - Curator.CURATE_SUCCESS, instance.getStatus(TASK_NAME)); + Curator.CURATE_SUCCESS, instance.getStatus(TASK_NAME)); assertEquals("Wrong run parameter", RUN_PARAMETER_VALUE, runParameter); assertEquals("Wrong task property", TASK_PROPERTY_VALUE, taskProperty); } + + @Test + public void testCurate_NoOpTask() throws Exception { + + CoreServiceFactory.getInstance().getPluginService().clearNamedPluginClasses(); + + final String TASK_NAME = "noop"; + + // Configure the noop task to be run. + ConfigurationService cfg = kernelImpl.getConfigurationService(); + cfg.setProperty("plugin.named.org.dspace.curate.CurationTask", + NoOpCurationTask.class.getName() + " = " + TASK_NAME); + + // Get and configure a Curator. + Curator curator = new Curator(); + + StringBuilder reporterOutput = new StringBuilder(); + curator.setReporter(reporterOutput); // Send any report to our StringBuilder. + + curator.addTask(TASK_NAME); + Item item = mock(Item.class); + when(item.getType()).thenReturn(2); + when(item.getHandle()).thenReturn("testHandle"); + curator.curate(context, item); + + assertEquals(Curator.CURATE_SUCCESS, curator.getStatus(TASK_NAME)); + assertEquals("No operation performed on testHandle", reporterOutput.toString()); + } } diff --git a/dspace-api/src/test/java/org/dspace/discovery/MetadataFieldIndexFactoryImplTest.java b/dspace-api/src/test/java/org/dspace/discovery/MetadataFieldIndexFactoryImplTest.java new file mode 100644 index 0000000000..b54158c002 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/discovery/MetadataFieldIndexFactoryImplTest.java @@ -0,0 +1,93 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.discovery; + +import static org.junit.Assert.assertTrue; + +import org.apache.solr.common.SolrInputDocument; +import org.dspace.AbstractUnitTest; +import org.dspace.content.MetadataField; +import org.dspace.content.MetadataSchema; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.content.service.MetadataFieldService; +import org.dspace.content.service.MetadataSchemaService; +import org.dspace.discovery.indexobject.IndexableMetadataField; +import org.dspace.discovery.indexobject.MetadataFieldIndexFactoryImpl; +import org.junit.Test; + +/** + * Test class for {@link MetadataFieldIndexFactoryImpl} + * + * @author Maria Verdonck (Atmire) on 23/07/2020 + */ +public class MetadataFieldIndexFactoryImplTest extends AbstractUnitTest { + private MetadataSchemaService metadataSchemaService = + ContentServiceFactory.getInstance().getMetadataSchemaService(); + private MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService(); + + private String schemaName = "schema1"; + private String elemName1 = "elem1"; + private String elemName2 = "elem2"; + private String qualName1 = "qual1"; + + private MetadataSchema schema; + private MetadataField field1; + private MetadataField field2; + + @Test + public void test_buildDocument_withQualifier() throws Exception { + context.turnOffAuthorisationSystem(); + schema = metadataSchemaService.create(context, schemaName, "htpp://test/schema/"); + field1 = metadataFieldService.create(context, schema, elemName1, qualName1, "note 1"); + + MetadataFieldIndexFactoryImpl fieldIndexFactory = new MetadataFieldIndexFactoryImpl(); + IndexableMetadataField indexableMetadataField = new IndexableMetadataField(this.field1); + SolrInputDocument solrInputDocument = fieldIndexFactory.buildDocument(context, indexableMetadataField); + + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.SCHEMA_FIELD_NAME + "_keyword") + .contains(this.field1.getMetadataSchema().getName())); + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.ELEMENT_FIELD_NAME + "_keyword") + .contains(this.field1.getElement())); + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.QUALIFIER_FIELD_NAME + "_keyword") + .contains(this.field1.getQualifier())); + + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.FIELD_NAME_VARIATIONS + "_keyword") + .contains(this.field1.getQualifier())); + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.FIELD_NAME_VARIATIONS + "_keyword") + .contains(this.field1.getElement() + "." + this.field1.getQualifier())); + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.FIELD_NAME_VARIATIONS + "_keyword") + .contains(this.field1.toString('.'))); + + metadataSchemaService.delete(context, schema); + metadataFieldService.delete(context, field1); + context.restoreAuthSystemState(); + } + + @Test + public void test_buildDocument_noQualifier() throws Exception { + context.turnOffAuthorisationSystem(); + schema = metadataSchemaService.create(context, schemaName, "htpp://test/schema/"); + field2 = metadataFieldService.create(context, schema, elemName2, null, "note 2"); + MetadataFieldIndexFactoryImpl fieldIndexFactory = new MetadataFieldIndexFactoryImpl(); + IndexableMetadataField indexableMetadataField = new IndexableMetadataField(this.field2); + SolrInputDocument solrInputDocument = fieldIndexFactory.buildDocument(context, indexableMetadataField); + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.SCHEMA_FIELD_NAME + "_keyword") + .contains(this.field2.getMetadataSchema().getName())); + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.ELEMENT_FIELD_NAME + "_keyword") + .contains(this.field2.getElement())); + + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.FIELD_NAME_VARIATIONS + "_keyword") + .contains(this.field2.getElement())); + assertTrue(solrInputDocument.getFieldValues(MetadataFieldIndexFactoryImpl.FIELD_NAME_VARIATIONS + "_keyword") + .contains(this.field2.toString('.'))); + + metadataSchemaService.delete(context, schema); + metadataFieldService.delete(context, field2); + context.restoreAuthSystemState(); + } +} diff --git a/dspace-api/src/test/java/org/dspace/discovery/MockSolrSearchCore.java b/dspace-api/src/test/java/org/dspace/discovery/MockSolrSearchCore.java index 1934ba9f0f..b81e18a473 100644 --- a/dspace-api/src/test/java/org/dspace/discovery/MockSolrSearchCore.java +++ b/dspace-api/src/test/java/org/dspace/discovery/MockSolrSearchCore.java @@ -7,19 +7,35 @@ */ package org.dspace.discovery; +import org.dspace.solr.MockSolrServer; +import org.springframework.beans.factory.DisposableBean; import org.springframework.beans.factory.InitializingBean; import org.springframework.stereotype.Service; /** - * Mock SOLR service for the Search Core + * Mock SOLR service for the Search Core. Manages an in-process Solr server + * with an in-memory "search" core. */ @Service -public class MockSolrSearchCore extends SolrSearchCore implements InitializingBean { +public class MockSolrSearchCore extends SolrSearchCore + implements InitializingBean, DisposableBean { + private MockSolrServer mockSolrServer; @Override public void afterPropertiesSet() throws Exception { - //We don't use SOLR in the tests of this module - solr = null; + mockSolrServer = new MockSolrServer("search"); + solr = mockSolrServer.getSolrServer(); } + /** + * Reset the core for the next test. See {@link MockSolrServer#reset()}. + */ + public void reset() { + mockSolrServer.reset(); + } + + @Override + public void destroy() throws Exception { + mockSolrServer.destroy(); + } } diff --git a/dspace-api/src/test/java/org/dspace/eperson/EPersonInWorkflowIT.java b/dspace-api/src/test/java/org/dspace/eperson/EPersonInWorkflowIT.java new file mode 100644 index 0000000000..90b094ccbc --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/eperson/EPersonInWorkflowIT.java @@ -0,0 +1,1558 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.eperson; + +import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertNotNull; +import static org.junit.Assert.assertNull; +import static org.junit.Assert.assertTrue; +import static org.junit.Assert.fail; + +import java.sql.SQLException; +import java.util.List; +import javax.servlet.http.HttpServletRequest; + +import org.apache.commons.lang3.StringUtils; +import org.apache.logging.log4j.Logger; +import org.dspace.AbstractIntegrationTestWithDatabase; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.WorkspaceItemBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.WorkspaceItem; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.content.service.CollectionService; +import org.dspace.content.service.CommunityService; +import org.dspace.content.service.InstallItemService; +import org.dspace.content.service.ItemService; +import org.dspace.content.service.WorkspaceItemService; +import org.dspace.eperson.factory.EPersonServiceFactory; +import org.dspace.eperson.service.EPersonService; +import org.dspace.eperson.service.GroupService; +import org.dspace.workflow.WorkflowService; +import org.dspace.workflow.factory.WorkflowServiceFactory; +import org.dspace.xmlworkflow.factory.XmlWorkflowServiceFactory; +import org.dspace.xmlworkflow.service.XmlWorkflowService; +import org.dspace.xmlworkflow.state.Workflow; +import org.dspace.xmlworkflow.storedcomponents.CollectionRole; +import org.dspace.xmlworkflow.storedcomponents.XmlWorkflowItem; +import org.dspace.xmlworkflow.storedcomponents.service.CollectionRoleService; +import org.junit.Before; +import org.junit.Test; +import org.springframework.mock.web.MockHttpServletRequest; + +/** + * Class to test interaction between EPerson deletion and tasks present in the workflow + */ +public class EPersonInWorkflowIT extends AbstractIntegrationTestWithDatabase { + + private final String REVIEW_STEP = "reviewstep"; + private final String CLAIM_ACTION = "claimaction"; + private final String REVIEW_ACTION = "reviewaction"; + private final String REVIEW_ROLE = "reviewer"; + private final String EDIT_STEP = "editstep"; + private final String EDIT_ACTION = "editaction"; + private final String FINAL_EDIT_ROLE = "finaleditor"; + private final String FINAL_EDIT_STEP = "finaleditstep"; + private final String FINAL_EDIT_ACTION = "finaleditaction"; + private final String EDIT_ROLE = "editor"; + protected EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); + protected GroupService groupService = EPersonServiceFactory.getInstance().getGroupService(); + protected CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService(); + protected CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); + protected ItemService itemService = ContentServiceFactory.getInstance().getItemService(); + protected InstallItemService installItemService = ContentServiceFactory.getInstance().getInstallItemService(); + protected WorkflowService workflowService = WorkflowServiceFactory.getInstance().getWorkflowService(); + protected WorkspaceItemService workspaceItemService = ContentServiceFactory.getInstance() + .getWorkspaceItemService(); + protected XmlWorkflowService xmlWorkflowService = XmlWorkflowServiceFactory.getInstance().getXmlWorkflowService(); + protected CollectionRoleService collectionRoleService = XmlWorkflowServiceFactory.getInstance() + .getCollectionRoleService(); + + + private EPerson workflowUserA; + private EPerson workflowUserB; + private EPerson workflowUserC; + private EPerson workflowUserD; + + private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(EPersonInWorkflowIT.class); + + /** + * This method will be run before every test as per @Before. It will + * initialize resources required for the tests. + * + * Other methods can be annotated with @Before here or in subclasses but no + * execution order is guaranteed + */ + @Before + @Override + public void setUp() throws Exception { + super.setUp(); + + context.turnOffAuthorisationSystem(); + + workflowUserA = EPersonBuilder.createEPerson(context).withEmail("workflowUserA@example.org").build(); + workflowUserB = EPersonBuilder.createEPerson(context).withEmail("workflowUserB@example.org").build(); + workflowUserC = EPersonBuilder.createEPerson(context).withEmail("workflowUserC@example.org").build(); + workflowUserD = EPersonBuilder.createEPerson(context).withEmail("workflowUserD@example.org").build(); + + context.restoreAuthSystemState(); + + } + + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies this with the task claimed by the user to be deleted. + * This test also verifies that after the task has been passed and the user has been removed from the workflow + * group, the EPerson can be removed. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenOnlyUserInGroup1() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - claim it by user B + * - delete user B + * - verify the delete is refused + * - remove user B from step 1 + * - verify that the removal is refused due to B being the last member in the workflow group and the group + * having a claimed item + * - approve it by user B and let it move to step 2 + * - remove user B from step 3 + * - approve it by user C + * - verify that the item is archived without any actions apart from removing user B + * - delete user B + * - verify the delete succeeds + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, REVIEW_ROLE, false); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + + + assertDeletionOfEperson(workflowUserB, false); + + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, FINAL_EDIT_ROLE, true); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, REVIEW_ROLE, true); + + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies this with a pooled task. + * This test also verifies that after the task has been passed and the user has been removed from the workflow + * group, the EPerson can be removed. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenOnlyUserInGroup2() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - delete user B + * - verify the delete is refused + * - remove user B from step 1 + * - verify that the removal is refused due to B being the last member in the workflow group and the group + * having a pool task + * - approve it by user B and let it move to step 2 + * - remove user B from step 3 + * - delete user B + * - verify the delete succeeds + * - Approve it by user C + * - verify that the item is archived without any actions apart from the approving in step 2 + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + assertDeletionOfEperson(workflowUserB, false); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, REVIEW_ROLE, false); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, REVIEW_ROLE, true); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, "finaleditor", true); + + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies this with a group without a task. + * This test also verifies that after user has been removed from the workflow + * group, the EPerson can be removed. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenOnlyUserInGroup3() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - delete user C + * - verify the delete is refused + * - remove user C from step 2 + * - delete user C + * - verify the delete succeeds + * - Approve it by user B + * - verify that the item moved to step 3 without any actions apart from the approving in step 1 + * - Approve it by user B + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + assertDeletionOfEperson(workflowUserC, false); + + assertRemovalOfEpersonFromWorkflowGroup(workflowUserC, collection, EDIT_ROLE, true); + + assertDeletionOfEperson(workflowUserC, true); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, + CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies a user can't be removed from a workflow step they have claimed + * items for that task. This test also verifies that the user can be removed from another workflow group where + * they have no claimed items for that task. This test also verifies that after user has performed the task, and the + * user has been removed from the workflow group, the EPerson can be removed. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenOnlyUserInGroup4() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - approve it by user B, and let it move to step 2 + * - approve it by user C, and let it move to step 3 + * - claim it by user B + * - remove user B from step 1 + * - delete user B + * - verify the delete is refused + * - remove user B from step 3, verify that the removal is refused due to user B having a claimed task and there + * being no other members in step 3 + * - approve it by user B + * - delete user B + * - verify the delete suceeds + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + + assertDeletionOfEperson(workflowUserB, false); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, REVIEW_ROLE, true); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, FINAL_EDIT_ROLE, false); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, FINAL_EDIT_ROLE, true); + assertDeletionOfEperson(workflowUserB, true); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies a user can't be removed from a workflow step they have claimed + * items for that task. This test also verifies that this verification is using both the step and the collection + * to determine whether the user can be removed from a workflow group. This test also verifies that after user has + * been removed from the workflow group and the task has been passed, the EPerson can be removed. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenOnlyUserInGroup5() throws Exception { + /* + * This test has the following setup: + * - Collection A - Step 1: user B + * - Collection A - Step 2: user C + * - Collection A - Step 3: user B + * + * - Collection B - Step 1: user B + * + * This test will perform the following checks: + * - create a workspace item in Collection A, and let it move to step 1 + * - claim it by user B + * - delete user B + * - verify the delete is refused + * - remove user B from Col A - step 3 + * - remove user B from Col B - step 1 + * - remove user B from Col A - step 1 + * - Verify that the removal from Col A - step 1 is refused because user B has a claimed task in that + * collection and no other user is present + * - approve it by user B, and let it move to step 2 + * - remove user B from Col A - step 1 + * - verify it succeeds + * - delete user B + * - verify it succeeds + * - approve it by user C + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collectionA = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + Collection collectionB = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collectionA) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collectionA); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collectionA, FINAL_EDIT_ROLE, true); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collectionB, REVIEW_ROLE, true); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collectionA, REVIEW_ROLE, false); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collectionA, REVIEW_ROLE, true); + assertDeletionOfEperson(workflowUserB, true); + + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that the submitter can be removed, and the workflow steps will still be supported + * if there's no submitter assigned to the item + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenOnlyUserInGroup6() throws Exception { + /* + * This test has the following setup: + * - Submitter: user A + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * - create a workspace item, and let it move to step 1 + * - delete the submitter + * - verify it succeeds + * - Approve it by user B + * - verify that the item moved to step 2 + * - Approve it by user C + * - verify that the item moved to step 3 + * - Approve it by user B + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + assertDeletionOfEperson(workflowUserA, true); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test also verifies the user can't be removed from a step with a pooled + * task if they are the only member. This test also verifies the user can be removed from a step with no tasks + * even if they are the only member. This test also verifies that after the task has been passed and the user has + * been removed from the workflow, the EPerson can be removed. This test also verifies that an item is correctly + * arhived if the last step has no members left. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenOnlyUserInGroup7() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - delete user B + * - verify the delete is refused + * - remove user B from step 1 + * - verify the removal is refused + * - remove user B from step 3 + * - verify the removal succeeds + * - approve it by user B + * - verify that the item moved to step 2 + * - remove user B from step 1 + * - delete user B + * - verify the delete succeeds + * - approve it by user C + * - verify that the item is archived without any actions apart from removing user B + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + assertDeletionOfEperson(workflowUserB, false); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, REVIEW_ROLE, false); + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, FINAL_EDIT_ROLE, true); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + assertRemovalOfEpersonFromWorkflowGroup(workflowUserB, collection, REVIEW_ROLE, true); + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies this with a pooled task in the last workflow step. + * This test also verifies that after after another user has been added to the workflow groups, the original EPerson + * can be removed. + * + * @throws Exception + */ + @Test + public void testDeleteUserAfterReplacingUser1() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - approve it by user B + * - verify that the item moved to step 2 + * - Approve it by user C + * - delete user B + * - verify the delete is refused + * - add user D to workflow step 3 + * - delete user B + * - verify the delete is refused + * - add user D to workflow step 1 + * - delete user B + * - verify the delete succeeds + * - Approve it by user D + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + assertDeletionOfEperson(workflowUserB, false); + + addUserToWorkflowGroup(workflowUserD, collection, FINAL_EDIT_ROLE); + assertDeletionOfEperson(workflowUserB, false); + addUserToWorkflowGroup(workflowUserD, collection, REVIEW_ROLE); + + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies this with a pooled task at the beginning of the workflow. + * This test also verifies that after after another user has been added to the workflow groups from which the + * original user is being removed, the EPerson can be removed and the workflow process can be resumed with the newly + * added user. + * + * @throws Exception + */ + @Test + public void testDeleteUserAfterReplacingUser2() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - delete user B + * - verify the delete is refused + * - add user D to workflow step 1 + * - add user D to workflow step 3 + * - delete user B + * - verify the delete succeeds + * - Approve it by user D + * - verify that the item moved to step 2 + * - Approve it by user C + * - Approve it by user D + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + assertDeletionOfEperson(workflowUserB, false); + addUserToWorkflowGroup(workflowUserD, collection, REVIEW_ROLE); + addUserToWorkflowGroup(workflowUserD, collection, FINAL_EDIT_ROLE); + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has an + * item present in a workflow with a pooled task. This test verifies this with an item that has entered the workflow + * and is still to progress to the step where the user will be removed. + * This test also verifies that after a new user has been added to this step, the original user can be removed. This + * test then verifies that the item can proceed through the full workflow and is correctly archived at the end. + * + * @throws Exception + */ + @Test + public void testDeleteUserAfterReplacingUser3() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - delete user C + * - verify the delete is refused + * - add user D to workflow step 2 + * - delete user C + * - verify the delete succeeds + * - Approve it by user B + * - verify that the item moved to step 2 + * - Approve it by user D + * - verify that the item moved to step 3 + * - Approve it by user B + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + assertDeletionOfEperson(workflowUserC, false); + addUserToWorkflowGroup(workflowUserD, collection, EDIT_ROLE); + assertDeletionOfEperson(workflowUserC, true); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has an + * item present in a workflow with a pooled task. This test verifies this with a claimed task in the first workflow + * step. + * This test also verifies that after another user has been added to the respective workflow groups, the original + * user can be deleted. The claimed task will then become available again in the workflow pool where the new user + * can claim it and approve it. + * This test will verify that the remainder of the workflow can be completed. + * + * @throws Exception + */ + @Test + public void testDeleteUserAfterReplacingUser4() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - claim it by user B, but don’t approve it + * - delete user B + * - verify the delete is refused + * - add user D to workflow step 1 + * - add user D to workflow step 3 + * - delete user B + * - verify the delete succeeds + * - Verify user D can now claim and approve it + * - verify that the item moved to step 2 + * - claim it by user C + * - approve it by user C + * - verify that the item moved to step 3 + * - Verify user D can claim and approve it + * - verify that the item is archived successfully + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + + assertDeletionOfEperson(workflowUserB, false); + addUserToWorkflowGroup(workflowUserD, collection, REVIEW_ROLE); + addUserToWorkflowGroup(workflowUserD, collection, FINAL_EDIT_ROLE); + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies this with a claimed task in the middle of the workflow by the + * user to be deleted. + * This test also verifies that after another user is added to the middle workflow step, the original user can be + * deleted and that the task will become available in the pool tasks of the new user. + * This test then verifies that the workflow can be progressed by the new user and completed through the final step, + * and that the item will be archived. + * + * @throws Exception + */ + @Test + public void testDeleteUserAfterReplacingUser5() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - Approve it by user B + * - verify that the item moved to step 2 + * - claim it by user C, but don’t approve it + * - delete user C + * - verify the delete is refused + * - add user D to workflow step 2 + * - delete user C + * - verify the delete succeeds + * - Verify user D can now claim and approve it + * - verify that the item moved to step 3 + * - verify that user B can claim and approve it + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + + assertDeletionOfEperson(workflowUserC, false); + addUserToWorkflowGroup(workflowUserD, collection, EDIT_ROLE); + assertDeletionOfEperson(workflowUserC, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson cannot be removed if they are the only member of a Workflow Group that has + * tasks currently assigned to it. This test verifies this with a claimed task by the user to be deleted in the + * final workflow step. + * This test also verifies that after another user has been added to the workflow groups of the to be deleted user, + * the original user can be successfully deleted. + * Afterwards the task can be claimed in the final step by the newly added user and the workflow can be completed. + * + * @throws Exception + */ + @Test + public void testDeleteUserAfterReplacingUser6() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B + * - Step 2: user C + * - Step 3: user B + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - Approve it by user B + * - verify that the item moved to step 2 + * - Approve it by user C + * - verify that the item moved to step 3 + * - claim it by user B, but don’t approve it + * - delete user B + * - verify the delete is refused + * - add user D to workflow step 1 + * - add user D to workflow step 3 + * - delete user B + * - verify the delete succeeds + * - Verify user D can now claim and approve it + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB) + .withWorkflowGroup(2, workflowUserC) + .withWorkflowGroup(3, workflowUserB) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + assertDeletionOfEperson(workflowUserB, false); + addUserToWorkflowGroup(workflowUserD, collection, REVIEW_ROLE); + addUserToWorkflowGroup(workflowUserD, collection, FINAL_EDIT_ROLE); + assertDeletionOfEperson(workflowUserB, true); + + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson can be removed if there is another user is present in the Workflow Group. + * This test verifies this with a pool task in the final workflow step. + * This test also verifies that the other user can claim the task and complete the workflow process. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenMultipleUser1() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B and D + * - Step 2: user C and D + * - Step 3: user B and D + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - approve it by user B + * - verify that the item moved to step 2 + * - Approve it by user C + * - delete user B + * - verify the delete succeeds + * - Approve it by user D + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB, workflowUserD) + .withWorkflowGroup(2, workflowUserC, workflowUserD) + .withWorkflowGroup(3, workflowUserB, workflowUserD) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson can be removed if there is another user is present in the Workflow Group. + * This test verifies this with a pool task in the first workflow step. + * This test also verifies that the other user can claim the task and complete the workflow process the deleted + * user was part of. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenMultipleUser2() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B and D + * - Step 2: user C and D + * - Step 3: user B and D + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - delete user B + * - verify the delete succeeds + * - Approve it by user D + * - verify that the item moved to step 2 + * - Approve it by user C + * - verify that the item moved to step 3 + * - Approve it by user D + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB, workflowUserD) + .withWorkflowGroup(2, workflowUserC, workflowUserD) + .withWorkflowGroup(3, workflowUserB, workflowUserD) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson can be removed if there is another user is present in the Workflow Group. + * This test verifies this with a pool task in the middle workflow step. + * This test also verifies that the other user can claim the task and complete the workflow process. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenMultipleUser3() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B and D + * - Step 2: user C and D + * - Step 3: user B and D + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - delete user C + * - verify the delete succeeds + * - Approve it by user B + * - verify that the item moved to step 2 + * - Approve it by user D + * - verify that the item moved to step 3 + * - Approve it by user B + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB, workflowUserD) + .withWorkflowGroup(2, workflowUserC, workflowUserD) + .withWorkflowGroup(3, workflowUserB, workflowUserD) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + assertDeletionOfEperson(workflowUserC, true); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson can be removed if there is another user is present in the Workflow Group. + * This test verifies this with a claimed task in the first workflow step. + * This test also verifies that the claimed task will return the first workflow's step task pool and that the other + * user can claim the task and progress it. + * This test then verifies that the workflow can be completed and the item will be archived. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenMultipleUser4() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B and D + * - Step 2: user C and D + * - Step 3: user B and D + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - claim it by user B, but don’t approve it + * - delete user B + * - verify the delete succeeds + * - Verify user D can now claim and approve it + * - verify that the item moved to step 2 + * - Approve it by user C + * - verify that the item moved to step 3 + * - Approve it by user D + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB, workflowUserD) + .withWorkflowGroup(2, workflowUserC, workflowUserD) + .withWorkflowGroup(3, workflowUserB, workflowUserD) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson can be removed if there is another user is present in the Workflow Group. + * This test verifies this with a claimed task in the middle workflow step. + * This test also verifies that the claimed task will return the middle workflow's step task pool and that the other + * user can claim the task and progress it. + * This test then verifies that the workflow can be completed and the item will be archived. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenMultipleUser5() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B and D + * - Step 2: user C and D + * - Step 3: user B and D + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - Approve it by user B + * - verify that the item moved to step 2 + * - claim it by user C, but don’t approve it + * - delete user C + * - verify the delete succeeds + * - Verify user D can now claim and approve it + * - verify that the item moved to step 3 + * - Approve it by user B + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB, workflowUserD) + .withWorkflowGroup(2, workflowUserC, workflowUserD) + .withWorkflowGroup(3, workflowUserB, workflowUserD) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + assertDeletionOfEperson(workflowUserC, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + /** + * This test verifies that an EPerson can be removed if there is another user is present in the Workflow Group. + * This test verifies this with a claimed task in the final workflow step. + * This test also verifies that the claimed task will return the final workflow's step task pool and that the other + * user can claim the task and complete workflow. + * This test then verifies that the item will be archived. + * + * @throws Exception + */ + @Test + public void testDeleteUserWhenMultipleUser6() throws Exception { + /* + * This test has the following setup: + * - Step 1: user B and D + * - Step 2: user C and D + * - Step 3: user B and D + * + * This test will perform the following checks: + * - create a workspace item, and let it move to step 1 + * - Approve it by user B + * - verify that the item moved to step 2 + * - Approve it by user C + * - verify that the item moved to step 3 + * - claim it by user B, but don’t approve it + * - delete user B + * - verify the delete succeeds + * - Verify user D can now claim and approve it + * - verify that the item is archived + */ + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUserB, workflowUserD) + .withWorkflowGroup(2, workflowUserC, workflowUserD) + .withWorkflowGroup(3, workflowUserB, workflowUserD) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(workflowUserA) + .withTitle("Test item full workflow") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Workflow workflow = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory().getWorkflow(collection); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + MockHttpServletRequest httpServletRequest = new MockHttpServletRequest(); + httpServletRequest.setParameter("submit_approve", "submit_approve"); + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, REVIEW_STEP, REVIEW_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserC, workflow, workflowItem, EDIT_STEP, EDIT_ACTION); + + + executeWorkflowAction(httpServletRequest, workflowUserB, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + assertDeletionOfEperson(workflowUserB, true); + + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, CLAIM_ACTION); + executeWorkflowAction(httpServletRequest, workflowUserD, workflow, workflowItem, FINAL_EDIT_STEP, + FINAL_EDIT_ACTION); + + assertTrue(workflowItem.getItem().isArchived()); + + } + + + private void addUserToWorkflowGroup(EPerson ePerson, Collection collection, String roleName) throws SQLException { + List roles = collectionRoleService.findByCollection(context, collection); + for (CollectionRole role : roles) { + if (StringUtils.equals(role.getRoleId(), roleName)) { + Group group = role.getGroup(); + groupService.addMember(context, group, ePerson); + } + } + + } + + private void executeWorkflowAction(HttpServletRequest httpServletRequest, EPerson user, + Workflow workflow, XmlWorkflowItem workflowItem, String stepId, String actionId) + throws Exception { + context.setCurrentUser(user); + xmlWorkflowService.doState(context, user, httpServletRequest, workflowItem.getID(), workflow, + workflow.getStep(stepId).getActionConfig(actionId)); + context.setCurrentUser(null); + } + + private void assertRemovalOfEpersonFromWorkflowGroup(EPerson ePerson, Collection collection, String roleName, + boolean shouldSucceed) { + boolean deleteSuccess = false; + boolean deleteError = false; + + try { + List roles = collectionRoleService.findByCollection(context, collection); + for (CollectionRole role : roles) { + if (StringUtils.equals(role.getRoleId(), roleName)) { + Group group = role.getGroup(); + groupService.removeMember(context, group, ePerson); + deleteSuccess = true; + } + } + } catch (Exception ex) { + if (ex instanceof IllegalStateException) { + deleteSuccess = false; + deleteError = true; + } else { + deleteSuccess = false; + log.error("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + ": ", ex); + fail("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + } + if (shouldSucceed) { + assertTrue(deleteSuccess); + assertFalse(deleteError); + } else { + assertTrue(deleteError); + assertFalse(deleteSuccess); + } + } + + private void assertDeletionOfEperson(EPerson ePerson, boolean shouldSucceed) throws SQLException { + boolean deleteSuccess; + boolean deleteError = false; + try { + ePersonService.delete(context, ePerson); + deleteSuccess = true; + } catch (Exception ex) { + if (ex instanceof IllegalStateException) { + deleteSuccess = false; + deleteError = true; + } else { + deleteSuccess = false; + log.error("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + ": ", ex); + fail("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + } + + EPerson ePersonCheck = ePersonService.find(context, ePerson.getID()); + if (shouldSucceed) { + assertTrue(deleteSuccess); + assertFalse(deleteError); + assertNull(ePersonCheck); + } else { + assertTrue(deleteError); + assertFalse(deleteSuccess); + assertNotNull(ePerson); + } + } +} diff --git a/dspace-api/src/test/java/org/dspace/eperson/EPersonTest.java b/dspace-api/src/test/java/org/dspace/eperson/EPersonTest.java index 8950bfa409..24bc00cce4 100644 --- a/dspace-api/src/test/java/org/dspace/eperson/EPersonTest.java +++ b/dspace-api/src/test/java/org/dspace/eperson/EPersonTest.java @@ -5,21 +5,43 @@ * * http://www.dspace.org/license/ */ - package org.dspace.eperson; import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertNotNull; +import static org.junit.Assert.assertNull; import static org.junit.Assert.fail; +import java.io.IOException; import java.sql.SQLException; +import java.util.Iterator; +import java.util.List; +import javax.mail.MessagingException; import org.apache.commons.codec.DecoderException; +import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; import org.dspace.AbstractUnitTest; import org.dspace.authorize.AuthorizeException; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.content.WorkspaceItem; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.content.service.CollectionService; +import org.dspace.content.service.CommunityService; +import org.dspace.content.service.InstallItemService; +import org.dspace.content.service.ItemService; +import org.dspace.content.service.WorkspaceItemService; import org.dspace.core.Constants; import org.dspace.eperson.factory.EPersonServiceFactory; import org.dspace.eperson.service.EPersonService; +import org.dspace.eperson.service.GroupService; +import org.dspace.workflow.WorkflowException; +import org.dspace.workflow.WorkflowItem; +import org.dspace.workflow.WorkflowItemService; +import org.dspace.workflow.WorkflowService; +import org.dspace.workflow.factory.WorkflowServiceFactory; import org.junit.Before; import org.junit.Test; @@ -27,9 +49,29 @@ import org.junit.Test; * @author mwood */ public class EPersonTest extends AbstractUnitTest { - protected EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); - private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(EPersonTest.class); + protected EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); + protected GroupService groupService = EPersonServiceFactory.getInstance().getGroupService(); + protected CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService(); + protected CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); + protected ItemService itemService = ContentServiceFactory.getInstance().getItemService(); + protected InstallItemService installItemService = ContentServiceFactory.getInstance().getInstallItemService(); + protected WorkflowItemService workflowItemService = WorkflowServiceFactory.getInstance().getWorkflowItemService(); + protected WorkflowService workflowService = WorkflowServiceFactory.getInstance().getWorkflowService(); + protected WorkspaceItemService workspaceItemService = ContentServiceFactory.getInstance() + .getWorkspaceItemService(); + + private Community community = null; + private Collection collection = null; + private Item item = null; + + private static final String EMAIL = "test@example.com"; + private static final String FIRSTNAME = "Kevin"; + private static final String LASTNAME = "Van de Velde"; + private static final String NETID = "1985"; + private static final String PASSWORD = "test"; + + private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(EPersonTest.class); public EPersonTest() { } @@ -38,8 +80,8 @@ public class EPersonTest extends AbstractUnitTest { * This method will be run before every test as per @Before. It will * initialize resources required for the tests. * - * Other methods can be annotated with @Before here or in subclasses - * but no execution order is guaranteed + * Other methods can be annotated with @Before here or in subclasses but no + * execution order is guaranteed */ @Before @Override @@ -49,12 +91,14 @@ public class EPersonTest extends AbstractUnitTest { context.turnOffAuthorisationSystem(); try { EPerson eperson = ePersonService.create(context); - eperson.setEmail("kevin@dspace.org"); - eperson.setFirstName(context, "Kevin"); - eperson.setLastName(context, "Van de Velde"); - eperson.setNetid("1985"); - eperson.setPassword("test"); + eperson.setEmail(EMAIL); + eperson.setFirstName(context, FIRSTNAME); + eperson.setLastName(context, LASTNAME); + eperson.setNetid(NETID); + eperson.setPassword(PASSWORD); ePersonService.update(context, eperson); + this.community = communityService.create(null, context); + this.collection = collectionService.create(context, this.community); } catch (SQLException | AuthorizeException ex) { log.error("Error in init", ex); fail("Error in init: " + ex.getMessage()); @@ -67,18 +111,76 @@ public class EPersonTest extends AbstractUnitTest { public void destroy() { context.turnOffAuthorisationSystem(); try { - EPerson testPerson = ePersonService.findByEmail(context, "kevin@dspace.org"); + EPerson testPerson = ePersonService.findByEmail(context, EMAIL); if (testPerson != null) { ePersonService.delete(context, testPerson); } - } catch (Exception ex) { + } catch (IOException | SQLException | AuthorizeException ex) { log.error("Error in destroy", ex); fail("Error in destroy: " + ex.getMessage()); } + if (item != null) { + try { + item = itemService.find(context, item.getID()); + itemService.delete(context, item); + } catch (SQLException | AuthorizeException | IOException ex) { + log.error("Error in destroy", ex); + fail("Error in destroy: " + ex.getMessage()); + } + } + if (this.collection != null) { + try { + this.collection = collectionService.find(context, this.collection.getID()); + collectionService.delete(context, this.collection); + } catch (SQLException | AuthorizeException | IOException ex) { + log.error("Error in destroy", ex); + fail("Error in destroy: " + ex.getMessage()); + } + } + if (this.community != null) { + try { + this.community = communityService.find(context, this.community.getID()); + communityService.delete(context, this.community); + } catch (SQLException | AuthorizeException | IOException ex) { + log.error("Error in destroy", ex); + fail("Error in destroy: " + ex.getMessage()); + } + } + context.restoreAuthSystemState(); + item = null; + this.collection = null; + this.community = null; super.destroy(); } + @Test + public void testPreferences() throws Exception { + + String cookies = + "{" + + "\"token_item\":true," + + "\"impersonation\":true," + + "\"redirect\":true," + + "\"language\":true," + + "\"klaro\":true," + + "\"google-analytics\":false" + + "}"; + + ePersonService.addMetadata(context, eperson, "dspace", "agreements", "cookies", null, cookies); + ePersonService.addMetadata(context, eperson, "dspace", "agreements", "end-user", null, "true"); + ePersonService.update(context, eperson); + + assertEquals( + cookies, + ePersonService.getMetadataFirstValue(eperson, "dspace", "agreements", "cookies", null) + ); + assertEquals( + "true", + ePersonService.getMetadataFirstValue(eperson, "dspace", "agreements", "end-user", null) + ); + } + /** * Test of equals method, of class EPerson. */ @@ -684,36 +786,25 @@ public class EPersonTest extends AbstractUnitTest { /** * Test of checkPassword method, of class EPerson. + * + * @throws SQLException + * @throws DecoderException */ @Test public void testCheckPassword() - throws SQLException, DecoderException { - EPerson eperson = ePersonService.findByEmail(context, "kevin@dspace.org"); - ePersonService.checkPassword(context, eperson, "test"); + throws SQLException, DecoderException { + EPerson eperson = ePersonService.findByEmail(context, EMAIL); + ePersonService.checkPassword(context, eperson, PASSWORD); } - /** - * Test of update method, of class EPerson. - */ -/* - @Test - public void testUpdate() - throws Exception - { - System.out.println("update"); - EPerson instance = null; - instance.update(); - // TODO review the generated test code and remove the default call to fail. - fail("The test case is a prototype."); - } -*/ - /** * Test of getType method, of class EPerson. + * + * @throws SQLException */ @Test public void testGetType() - throws SQLException { + throws SQLException { System.out.println("getType"); int expResult = Constants.EPERSON; int result = eperson.getType(); @@ -721,37 +812,270 @@ public class EPersonTest extends AbstractUnitTest { } /** - * Test of getDeleteConstraints method, of class EPerson. + * Simple test if deletion of an EPerson throws any exceptions. + * + * @throws SQLException + * @throws AuthorizeException */ -/* @Test - public void testGetDeleteConstraints() - throws Exception - { - System.out.println("getDeleteConstraints"); - EPerson instance = null; - List expResult = null; - List result = instance.getDeleteConstraints(); - assertEquals(expResult, result); - // TODO review the generated test code and remove the default call to fail. - fail("The test case is a prototype."); + public void testDeleteEPerson() throws SQLException, AuthorizeException { + EPerson deleteEperson = ePersonService.findByEmail(context, EMAIL); + context.turnOffAuthorisationSystem(); + + try { + ePersonService.delete(context, deleteEperson); + } catch (AuthorizeException | IOException ex) { + log.error("Cannot delete EPersion, caught " + ex.getClass().getName() + ":", ex); + fail("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + context.restoreAuthSystemState(); + context.commit(); + EPerson findDeletedEperson = ePersonService.findByEmail(context, EMAIL); + assertNull("EPerson has not been deleted correctly!", findDeletedEperson); } -*/ /** - * Test of getName method, of class EPerson. + * Test that an EPerson has a delete constraint if it submitted an Item. + * + * @throws SQLException */ -/* @Test - public void testGetName() - { - System.out.println("getName"); - EPerson instance = null; - String expResult = ""; - String result = instance.getName(); - assertEquals(expResult, result); - // TODO review the generated test code and remove the default call to fail. - fail("The test case is a prototype."); + public void testDeletionConstraintOfSubmitter() + throws SQLException { + EPerson ep = ePersonService.findByEmail(context, EMAIL); + try { + item = prepareItem(ep); + } catch (SQLException | AuthorizeException | IOException ex) { + log.error("Caught an Exception while initializing an Item. " + ex.getClass().getName() + ": ", ex); + fail("Caught an Exception while initializing an Item. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + + context.turnOffAuthorisationSystem(); + + List tableList = ePersonService.getDeleteConstraints(context, ep); + Iterator iterator = tableList.iterator(); + while (iterator.hasNext()) { + String tableName = iterator.next(); + if (StringUtils.equalsIgnoreCase(tableName, "item")) { + return; + } + } + // if we did not get and EPersonDeletionException or it did not contain the item table, we should fail + // because it was not recognized that the EPerson is used as submitter. + fail("It was not recognized that a EPerson is referenced in the item table."); + } + + /** + * Test that the submitter is set to null if the specified EPerson was + * deleted using cascading. + * + * @throws SQLException + * @throws AuthorizeException + */ + @Test + public void testDeletionOfSubmitterWithAnItem() + throws SQLException, AuthorizeException { + EPerson ep = ePersonService.findByEmail(context, EMAIL); + try { + item = prepareItem(ep); + } catch (SQLException | AuthorizeException | IOException ex) { + log.error("Caught an Exception while initializing an Item. " + ex.getClass().getName() + ": ", ex); + fail("Caught an Exception while initializing an Item. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + assertNotNull(item); + context.turnOffAuthorisationSystem(); + try { + ePersonService.delete(context, ep); + } catch (SQLException | IOException | AuthorizeException ex) { + if (ex.getCause() instanceof EPersonDeletionException) { + fail("Caught an EPersonDeletionException while trying to cascading delete an EPerson: " + + ex.getMessage()); + } else { + log.error("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + ": ", ex); + fail("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + } + item = itemService.find(context, item.getID()); + assertNotNull("Could not load item after cascading deletion of the submitter.", item); + assertNull("Cascading deletion of an EPerson did not set the submitter of an submitted item null.", + item.getSubmitter()); + } + + /** + * Test that an unsubmitted workspace items get deleted when an EPerson gets + * deleted. + * + * @throws SQLException + * @throws IOException + * @throws AuthorizeException + */ + @Test + public void testCascadingDeletionOfUnsubmittedWorkspaceItem() + throws SQLException, AuthorizeException, IOException { + EPerson ep = ePersonService.findByEmail(context, EMAIL); + + context.turnOffAuthorisationSystem(); + WorkspaceItem wsi = prepareWorkspaceItem(ep); + Item item = wsi.getItem(); + itemService.addMetadata(context, item, "dc", "title", null, "en", "Testdocument 1"); + itemService.update(context, item); + context.restoreAuthSystemState(); + context.commit(); + context.turnOffAuthorisationSystem(); + + try { + ePersonService.delete(context, ep); + } catch (SQLException | IOException | AuthorizeException ex) { + if (ex.getCause() instanceof EPersonDeletionException) { + fail("Caught an EPersonDeletionException while trying to cascading delete an EPerson: " + + ex.getMessage()); + } else { + log.error("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": ", ex); + fail("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + } + + context.restoreAuthSystemState(); + context.commit(); + + try { + WorkspaceItem restoredWsi = workspaceItemService.find(context, wsi.getID()); + Item restoredItem = itemService.find(context, item.getID()); + assertNull("An unsubmited WorkspaceItem wasn't deleted while cascading deleting the submitter.", + restoredWsi); + assertNull("An unsubmited Item wasn't deleted while cascading deleting the submitter.", restoredItem); + } catch (SQLException ex) { + log.error("SQLException while trying to load previously stored. " + ex); + } + } + + /** + * Test that submitted but not yet archived items do not get delete while + * cascading deletion of an EPerson. + * + * @throws SQLException + * @throws AuthorizeException + * @throws IOException + * @throws MessagingException + * @throws WorkflowException + */ + @Test + public void testCascadingDeleteSubmitterPreservesWorkflowItems() + throws SQLException, AuthorizeException, IOException, MessagingException, WorkflowException { + EPerson ep = ePersonService.findByEmail(context, EMAIL); + WorkspaceItem wsi = null; + + try { + wsi = prepareWorkspaceItem(ep); + } catch (SQLException | AuthorizeException | IOException ex) { + log.error("Caught an Exception while initializing an WorkspaceItem. " + ex.getClass().getName() + + ": ", ex); + fail("Caught an Exception while initializing an WorkspaceItem. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + assertNotNull(wsi); + context.turnOffAuthorisationSystem(); + + // for this test we need an workflow item that is not yet submitted. Currently the Workflow advance + // automatically if nobody is defined to perform a step (see comments of DS-1941). + // We need to configure a collection to have a workflow step and set a person to perform this step. Then we can + // create an item, start the workflow and delete the item's submitter. + Group wfGroup = collectionService.createWorkflowGroup(context, wsi.getCollection(), 1); + collectionService.update(context, wsi.getCollection()); + EPerson groupMember = ePersonService.create(context); + groupMember.setEmail("testCascadingDeleteSubmitterPreservesWorkflowItems2@example.org"); + ePersonService.update(context, groupMember); + wfGroup.addMember(groupMember); + groupService.update(context, wfGroup); + + // DSpace currently contains two workflow systems. The newer XMLWorfklow needs additional tables that are not + // part of the test database yet. While it is expected that it becomes the default workflow system (DS-2059) + // one day, this won't happen before it its backported to JSPUI (DS-2121). + // TODO: add tests using the configurable workflowsystem + int wfiID = workflowService.startWithoutNotify(context, wsi).getID(); + context.restoreAuthSystemState(); + context.commit(); + context.turnOffAuthorisationSystem(); + + // check that the workflow item exists. + assertNotNull("Cannot find currently created WorkflowItem!", workflowItemService.find(context, wfiID)); + + // delete the submitter + try { + ePersonService.delete(context, ep); + } catch (SQLException | IOException | AuthorizeException ex) { + if (ex.getCause() instanceof EPersonDeletionException) { + fail("Caught an EPersonDeletionException while trying to cascading delete an EPerson: " + + ex.getMessage()); + } else { + log.error("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": ", ex); + fail("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + } + + context.restoreAuthSystemState(); + context.commit(); + context.turnOffAuthorisationSystem(); + + // check whether the workflow item still exists. + WorkflowItem wfi = workflowItemService.find(context, wfiID); + assertNotNull("Could not load WorkflowItem after cascading deletion of the submitter.", wfi); + assertNull("Cascading deletion of an EPerson did not set the submitter of an submitted WorkflowItem null.", + wfi.getSubmitter()); + } + + /** + * Creates an item, sets the specified submitter. + * + * This method is just an shortcut, so we must not use all the code again + * and again. + * + * @param submitter + * @return the created item. + * @throws SQLException + * @throws AuthorizeException + * @throws IOException + */ + private Item prepareItem(EPerson submitter) + throws SQLException, AuthorizeException, IOException { + context.turnOffAuthorisationSystem(); + WorkspaceItem wsi = prepareWorkspaceItem(submitter); + item = installItemService.installItem(context, wsi); + //we need to commit the changes so we don't block the table for testing + context.restoreAuthSystemState(); + return item; + } + + /** + * Creates a WorkspaceItem and sets the specified submitter. + * + * This method is just an shortcut, so we must not use all the code again + * and again. + * + * @param submitter + * @return the created WorkspaceItem. + * @throws SQLException + * @throws AuthorizeException + * @throws IOException + */ + private WorkspaceItem prepareWorkspaceItem(EPerson submitter) + throws SQLException, AuthorizeException, IOException { + context.turnOffAuthorisationSystem(); + // create a community, a collection and a WorkspaceItem + + WorkspaceItem wsi = workspaceItemService.create(context, this.collection, false); + // set the submitter + wsi.getItem().setSubmitter(submitter); + workspaceItemService.update(context, wsi); + context.restoreAuthSystemState(); + return wsi; } -*/ } diff --git a/dspace-api/src/test/java/org/dspace/eperson/GroupTest.java b/dspace-api/src/test/java/org/dspace/eperson/GroupTest.java index 744c1d666f..7fc3563bae 100644 --- a/dspace-api/src/test/java/org/dspace/eperson/GroupTest.java +++ b/dspace-api/src/test/java/org/dspace/eperson/GroupTest.java @@ -171,7 +171,6 @@ public class GroupTest extends AbstractUnitTest { public void findAll() throws SQLException { List groups = groupService.findAll(context, null); assertThat("findAll 1", groups, notNullValue()); - System.out.println("TEST GROUP OUTPUT " + groups); assertTrue("findAll 2", 0 < groups.size()); } diff --git a/dspace-api/src/test/java/org/dspace/license/MockCCLicenseConnectorServiceImpl.java b/dspace-api/src/test/java/org/dspace/license/MockCCLicenseConnectorServiceImpl.java index bb443ab4a4..bc687a43f5 100644 --- a/dspace-api/src/test/java/org/dspace/license/MockCCLicenseConnectorServiceImpl.java +++ b/dspace-api/src/test/java/org/dspace/license/MockCCLicenseConnectorServiceImpl.java @@ -29,6 +29,7 @@ public class MockCCLicenseConnectorServiceImpl extends CCLicenseConnectorService * @param language - the language * @return a map of mocked licenses with the id and the license */ + @Override public Map retrieveLicenses(String language) { Map ccLicenses = new HashMap<>(); CCLicense mockLicense1 = createMockLicense(1, new int[]{3, 2, 3}); @@ -89,6 +90,7 @@ public class MockCCLicenseConnectorServiceImpl extends CCLicenseConnectorService * @param answerMap - the answers to the different field questions * @return the CC License URI */ + @Override public String retrieveRightsByQuestion(final String licenseId, final String language, final Map answerMap) { @@ -105,6 +107,7 @@ public class MockCCLicenseConnectorServiceImpl extends CCLicenseConnectorService * @return a mock license RDF document or null when the URI contains invalid * @throws IOException */ + @Override public Document retrieveLicenseRDFDoc(String licenseURI) throws IOException { if (!StringUtils.contains(licenseURI, "invalid")) { InputStream cclicense = null; diff --git a/dspace-server-webapp/src/test/java/org/dspace/solr/MockSolrServer.java b/dspace-api/src/test/java/org/dspace/solr/MockSolrServer.java similarity index 97% rename from dspace-server-webapp/src/test/java/org/dspace/solr/MockSolrServer.java rename to dspace-api/src/test/java/org/dspace/solr/MockSolrServer.java index 237f35e63f..6faf9a7d1b 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/solr/MockSolrServer.java +++ b/dspace-api/src/test/java/org/dspace/solr/MockSolrServer.java @@ -19,7 +19,7 @@ import org.apache.solr.client.solrj.SolrClient; import org.apache.solr.client.solrj.SolrServerException; import org.apache.solr.client.solrj.embedded.EmbeddedSolrServer; import org.apache.solr.core.CoreContainer; -import org.dspace.app.rest.test.AbstractDSpaceIntegrationTest; +import org.dspace.AbstractDSpaceIntegrationTest; /** * Factory of connections to an in-process embedded Solr service. @@ -110,7 +110,7 @@ public class MockSolrServer { server.deleteByQuery("*:*"); server.commit(); } catch (SolrServerException | IOException e) { - e.printStackTrace(System.err); + log.error("Failed to empty Solr index: {}", e.getMessage(), e); } loadedCores.put(coreName, server); diff --git a/dspace-api/src/test/java/org/dspace/statistics/MockSolrLoggerServiceImpl.java b/dspace-api/src/test/java/org/dspace/statistics/MockSolrLoggerServiceImpl.java index cca05a12cc..7cb20c23d1 100644 --- a/dspace-api/src/test/java/org/dspace/statistics/MockSolrLoggerServiceImpl.java +++ b/dspace-api/src/test/java/org/dspace/statistics/MockSolrLoggerServiceImpl.java @@ -16,6 +16,7 @@ import java.util.ArrayList; import java.util.Collections; import java.util.HashMap; import java.util.List; +import java.util.Map; import com.maxmind.geoip2.DatabaseReader; import com.maxmind.geoip2.model.CityResponse; @@ -27,27 +28,29 @@ import com.maxmind.geoip2.record.MaxMind; import com.maxmind.geoip2.record.Postal; import com.maxmind.geoip2.record.RepresentedCountry; import com.maxmind.geoip2.record.Traits; +import org.dspace.solr.MockSolrServer; +import org.springframework.beans.factory.DisposableBean; import org.springframework.beans.factory.InitializingBean; +import org.springframework.stereotype.Service; /** * Mock service that uses an embedded SOLR server for the statistics core. - *

    - * NOTE: this class is overridden by one of the same name - * defined in dspace-server-webapp and declared as a bean there. - * See {@code test/data/dspaceFolder/config/spring/api/solr-services.xml}. Some kind of classpath - * magic makes this work. */ +@Service public class MockSolrLoggerServiceImpl extends SolrLoggerServiceImpl - implements InitializingBean { + implements InitializingBean, DisposableBean { + + private MockSolrServer mockSolrServer; public MockSolrLoggerServiceImpl() { } @Override public void afterPropertiesSet() throws Exception { - //We don't use SOLR in the tests of this module - solr = null; + // Initialize our service with a Mock Solr statistics core + mockSolrServer = new MockSolrServer("statistics"); + solr = mockSolrServer.getSolrServer(); // Mock GeoIP's DatabaseReader DatabaseReader reader = mock(DatabaseReader.class); @@ -58,14 +61,18 @@ public class MockSolrLoggerServiceImpl } /** - * A mock/fake GeoIP CityResponse, which will be used for *all* test statistical requests + * A mock/fake GeoIP CityResponse, which will be used for *all* test + * statistical requests. + * * @return faked CityResponse */ private CityResponse mockCityResponse() { - List cityNames = new ArrayList(Collections.singleton("New York")); - City city = new City(cityNames, 1, 1, new HashMap()); + List cityLocales = new ArrayList(Collections.singleton("en")); + Map cityNames = new HashMap<>(); + cityNames.put("en", "New York"); + City city = new City(cityLocales, 1, 1, cityNames); - List countryNames = new ArrayList(Collections.singleton("United States")); + List countryNames = new ArrayList<>(Collections.singleton("United States")); Country country = new Country(countryNames, 1, 1, "US", new HashMap()); Location location = new Location(1, 1, 40.760498D, -73.9933D, 501, 1, "EST"); @@ -73,7 +80,17 @@ public class MockSolrLoggerServiceImpl Postal postal = new Postal("10036", 1); return new CityResponse(city, new Continent(), country, location, new MaxMind(), postal, - country, new RepresentedCountry(), new ArrayList<>(0), - new Traits()); + country, new RepresentedCountry(), new ArrayList<>(0), + new Traits()); + } + + /** Reset the core for the next test. See {@link MockSolrServer#reset()}. */ + public void reset() { + mockSolrServer.reset(); + } + + @Override + public void destroy() throws Exception { + mockSolrServer.destroy(); } } diff --git a/dspace-api/src/test/java/org/dspace/statistics/export/FailedOpenURLTrackerServiceImplTest.java b/dspace-api/src/test/java/org/dspace/statistics/export/FailedOpenURLTrackerServiceImplTest.java new file mode 100644 index 0000000000..25c1a9b02b --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/statistics/export/FailedOpenURLTrackerServiceImplTest.java @@ -0,0 +1,85 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export; + +import static org.junit.Assert.assertEquals; +import static org.mockito.Matchers.any; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.when; + +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.List; + +import org.dspace.core.Context; +import org.dspace.statistics.export.dao.OpenURLTrackerDAO; +import org.junit.Test; +import org.junit.runner.RunWith; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.runners.MockitoJUnitRunner; + +/** + * Class to test the FailedOpenURLTrackerServiceImpl + */ +@RunWith(MockitoJUnitRunner.class) +public class FailedOpenURLTrackerServiceImplTest { + + @InjectMocks + private FailedOpenURLTrackerServiceImpl openURLTrackerLoggerService; + + @Mock + private Context context; + + @Mock + private OpenURLTracker openURLTracker; + + @Mock + private OpenURLTrackerDAO openURLTrackerDAO; + + /** + * Tests the remove method + * @throws SQLException + */ + @Test + public void testRemove() throws SQLException { + openURLTrackerLoggerService.remove(context, openURLTracker); + + Mockito.verify(openURLTrackerDAO, times(1)).delete(context, openURLTracker); + + } + + /** + * Tests the findAll method + * @throws SQLException + */ + @Test + public void testFindAll() throws SQLException { + List trackers = new ArrayList<>(); + + when(openURLTrackerDAO.findAll(context, OpenURLTracker.class)).thenReturn(trackers); + + assertEquals("TestFindAll 0", trackers, openURLTrackerLoggerService.findAll(context)); + } + + /** + * Tests the create method + * @throws SQLException + */ + @Test + public void testCreate() throws SQLException { + OpenURLTracker tracker = new OpenURLTracker(); + + when(openURLTrackerDAO.create(any(), any())).thenReturn(tracker); + + assertEquals("TestCreate 0", tracker, openURLTrackerLoggerService.create(context)); + } + + +} diff --git a/dspace-api/src/test/java/org/dspace/statistics/export/ITIrusExportUsageEventListener.java b/dspace-api/src/test/java/org/dspace/statistics/export/ITIrusExportUsageEventListener.java new file mode 100644 index 0000000000..75ee6e4008 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/statistics/export/ITIrusExportUsageEventListener.java @@ -0,0 +1,418 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertTrue; +import static org.mockito.Matchers.anyString; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + +import java.io.File; +import java.io.FileInputStream; +import java.io.UnsupportedEncodingException; +import java.net.URLEncoder; +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.List; +import java.util.regex.Pattern; +import javax.servlet.http.HttpServletRequest; + +import org.apache.commons.codec.CharEncoding; +import org.apache.log4j.Logger; +import org.dspace.AbstractIntegrationTestWithDatabase; +import org.dspace.authorize.AuthorizeException; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EntityTypeBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.content.Bitstream; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.EntityType; +import org.dspace.content.Item; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.content.service.BitstreamService; +import org.dspace.content.service.BundleService; +import org.dspace.content.service.CollectionService; +import org.dspace.content.service.CommunityService; +import org.dspace.content.service.EntityTypeService; +import org.dspace.content.service.InstallItemService; +import org.dspace.content.service.ItemService; +import org.dspace.content.service.WorkspaceItemService; +import org.dspace.core.Context; +import org.dspace.eperson.factory.EPersonServiceFactory; +import org.dspace.eperson.service.EPersonService; +import org.dspace.eperson.service.GroupService; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.dspace.statistics.export.factory.OpenURLTrackerLoggerServiceFactory; +import org.dspace.statistics.export.service.FailedOpenURLTrackerService; +import org.dspace.usage.UsageEvent; +import org.junit.After; +import org.junit.Before; +import org.junit.Test; + +/** + * Test class for the IrusExportUsageEventListener + */ +//@RunWith(MockitoJUnitRunner.class) +public class ITIrusExportUsageEventListener extends AbstractIntegrationTestWithDatabase { + + private static Logger log = Logger.getLogger(ITIrusExportUsageEventListener.class); + + + protected CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService(); + protected ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + protected CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); + protected ItemService itemService = ContentServiceFactory.getInstance().getItemService(); + protected InstallItemService installItemService = ContentServiceFactory.getInstance().getInstallItemService(); + protected WorkspaceItemService workspaceItemService = ContentServiceFactory.getInstance().getWorkspaceItemService(); + protected EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); + protected GroupService groupService = EPersonServiceFactory.getInstance().getGroupService(); + protected BundleService bundleService = ContentServiceFactory.getInstance().getBundleService(); + protected BitstreamService bitstreamService = ContentServiceFactory.getInstance().getBitstreamService(); + protected EntityTypeService entityTypeService = ContentServiceFactory.getInstance().getEntityTypeService(); + protected FailedOpenURLTrackerService failedOpenURLTrackerService = + OpenURLTrackerLoggerServiceFactory.getInstance().getOpenUrlTrackerLoggerService(); + + protected ArrayList testProcessedUrls = DSpaceServicesFactory.getInstance().getServiceManager() + .getServiceByName("testProcessedUrls", + ArrayList.class); + + private IrusExportUsageEventListener exportUsageEventListener = + DSpaceServicesFactory.getInstance() + .getServiceManager() + .getServicesByType(IrusExportUsageEventListener.class) + .get(0); + + private Item item; + private Item itemNotToBeProcessed; + private Bitstream bitstream; + private Bitstream bitstreamNotToBeProcessed; + private EntityType entityType; + private Community community; + private Collection collection; + + private String encodedUrl; + private String encodedUIUrl; + + + /** + * Initializes the test by setting up all objects needed to create a test item + */ + @Before() + public void setUp() throws Exception { + super.setUp(); + + configurationService.setProperty("irus.statistics.tracker.enabled", true); + configurationService.setProperty("irus.statistics.tracker.type-field", "dc.type"); + configurationService.setProperty("irus.statistics.tracker.type-value", "Excluded type"); + + + context.turnOffAuthorisationSystem(); + try { + + entityType = EntityTypeBuilder.createEntityTypeBuilder(context, "Publication").build(); + community = CommunityBuilder.createCommunity(context).build(); + collection = CollectionBuilder.createCollection(context, community).build(); + item = ItemBuilder.createItem(context, collection) + .withRelationshipType(entityType.getLabel()) + .build(); + + File f = new File(testProps.get("test.bitstream").toString()); + bitstream = BitstreamBuilder.createBitstream(context, item, new FileInputStream(f)).build(); + + itemNotToBeProcessed = ItemBuilder.createItem(context, collection) + .withRelationshipType(entityType.getLabel()) + .withType("Excluded type") + .build(); + File itemNotToBeProcessedFile = new File(testProps.get("test.bitstream").toString()); + bitstreamNotToBeProcessed = BitstreamBuilder + .createBitstream(context, itemNotToBeProcessed, new FileInputStream(itemNotToBeProcessedFile)) + .build(); + + String dspaceUrl = configurationService.getProperty("dspace.server.url"); + encodedUrl = URLEncoder.encode(dspaceUrl, CharEncoding.UTF_8); + String dspaceUIUrl = configurationService.getProperty("dspace.ui.url"); + encodedUIUrl = URLEncoder.encode(dspaceUIUrl, CharEncoding.UTF_8); + + + } catch (Exception e) { + log.error(e.getMessage(), e); + } finally { + context.restoreAuthSystemState(); + } + } + + /** + * Clean up the created objects + * Empty the testProcessedUrls used to store succeeded urls + * Empty the database table where the failed urls are logged + */ + @After + public void destroy() throws Exception { + try { + context.turnOffAuthorisationSystem(); + + List all = failedOpenURLTrackerService.findAll(context); + for (OpenURLTracker tracker : all) { + failedOpenURLTrackerService.remove(context, tracker); + } + + // Clear the list of processedUrls + testProcessedUrls.clear(); + + } catch (Exception e) { + log.error(e.getMessage(), e); + } finally { + try { + context.complete(); + } catch (SQLException e) { + log.error(e); + } + } + super.destroy(); + } + + /** + * Test whether the usage event of an item meeting all conditions is processed and succeeds + */ + @Test + public void testReceiveEventOnItemThatShouldBeProcessed() throws UnsupportedEncodingException, SQLException { + HttpServletRequest request = mock(HttpServletRequest.class); + when(request.getRemoteAddr()).thenReturn("client-ip"); + when(request.getHeader(anyString())).thenReturn(null); + + UsageEvent usageEvent = mock(UsageEvent.class); + when(usageEvent.getObject()).thenReturn(item); + when(usageEvent.getRequest()).thenReturn(request); + when(usageEvent.getContext()).thenReturn(new Context()); + + exportUsageEventListener.receiveEvent(usageEvent); + + + List all = failedOpenURLTrackerService.findAll(context); + + + String regex = "https://irus.jisc.ac.uk/counter/test/\\?url_ver=Z39.88-2004&req_id=" + + URLEncoder.encode(request.getRemoteAddr(), "UTF-8") + "&req_dat=&rft" + + ".artnum=oai%3Alocalhost%3A" + URLEncoder.encode(item.getHandle(), "UTF-8") + "&rfr_dat=&rfr_id" + + "=localhost&url_tim=" + ".*" + "?&svc_dat=" + encodedUIUrl + "%2Fhandle%2F" + URLEncoder + .encode(item.getHandle(), "UTF-8") + "&rft_dat=Investigation"; + + boolean isMatch = matchesString(String.valueOf(testProcessedUrls.get(0)), regex); + + assertEquals(1, testProcessedUrls.size()); + assertTrue(isMatch); + assertEquals(0, all.size()); + + + } + + /** + * Test whether the usage event of an item meeting all conditions is processed but fails + */ + @Test + public void testReceiveEventOnItemThatShouldBeProcessedFailed() throws SQLException, UnsupportedEncodingException { + HttpServletRequest request = mock(HttpServletRequest.class); + when(request.getRemoteAddr()).thenReturn("client-ip-fail"); + when(request.getHeader(anyString())).thenReturn(null); + + UsageEvent usageEvent = mock(UsageEvent.class); + when(usageEvent.getObject()).thenReturn(item); + when(usageEvent.getRequest()).thenReturn(request); + when(usageEvent.getContext()).thenReturn(new Context()); + + exportUsageEventListener.receiveEvent(usageEvent); + + + List all = failedOpenURLTrackerService.findAll(context); + + String regex = "https://irus.jisc.ac.uk/counter/test/\\?url_ver=Z39.88-2004&req_id=" + + URLEncoder.encode(request.getRemoteAddr(), "UTF-8") + "&req_dat=&rft" + + ".artnum=oai%3Alocalhost%3A" + URLEncoder.encode(item.getHandle(), "UTF-8") + "&rfr_dat=&rfr_id" + + "=localhost&url_tim=" + ".*" + "?&svc_dat=" + encodedUIUrl + "%2Fhandle%2F" + URLEncoder + .encode(item.getHandle(), "UTF-8") + "&rft_dat=Investigation"; + + boolean isMatch = matchesString(all.get(0).getUrl(), regex); + + assertEquals(0, testProcessedUrls.size()); + + assertEquals(1, all.size()); + assertTrue(isMatch); + } + + /** + * Test whether the usage event of an item that does not meet all conditions is not processed + */ + @Test + public void testReceiveEventOnItemThatShouldNotBeProcessed() throws SQLException, AuthorizeException { + context.turnOffAuthorisationSystem(); + + HttpServletRequest request = mock(HttpServletRequest.class); + + UsageEvent usageEvent = mock(UsageEvent.class); + when(usageEvent.getObject()).thenReturn(itemNotToBeProcessed); + when(usageEvent.getRequest()).thenReturn(request); + when(usageEvent.getContext()).thenReturn(new Context()); + + itemService.clearMetadata(context, item, "relationship", "type", null, Item.ANY); + itemService.addMetadata(context, item, "relationship", "type", null, null, "OrgUnit"); + itemService.update(context, item); + + context.restoreAuthSystemState(); + + // doCallRealMethod().when(IrusExportUsageEventListener).receiveEvent(usageEvent); + exportUsageEventListener.receiveEvent(usageEvent); + + List all = failedOpenURLTrackerService.findAll(context); + + + assertEquals(0, testProcessedUrls.size()); + assertEquals(0, all.size()); + } + + /** + * Test whether the usage event of a bitstream meeting all conditions is processed and succeeds + */ + @Test + public void testReceiveEventOnBitstreamThatShouldBeProcessed() throws SQLException, UnsupportedEncodingException { + HttpServletRequest request = mock(HttpServletRequest.class); + when(request.getRemoteAddr()).thenReturn("client-ip"); + when(request.getHeader(anyString())).thenReturn(null); + + UsageEvent usageEvent = mock(UsageEvent.class); + when(usageEvent.getObject()).thenReturn(bitstream); + when(usageEvent.getRequest()).thenReturn(request); + when(usageEvent.getContext()).thenReturn(new Context()); + + exportUsageEventListener.receiveEvent(usageEvent); + + String regex = "https://irus.jisc.ac.uk/counter/test/\\?url_ver=Z39.88-2004&req_id=" + + URLEncoder.encode(request.getRemoteAddr(), "UTF-8") + "&req_dat=&rft" + + ".artnum=oai%3Alocalhost%3A" + URLEncoder.encode(item.getHandle(), "UTF-8") + "&rfr_dat=&rfr_id" + + "=localhost&url_tim=" + ".*" + "?&svc_dat=" + encodedUrl + "%2Fapi%2Fcore%2Fbitstreams" + + "%2F" + bitstream.getID() + "%2Fcontent" + "&rft_dat=Request"; + + boolean isMatch = matchesString(String.valueOf(testProcessedUrls.get(0)), regex); + + assertEquals(1, testProcessedUrls.size()); + assertTrue(isMatch); + + List all = failedOpenURLTrackerService.findAll(context); + assertEquals(0, all.size()); + } + + /** + * Test whether the usage event of a bitstream meeting all conditions is processed but fails + */ + @Test + public void testReceiveEventOnBitstreamThatShouldBeProcessedFail() throws UnsupportedEncodingException, + SQLException { + HttpServletRequest request = mock(HttpServletRequest.class); + when(request.getRemoteAddr()).thenReturn("client-ip-fail"); + when(request.getHeader(anyString())).thenReturn(null); + + UsageEvent usageEvent = mock(UsageEvent.class); + when(usageEvent.getObject()).thenReturn(bitstream); + when(usageEvent.getRequest()).thenReturn(request); + when(usageEvent.getContext()).thenReturn(new Context()); + + exportUsageEventListener.receiveEvent(usageEvent); + + List all = failedOpenURLTrackerService.findAll(context); + + String regex = "https://irus.jisc.ac.uk/counter/test/\\?url_ver=Z39.88-2004&req_id=" + + URLEncoder.encode(request.getRemoteAddr(), "UTF-8") + "&req_dat=&rft" + + ".artnum=oai%3Alocalhost%3A" + URLEncoder.encode(item.getHandle(), "UTF-8") + "&rfr_dat=&rfr_id" + + "=localhost&url_tim=" + ".*" + "?&svc_dat=" + encodedUrl + "%2Fapi%2Fcore%2Fbitstreams" + + "%2F" + bitstream.getID() + "%2Fcontent" + "&rft_dat=Request"; + + + boolean isMatch = matchesString(all.get(0).getUrl(), regex); + + assertEquals(1, all.size()); + assertEquals(true, isMatch); + assertEquals(0, testProcessedUrls.size()); + + } + + /** + * Test whether the usage event of a bitstream that does not meet all conditions is not processed + */ + @Test + public void testReceiveEventOnBitstreamThatShouldNotBeProcessed() throws SQLException, AuthorizeException { + context.turnOffAuthorisationSystem(); + HttpServletRequest request = mock(HttpServletRequest.class); + when(request.getRemoteAddr()).thenReturn("client-ip-fail"); + when(request.getHeader(anyString())).thenReturn(null); + + UsageEvent usageEvent = mock(UsageEvent.class); + when(usageEvent.getObject()).thenReturn(bitstreamNotToBeProcessed); + when(usageEvent.getRequest()).thenReturn(request); + when(usageEvent.getContext()).thenReturn(new Context()); + + itemService.clearMetadata(context, item, "relationship", "type", null, Item.ANY); + itemService.addMetadata(context, item, "relationship", "type", null, null, "OrgUnit"); + itemService.update(context, item); + + context.restoreAuthSystemState(); + + exportUsageEventListener.receiveEvent(usageEvent); + + List all = failedOpenURLTrackerService.findAll(context); + + + assertEquals(0, all.size()); + assertEquals(0, testProcessedUrls.size()); + + } + + /** + * Test that an object that is not an Item or Bitstream is not processed + */ + @Test + public void testReceiveEventOnNonRelevantObject() throws SQLException { + + HttpServletRequest request = mock(HttpServletRequest.class); + + UsageEvent usageEvent = mock(UsageEvent.class); + when(usageEvent.getObject()).thenReturn(community); + when(usageEvent.getContext()).thenReturn(new Context()); + + exportUsageEventListener.receiveEvent(usageEvent); + + List all = failedOpenURLTrackerService.findAll(context); + + + assertEquals(0, all.size()); + assertEquals(0, testProcessedUrls.size()); + + } + + /** + * Method to test if a string matches a regex + * + * @param string + * @param regex + * @return whether the regex matches the string + */ + private boolean matchesString(String string, String regex) { + + Pattern p = Pattern.compile(regex); + + if (p.matcher(string).matches()) { + return true; + } + return false; + } + + +} diff --git a/dspace-api/src/test/java/org/dspace/statistics/export/ITRetryFailedOpenUrlTracker.java b/dspace-api/src/test/java/org/dspace/statistics/export/ITRetryFailedOpenUrlTracker.java new file mode 100644 index 0000000000..a445a6540f --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/statistics/export/ITRetryFailedOpenUrlTracker.java @@ -0,0 +1,182 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export; + +import static org.junit.Assert.assertEquals; + +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.List; + +import org.apache.log4j.Logger; +import org.dspace.AbstractIntegrationTest; +import org.dspace.app.scripts.handler.impl.TestDSpaceRunnableHandler; +import org.dspace.scripts.DSpaceRunnable; +import org.dspace.scripts.configuration.ScriptConfiguration; +import org.dspace.scripts.factory.ScriptServiceFactory; +import org.dspace.scripts.service.ScriptService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.dspace.statistics.export.factory.OpenURLTrackerLoggerServiceFactory; +import org.dspace.statistics.export.service.FailedOpenURLTrackerService; +import org.junit.After; +import org.junit.Test; + +/** + * Class to test the RetryFailedOpenUrlTracker + */ +public class ITRetryFailedOpenUrlTracker extends AbstractIntegrationTest { + + private static Logger log = Logger.getLogger(ITRetryFailedOpenUrlTracker.class); + + + protected FailedOpenURLTrackerService failedOpenURLTrackerService = + OpenURLTrackerLoggerServiceFactory.getInstance().getOpenUrlTrackerLoggerService(); + + protected ArrayList testProcessedUrls = DSpaceServicesFactory.getInstance().getServiceManager() + .getServiceByName("testProcessedUrls", + ArrayList.class); + + private ScriptService scriptService = ScriptServiceFactory.getInstance().getScriptService(); + + + /** + * Clean up the logged entries from the db after each test + */ + @After + @Override + public void destroy() { + try { + context.turnOffAuthorisationSystem(); + + List all = failedOpenURLTrackerService.findAll(context); + for (OpenURLTracker tracker : all) { + failedOpenURLTrackerService.remove(context, tracker); + } + + // Clear the list of processedUrls + testProcessedUrls.clear(); + + } catch (Exception e) { + log.error(e.getMessage(), e); + } finally { + try { + context.complete(); + } catch (SQLException e) { + log.error(e); + } + } + super.destroy(); + } + + /** + * Test the mode of the script that allows the user to add a failed url to the database + * + * @throws Exception + */ + @Test + public void testAddNewFailedUrl() throws Exception { + + TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); + ScriptConfiguration retryOpenUrlTrackerConfig = scriptService.getScriptConfiguration("retry-tracker"); + DSpaceRunnable retryOpenUrlTracker = + scriptService.createDSpaceRunnableForScriptConfiguration(retryOpenUrlTrackerConfig); + String urlToAdd = "test-failed-url"; + String[] args = {"-a", urlToAdd}; + + retryOpenUrlTracker.initialize(args, testDSpaceRunnableHandler, eperson); + retryOpenUrlTracker.internalRun(); + + List all = failedOpenURLTrackerService.findAll(context); + + assertEquals(0, testProcessedUrls.size()); + assertEquals(1, all.size()); + assertEquals(urlToAdd, all.get(0).getUrl()); + } + + /** + * Test to check that all logged failed urls are reprocessed succesfully and removed from the db + * + * @throws Exception + */ + @Test + public void testReprocessAllUrls() throws Exception { + + TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); + ScriptConfiguration retryOpenUrlTrackerConfig = scriptService.getScriptConfiguration("retry-tracker"); + DSpaceRunnable retryOpenUrlTracker = + scriptService.createDSpaceRunnableForScriptConfiguration(retryOpenUrlTrackerConfig); + String[] args = {"-r"}; + + OpenURLTracker tracker1 = failedOpenURLTrackerService.create(context); + tracker1.setUrl("test-url-1"); + OpenURLTracker tracker2 = failedOpenURLTrackerService.create(context); + tracker2.setUrl("test-url-2"); + OpenURLTracker tracker3 = failedOpenURLTrackerService.create(context); + tracker3.setUrl("test-url-3"); + + + retryOpenUrlTracker.initialize(args, testDSpaceRunnableHandler, eperson); + retryOpenUrlTracker.internalRun(); + + List all = failedOpenURLTrackerService.findAll(context); + + assertEquals(3, testProcessedUrls.size()); + assertEquals(true, testProcessedUrls.contains("test-url-1")); + assertEquals(true, testProcessedUrls.contains("test-url-2")); + assertEquals(true, testProcessedUrls.contains("test-url-3")); + + assertEquals(0, all.size()); + } + + /** + * Test to check that the successful retries are removed, but the failed retries remain in the db + * + * @throws Exception + */ + @Test + public void testReprocessPartOfUrls() throws Exception { + + TestDSpaceRunnableHandler testDSpaceRunnableHandler = new TestDSpaceRunnableHandler(); + ScriptConfiguration retryOpenUrlTrackerConfig = scriptService.getScriptConfiguration("retry-tracker"); + DSpaceRunnable retryOpenUrlTracker = + scriptService.createDSpaceRunnableForScriptConfiguration(retryOpenUrlTrackerConfig); + String[] args = {"-r"}; + + OpenURLTracker tracker1 = failedOpenURLTrackerService.create(context); + tracker1.setUrl("test-url-1"); + OpenURLTracker tracker2 = failedOpenURLTrackerService.create(context); + tracker2.setUrl("test-url-2-fail"); + OpenURLTracker tracker3 = failedOpenURLTrackerService.create(context); + tracker3.setUrl("test-url-3-fail"); + OpenURLTracker tracker4 = failedOpenURLTrackerService.create(context); + tracker4.setUrl("test-url-4-fail"); + OpenURLTracker tracker5 = failedOpenURLTrackerService.create(context); + tracker5.setUrl("test-url-5"); + + + retryOpenUrlTracker.initialize(args, testDSpaceRunnableHandler, eperson); + retryOpenUrlTracker.internalRun(); + + List all = failedOpenURLTrackerService.findAll(context); + List storedTrackerUrls = new ArrayList<>(); + for (OpenURLTracker tracker : all) { + storedTrackerUrls.add(tracker.getUrl()); + } + + assertEquals(2, testProcessedUrls.size()); + assertEquals(true, testProcessedUrls.contains("test-url-1")); + assertEquals(true, testProcessedUrls.contains("test-url-5")); + + assertEquals(3, all.size()); + assertEquals(true, storedTrackerUrls.contains("test-url-2-fail")); + assertEquals(true, storedTrackerUrls.contains("test-url-3-fail")); + assertEquals(true, storedTrackerUrls.contains("test-url-4-fail")); + } + + +} diff --git a/dspace-api/src/test/java/org/dspace/statistics/export/processor/BitstreamEventProcessorTest.java b/dspace-api/src/test/java/org/dspace/statistics/export/processor/BitstreamEventProcessorTest.java new file mode 100644 index 0000000000..62556d1594 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/statistics/export/processor/BitstreamEventProcessorTest.java @@ -0,0 +1,87 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.processor; + +import static org.hamcrest.core.Is.is; +import static org.junit.Assert.assertThat; +import static org.mockito.Mockito.mock; + +import java.io.File; +import java.io.FileInputStream; +import java.io.UnsupportedEncodingException; +import java.net.URLEncoder; +import javax.servlet.http.HttpServletRequest; + +import org.apache.commons.codec.CharEncoding; +import org.dspace.AbstractIntegrationTestWithDatabase; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.content.Bitstream; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.junit.Before; +import org.junit.Test; + +/** + * Test class for the BitstreamEventProcessor + */ +public class BitstreamEventProcessorTest extends AbstractIntegrationTestWithDatabase { + + private ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + + + private String encodedUrl; + + + @Before + public void setUp() throws Exception { + super.setUp(); + configurationService.setProperty("irus.statistics.tracker.enabled", true); + + String dspaceUrl = configurationService.getProperty("dspace.server.url"); + try { + encodedUrl = URLEncoder.encode(dspaceUrl, CharEncoding.UTF_8); + } catch (UnsupportedEncodingException e) { + throw new AssertionError("Error occurred in setup()", e); + } + + } + + @Test + /** + * Test the method that adds data based on the object types + */ + public void testAddObectSpecificData() throws Exception { + HttpServletRequest request = mock(HttpServletRequest.class); + + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).build(); + + File f = new File(testProps.get("test.bitstream").toString()); + Bitstream bitstream = BitstreamBuilder.createBitstream(context, item, new FileInputStream(f)).build(); + + context.restoreAuthSystemState(); + + BitstreamEventProcessor bitstreamEventProcessor = new BitstreamEventProcessor(context, request, bitstream); + + String result = bitstreamEventProcessor.addObjectSpecificData("existing-string", bitstream); + + assertThat(result, + is("existing-string&svc_dat=" + encodedUrl + "%2Fapi%2Fcore%2Fbitstreams%2F" + bitstream.getID() + + "%2Fcontent&rft_dat=Request")); + + } + +} diff --git a/dspace-api/src/test/java/org/dspace/statistics/export/processor/ExportEventProcessorTest.java b/dspace-api/src/test/java/org/dspace/statistics/export/processor/ExportEventProcessorTest.java new file mode 100644 index 0000000000..5df7405ed4 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/statistics/export/processor/ExportEventProcessorTest.java @@ -0,0 +1,281 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.processor; + +import static org.hamcrest.CoreMatchers.startsWith; +import static org.hamcrest.core.Is.is; +import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertThat; +import static org.junit.Assert.assertTrue; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.when; + +import java.io.UnsupportedEncodingException; +import java.net.URLEncoder; +import java.sql.SQLException; +import javax.servlet.http.HttpServletRequest; + +import org.apache.commons.codec.CharEncoding; +import org.dspace.AbstractIntegrationTestWithDatabase; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EntityTypeBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.WorkspaceItemBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.EntityType; +import org.dspace.content.Item; +import org.dspace.content.WorkspaceItem; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.junit.Before; +import org.junit.Test; +import org.mockito.Mock; + +/** + * Test for the ExportEventProcessor class + */ +public class ExportEventProcessorTest extends AbstractIntegrationTestWithDatabase { + + @Mock + private HttpServletRequest request = mock(HttpServletRequest.class); + + private ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + + private EntityType publication; + private EntityType otherEntity; + private final String excluded_type = "Excluded type"; + + @Before + public void setUp() throws Exception { + super.setUp(); + + configurationService.setProperty("irus.statistics.tracker.urlversion", "Z39.88-2004"); + configurationService.setProperty("irus.statistics.tracker.enabled", true); + configurationService.setProperty("irus.statistics.tracker.type-field", "dc.type"); + configurationService.setProperty("irus.statistics.tracker.type-value", "Excluded type"); + + context.turnOffAuthorisationSystem(); + publication = EntityTypeBuilder.createEntityTypeBuilder(context, "Publication").build(); + otherEntity = EntityTypeBuilder.createEntityTypeBuilder(context, "Other").build(); + context.restoreAuthSystemState(); + + + } + + @Test + /** + * Test the getBaseParameters method + */ + public void testGetBaseParameters() throws UnsupportedEncodingException { + + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).build(); + String encodedHandle = URLEncoder.encode(item.getHandle(), CharEncoding.UTF_8); + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + when(request.getRemoteAddr()).thenReturn("test-client-ip"); + when(request.getHeader("USER-AGENT")).thenReturn("test-user-agent"); + when(request.getHeader("referer")).thenReturn("test-referer"); + + String result = exportEventProcessor.getBaseParameters(item); + String expected = "url_ver=Z39.88-2004&req_id=test-client-ip&req_dat=test-user-agent&rft.artnum=" + + "oai%3Alocalhost%3A" + encodedHandle + "&rfr_dat=test-referer&rfr_id=localhost&url_tim="; + + assertThat(result, startsWith(expected)); + + + } + + @Test + /** + * Test the ShouldProcessItem method where the item is null + */ + public void testShouldProcessItemWhenNull() throws SQLException { + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, null); + + boolean result = exportEventProcessor.shouldProcessItem(null); + assertThat(result, is(false)); + } + + @Test + /** + * Test the ShouldProcessItem method where the item is not archived + */ + public void testShouldProcessItemWhenNotArchived() throws SQLException { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + WorkspaceItem workspaceItem = WorkspaceItemBuilder.createWorkspaceItem(context, collection).build(); + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, workspaceItem.getItem()); + + boolean result = exportEventProcessor.shouldProcessItem(workspaceItem.getItem()); + assertFalse(result); + } + + @Test + /** + * Test the ShouldProcessItem method where the item can be edit by the current user + */ + public void testShouldProcessItemWhenCanEdit() throws SQLException { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).withRelationshipType(otherEntity.getLabel()).build(); + context.restoreAuthSystemState(); + + context.setCurrentUser(admin); + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + boolean result = exportEventProcessor.shouldProcessItem(item); + assertFalse(result); + + } + + @Test + /** + * Test the ShouldProcessItem method where the item type should be excluded + */ + public void testShouldProcessItemWhenShouldNotProcessType() throws Exception { + + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection) + .withType("Excluded type") + .withRelationshipType(publication.getLabel()) + .build(); + + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + boolean result = exportEventProcessor.shouldProcessItem(item); + assertFalse(result); + + } + + @Test + /** + * Test the ShouldProcessItem method where the item entity type should not be processed + */ + public void testShouldProcessItemWhenShouldNotProcessEntity() throws SQLException { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).withRelationshipType(otherEntity.getLabel()).build(); + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + boolean result = exportEventProcessor.shouldProcessItem(item); + assertFalse(result); + + } + + @Test + /** + * Test the ShouldProcessItem method where all conditions are met + */ + public void testShouldProcessItem() throws SQLException { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).withRelationshipType(publication.getLabel()).build(); + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + boolean result = exportEventProcessor.shouldProcessItem(item); + assertTrue(result); + + } + + + @Test + /** + * Test the ShouldProcessEntityType method where all conditions are met + */ + public void testShouldProcessEntityType() throws SQLException { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).withRelationshipType(publication.getLabel()).build(); + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + boolean result = exportEventProcessor.shouldProcessEntityType(item); + + assertTrue(result); + } + + @Test + /** + * Test the ShouldProcessEntityType method where the item entity type is not present in the configured list + */ + public void testShouldProcessEntityTypeWhenNotInList() throws SQLException { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).withRelationshipType(otherEntity.getLabel()).build(); + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + boolean result = exportEventProcessor.shouldProcessEntityType(item); + + assertFalse(result); + + } + + + @Test + /** + * Test the shouldProcessItemType method where the item type is present in the list of excluded types + */ + public void testShouldProcessItemTypeInExcludeTrackerTypeList() { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).withType(excluded_type).build(); + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + boolean result = exportEventProcessor.shouldProcessItemType(item); + assertFalse(result); + + } + + @Test + /** + * Test the shouldProcessItemType method where the item type is not present in the list of excluded types + */ + public void testShouldProcessItemTypeNotInExcludeTrackerTypeList() { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).withType("Not excluded type").build(); + context.restoreAuthSystemState(); + + ExportEventProcessor exportEventProcessor = new ItemEventProcessor(context, request, item); + + boolean result = exportEventProcessor.shouldProcessItemType(item); + assertTrue(result); + + } + +} diff --git a/dspace-api/src/test/java/org/dspace/statistics/export/processor/ItemEventProcessorTest.java b/dspace-api/src/test/java/org/dspace/statistics/export/processor/ItemEventProcessorTest.java new file mode 100644 index 0000000000..ded4546f26 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/statistics/export/processor/ItemEventProcessorTest.java @@ -0,0 +1,76 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.processor; + +import static org.hamcrest.core.Is.is; +import static org.junit.Assert.assertThat; + +import java.io.UnsupportedEncodingException; +import java.net.URLEncoder; + +import org.apache.commons.codec.CharEncoding; +import org.dspace.AbstractIntegrationTestWithDatabase; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.services.ConfigurationService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.junit.Before; +import org.junit.Test; + +/** + * Test class for the ItemEventProcessor + */ +public class ItemEventProcessorTest extends AbstractIntegrationTestWithDatabase { + + + private ConfigurationService configurationService = DSpaceServicesFactory.getInstance().getConfigurationService(); + + private String encodedUrl; + + @Before + public void setUp() throws Exception { + super.setUp(); + configurationService.setProperty("irus.statistics.tracker.enabled", true); + + String dspaceUrl = configurationService.getProperty("dspace.ui.url"); + try { + encodedUrl = URLEncoder.encode(dspaceUrl, CharEncoding.UTF_8); + } catch (UnsupportedEncodingException e) { + throw new AssertionError("Error occurred in setup()", e); + } + + } + + @Test + /** + * Test the method that adds data based on the object types + */ + public void testAddObectSpecificData() throws UnsupportedEncodingException { + context.turnOffAuthorisationSystem(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).build(); + context.restoreAuthSystemState(); + + String encodedHandle = URLEncoder.encode(item.getHandle(), CharEncoding.UTF_8); + + ItemEventProcessor itemEventProcessor = new ItemEventProcessor(context, null, item); + String result = itemEventProcessor.addObjectSpecificData("existing-string", item); + + assertThat(result, + is("existing-string&svc_dat=" + encodedUrl + "%2Fhandle%2F" + encodedHandle + + "&rft_dat=Investigation")); + + } + + +} diff --git a/dspace-api/src/test/java/org/dspace/statistics/export/service/MockOpenUrlServiceImpl.java b/dspace-api/src/test/java/org/dspace/statistics/export/service/MockOpenUrlServiceImpl.java new file mode 100644 index 0000000000..14ac9d36d5 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/statistics/export/service/MockOpenUrlServiceImpl.java @@ -0,0 +1,41 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.service; + +import java.io.IOException; +import java.net.HttpURLConnection; +import java.util.ArrayList; + +import org.apache.commons.lang.StringUtils; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * Mock OpenUrlService that will ensure that IRUS tracker does need to be contacted in order to test the functionality + */ +public class MockOpenUrlServiceImpl extends OpenUrlServiceImpl { + + @Autowired + ArrayList testProcessedUrls; + + /** + * Returns a response code to simulate contact to the external url + * When the url contains "fail", a fail code 500 will be returned + * Otherwise the success code 200 will be returned + * @param urlStr + * @return 200 or 500 depending on whether the "fail" keyword is present in the url + * @throws IOException + */ + protected int getResponseCodeFromUrl(final String urlStr) throws IOException { + if (StringUtils.contains(urlStr, "fail")) { + return HttpURLConnection.HTTP_INTERNAL_ERROR; + } else { + testProcessedUrls.add(urlStr); + return HttpURLConnection.HTTP_OK; + } + } +} diff --git a/dspace-api/src/test/java/org/dspace/statistics/export/service/OpenUrlServiceImplTest.java b/dspace-api/src/test/java/org/dspace/statistics/export/service/OpenUrlServiceImplTest.java new file mode 100644 index 0000000000..192b771458 --- /dev/null +++ b/dspace-api/src/test/java/org/dspace/statistics/export/service/OpenUrlServiceImplTest.java @@ -0,0 +1,134 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.service; + +import static org.hamcrest.CoreMatchers.is; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.Mockito.doCallRealMethod; +import static org.mockito.Mockito.doNothing; +import static org.mockito.Mockito.doReturn; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +import java.io.IOException; +import java.net.HttpURLConnection; +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.List; + +import org.dspace.core.Context; +import org.dspace.statistics.export.OpenURLTracker; +import org.junit.Assert; +import org.junit.Test; +import org.junit.runner.RunWith; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.Spy; +import org.mockito.junit.MockitoJUnitRunner; + +/** + * Test class for the OpenUrlServiceImpl + */ +@RunWith(MockitoJUnitRunner.class) +public class OpenUrlServiceImplTest { + + @InjectMocks + @Spy + private OpenUrlServiceImpl openUrlService; + + @Mock + private FailedOpenURLTrackerService failedOpenURLTrackerService; + + /** + * Test the processUrl method + * @throws IOException + * @throws SQLException + */ + @Test + public void testProcessUrl() throws IOException, SQLException { + Context context = mock(Context.class); + + doReturn(HttpURLConnection.HTTP_OK).when(openUrlService) + .getResponseCodeFromUrl(anyString()); + openUrlService.processUrl(context, "test-url"); + + verify(openUrlService, times(0)).logfailed(context, "test-url"); + + + } + + /** + * Test the processUrl method when the url connection fails + * @throws IOException + * @throws SQLException + */ + @Test + public void testProcessUrlOnFail() throws IOException, SQLException { + Context context = mock(Context.class); + + doReturn(HttpURLConnection.HTTP_INTERNAL_ERROR).when(openUrlService) + .getResponseCodeFromUrl(anyString()); + doNothing().when(openUrlService).logfailed(any(Context.class), anyString()); + + openUrlService.processUrl(context, "test-url"); + + verify(openUrlService, times(1)).logfailed(context, "test-url"); + + + } + + /** + * Test the ReprocessFailedQueue method + * @throws SQLException + */ + @Test + public void testReprocessFailedQueue() throws SQLException { + Context context = mock(Context.class); + + List trackers = new ArrayList<>(); + OpenURLTracker tracker1 = mock(OpenURLTracker.class); + OpenURLTracker tracker2 = mock(OpenURLTracker.class); + OpenURLTracker tracker3 = mock(OpenURLTracker.class); + + trackers.add(tracker1); + trackers.add(tracker2); + trackers.add(tracker3); + + when(failedOpenURLTrackerService.findAll(any(Context.class))).thenReturn(trackers); + doNothing().when(openUrlService).tryReprocessFailed(any(Context.class), any(OpenURLTracker.class)); + + openUrlService.reprocessFailedQueue(context); + + verify(openUrlService, times(3)).tryReprocessFailed(any(Context.class), any(OpenURLTracker.class)); + + } + + /** + * Test the method that logs the failed urls in the db + * @throws SQLException + */ + @Test + public void testLogfailed() throws SQLException { + Context context = mock(Context.class); + OpenURLTracker tracker1 = mock(OpenURLTracker.class); + + doCallRealMethod().when(tracker1).setUrl(anyString()); + when(tracker1.getUrl()).thenCallRealMethod(); + + when(failedOpenURLTrackerService.create(any(Context.class))).thenReturn(tracker1); + + String failedUrl = "failed-url"; + openUrlService.logfailed(context, failedUrl); + + Assert.assertThat(tracker1.getUrl(), is(failedUrl)); + + } +} diff --git a/dspace-api/src/test/java/org/dspace/xmlworkflow/XmlWorkflowFactoryTest.java b/dspace-api/src/test/java/org/dspace/xmlworkflow/XmlWorkflowFactoryTest.java index a19e6a2622..03a6a0e949 100644 --- a/dspace-api/src/test/java/org/dspace/xmlworkflow/XmlWorkflowFactoryTest.java +++ b/dspace-api/src/test/java/org/dspace/xmlworkflow/XmlWorkflowFactoryTest.java @@ -10,8 +10,10 @@ package org.dspace.xmlworkflow; import static junit.framework.TestCase.assertEquals; import static org.junit.Assert.fail; +import java.io.IOException; import java.sql.SQLException; +import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.dspace.AbstractUnitTest; import org.dspace.authorize.AuthorizeException; @@ -35,9 +37,11 @@ import org.junit.Test; */ public class XmlWorkflowFactoryTest extends AbstractUnitTest { - private CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); - private CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService(); - private XmlWorkflowFactory xmlWorkflowFactory + private final CollectionService collectionService + = ContentServiceFactory.getInstance().getCollectionService(); + private final CommunityService communityService + = ContentServiceFactory.getInstance().getCommunityService(); + private final XmlWorkflowFactory xmlWorkflowFactory = new DSpace().getServiceManager().getServiceByName("xmlWorkflowFactory", XmlWorkflowFactoryImpl.class); private Community owningCommunity; @@ -47,7 +51,7 @@ public class XmlWorkflowFactoryTest extends AbstractUnitTest { /** * log4j category */ - private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(XmlWorkflowFactoryTest.class); + private static final Logger log = LogManager.getLogger(XmlWorkflowFactoryTest.class); /** * This method will be run before every test as per @Before. It will @@ -94,7 +98,7 @@ public class XmlWorkflowFactoryTest extends AbstractUnitTest { this.collectionService.delete(context, this.nonMappedCollection); this.collectionService.delete(context, this.mappedCollection); this.communityService.delete(context, this.owningCommunity); - } catch (Exception e) { + } catch (IOException | SQLException | AuthorizeException e) { log.error("Error in destroy", e); } @@ -112,12 +116,12 @@ public class XmlWorkflowFactoryTest extends AbstractUnitTest { @Test public void workflowMapping_NonMappedCollection() throws WorkflowConfigurationException { Workflow workflow = xmlWorkflowFactory.getWorkflow(this.nonMappedCollection); - assertEquals(workflow.getID(), "defaultWorkflow"); + assertEquals(XmlWorkflowFactoryImpl.LEGACY_WORKFLOW_NAME, workflow.getID()); } @Test public void workflowMapping_MappedCollection() throws WorkflowConfigurationException { Workflow workflow = xmlWorkflowFactory.getWorkflow(this.mappedCollection); - assertEquals(workflow.getID(), "selectSingleReviewer"); + assertEquals( "selectSingleReviewer", workflow.getID()); } } diff --git a/dspace-oai/pom.xml b/dspace-oai/pom.xml index d5a129c90a..05ee3dc7df 100644 --- a/dspace-oai/pom.xml +++ b/dspace-oai/pom.xml @@ -8,7 +8,7 @@ dspace-parent org.dspace - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. diff --git a/dspace-rdf/pom.xml b/dspace-rdf/pom.xml index 47fc5cd204..c483ea5a91 100644 --- a/dspace-rdf/pom.xml +++ b/dspace-rdf/pom.xml @@ -9,7 +9,7 @@ org.dspace dspace-parent - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. diff --git a/dspace-rdf/src/main/java/org/dspace/rdf/providing/LocalURIRedirectionServlet.java b/dspace-rdf/src/main/java/org/dspace/rdf/providing/LocalURIRedirectionServlet.java index b6a6854938..7224bb9bfb 100644 --- a/dspace-rdf/src/main/java/org/dspace/rdf/providing/LocalURIRedirectionServlet.java +++ b/dspace-rdf/src/main/java/org/dspace/rdf/providing/LocalURIRedirectionServlet.java @@ -86,7 +86,8 @@ public class LocalURIRedirectionServlet extends HttpServlet { response.sendError(HttpServletResponse.SC_NOT_FOUND); return; } - + // use object's reported handle for redirect (just in case user provided handle had odd characters) + handle = dso.getHandle(); // close the context and send forward. context.abort(); Negotiator.sendRedirect(response, handle, "", requestedMimeType, true); diff --git a/dspace-rest/pom.xml b/dspace-rest/pom.xml index 1038617b49..be80c8c159 100644 --- a/dspace-rest/pom.xml +++ b/dspace-rest/pom.xml @@ -3,7 +3,7 @@ org.dspace dspace-rest war - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT DSpace (Deprecated) REST Webapp DSpace RESTful Web Services API. NOTE: this REST API is DEPRECATED. Please consider using the REST API in the dspace-server-webapp instead! @@ -12,7 +12,7 @@ org.dspace dspace-parent - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. diff --git a/dspace-rest/src/main/java/org/dspace/rest/CollectionsResource.java b/dspace-rest/src/main/java/org/dspace/rest/CollectionsResource.java index af06792b7b..66919ad5c7 100644 --- a/dspace-rest/src/main/java/org/dspace/rest/CollectionsResource.java +++ b/dspace-rest/src/main/java/org/dspace/rest/CollectionsResource.java @@ -274,16 +274,16 @@ public class CollectionsResource extends Resource { headers, request, context); items = new ArrayList(); - Iterator dspaceItems = itemService.findByCollection(context, dspaceCollection); - for (int i = 0; (dspaceItems.hasNext()) && (i < (limit + offset)); i++) { + Iterator dspaceItems = itemService.findByCollection(context, dspaceCollection, + limit, offset); + + while (dspaceItems.hasNext()) { org.dspace.content.Item dspaceItem = dspaceItems.next(); - if (i >= offset) { - if (itemService.isItemListedForUser(context, dspaceItem)) { - items.add(new Item(dspaceItem, servletContext, expand, context)); - writeStats(dspaceItem, UsageEvent.Action.VIEW, user_ip, user_agent, xforwardedfor, - headers, request, context); - } + if (itemService.isItemListedForUser(context, dspaceItem)) { + items.add(new Item(dspaceItem, servletContext, expand, context)); + writeStats(dspaceItem, UsageEvent.Action.VIEW, user_ip, user_agent, xforwardedfor, + headers, request, context); } } diff --git a/dspace-rest/src/main/webapp/WEB-INF/applicationContext.xml b/dspace-rest/src/main/webapp/WEB-INF/applicationContext.xml index 62b660b86b..ec892fbaa4 100644 --- a/dspace-rest/src/main/webapp/WEB-INF/applicationContext.xml +++ b/dspace-rest/src/main/webapp/WEB-INF/applicationContext.xml @@ -28,7 +28,7 @@ - - test-environment - - false - - maven.test.skip - false - - - - - - - maven-dependency-plugin - - ${project.build.directory}/testing - - - org.dspace - dspace-parent - ${project.version} - zip - testEnvironment - - - - - - setupTestEnvironment - generate-test-resources - - unpack - - - - setupIntegrationTestEnvironment - pre-integration-test - - unpack - - - - - - - - org.codehaus.gmaven - groovy-maven-plugin - - - setproperty - generate-test-resources - - - execute - - - - project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/'); - println("Initializing Maven property 'agnostic.build.dir' to: " + project.properties['agnostic.build.dir']); - - - - - - - - - maven-surefire-plugin - - - - - ${agnostic.build.dir}/testing/dspace/ - - true - ${agnostic.build.dir}/testing/dspace/solr/ - - - - - - - maven-failsafe-plugin - - - - ${agnostic.build.dir}/testing/dspace/ - - true - ${agnostic.build.dir}/testing/dspace/solr/ - - - - - - - - - @@ -187,9 +72,160 @@ + + + org.codehaus.gmaven + groovy-maven-plugin + + + setproperty + initialize + + execute + + + + project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/'); + log.info("Initializing Maven property 'agnostic.build.dir' to: {}", project.properties['agnostic.build.dir']); + + + + + + + + + unit-test-environment + + false + + skipUnitTests + false + + + + + + + maven-dependency-plugin + + ${project.build.directory}/testing + + + org.dspace + dspace-parent + ${project.version} + zip + testEnvironment + + + + + + setupUnitTestEnvironment + generate-test-resources + + unpack + + + + + + + + maven-surefire-plugin + + + + + ${agnostic.build.dir}/testing/dspace/ + + true + ${agnostic.build.dir}/testing/dspace/solr/ + + + + + + + + + + + integration-test-environment + + false + + skipIntegrationTests + false + + + + + + + maven-dependency-plugin + + ${project.build.directory}/testing + + + org.dspace + dspace-parent + ${project.version} + zip + testEnvironment + + + + + + setupIntegrationTestEnvironment + pre-integration-test + + unpack + + + + + + + + maven-failsafe-plugin + + + + ${agnostic.build.dir}/testing/dspace/ + + true + ${agnostic.build.dir}/testing/dspace/solr/ + + + + + + + + + + @@ -307,6 +343,13 @@ dspace-api + + org.dspace + dspace-api + test-jar + test + + org.dspace dspace-services @@ -460,6 +503,14 @@ solr-cell test + + org.bouncycastle + bcpkix-jdk15on + + + org.bouncycastle + bcprov-jdk15on + org.eclipse.jetty jetty-continuation @@ -524,13 +575,11 @@ org.apache.lucene lucene-analyzers-smartcn - ${solr.client.version} test org.apache.lucene lucene-analyzers-stempel - ${solr.client.version} test diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/Application.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/Application.java index 18d06c87e8..a2ea0d1c3c 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/Application.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/Application.java @@ -7,6 +7,8 @@ */ package org.dspace.app.rest; +import java.io.IOException; +import java.sql.SQLException; import java.util.List; import javax.servlet.Filter; @@ -16,6 +18,7 @@ import org.dspace.app.rest.parameter.resolver.SearchFilterResolver; import org.dspace.app.rest.utils.ApplicationConfig; import org.dspace.app.rest.utils.DSpaceConfigurationInitializer; import org.dspace.app.rest.utils.DSpaceKernelInitializer; +import org.dspace.app.sitemap.GenerateSitemaps; import org.dspace.app.util.DSpaceContextListener; import org.dspace.utils.servlet.DSpaceWebappServletFilter; import org.slf4j.Logger; @@ -28,6 +31,8 @@ import org.springframework.context.annotation.Bean; import org.springframework.core.annotation.Order; import org.springframework.hateoas.server.LinkRelationProvider; import org.springframework.lang.NonNull; +import org.springframework.scheduling.annotation.EnableScheduling; +import org.springframework.scheduling.annotation.Scheduled; import org.springframework.web.context.request.RequestContextListener; import org.springframework.web.cors.CorsConfiguration; import org.springframework.web.method.support.HandlerMethodArgumentResolver; @@ -49,6 +54,7 @@ import org.springframework.web.servlet.config.annotation.WebMvcConfigurer; * @author Tim Donohue */ @SpringBootApplication +@EnableScheduling public class Application extends SpringBootServletInitializer { private static final Logger log = LoggerFactory.getLogger(Application.class); @@ -56,6 +62,11 @@ public class Application extends SpringBootServletInitializer { @Autowired private ApplicationConfig configuration; + @Scheduled(cron = "${sitemap.cron:-}") + public void generateSitemap() throws IOException, SQLException { + GenerateSitemaps.generateSitemapsScheduled(); + } + /** * Override the default SpringBootServletInitializer.configure() method, * passing it this Application class. @@ -140,11 +151,11 @@ public class Application extends SpringBootServletInitializer { // Set Access-Control-Allow-Credentials to "true" and specify which origins are valid // for our Access-Control-Allow-Origin header .allowCredentials(corsAllowCredentials).allowedOrigins(corsAllowedOrigins) - // Whitelist of request preflight headers allowed to be sent to us from the client + // Allow list of request preflight headers allowed to be sent to us from the client .allowedHeaders("Authorization", "Content-Type", "X-Requested-With", "accept", "Origin", "Access-Control-Request-Method", "Access-Control-Request-Headers", "X-On-Behalf-Of") - // Whitelist of response headers allowed to be sent by us (the server) + // Allow list of response headers allowed to be sent by us (the server) .exposedHeaders("Access-Control-Allow-Origin", "Access-Control-Allow-Credentials", "Authorization"); } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/AuthenticationRestController.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/AuthenticationRestController.java index 68f9085e21..3038011009 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/AuthenticationRestController.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/AuthenticationRestController.java @@ -16,10 +16,13 @@ import org.dspace.app.rest.converter.ConverterService; import org.dspace.app.rest.converter.EPersonConverter; import org.dspace.app.rest.link.HalLinkService; import org.dspace.app.rest.model.AuthenticationStatusRest; +import org.dspace.app.rest.model.AuthenticationTokenRest; import org.dspace.app.rest.model.AuthnRest; import org.dspace.app.rest.model.EPersonRest; import org.dspace.app.rest.model.hateoas.AuthenticationStatusResource; +import org.dspace.app.rest.model.hateoas.AuthenticationTokenResource; import org.dspace.app.rest.model.hateoas.AuthnResource; +import org.dspace.app.rest.model.wrapper.AuthenticationToken; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.security.RestAuthenticationService; import org.dspace.app.rest.utils.ContextUtil; @@ -32,6 +35,7 @@ import org.springframework.beans.factory.annotation.Autowired; import org.springframework.hateoas.Link; import org.springframework.http.HttpStatus; import org.springframework.http.ResponseEntity; +import org.springframework.security.access.prepost.PreAuthorize; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RequestMethod; import org.springframework.web.bind.annotation.RequestParam; @@ -118,6 +122,30 @@ public class AuthenticationRestController implements InitializingBean { "valid."); } + /** + * This method will generate a short lived token to be used for bitstream downloads among other things. + * + * curl -v -X POST https://{dspace-server.url}/api/authn/shortlivedtokens -H "Authorization: Bearer eyJhbG...COdbo" + * + * Example: + *

    +     * {@code
    +     * curl -v -X POST https://{dspace-server.url}/api/authn/shortlivedtokens -H "Authorization: Bearer eyJhbG...COdbo"
    +     * }
    +     * 
    + * @param request The StandardMultipartHttpServletRequest + * @return The created short lived token + */ + @PreAuthorize("hasAuthority('AUTHENTICATED')") + @RequestMapping(value = "/shortlivedtokens", method = RequestMethod.POST) + public AuthenticationTokenResource shortLivedToken(HttpServletRequest request) { + Projection projection = utils.obtainProjection(); + AuthenticationToken shortLivedToken = + restAuthenticationService.getShortLivedAuthenticationToken(ContextUtil.obtainContext(request), request); + AuthenticationTokenRest authenticationTokenRest = converter.toRest(shortLivedToken, projection); + return converter.toResource(authenticationTokenRest); + } + @RequestMapping(value = "/login", method = { RequestMethod.GET, RequestMethod.PUT, RequestMethod.PATCH, RequestMethod.DELETE }) public ResponseEntity login() { diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/DiscoveryRestController.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/DiscoveryRestController.java index df1598b96a..d167d2a84d 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/DiscoveryRestController.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/DiscoveryRestController.java @@ -7,6 +7,8 @@ */ package org.dspace.app.rest; +import static org.apache.commons.collections4.ListUtils.emptyIfNull; + import java.util.Arrays; import java.util.List; import java.util.Objects; @@ -100,51 +102,55 @@ public class DiscoveryRestController implements InitializingBean { @RequestMapping(method = RequestMethod.GET, value = "/search/facets") public FacetsResource getFacets(@RequestParam(name = "query", required = false) String query, - @RequestParam(name = "dsoType", required = false) String dsoType, + @RequestParam(name = "dsoType", required = false) List dsoTypes, @RequestParam(name = "scope", required = false) String dsoScope, @RequestParam(name = "configuration", required = false) String configuration, List searchFilters, Pageable page) throws Exception { + dsoTypes = emptyIfNull(dsoTypes); + if (log.isTraceEnabled()) { log.trace("Searching with scope: " + StringUtils.trimToEmpty(dsoScope) - + ", configuration name: " + StringUtils.trimToEmpty(configuration) - + ", dsoType: " + StringUtils.trimToEmpty(dsoType) - + ", query: " + StringUtils.trimToEmpty(query) - + ", filters: " + Objects.toString(searchFilters)); + + ", configuration name: " + StringUtils.trimToEmpty(configuration) + + ", dsoTypes: " + String.join(", ", dsoTypes) + + ", query: " + StringUtils.trimToEmpty(query) + + ", filters: " + Objects.toString(searchFilters)); } SearchResultsRest searchResultsRest = discoveryRestRepository - .getAllFacets(query, dsoType, dsoScope, configuration, searchFilters); + .getAllFacets(query, dsoTypes, dsoScope, configuration, searchFilters); FacetsResource facetsResource = new FacetsResource(searchResultsRest, page); halLinkService.addLinks(facetsResource, page); return facetsResource; - - } @RequestMapping(method = RequestMethod.GET, value = "/search/objects") public SearchResultsResource getSearchObjects(@RequestParam(name = "query", required = false) String query, - @RequestParam(name = "dsoType", required = false) String dsoType, + @RequestParam(name = "dsoType", required = false) + List dsoTypes, @RequestParam(name = "scope", required = false) String dsoScope, @RequestParam(name = "configuration", required = false) String configuration, List searchFilters, Pageable page) throws Exception { + + dsoTypes = emptyIfNull(dsoTypes); + if (log.isTraceEnabled()) { log.trace("Searching with scope: " + StringUtils.trimToEmpty(dsoScope) - + ", configuration name: " + StringUtils.trimToEmpty(configuration) - + ", dsoType: " + StringUtils.trimToEmpty(dsoType) - + ", query: " + StringUtils.trimToEmpty(query) - + ", filters: " + Objects.toString(searchFilters) - + ", page: " + Objects.toString(page)); + + ", configuration name: " + StringUtils.trimToEmpty(configuration) + + ", dsoTypes: " + String.join(", ", dsoTypes) + + ", query: " + StringUtils.trimToEmpty(query) + + ", filters: " + Objects.toString(searchFilters) + + ", page: " + Objects.toString(page)); } //Get the Search results in JSON format SearchResultsRest searchResultsRest = discoveryRestRepository - .getSearchObjects(query, dsoType, dsoScope, configuration, searchFilters, page, utils.obtainProjection()); + .getSearchObjects(query, dsoTypes, dsoScope, configuration, searchFilters, page, utils.obtainProjection()); //Convert the Search JSON results to paginated HAL resources SearchResultsResource searchResultsResource = new SearchResultsResource(searchResultsRest, utils, page); @@ -174,15 +180,18 @@ public class DiscoveryRestController implements InitializingBean { public RepresentationModel getFacetValues(@PathVariable("name") String facetName, @RequestParam(name = "prefix", required = false) String prefix, @RequestParam(name = "query", required = false) String query, - @RequestParam(name = "dsoType", required = false) String dsoType, + @RequestParam(name = "dsoType", required = false) List dsoTypes, @RequestParam(name = "scope", required = false) String dsoScope, @RequestParam(name = "configuration", required = false) String configuration, List searchFilters, Pageable page) throws Exception { + + dsoTypes = emptyIfNull(dsoTypes); + if (log.isTraceEnabled()) { log.trace("Facetting on facet " + facetName + " with scope: " + StringUtils.trimToEmpty(dsoScope) - + ", dsoType: " + StringUtils.trimToEmpty(dsoType) + + ", dsoTypes: " + String.join(", ", dsoTypes) + ", prefix: " + StringUtils.trimToEmpty(prefix) + ", query: " + StringUtils.trimToEmpty(query) + ", filters: " + Objects.toString(searchFilters) @@ -190,7 +199,7 @@ public class DiscoveryRestController implements InitializingBean { } FacetResultsRest facetResultsRest = discoveryRestRepository - .getFacetObjects(facetName, prefix, query, dsoType, dsoScope, configuration, searchFilters, page); + .getFacetObjects(facetName, prefix, query, dsoTypes, dsoScope, configuration, searchFilters, page); FacetResultsResource facetResultsResource = converter.toResource(facetResultsRest); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/OpenSearchController.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/OpenSearchController.java index 42ad173f2e..62c6a9c573 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/OpenSearchController.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/OpenSearchController.java @@ -34,6 +34,7 @@ import org.dspace.content.service.CollectionService; import org.dspace.content.service.CommunityService; import org.dspace.core.Context; import org.dspace.core.LogManager; +import org.dspace.core.Utils; import org.dspace.discovery.DiscoverQuery; import org.dspace.discovery.DiscoverResult; import org.dspace.discovery.IndexableObject; @@ -103,7 +104,8 @@ public class OpenSearchController { // do some sanity checking if (!openSearchService.getFormats().contains(format)) { - String err = "Format " + format + " is not supported."; + // Since we are returning error response as HTML, escape any HTML in "format" param + String err = "Format " + Utils.addEntities(format) + " is not supported."; response.setContentType("text/html"); response.setContentLength(err.length()); response.getWriter().write(err); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/RestResourceController.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/RestResourceController.java index b501ba4406..9e14df2ec3 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/RestResourceController.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/RestResourceController.java @@ -149,7 +149,7 @@ public class RestResourceController implements InitializingBean { * @return single DSpaceResource */ @RequestMapping(method = RequestMethod.GET, value = REGEX_REQUESTMAPPING_IDENTIFIER_AS_DIGIT) - public DSpaceResource findOne(@PathVariable String apiCategory, @PathVariable String model, + public HALResource findOne(@PathVariable String apiCategory, @PathVariable String model, @PathVariable Integer id) { return findOneInternal(apiCategory, model, id); } @@ -180,7 +180,7 @@ public class RestResourceController implements InitializingBean { * @return single DSpaceResource */ @RequestMapping(method = RequestMethod.GET, value = REGEX_REQUESTMAPPING_IDENTIFIER_AS_STRING_VERSION_STRONG) - public DSpaceResource findOne(@PathVariable String apiCategory, @PathVariable String model, + public HALResource findOne(@PathVariable String apiCategory, @PathVariable String model, @PathVariable String id) { return findOneInternal(apiCategory, model, id); } @@ -200,7 +200,7 @@ public class RestResourceController implements InitializingBean { * @return single DSpaceResource */ @RequestMapping(method = RequestMethod.GET, value = REGEX_REQUESTMAPPING_IDENTIFIER_AS_UUID) - public DSpaceResource findOne(@PathVariable String apiCategory, @PathVariable String model, + public HALResource findOne(@PathVariable String apiCategory, @PathVariable String model, @PathVariable UUID uuid) { return findOneInternal(apiCategory, model, uuid); } @@ -213,7 +213,7 @@ public class RestResourceController implements InitializingBean { * @param id Identifier from request * @return single DSpaceResource */ - private DSpaceResource findOneInternal(String apiCategory, + private HALResource findOneInternal(String apiCategory, String model, ID id) { DSpaceRestRepository repository = utils.getResourceRepository(apiCategory, model); Optional modelObject = Optional.empty(); @@ -624,7 +624,7 @@ public class RestResourceController implements InitializingBean { HttpServletRequest request, @PathVariable String apiCategory, @PathVariable String model, - @RequestParam("file") MultipartFile uploadfile) + @RequestParam("file") List uploadfile) throws SQLException, FileNotFoundException, IOException, AuthorizeException { checkModelPluralForm(apiCategory, model); @@ -799,7 +799,7 @@ public class RestResourceController implements InitializingBean { Method linkMethod = utils.requireMethod(linkRepository.getClass(), linkRest.method()); try { if (Page.class.isAssignableFrom(linkMethod.getReturnType())) { - Page pageResult = (Page) linkMethod + Page pageResult = (Page) linkMethod .invoke(linkRepository, request, uuid, page, utils.obtainProjection()); if (pageResult == null) { @@ -823,8 +823,8 @@ public class RestResourceController implements InitializingBean { return new EntityModel(new EmbeddedPage(link.getHref(), pageResult.map(converter::toResource), null, subpath)); } else { - RestModel object = (RestModel) linkMethod.invoke(linkRepository, request, uuid, page, - utils.obtainProjection()); + RestModel object = (RestModel) linkMethod.invoke(linkRepository, request, + uuid, page, utils.obtainProjection()); if (object == null) { response.setStatus(HttpServletResponse.SC_NO_CONTENT); return null; @@ -846,7 +846,7 @@ public class RestResourceController implements InitializingBean { throw new RuntimeException(e); } } - RestAddressableModel modelObject = repository.findById(uuid).orElse(null); + RestModel modelObject = repository.findById(uuid).orElse(null); if (modelObject == null) { throw new ResourceNotFoundException(apiCategory + "." + model + " with id: " + uuid + " not found"); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/ShibbolethRestController.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/ShibbolethRestController.java index 7355bab2a8..159170f8b2 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/ShibbolethRestController.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/ShibbolethRestController.java @@ -8,10 +8,13 @@ package org.dspace.app.rest; import java.io.IOException; +import java.util.ArrayList; import java.util.Arrays; import javax.servlet.http.HttpServletResponse; +import org.apache.commons.lang3.StringUtils; import org.dspace.app.rest.model.AuthnRest; +import org.dspace.core.Utils; import org.dspace.services.ConfigurationService; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -47,14 +50,35 @@ public class ShibbolethRestController implements InitializingBean { .register(this, Arrays.asList(new Link("/api/" + AuthnRest.CATEGORY, "shibboleth"))); } + // LGTM.com thinks this method has an unvalidated URL redirect (https://lgtm.com/rules/4840088/) in `redirectUrl`, + // even though we are clearly validating the hostname of `redirectUrl` and test it in ShibbolethRestControllerIT + @SuppressWarnings("lgtm[java/unvalidated-url-redirection]") @RequestMapping(method = RequestMethod.GET) public void shibboleth(HttpServletResponse response, @RequestParam(name = "redirectUrl", required = false) String redirectUrl) throws IOException { if (redirectUrl == null) { redirectUrl = configurationService.getProperty("dspace.ui.url"); } - log.info("Redirecting to " + redirectUrl); - response.sendRedirect(redirectUrl); + + // Validate that the redirectURL matches either the server or UI hostname. It *cannot* be an arbitrary URL. + String redirectHostName = Utils.getHostName(redirectUrl); + String serverHostName = Utils.getHostName(configurationService.getProperty("dspace.server.url")); + ArrayList allowedHostNames = new ArrayList(); + allowedHostNames.add(serverHostName); + String[] allowedUrls = configurationService.getArrayProperty("rest.cors.allowed-origins"); + for (String url : allowedUrls) { + allowedHostNames.add(Utils.getHostName(url)); + } + + if (StringUtils.equalsAnyIgnoreCase(redirectHostName, allowedHostNames.toArray(new String[0]))) { + log.debug("Shibboleth redirecting to " + redirectUrl); + response.sendRedirect(redirectUrl); + } else { + log.error("Invalid Shibboleth redirectURL=" + redirectUrl + + ". URL doesn't match hostname of server or UI!"); + response.sendError(HttpServletResponse.SC_BAD_REQUEST, + "Invalid redirectURL! Must match server or ui hostname."); + } } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/SitemapRestController.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/SitemapRestController.java new file mode 100644 index 0000000000..4eef1ba34b --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/SitemapRestController.java @@ -0,0 +1,148 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import java.io.File; +import java.io.FileInputStream; +import java.io.IOException; +import java.io.InputStream; +import java.nio.file.Files; +import java.sql.SQLException; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; + +import org.apache.catalina.connector.ClientAbortException; +import org.apache.logging.log4j.Logger; +import org.dspace.app.rest.utils.ContextUtil; +import org.dspace.app.rest.utils.MultipartFileSender; +import org.dspace.core.Context; +import org.dspace.services.ConfigurationService; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.rest.webmvc.ResourceNotFoundException; +import org.springframework.stereotype.Controller; +import org.springframework.web.bind.annotation.GetMapping; +import org.springframework.web.bind.annotation.PathVariable; +import org.springframework.web.bind.annotation.RequestMapping; + +/** + * This is a specialized controller to provide access to the sitemap files, generated by + * {@link org.dspace.app.sitemap.GenerateSitemaps} + * + * The mapping for requested endpoint try to resolve a valid sitemap file name, for example + *
    + * {@code
    + * https:///sitemaps/26453b4d-e513-44e8-8d5b-395f62972eff/sitemap0.html
    + * }
    + * 
    + * + * @author Maria Verdonck (Atmire) on 08/07/2020 + */ +@Controller +@RequestMapping("/${sitemap.path:sitemaps}") +public class SitemapRestController { + + private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(SitemapRestController.class); + + @Autowired + ConfigurationService configurationService; + + // Most file systems are configured to use block sizes of 4096 or 8192 and our buffer should be a multiple of that. + private static final int BUFFER_SIZE = 4096 * 10; + + /** + * Tries to retrieve a matching sitemap file in configured location + * + * @param name the name of the requested sitemap file + * @param response the HTTP response + * @param request the HTTP request + * @throws SQLException if db error while completing DSpace context + * @throws IOException if IO error surrounding sitemap file + */ + @GetMapping("/{name}") + public void retrieve(@PathVariable String name, HttpServletResponse response, + HttpServletRequest request) throws IOException, SQLException { + // Find sitemap with given name in dspace/sitemaps + File foundSitemapFile = null; + File sitemapOutputDir = new File(configurationService.getProperty("sitemap.dir")); + if (sitemapOutputDir.exists() && sitemapOutputDir.isDirectory()) { + // List of all files and directories inside sitemapOutputDir + File sitemapFilesList[] = sitemapOutputDir.listFiles(); + for (File sitemapFile : sitemapFilesList) { + if (name.equalsIgnoreCase(sitemapFile.getName())) { + if (sitemapFile.isFile()) { + foundSitemapFile = sitemapFile; + } else { + throw new ResourceNotFoundException( + "Directory with name " + name + " in " + sitemapOutputDir.getAbsolutePath() + + " found, but no file."); + } + } + } + } else { + throw new ResourceNotFoundException( + "Sitemap directory in " + sitemapOutputDir.getAbsolutePath() + " does not " + + "exist, either sitemaps have not been generated (./dspace generate-sitemaps)," + + " or are located elsewhere (config used: sitemap.dir)."); + } + if (foundSitemapFile == null) { + throw new ResourceNotFoundException( + "Could not find sitemap file with name " + name + " in " + sitemapOutputDir.getAbsolutePath()); + } else { + // return found sitemap file + this.returnSitemapFile(foundSitemapFile, response, request); + } + } + + /** + * Sends back the matching sitemap file as a MultipartFile, with the headers set with details of the file + * (content, size, name, last modified) + * + * @param foundSitemapFile the found sitemap file, with matching name as in request path + * @param response the HTTP response + * @param request the HTTP request + * @throws SQLException if db error while completing DSpace context + * @throws IOException if IO error surrounding sitemap file + */ + private void returnSitemapFile(File foundSitemapFile, HttpServletResponse response, HttpServletRequest request) + throws SQLException, IOException { + // Pipe the bits + try (InputStream is = new FileInputStream(foundSitemapFile)) { + MultipartFileSender sender = MultipartFileSender + .fromInputStream(is) + .withBufferSize(BUFFER_SIZE) + .withFileName(foundSitemapFile.getName()) + .withLength(foundSitemapFile.length()) + .withMimetype(Files.probeContentType(foundSitemapFile.toPath())) + .with(request) + .with(response); + + sender.withLastModified(foundSitemapFile.lastModified()); + + // Determine if we need to send the file as a download or if the browser can open it inline + long dispositionThreshold = configurationService.getLongProperty("webui.content_disposition_threshold"); + if (dispositionThreshold >= 0 && foundSitemapFile.length() > dispositionThreshold) { + sender.withDisposition(MultipartFileSender.CONTENT_DISPOSITION_ATTACHMENT); + } + + Context context = ContextUtil.obtainContext(request); + + // We have all the data we need, close the connection to the database so that it doesn't stay open during + // download/streaming + context.complete(); + + // Send the data + if (sender.isValid()) { + sender.serveResource(); + } + + } catch (ClientAbortException e) { + log.debug("Client aborted the request before the download was completed. " + + "Client is probably switching to a Range request.", e); + } + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/StatisticsRestController.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/StatisticsRestController.java index 5fbf053588..77cae6f596 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/StatisticsRestController.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/StatisticsRestController.java @@ -12,7 +12,6 @@ import java.util.UUID; import org.dspace.app.rest.converter.ConverterService; import org.dspace.app.rest.exception.RepositoryMethodNotImplementedException; -import org.dspace.app.rest.link.HalLinkService; import org.dspace.app.rest.model.RestAddressableModel; import org.dspace.app.rest.model.StatisticsSupportRest; import org.dspace.app.rest.model.hateoas.SearchEventResource; @@ -46,9 +45,6 @@ public class StatisticsRestController implements InitializingBean { @Autowired private DiscoverableEndpointsService discoverableEndpointsService; - @Autowired - private HalLinkService halLinkService; - @Autowired private ConverterService converter; diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/WorkflowDefinitionCollectionsLinkRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/WorkflowDefinitionCollectionsLinkRepository.java index 7ae5f5ecc0..fd1192e0bb 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/WorkflowDefinitionCollectionsLinkRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/WorkflowDefinitionCollectionsLinkRepository.java @@ -12,13 +12,11 @@ import java.util.List; import javax.annotation.Nullable; import javax.servlet.http.HttpServletRequest; -import org.dspace.app.rest.converter.ConverterService; import org.dspace.app.rest.model.CollectionRest; import org.dspace.app.rest.model.WorkflowDefinitionRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.repository.AbstractDSpaceRestRepository; import org.dspace.app.rest.repository.LinkRestRepository; -import org.dspace.app.rest.utils.Utils; import org.dspace.content.Collection; import org.dspace.core.Context; import org.dspace.xmlworkflow.factory.XmlWorkflowFactory; @@ -43,12 +41,6 @@ public class WorkflowDefinitionCollectionsLinkRepository extends AbstractDSpaceR @Autowired protected XmlWorkflowFactory xmlWorkflowFactory; - @Autowired - protected ConverterService converter; - - @Autowired - protected Utils utils; - /** * GET endpoint that returns the list of collections that make an explicit use of the workflow-definition. * If a collection doesn't specify the workflow-definition to be used, the default mapping applies, @@ -69,10 +61,10 @@ public class WorkflowDefinitionCollectionsLinkRepository extends AbstractDSpaceR if (xmlWorkflowFactory.isDefaultWorkflow(workflowName)) { collectionsMappedToWorkflow.addAll(xmlWorkflowFactory.getAllNonMappedCollectionsHandles(context)); } - collectionsMappedToWorkflow.addAll(xmlWorkflowFactory.getCollectionHandlesMappedToWorklow(context, + collectionsMappedToWorkflow.addAll(xmlWorkflowFactory.getCollectionHandlesMappedToWorkflow(context, workflowName)); Pageable pageable = optionalPageable != null ? optionalPageable : PageRequest.of(0, 20); - return converter.toRestPage(collectionsMappedToWorkflow, pageable, + return super.converter.toRestPage(collectionsMappedToWorkflow, pageable, projection); } else { throw new ResourceNotFoundException("No workflow with name " + workflowName + " is configured"); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/AuthorizeServiceRestUtil.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/AuthorizeServiceRestUtil.java new file mode 100644 index 0000000000..6b34479ec0 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/AuthorizeServiceRestUtil.java @@ -0,0 +1,72 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization; + +import java.sql.SQLException; + +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.app.rest.utils.Utils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.DSpaceObject; +import org.dspace.content.Item; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.content.service.DSpaceObjectService; +import org.dspace.core.Context; +import org.dspace.eperson.EPerson; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * This class is a wrapper around the AuthorizationService which takes Rest objects instead of dspace objects + */ +@Component +public class AuthorizeServiceRestUtil { + @Autowired + private AuthorizeService authorizeService; + @Autowired + private Utils utils; + @Autowired + private ContentServiceFactory contentServiceFactory; + + /** + * Checks that the specified eperson can perform the given action on the rest given object. + * + * @param context DSpace context + * @param object The Rest object to test the action against + * @param dSpaceRestPermission The permission to check + * @return A boolean indicating if the action is allowed by the logged in ePerson on the given object + * @throws SQLException + */ + public boolean authorizeActionBoolean(Context context, BaseObjectRest object, + DSpaceRestPermission dSpaceRestPermission) + throws SQLException { + + DSpaceObject dSpaceObject = (DSpaceObject)utils.getDSpaceAPIObjectFromRest(context, object); + if (dSpaceObject == null) { + return false; + } + + DSpaceObjectService dSpaceObjectService = + contentServiceFactory.getDSpaceObjectService(dSpaceObject.getType()); + + EPerson ePerson = context.getCurrentUser(); + + // If the item is still inprogress we can process here only the READ permission. + // Other actions need to be evaluated against the wrapper object (workspace or workflow item) + if (dSpaceObject instanceof Item) { + if (!DSpaceRestPermission.READ.equals(dSpaceRestPermission) + && !((Item) dSpaceObject).isArchived() && !((Item) dSpaceObject).isWithdrawn()) { + return false; + } + } + + return authorizeService.authorizeActionBoolean(context, ePerson, dSpaceObject, + dSpaceRestPermission.getDspaceApiActionId(), true); + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/AdministratorOfFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/AdministratorOfFeature.java index 5ce4977b5a..6cfee12751 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/AdministratorOfFeature.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/AdministratorOfFeature.java @@ -14,11 +14,13 @@ import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; import org.dspace.app.rest.model.BaseObjectRest; import org.dspace.app.rest.model.CollectionRest; import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.model.ItemRest; import org.dspace.app.rest.model.SiteRest; import org.dspace.app.rest.utils.Utils; import org.dspace.authorize.service.AuthorizeService; import org.dspace.content.Collection; import org.dspace.content.Community; +import org.dspace.content.Item; import org.dspace.core.Context; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Component; @@ -53,6 +55,10 @@ public class AdministratorOfFeature implements AuthorizationFeature { Collection collection = (Collection) utils.getDSpaceAPIObjectFromRest(context, object); return authService.isAdmin(context, collection); } + if (object instanceof ItemRest) { + Item item = (Item) utils.getDSpaceAPIObjectFromRest(context, object); + return authService.isAdmin(context, item); + } } return authService.isAdmin(context); } @@ -62,7 +68,8 @@ public class AdministratorOfFeature implements AuthorizationFeature { return new String[]{ SiteRest.CATEGORY + "." + SiteRest.NAME, CommunityRest.CATEGORY + "." + CommunityRest.NAME, - CollectionRest.CATEGORY + "." + CollectionRest.NAME + CollectionRest.CATEGORY + "." + CollectionRest.NAME, + ItemRest.CATEGORY + "." + ItemRest.NAME }; } } \ No newline at end of file diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateBitstreamFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateBitstreamFeature.java new file mode 100644 index 0000000000..7c26383ca9 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateBitstreamFeature.java @@ -0,0 +1,89 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.apache.log4j.Logger; +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.authorization.AuthorizeServiceRestUtil; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.BundleRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.app.rest.utils.Utils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.Bundle; +import org.dspace.content.DSpaceObject; +import org.dspace.content.Item; +import org.dspace.content.service.BundleService; +import org.dspace.core.Constants; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The create bitstream feature. It can be used to verify if bitstreams can be created in a specific bundle. + * + * Authorization is granted if the current user has ADD & WRITE permissions on the given bundle AND the item + */ +@Component +@AuthorizationFeatureDocumentation(name = CreateBitstreamFeature.NAME, + description = "It can be used to verify if bitstreams can be created in a specific bundle") +public class CreateBitstreamFeature implements AuthorizationFeature { + + Logger log = Logger.getLogger(CreateBitstreamFeature.class); + + public final static String NAME = "canCreateBitstream"; + + @Autowired + private AuthorizeServiceRestUtil authorizeServiceRestUtil; + @Autowired + private BundleService bundleService; + @Autowired + private Utils utils; + @Autowired + private AuthorizeService authorizeService; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object instanceof BundleRest) { + if (!authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.WRITE)) { + return false; + } + if (!authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.ADD)) { + return false; + } + + DSpaceObject owningObject = bundleService.getParentObject(context, + (Bundle)utils.getDSpaceAPIObjectFromRest(context, object)); + + // Safety check. In case this is ever not true, this method should be revised. + if (!(owningObject instanceof Item)) { + log.error("The parent object of bundle " + object.getType() + " is not an item"); + return false; + } + + if (!authorizeService.authorizeActionBoolean(context, context.getCurrentUser(), owningObject, + Constants.WRITE, true)) { + return false; + } + + return authorizeService.authorizeActionBoolean(context, context.getCurrentUser(), owningObject, + Constants.ADD, true); + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + BundleRest.CATEGORY + "." + BundleRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateBundleFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateBundleFeature.java new file mode 100644 index 0000000000..0dc97446aa --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateBundleFeature.java @@ -0,0 +1,54 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.authorization.AuthorizeServiceRestUtil; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.ItemRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The create bundle feature. It can be used to verify if bundles can be created in a specific item. + * + * Authorization is granted if the current user has ADD & WRITE permissions on the given item + */ +@Component +@AuthorizationFeatureDocumentation(name = CreateBundleFeature.NAME, + description = "It can be used to verify if bundles can be created in a specific item") +public class CreateBundleFeature implements AuthorizationFeature { + + public final static String NAME = "canCreateBundle"; + + @Autowired + private AuthorizeServiceRestUtil authorizeServiceRestUtil; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object instanceof ItemRest) { + if (!authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.WRITE)) { + return false; + } + return authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.ADD); + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + ItemRest.CATEGORY + "." + ItemRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateCollectionFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateCollectionFeature.java new file mode 100644 index 0000000000..57b92aa425 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateCollectionFeature.java @@ -0,0 +1,59 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.utils.Utils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.Community; +import org.dspace.core.Constants; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The canCreateCollections feature. + * It can be used to verify if a user has access to create a new collection within a specific community. + */ +@Component +@AuthorizationFeatureDocumentation(name = CreateCollectionFeature.NAME, + description = "It can be used to verify if a user has access to create a new collection within a specific " + + "community") +public class CreateCollectionFeature implements AuthorizationFeature { + + public static final String NAME = "canCreateCollections"; + + @Autowired + AuthorizeService authService; + + @Autowired + Utils utils; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object != null) { + if (object instanceof CommunityRest) { + Community community = (Community) utils.getDSpaceAPIObjectFromRest(context, object); + return authService.authorizeActionBoolean(context, community, Constants.ADD); + } + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + CommunityRest.CATEGORY + "." + CommunityRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateCommunityFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateCommunityFeature.java new file mode 100644 index 0000000000..aefc2b0a42 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/CreateCommunityFeature.java @@ -0,0 +1,66 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.model.SiteRest; +import org.dspace.app.rest.utils.Utils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.Community; +import org.dspace.content.Site; +import org.dspace.core.Constants; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The canCreateCommunities feature. + * It can be used to verify if a user has access to create a new community within a specific parent community or site. + */ +@Component +@AuthorizationFeatureDocumentation(name = CreateCommunityFeature.NAME, + description = "It can be used to verify if a user has access to create a new community within a specific parent" + + " community or site") +public class CreateCommunityFeature implements AuthorizationFeature { + + public static final String NAME = "canCreateCommunities"; + + @Autowired + AuthorizeService authService; + + @Autowired + Utils utils; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object != null) { + if (object instanceof CommunityRest) { + Community community = (Community) utils.getDSpaceAPIObjectFromRest(context, object); + return authService.authorizeActionBoolean(context, community, Constants.ADD); + } + if (object instanceof SiteRest) { + Site site = (Site) utils.getDSpaceAPIObjectFromRest(context, object); + return authService.authorizeActionBoolean(context, site, Constants.ADD); + } + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + SiteRest.CATEGORY + "." + SiteRest.NAME, + CommunityRest.CATEGORY + "." + CommunityRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/DeleteFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/DeleteFeature.java new file mode 100644 index 0000000000..02ca816290 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/DeleteFeature.java @@ -0,0 +1,138 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.authorization.AuthorizeServiceRestUtil; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.BitstreamRest; +import org.dspace.app.rest.model.BundleRest; +import org.dspace.app.rest.model.CollectionRest; +import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.model.EPersonRest; +import org.dspace.app.rest.model.GroupRest; +import org.dspace.app.rest.model.ItemRest; +import org.dspace.app.rest.model.WorkspaceItemRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.app.rest.utils.Utils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.DSpaceObject; +import org.dspace.content.Item; +import org.dspace.content.WorkspaceItem; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.core.Constants; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The delete feature. It can be used to verify if specific content can be deleted/expunged. + * + * Authorization is granted + * - for a bitstream if the current used has REMOVE permissions on both the Item and the Bundle + * - for a bundle if the current user has REMOVE permissions on the Item + * - for an item if the current user has REMOVE permissions on the collection AND and DELETE permissions on the item + * - for a collection if the current user has REMOVE permissions on the community + * - for a community with a parent community if the current user has REMOVE permissions on the parent community + * - for a community without a parent community if the current user has DELETE permissions on the current community + * - for other objects if the current user has REMOVE permissions on the parent object if there is one. Otherwise if the + * current user has DELETE permissions on the current object + */ +@Component +@AuthorizationFeatureDocumentation(name = DeleteFeature.NAME, + description = "It can be used to verify if specific content can be deleted/expunged") +public class DeleteFeature implements AuthorizationFeature { + + public final static String NAME = "canDelete"; + + @Autowired + private AuthorizeServiceRestUtil authorizeServiceRestUtil; + @Autowired + private Utils utils; + @Autowired + private AuthorizeService authorizeService; + @Autowired + private ContentServiceFactory contentServiceFactory; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object instanceof BaseObjectRest) { + if (object.getType().equals(WorkspaceItemRest.NAME)) { + object = ((WorkspaceItemRest)object).getItem(); + } + + DSpaceObject dSpaceObject = (DSpaceObject) utils.getDSpaceAPIObjectFromRest(context, object); + DSpaceObject parentObject = getParentObject(context, dSpaceObject); + + switch (object.getType()) { + case BitstreamRest.NAME: + return ( + authorizeService.authorizeActionBoolean(context, context.getCurrentUser(), parentObject, + Constants.REMOVE, true) + && authorizeService.authorizeActionBoolean(context, context.getCurrentUser(), dSpaceObject, + Constants.REMOVE, true) + ); + case ItemRest.NAME: + return ( + authorizeService.authorizeActionBoolean(context, context.getCurrentUser(), parentObject, + Constants.REMOVE, true) + && authorizeServiceRestUtil.authorizeActionBoolean(context, object, + DSpaceRestPermission.DELETE) + ); + case CollectionRest.NAME: + case CommunityRest.NAME: + case BundleRest.NAME: + case WorkspaceItemRest.NAME: + case EPersonRest.NAME: + case GroupRest.NAME: + default: + if (parentObject != null) { + return authorizeService.authorizeActionBoolean(context, context.getCurrentUser(), parentObject, + Constants.REMOVE, true); + } + + return authorizeServiceRestUtil.authorizeActionBoolean(context, object, + DSpaceRestPermission.DELETE); + } + } + return false; + } + + private DSpaceObject getParentObject(Context context, DSpaceObject object) throws SQLException { + DSpaceObject parentObject + = contentServiceFactory.getDSpaceObjectService(object.getType()).getParentObject(context, object); + if (object.getType() == Constants.ITEM && parentObject == null) { + Item item = (Item) object; + parentObject = item.getOwningCollection(); + WorkspaceItem byItem = ContentServiceFactory.getInstance() + .getWorkspaceItemService() + .findByItem(context, item); + if (byItem != null) { + parentObject = byItem.getCollection(); + } + } + return parentObject; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + CommunityRest.CATEGORY + "." + CommunityRest.NAME, + CollectionRest.CATEGORY + "." + CollectionRest.NAME, + ItemRest.CATEGORY + "." + ItemRest.NAME, + BundleRest.CATEGORY + "." + BundleRest.NAME, + BitstreamRest.CATEGORY + "." + BitstreamRest.NAME, + WorkspaceItemRest.CATEGORY + "." + WorkspaceItemRest.NAME, + EPersonRest.CATEGORY + "." + EPersonRest.NAME, + GroupRest.CATEGORY + "." + GroupRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/EPersonRegistrationFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/EPersonRegistrationFeature.java new file mode 100644 index 0000000000..a03d68fcc9 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/EPersonRegistrationFeature.java @@ -0,0 +1,52 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.SiteRest; +import org.dspace.app.util.AuthorizeUtil; +import org.dspace.core.Context; +import org.dspace.services.RequestService; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The EPerson Registration feature. It's able to be used on site objects if the user.registration property is set to + * true. If it's set to true, it'll check if the current context is allowed to set the password. + */ +@Component +@AuthorizationFeatureDocumentation(name = EPersonRegistrationFeature.NAME, + description = "It can be used to register an eperson") +public class EPersonRegistrationFeature implements AuthorizationFeature { + + public static final String NAME = "epersonRegistration"; + + @Autowired + private RequestService requestService; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (!(object instanceof SiteRest)) { + return false; + } + if (!AuthorizeUtil.authorizeNewAccountRegistration(context, + requestService.getCurrentRequest().getHttpServletRequest())) { + return false; + } + return true; + } + + @Override + public String[] getSupportedTypes() { + return new String[] {SiteRest.CATEGORY + "." + SiteRest.NAME}; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/EditMetadataFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/EditMetadataFeature.java new file mode 100644 index 0000000000..06961bc33d --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/EditMetadataFeature.java @@ -0,0 +1,67 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.authorization.AuthorizeServiceRestUtil; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.BitstreamRest; +import org.dspace.app.rest.model.BundleRest; +import org.dspace.app.rest.model.CollectionRest; +import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.model.ItemRest; +import org.dspace.app.rest.model.SiteRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The edit metadata feature. It can be used to verify if the metadata of the specified objects can be edited. + * + * Authorization is granted if the current user has WRITE permissions on the given DSO + */ +@Component +@AuthorizationFeatureDocumentation(name = EditMetadataFeature.NAME, + description = "It can be used to verify if the metadata of the specified objects can be edited") +public class EditMetadataFeature implements AuthorizationFeature { + + public final static String NAME = "canEditMetadata"; + + @Autowired + private AuthorizeServiceRestUtil authorizeServiceRestUtil; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object instanceof CommunityRest + || object instanceof CollectionRest + || object instanceof ItemRest + || object instanceof BundleRest + || object instanceof BitstreamRest + || object instanceof SiteRest + ) { + return authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.WRITE); + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + CommunityRest.CATEGORY + "." + CommunityRest.NAME, + CollectionRest.CATEGORY + "." + CollectionRest.NAME, + ItemRest.CATEGORY + "." + ItemRest.NAME, + BundleRest.CATEGORY + "." + BundleRest.NAME, + BitstreamRest.CATEGORY + "." + BitstreamRest.NAME, + SiteRest.CATEGORY + "." + SiteRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MakeDiscoverableFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MakeDiscoverableFeature.java new file mode 100644 index 0000000000..76fd190ec9 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MakeDiscoverableFeature.java @@ -0,0 +1,51 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.authorization.AuthorizeServiceRestUtil; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.ItemRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The make discoverable feature. It can be used to verify if an item can be made discoverable. + * + * Authorization is granted if the current user has WRITE permissions on the given item + */ +@Component +@AuthorizationFeatureDocumentation(name = MakeDiscoverableFeature.NAME, + description = "It can be used to verify if an item can be made discoverable") +public class MakeDiscoverableFeature implements AuthorizationFeature { + + public final static String NAME = "canMakeDiscoverable"; + + @Autowired + private AuthorizeServiceRestUtil authorizeServiceRestUtil; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object instanceof ItemRest) { + return authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.WRITE); + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + ItemRest.CATEGORY + "." + ItemRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MakePrivateFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MakePrivateFeature.java new file mode 100644 index 0000000000..d48f52fb3d --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MakePrivateFeature.java @@ -0,0 +1,51 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.authorization.AuthorizeServiceRestUtil; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.ItemRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The make private feature. It can be used to verify if an item can be made private. + * + * Authorization is granted if the current user has WRITE permissions on the given item + */ +@Component +@AuthorizationFeatureDocumentation(name = MakePrivateFeature.NAME, + description = "It can be used to verify if an item can be made private") +public class MakePrivateFeature implements AuthorizationFeature { + + public final static String NAME = "canMakePrivate"; + + @Autowired + private AuthorizeServiceRestUtil authorizeServiceRestUtil; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object instanceof ItemRest) { + return authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.WRITE); + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + ItemRest.CATEGORY + "." + ItemRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MoveFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MoveFeature.java new file mode 100644 index 0000000000..a364e0c0ae --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/MoveFeature.java @@ -0,0 +1,81 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.apache.log4j.Logger; +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.authorization.AuthorizeServiceRestUtil; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.ItemRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.app.rest.utils.Utils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.Collection; +import org.dspace.content.DSpaceObject; +import org.dspace.content.Item; +import org.dspace.content.service.ItemService; +import org.dspace.core.Constants; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The move feature. It can be used to verify if item can be moved to a different collection. + * + * Authorization is granted if the current user has WRITE permissions on the given item and REMOVE permissions on the + * item’s owning collection + */ +@Component +@AuthorizationFeatureDocumentation(name = MoveFeature.NAME, + description = "It can be used to verify if item can be moved to a different collection") +public class MoveFeature implements AuthorizationFeature { + + Logger log = Logger.getLogger(MoveFeature.class); + + public final static String NAME = "canMove"; + + @Autowired + private AuthorizeServiceRestUtil authorizeServiceRestUtil; + @Autowired + private Utils utils; + @Autowired + private ItemService itemService; + @Autowired + private AuthorizeService authorizeService; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object instanceof ItemRest) { + if (!authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.WRITE)) { + return false; + } + + DSpaceObject owningObject = itemService.getParentObject(context, + (Item)utils.getDSpaceAPIObjectFromRest(context, object)); + + if (!(owningObject instanceof Collection)) { + log.error("The partent object of item " + object.getType() + " is not a collection"); + return false; + } + + return authorizeService.authorizeActionBoolean(context, context.getCurrentUser(), owningObject, + Constants.REMOVE, true); + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + ItemRest.CATEGORY + "." + ItemRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/PolicyFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/PolicyFeature.java new file mode 100644 index 0000000000..741d265cae --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/PolicyFeature.java @@ -0,0 +1,104 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.BitstreamRest; +import org.dspace.app.rest.model.BundleRest; +import org.dspace.app.rest.model.CollectionRest; +import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.model.ItemRest; +import org.dspace.app.rest.model.SiteRest; +import org.dspace.app.rest.utils.Utils; +import org.dspace.app.util.AuthorizeUtil; +import org.dspace.authorize.AuthorizeException; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.Bitstream; +import org.dspace.content.Bundle; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The policy feature. It can be used by administrators (or community/collection delegate) to manage resource policies + * + * Authorization is granted + * - for the site if the current user is administrator + * - for other objects if the current user has ADMIN permissions on the object + */ +@Component +@AuthorizationFeatureDocumentation(name = PolicyFeature.NAME, + description = "It can be used to verify if the resourcepolicies of the specified objects can be managed") +public class PolicyFeature implements AuthorizationFeature { + + public static final String NAME = "canManagePolicies"; + + @Autowired + AuthorizeService authService; + @Autowired + private Utils utils; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object != null) { + try { + if (object instanceof SiteRest) { + return authService.isAdmin(context); + } + if (object instanceof CommunityRest) { + AuthorizeUtil.authorizeManageCommunityPolicy(context, + (Community)utils.getDSpaceAPIObjectFromRest(context, object)); + return true; + } + if (object instanceof CollectionRest) { + AuthorizeUtil.authorizeManageCollectionPolicy(context, + (Collection) utils.getDSpaceAPIObjectFromRest(context, object)); + return true; + } + if (object instanceof ItemRest) { + AuthorizeUtil.authorizeManageItemPolicy(context, + (Item)utils.getDSpaceAPIObjectFromRest(context, object)); + return true; + } + if (object instanceof BundleRest) { + AuthorizeUtil.authorizeManageBundlePolicy(context, + (Bundle)utils.getDSpaceAPIObjectFromRest(context, object)); + return true; + } + if (object instanceof BitstreamRest) { + AuthorizeUtil.authorizeManageBitstreamPolicy(context, + (Bitstream)utils.getDSpaceAPIObjectFromRest(context, object)); + return true; + } + } catch (AuthorizeException e) { + return false; + } + } + + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + SiteRest.CATEGORY + "." + SiteRest.NAME, + CommunityRest.CATEGORY + "." + CommunityRest.NAME, + CollectionRest.CATEGORY + "." + CollectionRest.NAME, + ItemRest.CATEGORY + "." + ItemRest.NAME, + BundleRest.CATEGORY + "." + BundleRest.NAME, + BitstreamRest.CATEGORY + "." + BitstreamRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/ReorderBitstreamFeature.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/ReorderBitstreamFeature.java new file mode 100644 index 0000000000..568d8e1319 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/authorization/impl/ReorderBitstreamFeature.java @@ -0,0 +1,51 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization.impl; + +import java.sql.SQLException; + +import org.dspace.app.rest.authorization.AuthorizationFeature; +import org.dspace.app.rest.authorization.AuthorizationFeatureDocumentation; +import org.dspace.app.rest.authorization.AuthorizeServiceRestUtil; +import org.dspace.app.rest.model.BaseObjectRest; +import org.dspace.app.rest.model.BundleRest; +import org.dspace.app.rest.security.DSpaceRestPermission; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Component; + +/** + * The reorder bitstream feature. It can be used to verify if bitstreams can be reordered in a specific bundle. + * + * Authorization is granted if the current user has WRITE permissions on the given bundle + */ +@Component +@AuthorizationFeatureDocumentation(name = ReorderBitstreamFeature.NAME, + description = "It can be used to verify if bitstreams can be reordered in a specific bundle") +public class ReorderBitstreamFeature implements AuthorizationFeature { + + public final static String NAME = "canReorderBitstreams"; + + @Autowired + private AuthorizeServiceRestUtil authorizeServiceRestUtil; + + @Override + public boolean isAuthorized(Context context, BaseObjectRest object) throws SQLException { + if (object instanceof BundleRest) { + return authorizeServiceRestUtil.authorizeActionBoolean(context, object, DSpaceRestPermission.WRITE); + } + return false; + } + + @Override + public String[] getSupportedTypes() { + return new String[]{ + BundleRest.CATEGORY + "." + BundleRest.NAME + }; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthenticationTokenConverter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthenticationTokenConverter.java new file mode 100644 index 0000000000..ea64bc8bc8 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthenticationTokenConverter.java @@ -0,0 +1,31 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.converter; + +import org.dspace.app.rest.model.AuthenticationTokenRest; +import org.dspace.app.rest.model.wrapper.AuthenticationToken; +import org.dspace.app.rest.projection.Projection; +import org.springframework.stereotype.Component; + +/** + * This is the converter from the AuthenticationToken to the REST data model + */ +@Component +public class AuthenticationTokenConverter implements DSpaceConverter { + @Override + public AuthenticationTokenRest convert(AuthenticationToken modelObject, Projection projection) { + AuthenticationTokenRest token = new AuthenticationTokenRest(); + token.setToken(modelObject.getToken()); + return token; + } + + @Override + public Class getModelClass() { + return AuthenticationToken.class; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/ConverterService.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/ConverterService.java index fc786bfc85..84ce1a0032 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/ConverterService.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/ConverterService.java @@ -34,6 +34,7 @@ import org.dspace.app.rest.security.DSpacePermissionEvaluator; import org.dspace.app.rest.security.WebSecurityExpressionEvaluator; import org.dspace.app.rest.utils.Utils; import org.dspace.services.RequestService; +import org.springframework.aop.support.AopUtils; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider; @@ -51,6 +52,8 @@ import org.springframework.stereotype.Service; /** * Converts domain objects from the DSpace service layer to rest objects, and from rest objects to resource * objects, applying {@link Projection}s where applicable. + * + * @author Luca Giamminonni (luca.giamminonni at 4science dot it) */ @Service public class ConverterService { @@ -149,14 +152,30 @@ public class ConverterService { DSpaceRestRepository repositoryToUse = utils .getResourceRepositoryByCategoryAndModel(baseObjectRest.getCategory(), baseObjectRest.getType()); Annotation preAuthorize = null; - for (Method m : repositoryToUse.getClass().getMethods()) { + int maxDepth = 0; + // DS-4530 exclude the AOP Proxy from determining the annotations + for (Method m : AopUtils.getTargetClass(repositoryToUse).getMethods()) { if (StringUtils.equalsIgnoreCase(m.getName(), "findOne")) { - preAuthorize = AnnotationUtils.findAnnotation(m, PreAuthorize.class); + int depth = howManySuperclass(m.getDeclaringClass()); + if (depth > maxDepth) { + preAuthorize = AnnotationUtils.findAnnotation(m, PreAuthorize.class); + maxDepth = depth; + } } } return preAuthorize; } + private int howManySuperclass(Class declaringClass) { + Class curr = declaringClass; + int count = 0; + while (curr != Object.class) { + curr = curr.getSuperclass(); + count++; + } + return count; + } + private Annotation getDefaultFindOnePreAuthorize() { for (Method m : DSpaceRestRepository.class.getMethods()) { if (StringUtils.equalsIgnoreCase(m.getName(), "findOne")) { diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverFacetResultsConverter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverFacetResultsConverter.java index 1532502b86..5feb375801 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverFacetResultsConverter.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverFacetResultsConverter.java @@ -35,13 +35,14 @@ public class DiscoverFacetResultsConverter { @Autowired private SearchFilterToAppliedFilterConverter searchFilterToAppliedFilterConverter; - public FacetResultsRest convert(Context context, String facetName, String prefix, String query, String dsoType, - String dsoScope, List searchFilters, DiscoverResult searchResult, - DiscoveryConfiguration configuration, Pageable page, Projection projection) { + public FacetResultsRest convert(Context context, String facetName, String prefix, String query, + List dsoTypes, String dsoScope, List searchFilters, + DiscoverResult searchResult, DiscoveryConfiguration configuration, Pageable page, + Projection projection) { FacetResultsRest facetResultsRest = new FacetResultsRest(); facetResultsRest.setProjection(projection); - setRequestInformation(context, facetName, prefix, query, dsoType, dsoScope, searchFilters, searchResult, + setRequestInformation(context, facetName, prefix, query, dsoTypes, dsoScope, searchFilters, searchResult, configuration, facetResultsRest, page, projection); addToFacetResultList(facetName, searchResult, facetResultsRest, configuration, page, projection); @@ -72,14 +73,14 @@ public class DiscoverFacetResultsConverter { return facetValueConverter.convert(value, projection); } - private void setRequestInformation(Context context, String facetName, String prefix, String query, String dsoType, - String dsoScope, List searchFilters, DiscoverResult searchResult, - DiscoveryConfiguration configuration, FacetResultsRest facetResultsRest, - Pageable page, Projection projection) { + private void setRequestInformation(Context context, String facetName, String prefix, String query, + List dsoTypes, String dsoScope, List searchFilters, + DiscoverResult searchResult, DiscoveryConfiguration configuration, + FacetResultsRest facetResultsRest, Pageable page, Projection projection) { facetResultsRest.setQuery(query); facetResultsRest.setPrefix(prefix); facetResultsRest.setScope(dsoScope); - facetResultsRest.setDsoType(dsoType); + facetResultsRest.setDsoTypes(dsoTypes); facetResultsRest.setFacetEntry(convertFacetEntry(facetName, searchResult, configuration, page, projection)); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverFacetsConverter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverFacetsConverter.java index ad52f9b002..a519e8b5b8 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverFacetsConverter.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverFacetsConverter.java @@ -38,7 +38,7 @@ public class DiscoverFacetsConverter { @Autowired private SearchService searchService; - public SearchResultsRest convert(Context context, String query, String dsoType, String configurationName, + public SearchResultsRest convert(Context context, String query, List dsoTypes, String configurationName, String dsoScope, List searchFilters, final Pageable page, DiscoveryConfiguration configuration, DiscoverResult searchResult, Projection projection) { @@ -46,7 +46,7 @@ public class DiscoverFacetsConverter { SearchResultsRest searchResultsRest = new SearchResultsRest(); searchResultsRest.setProjection(projection); - setRequestInformation(context, query, dsoType, configurationName, dsoScope, searchFilters, page, + setRequestInformation(context, query, dsoTypes, configurationName, dsoScope, searchFilters, page, searchResultsRest); addFacetValues(context, searchResult, searchResultsRest, configuration, projection); @@ -129,13 +129,13 @@ public class DiscoverFacetsConverter { } } - private void setRequestInformation(final Context context, final String query, final String dsoType, + private void setRequestInformation(final Context context, final String query, final List dsoTypes, final String configurationName, final String scope, final List searchFilters, final Pageable page, final SearchResultsRest resultsRest) { resultsRest.setQuery(query); resultsRest.setConfiguration(configurationName); - resultsRest.setDsoType(dsoType); + resultsRest.setDsoTypes(dsoTypes); resultsRest.setSort(SearchResultsRest.Sorting.fromPage(page)); resultsRest.setScope(scope); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverResultConverter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverResultConverter.java index 772961a50f..6b289ec962 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverResultConverter.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/DiscoverResultConverter.java @@ -43,7 +43,7 @@ public class DiscoverResultConverter { @Autowired private SearchFilterToAppliedFilterConverter searchFilterToAppliedFilterConverter; - public SearchResultsRest convert(final Context context, final String query, final String dsoType, + public SearchResultsRest convert(final Context context, final String query, final List dsoTypes, final String configurationName, final String scope, final List searchFilters, final Pageable page, final DiscoverResult searchResult, final DiscoveryConfiguration configuration, @@ -52,7 +52,7 @@ public class DiscoverResultConverter { SearchResultsRest resultsRest = new SearchResultsRest(); resultsRest.setProjection(projection); - setRequestInformation(context, query, dsoType, configurationName, scope, searchFilters, page, resultsRest); + setRequestInformation(context, query, dsoTypes, configurationName, scope, searchFilters, page, resultsRest); addSearchResults(searchResult, resultsRest, projection); @@ -101,13 +101,13 @@ public class DiscoverResultConverter { return null; } - private void setRequestInformation(final Context context, final String query, final String dsoType, + private void setRequestInformation(final Context context, final String query, final List dsoTypes, final String configurationName, final String scope, final List searchFilters, final Pageable page, final SearchResultsRest resultsRest) { resultsRest.setQuery(query); resultsRest.setConfiguration(configurationName); - resultsRest.setDsoType(dsoType); + resultsRest.setDsoTypes(dsoTypes); resultsRest.setScope(scope); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/SubmissionFormConverter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/SubmissionFormConverter.java index f8f34612eb..339f601dc4 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/SubmissionFormConverter.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/SubmissionFormConverter.java @@ -115,14 +115,16 @@ public class SubmissionFormConverter implements DSpaceConverter getModelClass() { return DCInputSet.class; diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/UsageReportConverter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/UsageReportConverter.java new file mode 100644 index 0000000000..cdea2f52fe --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/UsageReportConverter.java @@ -0,0 +1,31 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.converter; + +import org.dspace.app.rest.model.UsageReportRest; +import org.dspace.app.rest.projection.Projection; +import org.springframework.stereotype.Component; + +/** + * Converter so list of UsageReportRest can be converted in to a rest page + * + * @author Maria Verdonck (Atmire) on 11/06/2020 + */ +@Component +public class UsageReportConverter implements DSpaceConverter { + @Override + public UsageReportRest convert(UsageReportRest modelObject, Projection projection) { + modelObject.setProjection(projection); + return modelObject; + } + + @Override + public Class getModelClass() { + return UsageReportRest.class; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthorityEntryRestConverter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/VocabularyEntryDetailsRestConverter.java similarity index 74% rename from dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthorityEntryRestConverter.java rename to dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/VocabularyEntryDetailsRestConverter.java index aa5e196dc1..358a71455d 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthorityEntryRestConverter.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/VocabularyEntryDetailsRestConverter.java @@ -7,7 +7,7 @@ */ package org.dspace.app.rest.converter; -import org.dspace.app.rest.model.AuthorityEntryRest; +import org.dspace.app.rest.model.VocabularyEntryDetailsRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.utils.AuthorityUtils; import org.dspace.content.authority.Choice; @@ -22,16 +22,17 @@ import org.springframework.stereotype.Component; * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) */ @Component -public class AuthorityEntryRestConverter implements DSpaceConverter { +public class VocabularyEntryDetailsRestConverter implements DSpaceConverter { @Override - public AuthorityEntryRest convert(Choice choice, Projection projection) { - AuthorityEntryRest entry = new AuthorityEntryRest(); + public VocabularyEntryDetailsRest convert(Choice choice, Projection projection) { + VocabularyEntryDetailsRest entry = new VocabularyEntryDetailsRest(); entry.setProjection(projection); entry.setValue(choice.value); entry.setDisplay(choice.label); entry.setId(choice.authority); entry.setOtherInformation(choice.extras); + entry.setSelectable(choice.selectable); return entry; } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthorityRestConverter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/VocabularyRestConverter.java similarity index 67% rename from dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthorityRestConverter.java rename to dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/VocabularyRestConverter.java index 7e78ef7f14..5dcb05a23e 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/AuthorityRestConverter.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/converter/VocabularyRestConverter.java @@ -7,7 +7,7 @@ */ package org.dspace.app.rest.converter; -import org.dspace.app.rest.model.AuthorityRest; +import org.dspace.app.rest.model.VocabularyRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.utils.AuthorityUtils; import org.dspace.content.authority.ChoiceAuthority; @@ -23,15 +23,15 @@ import org.springframework.stereotype.Component; * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) */ @Component -public class AuthorityRestConverter implements DSpaceConverter { +public class VocabularyRestConverter implements DSpaceConverter { @Override - public AuthorityRest convert(ChoiceAuthority step, Projection projection) { - AuthorityRest authorityRest = new AuthorityRest(); + public VocabularyRest convert(ChoiceAuthority authority, Projection projection) { + VocabularyRest authorityRest = new VocabularyRest(); authorityRest.setProjection(projection); - authorityRest.setHierarchical(step.isHierarchical()); - authorityRest.setScrollable(step.isScrollable()); - authorityRest.setIdentifier(step.hasIdentifier()); + authorityRest.setHierarchical(authority.isHierarchical()); + authorityRest.setScrollable(authority.isScrollable()); + authorityRest.setPreloadLevel(authority.getPreloadLevel()); return authorityRest; } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/exception/DSpaceApiExceptionControllerAdvice.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/exception/DSpaceApiExceptionControllerAdvice.java index d255b6fe27..04aa626153 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/exception/DSpaceApiExceptionControllerAdvice.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/exception/DSpaceApiExceptionControllerAdvice.java @@ -14,6 +14,8 @@ import java.sql.SQLException; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; import org.dspace.app.rest.security.RestAuthenticationService; import org.dspace.authorize.AuthorizeException; import org.springframework.beans.TypeMismatchException; @@ -38,9 +40,12 @@ import org.springframework.web.servlet.mvc.method.annotation.ResponseEntityExcep * @author Tom Desair (tom dot desair at atmire dot com) * @author Frederic Van Reet (frederic dot vanreet at atmire dot com) * @author Andrea Bollini (andrea.bollini at 4science.it) + * @author Pasquale Cavallo (pasquale.cavallo at 4science dot it) + * */ @ControllerAdvice public class DSpaceApiExceptionControllerAdvice extends ResponseEntityExceptionHandler { + private static final Logger log = LogManager.getLogger(DSpaceApiExceptionControllerAdvice.class); @Autowired private RestAuthenticationService restAuthenticationService; @@ -49,16 +54,16 @@ public class DSpaceApiExceptionControllerAdvice extends ResponseEntityExceptionH protected void handleAuthorizeException(HttpServletRequest request, HttpServletResponse response, Exception ex) throws IOException { if (restAuthenticationService.hasAuthenticationData(request)) { - sendErrorResponse(request, response, ex, ex.getMessage(), HttpServletResponse.SC_FORBIDDEN); + sendErrorResponse(request, response, ex, "Access is denied", HttpServletResponse.SC_FORBIDDEN); } else { - sendErrorResponse(request, response, ex, ex.getMessage(), HttpServletResponse.SC_UNAUTHORIZED); + sendErrorResponse(request, response, ex, "Authentication is required", HttpServletResponse.SC_UNAUTHORIZED); } } @ExceptionHandler({IllegalArgumentException.class, MultipartException.class}) protected void handleWrongRequestException(HttpServletRequest request, HttpServletResponse response, Exception ex) throws IOException { - sendErrorResponse(request, response, ex, ex.getMessage(), HttpServletResponse.SC_BAD_REQUEST); + sendErrorResponse(request, response, ex, "Request is invalid or incorrect", HttpServletResponse.SC_BAD_REQUEST); } @ExceptionHandler(SQLException.class) @@ -72,24 +77,24 @@ public class DSpaceApiExceptionControllerAdvice extends ResponseEntityExceptionH protected void handleIOException(HttpServletRequest request, HttpServletResponse response, Exception ex) throws IOException { sendErrorResponse(request, response, ex, - "An internal read or write operation failed (IO Exception)", + "An internal read or write operation failed", HttpServletResponse.SC_INTERNAL_SERVER_ERROR); } @ExceptionHandler(MethodNotAllowedException.class) protected void methodNotAllowedException(HttpServletRequest request, HttpServletResponse response, Exception ex) throws IOException { - sendErrorResponse(request, response, ex, ex.getMessage(), HttpServletResponse.SC_METHOD_NOT_ALLOWED); + sendErrorResponse(request, response, ex, "Method is not allowed or supported", + HttpServletResponse.SC_METHOD_NOT_ALLOWED); } @ExceptionHandler( {UnprocessableEntityException.class}) protected void handleUnprocessableEntityException(HttpServletRequest request, HttpServletResponse response, Exception ex) throws IOException { - //422 is not defined in HttpServletResponse. Its meaning is "Unprocessable Entity". //Using the value from HttpStatus. sendErrorResponse(request, response, null, - ex.getMessage(), + "Unprocessable or invalid entity", HttpStatus.UNPROCESSABLE_ENTITY.value()); } @@ -98,7 +103,7 @@ public class DSpaceApiExceptionControllerAdvice extends ResponseEntityExceptionH throws IOException { // we want the 400 status for missing parameters, see https://jira.lyrasis.org/browse/DS-4428 sendErrorResponse(request, response, null, - ex.getMessage(), + "A required parameter is invalid", HttpStatus.BAD_REQUEST.value()); } @@ -107,7 +112,7 @@ public class DSpaceApiExceptionControllerAdvice extends ResponseEntityExceptionH throws IOException { // we want the 400 status for missing parameters, see https://jira.lyrasis.org/browse/DS-4428 sendErrorResponse(request, response, null, - ex.getMessage(), + "A required parameter is missing", HttpStatus.BAD_REQUEST.value()); } @@ -137,7 +142,7 @@ public class DSpaceApiExceptionControllerAdvice extends ResponseEntityExceptionH } else { returnCode = HttpServletResponse.SC_INTERNAL_SERVER_ERROR; } - sendErrorResponse(request, response, ex, "An Exception has occured", returnCode); + sendErrorResponse(request, response, ex, "An exception has occurred", returnCode); } @@ -147,6 +152,13 @@ public class DSpaceApiExceptionControllerAdvice extends ResponseEntityExceptionH //Make sure Spring picks up this exception request.setAttribute(EXCEPTION_ATTRIBUTE, ex); + // For now, just logging server errors. + // We don't want to fill logs with bad/invalid REST API requests. + if (statusCode == HttpServletResponse.SC_INTERNAL_SERVER_ERROR) { + // Log the full error and status code + log.error("{} (status:{})", message, statusCode, ex); + } + //Exception properties will be set by org.springframework.boot.web.support.ErrorPageFilter response.sendError(statusCode, message); } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/exception/LinkNotFoundException.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/exception/LinkNotFoundException.java new file mode 100644 index 0000000000..5710b7a176 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/exception/LinkNotFoundException.java @@ -0,0 +1,30 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.exception; + +import org.springframework.http.HttpStatus; +import org.springframework.web.bind.annotation.ResponseStatus; + +/** + * This is the exception to capture details about a not existing linked resource + * + * @author Andrea Bollini (andrea.bollini at 4science.it) + */ +@ResponseStatus(value = HttpStatus.NOT_FOUND, reason = "This link is not found in the system") +public class LinkNotFoundException extends RuntimeException { + String apiCategory; + String model; + String id; + + public LinkNotFoundException(String apiCategory, String model, String id) { + this.apiCategory = apiCategory; + this.model = model; + this.id = id; + } + +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/filter/ContentLanguageHeaderResponseFilter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/filter/ContentLanguageHeaderResponseFilter.java new file mode 100644 index 0000000000..74ffd73ad4 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/filter/ContentLanguageHeaderResponseFilter.java @@ -0,0 +1,58 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.filter; + +import java.io.IOException; +import java.util.Locale; +import javax.servlet.Filter; +import javax.servlet.FilterChain; +import javax.servlet.FilterConfig; +import javax.servlet.ServletException; +import javax.servlet.ServletRequest; +import javax.servlet.ServletResponse; +import javax.servlet.http.HttpServletResponse; + +import org.dspace.core.I18nUtil; +import org.springframework.stereotype.Component; + +/** + * This filter assures that when the dspace instance supports multiple languages + * they are noted in the Content-Language Header of the response. Where + * appropriate the single endpoint can set the Content-Language header directly + * to note that the response is specific for a language + * + * @author Mykhaylo Boychuk (at 4science.it) + */ +@Component +public class ContentLanguageHeaderResponseFilter implements Filter { + + @Override + public void init(FilterConfig filterConfig) throws ServletException { + } + + @Override + public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) + throws IOException, ServletException { + HttpServletResponse httpServletResponse = (HttpServletResponse) response; + Locale[] locales = I18nUtil.getSupportedLocales(); + StringBuilder locsStr = new StringBuilder(); + for (Locale locale : locales) { + if (locsStr.length() > 0) { + locsStr.append(","); + } + locsStr.append(locale.getLanguage()); + } + httpServletResponse.setHeader("Content-Language", locsStr.toString()); + chain.doFilter(request, response); + } + + @Override + public void destroy() { + } + +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/AuthenticationTokenHalLinkFactory.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/AuthenticationTokenHalLinkFactory.java new file mode 100644 index 0000000000..ea70f08923 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/AuthenticationTokenHalLinkFactory.java @@ -0,0 +1,42 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.link; + +import java.util.LinkedList; + +import org.dspace.app.rest.AuthenticationRestController; +import org.dspace.app.rest.model.hateoas.AuthenticationTokenResource; +import org.springframework.data.domain.Pageable; +import org.springframework.hateoas.IanaLinkRelations; +import org.springframework.hateoas.Link; +import org.springframework.stereotype.Component; + +/** + * This class adds the self link to the AuthenticationTokenResource. + */ +@Component +public class AuthenticationTokenHalLinkFactory + extends HalLinkFactory { + + @Override + protected void addLinks(AuthenticationTokenResource halResource, Pageable pageable, LinkedList list) + throws Exception { + + list.add(buildLink(IanaLinkRelations.SELF.value(), getMethodOn().shortLivedToken(null))); + } + + @Override + protected Class getControllerClass() { + return AuthenticationRestController.class; + } + + @Override + protected Class getResourceClass() { + return AuthenticationTokenResource.class; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/AuthorityEntryHalLinkFactory.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/AuthorityEntryHalLinkFactory.java deleted file mode 100644 index e24d70a526..0000000000 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/AuthorityEntryHalLinkFactory.java +++ /dev/null @@ -1,66 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.app.rest.link; - -import static org.springframework.hateoas.server.mvc.WebMvcLinkBuilder.linkTo; - -import java.util.LinkedList; - -import org.atteo.evo.inflector.English; -import org.dspace.app.rest.RestResourceController; -import org.dspace.app.rest.model.AuthorityEntryRest; -import org.dspace.app.rest.model.AuthorityRest; -import org.dspace.app.rest.model.hateoas.AuthorityEntryResource; -import org.dspace.app.rest.utils.AuthorityUtils; -import org.springframework.data.domain.Pageable; -import org.springframework.hateoas.IanaLinkRelations; -import org.springframework.hateoas.Link; -import org.springframework.stereotype.Component; -import org.springframework.web.util.UriComponentsBuilder; - -/** - * This class' purpose is to provide a factory to add links to the AuthorityEntryResource. The addLinks factory will - * be called - * from the HalLinkService class addLinks method. - */ -@Component -public class AuthorityEntryHalLinkFactory extends HalLinkFactory { - - protected void addLinks(final AuthorityEntryResource halResource, final Pageable pageable, - final LinkedList list) throws Exception { - AuthorityEntryRest entry = halResource.getContent(); - - if (entry.getOtherInformation() != null) { - if (entry.getOtherInformation().containsKey(AuthorityUtils.RESERVED_KEYMAP_PARENT)) { - UriComponentsBuilder uriComponentsBuilder = linkTo( - getMethodOn(AuthorityRest.CATEGORY, AuthorityRest.NAME) - .findRel(null, null, AuthorityRest.CATEGORY, - English.plural(AuthorityRest.NAME), - entry.getAuthorityName() + "/" + AuthorityRest.ENTRY, - entry.getOtherInformation().get(AuthorityUtils.RESERVED_KEYMAP_PARENT), null, null)) - .toUriComponentsBuilder(); - - list.add(buildLink(AuthorityUtils.RESERVED_KEYMAP_PARENT, uriComponentsBuilder.build().toString())); - } - } - String selfLinkString = linkTo( - getMethodOn().findOne(entry.getCategory(), English.plural(entry.getType()), entry.getAuthorityName())) - .toUriComponentsBuilder().build().toString() + "/entryValues/" + entry.getId(); - list.add(buildLink(IanaLinkRelations.SELF.value(), selfLinkString)); - } - - protected Class getControllerClass() { - return RestResourceController.class; - } - - protected Class getResourceClass() { - return AuthorityEntryResource.class; - } - -} - diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/CollectionResourceWorkflowGroupHalLinkFactory.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/CollectionResourceWorkflowGroupHalLinkFactory.java index 7d0256cfc4..c049a74c0d 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/CollectionResourceWorkflowGroupHalLinkFactory.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/CollectionResourceWorkflowGroupHalLinkFactory.java @@ -47,9 +47,9 @@ public class CollectionResourceWorkflowGroupHalLinkFactory Map roles = WorkflowUtils.getCollectionRoles(collection); UUID resourceUuid = UUID.fromString(halResource.getContent().getUuid()); for (Map.Entry entry : roles.entrySet()) { - list.add(buildLink("workflowGroups/" + entry.getKey(), getMethodOn() + list.add(buildLink("workflowGroups", getMethodOn() .getWorkflowGroupForRole(resourceUuid, null, null, - entry.getKey()))); + entry.getKey())).withName(entry.getKey())); } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/search/DiscoveryRestHalLinkFactory.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/search/DiscoveryRestHalLinkFactory.java index 0479322881..8e744e9fd5 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/search/DiscoveryRestHalLinkFactory.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/link/search/DiscoveryRestHalLinkFactory.java @@ -28,7 +28,7 @@ public abstract class DiscoveryRestHalLinkFactory extends HalLinkFactory extends HalLinkFactory extends HalLinkFactory dsoType = searchData == null ? null : searchData.getDsoTypes(); String scope = searchData == null ? null : searchData.getScope(); String configuration = searchData == null ? null : searchData.getConfiguration(); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthenticationStatusRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthenticationStatusRest.java index cddfe34a22..a137620e6b 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthenticationStatusRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthenticationStatusRest.java @@ -18,7 +18,7 @@ public class AuthenticationStatusRest extends BaseObjectRest { private boolean authenticated; public static final String NAME = "status"; - public static final String CATEGORY = "authn"; + public static final String CATEGORY = RestAddressableModel.AUTHENTICATION; @Override public String getCategory() { diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthenticationTokenRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthenticationTokenRest.java new file mode 100644 index 0000000000..0599e09565 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthenticationTokenRest.java @@ -0,0 +1,44 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +import org.dspace.app.rest.RestResourceController; + +/** + * The authentication token REST HAL Resource. The HAL Resource wraps the REST Resource + * adding support for the links and embedded resources + */ +public class AuthenticationTokenRest extends RestAddressableModel { + public static final String NAME = "shortlivedtoken"; + public static final String CATEGORY = RestAddressableModel.AUTHENTICATION; + + private String token; + + @Override + public String getCategory() { + return CATEGORY; + } + + @Override + public Class getController() { + return RestResourceController.class; + } + + @Override + public String getType() { + return NAME; + } + + public String getToken() { + return token; + } + + public void setToken(String token) { + this.token = token; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthnRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthnRest.java index dd225de1c7..fade90fe4d 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthnRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthnRest.java @@ -18,7 +18,7 @@ import org.dspace.app.rest.AuthenticationRestController; public class AuthnRest extends BaseObjectRest { public static final String NAME = "authn"; - public static final String CATEGORY = "authn"; + public static final String CATEGORY = RestAddressableModel.AUTHENTICATION; public String getCategory() { return CATEGORY; diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/BundleRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/BundleRest.java index 71f05c8333..dd4a80d488 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/BundleRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/BundleRest.java @@ -16,6 +16,10 @@ import com.fasterxml.jackson.annotation.JsonProperty; * @author Jelle Pelgrims (jelle.pelgrims at atmire.com) */ @LinksRest(links = { + @LinkRest( + name = BundleRest.ITEM, + method = "getItem" + ), @LinkRest( name = BundleRest.BITSTREAMS, method = "getBitstreams" @@ -30,6 +34,7 @@ public class BundleRest extends DSpaceObjectRest { public static final String PLURAL_NAME = "bundles"; public static final String CATEGORY = RestAddressableModel.CORE; + public static final String ITEM = "item"; public static final String BITSTREAMS = "bitstreams"; public static final String PRIMARY_BITSTREAM = "primaryBitstream"; diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/DiscoveryResultsRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/DiscoveryResultsRest.java index d45d948fa6..bf1d513a81 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/DiscoveryResultsRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/DiscoveryResultsRest.java @@ -27,7 +27,7 @@ public abstract class DiscoveryResultsRest extends BaseObjectRest { private List appliedFilters; private SearchResultsRest.Sorting sort; @JsonIgnore - private String dsoType; + private List dsoTypes; @JsonIgnore private List searchFilters; private String configuration; @@ -52,12 +52,12 @@ public abstract class DiscoveryResultsRest extends BaseObjectRest { this.query = query; } - public String getDsoType() { - return dsoType; + public List getDsoTypes() { + return dsoTypes; } - public void setDsoType(final String dsoType) { - this.dsoType = dsoType; + public void setDsoTypes(final List dsoTypes) { + this.dsoTypes = dsoTypes; } public String getScope() { diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/EPersonRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/EPersonRest.java index 00881b9fd1..7b4c683322 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/EPersonRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/EPersonRest.java @@ -41,7 +41,7 @@ public class EPersonRest extends DSpaceObjectRest { private boolean requireCertificate = false; - private boolean selfRegistered = false; + private Boolean selfRegistered; @JsonProperty(access = Access.WRITE_ONLY) private String password; @@ -92,11 +92,11 @@ public class EPersonRest extends DSpaceObjectRest { this.requireCertificate = requireCertificate; } - public boolean isSelfRegistered() { + public Boolean isSelfRegistered() { return selfRegistered; } - public void setSelfRegistered(boolean selfRegistered) { + public void setSelfRegistered(Boolean selfRegistered) { this.selfRegistered = selfRegistered; } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/MetadataFieldRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/MetadataFieldRest.java index 966b3afbbe..4524f82a68 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/MetadataFieldRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/MetadataFieldRest.java @@ -18,6 +18,7 @@ import org.dspace.app.rest.RestResourceController; */ public class MetadataFieldRest extends BaseObjectRest { public static final String NAME = "metadatafield"; + public static final String NAME_PLURAL = "metadatafields"; public static final String CATEGORY = RestAddressableModel.CORE; @JsonIgnore diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/ProcessRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/ProcessRest.java index 399d880f3b..8216e16171 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/ProcessRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/ProcessRest.java @@ -28,6 +28,10 @@ import org.dspace.scripts.Process; @LinkRest( name = ProcessRest.FILE_TYPES, method = "getFileTypesFromProcess" + ), + @LinkRest( + name = ProcessRest.OUTPUT, + method = "getOutputFromProcess" ) }) public class ProcessRest extends BaseObjectRest { @@ -37,6 +41,8 @@ public class ProcessRest extends BaseObjectRest { public static final String FILES = "files"; public static final String FILE_TYPES = "filetypes"; + public static final String OUTPUT = "output"; + public String getCategory() { return CATEGORY; } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RegistrationRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RegistrationRest.java new file mode 100644 index 0000000000..e8397f8ca7 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RegistrationRest.java @@ -0,0 +1,77 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +import java.util.UUID; + +import com.fasterxml.jackson.annotation.JsonProperty; +import org.dspace.app.rest.RestResourceController; + + +/** + * This class acts as the REST representation of the RegistrationData model class. + * This class acts as a data holder for the RegistrationResource + * Refer to {@link org.dspace.eperson.RegistrationData} for explanation about the properties + */ +public class RegistrationRest extends RestAddressableModel { + + public static final String NAME = "registration"; + public static final String NAME_PLURAL = "registrations"; + public static final String CATEGORY = EPERSON; + + private String email; + private UUID user; + + /** + * Generic getter for the email + * @return the email value of this RegisterRest + */ + public String getEmail() { + return email; + } + + /** + * Generic setter for the email + * @param email The email to be set on this RegisterRest + */ + public void setEmail(String email) { + this.email = email; + } + + /** + * Generic getter for the user + * @return the user value of this RegisterRest + */ + public UUID getUser() { + return user; + } + + /** + * Generic setter for the user + * @param user The user to be set on this RegisterRest + */ + public void setUser(UUID user) { + this.user = user; + } + + @Override + public String getCategory() { + return CATEGORY; + } + + @Override + public Class getController() { + return RestResourceController.class; + } + + @Override + @JsonProperty(access = JsonProperty.Access.READ_ONLY) + public String getType() { + return NAME; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RelationshipRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RelationshipRest.java index 16fcbbd0bf..e1aeb3ff6f 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RelationshipRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RelationshipRest.java @@ -27,7 +27,6 @@ public class RelationshipRest extends BaseObjectRest { @JsonIgnore private UUID rightId; - private int relationshipTypeId; private RelationshipTypeRest relationshipType; private int leftPlace; private int rightPlace; @@ -90,14 +89,6 @@ public class RelationshipRest extends BaseObjectRest { this.rightPlace = rightPlace; } - public int getRelationshipTypeId() { - return relationshipTypeId; - } - - public void setRelationshipTypeId(int relationshipTypeId) { - this.relationshipTypeId = relationshipTypeId; - } - public String getRightwardValue() { return rightwardValue; } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RestModel.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RestModel.java index 940f4a4deb..0b32aedf92 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RestModel.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/RestModel.java @@ -32,6 +32,7 @@ public interface RestModel extends Serializable { public static final String WORKFLOW = "workflow"; public static final String AUTHORIZATION = "authz"; public static final String VERSIONING = "versioning"; + public static final String AUTHENTICATION = "authn"; public String getType(); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/SubmissionFormInputTypeRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/SubmissionFormInputTypeRest.java index 594d715b22..ff5481443b 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/SubmissionFormInputTypeRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/SubmissionFormInputTypeRest.java @@ -21,7 +21,6 @@ import com.fasterxml.jackson.annotation.JsonInclude.Include; public class SubmissionFormInputTypeRest { private String type; private String regex; - private AuthorityRest authority; public String getType() { return type; @@ -39,11 +38,4 @@ public class SubmissionFormInputTypeRest { this.regex = regex; } - public AuthorityRest getAuthority() { - return authority; - } - - public void setAuthority(AuthorityRest authority) { - this.authority = authority; - } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointCityRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointCityRest.java new file mode 100644 index 0000000000..369bcce4d1 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointCityRest.java @@ -0,0 +1,29 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +/** + * This class serves as a REST representation of a City data Point of a {@link UsageReportRest} from the DSpace + * statistics + * + * @author Maria Verdonck (Atmire) on 08/06/2020 + */ +public class UsageReportPointCityRest extends UsageReportPointRest { + public static final String NAME = "city"; + + @Override + public String getType() { + return NAME; + } + + @Override + public void setId(String id) { + super.id = id; + super.label = id; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointCountryRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointCountryRest.java new file mode 100644 index 0000000000..7189c48983 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointCountryRest.java @@ -0,0 +1,37 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +import org.dspace.statistics.util.LocationUtils; + +/** + * This class serves as a REST representation of a Country data Point of a {@link UsageReportRest} from the DSpace + * statistics + * + * @author Maria Verdonck (Atmire) on 08/06/2020 + */ +public class UsageReportPointCountryRest extends UsageReportPointRest { + public static final String NAME = "country"; + + @Override + public void setLabel(String label) { + super.label = label; + super.id = LocationUtils.getCountryCode(label); + } + + @Override + public void setId(String id) { + super.id = id; + super.label = LocationUtils.getCountryName(id); + } + + @Override + public String getType() { + return NAME; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointDateRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointDateRest.java new file mode 100644 index 0000000000..e9b4ddea15 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointDateRest.java @@ -0,0 +1,29 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +/** + * This class serves as a REST representation of a Date (month) data Point of a {@link UsageReportRest} from the DSpace + * statistics + * + * @author Maria Verdonck (Atmire) on 08/06/2020 + */ +public class UsageReportPointDateRest extends UsageReportPointRest { + public static final String NAME = "date"; + + @Override + public String getType() { + return NAME; + } + + @Override + public void setId(String id) { + super.id = id; + super.label = id; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointDsoTotalVisitsRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointDsoTotalVisitsRest.java new file mode 100644 index 0000000000..fd8d334786 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointDsoTotalVisitsRest.java @@ -0,0 +1,36 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +/** + * This class serves as a REST representation of a TotalVisit data Point of a DSO's {@link UsageReportRest} from the + * DSpace statistics + * + * @author Maria Verdonck (Atmire) on 08/06/2020 + */ +public class UsageReportPointDsoTotalVisitsRest extends UsageReportPointRest { + + /** + * Type of dso a UsageReport is being requested of (e.g. item, bitstream, ...) + */ + private String type; + + @Override + public String getType() { + return this.type; + } + + /** + * Sets the type of this {@link UsageReportPointRest} object, should be type of dso concerned (e.g. item, bitstream, ...) + * + * @param type Type of dso a {@link UsageReportRest} object is being requested of (e.g. item, bitstream, ...) + */ + public void setType(String type) { + this.type = type; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointRest.java new file mode 100644 index 0000000000..feb006486f --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportPointRest.java @@ -0,0 +1,123 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +import java.util.HashMap; +import java.util.Map; + +import org.dspace.app.rest.RestResourceController; + +/** + * This class serves as a REST representation of a Point of a {@link UsageReportRest} from the DSpace statistics + * + * @author Maria Verdonck (Atmire) on 08/06/2020 + */ +public class UsageReportPointRest extends BaseObjectRest { + public static final String NAME = "point"; + public static final String CATEGORY = RestModel.STATISTICS; + protected String id; + protected String label; + private Map values; + + /** + * Returns the category of this Rest object, {@link #CATEGORY} + * + * @return The category of this Rest object, {@link #CATEGORY} + */ + @Override + public String getCategory() { + return CATEGORY; + } + + /** + * Return controller class responsible for this Rest object + * + * @return Controller class responsible for this Rest object + */ + @Override + public Class getController() { + return RestResourceController.class; + } + + /** + * Returns the type of this {@link UsageReportPointRest} object + * + * @return Type of this {@link UsageReportPointRest} object + */ + @Override + public String getType() { + return NAME; + } + + /** + * Returns the values of this {@link UsageReportPointRest} object, containing the amount of views + * + * @return The values of this {@link UsageReportPointRest} object, containing the amount of views + */ + public Map getValues() { + return values; + } + + /** + * Returns the id of this {@link UsageReportPointRest} object, of the form: type of UsageReport_dso uuid + * + * @return The id of this {@link UsageReportPointRest} object, of the form: type of UsageReport_dso uuid + */ + public String getId() { + return id; + } + + /** + * Set the id of this {@link UsageReportPointRest} object, of the form: type of UsageReport_dso uuid + * + * @param id The id of this {@link UsageReportPointRest} object, of the form: type of UsageReport_dso uuid + */ + public void setId(String id) { + this.id = id; + } + + /** + * Add a value pair to this {@link UsageReportPointRest} object's values + * + * @param key Key of new value pair + * @param value Value of new value pair + */ + public void addValue(String key, Integer value) { + if (values == null) { + values = new HashMap<>(); + } + values.put(key, value); + } + + /** + * Sets all values of this {@link UsageReportPointRest} object + * + * @param values All values of this {@link UsageReportPointRest} object + */ + public void setValues(Map values) { + this.values = values; + } + + /** + * Returns label of this {@link UsageReportPointRest} object, e.g. the dso's name + * + * @return Label of this {@link UsageReportPointRest} object, e.g. the dso's name + */ + public String getLabel() { + return label; + } + + /** + * Sets the label of this {@link UsageReportPointRest} object, e.g. the dso's name + * + * @param label Label of this {@link UsageReportPointRest} object, e.g. the dso's name + */ + public void setLabel(String label) { + this.label = label; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportRest.java new file mode 100644 index 0000000000..a59535fb94 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/UsageReportRest.java @@ -0,0 +1,119 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +import java.util.ArrayList; +import java.util.List; + +import com.fasterxml.jackson.annotation.JsonProperty; +import org.dspace.app.rest.RestResourceController; + +/** + * This class serves as a REST representation of a Usage Report from the DSpace statistics + * + * @author Maria Verdonck (Atmire) on 08/06/2020 + */ +public class UsageReportRest extends BaseObjectRest { + public static final String NAME = "usagereport"; + public static final String CATEGORY = RestModel.STATISTICS; + + @JsonProperty(value = "report-type") + private String reportType; + private List points; + + /** + * Returns the category of this Rest object, {@link #CATEGORY} + * + * @return The category of this Rest object, {@link #CATEGORY} + */ + @Override + public String getCategory() { + return CATEGORY; + } + + /** + * Return controller class responsible for this Rest object + * + * @return Controller class responsible for this Rest object + */ + @Override + public Class getController() { + return RestResourceController.class; + } + + /** + * Returns the type of this {@link UsageReportRest} object + * + * @return Type of this {@link UsageReportRest} object + */ + @Override + public String getType() { + return NAME; + } + + /** + * Returns the report type of this UsageReport, options listed in + * {@link org.dspace.app.rest.utils.UsageReportUtils}, e.g. + * {@link org.dspace.app.rest.utils.UsageReportUtils#TOTAL_VISITS_REPORT_ID} + * + * @return The report type of this UsageReport, options listed in + * {@link org.dspace.app.rest.utils.UsageReportUtils}, e.g. + * {@link org.dspace.app.rest.utils.UsageReportUtils#TOTAL_VISITS_REPORT_ID} + */ + public String getReportType() { + return reportType; + } + + /** + * Sets the report type of this UsageReport, options listed in + * {@link org.dspace.app.rest.utils.UsageReportUtils}, e.g. + * {@link org.dspace.app.rest.utils.UsageReportUtils#TOTAL_VISITS_REPORT_ID} + * + * @param reportType The report type of this UsageReport, options listed in + * {@link org.dspace.app.rest.utils.UsageReportUtils}, e.g. + * {@link org.dspace.app.rest.utils.UsageReportUtils#TOTAL_VISITS_REPORT_ID} + */ + public void setReportType(String reportType) { + this.reportType = reportType; + } + + /** + * Returns the list of {@link UsageReportPointRest} objects attached to this {@link UsageReportRest} object, or + * empty list if none + * + * @return The list of {@link UsageReportPointRest} objects attached to this {@link UsageReportRest} object, or + * empty list if none + */ + public List getPoints() { + if (points == null) { + points = new ArrayList<>(); + } + return points; + } + + /** + * Adds a {@link UsageReportPointRest} object to this {@link UsageReportRest} object + * + * @param point {@link UsageReportPointRest} to add to this {@link UsageReportRest} object + */ + public void addPoint(UsageReportPointRest point) { + if (points == null) { + points = new ArrayList<>(); + } + points.add(point); + } + + /** + * Set all {@link UsageReportPointRest} objects on this {@link UsageReportRest} object + * + * @param points All {@link UsageReportPointRest} objects on this {@link UsageReportRest} object + */ + public void setPoints(List points) { + this.points = points; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyEntryDetailsRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyEntryDetailsRest.java new file mode 100644 index 0000000000..42644c8c85 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyEntryDetailsRest.java @@ -0,0 +1,104 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model; + +import java.util.Map; + +import com.fasterxml.jackson.annotation.JsonIgnore; +import org.dspace.app.rest.RestResourceController; + +/** + * The Vocabulary Entry Details REST Resource + * + * @author Andrea Bollini (andrea.bollini at 4science.it) + */ +@LinksRest(links = { + @LinkRest(name = VocabularyEntryDetailsRest.PARENT, method = "getParent"), + @LinkRest(name = VocabularyEntryDetailsRest.CHILDREN, method = "getChildren") + }) +public class VocabularyEntryDetailsRest extends BaseObjectRest { + public static final String NAME = "vocabularyEntryDetail"; + public static final String PARENT = "parent"; + public static final String CHILDREN = "children"; + private String display; + private String value; + private Map otherInformation; + private boolean selectable; + @JsonIgnore + private boolean inHierarchicalVocabulary = false; + + @JsonIgnore + private String vocabularyName; + + public String getDisplay() { + return display; + } + + public void setDisplay(String value) { + this.display = value; + } + + public Map getOtherInformation() { + return otherInformation; + } + + public void setOtherInformation(Map otherInformation) { + this.otherInformation = otherInformation; + } + + public String getValue() { + return value; + } + + public void setValue(String value) { + this.value = value; + } + + public static String getName() { + return NAME; + } + + public String getVocabularyName() { + return vocabularyName; + } + + public void setVocabularyName(String vocabularyName) { + this.vocabularyName = vocabularyName; + } + + @Override + public String getCategory() { + return VocabularyRest.CATEGORY; + } + + @Override + public String getType() { + return VocabularyEntryDetailsRest.NAME; + } + + @Override + public Class getController() { + return RestResourceController.class; + } + + public Boolean isSelectable() { + return selectable; + } + + public void setSelectable(Boolean selectable) { + this.selectable = selectable; + } + + public void setInHierarchicalVocabulary(boolean isInHierarchicalVocabulary) { + this.inHierarchicalVocabulary = isInHierarchicalVocabulary; + } + + public boolean isInHierarchicalVocabulary() { + return inHierarchicalVocabulary; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthorityEntryRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyEntryRest.java similarity index 53% rename from dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthorityEntryRest.java rename to dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyEntryRest.java index 9fcc01d972..713d4c5209 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthorityEntryRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyEntryRest.java @@ -10,30 +10,28 @@ package org.dspace.app.rest.model; import java.util.Map; import com.fasterxml.jackson.annotation.JsonIgnore; -import org.dspace.app.rest.RestResourceController; +import com.fasterxml.jackson.annotation.JsonInclude; +import com.fasterxml.jackson.annotation.JsonInclude.Include; /** - * The Authority Entry REST Resource + * An entry in a Vocabulary * * @author Andrea Bollini (andrea.bollini at 4science.it) */ -public class AuthorityEntryRest extends RestAddressableModel { - public static final String NAME = "authorityEntry"; - private String id; +public class VocabularyEntryRest implements RestModel { + public static final String NAME = "vocabularyEntry"; + + @JsonInclude(Include.NON_NULL) + private String authority; private String display; private String value; private Map otherInformation; + /** + * The Vocabulary Entry Details resource if available related to this entry + */ @JsonIgnore - private String authorityName; - - public String getId() { - return id; - } - - public void setId(String id) { - this.id = id; - } + private VocabularyEntryDetailsRest vocabularyEntryDetailsRest; public String getDisplay() { return display; @@ -59,31 +57,24 @@ public class AuthorityEntryRest extends RestAddressableModel { this.value = value; } - public static String getName() { - return NAME; + public void setAuthority(String authority) { + this.authority = authority; } - public String getAuthorityName() { - return authorityName; + public String getAuthority() { + return authority; } - public void setAuthorityName(String authorityName) { - this.authorityName = authorityName; + public void setVocabularyEntryDetailsRest(VocabularyEntryDetailsRest vocabularyEntryDetailsRest) { + this.vocabularyEntryDetailsRest = vocabularyEntryDetailsRest; } - @Override - public String getCategory() { - return AuthorityRest.CATEGORY; + public VocabularyEntryDetailsRest getVocabularyEntryDetailsRest() { + return vocabularyEntryDetailsRest; } @Override public String getType() { - return AuthorityRest.NAME; + return VocabularyEntryRest.NAME; } - - @Override - public Class getController() { - return RestResourceController.class; - } - } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthorityRest.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyRest.java similarity index 68% rename from dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthorityRest.java rename to dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyRest.java index 3245e6f877..cc848b945b 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/AuthorityRest.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/VocabularyRest.java @@ -10,25 +10,20 @@ package org.dspace.app.rest.model; import org.dspace.app.rest.RestResourceController; /** - * The authority REST resource + * The vocabulary REST resource * * @author Andrea Bollini (andrea.bollini at 4science.it) */ @LinksRest(links = { - @LinkRest(name = AuthorityRest.ENTRIES, - method = "query" + @LinkRest(name = VocabularyRest.ENTRIES, + method = "filter" ), - @LinkRest( - name = AuthorityRest.ENTRY, - method = "getResource" - ) }) -public class AuthorityRest extends BaseObjectRest { +public class VocabularyRest extends BaseObjectRest { - public static final String NAME = "authority"; - public static final String CATEGORY = RestAddressableModel.INTEGRATION; + public static final String NAME = "vocabulary"; + public static final String CATEGORY = RestAddressableModel.SUBMISSION; public static final String ENTRIES = "entries"; - public static final String ENTRY = "entryValues"; private String name; @@ -36,7 +31,7 @@ public class AuthorityRest extends BaseObjectRest { private boolean hierarchical; - private boolean identifier; + private Integer preloadLevel; @Override public String getId() { @@ -67,6 +62,14 @@ public class AuthorityRest extends BaseObjectRest { this.hierarchical = hierarchical; } + public Integer getPreloadLevel() { + return preloadLevel; + } + + public void setPreloadLevel(Integer preloadLevel) { + this.preloadLevel = preloadLevel; + } + @Override public String getType() { return NAME; @@ -81,12 +84,4 @@ public class AuthorityRest extends BaseObjectRest { public String getCategory() { return CATEGORY; } - - public boolean hasIdentifier() { - return identifier; - } - - public void setIdentifier(boolean identifier) { - this.identifier = identifier; - } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthenticationTokenResource.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthenticationTokenResource.java new file mode 100644 index 0000000000..e46831b2f7 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthenticationTokenResource.java @@ -0,0 +1,20 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model.hateoas; + +import org.dspace.app.rest.model.AuthenticationTokenRest; + +/** + * Token resource, wraps the AuthenticationToken object + */ +public class AuthenticationTokenResource extends HALResource { + + public AuthenticationTokenResource(AuthenticationTokenRest content) { + super(content); + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthorityEntryResource.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthorityEntryResource.java deleted file mode 100644 index c99ebd6f2e..0000000000 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthorityEntryResource.java +++ /dev/null @@ -1,26 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.app.rest.model.hateoas; - -import org.dspace.app.rest.model.AuthorityEntryRest; -import org.dspace.app.rest.model.hateoas.annotations.RelNameDSpaceResource; - -/** - * Authority Rest HAL Resource. The HAL Resource wraps the REST Resource adding - * support for the links and embedded resources - * - * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) - */ -@RelNameDSpaceResource(AuthorityEntryRest.NAME) -public class AuthorityEntryResource extends HALResource { - - - public AuthorityEntryResource(AuthorityEntryRest entry) { - super(entry); - } -} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/HALResource.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/HALResource.java index 31e2c672e3..2631b63417 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/HALResource.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/HALResource.java @@ -8,11 +8,13 @@ package org.dspace.app.rest.model.hateoas; import java.util.HashMap; +import java.util.List; import java.util.Map; import com.fasterxml.jackson.annotation.JsonInclude; import com.fasterxml.jackson.annotation.JsonProperty; import com.fasterxml.jackson.annotation.JsonUnwrapped; +import org.apache.commons.lang3.StringUtils; import org.springframework.hateoas.EntityModel; import org.springframework.hateoas.Link; @@ -49,6 +51,15 @@ public abstract class HALResource extends EntityModel { public EntityModel add(Link link) { if (!hasLink(link.getRel())) { return super.add(link); + } else { + String name = link.getName(); + if (StringUtils.isNotBlank(name)) { + List list = this.getLinks(link.getRel()); + // If a link of this name doesn't already exist in the list, add it + if (!list.stream().anyMatch((l -> StringUtils.equalsIgnoreCase(l.getName(), name)))) { + super.add(link); + } + } } return this; } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/RegistrationResource.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/RegistrationResource.java new file mode 100644 index 0000000000..da53b680d6 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/RegistrationResource.java @@ -0,0 +1,22 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model.hateoas; + +import org.dspace.app.rest.model.RegistrationRest; +import org.dspace.app.rest.model.hateoas.annotations.RelNameDSpaceResource; + +/** + * Registration HAL Resource. This resource adds the data from the REST object together with embedded objects + * and a set of links if applicable + */ +@RelNameDSpaceResource(RegistrationRest.NAME) +public class RegistrationResource extends HALResource { + public RegistrationResource(RegistrationRest content) { + super(content); + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/UsageReportResource.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/UsageReportResource.java new file mode 100644 index 0000000000..a6a3397e25 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/UsageReportResource.java @@ -0,0 +1,24 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model.hateoas; + +import org.dspace.app.rest.model.UsageReportRest; +import org.dspace.app.rest.model.hateoas.annotations.RelNameDSpaceResource; +import org.dspace.app.rest.utils.Utils; + +/** + * The Resource representation of a {@link UsageReportRest} object + * + * @author Maria Verdonck (Atmire) on 08/06/2020 + */ +@RelNameDSpaceResource(UsageReportRest.NAME) +public class UsageReportResource extends DSpaceResource { + public UsageReportResource(UsageReportRest content, Utils utils) { + super(content, utils); + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyEntryDetailsResource.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyEntryDetailsResource.java new file mode 100644 index 0000000000..0467b29cef --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyEntryDetailsResource.java @@ -0,0 +1,30 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model.hateoas; + +import org.dspace.app.rest.model.VocabularyEntryDetailsRest; +import org.dspace.app.rest.model.hateoas.annotations.RelNameDSpaceResource; +import org.dspace.app.rest.utils.Utils; + +/** + * Vocabulary Entry Details Rest HAL Resource. The HAL Resource wraps the REST Resource adding + * support for the links and embedded resources + * + * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) + */ +@RelNameDSpaceResource(VocabularyEntryDetailsRest.NAME) +public class VocabularyEntryDetailsResource extends DSpaceResource { + + public VocabularyEntryDetailsResource(VocabularyEntryDetailsRest entry, Utils utils) { + super(entry, utils); + if (entry.isInHierarchicalVocabulary()) { + add(utils.linkToSubResource(entry, VocabularyEntryDetailsRest.PARENT)); + add(utils.linkToSubResource(entry, VocabularyEntryDetailsRest.CHILDREN)); + } + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyEntryResource.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyEntryResource.java new file mode 100644 index 0000000000..c29baa9fcd --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyEntryResource.java @@ -0,0 +1,24 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model.hateoas; + +import org.dspace.app.rest.model.VocabularyEntryRest; +import org.dspace.app.rest.model.hateoas.annotations.RelNameDSpaceResource; + +/** + * Vocabulary Entry Rest HAL Resource. The HAL Resource wraps the REST Resource + * adding support for the links and embedded resources + * + * @author Mykhaylo Boychuk (mykhaylo.boychuk at 4science.it) + */ +@RelNameDSpaceResource(VocabularyEntryRest.NAME) +public class VocabularyEntryResource extends HALResource { + public VocabularyEntryResource(VocabularyEntryRest sd) { + super(sd); + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthorityResource.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyResource.java similarity index 61% rename from dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthorityResource.java rename to dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyResource.java index 0e153097b4..4a2ec01d33 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/AuthorityResource.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/hateoas/VocabularyResource.java @@ -7,7 +7,7 @@ */ package org.dspace.app.rest.model.hateoas; -import org.dspace.app.rest.model.AuthorityRest; +import org.dspace.app.rest.model.VocabularyRest; import org.dspace.app.rest.model.hateoas.annotations.RelNameDSpaceResource; import org.dspace.app.rest.utils.Utils; @@ -17,13 +17,10 @@ import org.dspace.app.rest.utils.Utils; * * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) */ -@RelNameDSpaceResource(AuthorityRest.NAME) -public class AuthorityResource extends DSpaceResource { - public AuthorityResource(AuthorityRest sd, Utils utils) { +@RelNameDSpaceResource(VocabularyRest.NAME) +public class VocabularyResource extends DSpaceResource { + public VocabularyResource(VocabularyRest sd, Utils utils) { super(sd, utils); - if (sd.hasIdentifier()) { - add(utils.linkToSubResource(sd, AuthorityRest.ENTRY)); - } - add(utils.linkToSubResource(sd, AuthorityRest.ENTRIES)); + add(utils.linkToSubResource(sd, VocabularyRest.ENTRIES)); } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/submit/SelectableMetadata.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/submit/SelectableMetadata.java index 06f2cfc459..a203bb2721 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/submit/SelectableMetadata.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/submit/SelectableMetadata.java @@ -7,6 +7,9 @@ */ package org.dspace.app.rest.model.submit; +import com.fasterxml.jackson.annotation.JsonInclude; +import com.fasterxml.jackson.annotation.JsonInclude.Include; + /** * The SelectableMetadata REST Resource. It is not addressable directly, only * used as inline object in the InputForm resource. @@ -24,7 +27,9 @@ package org.dspace.app.rest.model.submit; public class SelectableMetadata { private String metadata; private String label; - private String authority; + @JsonInclude(Include.NON_NULL) + private String controlledVocabulary; + @JsonInclude(Include.NON_NULL) private Boolean closed = false; public String getMetadata() { @@ -43,12 +48,12 @@ public class SelectableMetadata { this.label = label; } - public void setAuthority(String authority) { - this.authority = authority; + public void setControlledVocabulary(String vocabularyName) { + this.controlledVocabulary = vocabularyName; } - public String getAuthority() { - return authority; + public String getControlledVocabulary() { + return controlledVocabulary; } public Boolean isClosed() { diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/wrapper/AuthenticationToken.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/wrapper/AuthenticationToken.java new file mode 100644 index 0000000000..30301ffd77 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/model/wrapper/AuthenticationToken.java @@ -0,0 +1,28 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.model.wrapper; + +/** + * This class represents an authentication token. It acts as a wrapper for a String object to differentiate between + * actual Strings and AuthenticationToken + */ +public class AuthenticationToken { + private String token; + + public AuthenticationToken(String token) { + this.token = token; + } + + public String getToken() { + return token; + } + + public void setToken(String token) { + this.token = token; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AbstractDSpaceRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AbstractDSpaceRestRepository.java index a64f8af5df..f5ef703700 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AbstractDSpaceRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AbstractDSpaceRestRepository.java @@ -7,10 +7,15 @@ */ package org.dspace.app.rest.repository; +import java.util.Enumeration; +import java.util.Locale; + +import org.apache.commons.lang3.StringUtils; import org.dspace.app.rest.converter.ConverterService; import org.dspace.app.rest.utils.ContextUtil; import org.dspace.app.rest.utils.Utils; import org.dspace.core.Context; +import org.dspace.core.I18nUtil; import org.dspace.services.RequestService; import org.dspace.services.model.Request; import org.dspace.utils.DSpace; @@ -33,11 +38,47 @@ public abstract class AbstractDSpaceRestRepository { protected RequestService requestService = new DSpace().getRequestService(); protected Context obtainContext() { + Context context = null; Request currentRequest = requestService.getCurrentRequest(); - return ContextUtil.obtainContext(currentRequest.getServletRequest()); + context = ContextUtil.obtainContext(currentRequest.getServletRequest()); + Locale currentLocale = getLocale(context, currentRequest); + context.setCurrentLocale(currentLocale); + return context; } public RequestService getRequestService() { return requestService; } + + private Locale getLocale(Context context, Request request) { + Locale userLocale = null; + Locale supportedLocale = null; + + // Locales requested from client + String locale = request.getHttpServletRequest().getHeader("Accept-Language"); + if (StringUtils.isNotBlank(locale)) { + Enumeration locales = request.getHttpServletRequest().getLocales(); + if (locales != null) { + while (locales.hasMoreElements()) { + Locale current = locales.nextElement(); + if (I18nUtil.isSupportedLocale(current)) { + userLocale = current; + break; + } + } + } + } + if (userLocale == null && context.getCurrentUser() != null) { + String userLanguage = context.getCurrentUser().getLanguage(); + if (userLanguage != null) { + userLocale = new Locale(userLanguage); + } + } + if (userLocale == null) { + return I18nUtil.getDefaultLocale(); + } + supportedLocale = I18nUtil.getSupportedLocale(userLocale); + return supportedLocale; + } + } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityEntryLinkRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityEntryLinkRepository.java deleted file mode 100644 index 0c3ec16299..0000000000 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityEntryLinkRepository.java +++ /dev/null @@ -1,82 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.app.rest.repository; - -import java.sql.SQLException; -import java.util.ArrayList; -import java.util.List; -import java.util.UUID; -import javax.annotation.Nullable; -import javax.servlet.http.HttpServletRequest; - -import org.apache.commons.lang3.StringUtils; -import org.dspace.app.rest.model.AuthorityEntryRest; -import org.dspace.app.rest.model.AuthorityRest; -import org.dspace.app.rest.projection.Projection; -import org.dspace.app.rest.utils.AuthorityUtils; -import org.dspace.content.Collection; -import org.dspace.content.authority.Choice; -import org.dspace.content.authority.Choices; -import org.dspace.content.authority.service.ChoiceAuthorityService; -import org.dspace.content.service.CollectionService; -import org.dspace.core.Context; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.data.domain.Page; -import org.springframework.data.domain.PageImpl; -import org.springframework.data.domain.Pageable; -import org.springframework.security.access.prepost.PreAuthorize; -import org.springframework.stereotype.Component; - -/** - * Controller for exposition of authority services - * - * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) - */ -@Component(AuthorityRest.CATEGORY + "." + AuthorityRest.NAME + "." + AuthorityRest.ENTRIES) -public class AuthorityEntryLinkRepository extends AbstractDSpaceRestRepository - implements LinkRestRepository { - - @Autowired - private ChoiceAuthorityService cas; - - @Autowired - private CollectionService cs; - - @Autowired - private AuthorityUtils authorityUtils; - - @PreAuthorize("hasAuthority('AUTHENTICATED')") - public Page query(@Nullable HttpServletRequest request, String name, - @Nullable Pageable optionalPageable, Projection projection) { - Context context = obtainContext(); - String query = request == null ? null : request.getParameter("query"); - String metadata = request == null ? null : request.getParameter("metadata"); - String uuidCollectìon = request == null ? null : request.getParameter("uuid"); - Collection collection = null; - if (StringUtils.isNotBlank(uuidCollectìon)) { - try { - collection = cs.find(context, UUID.fromString(uuidCollectìon)); - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - List results = new ArrayList<>(); - Pageable pageable = utils.getPageable(optionalPageable); - if (StringUtils.isNotBlank(metadata)) { - String[] tokens = org.dspace.core.Utils.tokenize(metadata); - String fieldKey = org.dspace.core.Utils.standardize(tokens[0], tokens[1], tokens[2], "_"); - Choices choices = cas.getMatches(fieldKey, query, collection, Math.toIntExact(pageable.getOffset()), - pageable.getPageSize(), - context.getCurrentLocale().toString()); - for (Choice value : choices.values) { - results.add(authorityUtils.convertEntry(value, name, projection)); - } - } - return new PageImpl<>(results, pageable, results.size()); - } -} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityEntryValueLinkRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityEntryValueLinkRepository.java deleted file mode 100644 index c2e3c557d4..0000000000 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityEntryValueLinkRepository.java +++ /dev/null @@ -1,60 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.app.rest.repository; - -import javax.servlet.http.HttpServletRequest; - -import org.dspace.app.rest.model.AuthorityEntryRest; -import org.dspace.app.rest.model.AuthorityRest; -import org.dspace.app.rest.projection.Projection; -import org.dspace.app.rest.utils.AuthorityUtils; -import org.dspace.content.authority.Choice; -import org.dspace.content.authority.ChoiceAuthority; -import org.dspace.content.authority.service.ChoiceAuthorityService; -import org.dspace.core.Context; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.data.domain.Pageable; -import org.springframework.data.rest.webmvc.ResourceNotFoundException; -import org.springframework.security.access.prepost.PreAuthorize; -import org.springframework.stereotype.Component; - -/** - * Controller for exposition of authority services - * - * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) - */ -@Component(AuthorityRest.CATEGORY + "." + AuthorityRest.NAME + "." + AuthorityRest.ENTRY) -public class AuthorityEntryValueLinkRepository extends AbstractDSpaceRestRepository - implements LinkRestRepository { - - @Autowired - private ChoiceAuthorityService cas; - - @Autowired - private AuthorityUtils authorityUtils; - - @PreAuthorize("hasAuthority('AUTHENTICATED')") - public AuthorityEntryRest getResource(HttpServletRequest request, String name, String relId, - Pageable pageable, Projection projection) { - Context context = obtainContext(); - ChoiceAuthority choiceAuthority = cas.getChoiceAuthorityByAuthorityName(name); - Choice choice = choiceAuthority.getChoice(null, relId, context.getCurrentLocale().toString()); - if (choice == null) { - throw new ResourceNotFoundException("The authority was not found"); - } - return authorityUtils.convertEntry(choice, name, projection); - } - - /** - * Not embeddable because this is not currently a pageable subresource. - */ - @Override - public boolean isEmbeddableRelation(Object data, String name) { - return false; - } -} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityRestRepository.java deleted file mode 100644 index d5dda5a0bc..0000000000 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorityRestRepository.java +++ /dev/null @@ -1,82 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.app.rest.repository; - -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; -import java.util.Set; - -import org.dspace.app.rest.DiscoverableEndpointsService; -import org.dspace.app.rest.model.AuthorityRest; -import org.dspace.app.rest.model.AuthorizationRest; -import org.dspace.app.rest.projection.Projection; -import org.dspace.app.rest.utils.AuthorityUtils; -import org.dspace.content.authority.ChoiceAuthority; -import org.dspace.content.authority.service.ChoiceAuthorityService; -import org.dspace.core.Context; -import org.springframework.beans.factory.InitializingBean; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.data.domain.Page; -import org.springframework.data.domain.PageImpl; -import org.springframework.data.domain.Pageable; -import org.springframework.hateoas.Link; -import org.springframework.security.access.prepost.PreAuthorize; -import org.springframework.stereotype.Component; - -/** - * Controller for exposition of authority services - * - * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) - */ -@Component(AuthorityRest.CATEGORY + "." + AuthorityRest.NAME) -public class AuthorityRestRepository extends DSpaceRestRepository - implements InitializingBean { - - @Autowired - private ChoiceAuthorityService cas; - - @Autowired - private AuthorityUtils authorityUtils; - - @Autowired - DiscoverableEndpointsService discoverableEndpointsService; - - @PreAuthorize("hasAuthority('AUTHENTICATED')") - @Override - public AuthorityRest findOne(Context context, String name) { - ChoiceAuthority source = cas.getChoiceAuthorityByAuthorityName(name); - return authorityUtils.convertAuthority(source, name, utils.obtainProjection()); - } - - @PreAuthorize("hasAuthority('AUTHENTICATED')") - @Override - public Page findAll(Context context, Pageable pageable) { - Set authoritiesName = cas.getChoiceAuthoritiesNames(); - List results = new ArrayList<>(); - Projection projection = utils.obtainProjection(); - for (String authorityName : authoritiesName) { - ChoiceAuthority source = cas.getChoiceAuthorityByAuthorityName(authorityName); - AuthorityRest result = authorityUtils.convertAuthority(source, authorityName, projection); - results.add(result); - } - return new PageImpl<>(results, pageable, results.size()); - } - - @Override - public Class getDomainClass() { - return AuthorityRest.class; - } - - @Override - public void afterPropertiesSet() throws Exception { - discoverableEndpointsService.register(this, Arrays.asList( - new Link("/api/" + AuthorizationRest.CATEGORY + "/" + AuthorizationRest.NAME + "/search", - AuthorizationRest.NAME + "-search"))); - } -} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorizationRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorizationRestRepository.java index 175fcd49f9..704a4191dd 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorizationRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/AuthorizationRestRepository.java @@ -7,6 +7,10 @@ */ package org.dspace.app.rest.repository; +import static java.util.Collections.emptyList; +import static java.util.Collections.singletonList; +import static org.apache.commons.lang3.StringUtils.isNotBlank; + import java.sql.SQLException; import java.util.ArrayList; import java.util.List; @@ -145,9 +149,11 @@ public class AuthorizationRestRepository extends DSpaceRestRepository findByObject(@Parameter(value = "uri", required = true) String uri, - @Parameter(value = "eperson") UUID epersonUuid, + @Parameter(value = "eperson") UUID epersonUuid, @Parameter(value = "feature") String featureName, Pageable pageable) throws AuthorizeException, SQLException { + Context context = obtainContext(); + BaseObjectRest obj = utils.getBaseObjectRestFromUri(context, uri); if (obj == null) { return null; @@ -162,11 +168,16 @@ public class AuthorizationRestRepository extends DSpaceRestRepository features = authorizationFeatureService.findByResourceType(obj.getUniqueType()); - List authorizations = new ArrayList(); - for (AuthorizationFeature f : features) { - if (authorizationFeatureService.isAuthorized(context, f, obj)) { - authorizations.add(new Authorization(user, f, obj)); + List authorizations; + if (isNotBlank(featureName)) { + authorizations = findByObjectAndFeature(context, user, obj, featureName); + } else { + List features = authorizationFeatureService.findByResourceType(obj.getUniqueType()); + authorizations = new ArrayList<>(); + for (AuthorizationFeature f : features) { + if (authorizationFeatureService.isAuthorized(context, f, obj)) { + authorizations.add(new Authorization(user, f, obj)); + } } } @@ -177,57 +188,17 @@ public class AuthorizationRestRepository extends DSpaceRestRepository findByObjectAndFeature( + Context context, EPerson user, BaseObjectRest obj, String featureName + ) throws SQLException { + + AuthorizationFeature feature = authorizationFeatureService.find(featureName); + + if (!authorizationFeatureService.isAuthorized(context, feature, obj)) { + return emptyList(); } - EPerson currUser = context.getCurrentUser(); - // get the user specified in the requested parameters, can be null for anonymous - EPerson user = getUserFromRequestParameter(context, epersonUuid); - if (currUser != user) { - // Temporarily change the Context's current user in order to retrieve - // authorizations based on that user - context.switchContextUser(user); - } - AuthorizationFeature feature = authorizationFeatureService.find(featureName); - AuthorizationRest authorizationRest = null; - if (authorizationFeatureService.isAuthorized(context, feature, obj)) { - Authorization authz = new Authorization(); - authz.setEperson(user); - authz.setFeature(feature); - authz.setObject(obj); - authorizationRest = converter.toRest(authz, utils.obtainProjection()); - } - if (currUser != user) { - // restore the real current user - context.restoreContextUser(); - } - return authorizationRest; + return singletonList(new Authorization(user, feature, obj)); } /** @@ -242,25 +213,27 @@ public class AuthorizationRestRepository extends DSpaceRestRepository upload(HttpServletRequest request, MultipartFile uploadfile) + public Iterable upload(HttpServletRequest request, List uploadfile) throws SQLException, FileNotFoundException, IOException, AuthorizeException { Context context = obtainContext(); Iterable entity = upload(context, request, uploadfile); @@ -486,7 +486,7 @@ public abstract class DSpaceRestRepository upload(Context context, HttpServletRequest request, - MultipartFile uploadfile) + List uploadfile) throws SQLException, FileNotFoundException, IOException, AuthorizeException { throw new RepositoryMethodNotImplementedException("No implementation found; Method not allowed!", ""); } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/DiscoveryRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/DiscoveryRestRepository.java index c0d40c4f3e..682ca834b8 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/DiscoveryRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/DiscoveryRestRepository.java @@ -89,7 +89,7 @@ public class DiscoveryRestRepository extends AbstractDSpaceRestRepository { return discoverConfigurationConverter.convert(discoveryConfiguration, utils.obtainProjection()); } - public SearchResultsRest getSearchObjects(final String query, final String dsoType, final String dsoScope, + public SearchResultsRest getSearchObjects(final String query, final List dsoTypes, final String dsoScope, final String configuration, final List searchFilters, final Pageable page, final Projection projection) { @@ -103,7 +103,7 @@ public class DiscoveryRestRepository extends AbstractDSpaceRestRepository { try { discoverQuery = queryBuilder - .buildQuery(context, scopeObject, discoveryConfiguration, query, searchFilters, dsoType, page); + .buildQuery(context, scopeObject, discoveryConfiguration, query, searchFilters, dsoTypes, page); searchResult = searchService.search(context, scopeObject, discoverQuery); } catch (SearchServiceException e) { @@ -112,7 +112,7 @@ public class DiscoveryRestRepository extends AbstractDSpaceRestRepository { } return discoverResultConverter - .convert(context, query, dsoType, configuration, dsoScope, searchFilters, page, searchResult, + .convert(context, query, dsoTypes, configuration, dsoScope, searchFilters, page, searchResult, discoveryConfiguration, projection); } @@ -130,7 +130,7 @@ public class DiscoveryRestRepository extends AbstractDSpaceRestRepository { return discoverSearchSupportConverter.convert(); } - public FacetResultsRest getFacetObjects(String facetName, String prefix, String query, String dsoType, + public FacetResultsRest getFacetObjects(String facetName, String prefix, String query, List dsoTypes, String dsoScope, final String configuration, List searchFilters, Pageable page) { Context context = obtainContext(); @@ -143,7 +143,7 @@ public class DiscoveryRestRepository extends AbstractDSpaceRestRepository { DiscoverQuery discoverQuery = null; try { discoverQuery = queryBuilder.buildFacetQuery(context, scopeObject, discoveryConfiguration, prefix, query, - searchFilters, dsoType, page, facetName); + searchFilters, dsoTypes, page, facetName); searchResult = searchService.search(context, scopeObject, discoverQuery); } catch (SearchServiceException e) { @@ -152,12 +152,12 @@ public class DiscoveryRestRepository extends AbstractDSpaceRestRepository { } FacetResultsRest facetResultsRest = discoverFacetResultsConverter.convert(context, facetName, prefix, query, - dsoType, dsoScope, searchFilters, searchResult, discoveryConfiguration, page, + dsoTypes, dsoScope, searchFilters, searchResult, discoveryConfiguration, page, utils.obtainProjection()); return facetResultsRest; } - public SearchResultsRest getAllFacets(String query, String dsoType, String dsoScope, String configuration, + public SearchResultsRest getAllFacets(String query, List dsoTypes, String dsoScope, String configuration, List searchFilters) { Context context = obtainContext(); @@ -171,14 +171,14 @@ public class DiscoveryRestRepository extends AbstractDSpaceRestRepository { try { discoverQuery = queryBuilder - .buildQuery(context, scopeObject, discoveryConfiguration, query, searchFilters, dsoType, page); + .buildQuery(context, scopeObject, discoveryConfiguration, query, searchFilters, dsoTypes, page); searchResult = searchService.search(context, scopeObject, discoverQuery); } catch (SearchServiceException e) { log.error("Error while searching with Discovery", e); } - SearchResultsRest searchResultsRest = discoverFacetsConverter.convert(context, query, dsoType, + SearchResultsRest searchResultsRest = discoverFacetsConverter.convert(context, query, dsoTypes, configuration, dsoScope, searchFilters, page, discoveryConfiguration, searchResult, utils.obtainProjection()); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/EPersonRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/EPersonRestRepository.java index e044346c2b..cb14f639e0 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/EPersonRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/EPersonRestRepository.java @@ -16,22 +16,34 @@ import javax.servlet.http.HttpServletRequest; import com.fasterxml.jackson.databind.ObjectMapper; import org.apache.commons.lang3.StringUtils; +import org.apache.log4j.Logger; import org.dspace.app.rest.DiscoverableEndpointsService; import org.dspace.app.rest.Parameter; import org.dspace.app.rest.SearchRestMethod; +import org.dspace.app.rest.authorization.AuthorizationFeatureService; +import org.dspace.app.rest.exception.DSpaceBadRequestException; import org.dspace.app.rest.exception.UnprocessableEntityException; import org.dspace.app.rest.model.EPersonRest; +import org.dspace.app.rest.model.MetadataRest; +import org.dspace.app.rest.model.MetadataValueRest; +import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.Patch; +import org.dspace.app.util.AuthorizeUtil; import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.service.SiteService; import org.dspace.core.Context; import org.dspace.eperson.EPerson; +import org.dspace.eperson.RegistrationData; +import org.dspace.eperson.service.AccountService; import org.dspace.eperson.service.EPersonService; +import org.dspace.eperson.service.RegistrationDataService; import org.springframework.beans.factory.InitializingBean; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; import org.springframework.hateoas.Link; +import org.springframework.security.access.AccessDeniedException; import org.springframework.security.access.prepost.PreAuthorize; import org.springframework.stereotype.Component; @@ -46,12 +58,26 @@ import org.springframework.stereotype.Component; public class EPersonRestRepository extends DSpaceObjectRestRepository implements InitializingBean { + private static final Logger log = Logger.getLogger(EPersonRestRepository.class); + @Autowired AuthorizeService authorizeService; @Autowired DiscoverableEndpointsService discoverableEndpointsService; + @Autowired + private AccountService accountService; + + @Autowired + private AuthorizationFeatureService authorizationFeatureService; + + @Autowired + private SiteService siteService; + + @Autowired + private RegistrationDataService registrationDataService; + private final EPersonService es; @@ -72,7 +98,23 @@ public class EPersonRestRepository extends DSpaceObjectRestRepository epersonFirstName = metadataRest.getMap().get("eperson.firstname"); + List epersonLastName = metadataRest.getMap().get("eperson.lastname"); + if (epersonFirstName == null || epersonLastName == null || + epersonFirstName.isEmpty() || epersonLastName.isEmpty()) { + throw new UnprocessableEntityException("The eperson.firstname and eperson.lastname values need to be " + + "filled in"); + } + } + String password = epersonRest.getPassword(); + if (!accountService.verifyPasswordStructure(password)) { + throw new DSpaceBadRequestException("The given password is invalid"); + } } @Override @@ -175,6 +290,18 @@ public class EPersonRestRepository extends DSpaceObjectRestRepository constraints = es.getDeleteConstraints(context, eperson); - if (constraints != null && constraints.size() > 0) { - throw new UnprocessableEntityException( - "The eperson cannot be deleted due to the following constraints: " - + StringUtils.join(constraints, ", ")); - } - } catch (SQLException e) { - throw new RuntimeException(e.getMessage(), e); - } - try { es.delete(context, eperson); } catch (SQLException | IOException e) { throw new RuntimeException(e.getMessage(), e); + } catch (IllegalStateException e) { + throw new UnprocessableEntityException(e.getMessage(), e); } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/ExternalSourceRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/ExternalSourceRestRepository.java index 49a128cd85..948e25e364 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/ExternalSourceRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/ExternalSourceRestRepository.java @@ -89,10 +89,10 @@ public class ExternalSourceRestRepository extends DSpaceRestRepository findAll(Context context, Pageable pageable) { List externalSources = externalDataService.getExternalDataProviders(); - return converter.toRestPage(externalSources, pageable, externalSources.size(), - utils.obtainProjection()); + return converter.toRestPage(externalSources, pageable, utils.obtainProjection()); } public Class getDomainClass() { diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/MetadataFieldRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/MetadataFieldRestRepository.java index b7764b81dc..b0a5f526f0 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/MetadataFieldRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/MetadataFieldRestRepository.java @@ -9,9 +9,11 @@ package org.dspace.app.rest.repository; import static java.lang.Integer.parseInt; import static org.apache.commons.lang3.StringUtils.isBlank; +import static org.dspace.app.rest.model.SearchConfigurationRest.Filter.OPERATOR_EQUALS; import java.io.IOException; import java.sql.SQLException; +import java.util.ArrayList; import java.util.List; import java.util.Objects; import javax.servlet.http.HttpServletRequest; @@ -19,6 +21,8 @@ import javax.servlet.http.HttpServletRequest; import com.fasterxml.jackson.databind.JsonNode; import com.fasterxml.jackson.databind.ObjectMapper; import com.google.gson.Gson; +import org.apache.commons.lang3.StringUtils; +import org.apache.logging.log4j.Logger; import org.dspace.app.rest.Parameter; import org.dspace.app.rest.SearchRestMethod; import org.dspace.app.rest.exception.DSpaceBadRequestException; @@ -31,6 +35,13 @@ import org.dspace.content.NonUniqueMetadataException; import org.dspace.content.service.MetadataFieldService; import org.dspace.content.service.MetadataSchemaService; import org.dspace.core.Context; +import org.dspace.discovery.DiscoverQuery; +import org.dspace.discovery.DiscoverResult; +import org.dspace.discovery.IndexableObject; +import org.dspace.discovery.SearchService; +import org.dspace.discovery.SearchServiceException; +import org.dspace.discovery.indexobject.IndexableMetadataField; +import org.dspace.discovery.indexobject.MetadataFieldIndexFactoryImpl; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; @@ -45,6 +56,10 @@ import org.springframework.stereotype.Component; */ @Component(MetadataFieldRest.CATEGORY + "." + MetadataFieldRest.NAME) public class MetadataFieldRestRepository extends DSpaceRestRepository { + /** + * log4j logger + */ + private static Logger log = org.apache.logging.log4j.LogManager.getLogger(MetadataFieldRestRepository.class); @Autowired MetadataFieldService metadataFieldService; @@ -52,6 +67,9 @@ public class MetadataFieldRestRepository extends DSpaceRestRepository findBySchema(@Parameter(value = "schema", required = true) String schemaName, - Pageable pageable) { + Pageable pageable) { try { Context context = obtainContext(); MetadataSchema schema = metadataSchemaService.find(context, schemaName); @@ -93,6 +111,108 @@ public class MetadataFieldRestRepository extends DSpaceRestRepository findByFieldName(@Parameter(value = "schema", required = false) String schemaName, + @Parameter(value = "element", required = false) String elementName, + @Parameter(value = "qualifier", required = false) String qualifierName, + @Parameter(value = "query", required = false) String query, + @Parameter(value = "exactName", required = false) String exactName, + Pageable pageable) throws SQLException { + Context context = obtainContext(); + + List matchingMetadataFields = new ArrayList<>(); + + if (StringUtils.isBlank(exactName)) { + // Find matches in Solr Search core + DiscoverQuery discoverQuery = + this.createDiscoverQuery(context, schemaName, elementName, qualifierName, query); + try { + DiscoverResult searchResult = searchService.search(context, null, discoverQuery); + for (IndexableObject object : searchResult.getIndexableObjects()) { + if (object instanceof IndexableMetadataField) { + matchingMetadataFields.add(((IndexableMetadataField) object).getIndexedObject()); + } + } + } catch (SearchServiceException e) { + log.error("Error while searching with Discovery", e); + throw new IllegalArgumentException("Error while searching with Discovery: " + e.getMessage()); + } + } else { + if (StringUtils.isNotBlank(elementName) || StringUtils.isNotBlank(qualifierName) || + StringUtils.isNotBlank(schemaName) || StringUtils.isNotBlank(query)) { + throw new UnprocessableEntityException("Use either exactName or a combination of element, qualifier " + + "and schema to search discovery for metadata fields"); + } + // Find at most one match with exactName query param in DB + MetadataField exactMatchingMdField = metadataFieldService.findByString(context, exactName, '.'); + if (exactMatchingMdField != null) { + matchingMetadataFields.add(exactMatchingMdField); + } + } + + return converter.toRestPage(matchingMetadataFields, pageable, utils.obtainProjection()); + } + + /** + * Creates a discovery query containing the filter queries derived from the request params + * + * @param context Context request + * @param schemaName an exact match of the prefix of the metadata schema (e.g. "dc", "dcterms", "eperson") + * @param elementName an exact match of the field's element (e.g. "contributor", "title") + * @param qualifierName an exact match of the field's qualifier (e.g. "author", "alternative") + * @param query part of the fully qualified field, should start with the start of the schema, element or + * qualifier (e.g. "dc.ti", "contributor", "auth", "contributor.ot") + * @return Discover query containing the filter queries derived from the request params + * @throws SQLException If DB error + */ + private DiscoverQuery createDiscoverQuery(Context context, String schemaName, String elementName, + String qualifierName, String query) throws SQLException { + List filterQueries = new ArrayList<>(); + if (StringUtils.isNotBlank(query)) { + if (query.split("\\.").length > 3) { + throw new IllegalArgumentException("Query param should not contain more than 2 dot (.) separators, " + + "forming schema.element.qualifier metadata field name"); + } + filterQueries.add(searchService.toFilterQuery(context, MetadataFieldIndexFactoryImpl.FIELD_NAME_VARIATIONS, + OPERATOR_EQUALS, query).getFilterQuery() + "*"); + } + if (StringUtils.isNotBlank(schemaName)) { + filterQueries.add( + searchService.toFilterQuery(context, MetadataFieldIndexFactoryImpl.SCHEMA_FIELD_NAME, OPERATOR_EQUALS, + schemaName).getFilterQuery()); + } + if (StringUtils.isNotBlank(elementName)) { + filterQueries.add( + searchService.toFilterQuery(context, MetadataFieldIndexFactoryImpl.ELEMENT_FIELD_NAME, OPERATOR_EQUALS, + elementName).getFilterQuery()); + } + if (StringUtils.isNotBlank(qualifierName)) { + filterQueries.add(searchService + .toFilterQuery(context, MetadataFieldIndexFactoryImpl.QUALIFIER_FIELD_NAME, OPERATOR_EQUALS, + qualifierName).getFilterQuery()); + } + + DiscoverQuery discoverQuery = new DiscoverQuery(); + discoverQuery.addFilterQueries(filterQueries.toArray(new String[filterQueries.size()])); + return discoverQuery; + } + @Override public Class getDomainClass() { return MetadataFieldRest.class; @@ -101,15 +221,15 @@ public class MetadataFieldRestRepository extends DSpaceRestRepository findProcessesByProperty(@Parameter(value = "userId") UUID ePersonUuid, + @Parameter(value = "scriptName") String scriptName, + @Parameter(value = "processStatus") String processStatusString, + Pageable pageable) + throws SQLException { + if (StringUtils.isBlank(scriptName) && ePersonUuid == null && StringUtils.isBlank(processStatusString)) { + throw new DSpaceBadRequestException("Either a name, user UUID or ProcessStatus should be provided"); + } + + Context context = obtainContext(); + EPerson ePerson = null; + if (ePersonUuid != null) { + ePerson = epersonService.find(context, ePersonUuid); + if (ePerson == null) { + throw new DSpaceBadRequestException("No EPerson with the given UUID is found"); + } + } + + ProcessStatus processStatus = StringUtils.isBlank(processStatusString) ? null : + ProcessStatus.valueOf(processStatusString); + ProcessQueryParameterContainer processQueryParameterContainer = createProcessQueryParameterContainer(scriptName, + ePerson, processStatus); + handleSearchSort(pageable, processQueryParameterContainer); + List processes = processService.search(context, processQueryParameterContainer, pageable.getPageSize(), + Math.toIntExact(pageable.getOffset())); + return converterService.toRestPage(processes, pageable, + processService.countSearch(context, processQueryParameterContainer), + utils.obtainProjection()); + + + } + + /** + * This method will retrieve the {@link Sort} from the given {@link Pageable} and it'll create the sortOrder and + * sortProperty Strings on the {@link ProcessQueryParameterContainer} object so that we can store how the sorting + * should be done + * @param pageable The pageable object + * @param processQueryParameterContainer The object in which the sorting will be filled in + */ + private void handleSearchSort(Pageable pageable, ProcessQueryParameterContainer processQueryParameterContainer) { + Sort sort = pageable.getSort(); + if (sort != null) { + Iterator iterator = sort.iterator(); + if (iterator.hasNext()) { + Sort.Order order = iterator.next(); + if (StringUtils.equalsIgnoreCase(order.getProperty(), "startTime")) { + processQueryParameterContainer.setSortProperty(Process_.START_TIME); + processQueryParameterContainer.setSortOrder(order.getDirection().name()); + } else if (StringUtils.equalsIgnoreCase(order.getProperty(), "endTime")) { + processQueryParameterContainer.setSortProperty(Process_.FINISHED_TIME); + processQueryParameterContainer.setSortOrder(order.getDirection().name()); + } else { + throw new DSpaceBadRequestException("The given sort option was invalid: " + order.getProperty()); + } + if (iterator.hasNext()) { + throw new DSpaceBadRequestException("Only one sort method is supported, can't give multiples"); + } + } + } + } + + /** + * This method will create a new {@link ProcessQueryParameterContainer} object and return it. + * This object will contain a map which is filled in with the database column reference as key and the value that + * it should contain when searching as the value of the entry + * @param scriptName The name that the script of the process should have + * @param ePerson The eperson that the process should have + * @param processStatus The status that the process should have + * @return The newly created {@link ProcessQueryParameterContainer} + */ + private ProcessQueryParameterContainer createProcessQueryParameterContainer(String scriptName, EPerson ePerson, + ProcessStatus processStatus) { + ProcessQueryParameterContainer processQueryParameterContainer = + new ProcessQueryParameterContainer(); + if (StringUtils.isNotBlank(scriptName)) { + processQueryParameterContainer.addToQueryParameterMap(Process_.NAME, scriptName); + } + if (ePerson != null) { + processQueryParameterContainer.addToQueryParameterMap(Process_.E_PERSON, ePerson); + } + if (processStatus != null) { + processQueryParameterContainer.addToQueryParameterMap(Process_.PROCESS_STATUS, processStatus); + } + return processQueryParameterContainer; + } + @Override public Class getDomainClass() { return ProcessRest.class; diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/RegistrationRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/RegistrationRestRepository.java new file mode 100644 index 0000000000..ba7583f1c5 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/RegistrationRestRepository.java @@ -0,0 +1,145 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.repository; + +import java.io.IOException; +import java.sql.SQLException; +import javax.mail.MessagingException; +import javax.servlet.ServletInputStream; +import javax.servlet.http.HttpServletRequest; + +import com.fasterxml.jackson.databind.ObjectMapper; +import org.apache.commons.lang3.StringUtils; +import org.apache.logging.log4j.Logger; +import org.dspace.app.rest.Parameter; +import org.dspace.app.rest.SearchRestMethod; +import org.dspace.app.rest.exception.DSpaceBadRequestException; +import org.dspace.app.rest.exception.RepositoryMethodNotImplementedException; +import org.dspace.app.rest.exception.UnprocessableEntityException; +import org.dspace.app.rest.model.RegistrationRest; +import org.dspace.app.util.AuthorizeUtil; +import org.dspace.authorize.AuthorizeException; +import org.dspace.core.Context; +import org.dspace.eperson.EPerson; +import org.dspace.eperson.RegistrationData; +import org.dspace.eperson.service.AccountService; +import org.dspace.eperson.service.EPersonService; +import org.dspace.eperson.service.RegistrationDataService; +import org.dspace.services.RequestService; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.Pageable; +import org.springframework.data.rest.webmvc.ResourceNotFoundException; +import org.springframework.security.access.AccessDeniedException; +import org.springframework.stereotype.Component; + +/** + * This is the repository that is responsible for managing Registration Rest objects + */ +@Component(RegistrationRest.CATEGORY + "." + RegistrationRest.NAME) +public class RegistrationRestRepository extends DSpaceRestRepository { + + private static Logger log = org.apache.logging.log4j.LogManager.getLogger(RegistrationRestRepository.class); + + @Autowired + private EPersonService ePersonService; + + @Autowired + private AccountService accountService; + + @Autowired + private RequestService requestService; + + @Autowired + private RegistrationDataService registrationDataService; + + @Override + public RegistrationRest findOne(Context context, Integer integer) { + throw new RepositoryMethodNotImplementedException("No implementation found; Method not allowed!", ""); + } + + @Override + public Page findAll(Context context, Pageable pageable) { + throw new RepositoryMethodNotImplementedException("No implementation found; Method not allowed!", ""); + } + + @Override + public RegistrationRest createAndReturn(Context context) { + HttpServletRequest request = requestService.getCurrentRequest().getHttpServletRequest(); + ObjectMapper mapper = new ObjectMapper(); + RegistrationRest registrationRest; + try { + ServletInputStream input = request.getInputStream(); + registrationRest = mapper.readValue(input, RegistrationRest.class); + } catch (IOException e1) { + throw new UnprocessableEntityException("Error parsing request body.", e1); + } + if (StringUtils.isBlank(registrationRest.getEmail())) { + throw new UnprocessableEntityException("The email cannot be omitted from the Registration endpoint"); + } + EPerson eperson = null; + try { + eperson = ePersonService.findByEmail(context, registrationRest.getEmail()); + } catch (SQLException e) { + log.error("Something went wrong retrieving EPerson for email: " + registrationRest.getEmail(), e); + } + if (eperson != null) { + try { + if (!AuthorizeUtil.authorizeUpdatePassword(context, eperson.getEmail())) { + throw new DSpaceBadRequestException("Password cannot be updated for the given EPerson with email: " + + eperson.getEmail()); + } + accountService.sendForgotPasswordInfo(context, registrationRest.getEmail()); + } catch (SQLException | IOException | MessagingException | AuthorizeException e) { + log.error("Something went wrong with sending forgot password info for email: " + + registrationRest.getEmail(), e); + } + } else { + try { + if (!AuthorizeUtil.authorizeNewAccountRegistration(context, request)) { + throw new AccessDeniedException( + "Registration is disabled, you are not authorized to create a new Authorization"); + } + accountService.sendRegistrationInfo(context, registrationRest.getEmail()); + } catch (SQLException | IOException | MessagingException | AuthorizeException e) { + log.error("Something with wrong with sending registration info for email: " + + registrationRest.getEmail()); + } + } + return null; + } + + @Override + public Class getDomainClass() { + return RegistrationRest.class; + } + + /** + * This method will find the RegistrationRest object that is associated with the token given + * @param token The token to be found and for which a RegistrationRest object will be found + * @return A RegistrationRest object for the given token + * @throws SQLException If something goes wrong + * @throws AuthorizeException If something goes wrong + */ + @SearchRestMethod(name = "findByToken") + public RegistrationRest findByToken(@Parameter(value = "token", required = true) String token) + throws SQLException, AuthorizeException { + Context context = obtainContext(); + RegistrationData registrationData = registrationDataService.findByToken(context, token); + if (registrationData == null) { + throw new ResourceNotFoundException("The token: " + token + " couldn't be found"); + } + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(registrationData.getEmail()); + EPerson ePerson = accountService.getEPerson(context, token); + if (ePerson != null) { + registrationRest.setUser(ePerson.getID()); + } + return registrationRest; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/ScriptRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/ScriptRestRepository.java index a3f6a6bb67..9151ae1976 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/ScriptRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/ScriptRestRepository.java @@ -102,7 +102,7 @@ public class ScriptRestRepository extends DSpaceRestRepository args = constructArgs(dSpaceCommandLineParameters); runDSpaceScript(files, context, scriptToExecute, restDSpaceRunnableHandler, args); return converter.toRest(restDSpaceRunnableHandler.getProcess(context), utils.obtainProjection()); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/StatisticsRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/StatisticsRestRepository.java index 0838b65d18..4aa7572767 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/StatisticsRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/StatisticsRestRepository.java @@ -7,13 +7,89 @@ */ package org.dspace.app.rest.repository; +import java.io.IOException; +import java.sql.SQLException; +import java.text.ParseException; +import java.util.List; +import java.util.UUID; + +import org.apache.commons.lang3.StringUtils; +import org.apache.solr.client.solrj.SolrServerException; +import org.dspace.app.rest.Parameter; +import org.dspace.app.rest.SearchRestMethod; +import org.dspace.app.rest.exception.RepositoryMethodNotImplementedException; import org.dspace.app.rest.model.StatisticsSupportRest; +import org.dspace.app.rest.model.UsageReportRest; +import org.dspace.app.rest.utils.DSpaceObjectUtils; +import org.dspace.app.rest.utils.UsageReportUtils; +import org.dspace.content.DSpaceObject; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.Pageable; +import org.springframework.security.access.prepost.PreAuthorize; import org.springframework.stereotype.Component; -@Component(StatisticsSupportRest.CATEGORY + "." + StatisticsSupportRest.NAME) -public class StatisticsRestRepository extends AbstractDSpaceRestRepository { +@Component(StatisticsSupportRest.CATEGORY + "." + UsageReportRest.NAME) +public class StatisticsRestRepository extends DSpaceRestRepository { + + @Autowired + private DSpaceObjectUtils dspaceObjectUtil; + + @Autowired + private UsageReportUtils usageReportUtils; public StatisticsSupportRest getStatisticsSupport() { return new StatisticsSupportRest(); } + + @Override + @PreAuthorize("hasPermission(#uuidObjectReportId, 'usagereport', 'READ')") + public UsageReportRest findOne(Context context, String uuidObjectReportId) { + UUID uuidObject = UUID.fromString(StringUtils.substringBefore(uuidObjectReportId, "_")); + String reportId = StringUtils.substringAfter(uuidObjectReportId, "_"); + + UsageReportRest usageReportRest = null; + try { + DSpaceObject dso = dspaceObjectUtil.findDSpaceObject(context, uuidObject); + if (dso == null) { + throw new IllegalArgumentException("No DSO found with uuid: " + uuidObject); + } + usageReportRest = usageReportUtils.createUsageReport(context, dso, reportId); + + } catch (ParseException | SolrServerException | IOException | SQLException e) { + throw new RuntimeException(e.getMessage(), e); + } + return converter.toRest(usageReportRest, utils.obtainProjection()); + } + + @PreAuthorize("hasPermission(#uri, 'usagereportsearch', 'READ')") + @SearchRestMethod(name = "object") + public Page findByObject(@Parameter(value = "uri", required = true) String uri, + Pageable pageable) { + UUID uuid = UUID.fromString(StringUtils.substringAfterLast(uri, "/")); + List usageReportsOfItem = null; + try { + Context context = obtainContext(); + DSpaceObject dso = dspaceObjectUtil.findDSpaceObject(context, uuid); + if (dso == null) { + throw new IllegalArgumentException("No DSO found with uuid: " + uuid); + } + usageReportsOfItem = usageReportUtils.getUsageReportsOfDSO(context, dso); + } catch (SQLException | ParseException | SolrServerException | IOException e) { + throw new RuntimeException(e.getMessage(), e); + } + + return converter.toRestPage(usageReportsOfItem, pageable, usageReportsOfItem.size(), utils.obtainProjection()); + } + + @Override + public Page findAll(Context context, Pageable pageable) { + throw new RepositoryMethodNotImplementedException("No implementation found; Method not allowed!", "findAll"); + } + + @Override + public Class getDomainClass() { + return UsageReportRest.class; + } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/SubmissionFormRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/SubmissionFormRestRepository.java index 48856aa163..76d680cf27 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/SubmissionFormRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/SubmissionFormRestRepository.java @@ -7,13 +7,17 @@ */ package org.dspace.app.rest.repository; +import java.util.HashMap; import java.util.List; +import java.util.Locale; +import java.util.Map; import org.dspace.app.rest.model.SubmissionFormRest; import org.dspace.app.util.DCInputSet; import org.dspace.app.util.DCInputsReader; import org.dspace.app.util.DCInputsReaderException; import org.dspace.core.Context; +import org.dspace.core.I18nUtil; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; import org.springframework.security.access.prepost.PreAuthorize; @@ -26,32 +30,48 @@ import org.springframework.stereotype.Component; */ @Component(SubmissionFormRest.CATEGORY + "." + SubmissionFormRest.NAME) public class SubmissionFormRestRepository extends DSpaceRestRepository { - - private DCInputsReader inputReader; + private Map inputReaders; + private DCInputsReader defaultInputReader; public SubmissionFormRestRepository() throws DCInputsReaderException { - inputReader = new DCInputsReader(); + defaultInputReader = new DCInputsReader(); + Locale[] locales = I18nUtil.getSupportedLocales(); + inputReaders = new HashMap(); + for (Locale locale : locales) { + inputReaders.put(locale, new DCInputsReader(I18nUtil.getInputFormsFileName(locale))); + } } @PreAuthorize("hasAuthority('AUTHENTICATED')") @Override - public SubmissionFormRest findOne(Context context, String submitName) { - DCInputSet inputConfig; + public SubmissionFormRest findOne(Context context, String submitName) { try { - inputConfig = inputReader.getInputsByFormName(submitName); + Locale currentLocale = context.getCurrentLocale(); + DCInputsReader inputReader = inputReaders.get(currentLocale); + if (inputReader == null) { + inputReader = defaultInputReader; + } + DCInputSet subConfs = inputReader.getInputsByFormName(submitName); + if (subConfs == null) { + return null; + } + return converter.toRest(subConfs, utils.obtainProjection()); } catch (DCInputsReaderException e) { throw new IllegalStateException(e.getMessage(), e); } - if (inputConfig == null) { - return null; - } - return converter.toRest(inputConfig, utils.obtainProjection()); } @PreAuthorize("hasAuthority('AUTHENTICATED')") @Override public Page findAll(Context context, Pageable pageable) { try { + Locale currentLocale = context.getCurrentLocale(); + DCInputsReader inputReader; + if (currentLocale != null) { + inputReader = inputReaders.get(currentLocale); + } else { + inputReader = defaultInputReader; + } long total = inputReader.countInputs(); List subConfs = inputReader.getAllInputs(pageable.getPageSize(), Math.toIntExact(pageable.getOffset())); @@ -65,4 +85,20 @@ public class SubmissionFormRestRepository extends DSpaceRestRepository getDomainClass() { return SubmissionFormRest.class; } + + /** + * Reload the current Submission Form configuration based on the currently + * supported locales. This method can be used to force a reload if the + * configured supported locales change. + * + * @throws DCInputsReaderException + */ + public void reload() throws DCInputsReaderException { + this.defaultInputReader = new DCInputsReader(); + Locale[] locales = I18nUtil.getSupportedLocales(); + this.inputReaders = new HashMap(); + for (Locale locale : locales) { + inputReaders.put(locale, new DCInputsReader(I18nUtil.getInputFormsFileName(locale))); + } + } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/SubmissionUploadRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/SubmissionUploadRestRepository.java index 3ea5989f5a..25ac640d49 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/SubmissionUploadRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/SubmissionUploadRestRepository.java @@ -7,7 +7,10 @@ */ package org.dspace.app.rest.repository; +import java.sql.SQLException; +import java.text.ParseException; import java.util.ArrayList; +import java.util.Collection; import java.util.List; import org.apache.commons.lang3.StringUtils; @@ -16,10 +19,6 @@ import org.dspace.app.rest.model.AccessConditionOptionRest; import org.dspace.app.rest.model.SubmissionUploadRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.utils.DateMathParser; -import org.dspace.app.util.SubmissionConfig; -import org.dspace.app.util.SubmissionConfigReader; -import org.dspace.app.util.SubmissionConfigReaderException; -import org.dspace.app.util.SubmissionStepConfig; import org.dspace.core.Context; import org.dspace.eperson.Group; import org.dspace.eperson.service.GroupService; @@ -28,7 +27,6 @@ import org.dspace.submit.model.UploadConfiguration; import org.dspace.submit.model.UploadConfigurationService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.domain.Page; -import org.springframework.data.domain.PageImpl; import org.springframework.data.domain.Pageable; import org.springframework.security.access.prepost.PreAuthorize; import org.springframework.stereotype.Component; @@ -45,8 +43,6 @@ public class SubmissionUploadRestRepository extends DSpaceRestRepository findAll(Context context, Pageable pageable) { - List subConfs = new ArrayList(); - subConfs = submissionConfigReader.getAllSubmissionConfigs(pageable.getPageSize(), - Math.toIntExact(pageable.getOffset())); + Collection uploadConfigs = uploadConfigurationService.getMap().values(); Projection projection = utils.obtainProjection(); List results = new ArrayList<>(); - for (SubmissionConfig config : subConfs) { - for (int i = 0; i < config.getNumberOfSteps(); i++) { - SubmissionStepConfig step = config.getStep(i); - if (SubmissionStepConfig.UPLOAD_STEP_NAME.equals(step.getType())) { - UploadConfiguration uploadConfig = uploadConfigurationService.getMap().get(step.getId()); - if (uploadConfig != null) { - try { - results.add(convert(context, uploadConfig, projection)); - } catch (Exception e) { - log.error(e.getMessage(), e); - } - } + List configNames = new ArrayList(); + for (UploadConfiguration uploadConfig : uploadConfigs) { + if (!configNames.contains(uploadConfig.getName())) { + configNames.add(uploadConfig.getName()); + try { + results.add(convert(context, uploadConfig, projection)); + } catch (Exception e) { + log.error(e.getMessage(), e); } } } - return new PageImpl(results, pageable, results.size()); + return utils.getPage(results, pageable); } @Override @@ -105,20 +91,31 @@ public class SubmissionUploadRestRepository extends DSpaceRestRepository getChildren(@Nullable HttpServletRequest request, String name, + @Nullable Pageable optionalPageable, Projection projection) { + + Context context = obtainContext(); + String[] parts = StringUtils.split(name, ":", 2); + if (parts.length != 2) { + return null; + } + String vocabularyName = parts[0]; + String id = parts[1]; + Pageable pageable = utils.getPageable(optionalPageable); + List results = new ArrayList(); + ChoiceAuthority authority = choiceAuthorityService.getChoiceAuthorityByAuthorityName(vocabularyName); + if (StringUtils.isNotBlank(id) && authority.isHierarchical()) { + Choices choices = choiceAuthorityService.getChoicesByParent(vocabularyName, id, (int) pageable.getOffset(), + pageable.getPageSize(), context.getCurrentLocale().toString()); + for (Choice value : choices.values) { + results.add(authorityUtils.convertEntryDetails(value, vocabularyName, authority.isHierarchical(), + utils.obtainProjection())); + } + Page resources = new PageImpl(results, pageable, + choices.total); + return resources; + } else { + throw new LinkNotFoundException(VocabularyRest.CATEGORY, VocabularyEntryDetailsRest.NAME, name); + } + } +} + diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryDetailsParentLinkRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryDetailsParentLinkRepository.java new file mode 100644 index 0000000000..379928d9cc --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryDetailsParentLinkRepository.java @@ -0,0 +1,64 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.repository; + +import javax.annotation.Nullable; +import javax.servlet.http.HttpServletRequest; +import javax.ws.rs.NotFoundException; + +import org.apache.commons.lang3.StringUtils; +import org.dspace.app.rest.model.VocabularyEntryDetailsRest; +import org.dspace.app.rest.model.VocabularyRest; +import org.dspace.app.rest.projection.Projection; +import org.dspace.app.rest.utils.AuthorityUtils; +import org.dspace.content.authority.Choice; +import org.dspace.content.authority.ChoiceAuthority; +import org.dspace.content.authority.service.ChoiceAuthorityService; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Pageable; +import org.springframework.security.access.prepost.PreAuthorize; +import org.springframework.stereotype.Component; + +/** + * Link repository to expose the parent of a vocabulary entry details in an hierarchical vocabulary + * + * @author Mykhaylo Boychuk ($science.it) + */ +@Component(VocabularyRest.CATEGORY + "." + VocabularyEntryDetailsRest.NAME + "." + VocabularyEntryDetailsRest.PARENT) +public class VocabularyEntryDetailsParentLinkRepository extends AbstractDSpaceRestRepository + implements LinkRestRepository { + + @Autowired + private ChoiceAuthorityService choiceAuthorityService; + + @Autowired + private AuthorityUtils authorityUtils; + + @PreAuthorize("hasAuthority('AUTHENTICATED')") + public VocabularyEntryDetailsRest getParent(@Nullable HttpServletRequest request, String name, + @Nullable Pageable optionalPageable, Projection projection) { + Context context = obtainContext(); + String[] parts = StringUtils.split(name, ":", 2); + if (parts.length != 2) { + return null; + } + String vocabularyName = parts[0]; + String id = parts[1]; + + ChoiceAuthority authority = choiceAuthorityService.getChoiceAuthorityByAuthorityName(vocabularyName); + Choice choice = null; + if (StringUtils.isNotBlank(id) && authority != null && authority.isHierarchical()) { + choice = choiceAuthorityService.getParentChoice(vocabularyName, id, context.getCurrentLocale().toString()); + } else { + throw new NotFoundException(); + } + return authorityUtils.convertEntryDetails(choice, vocabularyName, authority.isHierarchical(), + utils.obtainProjection()); + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryDetailsRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryDetailsRestRepository.java new file mode 100644 index 0000000000..26e43cac8b --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryDetailsRestRepository.java @@ -0,0 +1,111 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.repository; + +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; + +import org.apache.commons.lang3.StringUtils; +import org.atteo.evo.inflector.English; +import org.dspace.app.rest.DiscoverableEndpointsService; +import org.dspace.app.rest.Parameter; +import org.dspace.app.rest.SearchRestMethod; +import org.dspace.app.rest.exception.LinkNotFoundException; +import org.dspace.app.rest.exception.RepositoryMethodNotImplementedException; +import org.dspace.app.rest.model.ResourcePolicyRest; +import org.dspace.app.rest.model.VocabularyEntryDetailsRest; +import org.dspace.app.rest.model.VocabularyRest; +import org.dspace.app.rest.utils.AuthorityUtils; +import org.dspace.content.authority.Choice; +import org.dspace.content.authority.ChoiceAuthority; +import org.dspace.content.authority.Choices; +import org.dspace.content.authority.service.ChoiceAuthorityService; +import org.dspace.core.Context; +import org.springframework.beans.factory.InitializingBean; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.PageImpl; +import org.springframework.data.domain.Pageable; +import org.springframework.hateoas.Link; +import org.springframework.security.access.prepost.PreAuthorize; +import org.springframework.stereotype.Component; + +/** + * Controller for exposition of vocabularies entry details for the submission + * + * @author Andrea Bollini (andrea.bollini at 4science.it) + */ +@Component(VocabularyRest.CATEGORY + "." + VocabularyEntryDetailsRest.NAME) +public class VocabularyEntryDetailsRestRepository extends DSpaceRestRepository + implements InitializingBean { + + @Autowired + private ChoiceAuthorityService cas; + + @Autowired + private AuthorityUtils authorityUtils; + + @Autowired + private DiscoverableEndpointsService discoverableEndpointsService; + + @Override + public void afterPropertiesSet() throws Exception { + String models = English.plural(VocabularyEntryDetailsRest.NAME); + discoverableEndpointsService.register(this, Arrays.asList( + new Link("/api/" + VocabularyRest.CATEGORY + "/" + models + "/search", + models + "-search"))); + } + + @PreAuthorize("hasAuthority('AUTHENTICATED')") + @Override + public Page findAll(Context context, Pageable pageable) { + throw new RepositoryMethodNotImplementedException(ResourcePolicyRest.NAME, "findAll"); + } + + @PreAuthorize("hasAuthority('AUTHENTICATED')") + @Override + public VocabularyEntryDetailsRest findOne(Context context, String name) { + String[] parts = StringUtils.split(name, ":", 2); + if (parts.length != 2) { + return null; + } + String vocabularyName = parts[0]; + String vocabularyId = parts[1]; + ChoiceAuthority source = cas.getChoiceAuthorityByAuthorityName(vocabularyName); + Choice choice = source.getChoice(vocabularyId, context.getCurrentLocale().toString()); + return authorityUtils.convertEntryDetails(choice, vocabularyName, source.isHierarchical(), + utils.obtainProjection()); + } + + @SearchRestMethod(name = "top") + @PreAuthorize("hasAuthority('AUTHENTICATED')") + public Page findAllTop(@Parameter(value = "vocabulary", required = true) + String vocabularyId, Pageable pageable) { + Context context = obtainContext(); + List results = new ArrayList(); + ChoiceAuthority source = cas.getChoiceAuthorityByAuthorityName(vocabularyId); + if (source.isHierarchical()) { + Choices choices = cas.getTopChoices(vocabularyId, (int)pageable.getOffset(), pageable.getPageSize(), + context.getCurrentLocale().toString()); + for (Choice value : choices.values) { + results.add(authorityUtils.convertEntryDetails(value, vocabularyId, source.isHierarchical(), + utils.obtainProjection())); + } + Page resources = new PageImpl(results, pageable, + choices.total); + return resources; + } + throw new LinkNotFoundException(VocabularyRest.CATEGORY, VocabularyEntryDetailsRest.NAME, vocabularyId); + } + + @Override + public Class getDomainClass() { + return VocabularyEntryDetailsRest.class; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryLinkRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryLinkRepository.java new file mode 100644 index 0000000000..9d75ef87c3 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyEntryLinkRepository.java @@ -0,0 +1,97 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.repository; + +import java.util.ArrayList; +import java.util.List; +import javax.annotation.Nullable; +import javax.servlet.http.HttpServletRequest; + +import org.apache.commons.lang3.BooleanUtils; +import org.apache.commons.lang3.StringUtils; +import org.dspace.app.rest.exception.UnprocessableEntityException; +import org.dspace.app.rest.model.VocabularyEntryRest; +import org.dspace.app.rest.model.VocabularyRest; +import org.dspace.app.rest.projection.Projection; +import org.dspace.app.rest.utils.AuthorityUtils; +import org.dspace.content.authority.Choice; +import org.dspace.content.authority.ChoiceAuthority; +import org.dspace.content.authority.Choices; +import org.dspace.content.authority.service.ChoiceAuthorityService; +import org.dspace.content.service.CollectionService; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.PageImpl; +import org.springframework.data.domain.Pageable; +import org.springframework.data.rest.webmvc.ResourceNotFoundException; +import org.springframework.security.access.prepost.PreAuthorize; +import org.springframework.stereotype.Component; + +/** + * Controller for exposition of authority services + * + * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) + */ +@Component(VocabularyRest.CATEGORY + "." + VocabularyRest.NAME + "." + VocabularyRest.ENTRIES) +public class VocabularyEntryLinkRepository extends AbstractDSpaceRestRepository + implements LinkRestRepository { + + @Autowired + private ChoiceAuthorityService cas; + + @Autowired + private CollectionService cs; + + @Autowired + private AuthorityUtils authorityUtils; + + @PreAuthorize("hasAuthority('AUTHENTICATED')") + public Page filter(@Nullable HttpServletRequest request, String name, + @Nullable Pageable optionalPageable, Projection projection) { + Context context = obtainContext(); + String exact = request == null ? null : request.getParameter("exact"); + String filter = request == null ? null : request.getParameter("filter"); + String entryID = request == null ? null : request.getParameter("entryID"); + + if (StringUtils.isNotBlank(filter) && StringUtils.isNotBlank(entryID)) { + throw new IllegalArgumentException("the filter and entryID parameters are mutually exclusive"); + } + + Pageable pageable = utils.getPageable(optionalPageable); + List results = new ArrayList<>(); + ChoiceAuthority ca = cas.getChoiceAuthorityByAuthorityName(name); + if (ca == null) { + throw new ResourceNotFoundException("the vocabulary named " + name + "doesn't exist"); + } + if (!ca.isScrollable() && StringUtils.isBlank(filter) && StringUtils.isBlank(entryID)) { + throw new UnprocessableEntityException( + "one of filter or entryID parameter is required for not scrollable vocabularies"); + } + Choices choices = null; + if (BooleanUtils.toBoolean(exact)) { + choices = ca.getBestMatch(filter, context.getCurrentLocale().toString()); + } else if (StringUtils.isNotBlank(entryID)) { + Choice choice = ca.getChoice(entryID, + context.getCurrentLocale().toString()); + if (choice != null) { + choices = new Choices(new Choice[] {choice}, 0, 1, Choices.CF_ACCEPTED, false); + } else { + choices = new Choices(false); + } + } else { + choices = ca.getMatches(filter, Math.toIntExact(pageable.getOffset()), + pageable.getPageSize(), context.getCurrentLocale().toString()); + } + boolean storeAuthority = ca.storeAuthorityInMetadata(); + for (Choice value : choices.values) { + results.add(authorityUtils.convertEntry(value, name, storeAuthority, projection)); + } + return new PageImpl<>(results, pageable, choices.total); + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyRestRepository.java new file mode 100644 index 0000000000..dcdf71186b --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/VocabularyRestRepository.java @@ -0,0 +1,112 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.repository; + +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.List; +import java.util.Set; +import java.util.UUID; + +import org.dspace.app.rest.Parameter; +import org.dspace.app.rest.SearchRestMethod; +import org.dspace.app.rest.exception.UnprocessableEntityException; +import org.dspace.app.rest.model.VocabularyRest; +import org.dspace.app.rest.projection.Projection; +import org.dspace.app.rest.utils.AuthorityUtils; +import org.dspace.content.Collection; +import org.dspace.content.MetadataField; +import org.dspace.content.authority.ChoiceAuthority; +import org.dspace.content.authority.service.ChoiceAuthorityService; +import org.dspace.content.service.CollectionService; +import org.dspace.content.service.MetadataFieldService; +import org.dspace.core.Context; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.Pageable; +import org.springframework.security.access.prepost.PreAuthorize; +import org.springframework.stereotype.Component; + +/** + * Controller for exposition of vocabularies for the submission + * + * @author Luigi Andrea Pascarelli (luigiandrea.pascarelli at 4science.it) + * @author Andrea Bollini (andrea.bollini at 4science.it) + */ +@Component(VocabularyRest.CATEGORY + "." + VocabularyRest.NAME) +public class VocabularyRestRepository extends DSpaceRestRepository { + + @Autowired + private ChoiceAuthorityService cas; + + @Autowired + private AuthorityUtils authorityUtils; + + @Autowired + private CollectionService collectionService; + + @Autowired + private MetadataFieldService metadataFieldService; + + @PreAuthorize("hasAuthority('AUTHENTICATED')") + @Override + public VocabularyRest findOne(Context context, String name) { + ChoiceAuthority source = cas.getChoiceAuthorityByAuthorityName(name); + return authorityUtils.convertAuthority(source, name, utils.obtainProjection()); + } + + @PreAuthorize("hasAuthority('AUTHENTICATED')") + @Override + public Page findAll(Context context, Pageable pageable) { + Set authoritiesName = cas.getChoiceAuthoritiesNames(); + List results = new ArrayList<>(); + Projection projection = utils.obtainProjection(); + for (String authorityName : authoritiesName) { + ChoiceAuthority source = cas.getChoiceAuthorityByAuthorityName(authorityName); + VocabularyRest result = authorityUtils.convertAuthority(source, authorityName, projection); + results.add(result); + } + return utils.getPage(results, pageable); + } + + @PreAuthorize("hasAuthority('AUTHENTICATED')") + @SearchRestMethod(name = "byMetadataAndCollection") + public VocabularyRest findByMetadataAndCollection( + @Parameter(value = "metadata", required = true) String metadataField, + @Parameter(value = "collection", required = true) UUID collectionUuid) { + + Collection collection = null; + MetadataField metadata = null; + String[] tokens = org.dspace.core.Utils.tokenize(metadataField); + + try { + collection = collectionService.find(obtainContext(), collectionUuid); + metadata = metadataFieldService.findByElement(obtainContext(), tokens[0], tokens[1], tokens[2]); + } catch (SQLException e) { + throw new RuntimeException( + "A database error occurs retrieving the metadata and/or the collection information", e); + } + + if (metadata == null) { + throw new UnprocessableEntityException(metadataField + " is not a valid metadata"); + } + if (collection == null) { + throw new UnprocessableEntityException(collectionUuid + " is not a valid collection"); + } + + String authorityName = cas.getChoiceAuthorityName(tokens[0], tokens[1], tokens[2], collection); + ChoiceAuthority source = cas.getChoiceAuthorityByAuthorityName(authorityName); + return authorityUtils.convertAuthority(source, authorityName, utils.obtainProjection()); + } + + @Override + public Class getDomainClass() { + return VocabularyRest.class; + } + +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/WorkspaceItemRestRepository.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/WorkspaceItemRestRepository.java index a8b514ba0c..7ee157ab52 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/WorkspaceItemRestRepository.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/WorkspaceItemRestRepository.java @@ -16,10 +16,6 @@ import java.util.List; import java.util.UUID; import javax.servlet.http.HttpServletRequest; -import gr.ekt.bte.core.TransformationEngine; -import gr.ekt.bte.core.TransformationSpec; -import gr.ekt.bte.exceptions.BadTransformationSpec; -import gr.ekt.bte.exceptions.MalformedSourceException; import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.Logger; import org.dspace.app.rest.Parameter; @@ -45,6 +41,7 @@ import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.service.AuthorizeService; import org.dspace.content.Collection; import org.dspace.content.Item; +import org.dspace.content.MetadataValue; import org.dspace.content.WorkspaceItem; import org.dspace.content.service.BitstreamFormatService; import org.dspace.content.service.BitstreamService; @@ -56,14 +53,12 @@ import org.dspace.core.Context; import org.dspace.eperson.EPerson; import org.dspace.eperson.EPersonServiceImpl; import org.dspace.event.Event; +import org.dspace.importer.external.datamodel.ImportRecord; +import org.dspace.importer.external.exception.FileMultipleOccurencesException; +import org.dspace.importer.external.metadatamapping.MetadatumDTO; +import org.dspace.importer.external.service.ImportService; import org.dspace.services.ConfigurationService; import org.dspace.submit.AbstractProcessingStep; -import org.dspace.submit.lookup.DSpaceWorkspaceItemOutputGenerator; -import org.dspace.submit.lookup.MultipleSubmissionLookupDataLoader; -import org.dspace.submit.lookup.SubmissionItemDataLoader; -import org.dspace.submit.lookup.SubmissionLookupOutputGenerator; -import org.dspace.submit.lookup.SubmissionLookupService; -import org.dspace.submit.util.ItemSubmissionLookupDTO; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; @@ -73,10 +68,12 @@ import org.springframework.security.access.prepost.PreAuthorize; import org.springframework.stereotype.Component; import org.springframework.web.multipart.MultipartFile; + /** * This is the repository responsible to manage WorkspaceItem Rest object * * @author Andrea Bollini (andrea.bollini at 4science.it) + * @author Pasquale Cavallo (pasquale.cavallo at 4science.it) */ @Component(WorkspaceItemRest.CATEGORY + "." + WorkspaceItemRest.NAME) public class WorkspaceItemRestRepository extends DSpaceRestRepository @@ -110,15 +107,15 @@ public class WorkspaceItemRestRepository extends DSpaceRestRepository upload(Context context, HttpServletRequest request, - MultipartFile uploadfile) + List uploadfiles) throws SQLException, FileNotFoundException, IOException, AuthorizeException { - File file = Utils.getFile(uploadfile, "upload-loader", "filedataloader"); List results = new ArrayList<>(); + String uuid = request.getParameter("owningCollection"); + if (StringUtils.isBlank(uuid)) { + uuid = configurationService.getProperty("submission.default.collection"); + } + Collection collection = null; + if (StringUtils.isNotBlank(uuid)) { + collection = collectionService.find(context, UUID.fromString(uuid)); + } else { + collection = collectionService.findAuthorizedOptimized(context, Constants.ADD).get(0); + } + + SubmissionConfig submissionConfig = + submissionConfigReader.getSubmissionConfigByCollection(collection.getHandle()); + List result = null; + List records = new ArrayList<>(); try { - String uuid = request.getParameter("collection"); - if (StringUtils.isBlank(uuid)) { - uuid = configurationService.getProperty("submission.default.collection"); - } - - Collection collection = null; - if (StringUtils.isNotBlank(uuid)) { - collection = collectionService.find(context, UUID.fromString(uuid)); - } else { - collection = collectionService.findAuthorizedOptimized(context, Constants.ADD).get(0); - } - - SubmissionConfig submissionConfig = - submissionConfigReader.getSubmissionConfigByCollection(collection.getHandle()); - - - List tmpResult = new ArrayList(); - - TransformationEngine transformationEngine1 = submissionLookupService.getPhase1TransformationEngine(); - TransformationSpec spec = new TransformationSpec(); - // FIXME this is mostly due to the need to test. The BTE framework has an assert statement that check if the - // number of found record is less than the requested and treat 0 as is, instead, the implementation assume - // 0=unlimited this lead to test failure. - // It is unclear if BTE really respect values other than 0/MAX allowing us to put a protection against heavy - // load - spec.setNumberOfRecords(Integer.MAX_VALUE); - if (transformationEngine1 != null) { - MultipleSubmissionLookupDataLoader dataLoader = - (MultipleSubmissionLookupDataLoader) transformationEngine1.getDataLoader(); - - List fileDataLoaders = submissionLookupService.getFileProviders(); - for (String fileDataLoader : fileDataLoaders) { - dataLoader.setFile(file.getAbsolutePath(), fileDataLoader); - - try { - SubmissionLookupOutputGenerator outputGenerator = - (SubmissionLookupOutputGenerator) transformationEngine1.getOutputGenerator(); - outputGenerator.setDtoList(new ArrayList()); - log.debug("BTE transformation is about to start!"); - transformationEngine1.transform(spec); - log.debug("BTE transformation finished!"); - tmpResult.addAll(outputGenerator.getDtoList()); - if (!tmpResult.isEmpty()) { - //exit with the results founded on the first data provided - break; - } - } catch (BadTransformationSpec e1) { - log.error(e1.getMessage(), e1); - } catch (MalformedSourceException e1) { - log.error(e1.getMessage(), e1); + for (MultipartFile mpFile : uploadfiles) { + File file = Utils.getFile(mpFile, "upload-loader", "filedataloader"); + try { + ImportRecord record = importService.getRecord(file, mpFile.getOriginalFilename()); + if (record != null) { + records.add(record); + break; } + } catch (Exception e) { + log.error("Error processing data", e); + throw e; + } finally { + file.delete(); } } + } catch (FileMultipleOccurencesException e) { + throw new UnprocessableEntityException("Too many entries in file"); + } catch (Exception e) { + log.error("Error importing metadata", e); + } + WorkspaceItem source = submissionService. + createWorkspaceItem(context, getRequestService().getCurrentRequest()); + merge(context, records, source); + result = new ArrayList<>(); + result.add(source); - List result = null; - - //try to ingest workspaceitems - if (!tmpResult.isEmpty()) { - TransformationEngine transformationEngine2 = submissionLookupService.getPhase2TransformationEngine(); - if (transformationEngine2 != null) { - SubmissionItemDataLoader dataLoader = - (SubmissionItemDataLoader) transformationEngine2.getDataLoader(); - dataLoader.setDtoList(tmpResult); - // dataLoader.setProviders() - - DSpaceWorkspaceItemOutputGenerator outputGenerator = - (DSpaceWorkspaceItemOutputGenerator) transformationEngine2.getOutputGenerator(); - outputGenerator.setCollection(collection); - outputGenerator.setContext(context); - outputGenerator.setFormName(submissionConfig.getSubmissionName()); - outputGenerator.setDto(tmpResult.get(0)); - - try { - transformationEngine2.transform(spec); - result = outputGenerator.getWitems(); - } catch (BadTransformationSpec e1) { - e1.printStackTrace(); - } catch (MalformedSourceException e1) { - e1.printStackTrace(); - } - } - } - - //we have to create the workspaceitem to push the file also if nothing found before - if (result == null) { - WorkspaceItem source = - submissionService.createWorkspaceItem(context, getRequestService().getCurrentRequest()); - result = new ArrayList<>(); - result.add(source); - } - - //perform upload of bitstream if there is exact one result and convert workspaceitem to entity rest - if (result != null && !result.isEmpty()) { - for (WorkspaceItem wi : result) { - - List errors = new ArrayList(); - - //load bitstream into bundle ORIGINAL only if there is one result (approximately this is the - // right behaviour for pdf file but not for other bibliographic format e.g. bibtex) - if (result.size() == 1) { - - for (int i = 0; i < submissionConfig.getNumberOfSteps(); i++) { - SubmissionStepConfig stepConfig = submissionConfig.getStep(i); - - ClassLoader loader = this.getClass().getClassLoader(); - Class stepClass; - try { - stepClass = loader.loadClass(stepConfig.getProcessingClassName()); - - Object stepInstance = stepClass.newInstance(); - if (UploadableStep.class.isAssignableFrom(stepClass)) { - UploadableStep uploadableStep = (UploadableStep) stepInstance; - ErrorRest err = uploadableStep.upload(context, submissionService, stepConfig, wi, - uploadfile); + //perform upload of bitstream if there is exact one result and convert workspaceitem to entity rest + if (!result.isEmpty()) { + for (WorkspaceItem wi : result) { + List errors = new ArrayList(); + wi.setMultipleFiles(uploadfiles.size() > 1); + //load bitstream into bundle ORIGINAL only if there is one result (approximately this is the + // right behaviour for pdf file but not for other bibliographic format e.g. bibtex) + if (result.size() == 1) { + for (int i = 0; i < submissionConfig.getNumberOfSteps(); i++) { + SubmissionStepConfig stepConfig = submissionConfig.getStep(i); + ClassLoader loader = this.getClass().getClassLoader(); + Class stepClass; + try { + stepClass = loader.loadClass(stepConfig.getProcessingClassName()); + Object stepInstance = stepClass.newInstance(); + if (UploadableStep.class.isAssignableFrom(stepClass)) { + UploadableStep uploadableStep = (UploadableStep) stepInstance; + for (MultipartFile mpFile : uploadfiles) { + ErrorRest err = uploadableStep.upload(context, + submissionService, stepConfig, wi, mpFile); if (err != null) { errors.add(err); } } - - } catch (Exception e) { - log.error(e.getMessage(), e); } + } catch (Exception e) { + log.error(e.getMessage(), e); } } - WorkspaceItemRest wsi = converter.toRest(wi, utils.obtainProjection()); - if (result.size() == 1) { - if (!errors.isEmpty()) { - wsi.getErrors().addAll(errors); - } - } - results.add(wsi); } + WorkspaceItemRest wsi = converter.toRest(wi, utils.obtainProjection()); + if (result.size() == 1) { + if (!errors.isEmpty()) { + wsi.getErrors().addAll(errors); + } + } + results.add(wsi); } - } finally { - file.delete(); } return results; } @@ -551,4 +491,24 @@ public class WorkspaceItemRestRepository extends DSpaceRestRepository getPKClass() { return Integer.class; } + + private void merge(Context context, List records, WorkspaceItem item) throws SQLException { + for (MetadataValue metadataValue : itemService.getMetadata( + item.getItem(), Item.ANY, Item.ANY, Item.ANY, Item.ANY)) { + itemService.clearMetadata(context, item.getItem(), + metadataValue.getMetadataField().getMetadataSchema().getNamespace(), + metadataValue.getMetadataField().getElement(), + metadataValue.getMetadataField().getQualifier(), + metadataValue.getLanguage()); + } + for (ImportRecord record : records) { + if (record != null && record.getValueList() != null) { + for (MetadatumDTO metadataValue : record.getValueList()) { + itemService.addMetadata(context, item.getItem(), metadataValue.getSchema(), + metadataValue.getElement(), metadataValue.getQualifier(), null, + metadataValue.getValue()); + } + } + } + } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/patch/operation/EPersonPasswordReplaceOperation.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/patch/operation/EPersonPasswordReplaceOperation.java index 00b30e24f1..5a30f26fc1 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/patch/operation/EPersonPasswordReplaceOperation.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/repository/patch/operation/EPersonPasswordReplaceOperation.java @@ -7,12 +7,22 @@ */ package org.dspace.app.rest.repository.patch.operation; +import java.sql.SQLException; + +import org.apache.commons.lang3.StringUtils; +import org.apache.logging.log4j.Logger; import org.dspace.app.rest.exception.DSpaceBadRequestException; import org.dspace.app.rest.model.patch.Operation; +import org.dspace.app.util.AuthorizeUtil; +import org.dspace.authorize.AuthorizeException; import org.dspace.core.Context; import org.dspace.eperson.EPerson; import org.dspace.eperson.factory.EPersonServiceFactory; +import org.dspace.eperson.service.AccountService; import org.dspace.eperson.service.EPersonService; +import org.dspace.services.RequestService; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.security.access.AccessDeniedException; import org.springframework.stereotype.Component; /** @@ -27,18 +37,35 @@ import org.springframework.stereotype.Component; @Component public class EPersonPasswordReplaceOperation extends PatchOperation { + private static final Logger log = org.apache.logging.log4j.LogManager + .getLogger(EPersonPasswordReplaceOperation.class); + /** * Path in json body of patch that uses this operation */ public static final String OPERATION_PASSWORD_CHANGE = "/password"; protected EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); + @Autowired + private RequestService requestService; + + @Autowired + private AccountService accountService; + @Override public R perform(Context context, R object, Operation operation) { checkOperationValue(operation.getValue()); if (supports(object, operation)) { EPerson eperson = (EPerson) object; + if (!AuthorizeUtil.authorizeUpdatePassword(context, eperson.getEmail())) { + throw new DSpaceBadRequestException("Password cannot be updated for the given EPerson with email: " + + eperson.getEmail()); + } + String token = requestService.getCurrentRequest().getHttpServletRequest().getParameter("token"); checkModelForExistingValue(eperson); + if (StringUtils.isNotBlank(token)) { + verifyAndDeleteToken(context, eperson, token, operation); + } ePersonService.setPassword(eperson, (String) operation.getValue()); return object; } else { @@ -46,6 +73,24 @@ public class EPersonPasswordReplaceOperation extends PatchOperation { } } + private void verifyAndDeleteToken(Context context, EPerson eperson, String token, Operation operation) { + try { + EPerson ePersonFromToken = accountService.getEPerson(context, token); + if (ePersonFromToken == null) { + throw new AccessDeniedException("The token in the parameter: " + token + " couldn't" + + " be associated with an EPerson"); + } + if (!ePersonFromToken.getID().equals(eperson.getID())) { + throw new AccessDeniedException("The token in the parameter belongs to a different EPerson" + + " than the uri indicates"); + } + context.setCurrentUser(ePersonFromToken); + accountService.deleteToken(context, token); + } catch (SQLException | AuthorizeException e) { + log.error("Failed to verify or delete the token for an EPerson patch", e); + } + } + /** * Checks whether the ePerson has a password via the ePersonService to checking if it has a non null password hash * throws a DSpaceBadRequestException if not pw hash was present diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/scripts/handler/impl/RestDSpaceRunnableHandler.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/scripts/handler/impl/RestDSpaceRunnableHandler.java index f2080dcd84..8f56513749 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/scripts/handler/impl/RestDSpaceRunnableHandler.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/scripts/handler/impl/RestDSpaceRunnableHandler.java @@ -14,6 +14,7 @@ import java.io.StringWriter; import java.sql.SQLException; import java.util.List; import java.util.Optional; +import java.util.UUID; import org.apache.commons.cli.HelpFormatter; import org.apache.commons.cli.Options; @@ -26,14 +27,17 @@ import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.BitstreamService; import org.dspace.core.Context; import org.dspace.eperson.EPerson; +import org.dspace.eperson.factory.EPersonServiceFactory; +import org.dspace.eperson.service.EPersonService; import org.dspace.scripts.DSpaceCommandLineParameter; import org.dspace.scripts.DSpaceRunnable; import org.dspace.scripts.Process; +import org.dspace.scripts.ProcessLogLevel; import org.dspace.scripts.factory.ScriptServiceFactory; import org.dspace.scripts.handler.DSpaceRunnableHandler; import org.dspace.scripts.service.ProcessService; import org.dspace.utils.DSpace; -import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor; +import org.springframework.core.task.TaskExecutor; /** * The {@link DSpaceRunnableHandler} dealing with Scripts started from the REST api @@ -44,9 +48,11 @@ public class RestDSpaceRunnableHandler implements DSpaceRunnableHandler { private BitstreamService bitstreamService = ContentServiceFactory.getInstance().getBitstreamService(); private ProcessService processService = ScriptServiceFactory.getInstance().getProcessService(); + private EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); private Integer processId; private String scriptName; + private UUID ePersonId; /** * This constructor will initialise the handler with the process created from the parameters @@ -57,6 +63,7 @@ public class RestDSpaceRunnableHandler implements DSpaceRunnableHandler { public RestDSpaceRunnableHandler(EPerson ePerson, String scriptName, List parameters) { Context context = new Context(); try { + ePersonId = ePerson.getID(); Process process = processService.create(context, ePerson, scriptName, parameters); processId = process.getID(); this.scriptName = process.getName(); @@ -96,10 +103,18 @@ public class RestDSpaceRunnableHandler implements DSpaceRunnableHandler { try { Process process = processService.find(context, processId); processService.complete(context, process); - context.complete(); logInfo("The script has completed"); + + addLogBitstreamToProcess(context); + + context.complete(); } catch (SQLException e) { log.error("RestDSpaceRunnableHandler with process: " + processId + " could not be completed", e); + } catch (IOException | AuthorizeException e) { + log.error("RestDSpaceRunnableHandler with process: " + processId + " could not be completed due to an " + + "error with the logging bitstream", e); + } catch (Exception e) { + log.error(e.getMessage(), e); } finally { if (context.isValid()) { context.abort(); @@ -130,9 +145,17 @@ public class RestDSpaceRunnableHandler implements DSpaceRunnableHandler { try { Process process = processService.find(context, processId); processService.fail(context, process); + + + addLogBitstreamToProcess(context); context.complete(); } catch (SQLException sqlException) { log.error("SQL exception while handling another exception", e); + } catch (IOException | AuthorizeException ioException) { + log.error("RestDSpaceRunnableHandler with process: " + processId + " could not be completed due to an " + + "error with the logging bitstream", e); + } catch (Exception exception) { + log.error(exception.getMessage(), exception); } finally { if (context.isValid()) { context.abort(); @@ -156,18 +179,26 @@ public class RestDSpaceRunnableHandler implements DSpaceRunnableHandler { String logMessage = getLogMessage(message); log.info(logMessage); + appendLogToProcess(message, ProcessLogLevel.INFO); + } @Override public void logWarning(String message) { String logMessage = getLogMessage(message); log.warn(logMessage); + + appendLogToProcess(message, ProcessLogLevel.WARNING); + } @Override public void logError(String message) { String logMessage = getLogMessage(message); log.error(logMessage); + + appendLogToProcess(message, ProcessLogLevel.ERROR); + } @Override @@ -231,9 +262,8 @@ public class RestDSpaceRunnableHandler implements DSpaceRunnableHandler { * @param script The script to be ran */ public void schedule(DSpaceRunnable script) { - ThreadPoolTaskExecutor taskExecutor = new DSpace().getServiceManager() - .getServiceByName("dspaceRunnableThreadExecutor", - ThreadPoolTaskExecutor.class); + TaskExecutor taskExecutor = new DSpace().getServiceManager() + .getServiceByName("dspaceRunnableThreadExecutor", TaskExecutor.class); Context context = new Context(); try { Process process = processService.find(context, processId); @@ -249,4 +279,24 @@ public class RestDSpaceRunnableHandler implements DSpaceRunnableHandler { } taskExecutor.execute(script); } + + private void appendLogToProcess(String message, ProcessLogLevel error) { + try { + processService.appendLog(processId, scriptName, message, error); + } catch (IOException e) { + log.error("RestDSpaceRunnableHandler with process: " + processId + " could not write log to process", e); + } + } + + private void addLogBitstreamToProcess(Context context) throws SQLException, IOException, AuthorizeException { + try { + EPerson ePerson = ePersonService.find(context, ePersonId); + Process process = processService.find(context, processId); + + context.setCurrentUser(ePerson); + processService.createLogBitstream(context, process); + } catch (SQLException | IOException | AuthorizeException e) { + log.error("RestDSpaceRunnableHandler with process: " + processId + " could not write log to process", e); + } + } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/AnonymousAdditionalAuthorizationFilter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/AnonymousAdditionalAuthorizationFilter.java new file mode 100644 index 0000000000..3087a5850b --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/AnonymousAdditionalAuthorizationFilter.java @@ -0,0 +1,69 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.security; + +import java.io.IOException; +import java.sql.SQLException; +import java.util.List; +import javax.servlet.FilterChain; +import javax.servlet.ServletException; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; + +import org.apache.log4j.Logger; +import org.dspace.app.rest.utils.ContextUtil; +import org.dspace.authenticate.service.AuthenticationService; +import org.dspace.core.Context; +import org.dspace.eperson.Group; +import org.springframework.security.authentication.AuthenticationManager; +import org.springframework.security.web.authentication.www.BasicAuthenticationFilter; + +/** + * This is a Filter class that'll fetch special groups from the {@link AuthenticationService} and set these in the + * current DSpace Context. It'll do extra processing on anonymous requests to see which authorizations they + * can implicitly have and adds those + * This will allow us to for example set a specific Group to a specific IP so that any request from that + * IP is always treated as being a part of the configured group. + * The configuration for the authentication through ip can be found in authentication-ip.cfg + * This can be enabled by uncommenting the IPAuhentication plugin in authentication.cfg + */ +public class AnonymousAdditionalAuthorizationFilter extends BasicAuthenticationFilter { + + private static final Logger log = Logger.getLogger(AnonymousAdditionalAuthorizationFilter.class); + + private AuthenticationService authenticationService; + + /** + * Constructor for the class + * @param authenticationManager The relevant AuthenticationManager + * @param authenticationService The autowired AuthenticationService + */ + public AnonymousAdditionalAuthorizationFilter(AuthenticationManager authenticationManager, + AuthenticationService authenticationService) { + super(authenticationManager); + this.authenticationService = authenticationService; + } + + @Override + protected void doFilterInternal(HttpServletRequest req, + HttpServletResponse res, + FilterChain chain) throws IOException, ServletException { + + Context context = ContextUtil.obtainContext(req); + try { + List groups = authenticationService.getSpecialGroups(context, req); + for (Group group : groups) { + context.setSpecialGroup(group.getID()); + } + } catch (SQLException e) { + log.error("Something went wrong trying to fetch groups in IPAuthenticationFilter", e); + } + chain.doFilter(req, res); + } + +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/AuthorizeServicePermissionEvaluatorPlugin.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/AuthorizeServicePermissionEvaluatorPlugin.java index f4a78e7d9d..8e525c7e35 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/AuthorizeServicePermissionEvaluatorPlugin.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/AuthorizeServicePermissionEvaluatorPlugin.java @@ -65,37 +65,39 @@ public class AuthorizeServicePermissionEvaluatorPlugin extends RestObjectPermiss Context context = ContextUtil.obtainContext(request.getServletRequest()); EPerson ePerson = null; try { - UUID dsoId = UUIDUtils.fromString(targetId.toString()); - DSpaceObjectService dSpaceObjectService; - try { - dSpaceObjectService = + if (targetId != null) { + UUID dsoId = UUIDUtils.fromString(targetId.toString()); + DSpaceObjectService dSpaceObjectService; + try { + dSpaceObjectService = contentServiceFactory.getDSpaceObjectService(Constants.getTypeID(targetType)); - } catch (UnsupportedOperationException e) { - // ok not a dspace object - return false; - } - - ePerson = ePersonService.findByEmail(context, (String) authentication.getPrincipal()); - - if (dSpaceObjectService != null && dsoId != null) { - DSpaceObject dSpaceObject = dSpaceObjectService.find(context, dsoId); - - //If the dso is null then we give permission so we can throw another status code instead - if (dSpaceObject == null) { - return true; + } catch (UnsupportedOperationException e) { + // ok not a dspace object + return false; } - // If the item is still inprogress we can process here only the READ permission. - // Other actions need to be evaluated against the wrapper object (workspace or workflow item) - if (dSpaceObject instanceof Item) { - if (!DSpaceRestPermission.READ.equals(restPermission) - && !((Item) dSpaceObject).isArchived() && !((Item) dSpaceObject).isWithdrawn()) { - return false; + ePerson = ePersonService.findByEmail(context, (String) authentication.getPrincipal()); + + if (dSpaceObjectService != null && dsoId != null) { + DSpaceObject dSpaceObject = dSpaceObjectService.find(context, dsoId); + + //If the dso is null then we give permission so we can throw another status code instead + if (dSpaceObject == null) { + return true; } - } - return authorizeService.authorizeActionBoolean(context, ePerson, dSpaceObject, + // If the item is still inprogress we can process here only the READ permission. + // Other actions need to be evaluated against the wrapper object (workspace or workflow item) + if (dSpaceObject instanceof Item) { + if (!DSpaceRestPermission.READ.equals(restPermission) + && !((Item) dSpaceObject).isArchived() && !((Item) dSpaceObject).isWithdrawn()) { + return false; + } + } + + return authorizeService.authorizeActionBoolean(context, ePerson, dSpaceObject, restPermission.getDspaceApiActionId(), true); + } } } catch (SQLException e) { diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/CustomLogoutHandler.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/CustomLogoutHandler.java index 204eda62dc..b3f4a00d37 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/CustomLogoutHandler.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/CustomLogoutHandler.java @@ -10,7 +10,6 @@ package org.dspace.app.rest.security; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; -import org.dspace.app.rest.security.jwt.JWTTokenHandler; import org.dspace.app.rest.utils.ContextUtil; import org.dspace.core.Context; import org.slf4j.Logger; @@ -29,7 +28,7 @@ import org.springframework.stereotype.Component; @Component public class CustomLogoutHandler implements LogoutHandler { - private static final Logger log = LoggerFactory.getLogger(JWTTokenHandler.class); + private static final Logger log = LoggerFactory.getLogger(CustomLogoutHandler.class); @Autowired private RestAuthenticationService restAuthenticationService; diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/DSpace401AuthenticationEntryPoint.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/DSpace401AuthenticationEntryPoint.java index 68aea6a526..b70931336e 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/DSpace401AuthenticationEntryPoint.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/DSpace401AuthenticationEntryPoint.java @@ -35,7 +35,6 @@ public class DSpace401AuthenticationEntryPoint implements AuthenticationEntryPoi response.setHeader("WWW-Authenticate", restAuthenticationService.getWwwAuthenticateHeaderValue(request, response)); - response.sendError(HttpServletResponse.SC_UNAUTHORIZED, - authException.getMessage()); + response.sendError(HttpServletResponse.SC_UNAUTHORIZED, "Authentication is required"); } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/EPersonRestPermissionEvaluatorPlugin.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/EPersonRestPermissionEvaluatorPlugin.java index 00c2c60cb2..50f209cedc 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/EPersonRestPermissionEvaluatorPlugin.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/EPersonRestPermissionEvaluatorPlugin.java @@ -11,7 +11,9 @@ import java.io.Serializable; import java.sql.SQLException; import java.util.List; import java.util.UUID; +import javax.servlet.http.HttpServletRequest; +import org.apache.commons.lang3.StringUtils; import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.Patch; import org.dspace.app.rest.repository.patch.operation.DSpaceObjectMetadataPatchUtils; @@ -22,7 +24,6 @@ import org.dspace.authorize.service.AuthorizeService; import org.dspace.core.Constants; import org.dspace.core.Context; import org.dspace.eperson.EPerson; -import org.dspace.eperson.service.EPersonService; import org.dspace.services.RequestService; import org.dspace.services.model.Request; import org.slf4j.Logger; @@ -46,9 +47,6 @@ public class EPersonRestPermissionEvaluatorPlugin extends RestObjectPermissionEv @Autowired private RequestService requestService; - @Autowired - private EPersonService ePersonService; - @Override public boolean hasDSpacePermission(Authentication authentication, Serializable targetId, String targetType, DSpaceRestPermission permission) { @@ -68,27 +66,27 @@ public class EPersonRestPermissionEvaluatorPlugin extends RestObjectPermissionEv EPerson ePerson = null; - try { - ePerson = ePersonService.findByEmail(context, (String) authentication.getPrincipal()); - UUID dsoId = UUID.fromString(targetId.toString()); + ePerson = context.getCurrentUser(); + UUID dsoId = UUID.fromString(targetId.toString()); - // anonymous user + // anonymous user + try { if (ePerson == null) { return false; } else if (dsoId.equals(ePerson.getID())) { return true; } else if (authorizeService.isCommunityAdmin(context, ePerson) - && AuthorizeUtil.canCommunityAdminManageAccounts()) { + && AuthorizeUtil.canCommunityAdminManageAccounts()) { return true; } else if (authorizeService.isCollectionAdmin(context, ePerson) - && AuthorizeUtil.canCollectionAdminManageAccounts()) { + && AuthorizeUtil.canCollectionAdminManageAccounts()) { return true; } - } catch (SQLException e) { log.error(e.getMessage(), e); } + return false; } @@ -96,6 +94,17 @@ public class EPersonRestPermissionEvaluatorPlugin extends RestObjectPermissionEv public boolean hasPatchPermission(Authentication authentication, Serializable targetId, String targetType, Patch patch) { + List operations = patch.getOperations(); + // If it's a password replace action, we can allow anon through provided that there's a token present + Request currentRequest = requestService.getCurrentRequest(); + if (currentRequest != null) { + HttpServletRequest httpServletRequest = currentRequest.getHttpServletRequest(); + if (operations.size() > 0 && StringUtils.equalsIgnoreCase(operations.get(0).getOp(), "replace") + && StringUtils.equalsIgnoreCase(operations.get(0).getPath(), "/password") + && StringUtils.isNotBlank(httpServletRequest.getParameter("token"))) { + return true; + } + } /** * First verify that the user has write permission on the eperson. */ @@ -103,7 +112,6 @@ public class EPersonRestPermissionEvaluatorPlugin extends RestObjectPermissionEv return false; } - List operations = patch.getOperations(); /** * The entire Patch request should be denied if it contains operations that are diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/PoolTaskRestPermissionEvaluatorPlugin.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/PoolTaskRestPermissionEvaluatorPlugin.java index 73c43714ba..d369cffcf9 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/PoolTaskRestPermissionEvaluatorPlugin.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/PoolTaskRestPermissionEvaluatorPlugin.java @@ -31,7 +31,7 @@ import org.springframework.stereotype.Component; /** * An authenticated user is allowed to interact with a pool task only if it is in his list. - * + * * @author Andrea Bollini (andrea.bollini at 4science.it) */ @Component @@ -75,7 +75,7 @@ public class PoolTaskRestPermissionEvaluatorPlugin extends RestObjectPermissionE XmlWorkflowItem workflowItem = poolTask.getWorkflowItem(); PoolTask poolTask2 = poolTaskService.findByWorkflowIdAndEPerson(context, workflowItem, ePerson); - if (poolTask2 != null && poolTask2.getID() == poolTask.getID()) { + if (poolTask2 != null && poolTask2.getID().equals(poolTask.getID())) { return true; } } catch (SQLException | AuthorizeException | IOException e) { diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/RestAuthenticationService.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/RestAuthenticationService.java index b1a47336ba..88b1d26524 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/RestAuthenticationService.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/RestAuthenticationService.java @@ -11,6 +11,7 @@ import java.io.IOException; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; +import org.dspace.app.rest.model.wrapper.AuthenticationToken; import org.dspace.authenticate.service.AuthenticationService; import org.dspace.core.Context; import org.dspace.eperson.EPerson; @@ -28,6 +29,14 @@ public interface RestAuthenticationService { void addAuthenticationDataForUser(HttpServletRequest request, HttpServletResponse response, DSpaceAuthentication authentication, boolean addCookie) throws IOException; + /** + * Retrieve a short lived authentication token, this can be used (among other things) for file downloads + * @param context the DSpace context + * @param request The current client request + * @return An AuthenticationToken that contains a string with the token + */ + AuthenticationToken getShortLivedAuthenticationToken(Context context, HttpServletRequest request); + EPerson getAuthenticatedEPerson(HttpServletRequest request, Context context); boolean hasAuthenticationData(HttpServletRequest request); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/StatelessAuthenticationFilter.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/StatelessAuthenticationFilter.java index 4ab9fb5371..ff845b00f2 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/StatelessAuthenticationFilter.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/StatelessAuthenticationFilter.java @@ -77,20 +77,21 @@ public class StatelessAuthenticationFilter extends BasicAuthenticationFilter { HttpServletResponse res, FilterChain chain) throws IOException, ServletException { - Authentication authentication = null; + Authentication authentication; try { authentication = getAuthentication(req, res); } catch (AuthorizeException e) { - res.sendError(HttpServletResponse.SC_UNAUTHORIZED, e.getMessage()); - log.error(e.getMessage(), e); + // just return an error, but do not log + res.sendError(HttpServletResponse.SC_UNAUTHORIZED, "Authentication is required"); return; } catch (IllegalArgumentException | SQLException e) { - res.sendError(HttpServletResponse.SC_BAD_REQUEST, e.getMessage()); - log.error(e.getMessage(), e); + res.sendError(HttpServletResponse.SC_BAD_REQUEST, "Authentication request is invalid or incorrect"); + log.error("Authentication request is invalid or incorrect (status:{})", + HttpServletResponse.SC_BAD_REQUEST, e); return; } catch (AccessDeniedException e) { - res.sendError(HttpServletResponse.SC_FORBIDDEN, e.getMessage()); - log.error(e.getMessage(), e); + res.sendError(HttpServletResponse.SC_FORBIDDEN, "Access is denied"); + log.error("Access is denied (status:{})", HttpServletResponse.SC_FORBIDDEN, e); return; } if (authentication != null) { @@ -134,7 +135,7 @@ public class StatelessAuthenticationFilter extends BasicAuthenticationFilter { if (configurationService.getBooleanProperty("webui.user.assumelogin")) { return getOnBehalfOfAuthentication(context, onBehalfOfParameterValue, res); } else { - throw new IllegalArgumentException("The login as feature is not allowed" + + throw new IllegalArgumentException("The 'login as' feature is not allowed" + " due to the current configuration"); } } @@ -146,7 +147,7 @@ public class StatelessAuthenticationFilter extends BasicAuthenticationFilter { } } else { if (request.getHeader(ON_BEHALF_OF_REQUEST_PARAM) != null) { - throw new AuthorizeException("Only admins are allowed to use the login as feature"); + throw new AuthorizeException("Must be logged in (as an admin) to use the 'login as' feature"); } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/UsageReportRestPermissionEvaluatorPlugin.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/UsageReportRestPermissionEvaluatorPlugin.java new file mode 100644 index 0000000000..c8b57d305f --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/UsageReportRestPermissionEvaluatorPlugin.java @@ -0,0 +1,91 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.security; + +import java.io.Serializable; +import java.sql.SQLException; +import java.util.UUID; + +import org.apache.commons.lang3.StringUtils; +import org.dspace.app.rest.model.UsageReportRest; +import org.dspace.app.rest.utils.ContextUtil; +import org.dspace.app.rest.utils.DSpaceObjectUtils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.content.DSpaceObject; +import org.dspace.core.Context; +import org.dspace.services.RequestService; +import org.dspace.services.model.Request; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.rest.webmvc.ResourceNotFoundException; +import org.springframework.security.core.Authentication; +import org.springframework.stereotype.Component; + +/** + * This class will handle Permissions for the {@link UsageReportRest} object and its calls + * + * @author Maria Verdonck (Atmire) on 11/06/2020 + */ +@Component +public class UsageReportRestPermissionEvaluatorPlugin extends RestObjectPermissionEvaluatorPlugin { + + private static final Logger log = LoggerFactory.getLogger(UsageReportRestPermissionEvaluatorPlugin.class); + + @Autowired + private RequestService requestService; + + @Autowired + private DSpaceObjectUtils dspaceObjectUtil; + + @Autowired + AuthorizeService authorizeService; + + + + /** + * Responsible for checking whether or not the user has used a valid request (valid UUID in /usagereports/{ + * UUID_ReportID} or in /usagereports/search/object?uri={uri-ending-in/UUID} and whether or not the used has the + * given (READ) rights on the corresponding DSO. + * + * @param targetType usagereport or usagereportsearch, so we know how to extract the UUID + * @param targetId string to extract uuid from + */ + @Override + public boolean hasDSpacePermission(Authentication authentication, Serializable targetId, String targetType, + DSpaceRestPermission restPermission) { + Request request = requestService.getCurrentRequest(); + Context context = ContextUtil.obtainContext(request.getServletRequest()); + UUID uuidObject = null; + if (targetId != null) { + if (StringUtils.equalsIgnoreCase(UsageReportRest.NAME, targetType)) { + if (StringUtils.countMatches(targetId.toString(), "_") != 1) { + throw new IllegalArgumentException("Must end in objectUUID_reportId, example: " + + "1911e8a4-6939-490c-b58b-a5d70f8d91fb_TopCountries"); + } + // Get uuid from uuidDSO_reportId pathParam + uuidObject = UUID.fromString(StringUtils.substringBefore(targetId.toString(), "_")); + } else if (StringUtils.equalsIgnoreCase(UsageReportRest.NAME + "search", targetType)) { + // Get uuid from url (selfLink of dso) queryParam + uuidObject = UUID.fromString(StringUtils.substringAfterLast(targetId.toString(), "/")); + } else { + return false; + } + try { + DSpaceObject dso = dspaceObjectUtil.findDSpaceObject(context, uuidObject); + if (dso == null) { + throw new ResourceNotFoundException("No DSO found with this UUID: " + uuidObject); + } + return authorizeService.authorizeActionBoolean(context, dso, restPermission.getDspaceApiActionId()); + } catch (SQLException e) { + log.error(e.getMessage(), e); + } + } + return true; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/WebSecurityConfiguration.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/WebSecurityConfiguration.java index 32c0cdda00..4471262ef6 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/WebSecurityConfiguration.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/WebSecurityConfiguration.java @@ -7,6 +7,7 @@ */ package org.dspace.app.rest.security; +import org.dspace.authenticate.service.AuthenticationService; import org.dspace.services.RequestService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.autoconfigure.security.SecurityProperties; @@ -53,6 +54,9 @@ public class WebSecurityConfiguration extends WebSecurityConfigurerAdapter { @Autowired private CustomLogoutHandler customLogoutHandler; + @Autowired + private AuthenticationService authenticationService; + @Override public void configure(WebSecurity webSecurity) throws Exception { webSecurity @@ -103,7 +107,8 @@ public class WebSecurityConfiguration extends WebSecurityConfigurerAdapter { //Everyone can call GET on the status endpoint .antMatchers(HttpMethod.GET, "/api/authn/status").permitAll() .and() - + .addFilterBefore(new AnonymousAdditionalAuthorizationFilter(authenticationManager(), authenticationService), + StatelessAuthenticationFilter.class) //Add a filter before our login endpoints to do the authentication based on the data in the HTTP request .addFilterBefore(new StatelessLoginFilter("/api/authn/login", authenticationManager(), restAuthenticationService), diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/JWTTokenHandler.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/JWTTokenHandler.java index 2f55653a79..dcfb364fdd 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/JWTTokenHandler.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/JWTTokenHandler.java @@ -42,23 +42,24 @@ import org.dspace.service.ClientInfoService; import org.dspace.services.ConfigurationService; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.InitializingBean; import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.security.access.AccessDeniedException; import org.springframework.security.crypto.keygen.BytesKeyGenerator; import org.springframework.security.crypto.keygen.KeyGenerators; -import org.springframework.stereotype.Component; /** * Class responsible for creating and parsing JSON Web Tokens (JWTs), supports both JWS and JWE - * https://jwt.io/ + * https://jwt.io/ . This abstract class needs to be extended with a class providing the + * configuration keys for the particular type of token. * * @author Frederic Van Reet (frederic dot vanreet at atmire dot com) * @author Tom Desair (tom dot desair at atmire dot com) */ -@Component -public class JWTTokenHandler implements InitializingBean { +public abstract class JWTTokenHandler { private static final int MAX_CLOCK_SKEW_SECONDS = 60; + private static final String AUTHORIZATION_TOKEN_PARAMETER = "authentication-token"; + private static final Logger log = LoggerFactory.getLogger(JWTTokenHandler.class); @Autowired @@ -76,24 +77,44 @@ public class JWTTokenHandler implements InitializingBean { @Autowired private ClientInfoService clientInfoService; - private String jwtKey; - private long expirationTime; - private boolean includeIP; - private boolean encryptionEnabled; - private boolean compressionEnabled; - private byte[] encryptionKey; + private String generatedJwtKey; + private String generatedEncryptionKey; + /** + * Get the configuration property key for the token secret. + * @return the configuration property key + */ + protected abstract String getTokenSecretConfigurationKey(); - @Override - public void afterPropertiesSet() throws Exception { - this.jwtKey = getSecret("jwt.token.secret"); - this.encryptionKey = getSecret("jwt.encryption.secret").getBytes(); + /** + * Get the configuration property key for the encryption secret. + * @return the configuration property key + */ + protected abstract String getEncryptionSecretConfigurationKey(); - this.expirationTime = configurationService.getLongProperty("jwt.token.expiration", 30) * 60 * 1000; - this.includeIP = configurationService.getBooleanProperty("jwt.token.include.ip", true); - this.encryptionEnabled = configurationService.getBooleanProperty("jwt.encryption.enabled", false); - this.compressionEnabled = configurationService.getBooleanProperty("jwt.compression.enabled", false); - } + /** + * Get the configuration property key for the expiration time. + * @return the configuration property key + */ + protected abstract String getTokenExpirationConfigurationKey(); + + /** + * Get the configuration property key for the include ip. + * @return the configuration property key + */ + protected abstract String getTokenIncludeIPConfigurationKey(); + + /** + * Get the configuration property key for the encryption enable setting. + * @return the configuration property key + */ + protected abstract String getEncryptionEnabledConfigurationKey(); + + /** + * Get the configuration property key for the compression enable setting. + * @return the configuration property key + */ + protected abstract String getCompressionEnabledConfigurationKey(); /** * Retrieve EPerson from a JSON Web Token (JWT) @@ -147,6 +168,11 @@ public class JWTTokenHandler implements InitializingBean { public String createTokenForEPerson(Context context, HttpServletRequest request, Date previousLoginDate, List groups) throws JOSEException, SQLException { + // Verify that the user isn't trying to use a short lived token to generate another token + if (StringUtils.isNotBlank(request.getParameter(AUTHORIZATION_TOKEN_PARAMETER))) { + throw new AccessDeniedException("Short lived tokens can't be used to generate other tokens"); + } + // Update the saved session salt for the currently logged in user, returning the user object EPerson ePerson = updateSessionSalt(context, previousLoginDate); @@ -184,17 +210,54 @@ public class JWTTokenHandler implements InitializingBean { } } - public long getExpirationPeriod() { - return expirationTime; + /** + * Retrieve the token secret key from configuration. If not specified, generate and cache a random 32 byte key + * @return configuration value or random 32 byte key + */ + public String getJwtKey() { + String secret = configurationService.getProperty(getTokenSecretConfigurationKey()); + + if (StringUtils.isBlank(secret)) { + if (StringUtils.isBlank(generatedJwtKey)) { + generatedJwtKey = generateRandomKey(); + } + secret = generatedJwtKey; + } + + return secret; } + public boolean getIncludeIP() { + return configurationService.getBooleanProperty(getTokenIncludeIPConfigurationKey(), true); + } + + public long getExpirationPeriod() { + return configurationService.getLongProperty(getTokenExpirationConfigurationKey(), 1800000); + } public boolean isEncryptionEnabled() { - return encryptionEnabled; + return configurationService.getBooleanProperty(getEncryptionEnabledConfigurationKey(), false); } + public boolean getCompressionEnabled() { + return configurationService.getBooleanProperty(getCompressionEnabledConfigurationKey(), false); + } + + /** + * Retrieve the encryption secret key from configuration. If not specified, generate and cache a random 32 byte key + * @return configuration value or random 32 byte key + */ public byte[] getEncryptionKey() { - return encryptionKey; + String secretString = configurationService.getProperty(getEncryptionSecretConfigurationKey()); + + if (StringUtils.isBlank(secretString)) { + if (StringUtils.isBlank(generatedEncryptionKey)) { + generatedEncryptionKey = generateRandomKey(); + } + secretString = generatedEncryptionKey; + } + + return secretString.getBytes(); } private JWEObject encryptJWT(SignedJWT signedJWT) throws JOSEException { @@ -220,7 +283,7 @@ public class JWTTokenHandler implements InitializingBean { * @return true if valid, false otherwise * @throws JOSEException */ - private boolean isValidToken(HttpServletRequest request, SignedJWT signedJWT, JWTClaimsSet jwtClaimsSet, + protected boolean isValidToken(HttpServletRequest request, SignedJWT signedJWT, JWTClaimsSet jwtClaimsSet, EPerson ePerson) throws JOSEException { if (ePerson == null || StringUtils.isBlank(ePerson.getSessionSalt())) { return false; @@ -310,7 +373,7 @@ public class JWTTokenHandler implements InitializingBean { //This method makes compression configurable private JWEHeader.Builder compression(JWEHeader.Builder builder) { - if (compressionEnabled) { + if (getCompressionEnabled()) { return builder.compressionAlgorithm(CompressionAlgorithm.DEF); } return builder; @@ -326,12 +389,12 @@ public class JWTTokenHandler implements InitializingBean { * @param ePerson * @return */ - private String buildSigningKey(HttpServletRequest request, EPerson ePerson) { + protected String buildSigningKey(HttpServletRequest request, EPerson ePerson) { String ipAddress = ""; - if (includeIP) { + if (getIncludeIP()) { ipAddress = getIpAddress(request); } - return jwtKey + ePerson.getSessionSalt() + ipAddress; + return getJwtKey() + ePerson.getSessionSalt() + ipAddress; } private String getIpAddress(HttpServletRequest request) { @@ -348,7 +411,7 @@ public class JWTTokenHandler implements InitializingBean { * @return EPerson object of current user, with an updated session salt * @throws SQLException */ - private EPerson updateSessionSalt(final Context context, final Date previousLoginDate) throws SQLException { + protected EPerson updateSessionSalt(final Context context, final Date previousLoginDate) throws SQLException { EPerson ePerson; try { @@ -358,7 +421,7 @@ public class JWTTokenHandler implements InitializingBean { //This allows a user to login on multiple devices/browsers at the same time. if (StringUtils.isBlank(ePerson.getSessionSalt()) || previousLoginDate == null - || (ePerson.getLastActive().getTime() - previousLoginDate.getTime() > expirationTime)) { + || (ePerson.getLastActive().getTime() - previousLoginDate.getTime() > getExpirationPeriod())) { ePerson.setSessionSalt(generateRandomKey()); ePersonService.update(context, ePerson); @@ -371,21 +434,6 @@ public class JWTTokenHandler implements InitializingBean { return ePerson; } - /** - * Retrieve the given secret key from configuration. If not specified, generate a random 32 byte key - * @param property configuration property to check for - * @return configuration value or random 32 byte key - */ - private String getSecret(String property) { - String secret = configurationService.getProperty(property); - - if (StringUtils.isBlank(secret)) { - secret = generateRandomKey(); - } - - return secret; - } - /** * Generate a random 32 bytes key */ diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/JWTTokenRestAuthenticationServiceImpl.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/JWTTokenRestAuthenticationServiceImpl.java index 3dbab09174..8aff9cc884 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/JWTTokenRestAuthenticationServiceImpl.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/JWTTokenRestAuthenticationServiceImpl.java @@ -18,6 +18,7 @@ import javax.servlet.http.HttpServletResponse; import com.nimbusds.jose.JOSEException; import org.apache.commons.lang3.StringUtils; +import org.dspace.app.rest.model.wrapper.AuthenticationToken; import org.dspace.app.rest.security.DSpaceAuthentication; import org.dspace.app.rest.security.RestAuthenticationService; import org.dspace.app.rest.utils.ContextUtil; @@ -47,9 +48,13 @@ public class JWTTokenRestAuthenticationServiceImpl implements RestAuthentication private static final String AUTHORIZATION_COOKIE = "Authorization-cookie"; private static final String AUTHORIZATION_HEADER = "Authorization"; private static final String AUTHORIZATION_TYPE = "Bearer"; + private static final String AUTHORIZATION_TOKEN_PARAMETER = "authentication-token"; @Autowired - private JWTTokenHandler jwtTokenHandler; + private LoginJWTTokenHandler loginJWTTokenHandler; + + @Autowired + private ShortLivedJWTTokenHandler shortLivedJWTTokenHandler; @Autowired private EPersonService ePersonService; @@ -71,7 +76,7 @@ public class JWTTokenRestAuthenticationServiceImpl implements RestAuthentication List groups = authenticationService.getSpecialGroups(context, request); - String token = jwtTokenHandler.createTokenForEPerson(context, request, + String token = loginJWTTokenHandler.createTokenForEPerson(context, request, authentication.getPreviousLoginDate(), groups); addTokenToResponse(response, token, addCookie); @@ -84,11 +89,40 @@ public class JWTTokenRestAuthenticationServiceImpl implements RestAuthentication } } + /** + * Create a short-lived token for bitstream downloads among other things + * @param context The context for which to create the token + * @param request The request for which to create the token + * @return The token with a short lifespan + */ + @Override + public AuthenticationToken getShortLivedAuthenticationToken(Context context, HttpServletRequest request) { + try { + String token; + List groups = authenticationService.getSpecialGroups(context, request); + token = shortLivedJWTTokenHandler.createTokenForEPerson(context, request, null, groups); + context.commit(); + return new AuthenticationToken(token); + } catch (JOSEException e) { + log.error("JOSE Exception", e); + } catch (SQLException e) { + log.error("SQL error when adding authentication", e); + } + + return null; + } + @Override public EPerson getAuthenticatedEPerson(HttpServletRequest request, Context context) { - String token = getToken(request); try { - EPerson ePerson = jwtTokenHandler.parseEPersonFromToken(token, request, context); + String token = getLoginToken(request); + EPerson ePerson = null; + if (token == null) { + token = getShortLivedToken(request); + ePerson = shortLivedJWTTokenHandler.parseEPersonFromToken(token, request, context); + } else { + ePerson = loginJWTTokenHandler.parseEPersonFromToken(token, request, context); + } return ePerson; } catch (JOSEException e) { log.error("Jose error", e); @@ -103,15 +137,16 @@ public class JWTTokenRestAuthenticationServiceImpl implements RestAuthentication @Override public boolean hasAuthenticationData(HttpServletRequest request) { return StringUtils.isNotBlank(request.getHeader(AUTHORIZATION_HEADER)) - || StringUtils.isNotBlank(getAuthorizationCookie(request)); + || StringUtils.isNotBlank(getAuthorizationCookie(request)) + || StringUtils.isNotBlank(request.getParameter(AUTHORIZATION_TOKEN_PARAMETER)); } @Override public void invalidateAuthenticationData(HttpServletRequest request, HttpServletResponse response, Context context) throws Exception { - String token = getToken(request); + String token = getLoginToken(request); invalidateAuthenticationCookie(response); - jwtTokenHandler.invalidateToken(token, request, context); + loginJWTTokenHandler.invalidateToken(token, request, context); } @Override @@ -119,6 +154,7 @@ public class JWTTokenRestAuthenticationServiceImpl implements RestAuthentication Cookie cookie = new Cookie(AUTHORIZATION_COOKIE, ""); cookie.setHttpOnly(true); cookie.setMaxAge(0); + cookie.setSecure(true); response.addCookie(cookie); } @@ -166,7 +202,7 @@ public class JWTTokenRestAuthenticationServiceImpl implements RestAuthentication response.setHeader(AUTHORIZATION_HEADER, String.format("%s %s", AUTHORIZATION_TYPE, token)); } - private String getToken(HttpServletRequest request) { + private String getLoginToken(HttpServletRequest request) { String tokenValue = null; String authHeader = request.getHeader(AUTHORIZATION_HEADER); String authCookie = getAuthorizationCookie(request); @@ -179,6 +215,15 @@ public class JWTTokenRestAuthenticationServiceImpl implements RestAuthentication return tokenValue; } + private String getShortLivedToken(HttpServletRequest request) { + String tokenValue = null; + if (StringUtils.isNotBlank(request.getParameter(AUTHORIZATION_TOKEN_PARAMETER))) { + tokenValue = request.getParameter(AUTHORIZATION_TOKEN_PARAMETER); + } + + return tokenValue; + } + private String getAuthorizationCookie(HttpServletRequest request) { String authCookie = ""; Cookie[] cookies = request.getCookies(); diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/LoginJWTTokenHandler.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/LoginJWTTokenHandler.java new file mode 100644 index 0000000000..9446834519 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/LoginJWTTokenHandler.java @@ -0,0 +1,47 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.security.jwt; + +import org.springframework.stereotype.Component; + +/** + * Class responsible for creating and parsing JSON Web Tokens (JWTs), supports both JWS and JWE + * https://jwt.io/ + */ +@Component +public class LoginJWTTokenHandler extends JWTTokenHandler { + @Override + protected String getTokenSecretConfigurationKey() { + return "jwt.login.token.secret"; + } + + @Override + protected String getEncryptionSecretConfigurationKey() { + return "jwt.login.encryption.secret"; + } + + @Override + protected String getTokenExpirationConfigurationKey() { + return "jwt.login.token.expiration"; + } + + @Override + protected String getTokenIncludeIPConfigurationKey() { + return "jwt.login.token.include.ip"; + } + + @Override + protected String getEncryptionEnabledConfigurationKey() { + return "jwt.login.encryption.enabled"; + } + + @Override + protected String getCompressionEnabledConfigurationKey() { + return "jwt.login.compression.enabled"; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/ShortLivedJWTTokenHandler.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/ShortLivedJWTTokenHandler.java new file mode 100644 index 0000000000..902e391c30 --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/security/jwt/ShortLivedJWTTokenHandler.java @@ -0,0 +1,99 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.security.jwt; + +import java.util.Date; +import javax.servlet.http.HttpServletRequest; + +import com.nimbusds.jose.JOSEException; +import com.nimbusds.jose.JWSVerifier; +import com.nimbusds.jose.crypto.MACVerifier; +import com.nimbusds.jwt.JWTClaimsSet; +import com.nimbusds.jwt.SignedJWT; +import com.nimbusds.jwt.util.DateUtils; +import org.apache.commons.lang3.StringUtils; +import org.dspace.core.Context; +import org.dspace.eperson.EPerson; +import org.springframework.stereotype.Component; + +/** + * Class responsible for creating and parsing JSON Web Tokens (JWTs) used for bitstream + * downloads among other things, supports both JWS and JWE https://jwt.io/ . + */ +@Component +public class ShortLivedJWTTokenHandler extends JWTTokenHandler { + + /** + * Determine if current JWT is valid for the given EPerson object. + * To be valid, current JWT *must* have been signed by the EPerson and not be expired. + * If EPerson is null or does not have a known active session, false is returned immediately. + * @param request current request + * @param signedJWT current signed JWT + * @param jwtClaimsSet claims set of current JWT + * @param ePerson EPerson parsed from current signed JWT + * @return true if valid, false otherwise + * @throws JOSEException + */ + @Override + protected boolean isValidToken(HttpServletRequest request, SignedJWT signedJWT, JWTClaimsSet jwtClaimsSet, + EPerson ePerson) throws JOSEException { + if (ePerson == null || StringUtils.isBlank(ePerson.getSessionSalt())) { + return false; + } else { + JWSVerifier verifier = new MACVerifier(buildSigningKey(request, ePerson)); + + //If token is valid and not expired return eperson in token + Date expirationTime = jwtClaimsSet.getExpirationTime(); + return signedJWT.verify(verifier) + && expirationTime != null + //Ensure expiration timestamp is after the current time + && DateUtils.isAfter(expirationTime, new Date(), 0); + } + } + + /** + * The session salt doesn't need to be updated for short lived tokens. + * @param context current DSpace Context + * @param previousLoginDate date of last login (prior to this one) + * @return EPerson object of current user, with an updated session salt + */ + @Override + protected EPerson updateSessionSalt(final Context context, final Date previousLoginDate) { + return context.getCurrentUser(); + } + + @Override + protected String getTokenSecretConfigurationKey() { + return "jwt.shortLived.token.secret"; + } + + @Override + protected String getEncryptionSecretConfigurationKey() { + return "jwt.shortLived.encryption.secret"; + } + + @Override + protected String getTokenExpirationConfigurationKey() { + return "jwt.shortLived.token.expiration"; + } + + @Override + protected String getTokenIncludeIPConfigurationKey() { + return "jwt.shortLived.token.include.ip"; + } + + @Override + protected String getEncryptionEnabledConfigurationKey() { + return "jwt.shortLived.encryption.enabled"; + } + + @Override + protected String getCompressionEnabledConfigurationKey() { + return "jwt.shortLived.compression.enabled"; + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/submit/factory/impl/MetadataValueRemovePatchOperation.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/submit/factory/impl/MetadataValueRemovePatchOperation.java index 5ea17cc3cd..1660a5455a 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/submit/factory/impl/MetadataValueRemovePatchOperation.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/submit/factory/impl/MetadataValueRemovePatchOperation.java @@ -8,6 +8,7 @@ package org.dspace.app.rest.submit.factory.impl; import java.sql.SQLException; +import java.util.Arrays; import java.util.List; import org.dspace.app.rest.model.MetadataValueRest; @@ -40,17 +41,10 @@ public abstract class MetadataValueRemovePatchOperation mm = getDSpaceObjectService().getMetadata(source, metadata[0], metadata[1], metadata[2], Item.ANY); - getDSpaceObjectService().clearMetadata(context, source, metadata[0], metadata[1], metadata[2], Item.ANY); if (index != -1) { - int idx = 0; - for (MetadataValue m : mm) { - if (idx != index) { - getDSpaceObjectService().addMetadata(context, source, metadata[0], metadata[1], metadata[2], - m.getLanguage(), m.getValue(), m.getAuthority(), - m.getConfidence()); - } - idx++; - } + getDSpaceObjectService().removeMetadataValues(context, source, Arrays.asList(mm.get(index))); + } else { + getDSpaceObjectService().clearMetadata(context, source, metadata[0], metadata[1], metadata[2], Item.ANY); } } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/submit/step/validation/UploadValidation.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/submit/step/validation/UploadValidation.java index f1e8172b8b..0c896db9aa 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/submit/step/validation/UploadValidation.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/submit/step/validation/UploadValidation.java @@ -41,15 +41,11 @@ public class UploadValidation extends AbstractValidation { SubmissionStepConfig config) throws DCInputsReaderException, SQLException { //TODO MANAGE METADATA - for (String key : uploadConfigurationService.getMap().keySet()) { - if (getName().equals(key)) { - UploadConfiguration uploadConfig = uploadConfigurationService.getMap().get(key); - if (uploadConfig.isRequired() && !itemService.hasUploadedFiles(obj.getItem())) { - addError(ERROR_VALIDATION_FILEREQUIRED, - "/" + WorkspaceItemRestRepository.OPERATION_PATH_SECTIONS + "/" - + config.getId()); - } - } + UploadConfiguration uploadConfig = uploadConfigurationService.getMap().get(config.getId()); + if (uploadConfig.isRequired() && !itemService.hasUploadedFiles(obj.getItem())) { + addError(ERROR_VALIDATION_FILEREQUIRED, + "/" + WorkspaceItemRestRepository.OPERATION_PATH_SECTIONS + "/" + + config.getId()); } return getErrors(); } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/AuthorityUtils.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/AuthorityUtils.java index 97be32ecf9..1a2a071fe1 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/AuthorityUtils.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/AuthorityUtils.java @@ -7,9 +7,11 @@ */ package org.dspace.app.rest.utils; +import org.apache.commons.lang3.StringUtils; import org.dspace.app.rest.converter.ConverterService; -import org.dspace.app.rest.model.AuthorityEntryRest; -import org.dspace.app.rest.model.AuthorityRest; +import org.dspace.app.rest.model.VocabularyEntryDetailsRest; +import org.dspace.app.rest.model.VocabularyEntryRest; +import org.dspace.app.rest.model.VocabularyRest; import org.dspace.app.rest.projection.Projection; import org.dspace.content.authority.Choice; import org.dspace.content.authority.ChoiceAuthority; @@ -27,6 +29,8 @@ public class AuthorityUtils { public static final String PRESENTATION_TYPE_LOOKUP = "lookup"; + public static final String PRESENTATION_TYPE_AUTHORLOOKUP = "authorLookup"; + public static final String PRESENTATION_TYPE_SUGGEST = "suggest"; public static final String RESERVED_KEYMAP_PARENT = "parent"; @@ -39,11 +43,11 @@ public class AuthorityUtils { public boolean isChoice(String schema, String element, String qualifier) { - return cas.isChoicesConfigured(org.dspace.core.Utils.standardize(schema, element, qualifier, "_")); + return cas.isChoicesConfigured(org.dspace.core.Utils.standardize(schema, element, qualifier, "_"), null); } public String getAuthorityName(String schema, String element, String qualifier) { - return cas.getChoiceAuthorityName(schema, element, qualifier); + return cas.getChoiceAuthorityName(schema, element, qualifier, null); } public boolean isClosed(String schema, String element, String qualifier) { @@ -62,9 +66,45 @@ public class AuthorityUtils { * @param projection the name of the projection to use, or {@code null}. * @return */ - public AuthorityEntryRest convertEntry(Choice choice, String authorityName, Projection projection) { - AuthorityEntryRest entry = converter.toRest(choice, projection); - entry.setAuthorityName(authorityName); + public VocabularyEntryDetailsRest convertEntryDetails(Choice choice, String authorityName, + boolean isHierarchical, Projection projection) { + if (choice == null) { + return null; + } + VocabularyEntryDetailsRest entry = converter.toRest(choice, projection); + entry.setVocabularyName(authorityName); + entry.setId(authorityName + ":" + entry.getId()); + entry.setInHierarchicalVocabulary(isHierarchical); + return entry; + } + + /** + * This utility method is currently a workaround to enrich the REST object with + * information from the parent vocabulary that is not referenced by the Choice + * model + * + * @param choice the dspace-api choice to expose as vocabulary entry + * @param authorityName the name of the vocabulary + * @param storeAuthority true if the entry id should be exposed as + * an authority for storing it in the metadatavalue + * @param projection the rest projection to apply + * @return the vocabulary entry rest reppresentation of the provided choice + */ + public VocabularyEntryRest convertEntry(Choice choice, String authorityName, boolean storeAuthority, + Projection projection) { + if (choice == null) { + return null; + } + VocabularyEntryRest entry = new VocabularyEntryRest(); + entry.setDisplay(choice.label); + entry.setValue(choice.value); + entry.setOtherInformation(choice.extras); + if (storeAuthority) { + entry.setAuthority(choice.authority); + } + if (StringUtils.isNotBlank(choice.authority)) { + entry.setVocabularyEntryDetailsRest(converter.toRest(choice, projection)); + } return entry; } @@ -76,8 +116,8 @@ public class AuthorityUtils { * @param projection the projecton to use. * @return */ - public AuthorityRest convertAuthority(ChoiceAuthority source, String authorityName, Projection projection) { - AuthorityRest result = converter.toRest(source, projection); + public VocabularyRest convertAuthority(ChoiceAuthority source, String authorityName, Projection projection) { + VocabularyRest result = converter.toRest(source, projection); result.setName(authorityName); return result; } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/DiscoverQueryBuilder.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/DiscoverQueryBuilder.java index a6d3fd00dd..3afbdfb8a3 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/DiscoverQueryBuilder.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/DiscoverQueryBuilder.java @@ -7,6 +7,10 @@ */ package org.dspace.app.rest.utils; +import static java.util.Collections.emptyList; +import static java.util.Collections.singletonList; +import static org.apache.commons.collections4.CollectionUtils.isNotEmpty; + import java.sql.SQLException; import java.util.ArrayList; import java.util.List; @@ -65,14 +69,47 @@ public class DiscoverQueryBuilder implements InitializingBean { pageSizeLimit = configurationService.getIntProperty("rest.search.max.results", 100); } + /** + * Build a discovery query + * + * @param context the DSpace context + * @param scope the scope for this discovery query + * @param discoveryConfiguration the discovery configuration for this discovery query + * @param query the query string for this discovery query + * @param searchFilters the search filters for this discovery query + * @param dsoType only include search results with this type + * @param page the pageable for this discovery query + */ public DiscoverQuery buildQuery(Context context, IndexableObject scope, DiscoveryConfiguration discoveryConfiguration, String query, List searchFilters, String dsoType, Pageable page) throws DSpaceBadRequestException { + List dsoTypes = dsoType != null ? singletonList(dsoType) : emptyList(); + + return buildQuery(context, scope, discoveryConfiguration, query, searchFilters, dsoTypes, page); + } + + /** + * Build a discovery query + * + * @param context the DSpace context + * @param scope the scope for this discovery query + * @param discoveryConfiguration the discovery configuration for this discovery query + * @param query the query string for this discovery query + * @param searchFilters the search filters for this discovery query + * @param dsoTypes only include search results with one of these types + * @param page the pageable for this discovery query + */ + public DiscoverQuery buildQuery(Context context, IndexableObject scope, + DiscoveryConfiguration discoveryConfiguration, + String query, List searchFilters, + List dsoTypes, Pageable page) + throws DSpaceBadRequestException { + DiscoverQuery queryArgs = buildCommonDiscoverQuery(context, discoveryConfiguration, query, searchFilters, - dsoType); + dsoTypes); //When all search criteria are set, configure facet results addFaceting(context, scope, queryArgs, discoveryConfiguration); @@ -98,14 +135,52 @@ public class DiscoverQueryBuilder implements InitializingBean { } } + /** + * Create a discovery facet query. + * + * @param context the DSpace context + * @param scope the scope for this discovery query + * @param discoveryConfiguration the discovery configuration for this discovery query + * @param prefix limit the facets results to those starting with the given prefix. + * @param query the query string for this discovery query + * @param searchFilters the search filters for this discovery query + * @param dsoType only include search results with this type + * @param page the pageable for this discovery query + * @param facetName the facet field + */ public DiscoverQuery buildFacetQuery(Context context, IndexableObject scope, DiscoveryConfiguration discoveryConfiguration, String prefix, String query, List searchFilters, String dsoType, Pageable page, String facetName) throws DSpaceBadRequestException { + List dsoTypes = dsoType != null ? singletonList(dsoType) : emptyList(); + + return buildFacetQuery( + context, scope, discoveryConfiguration, prefix, query, searchFilters, dsoTypes, page, facetName); + } + + /** + * Create a discovery facet query. + * + * @param context the DSpace context + * @param scope the scope for this discovery query + * @param discoveryConfiguration the discovery configuration for this discovery query + * @param prefix limit the facets results to those starting with the given prefix. + * @param query the query string for this discovery query + * @param searchFilters the search filters for this discovery query + * @param dsoTypes only include search results with one of these types + * @param page the pageable for this discovery query + * @param facetName the facet field + */ + public DiscoverQuery buildFacetQuery(Context context, IndexableObject scope, + DiscoveryConfiguration discoveryConfiguration, + String prefix, String query, List searchFilters, + List dsoTypes, Pageable page, String facetName) + throws DSpaceBadRequestException { + DiscoverQuery queryArgs = buildCommonDiscoverQuery(context, discoveryConfiguration, query, searchFilters, - dsoType); + dsoTypes); //When all search criteria are set, configure facet results addFacetingForFacets(context, scope, prefix, queryArgs, discoveryConfiguration, facetName, page); @@ -170,7 +245,7 @@ public class DiscoverQueryBuilder implements InitializingBean { private DiscoverQuery buildCommonDiscoverQuery(Context context, DiscoveryConfiguration discoveryConfiguration, String query, - List searchFilters, String dsoType) + List searchFilters, List dsoTypes) throws DSpaceBadRequestException { DiscoverQuery queryArgs = buildBaseQueryForConfiguration(discoveryConfiguration); @@ -182,10 +257,13 @@ public class DiscoverQueryBuilder implements InitializingBean { queryArgs.setQuery(query); } - //Limit results to DSO type - if (StringUtils.isNotBlank(dsoType)) { - queryArgs.setDSpaceObjectFilter(getDsoType(dsoType)); + //Limit results to DSO types + if (isNotEmpty(dsoTypes)) { + dsoTypes.stream() + .map(this::getDsoType) + .forEach(queryArgs::addDSpaceObjectFilter); } + return queryArgs; } diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/MultipartFileSender.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/MultipartFileSender.java index 4ae836bccf..284d0b87ab 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/MultipartFileSender.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/MultipartFileSender.java @@ -156,9 +156,13 @@ public class MultipartFileSender { // Initialize response. response.reset(); response.setBufferSize(bufferSize); - response.setHeader(CONTENT_TYPE, contentType); + if (contentType != null) { + response.setHeader(CONTENT_TYPE, contentType); + } response.setHeader(ACCEPT_RANGES, BYTES); - response.setHeader(ETAG, checksum); + if (checksum != null) { + response.setHeader(ETAG, checksum); + } response.setDateHeader(LAST_MODIFIED, lastModified); response.setDateHeader(EXPIRES, System.currentTimeMillis() + DEFAULT_EXPIRE_TIME); @@ -481,4 +485,4 @@ public class MultipartFileSender { return Arrays.binarySearch(matchValues, toMatch) > -1 || Arrays.binarySearch(matchValues, "*") > -1; } -} \ No newline at end of file +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/RegexUtils.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/RegexUtils.java index 8e887261db..ea0b793055 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/RegexUtils.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/RegexUtils.java @@ -28,7 +28,8 @@ public class RegexUtils { * identifier (digits or uuid) */ public static final String REGEX_REQUESTMAPPING_IDENTIFIER_AS_STRING_VERSION_STRONG = "/{id:^(?!^\\d+$)" + - "(?!^[0-9a-fxA-FX]{8}-[0-9a-fxA-FX]{4}-[0-9a-fxA-FX]{4}-[0-9a-fxA-FX]{4}-[0-9a-fxA-FX]{12}$)[\\w+\\-\\.]+$+}"; + "(?!^[0-9a-fxA-FX]{8}-[0-9a-fxA-FX]{4}-[0-9a-fxA-FX]{4}-[0-9a-fxA-FX]{4}-[0-9a-fxA-FX]{12}$)" + + "[\\w+\\-\\.:]+$+}"; /** * Regular expression in the request mapping to accept number as identifier diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/UsageReportUtils.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/UsageReportUtils.java new file mode 100644 index 0000000000..ca10b9face --- /dev/null +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/UsageReportUtils.java @@ -0,0 +1,352 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.utils; + +import java.io.IOException; +import java.sql.SQLException; +import java.text.ParseException; +import java.util.ArrayList; +import java.util.List; + +import org.apache.commons.lang3.StringUtils; +import org.apache.solr.client.solrj.SolrServerException; +import org.dspace.app.rest.model.UsageReportPointCityRest; +import org.dspace.app.rest.model.UsageReportPointCountryRest; +import org.dspace.app.rest.model.UsageReportPointDateRest; +import org.dspace.app.rest.model.UsageReportPointDsoTotalVisitsRest; +import org.dspace.app.rest.model.UsageReportRest; +import org.dspace.content.Bitstream; +import org.dspace.content.DSpaceObject; +import org.dspace.content.Item; +import org.dspace.content.Site; +import org.dspace.core.Constants; +import org.dspace.core.Context; +import org.dspace.handle.service.HandleService; +import org.dspace.statistics.Dataset; +import org.dspace.statistics.content.DatasetDSpaceObjectGenerator; +import org.dspace.statistics.content.DatasetTimeGenerator; +import org.dspace.statistics.content.DatasetTypeGenerator; +import org.dspace.statistics.content.StatisticsDataVisits; +import org.dspace.statistics.content.StatisticsListing; +import org.dspace.statistics.content.StatisticsTable; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.rest.webmvc.ResourceNotFoundException; +import org.springframework.stereotype.Component; + +/** + * This is the Service dealing with the {@link UsageReportRest} logic + * + * @author Maria Verdonck (Atmire) on 08/06/2020 + */ +@Component +public class UsageReportUtils { + + @Autowired + private HandleService handleService; + + public static final String TOTAL_VISITS_REPORT_ID = "TotalVisits"; + public static final String TOTAL_VISITS_PER_MONTH_REPORT_ID = "TotalVisitsPerMonth"; + public static final String TOTAL_DOWNLOADS_REPORT_ID = "TotalDownloads"; + public static final String TOP_COUNTRIES_REPORT_ID = "TopCountries"; + public static final String TOP_CITIES_REPORT_ID = "TopCities"; + + /** + * Get list of usage reports that are applicable to the DSO (of given UUID) + * + * @param context DSpace context + * @param dso DSpaceObject we want all available usage reports of + * @return List of usage reports, applicable to the given DSO + */ + public List getUsageReportsOfDSO(Context context, DSpaceObject dso) + throws SQLException, ParseException, SolrServerException, IOException { + List usageReports = new ArrayList<>(); + if (dso instanceof Site) { + UsageReportRest globalUsageStats = this.resolveGlobalUsageReport(context); + globalUsageStats.setId(dso.getID().toString() + "_" + TOTAL_VISITS_REPORT_ID); + usageReports.add(globalUsageStats); + } else { + usageReports.add(this.createUsageReport(context, dso, TOTAL_VISITS_REPORT_ID)); + usageReports.add(this.createUsageReport(context, dso, TOTAL_VISITS_PER_MONTH_REPORT_ID)); + usageReports.add(this.createUsageReport(context, dso, TOP_COUNTRIES_REPORT_ID)); + usageReports.add(this.createUsageReport(context, dso, TOP_CITIES_REPORT_ID)); + } + if (dso instanceof Item || dso instanceof Bitstream) { + usageReports.add(this.createUsageReport(context, dso, TOTAL_DOWNLOADS_REPORT_ID)); + } + return usageReports; + } + + /** + * Creates the stat different stat usage report based on the report id. + * If the report id or the object uuid is invalid, an exception is thrown. + * + * @param context DSpace context + * @param dso DSpace object we want a stat usage report on + * @param reportId Type of usage report requested + * @return Rest object containing the stat usage report, see {@link UsageReportRest} + */ + public UsageReportRest createUsageReport(Context context, DSpaceObject dso, String reportId) + throws ParseException, SolrServerException, IOException { + try { + UsageReportRest usageReportRest; + switch (reportId) { + case TOTAL_VISITS_REPORT_ID: + usageReportRest = resolveTotalVisits(context, dso); + usageReportRest.setReportType(TOTAL_VISITS_REPORT_ID); + break; + case TOTAL_VISITS_PER_MONTH_REPORT_ID: + usageReportRest = resolveTotalVisitsPerMonth(context, dso); + usageReportRest.setReportType(TOTAL_VISITS_PER_MONTH_REPORT_ID); + break; + case TOTAL_DOWNLOADS_REPORT_ID: + usageReportRest = resolveTotalDownloads(context, dso); + usageReportRest.setReportType(TOTAL_DOWNLOADS_REPORT_ID); + break; + case TOP_COUNTRIES_REPORT_ID: + usageReportRest = resolveTopCountries(context, dso); + usageReportRest.setReportType(TOP_COUNTRIES_REPORT_ID); + break; + case TOP_CITIES_REPORT_ID: + usageReportRest = resolveTopCities(context, dso); + usageReportRest.setReportType(TOP_CITIES_REPORT_ID); + break; + default: + throw new ResourceNotFoundException("The given report id can't be resolved: " + reportId + "; " + + "available reports: TotalVisits, TotalVisitsPerMonth, " + + "TotalDownloads, TopCountries, TopCities"); + } + usageReportRest.setId(dso.getID() + "_" + reportId); + return usageReportRest; + } catch (SQLException e) { + throw new SolrServerException("SQLException trying to receive statistics of: " + dso.getID()); + } + } + + /** + * Create stat usage report of the items most popular over entire site + * + * @param context DSpace context + * @return Usage report with top most popular items + */ + private UsageReportRest resolveGlobalUsageReport(Context context) + throws SQLException, IOException, ParseException, SolrServerException { + StatisticsListing statListing = new StatisticsListing( + new StatisticsDataVisits()); + + // Adding a new generator for our top 10 items without a name length delimiter + DatasetDSpaceObjectGenerator dsoAxis = new DatasetDSpaceObjectGenerator(); + // TODO make max nr of top items (views wise)? Must be set + dsoAxis.addDsoChild(Constants.ITEM, 10, false, -1); + statListing.addDatasetGenerator(dsoAxis); + + Dataset dataset = statListing.getDataset(context, 1); + + UsageReportRest usageReportRest = new UsageReportRest(); + for (int i = 0; i < dataset.getColLabels().size(); i++) { + UsageReportPointDsoTotalVisitsRest totalVisitPoint = new UsageReportPointDsoTotalVisitsRest(); + totalVisitPoint.setType("item"); + String urlOfItem = dataset.getColLabelsAttrs().get(i).get("url"); + if (urlOfItem != null) { + String handle = StringUtils.substringAfterLast(urlOfItem, "handle/"); + if (handle != null) { + DSpaceObject dso = handleService.resolveToObject(context, handle); + totalVisitPoint.setId(dso != null ? dso.getID().toString() : urlOfItem); + totalVisitPoint.setLabel(dso != null ? dso.getName() : urlOfItem); + totalVisitPoint.addValue("views", Integer.valueOf(dataset.getMatrix()[0][i])); + usageReportRest.addPoint(totalVisitPoint); + } + } + } + usageReportRest.setReportType(TOTAL_VISITS_REPORT_ID); + return usageReportRest; + } + + /** + * Create a stat usage report for the amount of TotalVisit on a DSO, containing one point with the amount of + * views on the DSO in. If there are no views on the DSO this point contains views=0. + * + * @param context DSpace context + * @param dso DSO we want usage report with TotalVisits on the DSO + * @return Rest object containing the TotalVisits usage report of the given DSO + */ + private UsageReportRest resolveTotalVisits(Context context, DSpaceObject dso) + throws SQLException, IOException, ParseException, SolrServerException { + Dataset dataset = this.getDSOStatsDataset(context, dso, 1, dso.getType()); + + UsageReportRest usageReportRest = new UsageReportRest(); + UsageReportPointDsoTotalVisitsRest totalVisitPoint = new UsageReportPointDsoTotalVisitsRest(); + totalVisitPoint.setType(StringUtils.substringAfterLast(dso.getClass().getName().toLowerCase(), ".")); + totalVisitPoint.setId(dso.getID().toString()); + if (dataset.getColLabels().size() > 0) { + totalVisitPoint.setLabel(dso.getName()); + totalVisitPoint.addValue("views", Integer.valueOf(dataset.getMatrix()[0][0])); + } else { + totalVisitPoint.setLabel(dso.getName()); + totalVisitPoint.addValue("views", 0); + } + + usageReportRest.addPoint(totalVisitPoint); + return usageReportRest; + } + + /** + * Create a stat usage report for the amount of TotalVisitPerMonth on a DSO, containing one point for each month + * with the views on that DSO in that month with the range -6 months to now. If there are no views on the DSO + * in a month, the point on that month contains views=0. + * + * @param context DSpace context + * @param dso DSO we want usage report with TotalVisitsPerMonth to the DSO + * @return Rest object containing the TotalVisits usage report on the given DSO + */ + private UsageReportRest resolveTotalVisitsPerMonth(Context context, DSpaceObject dso) + throws SQLException, IOException, ParseException, SolrServerException { + StatisticsTable statisticsTable = new StatisticsTable(new StatisticsDataVisits(dso)); + DatasetTimeGenerator timeAxis = new DatasetTimeGenerator(); + // TODO month start and end as request para? + timeAxis.setDateInterval("month", "-6", "+1"); + statisticsTable.addDatasetGenerator(timeAxis); + DatasetDSpaceObjectGenerator dsoAxis = new DatasetDSpaceObjectGenerator(); + dsoAxis.addDsoChild(dso.getType(), 10, false, -1); + statisticsTable.addDatasetGenerator(dsoAxis); + Dataset dataset = statisticsTable.getDataset(context, 0); + + UsageReportRest usageReportRest = new UsageReportRest(); + for (int i = 0; i < dataset.getColLabels().size(); i++) { + UsageReportPointDateRest monthPoint = new UsageReportPointDateRest(); + monthPoint.setId(dataset.getColLabels().get(i)); + monthPoint.addValue("views", Integer.valueOf(dataset.getMatrix()[0][i])); + usageReportRest.addPoint(monthPoint); + } + return usageReportRest; + } + + /** + * Create a stat usage report for the amount of TotalDownloads on the files of an Item or of a Bitstream, + * containing a point for each bitstream of the item that has been visited at least once or one point for the + * bitstream containing the amount of times that bitstream has been visited (even if 0) + * If the item has no bitstreams, or no bitstreams that have ever been downloaded/visited, then it contains an + * empty list of points=[] + * If the given UUID is for DSO that is neither a Bitstream nor an Item, an exception is thrown. + * + * @param context DSpace context + * @param dso Item/Bitstream we want usage report on with TotalDownloads of the Item's bitstreams or of the + * bitstream itself + * @return Rest object containing the TotalDownloads usage report on the given Item/Bitstream + */ + private UsageReportRest resolveTotalDownloads(Context context, DSpaceObject dso) + throws SQLException, SolrServerException, ParseException, IOException { + if (dso instanceof org.dspace.content.Bitstream) { + return this.resolveTotalVisits(context, dso); + } + + if (dso instanceof org.dspace.content.Item) { + Dataset dataset = this.getDSOStatsDataset(context, dso, 1, Constants.BITSTREAM); + + UsageReportRest usageReportRest = new UsageReportRest(); + for (int i = 0; i < dataset.getColLabels().size(); i++) { + UsageReportPointDsoTotalVisitsRest totalDownloadsPoint = new UsageReportPointDsoTotalVisitsRest(); + totalDownloadsPoint.setType("bitstream"); + totalDownloadsPoint.setId(dataset.getColLabels().get(i)); + totalDownloadsPoint.addValue("views", Integer.valueOf(dataset.getMatrix()[0][i])); + usageReportRest.addPoint(totalDownloadsPoint); + } + return usageReportRest; + } + throw new IllegalArgumentException("TotalDownloads report only available for items and bitstreams"); + } + + /** + * Create a stat usage report for the TopCountries that have visited the given DSO. If there have been no visits, or + * no visits with a valid Geolite determined country (based on IP), this report contains an empty list of points=[]. + * The list of points is limited to the top 100 countries, and each point contains the country name, its iso code + * and the amount of views on the given DSO from that country. + * + * @param context DSpace context + * @param dso DSO we want usage report of the TopCountries on the given DSO + * @return Rest object containing the TopCountries usage report on the given DSO + */ + private UsageReportRest resolveTopCountries(Context context, DSpaceObject dso) + throws SQLException, IOException, ParseException, SolrServerException { + Dataset dataset = this.getTypeStatsDataset(context, dso, "countryCode", 1); + + UsageReportRest usageReportRest = new UsageReportRest(); + for (int i = 0; i < dataset.getColLabels().size(); i++) { + UsageReportPointCountryRest countryPoint = new UsageReportPointCountryRest(); + countryPoint.setLabel(dataset.getColLabels().get(i)); + countryPoint.addValue("views", Integer.valueOf(dataset.getMatrix()[0][i])); + usageReportRest.addPoint(countryPoint); + } + return usageReportRest; + } + + /** + * Create a stat usage report for the TopCities that have visited the given DSO. If there have been no visits, or + * no visits with a valid Geolite determined city (based on IP), this report contains an empty list of points=[]. + * The list of points is limited to the top 100 cities, and each point contains the city name and the amount of + * views on the given DSO from that city. + * + * @param context DSpace context + * @param dso DSO we want usage report of the TopCities on the given DSO + * @return Rest object containing the TopCities usage report on the given DSO + */ + private UsageReportRest resolveTopCities(Context context, DSpaceObject dso) + throws SQLException, IOException, ParseException, SolrServerException { + Dataset dataset = this.getTypeStatsDataset(context, dso, "city", 1); + + UsageReportRest usageReportRest = new UsageReportRest(); + for (int i = 0; i < dataset.getColLabels().size(); i++) { + UsageReportPointCityRest cityPoint = new UsageReportPointCityRest(); + cityPoint.setId(dataset.getColLabels().get(i)); + cityPoint.addValue("views", Integer.valueOf(dataset.getMatrix()[0][i])); + usageReportRest.addPoint(cityPoint); + } + return usageReportRest; + } + + /** + * Retrieves the stats dataset of a given DSO, of given type, with a given facetMinCount limit (usually either 0 + * or 1, 0 if we want a data point even though the facet data point has 0 matching results). + * + * @param context DSpace context + * @param dso DSO we want the stats dataset of + * @param facetMinCount Minimum amount of results on a facet data point for it to be added to dataset + * @param dsoType Type of DSO we want the stats dataset of + * @return Stats dataset with the given filters. + */ + private Dataset getDSOStatsDataset(Context context, DSpaceObject dso, int facetMinCount, int dsoType) + throws SQLException, IOException, ParseException, SolrServerException { + StatisticsListing statsList = new StatisticsListing(new StatisticsDataVisits(dso)); + DatasetDSpaceObjectGenerator dsoAxis = new DatasetDSpaceObjectGenerator(); + dsoAxis.addDsoChild(dsoType, 10, false, -1); + statsList.addDatasetGenerator(dsoAxis); + return statsList.getDataset(context, facetMinCount); + } + + /** + * Retrieves the stats dataset of a given dso, with a given axisType (example countryCode, city), which + * corresponds to a solr field, and a given facetMinCount limit (usually either 0 or 1, 0 if we want a data point + * even though the facet data point has 0 matching results). + * + * @param context DSpace context + * @param dso DSO we want the stats dataset of + * @param typeAxisString String of the type we want on the axis of the dataset (corresponds to solr field), + * examples: countryCode, city + * @param facetMinCount Minimum amount of results on a facet data point for it to be added to dataset + * @return Stats dataset with the given type on the axis, of the given DSO and with given facetMinCount + */ + private Dataset getTypeStatsDataset(Context context, DSpaceObject dso, String typeAxisString, int facetMinCount) + throws SQLException, IOException, ParseException, SolrServerException { + StatisticsListing statListing = new StatisticsListing(new StatisticsDataVisits(dso)); + DatasetTypeGenerator typeAxis = new DatasetTypeGenerator(); + typeAxis.setType(typeAxisString); + // TODO make max nr of top countries/cities a request para? Must be set + typeAxis.setMax(100); + statListing.addDatasetGenerator(typeAxis); + return statListing.getDataset(context, facetMinCount); + } +} diff --git a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/Utils.java b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/Utils.java index fc3b5fb711..f8158d9887 100644 --- a/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/Utils.java +++ b/dspace-server-webapp/src/main/java/org/dspace/app/rest/utils/Utils.java @@ -47,7 +47,6 @@ import org.apache.log4j.Logger; import org.dspace.app.rest.converter.ConverterService; import org.dspace.app.rest.exception.PaginationException; import org.dspace.app.rest.exception.RepositoryNotFoundException; -import org.dspace.app.rest.model.AuthorityRest; import org.dspace.app.rest.model.BaseObjectRest; import org.dspace.app.rest.model.CommunityRest; import org.dspace.app.rest.model.LinkRest; @@ -58,6 +57,7 @@ import org.dspace.app.rest.model.ResourcePolicyRest; import org.dspace.app.rest.model.RestAddressableModel; import org.dspace.app.rest.model.RestModel; import org.dspace.app.rest.model.VersionHistoryRest; +import org.dspace.app.rest.model.VocabularyRest; import org.dspace.app.rest.model.hateoas.EmbeddedPage; import org.dspace.app.rest.model.hateoas.HALResource; import org.dspace.app.rest.projection.CompositeProjection; @@ -254,7 +254,7 @@ public class Utils { return CommunityRest.NAME; } if (modelPlural.equals("authorities")) { - return AuthorityRest.NAME; + return VocabularyRest.NAME; } if (modelPlural.equals("resourcepolicies")) { return ResourcePolicyRest.NAME; @@ -268,6 +268,9 @@ public class Utils { if (StringUtils.equals(modelPlural, "properties")) { return PropertyRest.NAME; } + if (StringUtils.equals(modelPlural, "vocabularies")) { + return VocabularyRest.NAME; + } return modelPlural.replaceAll("s$", ""); } @@ -523,7 +526,7 @@ public class Utils { * {@link CompositeProjection} and applied in order as described there. *

    * In addition, any number of embeds may be specified by rel name via the {@code embed} parameter. - * When provided, these act as a whitelist of embeds that may be included in the response, as described + * When provided, these act as an "allow list" of embeds that may be included in the response, as described * and implemented by {@link EmbedRelsProjection}. *

    * diff --git a/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/api/scripts.xml b/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/api/scripts.xml new file mode 100644 index 0000000000..6facf51941 --- /dev/null +++ b/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/api/scripts.xml @@ -0,0 +1,35 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/api/solr-services.xml b/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/api/solr-services.xml index b0eb2191c7..5bb4b97589 100644 --- a/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/api/solr-services.xml +++ b/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/api/solr-services.xml @@ -19,15 +19,22 @@ - + - + - + - + - - + + diff --git a/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/rest/scripts.xml b/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/rest/scripts.xml new file mode 100644 index 0000000000..294e197b70 --- /dev/null +++ b/dspace-server-webapp/src/test/data/dspaceFolder/config/spring/rest/scripts.xml @@ -0,0 +1,25 @@ + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/oai/OAIpmhIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/oai/OAIpmhIT.java index 3b277a937f..052c363771 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/oai/OAIpmhIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/oai/OAIpmhIT.java @@ -29,9 +29,9 @@ import com.lyncode.xoai.dataprovider.services.impl.BaseDateProvider; import com.lyncode.xoai.dataprovider.xml.xoaiconfig.Configuration; import com.lyncode.xoai.dataprovider.xml.xoaiconfig.ContextConfiguration; import org.apache.commons.lang3.time.DateUtils; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; import org.dspace.content.Community; import org.dspace.services.ConfigurationService; import org.dspace.xoai.services.api.EarliestDateResolver; @@ -76,7 +76,7 @@ public class OAIpmhIT extends AbstractControllerIntegrationTest { private EarliestDateResolver earliestDateResolver; // XOAI's BaseDateProvider (used for date-based testing below) - private static BaseDateProvider baseDateProvider = new BaseDateProvider(); + private static final BaseDateProvider baseDateProvider = new BaseDateProvider(); // Spy on the current XOAIManagerResolver bean, to allow us to change behavior of XOAIManager in tests // See also: createMockXOAIManager() method @@ -278,6 +278,6 @@ public class OAIpmhIT extends AbstractControllerIntegrationTest { * @throws ConfigurationException */ private XOAIManager createMockXOAIManager(Configuration xoaiConfig) throws ConfigurationException { - return new XOAIManager(filterResolver, resourceResolver, xoaiConfig); + return new XOAIManager(filterResolver, resourceResolver, xoaiConfig); } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/opensearch/OpenSearchControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/opensearch/OpenSearchControllerIT.java index 58cd3d0a16..9f7e4e6610 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/opensearch/OpenSearchControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/opensearch/OpenSearchControllerIT.java @@ -12,10 +12,10 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.xpath; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -210,7 +210,7 @@ public class OpenSearchControllerIT extends AbstractControllerIntegrationTest { .andExpect(xpath("OpenSearchDescription/LongName").string("DSpace at My University")) .andExpect(xpath("OpenSearchDescription/Description") .string("DSpace at My University DSpace repository") - ) + ) ; /* Expected response for the service document is: diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rdf/RdfIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rdf/RdfIT.java index 681b2ced81..85ab3dcadd 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rdf/RdfIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rdf/RdfIT.java @@ -14,8 +14,8 @@ import static org.mockito.Mockito.doReturn; import java.net.URI; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.test.AbstractWebClientIntegrationTest; +import org.dspace.builder.CommunityBuilder; import org.dspace.content.Community; import org.dspace.content.service.SiteService; import org.dspace.rdf.RDFUtil; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AnonymousAdditionalAuthorizationFilterIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AnonymousAdditionalAuthorizationFilterIT.java new file mode 100644 index 0000000000..106018ff9b --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AnonymousAdditionalAuthorizationFilterIT.java @@ -0,0 +1,164 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.eperson.Group; +import org.dspace.eperson.service.GroupService; +import org.dspace.services.ConfigurationService; +import org.junit.Before; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * Testing class for the {@link org.dspace.app.rest.security.AnonymousAdditionalAuthorizationFilter} filter + */ +public class AnonymousAdditionalAuthorizationFilterIT extends AbstractControllerIntegrationTest { + + @Autowired + private ConfigurationService configurationService; + + @Autowired + private GroupService groupService; + + public static final String[] IP = {"org.dspace.authenticate.IPAuthentication"}; + public static final String[] IP_AND_PASS = + {"org.dspace.authenticate.IPAuthentication", + "org.dspace.authenticate.PasswordAuthentication"}; + public static final String[] PASS = {"org.dspace.authenticate.PasswordAuthentication"}; + + + Item staffAccessItem1; + Group staff; + + @Before + public void setup() { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + staff = GroupBuilder.createGroup(context).withName("Staff").build(); + + //2. Three public items that are readable by Anonymous with different subjects + staffAccessItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Public item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .withReaderGroup(staff) + .build(); + + } + + @Test + public void verifyIPAuthentication() throws Exception { + configurationService.setProperty("plugin.sequence.org.dspace.authenticate.AuthenticationMethod", IP); + + // Make sure that the item is not accessible for anonymous + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID())) + .andExpect(status().isUnauthorized()); + + // Test that we can access the item using the IP that's configured for the Staff group + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID()) + .header("X-Forwarded-For", "5.5.5.5")) + .andExpect(status().isOk()); + + // Test that we can't access the item using the IP that's configured for the Students group + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID()) + .header("X-FORWARDED-FOR", "6.6.6.6")) + .andExpect(status().isUnauthorized()); + } + + @Test + public void verifyIPAndPasswordAuthentication() throws Exception { + configurationService.setProperty("plugin.sequence.org.dspace.authenticate.AuthenticationMethod", IP_AND_PASS); + + groupService.addMember(context, staff, eperson); + + // Make sure that the item is not accessible for anonymous + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID())) + .andExpect(status().isUnauthorized()); + + // Test that we can access the item using the IP that's configured for the Staff group + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID()) + .header("X-Forwarded-For", "5.5.5.5")) + .andExpect(status().isOk()); + + // Test that we can't access the item using the IP that's configured for the Students group + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID()) + .header("X-Forwarded-For", "6.6.6.6")) + .andExpect(status().isUnauthorized()); + + String token = getAuthToken(eperson.getEmail(), password); + + // Test that the user in the Staff group can access the Item with the normal password authentication + getClient(token).perform(get("/api/core/items/" + staffAccessItem1.getID())) + .andExpect(status().isOk()); + + // Test that the user in the Staff group can access the Item with the normal password authentication even + // when it's IP is configured to be part of the students group + getClient(getAuthTokenWithXForwardedForHeader(eperson.getEmail(), password, "6.6.6.6")) + .perform(get("/api/core/items/" + staffAccessItem1.getID()) + .header("X-Forwarded-For", "6.6.6.6")) + .andExpect(status().isOk()); + } + + @Test + public void verifyPasswordAuthentication() throws Exception { + configurationService.setProperty("plugin.sequence.org.dspace.authenticate.AuthenticationMethod", PASS); + + groupService.addMember(context, staff, eperson); + + // Make sure that the item is not accessible for anonymous + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID())) + .andExpect(status().isUnauthorized()); + + // Test that the Item can't be accessed with the IP for the Staff group if the config is turned off and only + // allows password authentication + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID()) + .header("X-Forwarded-For", "5.5.5.5")) + .andExpect(status().isUnauthorized()); + + // Test that the Item can't be accessed with the IP for the Students group if the config is turned off and only + // allows password authentication + getClient().perform(get("/api/core/items/" + staffAccessItem1.getID()) + .header("X-Forwarded-For", "6.6.6.6")) + .andExpect(status().isUnauthorized()); + + String token = getAuthToken(eperson.getEmail(), password); + + // Test that the Item is accessible for a user in the Staff group by password login + getClient(token).perform(get("/api/core/items/" + staffAccessItem1.getID())) + .andExpect(status().isOk()); + + // Test that the Item is accessible for a user in the Staff group by password Login when the request + // is coming from the IP that's configured to be for the Student group + getClient(getAuthTokenWithXForwardedForHeader(eperson.getEmail(), password, "6.6.6.6")) + .perform(get("/api/core/items/" + staffAccessItem1.getID()) + .header("X-Forwarded-For", "6.6.6.6")) + .andExpect(status().isOk()); + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthenticationRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthenticationRestControllerIT.java index 5a65447858..24a27b9fe8 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthenticationRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthenticationRestControllerIT.java @@ -11,7 +11,9 @@ import static java.lang.Thread.sleep; import static org.hamcrest.Matchers.containsString; import static org.hamcrest.Matchers.endsWith; import static org.hamcrest.Matchers.is; +import static org.hamcrest.Matchers.notNullValue; import static org.hamcrest.Matchers.startsWith; +import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertNotEquals; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post; @@ -20,20 +22,38 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; +import java.io.InputStream; import java.util.Base64; +import java.util.Map; import javax.servlet.http.Cookie; -import org.dspace.app.rest.builder.GroupBuilder; +import com.fasterxml.jackson.databind.ObjectMapper; +import org.apache.commons.codec.CharEncoding; +import org.apache.commons.io.IOUtils; import org.dspace.app.rest.matcher.AuthenticationStatusMatcher; import org.dspace.app.rest.matcher.EPersonMatcher; import org.dspace.app.rest.matcher.HalMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.BundleBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.content.Bitstream; +import org.dspace.content.Bundle; +import org.dspace.content.Collection; +import org.dspace.content.Item; +import org.dspace.eperson.EPerson; import org.dspace.eperson.Group; import org.dspace.services.ConfigurationService; +import org.hamcrest.Matchers; import org.junit.Before; import org.junit.Ignore; import org.junit.Test; import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.test.web.servlet.MvcResult; import org.springframework.test.web.servlet.result.MockMvcResultHandlers; /** @@ -50,12 +70,14 @@ public class AuthenticationRestControllerIT extends AbstractControllerIntegratio public static final String[] PASS_ONLY = {"org.dspace.authenticate.PasswordAuthentication"}; public static final String[] SHIB_ONLY = {"org.dspace.authenticate.ShibAuthentication"}; - public static final String[] SHIB_AND_PASS = - {"org.dspace.authenticate.ShibAuthentication", - "org.dspace.authenticate.PasswordAuthentication"}; - public static final String[] SHIB_AND_IP = - {"org.dspace.authenticate.IPAuthentication", - "org.dspace.authenticate.ShibAuthentication"}; + public static final String[] SHIB_AND_PASS = { + "org.dspace.authenticate.ShibAuthentication", + "org.dspace.authenticate.PasswordAuthentication" + }; + public static final String[] SHIB_AND_IP = { + "org.dspace.authenticate.IPAuthentication", + "org.dspace.authenticate.ShibAuthentication" + }; @Before public void setup() throws Exception { @@ -386,7 +408,7 @@ public class AuthenticationRestControllerIT extends AbstractControllerIntegratio @Test public void testLoginGetRequest() throws Exception { - getClient().perform(get("/api/authn/login") + getClient().perform(get("/api/authn/login") .param("user", eperson.getEmail()) .param("password", password)) .andExpect(status().isMethodNotAllowed()); @@ -701,8 +723,8 @@ public class AuthenticationRestControllerIT extends AbstractControllerIntegratio //Check if WWW-Authenticate header contains only password getClient().perform(get("/api/authn/status").header("Referer", "http://my.uni.edu")) - .andExpect(status().isOk()) - .andExpect(header().string("WWW-Authenticate", + .andExpect(status().isOk()) + .andExpect(header().string("WWW-Authenticate", "password realm=\"DSpace REST API\"")); //Check if a shibboleth authentication fails @@ -710,7 +732,6 @@ public class AuthenticationRestControllerIT extends AbstractControllerIntegratio .requestAttr("SHIB-MAIL", eperson.getEmail()) .requestAttr("SHIB-SCOPED-AFFILIATION", "faculty;staff")) .andExpect(status().isUnauthorized()); - } @Test @@ -757,4 +778,162 @@ public class AuthenticationRestControllerIT extends AbstractControllerIntegratio .andExpect(status().isUnauthorized()); } + + @Test + public void testShortLivedToken() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + + // Verify the main session salt doesn't change + String salt = eperson.getSessionSalt(); + + getClient(token).perform(post("/api/authn/shortlivedtokens")) + .andExpect(jsonPath("$.token", notNullValue())) + .andExpect(jsonPath("$.type", is("shortlivedtoken"))) + .andExpect(jsonPath("$._links.self.href", Matchers.containsString("/api/authn/shortlivedtokens"))); + + assertEquals(salt, eperson.getSessionSalt()); + } + + @Test + public void testShortLivedTokenNotAuthenticated() throws Exception { + getClient().perform(post("/api/authn/shortlivedtokens")) + .andExpect(status().isUnauthorized()); + } + + @Test + public void testShortLivedTokenToDowloadBitstream() throws Exception { + Bitstream bitstream = createPrivateBitstream(); + String shortLivedToken = getShortLivedToken(eperson); + + getClient().perform(get("/api/core/bitstreams/" + bitstream.getID() + + "/content?authentication-token=" + shortLivedToken)) + .andExpect(status().isOk()); + } + + @Test + public void testShortLivedTokenToDowloadBitstreamUnauthorized() throws Exception { + Bitstream bitstream = createPrivateBitstream(); + + context.turnOffAuthorisationSystem(); + EPerson testEPerson = EPersonBuilder.createEPerson(context) + .withNameInMetadata("John", "Doe") + .withEmail("UnauthorizedUser@example.com") + .withPassword(password) + .build(); + context.restoreAuthSystemState(); + + String shortLivedToken = getShortLivedToken(testEPerson); + getClient().perform(get("/api/core/bitstreams/" + bitstream.getID() + + "/content?authentication-token=" + shortLivedToken)) + .andExpect(status().isForbidden()); + } + + @Test + public void testLoginTokenToDowloadBitstream() throws Exception { + Bitstream bitstream = createPrivateBitstream(); + + String loginToken = getAuthToken(eperson.getEmail(), password); + getClient().perform(get("/api/core/bitstreams/" + bitstream.getID() + + "/content?authentication-token=" + loginToken)) + .andExpect(status().isForbidden()); + } + + @Test + public void testExpiredShortLivedTokenToDowloadBitstream() throws Exception { + Bitstream bitstream = createPrivateBitstream(); + configurationService.setProperty("jwt.shortLived.token.expiration", "1"); + String shortLivedToken = getShortLivedToken(eperson); + Thread.sleep(1); + getClient().perform(get("/api/core/bitstreams/" + bitstream.getID() + + "/content?authentication-token=" + shortLivedToken)) + .andExpect(status().isForbidden()); + } + + @Test + public void testShortLivedAndLoginTokenSeparation() throws Exception { + configurationService.setProperty("jwt.shortLived.token.expiration", "1"); + + String token = getAuthToken(eperson.getEmail(), password); + Thread.sleep(2); + getClient(token).perform(get("/api/authn/status").param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.authenticated", is(true))); + } + + // TODO: fix the exception. For now we want to verify a short lived token can't be used to login + @Test(expected = Exception.class) + public void testLoginWithShortLivedToken() throws Exception { + String shortLivedToken = getShortLivedToken(eperson); + + getClient().perform(post("/api/authn/login?authentication-token=" + shortLivedToken)) + .andExpect(status().isInternalServerError()); + // TODO: This internal server error needs to be fixed. This should actually produce a forbidden status + //.andExpect(status().isForbidden()); + } + + @Test + public void testGenerateShortLivedTokenWithShortLivedToken() throws Exception { + String shortLivedToken = getShortLivedToken(eperson); + + getClient().perform(post("/api/authn/shortlivedtokens?authentication-token=" + shortLivedToken)) + .andExpect(status().isForbidden()); + } + + private String getShortLivedToken(EPerson requestUser) throws Exception { + ObjectMapper mapper = new ObjectMapper(); + + String token = getAuthToken(requestUser.getEmail(), password); + MvcResult mvcResult = getClient(token).perform(post("/api/authn/shortlivedtokens")) + .andReturn(); + + String content = mvcResult.getResponse().getContentAsString(); + Map map = mapper.readValue(content, Map.class); + return String.valueOf(map.get("token")); + } + + private Bitstream createPrivateBitstream() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and one collection. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, parentCommunity).withName("Collection 1").build(); + + //2. One public items that is readable by Anonymous + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Test") + .withIssueDate("2010-10-17") + .withAuthor("Smith, Donald") + .withSubject("ExtraEntry") + .build(); + + Bundle bundle1 = BundleBuilder.createBundle(context, publicItem1) + .withName("TEST BUNDLE") + .build(); + + //2. An item restricted to a specific internal group + Group staffGroup = GroupBuilder.createGroup(context) + .withName("Staff") + .addMember(eperson) + .build(); + + String bitstreamContent = "ThisIsSomeDummyText"; + Bitstream bitstream = null; + try (InputStream is = IOUtils.toInputStream(bitstreamContent, CharEncoding.UTF_8)) { + bitstream = BitstreamBuilder. + createBitstream(context, bundle1, is) + .withName("Bitstream") + .withDescription("description") + .withMimeType("text/plain") + .withReaderGroup(staffGroup) + .build(); + } + + context.restoreAuthSystemState(); + + return bitstream; + } } + diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorityRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorityRestRepositoryIT.java deleted file mode 100644 index 089f781902..0000000000 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorityRestRepositoryIT.java +++ /dev/null @@ -1,233 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.app.rest; - -import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; -import static org.hamcrest.Matchers.is; -import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; -import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; -import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; - -import java.util.Date; -import java.util.UUID; - -import org.apache.solr.client.solrj.SolrQuery; -import org.apache.solr.client.solrj.response.QueryResponse; -import org.dspace.app.rest.matcher.AuthorityEntryMatcher; -import org.dspace.app.rest.test.AbstractControllerIntegrationTest; -import org.dspace.authority.PersonAuthorityValue; -import org.dspace.authority.factory.AuthorityServiceFactory; -import org.dspace.content.authority.service.ChoiceAuthorityService; -import org.dspace.core.service.PluginService; -import org.dspace.services.ConfigurationService; -import org.hamcrest.Matchers; -import org.junit.Before; -import org.junit.Ignore; -import org.junit.Test; -import org.springframework.beans.factory.annotation.Autowired; - -/** - * This class handles all Authority related IT. It alters some config to run the tests, but it gets cleared again - * after every test - */ -public class AuthorityRestRepositoryIT extends AbstractControllerIntegrationTest { - - @Autowired - ConfigurationService configurationService; - - @Autowired - private PluginService pluginService; - - @Autowired - private ChoiceAuthorityService cas; - - @Before - public void setup() throws Exception { - super.setUp(); - configurationService.setProperty("plugin.named.org.dspace.content.authority.ChoiceAuthority", - "org.dspace.content.authority.SolrAuthority = SolrAuthorAuthority"); - - configurationService.setProperty("solr.authority.server", - "${solr.server}/authority"); - configurationService.setProperty("choices.plugin.dc.contributor.author", - "SolrAuthorAuthority"); - configurationService.setProperty("choices.presentation.dc.contributor.author", - "authorLookup"); - configurationService.setProperty("authority.controlled.dc.contributor.author", - "true"); - - configurationService.setProperty("authority.author.indexer.field.1", - "dc.contributor.author"); - - - // These clears have to happen so that the config is actually reloaded in those classes. This is needed for - // the properties that we're altering above and this is only used within the tests - pluginService.clearNamedPluginClasses(); - cas.clearCache(); - - PersonAuthorityValue person1 = new PersonAuthorityValue(); - person1.setId(String.valueOf(UUID.randomUUID())); - person1.setLastName("Shirasaka"); - person1.setFirstName("Seiko"); - person1.setValue("Shirasaka, Seiko"); - person1.setField("dc_contributor_author"); - person1.setLastModified(new Date()); - person1.setCreationDate(new Date()); - AuthorityServiceFactory.getInstance().getAuthorityIndexingService().indexContent(person1); - - PersonAuthorityValue person2 = new PersonAuthorityValue(); - person2.setId(String.valueOf(UUID.randomUUID())); - person2.setLastName("Miller"); - person2.setFirstName("Tyler E"); - person2.setValue("Miller, Tyler E"); - person2.setField("dc_contributor_author"); - person2.setLastModified(new Date()); - person2.setCreationDate(new Date()); - AuthorityServiceFactory.getInstance().getAuthorityIndexingService().indexContent(person2); - - AuthorityServiceFactory.getInstance().getAuthorityIndexingService().commit(); - } - - @Test - public void correctSrscQueryTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - getClient(token).perform( - get("/api/integration/authorities/srsc/entries") - .param("metadata", "dc.subject") - .param("query", "Research") - .param("size", "1000")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(26))); - } - - @Test - public void noResultsSrscQueryTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - getClient(token).perform( - get("/api/integration/authorities/srsc/entries") - .param("metadata", "dc.subject") - .param("query", "Research2") - .param("size", "1000")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(0))); - } - - @Test - @Ignore - /** - * This functionality is currently broken, it returns all 22 values - */ - public void correctCommonTypesTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - getClient(token).perform( - get("/api/integration/authorities/common_types/entries") - .param("metadata", "dc.type") - .param("query", "Book") - .param("size", "1000")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(2))); - } - - @Test - public void correctSolrQueryTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - getClient(token).perform( - get("/api/integration/authorities/SolrAuthorAuthority/entries") - .param("metadata", "dc.contributor.author") - .param("query", "Shirasaka") - .param("size", "1000")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(1))); - } - - @Test - public void noResultsSolrQueryTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - getClient(token).perform( - get("/api/integration/authorities/SolrAuthorAuthority/entries") - .param("metadata", "dc.contributor.author") - .param("query", "Smith") - .param("size", "1000")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(0))); - } - - @Test - public void retrieveSrscValueTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - - // When full projection is requested, response should include expected properties, links, and embeds. - getClient(token).perform( - get("/api/integration/authorities/srsc/entryValues/SCB1922").param("projection", "full")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", AuthorityEntryMatcher.matchFullEmbeds())) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(1))); - } - - @Test - public void noResultsSrscValueTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - getClient(token).perform( - get("/api/integration/authorities/srsc/entryValues/DOESNTEXIST")) - .andExpect(status().isNotFound()); - } - - @Test - public void retrieveCommonTypesValueTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - getClient(token).perform( - get("/api/integration/authorities/common_types/entryValues/Book").param("projection", "full")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(1))) - ; - - } - - @Test - public void retrieveCommonTypesWithSpaceValueTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - getClient(token).perform( - get("/api/integration/authorities/common_types/entryValues/Learning+Object")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(1))); - } - - @Test - public void discoverableNestedLinkTest() throws Exception { - String token = getAuthToken(eperson.getEmail(), password); - getClient(token).perform(get("/api")) - .andExpect(status().isOk()) - .andExpect(jsonPath("$._links",Matchers.allOf( - hasJsonPath("$.authorizations.href", - is("http://localhost/api/authz/authorizations")), - hasJsonPath("$.authorization-search.href", - is("http://localhost/api/authz/authorization/search")) - ))); - } - - @Test - public void retrieveSolrValueTest() throws Exception { - String token = getAuthToken(admin.getEmail(), password); - - SolrQuery query = new SolrQuery(); - query.setQuery("*:*"); - QueryResponse queryResponse = AuthorityServiceFactory.getInstance().getAuthoritySearchService().search(query); - String id = String.valueOf(queryResponse.getResults().get(0).getFieldValue("id")); - - getClient(token).perform( - get("/api/integration/authorities/SolrAuthorAuthority/entryValues/" + id)) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.page.totalElements", Matchers.is(1))); - } - - @Override - public void destroy() throws Exception { - AuthorityServiceFactory.getInstance().getAuthorityIndexingService().cleanIndex(); - super.destroy(); - } -} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorizationFeatureServiceIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorizationFeatureServiceIT.java index d53bdc92c8..eba774345d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorizationFeatureServiceIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorizationFeatureServiceIT.java @@ -16,6 +16,7 @@ import java.util.List; import java.util.Set; import org.apache.commons.lang3.ArrayUtils; +import org.dspace.AbstractIntegrationTestWithDatabase; import org.dspace.app.rest.authorization.AlwaysFalseFeature; import org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature; import org.dspace.app.rest.authorization.AlwaysTrueFeature; @@ -26,7 +27,6 @@ import org.dspace.app.rest.converter.SiteConverter; import org.dspace.app.rest.model.CollectionRest; import org.dspace.app.rest.model.SiteRest; import org.dspace.app.rest.projection.DefaultProjection; -import org.dspace.app.rest.test.AbstractIntegrationTestWithDatabase; import org.dspace.app.rest.utils.DSpaceConfigurationInitializer; import org.dspace.app.rest.utils.DSpaceKernelInitializer; import org.dspace.content.Site; @@ -77,7 +77,7 @@ public class AuthorizationFeatureServiceIT extends AbstractIntegrationTestWithDa assertThat("We have at least our 7 mock features for testing", authzFeatureServiceFindAll.size(), greaterThanOrEqualTo(7)); - Set featureNames = new HashSet(); + Set featureNames = new HashSet<>(); for (AuthorizationFeature f : authzFeatureServiceFindAll) { featureNames.add(f.getName()); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorizationRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorizationRestRepositoryIT.java index 05631790e3..08564852ee 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorizationRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/AuthorizationRestRepositoryIT.java @@ -7,8 +7,12 @@ */ package org.dspace.app.rest; +import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; +import static org.hamcrest.Matchers.allOf; +import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.greaterThanOrEqualTo; import static org.hamcrest.Matchers.is; +import static org.hamcrest.Matchers.nullValue; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; @@ -28,9 +32,6 @@ import org.dspace.app.rest.authorization.TrueForAdminsFeature; import org.dspace.app.rest.authorization.TrueForLoggedUsersFeature; import org.dspace.app.rest.authorization.TrueForTestUsersFeature; import org.dspace.app.rest.authorization.TrueForUsersInGroupTestFeature; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.GroupBuilder; import org.dspace.app.rest.converter.CommunityConverter; import org.dspace.app.rest.converter.EPersonConverter; import org.dspace.app.rest.converter.SiteConverter; @@ -43,6 +44,9 @@ import org.dspace.app.rest.model.SiteRest; import org.dspace.app.rest.projection.DefaultProjection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.utils.Utils; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; import org.dspace.content.Community; import org.dspace.content.Site; import org.dspace.content.factory.ContentServiceFactory; @@ -131,6 +135,8 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration trueForLoggedUsers = authorizationFeatureService.find(TrueForLoggedUsersFeature.NAME); trueForTestUsers = authorizationFeatureService.find(TrueForTestUsersFeature.NAME); trueForUsersInGroupTest = authorizationFeatureService.find(TrueForUsersInGroupTestFeature.NAME); + + configurationService.setProperty("webui.user.assumelogin", true); } @Test @@ -375,8 +381,10 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // disarm the alwaysThrowExceptionFeature configurationService.setProperty("org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature.turnoff", true); - // verify that it works for administrators + String adminToken = getAuthToken(admin.getEmail(), password); + + // verify that it works for administrators - with eperson parameter getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("projection", "full") .param("uri", siteUri) @@ -414,8 +422,46 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", greaterThanOrEqualTo(3))); - // verify that it works for normal loggedin users + // verify that it works for administrators - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("projection", "full") + .param("uri", siteUri)) + .andExpect(status().isOk()) + // there are at least 3: alwaysTrue, trueForAdministrators and trueForLoggedUsers + .andExpect(jsonPath("$._embedded.authorizations", Matchers.hasSize(greaterThanOrEqualTo(3)))) + .andExpect(jsonPath("$._embedded.authorizations", Matchers.everyItem( + Matchers.anyOf( + JsonPathMatchers.hasJsonPath("$.type", is("authorization")), + JsonPathMatchers.hasJsonPath("$._embedded.feature", + Matchers.allOf( + is(alwaysTrue.getName()), + is(trueForAdmins.getName()), + is(trueForLoggedUsers.getName()) + )), + JsonPathMatchers.hasJsonPath("$._embedded.feature", + Matchers.not(Matchers.anyOf( + is(alwaysFalse.getName()), + is(alwaysException.getName()), + is(trueForTestUsers.getName()) + ) + )), + JsonPathMatchers.hasJsonPath("$._embedded.feature.resourcetypes", + Matchers.hasItem(is("authorization"))), + JsonPathMatchers.hasJsonPath("$.id", + Matchers.anyOf( + Matchers.startsWith(admin.getID().toString()), + Matchers.endsWith(siteRest.getUniqueType() + "_" + siteRest.getId())))) + ) + ) + ) + .andExpect(jsonPath("$._links.self.href", + Matchers.containsString("/api/authz/authorizations/search/object"))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", greaterThanOrEqualTo(3))); + String epersonToken = getAuthToken(eperson.getEmail(), password); + + // verify that it works for normal loggedin users - with eperson parameter getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("projection", "full") .param("uri", siteUri) @@ -453,7 +499,44 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", greaterThanOrEqualTo(2))); - // verify that it works for administators inspecting other users + // verify that it works for normal loggedin users - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("projection", "full") + .param("uri", siteUri)) + .andExpect(status().isOk()) + // there are at least 2: alwaysTrue and trueForLoggedUsers + .andExpect(jsonPath("$._embedded.authorizations", Matchers.hasSize(greaterThanOrEqualTo(2)))) + .andExpect(jsonPath("$._embedded.authorizations", Matchers.everyItem( + Matchers.anyOf( + JsonPathMatchers.hasJsonPath("$.type", is("authorization")), + JsonPathMatchers.hasJsonPath("$._embedded.feature", + Matchers.allOf( + is(alwaysTrue.getName()), + is(trueForLoggedUsers.getName()) + )), + JsonPathMatchers.hasJsonPath("$._embedded.feature", + Matchers.not(Matchers.anyOf( + is(alwaysFalse.getName()), + is(alwaysException.getName()), + is(trueForTestUsers.getName()), + is(trueForAdmins.getName()) + ) + )), + JsonPathMatchers.hasJsonPath("$._embedded.feature.resourcetypes", + Matchers.hasItem(is("authorization"))), + JsonPathMatchers.hasJsonPath("$.id", + Matchers.anyOf( + Matchers.startsWith(eperson.getID().toString()), + Matchers.endsWith(siteRest.getUniqueType() + "_" + siteRest.getId())))) + ) + ) + ) + .andExpect(jsonPath("$._links.self.href", + Matchers.containsString("/api/authz/authorizations/search/object"))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", greaterThanOrEqualTo(2))); + + // verify that it works for administators inspecting other users - by using the eperson parameter getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("projection", "full") .param("uri", siteUri) @@ -495,6 +578,48 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", greaterThanOrEqualTo(2))); + // verify that it works for administators inspecting other users - by assuming login + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("projection", "full") + .param("uri", siteUri) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(status().isOk()) + // there are at least 2: alwaysTrue and trueForLoggedUsers + .andExpect(jsonPath("$._embedded.authorizations", Matchers.hasSize(greaterThanOrEqualTo(2)))) + .andExpect(jsonPath("$._embedded.authorizations", Matchers.everyItem( + Matchers.anyOf( + JsonPathMatchers.hasJsonPath("$.type", is("authorization")), + JsonPathMatchers.hasJsonPath("$._embedded.feature", + Matchers.allOf( + is(alwaysTrue.getName()), + is(trueForLoggedUsers.getName()) + )), + JsonPathMatchers.hasJsonPath("$._embedded.feature", + Matchers.not(Matchers.anyOf( + is(alwaysFalse.getName()), + is(alwaysException.getName()), + is(trueForTestUsers.getName()), + // this guarantee that we are looking to the eperson + // authz and not to the admin ones + is(trueForAdmins.getName()) + ) + )), + JsonPathMatchers.hasJsonPath("$._embedded.feature.resourcetypes", + Matchers.hasItem(is("authorization"))), + JsonPathMatchers.hasJsonPath("$.id", + Matchers.anyOf( + // this guarantee that we are looking to the eperson + // authz and not to the admin ones + Matchers.startsWith(eperson.getID().toString()), + Matchers.endsWith(siteRest.getUniqueType() + "_" + siteRest.getId())))) + ) + ) + ) + .andExpect(jsonPath("$._links.self.href", + Matchers.containsString("/api/authz/authorizations/search/object"))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", greaterThanOrEqualTo(2))); + // verify that it works for anonymous users getClient().perform(get("/api/authz/authorizations/search/object") .param("projection", "full") @@ -529,41 +654,6 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration Matchers.containsString("/api/authz/authorizations/search/object"))) .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", greaterThanOrEqualTo(1))); - - // verify that it works for administrators inspecting anonymous users - getClient(adminToken).perform(get("/api/authz/authorizations/search/object") - .param("projection", "full") - .param("uri", siteUri)) - .andExpect(status().isOk()) - .andExpect(jsonPath("$._embedded.authorizations", Matchers.hasSize(greaterThanOrEqualTo(1)))) - .andExpect(jsonPath("$._embedded.authorizations", Matchers.everyItem( - Matchers.anyOf( - JsonPathMatchers.hasJsonPath("$.type", is("authorization")), - JsonPathMatchers.hasJsonPath("$._embedded.feature", - Matchers.allOf( - is(alwaysTrue.getName()) - )), - JsonPathMatchers.hasJsonPath("$._embedded.feature", - Matchers.not(Matchers.anyOf( - is(alwaysFalse.getName()), - is(alwaysException.getName()), - is(trueForTestUsers.getName()), - is(trueForAdmins.getName()) - ) - )), - JsonPathMatchers.hasJsonPath("$._embedded.feature.resourcetypes", - Matchers.hasItem(is("authorization"))), - JsonPathMatchers.hasJsonPath("$.id", - Matchers.anyOf( - Matchers.startsWith(eperson.getID().toString()), - Matchers.endsWith(siteRest.getUniqueType() + "_" + siteRest.getId())))) - ) - ) - ) - .andExpect(jsonPath("$._links.self.href", - Matchers.containsString("/api/authz/authorizations/search/object"))) - .andExpect(jsonPath("$.page.size", is(20))) - .andExpect(jsonPath("$.page.totalElements", greaterThanOrEqualTo(1))); } @Test @@ -578,8 +668,10 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // disarm the alwaysThrowExceptionFeature configurationService.setProperty("org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature.turnoff", true); - // verify that it works for administrators, no result + String adminToken = getAuthToken(admin.getEmail(), password); + + // verify that it works for administrators, no result - with eperson parameter getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wrongSiteUri) .param("eperson", admin.getID().toString())) @@ -590,8 +682,19 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", is(0))); - // verify that it works for normal loggedin users + // verify that it works for administrators, no result - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", wrongSiteUri)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", JsonPathMatchers.hasNoJsonPath("$._embedded.authorizations"))) + .andExpect(jsonPath("$._links.self.href", + Matchers.containsString("/api/authz/authorizations/search/object"))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(0))); + String epersonToken = getAuthToken(eperson.getEmail(), password); + + // verify that it works for normal loggedin users - with eperson parameter getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wrongSiteUri) .param("eperson", eperson.getID().toString())) @@ -602,7 +705,17 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", is(0))); - // verify that it works for administators inspecting other users + // verify that it works for normal loggedin users - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", wrongSiteUri)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", JsonPathMatchers.hasNoJsonPath("$._embedded.authorizations"))) + .andExpect(jsonPath("$._links.self.href", + Matchers.containsString("/api/authz/authorizations/search/object"))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + // verify that it works for administators inspecting other users - by using the eperson parameter getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wrongSiteUri) .param("eperson", eperson.getID().toString())) @@ -613,9 +726,10 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", is(0))); - // verify that it works for anonymous users - getClient().perform(get("/api/authz/authorizations/search/object") - .param("uri", wrongSiteUri)) + // verify that it works for administators inspecting other users - by assuming login + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", wrongSiteUri) + .header("X-On-Behalf-Of", eperson.getID())) .andExpect(status().isOk()) .andExpect(jsonPath("$", JsonPathMatchers.hasNoJsonPath("$._embedded.authorizations"))) .andExpect(jsonPath("$._links.self.href", @@ -623,8 +737,8 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", is(0))); - // verify that it works for administrators inspecting anonymous users - getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + // verify that it works for anonymous users + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", wrongSiteUri)) .andExpect(status().isOk()) .andExpect(jsonPath("$", JsonPathMatchers.hasNoJsonPath("$._embedded.authorizations"))) @@ -655,31 +769,45 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration String epersonToken = getAuthToken(eperson.getEmail(), password); for (String invalidUri : invalidUris) { log.debug("findByObjectBadRequestTest - Testing the URI: " + invalidUri); - // verify that it works for administrators with an invalid or missing uri + + // verify that it works for administrators with an invalid or missing uri - with eperson parameter getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", invalidUri) .param("eperson", admin.getID().toString())) .andExpect(status().isBadRequest()); - // verify that it works for normal loggedin users with an invalid or missing uri + // verify that it works for administrators with an invalid or missing uri - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", invalidUri)) + .andExpect(status().isBadRequest()); + + // verify that it works for normal loggedin users with an invalid or missing uri - with eperson parameter getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", invalidUri) .param("eperson", eperson.getID().toString())) .andExpect(status().isBadRequest()); - // verify that it works for administators inspecting other users with an invalid or missing uri + // verify that it works for normal loggedin users with an invalid or missing uri - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", invalidUri)) + .andExpect(status().isBadRequest()); + + // verify that it works for administators inspecting other users with an invalid or missing uri - by + // using the eperson parameter getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", invalidUri) .param("eperson", eperson.getID().toString())) .andExpect(status().isBadRequest()); - // verify that it works for anonymous users with an invalid or missing uri - getClient().perform(get("/api/authz/authorizations/search/object") - .param("uri", invalidUri)) + // verify that it works for administators inspecting other users with an invalid or missing uri - by + // assuming login + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", invalidUri) + .header("X-On-Behalf-Of", eperson.getID())) .andExpect(status().isBadRequest()); - // verify that it works for administrators inspecting anonymous users with an invalid or missing uri - getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + // verify that it works for anonymous users with an invalid or missing uri + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", invalidUri)) .andExpect(status().isBadRequest()); } @@ -712,16 +840,29 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // disarm the alwaysThrowExceptionFeature configurationService.setProperty("org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature.turnoff", true); + // verify that it works for an anonymous user inspecting an admin user - by using the eperson parameter getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("eperson", admin.getID().toString())) .andExpect(status().isUnauthorized()); - // verify that it works for normal loggedin users with an invalid or missing uri + // verify that it works for an anonymous user inspecting an admin user - by assuming login + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .header("X-On-Behalf-Of", admin.getID())) + .andExpect(status().isUnauthorized()); + + // verify that it works for an anonymous user inspecting another user - by using the eperson parameter getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("eperson", eperson.getID().toString())) .andExpect(status().isUnauthorized()); + + // verify that it works for an anonymous user inspecting another user - by assuming login + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(status().isUnauthorized()); } @Test @@ -742,17 +883,30 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // disarm the alwaysThrowExceptionFeature configurationService.setProperty("org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature.turnoff", true); String anotherToken = getAuthToken(anotherEperson.getEmail(), password); - // verify that he cannot search the admin authorizations + + // verify that he cannot search the admin authorizations - by using the eperson parameter getClient(anotherToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("eperson", admin.getID().toString())) .andExpect(status().isForbidden()); - // verify that he cannot search the authorizations of another "normal" eperson + // verify that he cannot search the admin authorizations - by assuming login + getClient(anotherToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .header("X-On-Behalf-Of", admin.getID())) + .andExpect(status().isForbidden()); + + // verify that he cannot search the authorizations of another "normal" eperson - by using the eperson parameter getClient(anotherToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("eperson", eperson.getID().toString())) .andExpect(status().isForbidden()); + + // verify that he cannot search the authorizations of another "normal" eperson - by assuming login + getClient(anotherToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(status().isForbidden()); } @Test @@ -765,8 +919,9 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); - // verify that it works for administrators String adminToken = getAuthToken(admin.getEmail(), password); + + // verify that it works for administrators - with eperson parameter getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) // use a large page so that the alwaysThrowExceptionFeature is invoked @@ -775,8 +930,17 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .param("eperson", admin.getID().toString())) .andExpect(status().isInternalServerError()); - // verify that it works for normal loggedin users + // verify that it works for administrators - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + // use a large page so that the alwaysThrowExceptionFeature is invoked + // this could become insufficient at some point + .param("size", "100")) + .andExpect(status().isInternalServerError()); + String epersonToken = getAuthToken(eperson.getEmail(), password); + + // verify that it works for normal loggedin users - with eperson parameter getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) // use a large page so that the alwaysThrowExceptionFeature is invoked @@ -785,6 +949,14 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration .param("eperson", eperson.getID().toString())) .andExpect(status().isInternalServerError()); + // verify that it works for normal loggedin users - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + // use a large page so that the alwaysThrowExceptionFeature is invoked + // this could become insufficient at some point + .param("size", "100")) + .andExpect(status().isInternalServerError()); + // verify that it works for anonymous users getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) @@ -811,70 +983,156 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration String comUri = utils.linkToSingleResource(comRest, "self").getHref(); context.restoreAuthSystemState(); - // verify that it works for administrators String adminToken = getAuthToken(admin.getEmail(), password); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + + // verify that it works for administrators - with eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", comUri) .param("projection", "level") .param("embedLevelDepth", "1") .param("feature", alwaysTrue.getName()) .param("eperson", admin.getID().toString())) .andExpect(status().isOk()) - .andExpect(jsonPath("$.type", is("authorization"))) - .andExpect(jsonPath("$._embedded.feature.id", is(alwaysTrue.getName()))) - .andExpect(jsonPath("$.id", Matchers.is(admin.getID().toString() + "_" + alwaysTrue.getName() + "_" - + comRest.getUniqueType() + "_" + comRest.getId()))); + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", is(admin.getID().toString() + "_" + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId())), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson.id", is(admin.getID().toString())) + ) + ))); + + // verify that it works for administrators - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("projection", "level") + .param("embedLevelDepth", "1") + .param("feature", alwaysTrue.getName())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", is( + admin.getID().toString() + "_" + + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId() + )), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson.id", is(admin.getID().toString())) + ) + ))); - // verify that it works for normal loggedin users String epersonToken = getAuthToken(eperson.getEmail(), password); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", comUri) - .param("projection", "level") - .param("embedLevelDepth", "1") - .param("feature", alwaysTrue.getName()) - .param("eperson", eperson.getID().toString())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.type", is("authorization"))) - .andExpect(jsonPath("$._embedded.feature.id", is(alwaysTrue.getName()))) - .andExpect(jsonPath("$.id", Matchers.is(eperson.getID().toString() + "_" + alwaysTrue.getName() + "_" - + comRest.getUniqueType() + "_" + comRest.getId()))); - // verify that it works for administators inspecting other users - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + // verify that it works for normal loggedin users - with eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", comUri) .param("projection", "level") .param("embedLevelDepth", "1") .param("feature", alwaysTrue.getName()) .param("eperson", eperson.getID().toString())) .andExpect(status().isOk()) - .andExpect(jsonPath("$.type", is("authorization"))) - .andExpect(jsonPath("$._embedded.feature.id", is(alwaysTrue.getName()))) - .andExpect(jsonPath("$.id", Matchers.is(eperson.getID().toString() + "_" + alwaysTrue.getName() + "_" - + comRest.getUniqueType() + "_" + comRest.getId()))); + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", is( + eperson.getID().toString() + "_" + + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId() + )), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson.id", is(eperson.getID().toString())) + ) + ))); + + // verify that it works for normal loggedin users - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("projection", "level") + .param("embedLevelDepth", "1") + .param("feature", alwaysTrue.getName())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", is( + eperson.getID().toString() + "_" + + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId() + )), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson.id", is(eperson.getID().toString())) + ) + ))); + + // verify that it works for administators inspecting other users - by using the eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("projection", "level") + .param("embedLevelDepth", "1") + .param("feature", alwaysTrue.getName()) + .param("eperson", eperson.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", is( + eperson.getID().toString() + "_" + + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId() + )), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson.id", is(eperson.getID().toString())) + ) + ))); + + // verify that it works for administators inspecting other users - by assuming login + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("projection", "level") + .param("embedLevelDepth", "1") + .param("feature", alwaysTrue.getName()) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", is( + eperson.getID().toString() + "_" + + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId() + )), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson.id", is(eperson.getID().toString())) + ) + ))); // verify that it works for anonymous users - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", comUri) .param("projection", "level") .param("embedLevelDepth", "1") .param("feature", alwaysTrue.getName())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.type", is("authorization"))) - .andExpect(jsonPath("$._embedded.feature.id", is(alwaysTrue.getName()))) - .andExpect(jsonPath("$.id",Matchers.is(alwaysTrue.getName() + "_" - + comRest.getUniqueType() + "_" + comRest.getId()))); - - // verify that it works for administrators inspecting anonymous users - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", comUri) - .param("projection", "level") - .param("embedLevelDepth", "1") - .param("feature", alwaysTrue.getName())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$.type", is("authorization"))) - .andExpect(jsonPath("$._embedded.feature.id", is(alwaysTrue.getName()))) - .andExpect(jsonPath("$.id",Matchers.is(alwaysTrue.getName() + "_" - + comRest.getUniqueType() + "_" + comRest.getId()))); + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", is( + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId() + )), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson", nullValue()) + ) + ))); } @Test @@ -888,40 +1146,55 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); - // verify that it works for administrators String adminToken = getAuthToken(admin.getEmail(), password); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + + // verify that it works for administrators - with eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", alwaysFalse.getName()) .param("eperson", admin.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); + + // verify that it works for administrators - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", alwaysFalse.getName())) + .andExpect(jsonPath("$.page.totalElements", is(0))); - // verify that it works for normal loggedin users String epersonToken = getAuthToken(eperson.getEmail(), password); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", siteUri) - .param("feature", trueForAdmins.getName()) - .param("eperson", eperson.getID().toString())) - .andExpect(status().isNoContent()); - // verify that it works for administators inspecting other users - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + // verify that it works for normal loggedin users - with eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForAdmins.getName()) .param("eperson", eperson.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); + + // verify that it works for normal loggedin users - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", trueForAdmins.getName())) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + // verify that it works for administators inspecting other users - by using the eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", trueForAdmins.getName()) + .param("eperson", eperson.getID().toString())) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + // verify that it works for administators inspecting other users - by assuming login + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", trueForAdmins.getName()) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(jsonPath("$.page.totalElements", is(0))); // verify that it works for anonymous users - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForLoggedUsers.getName())) - .andExpect(status().isNoContent()); - - // verify that it works for administrators inspecting anonymous users - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", siteUri) - .param("feature", trueForLoggedUsers.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -939,75 +1212,103 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // disarm the alwaysThrowExceptionFeature configurationService.setProperty("org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature.turnoff", true); - // verify that it works for administrators, no result + String adminToken = getAuthToken(admin.getEmail(), password); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + + // verify that it works for administrators, no result - with eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wrongSiteUri) .param("feature", alwaysTrue.getName()) .param("eperson", admin.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(0))); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", "not-existing-feature") .param("eperson", admin.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); + + // verify that it works for administrators, no result - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", wrongSiteUri) + .param("feature", alwaysTrue.getName())) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", "not-existing-feature")) + .andExpect(jsonPath("$.page.totalElements", is(0))); - // verify that it works for normal loggedin users String epersonToken = getAuthToken(eperson.getEmail(), password); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + + // verify that it works for normal loggedin users - with eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wrongSiteUri) .param("feature", alwaysTrue.getName()) .param("eperson", eperson.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", "not-existing-feature") .param("eperson", eperson.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); - // verify that it works for administators inspecting other users - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + // verify that it works for normal loggedin users - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", wrongSiteUri) + .param("feature", alwaysTrue.getName())) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", "not-existing-feature")) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + // verify that it works for administators inspecting other users - by using the eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wrongSiteUri) .param("feature", alwaysTrue.getName()) .param("eperson", eperson.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", "not-existing-feature") .param("eperson", eperson.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); + + // verify that it works for administators inspecting other users - by assuming login + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", wrongSiteUri) + .param("feature", alwaysTrue.getName()) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", "not-existing-feature") + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(jsonPath("$.page.totalElements", is(0))); // verify that it works for anonymous users - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", wrongSiteUri) .param("feature", alwaysTrue.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", "not-existing-feature")) - .andExpect(status().isNoContent()); - - // verify that it works for administrators inspecting anonymous users - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", wrongSiteUri) - .param("feature", alwaysTrue.getName())) - .andExpect(status().isNoContent()); - - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", siteUri) - .param("feature", "not-existing-feature")) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test /** * Verify that the findByObject return the 400 Bad Request response for invalid or missing URI or feature (required * parameters) - * + * * @throws Exception */ public void findByObjectAndFeatureBadRequestTest() throws Exception { @@ -1027,72 +1328,55 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration String epersonToken = getAuthToken(eperson.getEmail(), password); for (String invalidUri : invalidUris) { log.debug("findByObjectAndFeatureBadRequestTest - Testing the URI: " + invalidUri); - // verify that it works for administrators with an invalid or missing uri - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + + // verify that it works for administrators with an invalid or missing uri - with eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", invalidUri) .param("feature", alwaysTrue.getName()) .param("eperson", admin.getID().toString())) .andExpect(status().isBadRequest()); - // verify that it works for normal loggedin users with an invalid or missing uri - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", invalidUri) - .param("feature", alwaysTrue.getName()) - .param("eperson", eperson.getID().toString())) - .andExpect(status().isBadRequest()); - - // verify that it works for administators inspecting other users with an invalid or missing uri - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", invalidUri) - .param("feature", alwaysTrue.getName()) - .param("eperson", eperson.getID().toString())) - .andExpect(status().isBadRequest()); - - // verify that it works for anonymous users with an invalid or missing uri - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + // verify that it works for administrators with an invalid or missing uri - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", invalidUri) .param("feature", alwaysTrue.getName())) .andExpect(status().isBadRequest()); - // verify that it works for administrators inspecting anonymous users with an invalid or missing uri - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + // verify that it works for normal loggedin users with an invalid or missing uri - with eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", invalidUri) + .param("feature", alwaysTrue.getName()) + .param("eperson", eperson.getID().toString())) + .andExpect(status().isBadRequest()); + + // verify that it works for normal loggedin users with an invalid or missing uri - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", invalidUri) + .param("feature", alwaysTrue.getName())) + .andExpect(status().isBadRequest()); + + // verify that it works for administators inspecting other users with an invalid or missing uri - by + // using the eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", invalidUri) + .param("feature", alwaysTrue.getName()) + .param("eperson", eperson.getID().toString())) + .andExpect(status().isBadRequest()); + + // verify that it works for administators inspecting other users with an invalid or missing uri - by + // assuming login + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", invalidUri) + .param("feature", alwaysTrue.getName()) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(status().isBadRequest()); + + // verify that it works for anonymous users with an invalid or missing uri + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", invalidUri) .param("feature", alwaysTrue.getName())) .andExpect(status().isBadRequest()); } - - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("eperson", admin.getID().toString())) - .andExpect(status().isBadRequest()); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("eperson", eperson.getID().toString())) - .andExpect(status().isBadRequest()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("eperson", eperson.getID().toString())) - .andExpect(status().isBadRequest()); - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature")) - .andExpect(status().isBadRequest()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature")) - .andExpect(status().isBadRequest()); - - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", siteUri) - .param("eperson", admin.getID().toString())) - .andExpect(status().isBadRequest()); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", siteUri) - .param("eperson", eperson.getID().toString())) - .andExpect(status().isBadRequest()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", siteUri) - .param("eperson", eperson.getID().toString())) - .andExpect(status().isBadRequest()); - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", siteUri)) - .andExpect(status().isBadRequest()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") - .param("uri", siteUri.toString())) - .andExpect(status().isBadRequest()); } @Test @@ -1109,18 +1393,33 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // disarm the alwaysThrowExceptionFeature configurationService.setProperty("org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature.turnoff", true); - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + // verify that it works for an anonymous user inspecting an admin user - by using the eperson parameter + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", alwaysTrue.getName()) .param("eperson", admin.getID().toString())) .andExpect(status().isUnauthorized()); - // verify that it works for normal loggedin users with an invalid or missing uri - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + // verify that it works for an anonymous user inspecting an admin user - by assuming login + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", alwaysTrue.getName()) + .header("X-On-Behalf-Of", admin.getID())) + .andExpect(status().isUnauthorized()); + + // verify that it works for an anonymous user inspecting a normal user - by using the eperson parameter + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", alwaysTrue.getName()) .param("eperson", eperson.getID().toString())) .andExpect(status().isUnauthorized()); + + // verify that it works for an anonymous user inspecting a normal user - by assuming login + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", alwaysTrue.getName()) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(status().isUnauthorized()); } @Test @@ -1141,19 +1440,34 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // disarm the alwaysThrowExceptionFeature configurationService.setProperty("org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature.turnoff", true); String anotherToken = getAuthToken(anotherEperson.getEmail(), password); - // verify that he cannot search the admin authorizations - getClient(anotherToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + + // verify that he cannot search the admin authorizations - by using the eperson parameter + getClient(anotherToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", alwaysTrue.getName()) .param("eperson", admin.getID().toString())) .andExpect(status().isForbidden()); - // verify that he cannot search the authorizations of another "normal" eperson - getClient(anotherToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + // verify that he cannot search the admin authorizations - by assuming login + getClient(anotherToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", alwaysTrue.getName()) + .header("X-On-Behalf-Of", admin.getID())) + .andExpect(status().isForbidden()); + + // verify that he cannot search the authorizations of another "normal" eperson - by using the eperson parameter + getClient(anotherToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", alwaysTrue.getName()) .param("eperson", eperson.getID().toString())) .andExpect(status().isForbidden()); + + // verify that he cannot search the authorizations of another "normal" eperson - by assuming login + getClient(anotherToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", alwaysTrue.getName()) + .header("X-On-Behalf-Of", eperson.getID())) + .andExpect(status().isForbidden()); } @Test @@ -1166,24 +1480,38 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); - // verify that it works for administrators String adminToken = getAuthToken(admin.getEmail(), password); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + + // verify that it works for administrators - with eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", alwaysException.getName()) .param("eperson", admin.getID().toString())) .andExpect(status().isInternalServerError()); - // verify that it works for normal loggedin users + // verify that it works for administrators - without eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", alwaysException.getName())) + .andExpect(status().isInternalServerError()); + String epersonToken = getAuthToken(eperson.getEmail(), password); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + + // verify that it works for normal loggedin users - with eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", alwaysException.getName()) .param("eperson", eperson.getID().toString())) .andExpect(status().isInternalServerError()); + // verify that it works for normal loggedin users - without eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", alwaysException.getName())) + .andExpect(status().isInternalServerError()); + // verify that it works for anonymous users - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", alwaysException.getName())) .andExpect(status().isInternalServerError()); @@ -1223,31 +1551,31 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // check both via direct access than via a search method getClient(adminToken).perform(get("/api/authz/authorizations/" + authAdminSite.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", admin.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); // nor the normal user both directly than if checked by the admin getClient(adminToken).perform(get("/api/authz/authorizations/" + authNormalUserSite.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", normalUser.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); getClient(normalUserToken).perform(get("/api/authz/authorizations/" + authNormalUserSite.getID())) .andExpect(status().isNotFound()); - getClient(normalUserToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(normalUserToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", normalUser.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); // instead the member user has getClient(adminToken).perform(get("/api/authz/authorizations/" + authMemberSite.getID())) .andExpect(status().isOk()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", memberOfTestGroup.getID().toString())) @@ -1255,7 +1583,7 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // so it can also check itself the permission getClient(memberToken).perform(get("/api/authz/authorizations/" + authMemberSite.getID())) .andExpect(status().isOk()); - getClient(memberToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(memberToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", memberOfTestGroup.getID().toString())) @@ -1271,7 +1599,7 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // our admin now should have the authorization getClient(adminToken).perform(get("/api/authz/authorizations/" + authAdminSite.getID())) .andExpect(status().isOk()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", admin.getID().toString())) @@ -1279,15 +1607,15 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // our normal user when checked via the admin should still not have the authorization getClient(adminToken).perform(get("/api/authz/authorizations/" + authNormalUserSite.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", normalUser.getID().toString())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); // but he should have the authorization if loggedin directly getClient(normalUserToken).perform(get("/api/authz/authorizations/" + authNormalUserSite.getID())) .andExpect(status().isOk()); - getClient(normalUserToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(normalUserToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", normalUser.getID().toString())) @@ -1295,20 +1623,79 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration // for our direct member user we don't expect differences getClient(adminToken).perform(get("/api/authz/authorizations/" + authMemberSite.getID())) .andExpect(status().isOk()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", memberOfTestGroup.getID().toString())) .andExpect(status().isOk()); getClient(memberToken).perform(get("/api/authz/authorizations/" + authMemberSite.getID())) .andExpect(status().isOk()); - getClient(memberToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(memberToken).perform(get("/api/authz/authorizations/search/object") .param("uri", siteUri) .param("feature", trueForUsersInGroupTest.getName()) .param("eperson", memberOfTestGroup.getID().toString())) .andExpect(status().isOk()); } + @Test + public void findByObjectAndFeatureFullProjectionTest() throws Exception { + context.turnOffAuthorisationSystem(); + Community com = CommunityBuilder.createCommunity(context).withName("A test community").build(); + CommunityRest comRest = communityConverter.convert(com, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + context.restoreAuthSystemState(); + + String adminToken = getAuthToken(admin.getEmail(), password); + + // verify that it works for administrators - with eperson parameter + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("projection", "full") + .param("feature", alwaysTrue.getName()) + .param("eperson", admin.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", is(admin.getID().toString() + "_" + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId())), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson.id", is(admin.getID().toString())), + hasJsonPath("$._embedded.object.id", is(com.getID().toString())) + ) + ))) + // This is the Full Projection data not visible to eperson's full projection + .andExpect(jsonPath("$._embedded.authorizations[0]._embedded.object._embedded.adminGroup", + nullValue())); + + String epersonToken = getAuthToken(eperson.getEmail(), password); + + // verify that it works for administrators - with eperson parameter + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("projection", "full") + .param("feature", alwaysTrue.getName()) + .param("eperson", eperson.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(1))) + .andExpect(jsonPath("$._embedded.authorizations", contains( + allOf( + hasJsonPath("$.id", + is(eperson.getID().toString() + "_" + alwaysTrue.getName() + "_" + + comRest.getUniqueType() + "_" + comRest.getId())), + hasJsonPath("$.type", is("authorization")), + hasJsonPath("$._embedded.feature.id", is(alwaysTrue.getName())), + hasJsonPath("$._embedded.eperson.id", is(eperson.getID().toString())), + hasJsonPath("$._embedded.object.id", is(com.getID().toString())) + ) + ))) + // This is the Full Projection data not visible to eperson's full projection + .andExpect( + jsonPath("$._embedded.authorizations[0]._embedded.object._embedded.adminGroup") + .doesNotExist()); + } + // utility methods to build authorization ID without having an authorization object private String getAuthorizationID(EPerson eperson, AuthorizationFeature feature, BaseObjectRest obj) { return getAuthorizationID(eperson != null ? eperson.getID().toString() : null, feature.getName(), @@ -1335,4 +1722,6 @@ public class AuthorizationRestRepositoryIT extends AbstractControllerIntegration return (epersonUuid != null ? epersonUuid + "_" : "") + featureName + "_" + type + "_" + id.toString(); } + + } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamControllerIT.java index 608232ef5d..ca3c05ec30 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamControllerIT.java @@ -22,15 +22,15 @@ import org.apache.commons.codec.CharEncoding; import org.apache.commons.io.IOUtils; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.BundleBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.ResourcePolicyBuilder; import org.dspace.app.rest.matcher.BundleMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.BundleBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Bundle; import org.dspace.content.Collection; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamFormatRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamFormatRestRepositoryIT.java index 744e673912..48ad410d00 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamFormatRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamFormatRestRepositoryIT.java @@ -25,14 +25,14 @@ import java.util.Random; import java.util.concurrent.atomic.AtomicReference; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.BitstreamFormatBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; import org.dspace.app.rest.converter.BitstreamFormatConverter; import org.dspace.app.rest.matcher.BitstreamFormatMatcher; import org.dspace.app.rest.matcher.HalMatcher; import org.dspace.app.rest.model.BitstreamFormatRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamFormatBuilder; +import org.dspace.builder.EPersonBuilder; import org.dspace.content.BitstreamFormat; import org.dspace.content.service.BitstreamFormatService; import org.dspace.core.I18nUtil; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamRestControllerIT.java index 96176d0a77..2a68c2e887 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamRestControllerIT.java @@ -11,13 +11,9 @@ import static java.util.UUID.randomUUID; import static org.apache.commons.codec.CharEncoding.UTF_8; import static org.apache.commons.collections.CollectionUtils.isEmpty; import static org.apache.commons.io.IOUtils.toInputStream; -import static org.dspace.app.rest.builder.BitstreamBuilder.createBitstream; -import static org.dspace.app.rest.builder.BitstreamFormatBuilder.createBitstreamFormat; -import static org.dspace.app.rest.builder.CollectionBuilder.createCollection; -import static org.dspace.app.rest.builder.CommunityBuilder.createCommunity; -import static org.dspace.app.rest.builder.ItemBuilder.createItem; -import static org.dspace.app.rest.builder.ResourcePolicyBuilder.createResourcePolicy; import static org.dspace.app.rest.matcher.BitstreamFormatMatcher.matchBitstreamFormat; +import static org.dspace.builder.BitstreamFormatBuilder.createBitstreamFormat; +import static org.dspace.builder.ResourcePolicyBuilder.createResourcePolicy; import static org.dspace.content.BitstreamFormat.KNOWN; import static org.dspace.content.BitstreamFormat.SUPPORTED; import static org.dspace.core.Constants.READ; @@ -53,14 +49,14 @@ import org.apache.commons.lang3.StringUtils; import org.apache.pdfbox.pdmodel.PDDocument; import org.apache.pdfbox.text.PDFTextStripper; import org.apache.solr.client.solrj.SolrServerException; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Bitstream; import org.dspace.content.BitstreamFormat; import org.dspace.content.Collection; @@ -129,22 +125,22 @@ public class BitstreamRestControllerIT extends AbstractControllerIntegrationTest context.turnOffAuthorisationSystem(); - Community community = createCommunity(context).build(); - Collection collection = createCollection(context, community).build(); - Item item = createItem(context, collection).build(); + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + Item item = ItemBuilder.createItem(context, collection).build(); - bitstream = createBitstream(context, item, toInputStream("test", UTF_8)) + bitstream = BitstreamBuilder.createBitstream(context, item, toInputStream("test", UTF_8)) .withFormat("test format") .build(); unknownFormat = bitstreamFormatService.findUnknown(context); knownFormat = createBitstreamFormat(context) - .withMimeType("known test mime type") - .withDescription("known test description") - .withShortDescription("known test short description") - .withSupportLevel(KNOWN) - .build(); + .withMimeType("known test mime type") + .withDescription("known test description") + .withShortDescription("known test short description") + .withSupportLevel(KNOWN) + .build(); supportedFormat = createBitstreamFormat(context) .withMimeType("supported mime type") @@ -729,7 +725,7 @@ public class BitstreamRestControllerIT extends AbstractControllerIntegrationTest // Find all hits/views of bitstream ObjectCount objectCount = solrLoggerService.queryTotal("type:" + Constants.BITSTREAM + - " AND id:" + bitstream.getID(), null); + " AND id:" + bitstream.getID(), null, 1); assertEquals(expectedNumberOfStatsRecords, objectCount.getCount()); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamRestRepositoryIT.java index 34bb56cbec..684eceb639 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BitstreamRestRepositoryIT.java @@ -23,18 +23,21 @@ import java.util.UUID; import org.apache.commons.codec.CharEncoding; import org.apache.commons.io.IOUtils; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.ResourcePolicyBuilder; import org.dspace.app.rest.matcher.BitstreamFormatMatcher; import org.dspace.app.rest.matcher.BitstreamMatcher; +import org.dspace.app.rest.matcher.BundleMatcher; import org.dspace.app.rest.matcher.HalMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.test.MetadataPatchSuite; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.BundleBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; import org.dspace.content.Bitstream; +import org.dspace.content.Bundle; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -1143,5 +1146,138 @@ public class BitstreamRestRepositoryIT extends AbstractControllerIntegrationTest } + @Test + public void getEmbeddedBundleForBitstream() throws Exception { + //We turn off the authorization system in order to create the structure as defined below + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and one collection. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + //2. One public items that is readable by Anonymous + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Test") + .withIssueDate("2010-10-17") + .withAuthor("Smith, Donald") + .withSubject("ExtraEntry") + .build(); + + String bitstreamContent = "ThisIsSomeDummyText"; + + //Add a bitstream to an item + Bitstream bitstream = null; + try (InputStream is = IOUtils.toInputStream(bitstreamContent, CharEncoding.UTF_8)) { + bitstream = BitstreamBuilder. + createBitstream(context, publicItem1, is) + .withName("Bitstream") + .withDescription("Description") + .withMimeType("text/plain") + .build(); + } + + Bundle bundle = bitstream.getBundles().get(0); + + //Get the bitstream with embedded bundle + getClient().perform(get("/api/core/bitstreams/" + bitstream.getID() + "?embed=bundle")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.bundle", + BundleMatcher.matchProperties( + bundle.getName(), + bundle.getID(), + bundle.getHandle(), + bundle.getType() + ) + )); + } + + @Test + /** + * This test proves that, if a bitstream is linked to multiple bundles, we only ever return the first bundle. + * **NOTE: DSpace does NOT support or expect to have a bitstream linked to multiple bundles**. + * But, because the database does allow for it, this test simply proves the REST API will respond without an error. + */ + public void linksToFirstBundleWhenMultipleBundles() throws Exception { + //We turn off the authorization system in order to create the structure as defined below + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and one collection. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + //2. One public items that is readable by Anonymous + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Test") + .withIssueDate("2010-10-17") + .withAuthor("Smith, Donald") + .withSubject("ExtraEntry") + .build(); + + String bitstreamContent = "ThisIsSomeDummyText"; + + //Add a bitstream to an item + Bitstream bitstream = null; + try (InputStream is = IOUtils.toInputStream(bitstreamContent, CharEncoding.UTF_8)) { + bitstream = BitstreamBuilder. + createBitstream(context, publicItem1, is) + .withName("Bitstream") + .withDescription("Description") + .withMimeType("text/plain") + .build(); + } + + Bundle secondBundle = BundleBuilder.createBundle(context, publicItem1) + .withName("second bundle") + .withBitstream(bitstream).build(); + + Bundle bundle = bitstream.getBundles().get(0); + + //Get bundle should contain the first bundle in the list + getClient().perform(get("/api/core/bitstreams/" + bitstream.getID() + "/bundle")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$", + BundleMatcher.matchProperties( + bundle.getName(), + bundle.getID(), + bundle.getHandle(), + bundle.getType() + ) + )); + } + + @Test + public void linksToEmptyWhenNoBundle() throws Exception { + // We turn off the authorization system in order to create the structure as defined below + context.turnOffAuthorisationSystem(); + + // ** GIVEN ** + // 1. A community with a logo + parentCommunity = CommunityBuilder.createCommunity(context).withName("Community").withLogo("logo_community") + .build(); + + // 2. A collection with a logo + Collection col = CollectionBuilder.createCollection(context, parentCommunity).withName("Collection") + .withLogo("logo_collection").build(); + + Bitstream bitstream = parentCommunity.getLogo(); + + //Get bundle should contain an empty response + getClient().perform(get("/api/core/bitstreams/" + bitstream.getID() + "/bundle")) + .andExpect(status().isNoContent()); + } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BrowsesResourceControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BrowsesResourceControllerIT.java index 88ca72b08a..0cf282b1ab 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BrowsesResourceControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BrowsesResourceControllerIT.java @@ -14,19 +14,20 @@ import static org.hamcrest.Matchers.containsString; import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.not; +import static org.hamcrest.Matchers.nullValue; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.content; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.matcher.BrowseEntryResourceMatcher; import org.dspace.app.rest.matcher.BrowseIndexMatcher; import org.dspace.app.rest.matcher.ItemMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -958,4 +959,168 @@ public class BrowsesResourceControllerIT extends AbstractControllerIntegrationTe ))); } + @Test + public void findBrowseByTitleItemsFullProjectionTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + Collection col2 = CollectionBuilder.createCollection(context, child1).withName("Collection 2").build(); + + //2. Two public items that are readable by Anonymous + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Public item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("Java").withSubject("Unit Testing") + .build(); + + + context.restoreAuthSystemState(); + + getClient().perform(get("/api/discover/browses/title/items") + .param("projection", "full")) + + //** THEN ** + //The status has to be 200 OK + .andExpect(status().isOk()) + //We expect the content type to be "application/hal+json;charset=UTF-8" + .andExpect(content().contentType(contentType)) + // The full projection for anon shouldn't show the adminGroup in the response + .andExpect( + jsonPath("$._embedded.items[0]._embedded.owningCollection._embedded.adminGroup").doesNotExist()); + + + String adminToken = getAuthToken(admin.getEmail(), password); + getClient(adminToken).perform(get("/api/discover/browses/title/items") + .param("projection", "full")) + + //** THEN ** + //The status has to be 200 OK + .andExpect(status().isOk()) + //We expect the content type to be "application/hal+json;charset=UTF-8" + .andExpect(content().contentType(contentType)) + // The full projection for admin should show the adminGroup in the response + .andExpect(jsonPath("$._embedded.items[0]._embedded.owningCollection._embedded.adminGroup", + nullValue())); + } + + @Test + public void browseByAuthorFullProjectionTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community and one collection. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, parentCommunity).withName("Collection 1").build(); + + //2. Twenty-one public items that are readable by Anonymous + for (int i = 0; i <= 20; i++) { + ItemBuilder.createItem(context, col1) + .withTitle("Public item " + String.format("%02d", i)) + .withIssueDate("2017-10-17") + .withAuthor("Test, Author" + String.format("%02d", i)) + .withSubject("Java").withSubject("Unit Testing") + .build(); + } + + context.restoreAuthSystemState(); + + + getClient().perform(get("/api/discover/browses/author/entries") + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(21))) + .andExpect(jsonPath("$.page.totalPages", is(2))) + .andExpect(jsonPath("$.page.number", is(0))) + .andExpect( + jsonPath("$._links.next.href", Matchers.containsString("/api/discover/browses/author/entries"))) + .andExpect( + jsonPath("$._links.last.href", Matchers.containsString("/api/discover/browses/author/entries"))) + .andExpect( + jsonPath("$._links.self.href", Matchers.endsWith("/api/discover/browses/author/entries"))); + + String adminToken = getAuthToken(admin.getEmail(), password); + getClient(adminToken).perform(get("/api/discover/browses/author/entries") + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(21))) + .andExpect(jsonPath("$.page.totalPages", is(2))) + .andExpect(jsonPath("$.page.number", is(0))) + .andExpect(jsonPath("$._links.next.href", + Matchers.containsString("/api/discover/browses/author/entries"))) + .andExpect(jsonPath("$._links.last.href", + Matchers.containsString("/api/discover/browses/author/entries"))) + .andExpect( + jsonPath("$._links.self.href", + Matchers.endsWith("/api/discover/browses/author/entries"))); + + getClient().perform(get("/api/discover/browses/author/entries")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(21))) + .andExpect(jsonPath("$.page.totalPages", is(2))) + .andExpect(jsonPath("$.page.number", is(0))) + .andExpect( + jsonPath("$._links.next.href", Matchers.containsString("/api/discover/browses/author/entries"))) + .andExpect( + jsonPath("$._links.last.href", Matchers.containsString("/api/discover/browses/author/entries"))) + .andExpect( + jsonPath("$._links.self.href", Matchers.endsWith("/api/discover/browses/author/entries"))); + + } + + @Test + public void testBrowseByDateIssuedItemsFullProjectionTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + Collection col2 = CollectionBuilder.createCollection(context, child1).withName("Collection 2").build(); + + Item item1 = ItemBuilder.createItem(context, col1) + .withTitle("Item 1") + .withIssueDate("2017-10-17") + .build(); + + context.restoreAuthSystemState(); + + getClient().perform(get("/api/discover/browses/dateissued/items") + .param("projection", "full")) + + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect( + jsonPath("$._embedded.items[0]._embedded.owningCollection._embedded.adminGroup").doesNotExist()); + + String adminToken = getAuthToken(admin.getEmail(), password); + getClient(adminToken).perform(get("/api/discover/browses/dateissued/items") + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.items[0]._embedded.owningCollection._embedded.adminGroup", + nullValue())); + } + + } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BundleRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BundleRestRepositoryIT.java index 07d6645c00..b99dd5bf99 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BundleRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BundleRestRepositoryIT.java @@ -27,16 +27,10 @@ import javax.ws.rs.core.MediaType; import com.fasterxml.jackson.databind.ObjectMapper; import org.apache.commons.codec.CharEncoding; import org.apache.commons.io.IOUtils; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.BundleBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.ResourcePolicyBuilder; import org.dspace.app.rest.matcher.BitstreamMatcher; import org.dspace.app.rest.matcher.BundleMatcher; import org.dspace.app.rest.matcher.HalMatcher; +import org.dspace.app.rest.matcher.ItemMatcher; import org.dspace.app.rest.matcher.MetadataMatcher; import org.dspace.app.rest.model.BundleRest; import org.dspace.app.rest.model.MetadataRest; @@ -46,10 +40,18 @@ import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.ResourcePolicy; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.BundleBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Bundle; import org.dspace.content.Collection; import org.dspace.content.Item; +import org.dspace.content.service.ItemService; import org.dspace.core.Constants; import org.dspace.eperson.EPerson; import org.hamcrest.Matchers; @@ -63,6 +65,9 @@ public class BundleRestRepositoryIT extends AbstractControllerIntegrationTest { @Autowired ResourcePolicyService resourcePolicyService; + @Autowired + ItemService itemService; + private Collection collection; private Item item; private Bundle bundle1; @@ -71,6 +76,7 @@ public class BundleRestRepositoryIT extends AbstractControllerIntegrationTest { private Bitstream bitstream2; @Before + @Override public void setUp() throws Exception { super.setUp(); @@ -636,4 +642,54 @@ public class BundleRestRepositoryIT extends AbstractControllerIntegrationTest { .andExpect(status().isOk()); } + @Test + public void getEmbeddedItemForBundle() throws Exception { + context.turnOffAuthorisationSystem(); + + bundle1 = BundleBuilder.createBundle(context, item) + .withName("testname") + .build(); + + context.restoreAuthSystemState(); + + getClient().perform(get("/api/core/bundles/" + bundle1.getID() + "?embed=item")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.item", + ItemMatcher.matchItemWithTitleAndDateIssued(item, "Public item 1", "2017-10-17") + )); + } + + @Test + /** + * This test proves that, if a bundle is linked to multiple items, we only ever return the first item. + * **NOTE: DSpace does NOT support or expect to have a bundle linked to multiple items**. + * But, because the database does allow for it, this test simply proves the REST API will respond without an error + */ + public void linksToFirstItemWhenMultipleItems() throws Exception { + context.turnOffAuthorisationSystem(); + + bundle1 = BundleBuilder.createBundle(context, item) + .withName("testname") + .build(); + + Item item2 = ItemBuilder.createItem(context, collection) + .withTitle("Public item 2") + .withIssueDate("2020-07-08") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("SecondEntry") + .build(); + + itemService.addBundle(context, item2, bundle1); + + context.restoreAuthSystemState(); + + getClient().perform(get("/api/core/bundles/" + bundle1.getID() + "/item")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$", + ItemMatcher.matchItemWithTitleAndDateIssued(item, "Public item 1", "2017-10-17") + )); + } + } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BundleUploadBitstreamControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BundleUploadBitstreamControllerIT.java index efb07b3f1d..eefcb81656 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/BundleUploadBitstreamControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/BundleUploadBitstreamControllerIT.java @@ -18,10 +18,6 @@ import java.util.Map; import java.util.UUID; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.BundleBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.matcher.BitstreamMatcher; import org.dspace.app.rest.matcher.MetadataMatcher; import org.dspace.app.rest.model.BitstreamRest; @@ -29,6 +25,10 @@ import org.dspace.app.rest.model.MetadataRest; import org.dspace.app.rest.model.MetadataValueRest; import org.dspace.app.rest.test.AbstractEntityIntegrationTest; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.BundleBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Bundle; import org.dspace.content.Collection; import org.dspace.content.Community; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CCLicenseAddPatchOperationIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CCLicenseAddPatchOperationIT.java index 4f9c753047..fcb814d82d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CCLicenseAddPatchOperationIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CCLicenseAddPatchOperationIT.java @@ -20,12 +20,12 @@ import java.util.ArrayList; import java.util.List; import javax.ws.rs.core.MediaType; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.WorkspaceItemBuilder; import org.dspace.app.rest.model.patch.AddOperation; import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.WorkspaceItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.WorkspaceItem; @@ -61,7 +61,7 @@ public class CCLicenseAddPatchOperationIT extends AbstractControllerIntegrationT String adminToken = getAuthToken(admin.getEmail(), password); - List ops = new ArrayList(); + List ops = new ArrayList<>(); AddOperation addOperation = new AddOperation("/sections/cclicense/uri", "http://creativecommons.org/licenses/by-nc-sa/4.0/"); @@ -102,7 +102,7 @@ public class CCLicenseAddPatchOperationIT extends AbstractControllerIntegrationT String adminToken = getAuthToken(admin.getEmail(), password); - List ops = new ArrayList(); + List ops = new ArrayList<>(); AddOperation addOperation = new AddOperation("/sections/cclicense/uri", "invalid-license-uri"); ops.add(addOperation); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CCLicenseRemovePatchOperationIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CCLicenseRemovePatchOperationIT.java index 3b05621f08..c003cf2809 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CCLicenseRemovePatchOperationIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CCLicenseRemovePatchOperationIT.java @@ -19,13 +19,13 @@ import java.util.ArrayList; import java.util.List; import javax.ws.rs.core.MediaType; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.WorkspaceItemBuilder; import org.dspace.app.rest.model.patch.AddOperation; import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.RemoveOperation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.WorkspaceItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.WorkspaceItem; @@ -62,7 +62,7 @@ public class CCLicenseRemovePatchOperationIT extends AbstractControllerIntegrati String epersonToken = getAuthToken(eperson.getEmail(), password); // First add a license and verify it is added - List ops = new ArrayList(); + List ops = new ArrayList<>(); AddOperation addOperation = new AddOperation("/sections/cclicense/uri", "http://creativecommons.org/licenses/by-nc-sa/4.0/"); @@ -84,7 +84,7 @@ public class CCLicenseRemovePatchOperationIT extends AbstractControllerIntegrati // Remove the license again and verify it is removed - List removeOps = new ArrayList(); + List removeOps = new ArrayList<>(); RemoveOperation removeOperation = new RemoveOperation("/sections/cclicense/uri"); removeOps.add(removeOperation); @@ -120,7 +120,7 @@ public class CCLicenseRemovePatchOperationIT extends AbstractControllerIntegrati String epersonToken = getAuthToken(eperson.getEmail(), password); - List removeOps = new ArrayList(); + List removeOps = new ArrayList<>(); RemoveOperation removeOperation = new RemoveOperation("/sections/cclicense/uri"); removeOps.add(removeOperation); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionGroupRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionGroupRestControllerIT.java index 7464e9c38c..767ea5f565 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionGroupRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionGroupRestControllerIT.java @@ -20,14 +20,14 @@ import java.util.UUID; import java.util.concurrent.atomic.AtomicReference; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.matcher.GroupMatcher; import org.dspace.app.rest.model.GroupRest; import org.dspace.app.rest.model.MetadataRest; import org.dspace.app.rest.model.MetadataValueRest; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; import org.dspace.content.Collection; import org.dspace.content.service.CollectionService; import org.dspace.core.Constants; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionHarvestSettingsControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionHarvestSettingsControllerIT.java index dd5f9b6b83..e7479786e7 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionHarvestSettingsControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionHarvestSettingsControllerIT.java @@ -21,15 +21,15 @@ import java.sql.SQLException; import java.util.List; import java.util.Map; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; import org.dspace.app.rest.matcher.HarvesterMetadataMatcher; import org.dspace.app.rest.matcher.MetadataConfigsMatcher; import org.dspace.app.rest.model.HarvestTypeEnum; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.core.Constants; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionLogoControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionLogoControllerIT.java index e82f845697..f093156000 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionLogoControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionLogoControllerIT.java @@ -14,9 +14,9 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.util.Map; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; import org.dspace.content.Collection; import org.junit.Before; import org.junit.Test; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionRestRepositoryIT.java index 62dc114c6e..26bead3a02 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CollectionRestRepositoryIT.java @@ -27,11 +27,6 @@ import java.util.UUID; import java.util.concurrent.atomic.AtomicReference; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ResourcePolicyBuilder; import org.dspace.app.rest.converter.CollectionConverter; import org.dspace.app.rest.matcher.CollectionMatcher; import org.dspace.app.rest.matcher.CommunityMatcher; @@ -46,6 +41,11 @@ import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.test.MetadataPatchSuite; import org.dspace.authorize.service.AuthorizeService; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ResourcePolicyBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.core.Constants; @@ -305,6 +305,44 @@ public class CollectionRestRepositoryIT extends AbstractControllerIntegrationTes col1.getName(), col1.getID(), col1.getHandle()))); } + @Test + public void findOneCollectionFullProjectionTest() throws Exception { + + //We turn off the authorization system in order to create the structure as defined below + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and one collection. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Community child2 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community Two") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + context.restoreAuthSystemState(); + + + String adminToken = getAuthToken(admin.getEmail(), password); + getClient(adminToken).perform(get("/api/core/collections/" + col1.getID()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$", CollectionMatcher.matchCollectionEntryFullProjection( + col1.getName(), col1.getID(), col1.getHandle()))); + + getClient().perform(get("/api/core/collections/" + col1.getID()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$", Matchers.not(CollectionMatcher.matchCollectionEntryFullProjection( + col1.getName(), col1.getID(), col1.getHandle())))); + } + @Test public void findOneCollectionUnAuthenticatedTest() throws Exception { @@ -435,7 +473,7 @@ public class CollectionRestRepositoryIT extends AbstractControllerIntegrationTes CollectionMatcher.matchCollectionEntrySpecificEmbedProjection(col2.getName(), col2.getID(), col2.getHandle()) ))) - ) + ) ; getClient().perform(get("/api/core/collections/" + col1.getID() + "/logo")) diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityAdminGroupRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityAdminGroupRestControllerIT.java index fb00219a4d..37548553b1 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityAdminGroupRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityAdminGroupRestControllerIT.java @@ -21,10 +21,6 @@ import java.util.UUID; import java.util.concurrent.atomic.AtomicReference; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.GroupBuilder; import org.dspace.app.rest.matcher.EPersonMatcher; import org.dspace.app.rest.matcher.GroupMatcher; import org.dspace.app.rest.model.GroupRest; @@ -32,6 +28,10 @@ import org.dspace.app.rest.model.MetadataRest; import org.dspace.app.rest.model.MetadataValueRest; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.service.CollectionService; @@ -157,7 +157,7 @@ public class CommunityAdminGroupRestControllerIT extends AbstractControllerInteg .andExpect(status().isCreated()) .andDo(result -> idRef .set(UUID.fromString(read(result.getResponse().getContentAsString(), "$.id"))) - ); + ); // no needs to explicitly cleanup the group created as the community comes // from a CommunityBuilder that will cleanup also related groups Group adminGroup = groupService.find(context, idRef.get()); @@ -188,7 +188,7 @@ public class CommunityAdminGroupRestControllerIT extends AbstractControllerInteg .andExpect(status().isCreated()) .andDo(result -> idRef .set(UUID.fromString(read(result.getResponse().getContentAsString(), "$.id"))) - ); + ); // no needs to explicitly cleanup the group created as the community comes // from a CommunityBuilder that will cleanup also related groups Group adminGroup = groupService.find(context, idRef.get()); @@ -249,7 +249,7 @@ public class CommunityAdminGroupRestControllerIT extends AbstractControllerInteg .andExpect(status().isCreated()) .andDo(result -> idRef .set(UUID.fromString(read(result.getResponse().getContentAsString(), "$.id"))) - ); + ); // no needs to explicitly cleanup the group created as the community comes // from a CommunityBuilder that will cleanup also related groups Group adminGroup = groupService.find(context, idRef.get()); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityCollectionItemParentIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityCollectionItemParentIT.java index d85cf34d6a..a29d9494c9 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityCollectionItemParentIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityCollectionItemParentIT.java @@ -16,14 +16,14 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.sql.SQLException; import java.util.UUID; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.matcher.CollectionMatcher; import org.dspace.app.rest.matcher.CommunityMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityLogoControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityLogoControllerIT.java index 22174c4c0c..1d34a99dd9 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityLogoControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityLogoControllerIT.java @@ -14,8 +14,8 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.util.Map; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CommunityBuilder; import org.junit.Before; import org.junit.Test; import org.springframework.http.MediaType; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityRestRepositoryIT.java index 56ab3c1972..9d1553ab70 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/CommunityRestRepositoryIT.java @@ -30,9 +30,6 @@ import java.util.stream.Collectors; import java.util.stream.StreamSupport; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; import org.dspace.app.rest.converter.CommunityConverter; import org.dspace.app.rest.matcher.CollectionMatcher; import org.dspace.app.rest.matcher.CommunityMatcher; @@ -47,6 +44,9 @@ import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.test.MetadataPatchSuite; import org.dspace.authorize.service.AuthorizeService; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.service.CommunityService; @@ -115,16 +115,16 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest String authToken = getAuthToken(admin.getEmail(), password); // Capture the UUID of the created Community (see andDo() below) - AtomicReference idRef = new AtomicReference(); - AtomicReference idRefNoEmbeds = new AtomicReference(); + AtomicReference idRef = new AtomicReference<>(); + AtomicReference idRefNoEmbeds = new AtomicReference<>(); try { getClient(authToken).perform(post("/api/core/communities") .content(mapper.writeValueAsBytes(comm)) .contentType(contentType) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isCreated()) .andExpect(content().contentType(contentType)) - .andExpect(jsonPath("$", CommunityMatcher.matchFullEmbeds())) + .andExpect(jsonPath("$", CommunityMatcher.matchNonAdminEmbeds())) .andExpect(jsonPath("$", Matchers.allOf( hasJsonPath("$.id", not(empty())), hasJsonPath("$.uuid", not(empty())), @@ -234,7 +234,7 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest new MetadataValueRest("Title Text"))); // Capture the UUID of the created Community (see andDo() below) - AtomicReference idRef = new AtomicReference(); + AtomicReference idRef = new AtomicReference<>(); try { getClient(authToken).perform(post("/api/core/communities") .content(mapper.writeValueAsBytes(comm)) @@ -326,15 +326,15 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities") - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.communities", Matchers.containsInAnyOrder( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()), CommunityMatcher - .matchCommunityEntryFullProjection(child1.getName(), child1.getID(), child1.getHandle()) + .matchCommunityEntryNonAdminEmbeds(child1.getName(), child1.getID(), child1.getHandle()) ))) .andExpect(jsonPath("$._links.self.href", Matchers.containsString("/api/core/communities"))) .andExpect(jsonPath("$.page.size", is(20))) @@ -360,13 +360,13 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities").param("size", "2") - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.communities", Matchers.containsInAnyOrder( CommunityMatcher.matchCommunityEntryMultipleTitles(titles, parentCommunity.getID(), parentCommunity.getHandle()), - CommunityMatcher.matchCommunityEntryFullProjection(child1.getName(), child1.getID(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(child1.getName(), child1.getID(), child1.getHandle()) ))) .andExpect(jsonPath("$._links.self.href", Matchers.containsString("/api/core/communities"))) @@ -392,13 +392,13 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities").param("size", "2") - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.communities", Matchers.containsInAnyOrder( CommunityMatcher.matchCommunityEntryMultipleTitles(titles, parentCommunity.getID(), parentCommunity.getHandle()), - CommunityMatcher.matchCommunityEntryFullProjection(childCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(childCommunity.getName(), childCommunity.getID(), childCommunity.getHandle()) ))) @@ -408,14 +408,14 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest 2, 4))); getClient().perform(get("/api/core/communities").param("size", "2").param("page", "1") - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.communities", Matchers.containsInAnyOrder( - CommunityMatcher.matchCommunityEntryFullProjection(secondParentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(secondParentCommunity.getName(), secondParentCommunity.getID(), secondParentCommunity.getHandle()), - CommunityMatcher.matchCommunityEntryFullProjection(thirdParentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(thirdParentCommunity.getName(), thirdParentCommunity.getID(), thirdParentCommunity.getHandle()) ))) @@ -433,11 +433,11 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities") - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.communities", Matchers.contains( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ))) @@ -499,17 +499,17 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest getClient().perform(get("/api/core/communities") .param("size", "1") - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.communities", Matchers.contains( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ))) .andExpect(jsonPath("$._embedded.communities", Matchers.not( Matchers.contains( - CommunityMatcher.matchCommunityEntryFullProjection(child1.getName(), child1.getID(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(child1.getName(), child1.getID(), child1.getHandle()) ) ))) @@ -519,16 +519,16 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest getClient().perform(get("/api/core/communities") .param("size", "1") .param("page", "1") - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.communities", Matchers.contains( - CommunityMatcher.matchCommunityEntryFullProjection(child1.getName(), child1.getID(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(child1.getName(), child1.getID(), child1.getHandle()) ))) .andExpect(jsonPath("$._embedded.communities", Matchers.not( Matchers.contains( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ) @@ -662,10 +662,10 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest // When full projection is requested, response should include expected properties, links, and embeds. getClient().perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) - .andExpect(jsonPath("$", CommunityMatcher.matchFullEmbeds())) + .andExpect(jsonPath("$", CommunityMatcher.matchNonAdminEmbeds())) .andExpect(jsonPath("$", CommunityMatcher.matchCommunityEntry( parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()))); @@ -679,6 +679,39 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()))); } + @Test + public void findOneFullProjectionTest() throws Exception { + //We turn off the authorization system in order to create the structure as defined below + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and one collection. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + context.restoreAuthSystemState(); + + String adminToken = getAuthToken(admin.getEmail(), password); + getClient(adminToken).perform(get("/api/core/communities/" + parentCommunity.getID().toString()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$", CommunityMatcher.matchCommunityEntryFullProjection( + parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()))); + + getClient().perform(get("/api/core/communities/" + parentCommunity.getID().toString()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$", Matchers.not(CommunityMatcher.matchCommunityEntryFullProjection( + parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle())))); + } + @Test public void findOneUnAuthenticatedTest() throws Exception { context.turnOffAuthorisationSystem(); @@ -778,17 +811,17 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ))) .andExpect(jsonPath("$", Matchers.not( Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(child1.getName(), child1.getID(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(child1.getName(), child1.getID(), child1.getHandle()) ) ))) @@ -860,21 +893,21 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities/search/top") - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.communities", Matchers.containsInAnyOrder( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()), - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity2.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity2.getName(), parentCommunity2.getID(), parentCommunity2.getHandle()) ))) .andExpect(jsonPath("$._embedded.communities", Matchers.not(Matchers.containsInAnyOrder( - CommunityMatcher.matchCommunityEntryFullProjection(child1.getName(), child1.getID(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(child1.getName(), child1.getID(), child1.getHandle()), - CommunityMatcher.matchCommunityEntryFullProjection(child12.getName(), child12.getID(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(child12.getName(), child12.getID(), child12.getHandle()) )))) .andExpect( @@ -1337,17 +1370,17 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ))) .andExpect(jsonPath("$", Matchers.not( Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(child1.getName(), child1.getID(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(child1.getName(), child1.getID(), child1.getHandle()) ) ))) @@ -1374,13 +1407,13 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest ; getClient().perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection("Electronic theses and dissertations", - parentCommunity.getID(), - parentCommunity.getHandle()) + CommunityMatcher.matchCommunityEntryNonAdminEmbeds("Electronic theses and dissertations", + parentCommunity.getID(), + parentCommunity.getHandle()) ))) .andExpect(jsonPath("$._links.self.href", Matchers.containsString("/api/core/communities"))) ; @@ -1429,11 +1462,11 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient(token).perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ))) @@ -1492,11 +1525,11 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ))) @@ -1526,11 +1559,11 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient(token).perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ))) @@ -1563,17 +1596,17 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); getClient().perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(parentCommunity.getName(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(parentCommunity.getName(), parentCommunity.getID(), parentCommunity.getHandle()) ))) .andExpect(jsonPath("$", Matchers.not( Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection(child1.getName(), child1.getID(), + CommunityMatcher.matchCommunityEntryNonAdminEmbeds(child1.getName(), child1.getID(), child1.getHandle()) ) ))) @@ -1603,13 +1636,13 @@ public class CommunityRestRepositoryIT extends AbstractControllerIntegrationTest ; getClient().perform(get("/api/core/communities/" + parentCommunity.getID().toString()) - .param("embed", CommunityMatcher.getFullEmbedsParameters())) + .param("embed", CommunityMatcher.getNonAdminEmbeds())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( - CommunityMatcher.matchCommunityEntryFullProjection("Electronic theses and dissertations", - parentCommunity.getID(), - parentCommunity.getHandle()) + CommunityMatcher.matchCommunityEntryNonAdminEmbeds("Electronic theses and dissertations", + parentCommunity.getID(), + parentCommunity.getHandle()) ))) .andExpect(jsonPath("$._links.self.href", Matchers.containsString("/api/core/communities"))) ; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/DiscoveryRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/DiscoveryRestControllerIT.java index 5b8022120d..bd5ceb75d9 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/DiscoveryRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/DiscoveryRestControllerIT.java @@ -25,15 +25,6 @@ import java.util.UUID; import com.jayway.jsonpath.matchers.JsonPathMatchers; import org.apache.commons.codec.CharEncoding; import org.apache.commons.io.IOUtils; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.ClaimedTaskBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.WorkflowItemBuilder; -import org.dspace.app.rest.builder.WorkspaceItemBuilder; import org.dspace.app.rest.matcher.AppliedFilterMatcher; import org.dspace.app.rest.matcher.FacetEntryMatcher; import org.dspace.app.rest.matcher.FacetValueMatcher; @@ -45,6 +36,15 @@ import org.dspace.app.rest.matcher.SortOptionMatcher; import org.dspace.app.rest.matcher.WorkflowItemMatcher; import org.dspace.app.rest.matcher.WorkspaceItemMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.ClaimedTaskBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.WorkflowItemBuilder; +import org.dspace.builder.WorkspaceItemBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Collection; import org.dspace.content.Community; @@ -95,11 +95,11 @@ public class DiscoveryRestControllerIT extends AbstractControllerIntegrationTest //We have 4 facets in the default configuration, they need to all be present in the embedded section .andExpect(jsonPath("$._embedded.facets", containsInAnyOrder( FacetEntryMatcher.authorFacet(false), - FacetEntryMatcher.entityTypeFacet(false), + FacetEntryMatcher.entityTypeFacet(false), FacetEntryMatcher.dateIssuedFacet(false), FacetEntryMatcher.subjectFacet(false), FacetEntryMatcher.hasContentInOriginalBundleFacet(false))) - ); + ); } @Test @@ -1351,9 +1351,10 @@ public class DiscoveryRestControllerIT extends AbstractControllerIntegrationTest context.restoreAuthSystemState(); - //** WHEN ** - //An anonymous user browses this endpoint to find the the objects in the system - //With a dsoType 'item' + // ** WHEN ** + // An anonymous user browses this endpoint to find the the objects in the system + + // With dsoType 'item' getClient().perform(get("/api/discover/search/objects") .param("dsoType", "Item")) @@ -1384,8 +1385,118 @@ public class DiscoveryRestControllerIT extends AbstractControllerIntegrationTest FacetEntryMatcher.hasContentInOriginalBundleFacet(false) ))) //There always needs to be a self link available - .andExpect(jsonPath("$._links.self.href", containsString("/api/discover/search/objects"))) - ; + .andExpect(jsonPath("$._links.self.href", containsString("/api/discover/search/objects"))); + + // With dsoTypes 'community' and 'collection' + getClient().perform(get("/api/discover/search/objects") + .param("dsoType", "Community") + .param("dsoType", "Collection")) + + //** THEN ** + //The status has to be 200 OK + .andExpect(status().isOk()) + //The type has to be 'discover' + .andExpect(jsonPath("$.type", is("discover"))) + // The page element needs to look like this and only have four totalElements because we only want + // the communities and the collections (dsoType) and we only created two of both types + .andExpect(jsonPath("$._embedded.searchResult.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 4) + ))) + // Only the two communities and the two collections can be present in the embedded.objects section + // as that's what we specified in the dsoType parameter + .andExpect(jsonPath("$._embedded.searchResult._embedded.objects", Matchers.containsInAnyOrder( + SearchResultMatcher.match("core", "community", "communities"), + SearchResultMatcher.match("core", "community", "communities"), + SearchResultMatcher.match("core", "collection", "collections"), + SearchResultMatcher.match("core", "collection", "collections") + ))) + //These facets have to show up in the embedded.facets section as well with the given hasMore + // property because we don't exceed their default limit for a hasMore true (the default is 10) + .andExpect(jsonPath("$._embedded.facets", Matchers.containsInAnyOrder( + FacetEntryMatcher.authorFacet(false), + FacetEntryMatcher.entityTypeFacet(false), + FacetEntryMatcher.subjectFacet(false), + FacetEntryMatcher.dateIssuedFacet(false), + FacetEntryMatcher.hasContentInOriginalBundleFacet(false) + ))) + //There always needs to be a self link available + .andExpect(jsonPath("$._links.self.href", containsString("/api/discover/search/objects"))); + + // With dsoTypes 'collection' and 'item' + getClient().perform(get("/api/discover/search/objects") + .param("dsoType", "Collection") + .param("dsoType", "Item")) + + //** THEN ** + //The status has to be 200 OK + .andExpect(status().isOk()) + //The type has to be 'discover' + .andExpect(jsonPath("$.type", is("discover"))) + // The page element needs to look like this and only have five totalElements because we only want + // the collections and the items (dsoType) and we only created two collections and three items + .andExpect(jsonPath("$._embedded.searchResult.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 5) + ))) + // Only the two collections and the three items can be present in the embedded.objects section + // as that's what we specified in the dsoType parameter + .andExpect(jsonPath("$._embedded.searchResult._embedded.objects", Matchers.containsInAnyOrder( + SearchResultMatcher.match("core", "collection", "collections"), + SearchResultMatcher.match("core", "collection", "collections"), + SearchResultMatcher.match("core", "item", "items"), + SearchResultMatcher.match("core", "item", "items"), + SearchResultMatcher.match("core", "item", "items") + ))) + //These facets have to show up in the embedded.facets section as well with the given hasMore + // property because we don't exceed their default limit for a hasMore true (the default is 10) + .andExpect(jsonPath("$._embedded.facets", Matchers.containsInAnyOrder( + FacetEntryMatcher.authorFacet(false), + FacetEntryMatcher.entityTypeFacet(false), + FacetEntryMatcher.subjectFacet(false), + FacetEntryMatcher.dateIssuedFacet(false), + FacetEntryMatcher.hasContentInOriginalBundleFacet(false) + ))) + //There always needs to be a self link available + .andExpect(jsonPath("$._links.self.href", containsString("/api/discover/search/objects"))); + + // With dsoTypes 'community', 'collection' and 'item' + getClient().perform(get("/api/discover/search/objects") + .param("dsoType", "Community") + .param("dsoType", "Collection") + .param("dsoType", "Item")) + + //** THEN ** + //The status has to be 200 OK + .andExpect(status().isOk()) + //The type has to be 'discover' + .andExpect(jsonPath("$.type", is("discover"))) + // The page element needs to look like this and have seven totalElements because we want + // the communities, the collections and the items (dsoType) and we created two communities, + // two collections and three items + .andExpect(jsonPath("$._embedded.searchResult.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 7) + ))) + // The two communities, the two collections and the three items can be present in the embedded.objects + // section as that's what we specified in the dsoType parameter + .andExpect(jsonPath("$._embedded.searchResult._embedded.objects", Matchers.containsInAnyOrder( + SearchResultMatcher.match("core", "community", "communities"), + SearchResultMatcher.match("core", "community", "communities"), + SearchResultMatcher.match("core", "collection", "collections"), + SearchResultMatcher.match("core", "collection", "collections"), + SearchResultMatcher.match("core", "item", "items"), + SearchResultMatcher.match("core", "item", "items"), + SearchResultMatcher.match("core", "item", "items") + ))) + //These facets have to show up in the embedded.facets section as well with the given hasMore + // property because we don't exceed their default limit for a hasMore true (the default is 10) + .andExpect(jsonPath("$._embedded.facets", Matchers.containsInAnyOrder( + FacetEntryMatcher.authorFacet(false), + FacetEntryMatcher.entityTypeFacet(false), + FacetEntryMatcher.subjectFacet(false), + FacetEntryMatcher.dateIssuedFacet(false), + FacetEntryMatcher.hasContentInOriginalBundleFacet(false) + ))) + //There always needs to be a self link available + .andExpect(jsonPath("$._links.self.href", containsString("/api/discover/search/objects"))); } @Test diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/EPersonRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/EPersonRestRepositoryIT.java index bdadb8c8d2..f2924467b5 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/EPersonRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/EPersonRestRepositoryIT.java @@ -16,6 +16,12 @@ import static org.hamcrest.Matchers.empty; import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.not; import static org.hamcrest.Matchers.nullValue; +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertNotEquals; +import static org.junit.Assert.assertNotNull; +import static org.junit.Assert.assertNull; +import static org.junit.Assert.assertTrue; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.delete; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.patch; @@ -32,11 +38,8 @@ import java.util.concurrent.atomic.AtomicReference; import javax.ws.rs.core.MediaType; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ItemBuilder; +import org.apache.commons.lang3.StringUtils; +import org.dspace.app.rest.jackson.IgnoreJacksonWriteOnlyAccess; import org.dspace.app.rest.matcher.EPersonMatcher; import org.dspace.app.rest.matcher.GroupMatcher; import org.dspace.app.rest.matcher.HalMatcher; @@ -44,15 +47,25 @@ import org.dspace.app.rest.matcher.MetadataMatcher; import org.dspace.app.rest.model.EPersonRest; import org.dspace.app.rest.model.MetadataRest; import org.dspace.app.rest.model.MetadataValueRest; +import org.dspace.app.rest.model.RegistrationRest; import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.ReplaceOperation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.test.MetadataPatchSuite; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.WorkflowItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; -import org.dspace.content.Item; import org.dspace.eperson.EPerson; import org.dspace.eperson.Group; +import org.dspace.eperson.PasswordHash; +import org.dspace.eperson.dao.RegistrationDataDAO; +import org.dspace.eperson.service.AccountService; +import org.dspace.eperson.service.EPersonService; +import org.dspace.eperson.service.RegistrationDataService; import org.dspace.services.ConfigurationService; import org.hamcrest.Matchers; import org.junit.Test; @@ -61,6 +74,17 @@ import org.springframework.beans.factory.annotation.Autowired; public class EPersonRestRepositoryIT extends AbstractControllerIntegrationTest { + @Autowired + private AccountService accountService; + + @Autowired + private RegistrationDataService registrationDataService; + + @Autowired + private EPersonService ePersonService; + + @Autowired + private RegistrationDataDAO registrationDataDAO; @Autowired private ConfigurationService configurationService; @@ -128,6 +152,39 @@ public class EPersonRestRepositoryIT extends AbstractControllerIntegrationTest { } } + @Test + public void createAnonAccessDeniedTest() throws Exception { + context.turnOffAuthorisationSystem(); + // we should check how to get it from Spring + ObjectMapper mapper = new ObjectMapper(); + EPersonRest data = new EPersonRest(); + EPersonRest dataFull = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + data.setEmail("createtest@fake-email.com"); + data.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + data.setMetadata(metadataRest); + dataFull.setEmail("createtestFull@fake-email.com"); + dataFull.setCanLogIn(true); + dataFull.setMetadata(metadataRest); + + context.restoreAuthSystemState(); + + getClient().perform(post("/api/eperson/epersons") + .content(mapper.writeValueAsBytes(data)) + .contentType(contentType) + .param("projection", "full")) + .andExpect(status().isUnauthorized()); + getClient().perform(get("/api/eperson/epersons/search/byEmail") + .param("email", data.getEmail())) + .andExpect(status().isNoContent()); + } + @Test public void findAllTest() throws Exception { context.turnOffAuthorisationSystem(); @@ -740,7 +797,7 @@ public class EPersonRestRepositoryIT extends AbstractControllerIntegrationTest { } @Test - public void deleteViolatingConstraints() throws Exception { + public void deleteViolatingWorkFlowConstraints() throws Exception { // We turn off the authorization system in order to create the structure as defined below context.turnOffAuthorisationSystem(); @@ -759,12 +816,13 @@ public class EPersonRestRepositoryIT extends AbstractControllerIntegrationTest { // 2. A collection with a logo Collection col = CollectionBuilder.createCollection(context, parentCommunity).withName("Collection") - .withLogo("logo_collection").build(); + .withLogo("logo_collection") + .withWorkflowGroup(1, ePerson) + .build(); // 3. Create an item that will prevent the deletion of the eperson account (it is the submitter) - Item item = ItemBuilder.createItem(context, col).build(); - + WorkflowItemBuilder.createWorkflowItem(context, col); context.restoreAuthSystemState(); String token = getAuthToken(admin.getEmail(), password); @@ -1793,6 +1851,818 @@ public class EPersonRestRepositoryIT extends AbstractControllerIntegrationTest { } + @Test + public void patchReplacePasswordWithToken() throws Exception { + context.turnOffAuthorisationSystem(); + + EPerson ePerson = EPersonBuilder.createEPerson(context) + .withNameInMetadata("John", "Doe") + .withEmail("Johndoe@fake-email.com") + .withPassword(password) + .build(); + + String newPassword = "newpassword"; + + context.restoreAuthSystemState(); + + List ops = new ArrayList(); + ReplaceOperation replaceOperation = new ReplaceOperation("/password", newPassword); + ops.add(replaceOperation); + String patchBody = getPatchContent(ops); + accountService.sendRegistrationInfo(context, ePerson.getEmail()); + String tokenForEPerson = registrationDataService.findByEmail(context, ePerson.getEmail()).getToken(); + PasswordHash oldPassword = ePersonService.getPasswordHash(ePerson); + // updates password + getClient().perform(patch("/api/eperson/epersons/" + ePerson.getID()) + .content(patchBody) + .contentType(MediaType.APPLICATION_JSON_PATCH_JSON) + .param("token", tokenForEPerson)) + .andExpect(status().isOk()); + + PasswordHash newPasswordHash = ePersonService.getPasswordHash(ePerson); + assertNotEquals(oldPassword, newPasswordHash); + assertTrue(registrationDataService.findByEmail(context, ePerson.getEmail()) == null); + + assertNull(registrationDataService.findByToken(context, tokenForEPerson)); + } + + + @Test + public void patchReplacePasswordWithRandomTokenPatchFail() throws Exception { + context.turnOffAuthorisationSystem(); + + EPerson ePerson = EPersonBuilder.createEPerson(context) + .withNameInMetadata("John", "Doe") + .withEmail("Johndoe@fake-email.com") + .withPassword(password) + .build(); + + String newPassword = "newpassword"; + + context.restoreAuthSystemState(); + + List ops = new ArrayList(); + ReplaceOperation replaceOperation = new ReplaceOperation("/password", newPassword); + ops.add(replaceOperation); + String patchBody = getPatchContent(ops); + accountService.sendRegistrationInfo(context, ePerson.getEmail()); + String tokenForEPerson = registrationDataService.findByEmail(context, ePerson.getEmail()).getToken(); + PasswordHash oldPassword = ePersonService.getPasswordHash(ePerson); + // updates password + getClient().perform(patch("/api/eperson/epersons/" + ePerson.getID()) + .content(patchBody) + .contentType(MediaType.APPLICATION_JSON_PATCH_JSON) + .param("token", "RandomToken")) + .andExpect(status().isUnauthorized()); + + PasswordHash newPasswordHash = ePersonService.getPasswordHash(ePerson); + assertEquals(oldPassword.getHashString(),newPasswordHash.getHashString()); + assertNotNull(registrationDataService.findByEmail(context, ePerson.getEmail())); + assertEquals(registrationDataService.findByEmail(context, ePerson.getEmail()).getToken(), tokenForEPerson); + + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, tokenForEPerson); + context.restoreAuthSystemState(); + } + + @Test + public void patchReplacePasswordWithOtherUserTokenFail() throws Exception { + context.turnOffAuthorisationSystem(); + + EPerson ePerson = EPersonBuilder.createEPerson(context) + .withNameInMetadata("John", "Doe") + .withEmail("Johndoe@fake-email.com") + .withPassword(password) + .build(); + + + EPerson ePersonTwo = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Smith", "Donald") + .withEmail("donaldSmith@fake-email.com") + .withPassword(password) + .build(); + + String newPassword = "newpassword"; + + context.restoreAuthSystemState(); + + List ops = new ArrayList(); + ReplaceOperation replaceOperation = new ReplaceOperation("/password", newPassword); + ops.add(replaceOperation); + String patchBody = getPatchContent(ops); + accountService.sendRegistrationInfo(context, ePerson.getEmail()); + accountService.sendRegistrationInfo(context, ePersonTwo.getEmail()); + String tokenForEPerson = registrationDataService.findByEmail(context, ePerson.getEmail()).getToken(); + String tokenForEPersonTwo = registrationDataService.findByEmail(context, ePersonTwo.getEmail()).getToken(); + + PasswordHash oldPassword = ePersonService.getPasswordHash(ePerson); + // updates password + getClient().perform(patch("/api/eperson/epersons/" + ePerson.getID()) + .content(patchBody) + .contentType(MediaType.APPLICATION_JSON_PATCH_JSON) + .param("token", tokenForEPersonTwo)) + .andExpect(status().isUnauthorized()); + + PasswordHash newPasswordHash = ePersonService.getPasswordHash(ePerson); + assertEquals(oldPassword.getHashString(),newPasswordHash.getHashString()); + assertNotNull(registrationDataService.findByEmail(context, ePerson.getEmail())); + + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, tokenForEPerson); + registrationDataService.deleteByToken(context, tokenForEPersonTwo); + context.restoreAuthSystemState(); + } + + @Test + public void patchReplaceEmailWithTokenFail() throws Exception { + context.turnOffAuthorisationSystem(); + + String originalEmail = "johndoe@fake-email.com"; + EPerson ePerson = EPersonBuilder.createEPerson(context) + .withNameInMetadata("John", "Doe") + .withEmail(originalEmail) + .withPassword(password) + .build(); + + String newEmail = "johnyandmaria@fake-email.com"; + + context.restoreAuthSystemState(); + + List ops = new ArrayList(); + ReplaceOperation replaceOperation = new ReplaceOperation("/email", newEmail); + ops.add(replaceOperation); + String patchBody = getPatchContent(ops); + accountService.sendRegistrationInfo(context, ePerson.getEmail()); + String tokenForEPerson = registrationDataService.findByEmail(context, ePerson.getEmail()).getToken(); + PasswordHash oldPassword = ePersonService.getPasswordHash(ePerson); + // updates password + getClient().perform(patch("/api/eperson/epersons/" + ePerson.getID()) + .content(patchBody) + .contentType(MediaType.APPLICATION_JSON_PATCH_JSON) + .param("token", tokenForEPerson)) + .andExpect(status().isUnauthorized()); + + PasswordHash newPasswordHash = ePersonService.getPasswordHash(ePerson); + assertEquals(oldPassword.getHashString(),newPasswordHash.getHashString()); + assertNotNull(registrationDataService.findByEmail(context, ePerson.getEmail())); + assertEquals(ePerson.getEmail(), originalEmail); + + context.turnOffAuthorisationSystem(); + registrationDataService.delete(context, registrationDataService.findByEmail(context, ePerson.getEmail())); + registrationDataService.deleteByToken(context, tokenForEPerson); + context.restoreAuthSystemState(); + + } + + @Test + public void registerNewAccountPatchUpdatePasswordRandomUserUuidFail() throws Exception { + context.turnOffAuthorisationSystem(); + + ObjectMapper mapper = new ObjectMapper(); + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + + EPerson ePerson = EPersonBuilder.createEPerson(context) + .withNameInMetadata("John", "Doe") + .withEmail("Johndoe@fake-email.com") + .withPassword(password) + .build(); + + String newPassword = "newpassword"; + + context.restoreAuthSystemState(); + + List ops = new ArrayList(); + ReplaceOperation replaceOperation = new ReplaceOperation("/password", newPassword); + ops.add(replaceOperation); + String patchBody = getPatchContent(ops); + accountService.sendRegistrationInfo(context, ePerson.getEmail()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + PasswordHash oldPassword = ePersonService.getPasswordHash(ePerson); + try { + // updates password + getClient().perform(patch("/api/eperson/epersons/" + ePerson.getID()) + .content(patchBody) + .contentType(MediaType.APPLICATION_JSON_PATCH_JSON) + .param("token", newRegisterToken)) + .andExpect(status().isUnauthorized()); + + PasswordHash newPasswordHash = ePersonService.getPasswordHash(ePerson); + assertTrue(StringUtils.equalsIgnoreCase(oldPassword.getHashString(),newPasswordHash.getHashString())); + assertFalse(registrationDataService.findByEmail(context, ePerson.getEmail()) == null); + assertFalse(registrationDataService.findByEmail(context, newRegisterEmail) == null); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.delete(context, registrationDataService.findByEmail(context, ePerson.getEmail())); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + } + } + + @Test + public void postEPersonWithTokenWithoutEmailProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + AtomicReference idRef = new AtomicReference(); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isCreated()) + .andExpect(jsonPath("$", Matchers.allOf( + hasJsonPath("$.uuid", not(empty())), + // is it what you expect? EPerson.getName() returns the email... + //hasJsonPath("$.name", is("Doe John")), + hasJsonPath("$.type", is("eperson")), + hasJsonPath("$._links.self.href", not(empty())), + hasJsonPath("$.metadata", Matchers.allOf( + matchMetadata("eperson.firstname", "John"), + matchMetadata("eperson.lastname", "Doe") + ))))) + .andDo(result -> idRef + .set(UUID.fromString(read(result.getResponse().getContentAsString(), "$.id")))); + + + + String epersonUuid = String.valueOf(idRef.get()); + EPerson createdEPerson = ePersonService.find(context, UUID.fromString(epersonUuid)); + assertTrue(ePersonService.checkPassword(context, createdEPerson, "somePassword")); + + assertNull(registrationDataService.findByToken(context, newRegisterToken)); + + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + EPersonBuilder.deleteEPerson(idRef.get()); + } + } + + @Test + public void postEPersonWithTokenWithEmailProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmail); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + AtomicReference idRef = new AtomicReference(); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isCreated()) + .andExpect(jsonPath("$", Matchers.allOf( + hasJsonPath("$.uuid", not(empty())), + // is it what you expect? EPerson.getName() returns the email... + //hasJsonPath("$.name", is("Doe John")), + hasJsonPath("$.email", is(newRegisterEmail)), + hasJsonPath("$.type", is("eperson")), + hasJsonPath("$._links.self.href", not(empty())), + hasJsonPath("$.metadata", Matchers.allOf( + matchMetadata("eperson.firstname", "John"), + matchMetadata("eperson.lastname", "Doe") + ))))).andDo(result -> idRef + .set(UUID.fromString(read(result.getResponse().getContentAsString(), "$.id")))); + + String epersonUuid = String.valueOf(idRef.get()); + EPerson createdEPerson = ePersonService.find(context, UUID.fromString(epersonUuid)); + assertTrue(ePersonService.checkPassword(context, createdEPerson, "somePassword")); + assertNull(registrationDataService.findByToken(context, newRegisterToken)); + + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + EPersonBuilder.deleteEPerson(idRef.get()); + } + + } + + @Test + public void postEPersonWithTokenWithEmailAndSelfRegisteredProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmail); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + ePersonRest.setSelfRegistered(true); + AtomicReference idRef = new AtomicReference(); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isCreated()) + .andExpect(jsonPath("$", Matchers.allOf( + hasJsonPath("$.uuid", not(empty())), + // is it what you expect? EPerson.getName() returns the email... + //hasJsonPath("$.name", is("Doe John")), + hasJsonPath("$.email", is(newRegisterEmail)), + hasJsonPath("$.type", is("eperson")), + hasJsonPath("$._links.self.href", not(empty())), + hasJsonPath("$.metadata", Matchers.allOf( + matchMetadata("eperson.firstname", "John"), + matchMetadata("eperson.lastname", "Doe") + ))))).andDo(result -> idRef + .set(UUID.fromString(read(result.getResponse().getContentAsString(), "$.id")))); + + + String epersonUuid = String.valueOf(idRef.get()); + EPerson createdEPerson = ePersonService.find(context, UUID.fromString(epersonUuid)); + assertTrue(ePersonService.checkPassword(context, createdEPerson, "somePassword")); + assertNull(registrationDataService.findByToken(context, newRegisterToken)); + + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + EPersonBuilder.deleteEPerson(idRef.get()); + } + + } + + @Test + public void postEPersonWithTokenWithTwoTokensDifferentEmailProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + String newRegisterEmailTwo = "new-register-two@fake-email.com"; + RegistrationRest registrationRestTwo = new RegistrationRest(); + registrationRestTwo.setEmail(newRegisterEmailTwo); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRestTwo))) + .andExpect(status().isCreated()); + String newRegisterTokenTwo = registrationDataService.findByEmail(context, newRegisterEmailTwo).getToken(); + + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmailTwo); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isBadRequest()); + + EPerson createdEPerson = ePersonService.findByEmail(context, newRegisterEmailTwo); + assertNull(createdEPerson); + assertNotNull(registrationDataService.findByToken(context, newRegisterToken)); + assertNotNull(registrationDataService.findByToken(context, newRegisterTokenTwo)); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + registrationDataService.deleteByToken(context, newRegisterTokenTwo); + context.restoreAuthSystemState(); + + } + } + + @Test + public void postEPersonWithRandomTokenWithEmailProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmail); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", "randomToken") + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isBadRequest()); + + EPerson createdEPerson = ePersonService.findByEmail(context, newRegisterEmail); + assertNull(createdEPerson); + assertNotNull(registrationDataService.findByToken(context, newRegisterToken)); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + } + + } + + @Test + public void postEPersonWithTokenWithEmailAndSelfRegisteredFalseProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmail); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + ePersonRest.setSelfRegistered(false); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isBadRequest()); + + EPerson createdEPerson = ePersonService.findByEmail(context, newRegisterEmail); + assertNull(createdEPerson); + assertNotNull(registrationDataService.findByToken(context, newRegisterToken)); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + } + + } + + @Test + public void postEPersonWithTokenWithoutLastNameProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmail); + ePersonRest.setCanLogIn(true); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + ePersonRest.setSelfRegistered(true); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isUnprocessableEntity()); + + EPerson createdEPerson = ePersonService.findByEmail(context, newRegisterEmail); + assertNull(createdEPerson); + assertNotNull(registrationDataService.findByToken(context, newRegisterToken)); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + } + + } + + @Test + public void postEPersonWithTokenWithoutFirstNameProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmail); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + ePersonRest.setSelfRegistered(true); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isUnprocessableEntity()); + + EPerson createdEPerson = ePersonService.findByEmail(context, newRegisterEmail); + assertNull(createdEPerson); + assertNotNull(registrationDataService.findByToken(context, newRegisterToken)); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + } + + } + + @Test + public void postEPersonWithTokenWithoutPasswordProperty() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmail); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isBadRequest()); + + EPerson createdEPerson = ePersonService.findByEmail(context, newRegisterEmail); + assertNull(createdEPerson); + assertNotNull(registrationDataService.findByToken(context, newRegisterToken)); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + } + + } + + @Test + public void postEPersonWithWrongToken() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + String newEmail = "new-email@fake-email.com"; + + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(eperson.getEmail()); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String forgotPasswordToken = registrationDataService.findByEmail(context, eperson.getEmail()).getToken(); + + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + ePersonRest.setSelfRegistered(true); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", forgotPasswordToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isBadRequest()); + + EPerson createdEPerson = ePersonService.findByEmail(context, newEmail); + assertNull(createdEPerson); + assertNotNull(registrationDataService.findByToken(context, forgotPasswordToken)); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, forgotPasswordToken); + context.restoreAuthSystemState(); + } + + + } + + @Test + public void postEPersonWithTokenWithEmailPropertyAnonUser() throws Exception { + + ObjectMapper mapper = new ObjectMapper(); + + String newRegisterEmail = "new-register@fake-email.com"; + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(newRegisterEmail); + getClient().perform(post("/api/eperson/registrations") + .contentType(MediaType.APPLICATION_JSON) + .content(mapper.writeValueAsBytes(registrationRest))) + .andExpect(status().isCreated()); + String newRegisterToken = registrationDataService.findByEmail(context, newRegisterEmail).getToken(); + + + EPersonRest ePersonRest = new EPersonRest(); + MetadataRest metadataRest = new MetadataRest(); + ePersonRest.setEmail(newRegisterEmail); + ePersonRest.setCanLogIn(true); + MetadataValueRest surname = new MetadataValueRest(); + surname.setValue("Doe"); + metadataRest.put("eperson.lastname", surname); + MetadataValueRest firstname = new MetadataValueRest(); + firstname.setValue("John"); + metadataRest.put("eperson.firstname", firstname); + ePersonRest.setMetadata(metadataRest); + ePersonRest.setPassword("somePassword"); + + mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + + AtomicReference idRef = new AtomicReference(); + + try { + getClient().perform(post("/api/eperson/epersons") + .param("token", newRegisterToken) + .content(mapper.writeValueAsBytes(ePersonRest)) + .contentType(MediaType.APPLICATION_JSON)) + .andExpect(status().isCreated()) + .andExpect(jsonPath("$", Matchers.allOf( + hasJsonPath("$.uuid", not(empty())), + // is it what you expect? EPerson.getName() returns the email... + //hasJsonPath("$.name", is("Doe John")), + hasJsonPath("$.email", is(newRegisterEmail)), + hasJsonPath("$.type", is("eperson")), + hasJsonPath("$._links.self.href", not(empty())), + hasJsonPath("$.metadata", Matchers.allOf( + matchMetadata("eperson.firstname", "John"), + matchMetadata("eperson.lastname", "Doe") + ))))).andDo(result -> idRef + .set(UUID.fromString(read(result.getResponse().getContentAsString(), "$.id")))); + + String epersonUuid = String.valueOf(idRef.get()); + EPerson createdEPerson = ePersonService.find(context, UUID.fromString(epersonUuid)); + assertTrue(ePersonService.checkPassword(context, createdEPerson, "somePassword")); + assertNull(registrationDataService.findByToken(context, newRegisterToken)); + } finally { + context.turnOffAuthorisationSystem(); + registrationDataService.deleteByToken(context, newRegisterToken); + context.restoreAuthSystemState(); + EPersonBuilder.deleteEPerson(idRef.get()); + } + } + @Test public void findByMetadataByCommAdminAndByColAdminTest() throws Exception { context.turnOffAuthorisationSystem(); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/GroupRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/GroupRestRepositoryIT.java index 868b5d271e..7789bc5d7b 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/GroupRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/GroupRestRepositoryIT.java @@ -31,10 +31,6 @@ import java.util.concurrent.atomic.AtomicReference; import javax.ws.rs.core.MediaType; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.GroupBuilder; import org.dspace.app.rest.matcher.EPersonMatcher; import org.dspace.app.rest.matcher.GroupMatcher; import org.dspace.app.rest.matcher.HalMatcher; @@ -47,6 +43,10 @@ import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.test.MetadataPatchSuite; import org.dspace.authorize.service.AuthorizeService; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.factory.ContentServiceFactory; @@ -97,7 +97,7 @@ public class GroupRestRepositoryIT extends AbstractControllerIntegrationTest { // hold the id of the created workflow item AtomicReference idRef = new AtomicReference<>(); - AtomicReference idRefNoEmbeds = new AtomicReference(); + AtomicReference idRefNoEmbeds = new AtomicReference<>(); try { ObjectMapper mapper = new ObjectMapper(); GroupRest groupRest = new GroupRest(); @@ -121,7 +121,7 @@ public class GroupRestRepositoryIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", GroupMatcher.matchFullEmbeds())) .andDo(result -> idRef .set(UUID.fromString(read(result.getResponse().getContentAsString(), "$.id"))) - ); + ); getClient(authToken).perform(get("/api/eperson/groups")) //The status has to be 200 OK diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/IdentifierRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/IdentifierRestControllerIT.java index 2956e90513..bd67289330 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/IdentifierRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/IdentifierRestControllerIT.java @@ -11,8 +11,8 @@ import static org.springframework.test.web.servlet.request.MockMvcRequestBuilder import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.header; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CommunityBuilder; import org.junit.Before; import org.junit.Ignore; import org.junit.Test; @@ -27,7 +27,7 @@ public class IdentifierRestControllerIT extends AbstractControllerIntegrationTes @Before public void setup() throws Exception { super.setUp(); - } + } @Test public void testValidIdentifier() throws Exception { diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemOwningCollectionUpdateRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemOwningCollectionUpdateRestControllerIT.java index 98014cc3a0..73c2c8a3fe 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemOwningCollectionUpdateRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemOwningCollectionUpdateRestControllerIT.java @@ -15,14 +15,14 @@ import static org.springframework.test.web.servlet.request.MockMvcRequestBuilder import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.ResourcePolicyBuilder; import org.dspace.app.rest.matcher.CollectionMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.ResourcePolicy; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; import org.dspace.content.Collection; import org.dspace.content.Item; import org.dspace.core.Constants; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemRestRepositoryIT.java index b510ffeb28..3e8889796d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemRestRepositoryIT.java @@ -13,6 +13,7 @@ import static org.dspace.app.rest.matcher.MetadataMatcher.matchMetadata; import static org.dspace.app.rest.matcher.MetadataMatcher.matchMetadataDoesNotExist; import static org.dspace.core.Constants.WRITE; import static org.hamcrest.Matchers.is; +import static org.hamcrest.Matchers.nullValue; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.delete; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.patch; @@ -33,17 +34,6 @@ import javax.ws.rs.core.MediaType; import com.fasterxml.jackson.databind.ObjectMapper; import org.apache.commons.io.IOUtils; import org.apache.commons.lang3.CharEncoding; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.EntityTypeBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.RelationshipBuilder; -import org.dspace.app.rest.builder.RelationshipTypeBuilder; -import org.dspace.app.rest.builder.ResourcePolicyBuilder; -import org.dspace.app.rest.builder.WorkspaceItemBuilder; import org.dspace.app.rest.matcher.BitstreamMatcher; import org.dspace.app.rest.matcher.CollectionMatcher; import org.dspace.app.rest.matcher.HalMatcher; @@ -55,6 +45,14 @@ import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.ReplaceOperation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.test.MetadataPatchSuite; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; +import org.dspace.builder.WorkspaceItemBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Collection; import org.dspace.content.Community; @@ -294,6 +292,47 @@ public class ItemRestRepositoryIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", publicItem1Matcher)); } + @Test + public void findOneFullProjectionTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + Collection col2 = CollectionBuilder.createCollection(context, child1).withName("Collection 2").build(); + + //2. Three public items that are readable by Anonymous with different subjects + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Public item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .build(); + context.restoreAuthSystemState(); + Matcher publicItem1Matcher = ItemMatcher.matchItemWithTitleAndDateIssued(publicItem1, + "Public item 1", + "2017-10-17"); + + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/core/items/" + publicItem1.getID()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.owningCollection._embedded.adminGroup", nullValue())); + + + getClient().perform(get("/api/core/items/" + publicItem1.getID()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.owningCollection._embedded.adminGroup").doesNotExist()); + + } + @Test public void findOneRelsTest() throws Exception { context.turnOffAuthorisationSystem(); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemTemplateRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemTemplateRestControllerIT.java index 58cca9c414..55e82831f3 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemTemplateRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ItemTemplateRestControllerIT.java @@ -22,8 +22,6 @@ import java.util.List; import java.util.Map; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.matcher.MetadataMatcher; import org.dspace.app.rest.model.MetadataRest; import org.dspace.app.rest.model.MetadataValueRest; @@ -33,6 +31,8 @@ import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.ReplaceOperation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; import org.dspace.content.Collection; import org.dspace.core.Constants; import org.hamcrest.Matchers; @@ -249,7 +249,7 @@ public class ItemTemplateRestControllerIT extends AbstractControllerIntegrationT String itemId = installTestTemplate(); - List ops = new ArrayList(); + List ops = new ArrayList<>(); ReplaceOperation replaceOperation = new ReplaceOperation("/inArchive", true); ops.add(replaceOperation); String illegalPatchBody = getPatchContent(ops); @@ -266,7 +266,7 @@ public class ItemTemplateRestControllerIT extends AbstractControllerIntegrationT String itemId = installTestTemplate(); - List ops = new ArrayList(); + List ops = new ArrayList<>(); ReplaceOperation replaceOperation = new ReplaceOperation("/discoverable", true); ops.add(replaceOperation); String illegalPatchBody = getPatchContent(ops); @@ -283,7 +283,7 @@ public class ItemTemplateRestControllerIT extends AbstractControllerIntegrationT String itemId = installTestTemplate(); - List ops = new ArrayList(); + List ops = new ArrayList<>(); ReplaceOperation replaceOperation = new ReplaceOperation("/withdrawn", true); ops.add(replaceOperation); String illegalPatchBody = getPatchContent(ops); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/LanguageSupportIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/LanguageSupportIT.java new file mode 100644 index 0000000000..379744ed22 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/LanguageSupportIT.java @@ -0,0 +1,87 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.header; + +import java.util.Locale; + +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.EPersonBuilder; +import org.dspace.content.authority.ChoiceAuthorityServiceImpl; +import org.dspace.core.LegacyPluginServiceImpl; +import org.dspace.eperson.EPerson; +import org.dspace.services.ConfigurationService; +import org.junit.Ignore; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * Integration test class for supported languages + * + * @author Mykhaylo Boychuk (at 4science) + */ +public class LanguageSupportIT extends AbstractControllerIntegrationTest { + + @Autowired + private ConfigurationService configurationService; + @Autowired + private LegacyPluginServiceImpl legacyPluginService; + @Autowired + private ChoiceAuthorityServiceImpl choiceAuthorityServiceImpl; + + @Test + public void checkDefaultLanguageAnonymousTest() throws Exception { + getClient().perform(get("/api")) + .andExpect(header().stringValues("Content-Language","en")); + } + + @Test + @Ignore("This test fails due to a bug in the MockHttpResponseServlet," + + " see https://github.com/spring-projects/spring-framework/issues/25281") + public void checkEnabledMultipleLanguageSupportTest() throws Exception { + context.turnOffAuthorisationSystem(); + String[] supportedLanguage = {"uk","it"}; + configurationService.setProperty("webui.supported.locales",supportedLanguage); + legacyPluginService.clearNamedPluginClasses(); + choiceAuthorityServiceImpl.clearCache(); + + Locale it = new Locale("it"); + + EPerson epersonUK = EPersonBuilder.createEPerson(context) + .withEmail("epersonUK@example.com") + .withPassword(password) + .withLanguage("uk") + .build(); + + EPerson epersonFR = EPersonBuilder.createEPerson(context) + .withEmail("epersonFR@example.com") + .withPassword(password) + .withLanguage("fr") + .build(); + + context.restoreAuthSystemState(); + + String tokenEPersonUK = getAuthToken(epersonUK.getEmail(), password); + String tokenEPersonFR = getAuthToken(epersonFR.getEmail(), password); + + getClient(tokenEPersonUK).perform(get("/api")) + .andExpect(header().stringValues("Content-Language","uk, it")); + + getClient(tokenEPersonUK).perform(get("/api").locale(it)) + .andExpect(header().stringValues("Content-Language","uk, it")); + + getClient(tokenEPersonFR).perform(get("/api").locale(it)) + .andExpect(header().stringValues("Content-Language","uk, it")); + + configurationService.setProperty("webui.supported.locales",null); + legacyPluginService.clearNamedPluginClasses(); + choiceAuthorityServiceImpl.clearCache(); + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/LoginAsEPersonIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/LoginAsEPersonIT.java index 22ec3ebb1d..95412b514d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/LoginAsEPersonIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/LoginAsEPersonIT.java @@ -23,15 +23,15 @@ import java.util.UUID; import com.fasterxml.jackson.databind.ObjectMapper; import org.apache.commons.io.IOUtils; import org.apache.commons.lang3.CharEncoding; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.PoolTaskBuilder; import org.dspace.app.rest.matcher.EPersonMatcher; import org.dspace.app.rest.matcher.WorkflowItemMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.PoolTaskBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Collection; import org.dspace.content.Community; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/MappedCollectionRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/MappedCollectionRestRepositoryIT.java index b3b6513cdd..26d5e87b1d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/MappedCollectionRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/MappedCollectionRestRepositoryIT.java @@ -15,12 +15,12 @@ import static org.springframework.test.web.servlet.request.MockMvcRequestBuilder import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.matcher.CollectionMatcher; import org.dspace.app.rest.matcher.ItemMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -406,13 +406,13 @@ public class MappedCollectionRestRepositoryIT extends AbstractControllerIntegrat .andExpect(jsonPath("$._embedded.mappedItems", Matchers.not(Matchers.contains( ItemMatcher.matchItemProperties(publicItem1)) ))) - .andExpect(jsonPath("$._embedded.mappedItems", Matchers.hasSize(0)));; + .andExpect(jsonPath("$._embedded.mappedItems", Matchers.hasSize(0))); getClient().perform(get("/api/core/collections/" + col3.getID() + "/mappedItems")) .andExpect(status().isOk()) .andExpect(jsonPath("$._embedded.mappedItems", Matchers.contains( ItemMatcher.matchItemProperties(publicItem1)) )) - .andExpect(jsonPath("$._embedded.mappedItems", Matchers.hasSize(1)));; + .andExpect(jsonPath("$._embedded.mappedItems", Matchers.hasSize(1))); } @Test diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/MetadataSchemaRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/MetadataSchemaRestRepositoryIT.java index 940a077f4c..4a094d7dc9 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/MetadataSchemaRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/MetadataSchemaRestRepositoryIT.java @@ -21,13 +21,13 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.util.concurrent.atomic.AtomicReference; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.MetadataSchemaBuilder; import org.dspace.app.rest.converter.MetadataSchemaConverter; import org.dspace.app.rest.matcher.HalMatcher; import org.dspace.app.rest.matcher.MetadataschemaMatcher; import org.dspace.app.rest.model.MetadataSchemaRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.MetadataSchemaBuilder; import org.dspace.content.MetadataSchema; import org.hamcrest.Matchers; import org.junit.Test; @@ -99,17 +99,17 @@ public class MetadataSchemaRestRepositoryIT extends AbstractControllerIntegratio try { - getClient(authToken) - .perform(post("/api/core/metadataschemas") - .content(new ObjectMapper().writeValueAsBytes(metadataSchemaRest)) - .contentType(contentType)) - .andExpect(status().isCreated()) - .andExpect(jsonPath("$", HalMatcher.matchNoEmbeds())) - .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), "$.id"))); + getClient(authToken) + .perform(post("/api/core/metadataschemas") + .content(new ObjectMapper().writeValueAsBytes(metadataSchemaRest)) + .contentType(contentType)) + .andExpect(status().isCreated()) + .andExpect(jsonPath("$", HalMatcher.matchNoEmbeds())) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), "$.id"))); - getClient().perform(get("/api/core/metadataschemas/" + idRef.get())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", MetadataschemaMatcher.matchEntry(TEST_NAME, TEST_NAMESPACE))); + getClient().perform(get("/api/core/metadataschemas/" + idRef.get())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", MetadataschemaMatcher.matchEntry(TEST_NAME, TEST_NAMESPACE))); } finally { MetadataSchemaBuilder.deleteMetadataSchema(idRef.get()); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/MetadatafieldRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/MetadatafieldRestRepositoryIT.java index 067a496de7..49b9045aac 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/MetadatafieldRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/MetadatafieldRestRepositoryIT.java @@ -8,6 +8,7 @@ package org.dspace.app.rest; import static com.jayway.jsonpath.JsonPath.read; +import static org.hamcrest.Matchers.hasItem; import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.notNullValue; import static org.hamcrest.Matchers.nullValue; @@ -23,11 +24,11 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.util.concurrent.atomic.AtomicReference; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.MetadataFieldBuilder; -import org.dspace.app.rest.builder.MetadataSchemaBuilder; import org.dspace.app.rest.matcher.MetadataFieldMatcher; import org.dspace.app.rest.model.MetadataFieldRest; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.MetadataFieldBuilder; +import org.dspace.builder.MetadataSchemaBuilder; import org.dspace.content.MetadataField; import org.dspace.content.MetadataFieldServiceImpl; import org.dspace.content.MetadataSchema; @@ -54,6 +55,9 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration private MetadataSchema metadataSchema; + public static final String METADATAFIELDS_ENDPOINT = "/api/core/metadatafields/"; + private static final String SEARCH_BYFIELDNAME_ENDPOINT = METADATAFIELDS_ENDPOINT + "search/byFieldName"; + @Autowired private MetadataSchemaService metadataSchemaService; @@ -74,13 +78,13 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration context.restoreAuthSystemState(); getClient().perform(get("/api/core/metadatafields") - .param("size", String.valueOf(100))) + .param("size", String.valueOf(100))) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItems( MetadataFieldMatcher.matchMetadataFieldByKeys("dc", "title", null), MetadataFieldMatcher.matchMetadataFieldByKeys("dc", "date", "issued")) - )) + )) .andExpect(jsonPath("$._links.first.href", Matchers.containsString("/api/core/metadatafields"))) .andExpect(jsonPath("$._links.self.href", Matchers.containsString("/api/core/metadatafields"))) .andExpect(jsonPath("$._links.next.href", Matchers.containsString("/api/core/metadatafields"))) @@ -102,7 +106,7 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$", Matchers.is( MetadataFieldMatcher.matchMetadataField(metadataField) - ))); + ))); } @Test @@ -122,30 +126,30 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration context.turnOffAuthorisationSystem(); MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", - "http://www.dspace.org/ns/aschema").build(); + "http://www.dspace.org/ns/aschema").build(); MetadataField metadataField = MetadataFieldBuilder .createMetadataField(context, schema, "AnElement", "AQualifier", "AScopeNote").build(); context.restoreAuthSystemState(); getClient().perform(get("/api/core/metadatafields/search/bySchema") - .param("schema", "dc") - .param("size", String.valueOf(100))) + .param("schema", "dc") + .param("size", String.valueOf(100))) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItems( MetadataFieldMatcher.matchMetadataFieldByKeys("dc", "title", null), MetadataFieldMatcher.matchMetadataFieldByKeys("dc", "date", "issued")) - )) + )) .andExpect(jsonPath("$.page.size", is(100))); getClient().perform(get("/api/core/metadatafields/search/bySchema") - .param("schema", schema.getName())) + .param("schema", schema.getName())) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( MetadataFieldMatcher.matchMetadataField(metadataField)) - )) + )) .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", is(1))); } @@ -154,7 +158,7 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration public void findByUndefinedSchema() throws Exception { getClient().perform(get("/api/core/metadatafields/search/bySchema") - .param("schema", "undefined")) + .param("schema", "undefined")) .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$.page.size", is(20))) @@ -168,6 +172,394 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration .andExpect(status().isBadRequest()); } + @Test + public void findByFieldName_schema() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement", "AQualifier", "AScopeNote").build(); + + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", schema.getName())) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + )) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(1))); + } + + @Test + public void findByFieldName_element() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataSchema schema2 = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema2", + "http://www.dspace.org/ns/aschema2").build(); + + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement", "AQualifier", "AScopeNote").build(); + + MetadataField metadataField2 = MetadataFieldBuilder + .createMetadataField(context, schema2, "AnElement", "AQualifier2", "AScopeNote2").build(); + + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("element", "AnElement")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + )) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(2))); + } + + @Test + public void findByFieldName_elementAndQualifier() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataSchema schema2 = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema2", + "http://www.dspace.org/ns/aschema2").build(); + + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", "AQualifier", "AScopeNote").build(); + + MetadataField metadataField2 = MetadataFieldBuilder + .createMetadataField(context, schema2, "AnElement2", "AQualifier", "AScopeNote2").build(); + + MetadataField metadataField3 = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement2", "AQualifier", "AScopeNote2").build(); + + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("element", "AnElement2") + .param("qualifier", "AQualifier")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField3)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + ))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(2))); + } + + @Test + public void findByFieldName_schemaAndQualifier() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataSchema schema2 = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema2", + "http://www.dspace.org/ns/aschema2").build(); + + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", "AQualifier", "AScopeNote").build(); + + MetadataField metadataField2 = MetadataFieldBuilder + .createMetadataField(context, schema2, "AnElement2", "AQualifier", "AScopeNote2").build(); + + MetadataField metadataField3 = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement3", "AQualifier", "AScopeNote3").build(); + + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", schema.getName()) + .param("qualifier", "AQualifier")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField3)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + ))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(2))); + } + + @Test + public void findByFieldName_schemaElementAndQualifier() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataSchema schema2 = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema2", + "http://www.dspace.org/ns/aschema2").build(); + + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", "AQualifier", "AScopeNote").build(); + + MetadataField metadataField2 = MetadataFieldBuilder + .createMetadataField(context, schema2, "AnElement2", "AQualifier", "AScopeNote2").build(); + + MetadataField metadataField3 = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement3", "AQualifier", "AScopeNote3").build(); + + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", schema.getName()) + .param("element", metadataField3.getElement()) + .param("qualifier", metadataField3.getQualifier())) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + ))) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + ))) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField3)) + )) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(1))); + } + + @Test + public void findByFieldName_query() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataSchema schema2 = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema2", + "http://www.dspace.org/ns/aschema2").build(); + + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", "AQualifier", "AScopeNote").build(); + + MetadataField metadataField2 = MetadataFieldBuilder + .createMetadataField(context, schema2, "AnElement2", "AQualifier", "AScopeNote2").build(); + + MetadataField metadataField3 = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement3", "AQualifier", "AScopeNote2").build(); + + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("query", schema.getName())) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField3)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + )) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(3))); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("query", schema.getName() + ".AnElement3")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + ))) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField3)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + ))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(1))); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("query", "AnElement3.AQual")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + ))) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField3)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + ))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(1))); + } + + @Test + public void findByFieldName_query_noQualifier() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataSchema schema2 = MetadataSchemaBuilder.createMetadataSchema(context, "test", + "http://www.dspace.org/ns/aschema2").build(); + + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", null, "AScopeNote").build(); + + MetadataField metadataField2 = MetadataFieldBuilder + .createMetadataField(context, schema2, "AnElement2", null, "AScopeNote2").build(); + + MetadataField metadataField3 = MetadataFieldBuilder + .createMetadataField(context, schema, "test", null, "AScopeNote2").build(); + + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("query", "test")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + ))) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField3)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + )) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(2))); + } + + @Test + public void findByFieldName_invalidQuery() throws Exception { + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("query", "schema.element.qualifier.morestuff")) + .andExpect(status().isBadRequest()); + } + + @Test + public void findByFieldName_exactName() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataSchema schema2 = MetadataSchemaBuilder.createMetadataSchema(context, "test", + "http://www.dspace.org/ns/aschema2").build(); + + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", null, "AScopeNote").build(); + + MetadataField metadataField2 = MetadataFieldBuilder + .createMetadataField(context, schema2, "AnElement2", null, "AScopeNote2").build(); + + MetadataField metadataField3 = MetadataFieldBuilder + .createMetadataField(context, schema, "test", null, "AScopeNote2").build(); + + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("exactName", metadataField.toString('.'))) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + )) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField3)) + ))) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.not(hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField2)) + ))) + .andExpect(jsonPath("$.page.size", is(20))) + .andExpect(jsonPath("$.page.totalElements", is(1))); + } + + @Test + public void findByFieldName_exactName_NoResult() throws Exception { + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("exactName", "not.valid.mdstring")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void findByFieldName_exactName_combinedDiscoveryQueryParams_query() throws Exception { + context.turnOffAuthorisationSystem(); + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", null, "AScopeNote").build(); + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("exactName", metadataField.toString('.')) + .param("query", "query")) + .andExpect(status().isUnprocessableEntity()); + } + + @Test + public void findByFieldName_exactName_combinedDiscoveryQueryParams_schema() throws Exception { + context.turnOffAuthorisationSystem(); + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", null, "AScopeNote").build(); + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("exactName", metadataField.toString('.')) + .param("schema", "schema")) + .andExpect(status().isUnprocessableEntity()); + } + + @Test + public void findByFieldName_exactName_combinedDiscoveryQueryParams_element() throws Exception { + context.turnOffAuthorisationSystem(); + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", null, "AScopeNote").build(); + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("exactName", metadataField.toString('.')) + .param("element", "element")) + .andExpect(status().isUnprocessableEntity()); + } + + @Test + public void findByFieldName_exactName_combinedDiscoveryQueryParams_qualifier() throws Exception { + context.turnOffAuthorisationSystem(); + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + MetadataField metadataField = MetadataFieldBuilder + .createMetadataField(context, schema, "AnElement1", null, "AScopeNote").build(); + context.restoreAuthSystemState(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("exactName", metadataField.toString('.')) + .param("qualifier", "qualifier")) + .andExpect(status().isUnprocessableEntity()); + } + @Test public void createSuccess() throws Exception { @@ -179,21 +571,64 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration String authToken = getAuthToken(admin.getEmail(), password); AtomicReference idRef = new AtomicReference<>(); try { - assertThat(metadataFieldService.findByElement(context, metadataSchema, ELEMENT, QUALIFIER), nullValue()); + assertThat(metadataFieldService.findByElement(context, metadataSchema, ELEMENT, QUALIFIER), nullValue()); - getClient(authToken) - .perform(post("/api/core/metadatafields") - .param("schemaId", metadataSchema.getID() + "") - .param("projection", "full") - .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) - .contentType(contentType)) - .andExpect(status().isCreated()) - .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), "$.id"))); + getClient(authToken) + .perform(post("/api/core/metadatafields") + .param("schemaId", metadataSchema.getID() + "") + .param("projection", "full") + .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) + .contentType(contentType)) + .andExpect(status().isCreated()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), "$.id"))); - getClient(authToken).perform(get("/api/core/metadatafields/" + idRef.get())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", MetadataFieldMatcher.matchMetadataFieldByKeys( - metadataSchema.getName(), "testElementForCreate", "testQualifierForCreate"))); + getClient(authToken).perform(get("/api/core/metadatafields/" + idRef.get())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", MetadataFieldMatcher.matchMetadataFieldByKeys( + metadataSchema.getName(), "testElementForCreate", "testQualifierForCreate"))); + } finally { + MetadataFieldBuilder.deleteMetadataField(idRef.get()); + } + } + + @Test + public void create_checkAddedToIndex() throws Exception { + + MetadataFieldRest metadataFieldRest = new MetadataFieldRest(); + metadataFieldRest.setElement("testElementForCreate"); + metadataFieldRest.setQualifier("testQualifierForCreate"); + metadataFieldRest.setScopeNote(SCOPE_NOTE); + + String authToken = getAuthToken(admin.getEmail(), password); + AtomicReference idRef = new AtomicReference<>(); + try { + assertThat(metadataFieldService.findByElement(context, metadataSchema, ELEMENT, QUALIFIER), nullValue()); + + getClient(authToken) + .perform(post("/api/core/metadatafields") + .param("schemaId", metadataSchema.getID() + "") + .param("projection", "full") + .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) + .contentType(contentType)) + .andExpect(status().isCreated()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), "$.id"))); + + getClient(authToken).perform(get("/api/core/metadatafields/" + idRef.get())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", MetadataFieldMatcher.matchMetadataFieldByKeys( + metadataSchema.getName(), "testElementForCreate", "testQualifierForCreate"))); + + // new metadata field found in index + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", metadataSchema.getName()) + .param("element", metadataFieldRest.getElement()) + .param("qualifier", metadataFieldRest.getQualifier())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataFieldByKeys(metadataSchema.getName(), + metadataFieldRest.getElement(), metadataFieldRest.getQualifier())) + )) + .andExpect(jsonPath("$.page.totalElements", is(1))); } finally { MetadataFieldBuilder.deleteMetadataField(idRef.get()); } @@ -209,9 +644,9 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration getClient() .perform(post("/api/core/metadatafields") - .param("schemaId", metadataSchema.getID() + "") - .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) - .contentType(contentType)) + .param("schemaId", metadataSchema.getID() + "") + .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) + .contentType(contentType)) .andExpect(status().isUnauthorized()); } @@ -227,9 +662,9 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration getClient(token) .perform(post("/api/core/metadatafields") - .param("schemaId", metadataSchema.getID() + "") - .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) - .contentType(contentType)) + .param("schemaId", metadataSchema.getID() + "") + .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) + .contentType(contentType)) .andExpect(status().isForbidden()); } @@ -315,6 +750,44 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration .andExpect(status().isNotFound()); } + @Test + public void delete_checkDeletedFromIndex() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataSchema schema = MetadataSchemaBuilder.createMetadataSchema(context, "ASchema", + "http://www.dspace.org/ns/aschema").build(); + + MetadataField metadataField = MetadataFieldBuilder.createMetadataField(context, schema, ELEMENT, QUALIFIER, + SCOPE_NOTE).build(); + + context.restoreAuthSystemState(); + + Integer id = metadataField.getID(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", schema.getName()) + .param("element", metadataField.getElement()) + .param("qualifier", metadataField.getQualifier())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataField(metadataField)) + )); + + getClient(getAuthToken(admin.getEmail(), password)) + .perform(delete("/api/core/metadatafields/" + id)) + .andExpect(status().isNoContent()); + + assertThat(metadataFieldService.find(context, id), nullValue()); + + // deleted metadata field not found in index + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", schema.getName()) + .param("element", metadataField.getElement()) + .param("qualifier", metadataField.getQualifier())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + @Test public void update() throws Exception { context.turnOffAuthorisationSystem(); @@ -332,15 +805,68 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration getClient(getAuthToken(admin.getEmail(), password)) .perform(put("/api/core/metadatafields/" + metadataField.getID()) - .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) - .contentType(contentType)) + .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) + .contentType(contentType)) .andExpect(status().isOk()); getClient().perform(get("/api/core/metadatafields/" + metadataField.getID())) .andExpect(status().isOk()) .andExpect(jsonPath("$", MetadataFieldMatcher.matchMetadataFieldByKeys( metadataSchema.getName(), ELEMENT_UPDATED, QUALIFIER_UPDATED) - )); + )); + } + + @Test + public void update_checkUpdatedInIndex() throws Exception { + context.turnOffAuthorisationSystem(); + + MetadataField metadataField = MetadataFieldBuilder.createMetadataField(context, ELEMENT, QUALIFIER, SCOPE_NOTE) + .build(); + + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", metadataSchema.getName()) + .param("element", metadataField.getElement()) + .param("qualifier", metadataField.getQualifier())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataFieldByKeys(metadataSchema.getName(), + metadataField.getElement(), metadataField.getQualifier())) + )) + .andExpect(jsonPath("$.page.totalElements", is(1))); + + context.restoreAuthSystemState(); + + MetadataFieldRest metadataFieldRest = new MetadataFieldRest(); + metadataFieldRest.setId(metadataField.getID()); + metadataFieldRest.setElement(ELEMENT_UPDATED); + metadataFieldRest.setQualifier(QUALIFIER_UPDATED); + metadataFieldRest.setScopeNote(SCOPE_NOTE_UPDATED); + + getClient(getAuthToken(admin.getEmail(), password)) + .perform(put("/api/core/metadatafields/" + metadataField.getID()) + .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) + .contentType(contentType)) + .andExpect(status().isOk()); + + // new metadata field found in index + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", metadataSchema.getName()) + .param("element", ELEMENT_UPDATED) + .param("qualifier", QUALIFIER_UPDATED)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.metadatafields", Matchers.hasItem( + MetadataFieldMatcher.matchMetadataFieldByKeys(metadataSchema.getName(), + ELEMENT_UPDATED, QUALIFIER_UPDATED)) + )) + .andExpect(jsonPath("$.page.totalElements", is(1))); + + // original metadata field not found in index + getClient().perform(get(SEARCH_BYFIELDNAME_ENDPOINT) + .param("schema", metadataSchema.getName()) + .param("element", metadataField.getElement()) + .param("qualifier", metadataField.getQualifier())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -360,15 +886,15 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration getClient() .perform(put("/api/core/metadatafields/" + metadataField.getID()) - .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) - .contentType(contentType)) + .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) + .contentType(contentType)) .andExpect(status().isUnauthorized()); getClient().perform(get("/api/core/metadatafields/" + metadataField.getID())) .andExpect(status().isOk()) .andExpect(jsonPath("$", MetadataFieldMatcher.matchMetadataFieldByKeys( metadataSchema.getName(), ELEMENT, QUALIFIER) - )); + )); } @@ -390,15 +916,15 @@ public class MetadatafieldRestRepositoryIT extends AbstractControllerIntegration getClient(getAuthToken(eperson.getEmail(), password)) .perform(put("/api/core/metadatafields/" + metadataField.getID()) - .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) - .contentType(contentType)) + .content(new ObjectMapper().writeValueAsBytes(metadataFieldRest)) + .contentType(contentType)) .andExpect(status().isForbidden()); getClient().perform(get("/api/core/metadatafields/" + metadataField.getID())) .andExpect(status().isOk()) .andExpect(jsonPath("$", MetadataFieldMatcher.matchMetadataFieldByKeys( metadataSchema.getName(), ELEMENT, QUALIFIER) - )); + )); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/PatchMetadataIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/PatchMetadataIT.java new file mode 100644 index 0000000000..a9178defc2 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/PatchMetadataIT.java @@ -0,0 +1,1319 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import static com.jayway.jsonpath.JsonPath.read; +import static org.hamcrest.CoreMatchers.equalTo; +import static org.hamcrest.CoreMatchers.not; +import static org.hamcrest.CoreMatchers.startsWith; +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertThat; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.patch; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.content; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.io.IOException; +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.List; +import java.util.concurrent.atomic.AtomicReference; + +import org.dspace.app.rest.matcher.MetadataMatcher; +import org.dspace.app.rest.model.MetadataValueRest; +import org.dspace.app.rest.model.patch.AddOperation; +import org.dspace.app.rest.model.patch.MoveOperation; +import org.dspace.app.rest.model.patch.Operation; +import org.dspace.app.rest.model.patch.RemoveOperation; +import org.dspace.app.rest.model.patch.ReplaceOperation; +import org.dspace.app.rest.test.AbstractEntityIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.RelationshipBuilder; +import org.dspace.builder.WorkspaceItemBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.content.MetadataValue; +import org.dspace.content.RelationshipType; +import org.dspace.content.WorkspaceItem; +import org.dspace.content.service.EntityTypeService; +import org.dspace.content.service.ItemService; +import org.dspace.content.service.RelationshipTypeService; +import org.dspace.content.service.WorkspaceItemService; +import org.hamcrest.Matchers; +import org.junit.After; +import org.junit.Before; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.http.MediaType; + +/** + * Created by kristof on 20/02/2020 + */ +public class PatchMetadataIT extends AbstractEntityIntegrationTest { + + @Autowired + private RelationshipTypeService relationshipTypeService; + + @Autowired + private EntityTypeService entityTypeService; + + @Autowired + private ItemService itemService; + + @Autowired + private WorkspaceItemService workspaceItemService; + + private Collection collection; + private WorkspaceItem publicationItem; + private Item personItem1; + private Item personItem2; + private RelationshipType publicationPersonRelationshipType; + + private List authorsOriginalOrder; + + private AtomicReference idRef1; + private AtomicReference idRef2; + + private String addedAuthor; + private String replacedAuthor; + + @Before + @Override + public void setUp() throws Exception { + super.setUp(); + context.turnOffAuthorisationSystem(); + + Community community = CommunityBuilder.createCommunity(context) + .withName("Parent community") + .build(); + collection = CollectionBuilder.createCollection(context, community) + .withName("Collection") + .build(); + + context.restoreAuthSystemState(); + } + + @After + @Override + public void destroy() throws Exception { + super.destroy(); + cleanupPersonRelations(); + } + + /** + * A method to create a workspace publication containing 5 authors: 3 regular authors and 2 related Person items. + * The authors are added in a specific order: + * - "Whyte, William": Regular author + * - "Dahlen, Sarah": Related Person + * - "Peterson, Karrie": Regular author + * - "Perotti, Enrico": Regular author + * - "Linton, Oliver": Related Person + */ + private void initPersonPublicationWorkspace() throws Exception { + // Setup the original order of authors + authorsOriginalOrder = new ArrayList<>(); + authorsOriginalOrder.add("Whyte, William"); + // Second one will be virtual metadata + authorsOriginalOrder.add("Dahlen, Sarah"); + authorsOriginalOrder.add("Peterson, Karrie"); + authorsOriginalOrder.add("Perotti, Enrico"); + // 5th one will be virtual metadata + authorsOriginalOrder.add("Linton, Oliver"); + + addedAuthor = "Semple, Robert"; + replacedAuthor = "New Value"; + + context.turnOffAuthorisationSystem(); + + personItem1 = ItemBuilder.createItem(context, collection) + .withTitle("Person 1") + .withPersonIdentifierFirstName("Sarah") + .withPersonIdentifierLastName("Dahlen") + .withRelationshipType("Person") + .build(); + personItem2 = ItemBuilder.createItem(context, collection) + .withTitle("Person 2") + .withPersonIdentifierFirstName("Oliver") + .withPersonIdentifierLastName("Linton") + .withRelationshipType("Person") + .build(); + publicationItem = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withTitle("Publication 1") + .withRelationshipType("Publication") + .build(); + publicationPersonRelationshipType = relationshipTypeService.findbyTypesAndTypeName(context, + entityTypeService.findByEntityType(context, "Publication"), + entityTypeService.findByEntityType(context, "Person"), + "isAuthorOfPublication", + "isPublicationOfAuthor"); + + String adminToken = getAuthToken(admin.getEmail(), password); + + // Make sure we grab the latest instance of the Item from the database before adding a regular author + WorkspaceItem publication = workspaceItemService.find(context, publicationItem.getID()); + itemService.addMetadata(context, publication.getItem(), + "dc", "contributor", "author", Item.ANY, authorsOriginalOrder.get(0)); + workspaceItemService.update(context, publication); + + context.restoreAuthSystemState(); + + // Create a relationship between publication and person 1 + idRef1 = new AtomicReference<>(); + getClient(adminToken).perform(post("/api/core/relationships") + .param("relationshipType", publicationPersonRelationshipType.getID().toString()) + .contentType(MediaType.parseMediaType + (org.springframework.data.rest.webmvc.RestMediaTypes.TEXT_URI_LIST_VALUE)) + .content("https://localhost:8080/server/api/core/items/" + publicationItem.getItem().getID() + "\n" + + "https://localhost:8080/server/api/core/items/" + personItem1.getID())) + .andExpect(status().isCreated()) + .andDo(result -> idRef1.set(read(result.getResponse().getContentAsString(), "$.id"))); + context.turnOffAuthorisationSystem(); + + // Add two more regular authors + List regularMetadata = new ArrayList<>(); + publication = workspaceItemService.find(context, publicationItem.getID()); + regularMetadata.add(authorsOriginalOrder.get(2)); + regularMetadata.add(authorsOriginalOrder.get(3)); + itemService.addMetadata(context, publication.getItem(), + "dc", "contributor", "author", null, regularMetadata); + workspaceItemService.update(context, publication); + + context.restoreAuthSystemState(); + + // Create a relationship between publication and person 2 + AtomicReference idRef2 = new AtomicReference<>(); + getClient(adminToken).perform(post("/api/core/relationships") + .param("relationshipType", publicationPersonRelationshipType.getID().toString()) + .contentType(MediaType.parseMediaType + (org.springframework.data.rest.webmvc.RestMediaTypes.TEXT_URI_LIST_VALUE)) + .content("https://localhost:8080/server/api/core/items/" + publicationItem.getItem().getID() + "\n" + + "https://localhost:8080/server/api/core/items/" + personItem2.getID())) + .andExpect(status().isCreated()) + .andDo(result -> idRef2.set(read(result.getResponse().getContentAsString(), "$.id"))); + + publication = workspaceItemService.find(context, publicationItem.getID()); + List publicationAuthorList = + itemService.getMetadata(publication.getItem(), "dc", "contributor", "author", Item.ANY); + assertEquals(publicationAuthorList.size(), 5); + assertThat(publicationAuthorList.get(0).getValue(), equalTo(authorsOriginalOrder.get(0))); + assertThat(publicationAuthorList.get(0).getAuthority(), not(startsWith("virtual::"))); + assertThat(publicationAuthorList.get(1).getValue(), equalTo(authorsOriginalOrder.get(1))); + assertThat(publicationAuthorList.get(1).getAuthority(), startsWith("virtual::")); + assertThat(publicationAuthorList.get(2).getValue(), equalTo(authorsOriginalOrder.get(2))); + assertThat(publicationAuthorList.get(2).getAuthority(), not(startsWith("virtual::"))); + assertThat(publicationAuthorList.get(3).getValue(), equalTo(authorsOriginalOrder.get(3))); + assertThat(publicationAuthorList.get(3).getAuthority(), not(startsWith("virtual::"))); + assertThat(publicationAuthorList.get(4).getValue(), equalTo(authorsOriginalOrder.get(4))); + assertThat(publicationAuthorList.get(4).getAuthority(), startsWith("virtual::")); + } + + /** + * Clean up created Person Relationshipts + * @throws IOException + * @throws SQLException + */ + private void cleanupPersonRelations() throws IOException, SQLException { + if (idRef1 != null) { + RelationshipBuilder.deleteRelationship(idRef1.get()); + idRef1 = null; + } + if (idRef2 != null) { + RelationshipBuilder.deleteRelationship(idRef2.get()); + idRef2 = null; + } + } + + /** + * A method to create a workspace publication containing 5 authors: 3 regular authors and 2 related Person items. + * The authors are added in a specific order: + * - "Whyte, William": Regular author + * - "Dahlen, Sarah": Regular Person + * - "Peterson, Karrie": Regular author + * - "Perotti, Enrico": Regular author + * - "Linton, Oliver": Regular Person + */ + private void initPlainTextPublicationWorkspace() throws Exception { + authorsOriginalOrder = new ArrayList<>(); + authorsOriginalOrder.add("Whyte, William"); + authorsOriginalOrder.add("Dahlen, Sarah"); + authorsOriginalOrder.add("Peterson, Karrie"); + authorsOriginalOrder.add("Perotti, Enrico"); + authorsOriginalOrder.add("Linton, Oliver"); + + addedAuthor = "Semple, Robert"; + replacedAuthor = "New Value"; + + context.turnOffAuthorisationSystem(); + + publicationItem = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withTitle("Publication 1") + .withRelationshipType("Publication") + .build(); + + String adminToken = getAuthToken(admin.getEmail(), password); + + // Make sure we grab the latest instance of the Item from the database before adding a regular author + WorkspaceItem publication = workspaceItemService.find(context, publicationItem.getID()); + itemService.addMetadata(context, publication.getItem(), + "dc", "contributor", "author", Item.ANY, authorsOriginalOrder); + workspaceItemService.update(context, publication); + + context.restoreAuthSystemState(); + + publication = workspaceItemService.find(context, publicationItem.getID()); + List publicationAuthorList = + itemService.getMetadata(publication.getItem(), "dc", "contributor", "author", Item.ANY); + assertEquals(publicationAuthorList.size(), 5); + assertThat(publicationAuthorList.get(0).getValue(), equalTo(authorsOriginalOrder.get(0))); + assertThat(publicationAuthorList.get(0).getAuthority(), not(startsWith("virtual::"))); + assertThat(publicationAuthorList.get(1).getValue(), equalTo(authorsOriginalOrder.get(1))); + assertThat(publicationAuthorList.get(1).getAuthority(), not(startsWith("virtual::"))); + assertThat(publicationAuthorList.get(2).getValue(), equalTo(authorsOriginalOrder.get(2))); + assertThat(publicationAuthorList.get(2).getAuthority(), not(startsWith("virtual::"))); + assertThat(publicationAuthorList.get(3).getValue(), equalTo(authorsOriginalOrder.get(3))); + assertThat(publicationAuthorList.get(3).getAuthority(), not(startsWith("virtual::"))); + assertThat(publicationAuthorList.get(4).getValue(), equalTo(authorsOriginalOrder.get(4))); + assertThat(publicationAuthorList.get(4).getAuthority(), not(startsWith("virtual::"))); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 1 to 0 using a PATCH request and verify the order of the authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 1,0,2,3,4 + */ + @Test + public void moveTraditionalPageOneAuthorOneToZeroTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + moveTraditionalPageOneAuthorTest(1, 0, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 2 to 0 using a PATCH request and verify the order of the authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 2,0,1,3,4 + */ + @Test + public void moveTraditionalPageOneAuthorTwoToZeroTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + moveTraditionalPageOneAuthorTest(2, 0, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 3 to 0 using a PATCH request and verify the order of the authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 3,0,1,2,4 + */ + @Test + public void moveTraditionalPageOneAuthorThreeToZeroTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + moveTraditionalPageOneAuthorTest(3, 0, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 4 to 0 using a PATCH request and verify the order of the authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 4,0,1,2,3 + */ + @Test + public void moveTraditionalPageOneAuthorFourToZeroTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(4)); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + + moveTraditionalPageOneAuthorTest(4, 0, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 1 to 3 using a PATCH request and verify the order of the authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,2,3,1,4 + */ + @Test + public void moveTraditionalPageOneAuthorOneToThreeTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + moveTraditionalPageOneAuthorTest(1, 3, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 1 to 4 using a PATCH request and verify the order of the authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,2,3,4,1 + */ + @Test + public void moveTraditionalPageOneAuthorOneToFourTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + expectedOrder.add(authorsOriginalOrder.get(1)); + + moveTraditionalPageOneAuthorTest(1, 4, expectedOrder); + } + + /** + * This test will replace an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 0 using a PATCH request and verify the order and value of the authors within the section. + * @throws Exception + */ + @Test + public void replaceTraditionalPageOneAuthorZeroTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(replacedAuthor); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + replaceTraditionalPageOneAuthorTest(0, expectedOrder); + } + + /** + * This test will replace an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 2 using a PATCH request and verify the order and value of the authors within the section. + * @throws Exception + */ + @Test + public void replaceTraditionalPageOneAuthorTwoTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(replacedAuthor); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + replaceTraditionalPageOneAuthorTest(2, expectedOrder); + } + + /** + * This test will replace an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 3 using a PATCH request and verify the order and value of the authors within the section. + * @throws Exception + */ + @Test + public void replaceTraditionalPageOneAuthorThreeTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(replacedAuthor); + expectedOrder.add(authorsOriginalOrder.get(4)); + + replaceTraditionalPageOneAuthorTest(3, expectedOrder); + } + + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 0 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: +,0,1,2,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlaceZeroTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("0", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 1 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,+,1,2,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlaceOneTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("1", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 2 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,+,2,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlaceTwoTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("2", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 3 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,+,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlaceThreeTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("3", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 4 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,3,+,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlaceFourTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("4", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 0 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: +,0,1,2,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOneLastPlaceTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + expectedOrder.add(addedAuthor); + + addTraditionalPageOneAuthorTest("-", expectedOrder); + + } + + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 0 using a PATCH request and verify the order of the remaining authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 1,2,3,4 + */ + @Test + public void removeAuthorOnTraditionalPageFromPlaceZeroTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + removeTraditionalPageOneAuthorTest(0, expectedOrder); + } + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 1 using a PATCH request and verify the order of the remaining authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,3,4 + */ + @Test + public void removeAuthorOnTraditionalPageFromPlaceOneTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + // The author at the first place is linked through a relationship and cannot be deleted through a PATCH request + removeTraditionalPageOneAuthorTest(1, expectedOrder); + } + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 2 using a PATCH request and verify the order of the remaining authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,3,4 + */ + @Test + public void removeAuthorOnTraditionalPageFromPlaceTwoTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + removeTraditionalPageOneAuthorTest(2, expectedOrder); + } + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 3 using a PATCH request and verify the order of the remaining authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,4 + */ + @Test + public void removeAuthorOnTraditionalPageFromPlaceThreeTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + removeTraditionalPageOneAuthorTest(3, expectedOrder); + } + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 4 using a PATCH request and verify the order of the remaining authors within the section. + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,3,4 + */ + @Test + public void removeAuthorOnTraditionalPageFromPlaceFourTest() throws Exception { + initPersonPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + // The author at the fourth place is linked through a relationship and cannot be deleted through a PATCH request + removeTraditionalPageOneAuthorTest(4, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 1 to 0 using a PATCH request and verify the order of the authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 1,0,2,3,4 + */ + @Test + public void moveTraditionalPageOnePlainTextAuthorOneToZeroTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + moveTraditionalPageOneAuthorTest(1, 0, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 2 to 0 using a PATCH request and verify the order of the authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 2,0,1,3,4 + */ + @Test + public void moveTraditionalPageOnePlainTextAuthorTwoToZeroTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + moveTraditionalPageOneAuthorTest(2, 0, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 3 to 0 using a PATCH request and verify the order of the authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 3,0,1,2,4 + */ + @Test + public void moveTraditionalPageOnePlainTextAuthorThreeToZeroTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + moveTraditionalPageOneAuthorTest(3, 0, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 4 to 0 using a PATCH request and verify the order of the authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 4,0,1,2,3 + */ + @Test + public void moveTraditionalPageOnePlainTextAuthorFourToZeroTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(4)); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + + moveTraditionalPageOneAuthorTest(4, 0, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 1 to 3 using a PATCH request and verify the order of the authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,2,3,1,4 + */ + @Test + public void moveTraditionalPageOnePlainTextAuthorOneToThreeTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + moveTraditionalPageOneAuthorTest(1, 3, expectedOrder); + } + + /** + * This test will move an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position 1 to 4 using a PATCH request and verify the order of the authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,2,3,4,1 + */ + @Test + public void moveTraditionalPageOnePlainTextAuthorOneToFourTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + expectedOrder.add(authorsOriginalOrder.get(1)); + + moveTraditionalPageOneAuthorTest(1, 4, expectedOrder); + } + + /** + * This test will replace an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 0 using a PATCH request and verify the order and value of the authors within the section. + * This test uses only plain text authors + * @throws Exception + */ + @Test + public void replaceTraditionalPagePlainTextOneAuthorZeroTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(replacedAuthor); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + replaceTraditionalPageOneAuthorTest(0, expectedOrder); + } + + /** + * This test will replace an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 2 using a PATCH request and verify the order and value of the authors within the section. + * This test uses only plain text authors + * @throws Exception + */ + @Test + public void replaceTraditionalPagePlainTextOneAuthorTwoTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(replacedAuthor); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + replaceTraditionalPageOneAuthorTest(2, expectedOrder); + } + + /** + * This test will replace an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 3 using a PATCH request and verify the order and value of the authors within the section. + * This test uses only plain text authors + * @throws Exception + */ + @Test + public void replaceTraditionalPageOnePlainTextAuthorThreeTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(replacedAuthor); + expectedOrder.add(authorsOriginalOrder.get(4)); + + replaceTraditionalPageOneAuthorTest(3, expectedOrder); + } + + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 0 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: +,0,1,2,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlainTextPlaceZeroTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("0", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 1 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,+,1,2,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlainTextPlaceOneTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("1", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 2 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,+,2,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlainTextPlaceTwoTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("2", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 3 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,+,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlainTextPlaceThreeTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("3", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 4 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,3,+,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOnePlainTextPlaceFourTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(addedAuthor); + expectedOrder.add(authorsOriginalOrder.get(4)); + + addTraditionalPageOneAuthorTest("4", expectedOrder); + + } + + /** + * This test will add an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position 0 using a PATCH request and verify the place of the new author and the order of the + * authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: +,0,1,2,3,4 (with + being the new author) + */ + @Test + public void addAuthorOnTraditionalPageOneLastPlainTextPlaceTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + expectedOrder.add(addedAuthor); + + addTraditionalPageOneAuthorTest("-", expectedOrder); + + } + + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 0 using a PATCH request and verify the order of the remaining authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 1,2,3,4 + */ + @Test + public void removeAuthorOnTraditionalPagePlainTextFromPlaceZeroTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + removeTraditionalPageOneAuthorTest(0, expectedOrder); + } + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 1 using a PATCH request and verify the order of the remaining authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,2,3,4 + */ + @Test + public void removeAuthorOnTraditionalPagePlainTextFromPlaceOneTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + // The author at the first place is linked through a relationship and cannot be deleted through a PATCH request + removeTraditionalPageOneAuthorTest(1, expectedOrder); + } + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 2 using a PATCH request and verify the order of the remaining authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,3,4 + */ + @Test + public void removeAuthorOnTraditionalPagePlainTextFromPlaceTwoTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(3)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + removeTraditionalPageOneAuthorTest(2, expectedOrder); + } + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 3 using a PATCH request and verify the order of the remaining authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,4 + */ + @Test + public void removeAuthorOnTraditionalPagePlainTextFromPlaceThreeTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(4)); + + removeTraditionalPageOneAuthorTest(3, expectedOrder); + } + /** + * This test will remove the author (dc.contributor.author) from a workspace publication's "traditionalpageone" + * section at position 4 using a PATCH request and verify the order of the remaining authors within the section. + * This test uses only plain text authors + * Original Order: 0,1,2,3,4 + * Expected Order: 0,1,2,3 + */ + @Test + public void removeAuthorOnTraditionalPagePlainTextFromPlaceFourTest() throws Exception { + initPlainTextPublicationWorkspace(); + + List expectedOrder = new ArrayList<>(); + expectedOrder.add(authorsOriginalOrder.get(0)); + expectedOrder.add(authorsOriginalOrder.get(1)); + expectedOrder.add(authorsOriginalOrder.get(2)); + expectedOrder.add(authorsOriginalOrder.get(3)); + + // The author at the fourth place is linked through a relationship and cannot be deleted through a PATCH request + removeTraditionalPageOneAuthorTest(4, expectedOrder); + } + + + /** + * This test will remove all authors (dc.contributor.author) that are not linked through a relationship from a + * workspace publication's "traditionalpageone" section using a PATCH request and verify that the only remaining + * authors are those coming from a relationship. + */ + @Test + public void removeAllAuthorsOnTraditionalPageTest() throws Exception { + initPersonPublicationWorkspace(); + + List ops = new ArrayList(); + RemoveOperation removeOperation = new RemoveOperation("/sections/traditionalpageone/dc.contributor.author"); + ops.add(removeOperation); + String patchBody = getPatchContent(ops); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(patch("/api/submission/workspaceitems/" + publicationItem.getID()) + .content(patchBody) + .contentType(javax.ws.rs.core.MediaType.APPLICATION_JSON_PATCH_JSON)) + .andExpect(status().isOk()); + + String authorField = "dc.contributor.author"; + getClient(token).perform(get("/api/submission/workspaceitems/" + publicationItem.getID())) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.sections.traditionalpageone", + // The author at the first and fourth place are linked through a relationship + // and cannot be deleted through a PATCH request + Matchers.allOf( + Matchers.is(MetadataMatcher.matchMetadata(authorField, authorsOriginalOrder.get(1), 0)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, authorsOriginalOrder.get(4), 1)) + ))); + + + + } + + /** + * This method moves an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from position "from" to "path" using a PATCH request and verifies the order of the authors within the + * section using an ordered list of expected author names. + * @param from The "from" index to use for the Move operation + * @param path The "path" index to use for the Move operation + * @param expectedOrder A list of author names sorted in the expected order + */ + private void moveTraditionalPageOneAuthorTest(int from, int path, List expectedOrder) throws Exception { + List ops = new ArrayList(); + MoveOperation moveOperation = getTraditionalPageOneMoveAuthorOperation(from, path); + ops.add(moveOperation); + String patchBody = getPatchContent(ops); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(patch("/api/submission/workspaceitems/" + publicationItem.getID()) + .content(patchBody) + .contentType(javax.ws.rs.core.MediaType.APPLICATION_JSON_PATCH_JSON)) + .andExpect(status().isOk()); + + String authorField = "dc.contributor.author"; + getClient(token).perform(get("/api/submission/workspaceitems/" + publicationItem.getID())) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.sections.traditionalpageone", Matchers.allOf( + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(0), 0)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(1), 1)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(2), 2)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(3), 3)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(4), 4)) + ))); + } + + /** + * This method replaces an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section at position "path" using a PATCH request and verifies the order of the authors within the + * section using an ordered list of expected author names. + * @param path The "path" index to use for the Replace operation + * @param expectedOrder A list of author names sorted in the expected order + */ + private void replaceTraditionalPageOneAuthorTest(int path, List expectedOrder) throws Exception { + List ops = new ArrayList(); + MetadataValueRest value = new MetadataValueRest(); + value.setValue(replacedAuthor); + + ReplaceOperation replaceOperation = new ReplaceOperation("/sections/traditionalpageone/dc.contributor.author/" + + path, value); + ops.add(replaceOperation); + String patchBody = getPatchContent(ops); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(patch("/api/submission/workspaceitems/" + publicationItem.getID()) + .content(patchBody) + .contentType(javax.ws.rs.core.MediaType.APPLICATION_JSON_PATCH_JSON)) + .andExpect(status().isOk()); + + String authorField = "dc.contributor.author"; + getClient(token).perform(get("/api/submission/workspaceitems/" + publicationItem.getID())) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.sections.traditionalpageone", Matchers.allOf( + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(0), 0)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(1), 1)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(2), 2)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(3), 3)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(4), 4)) + ))); + } + + /** + * This method adds an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section to the position "path" using a PATCH request and verifies the place of the new author and the + * order of the previous authors within the section using an ordered list of expected author names. + * @param path The "path" index to use for the Add operation + * @param expectedOrder A list of author names sorted in the expected order + */ + private void addTraditionalPageOneAuthorTest(String path, List expectedOrder) throws Exception { + List ops = new ArrayList(); + MetadataValueRest value = new MetadataValueRest(); + value.setValue(addedAuthor); + AddOperation addOperation = new AddOperation("/sections/traditionalpageone/dc.contributor.author/" + path, + value); + ops.add(addOperation); + String patchBody = getPatchContent(ops); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(patch("/api/submission/workspaceitems/" + publicationItem.getID()) + .content(patchBody) + .contentType(javax.ws.rs.core.MediaType.APPLICATION_JSON_PATCH_JSON)) + .andExpect(status().isOk()); + + String authorField = "dc.contributor.author"; + getClient(token).perform(get("/api/submission/workspaceitems/" + publicationItem.getID())) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.sections.traditionalpageone", Matchers.allOf( + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(0), 0)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(1), 1)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(2), 2)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(3), 3)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(4), 4)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(5), 5)) + ))); + } + + /** + * This method removes an author (dc.contributor.author) within a workspace publication's "traditionalpageone" + * section from the position "path" using a PATCH request and verifies the order of the remaining authors + * within the section using an ordered list of expected author names. + * @param path The "path" index to use for the Remove operation + * @param expectedOrder A list of author names sorted in the expected order + */ + private void removeTraditionalPageOneAuthorTest(int path, List expectedOrder) throws Exception { + List ops = new ArrayList(); + RemoveOperation removeOperation = new RemoveOperation("/sections/traditionalpageone/dc.contributor.author/" + + path); + ops.add(removeOperation); + String patchBody = getPatchContent(ops); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(patch("/api/submission/workspaceitems/" + publicationItem.getID()) + .content(patchBody) + .contentType(javax.ws.rs.core.MediaType.APPLICATION_JSON_PATCH_JSON)) + .andExpect(status().isOk()); + + String authorField = "dc.contributor.author"; + getClient(token).perform(get("/api/submission/workspaceitems/" + publicationItem.getID())) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.sections.traditionalpageone", Matchers.allOf( + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(0), 0)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(1), 1)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(2), 2)), + Matchers.is(MetadataMatcher.matchMetadata(authorField, expectedOrder.get(3), 3)) + ))); + } + + /** + * Create a move operation on a workspace item's "traditionalpageone" section for + * metadata field "dc.contributor.author". + * @param from The "from" index to use for the Move operation + * @param path The "path" index to use for the Move operation + */ + private MoveOperation getTraditionalPageOneMoveAuthorOperation(int from, int path) { + return new MoveOperation("/sections/traditionalpageone/dc.contributor.author/" + path, + "/sections/traditionalpageone/dc.contributor.author/" + from); + } + +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ProcessRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ProcessRestRepositoryIT.java index fdca31d07b..ab7e675d48 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ProcessRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ProcessRestRepositoryIT.java @@ -7,6 +7,7 @@ */ package org.dspace.app.rest; +import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.containsInAnyOrder; import static org.hamcrest.Matchers.is; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; @@ -18,19 +19,21 @@ import java.sql.SQLException; import java.util.LinkedList; import java.util.List; import java.util.Random; +import java.util.UUID; import org.apache.commons.codec.CharEncoding; import org.apache.commons.collections4.CollectionUtils; import org.apache.commons.io.IOUtils; -import org.dspace.app.rest.builder.ProcessBuilder; import org.dspace.app.rest.matcher.PageMatcher; import org.dspace.app.rest.matcher.ProcessFileTypesMatcher; import org.dspace.app.rest.matcher.ProcessMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.ProcessBuilder; import org.dspace.content.Bitstream; import org.dspace.content.ProcessStatus; import org.dspace.scripts.DSpaceCommandLineParameter; import org.dspace.scripts.Process; +import org.dspace.scripts.ProcessLogLevel; import org.dspace.scripts.service.ProcessService; import org.hamcrest.Matchers; import org.junit.After; @@ -72,7 +75,7 @@ public class ProcessRestRepositoryIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is( ProcessMatcher.matchProcess(process.getName(), String.valueOf(process.getEPerson().getID()), process.getID(), parameters, ProcessStatus.SCHEDULED))) - ); + ); } @Test @@ -86,7 +89,7 @@ public class ProcessRestRepositoryIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is( ProcessMatcher.matchProcess(process.getName(), String.valueOf(process.getEPerson().getID()), process.getID(), new LinkedList<>(), ProcessStatus.SCHEDULED))) - ); + ); } @Test @@ -157,39 +160,40 @@ public class ProcessRestRepositoryIT extends AbstractControllerIntegrationTest { getClient(token).perform(get("/api/system/processes/")) .andExpect(status().isOk()) - .andExpect(jsonPath("$._embedded.processes", containsInAnyOrder( - ProcessMatcher.matchProcess(process.getName(), String.valueOf(process.getEPerson().getID()), - process.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess.getName(), - String.valueOf(newProcess.getEPerson().getID()), - newProcess.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess1.getName(), - String.valueOf(newProcess1.getEPerson().getID()), - newProcess1.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess2.getName(), - String.valueOf(newProcess2.getEPerson().getID()), - newProcess2.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess3.getName(), - String.valueOf(newProcess3.getEPerson().getID()), - newProcess3.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess4.getName(), - String.valueOf(newProcess4.getEPerson().getID()), - newProcess4.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess5.getName(), - String.valueOf(newProcess5.getEPerson().getID()), - newProcess5.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess6.getName(), - String.valueOf(newProcess6.getEPerson().getID()), - newProcess6.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess7.getName(), - String.valueOf(newProcess7.getEPerson().getID()), - newProcess7.getID(), parameters, ProcessStatus.SCHEDULED), + // Expect all processes to be returned, newest to oldest + .andExpect(jsonPath("$._embedded.processes", contains( + ProcessMatcher.matchProcess(newProcess9.getName(), + String.valueOf(newProcess9.getEPerson().getID()), + newProcess9.getID(), parameters, ProcessStatus.SCHEDULED), ProcessMatcher.matchProcess(newProcess8.getName(), String.valueOf(newProcess8.getEPerson().getID()), newProcess8.getID(), parameters, ProcessStatus.SCHEDULED), - ProcessMatcher.matchProcess(newProcess9.getName(), - String.valueOf(newProcess9.getEPerson().getID()), - newProcess9.getID(), parameters, ProcessStatus.SCHEDULED) + ProcessMatcher.matchProcess(newProcess7.getName(), + String.valueOf(newProcess7.getEPerson().getID()), + newProcess7.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess6.getName(), + String.valueOf(newProcess6.getEPerson().getID()), + newProcess6.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess5.getName(), + String.valueOf(newProcess5.getEPerson().getID()), + newProcess5.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess4.getName(), + String.valueOf(newProcess4.getEPerson().getID()), + newProcess4.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess3.getName(), + String.valueOf(newProcess3.getEPerson().getID()), + newProcess3.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess2.getName(), + String.valueOf(newProcess2.getEPerson().getID()), + newProcess2.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess1.getName(), + String.valueOf(newProcess1.getEPerson().getID()), + newProcess1.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess.getName(), + String.valueOf(newProcess.getEPerson().getID()), + newProcess.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(process.getName(), String.valueOf(process.getEPerson().getID()), + process.getID(), parameters, ProcessStatus.SCHEDULED) ))) .andExpect(jsonPath("$.page", is( PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 11)))); @@ -320,9 +324,492 @@ public class ProcessRestRepositoryIT extends AbstractControllerIntegrationTest { .andExpect(status().isNotFound()); + } + + @Test + public void searchProcessTestForbidden() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty")) + .andExpect(status().isForbidden()); + } + + @Test + public void searchProcessTestUnauthorized() throws Exception { + + getClient().perform(get("/api/system/processes/search/byProperty")) + .andExpect(status().isUnauthorized()); + } + + @Test + public void searchProcessTestByUser() throws Exception { + Process newProcess = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess4 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess5 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess6 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess7 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters).build(); + Process newProcess8 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters).build(); + Process newProcess9 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters).build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", admin.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", containsInAnyOrder( + ProcessMatcher.matchProcess(process.getName(), + String.valueOf(admin.getID().toString()), + process.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess7.getName(), + String.valueOf(admin.getID().toString()), + newProcess7.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess8.getName(), + String.valueOf(admin.getID().toString()), + newProcess8.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess9.getName(), + String.valueOf(admin.getID().toString()), + newProcess9.getID(), parameters, ProcessStatus.SCHEDULED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 4)))); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", eperson.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", containsInAnyOrder( + ProcessMatcher.matchProcess(newProcess.getName(), + String.valueOf(eperson.getID().toString()), + newProcess.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess1.getName(), + String.valueOf(eperson.getID().toString()), + newProcess1.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess2.getName(), + String.valueOf(eperson.getID().toString()), + newProcess2.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess3.getName(), + String.valueOf(eperson.getID().toString()), + newProcess3.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess4.getName(), + String.valueOf(eperson.getID().toString()), + newProcess4.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess5.getName(), + String.valueOf(eperson.getID().toString()), + newProcess5.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess6.getName(), + String.valueOf(eperson.getID().toString()), + newProcess6.getID(), parameters, ProcessStatus.SCHEDULED) + + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 7)))); + } + + @Test + public void searchProcessTestByProcessStatus() throws Exception { + Process newProcess = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess4 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess5 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess6 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess7 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + Process newProcess8 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters).build(); + Process newProcess9 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("processStatus", "FAILED")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", containsInAnyOrder( + ProcessMatcher.matchProcess(newProcess7.getName(), + String.valueOf(admin.getID().toString()), + newProcess7.getID(), parameters, ProcessStatus.FAILED), + ProcessMatcher.matchProcess(newProcess9.getName(), + String.valueOf(admin.getID().toString()), + newProcess9.getID(), parameters, ProcessStatus.FAILED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 2)))); + } + + @Test + public void searchProcessTestByScriptName() throws Exception { + Process newProcess = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess4 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess5 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess6 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess7 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + Process newProcess8 = ProcessBuilder.createProcess(context, admin, "another-mock-script", parameters).build(); + Process newProcess9 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("scriptName", "another-mock-script")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", containsInAnyOrder( + ProcessMatcher.matchProcess(newProcess8.getName(), + String.valueOf(admin.getID().toString()), + newProcess8.getID(), parameters, ProcessStatus.SCHEDULED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 1)))); + } + + @Test + public void searchProcessTestByScriptNameAndUserId() throws Exception { + Process newProcess = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess4 = ProcessBuilder.createProcess(context, eperson, "another-mock-script", parameters).build(); + Process newProcess5 = ProcessBuilder.createProcess(context, eperson, "another-mock-script", parameters).build(); + Process newProcess6 = ProcessBuilder.createProcess(context, eperson, "another-mock-script", parameters).build(); + Process newProcess7 = ProcessBuilder.createProcess(context, admin, "another-mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + Process newProcess8 = ProcessBuilder.createProcess(context, admin, "another-mock-script", parameters).build(); + Process newProcess9 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("scriptName", "another-mock-script") + .param("userId", admin.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", containsInAnyOrder( + ProcessMatcher.matchProcess(newProcess7.getName(), + String.valueOf(admin.getID().toString()), + newProcess7.getID(), parameters, ProcessStatus.FAILED), + ProcessMatcher.matchProcess(newProcess8.getName(), + String.valueOf(admin.getID().toString()), + newProcess8.getID(), parameters, ProcessStatus.SCHEDULED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 2)))); + } + + @Test + public void searchProcessTestByUserIdAndProcessStatus() throws Exception { + Process newProcess = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + Process newProcess4 = ProcessBuilder.createProcess(context, eperson, "another-mock-script", parameters).build(); + Process newProcess5 = ProcessBuilder.createProcess(context, eperson, "another-mock-script", parameters).build(); + Process newProcess6 = ProcessBuilder.createProcess(context, eperson, "another-mock-script", parameters).build(); + Process newProcess7 = ProcessBuilder.createProcess(context, admin, "another-mock-script", parameters).build(); + Process newProcess8 = ProcessBuilder.createProcess(context, admin, "another-mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + Process newProcess9 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("processStatus", "FAILED") + .param("userId", admin.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", containsInAnyOrder( + ProcessMatcher.matchProcess(newProcess9.getName(), + String.valueOf(admin.getID().toString()), + newProcess9.getID(), parameters, ProcessStatus.FAILED), + ProcessMatcher.matchProcess(newProcess8.getName(), + String.valueOf(admin.getID().toString()), + newProcess8.getID(), parameters, ProcessStatus.FAILED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 2)))); + } + + @Test + public void searchProcessTestByUserIdAndProcessStatusAndScriptName() throws Exception { + Process newProcess = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + Process newProcess4 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters).build(); + Process newProcess5 = ProcessBuilder.createProcess(context, eperson, "another-mock-script", parameters).build(); + Process newProcess6 = ProcessBuilder.createProcess(context, eperson, "another-mock-script", parameters).build(); + Process newProcess7 = ProcessBuilder.createProcess(context, admin, "another-mock-script", parameters).build(); + Process newProcess8 = ProcessBuilder.createProcess(context, admin, "another-mock-script", parameters) + .withProcessStatus(ProcessStatus.FAILED).build(); + Process newProcess9 = ProcessBuilder.createProcess(context, admin, "mock-script", parameters).build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("processStatus", "FAILED") + .param("userId", eperson.getID().toString()) + .param("scriptName", "mock-script")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", containsInAnyOrder( + ProcessMatcher.matchProcess("mock-script", + String.valueOf(eperson.getID().toString()), + newProcess1.getID(), parameters, ProcessStatus.FAILED), + ProcessMatcher.matchProcess("mock-script", + String.valueOf(eperson.getID().toString()), + newProcess3.getID(), parameters, ProcessStatus.FAILED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 2)))); + } + + @Test + public void searchProcessTestNoParametersBadRequest() throws Exception { + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty")) + .andExpect(status().isBadRequest()); + } + + + @Test + public void searchProcessTestUnparseableProcessStatusParamBadRequest() throws Exception { + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("processStatus", "not-a-valid-status")) + .andExpect(status().isBadRequest()); + } + + @Test + public void searchProcessTestInvalidEPersonUuid() throws Exception { + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", UUID.randomUUID().toString())) + .andExpect(status().isBadRequest()); + } + + @Test + public void searchProcessTestByUserSortedOnStartTimeAsc() throws Exception { + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("10/01/1990", "20/01/1990").build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("11/01/1990", "19/01/1990").build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("12/01/1990", "18/01/1990").build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", eperson.getID().toString()) + .param("sort", "startTime,asc")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", contains( + ProcessMatcher.matchProcess(newProcess1.getName(), + String.valueOf(eperson.getID().toString()), + newProcess1.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess2.getName(), + String.valueOf(eperson.getID().toString()), + newProcess2.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess3.getName(), + String.valueOf(eperson.getID().toString()), + newProcess3.getID(), parameters, ProcessStatus.SCHEDULED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 3)))); + } + + @Test + public void searchProcessTestByUserSortedOnStartTimeDesc() throws Exception { + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("10/01/1990", "20/01/1990").build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("11/01/1990", "19/01/1990").build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("12/01/1990", "18/01/1990").build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", eperson.getID().toString()) + .param("sort", "startTime,desc")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", contains( + ProcessMatcher.matchProcess(newProcess3.getName(), + String.valueOf(eperson.getID().toString()), + newProcess3.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess2.getName(), + String.valueOf(eperson.getID().toString()), + newProcess2.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess1.getName(), + String.valueOf(eperson.getID().toString()), + newProcess1.getID(), parameters, ProcessStatus.SCHEDULED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 3)))); + } + + @Test + public void searchProcessTestByUserSortedOnEndTimeAsc() throws Exception { + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("10/01/1990", "20/01/1990").build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("11/01/1990", "19/01/1990").build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("12/01/1990", "18/01/1990").build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", eperson.getID().toString()) + .param("sort", "endTime,asc")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", contains( + ProcessMatcher.matchProcess(newProcess3.getName(), + String.valueOf(eperson.getID().toString()), + newProcess3.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess2.getName(), + String.valueOf(eperson.getID().toString()), + newProcess2.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess1.getName(), + String.valueOf(eperson.getID().toString()), + newProcess1.getID(), parameters, ProcessStatus.SCHEDULED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 3)))); + } + + @Test + public void searchProcessTestByUserSortedOnEndTimeDesc() throws Exception { + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("10/01/1990", "20/01/1990").build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("11/01/1990", "19/01/1990").build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("12/01/1990", "18/01/1990").build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", eperson.getID().toString()) + .param("sort", "endTime,desc")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", contains( + ProcessMatcher.matchProcess(newProcess1.getName(), + String.valueOf(eperson.getID().toString()), + newProcess1.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess2.getName(), + String.valueOf(eperson.getID().toString()), + newProcess2.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess3.getName(), + String.valueOf(eperson.getID().toString()), + newProcess3.getID(), parameters, ProcessStatus.SCHEDULED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 3)))); + } + + @Test + public void searchProcessTestByUserSortedOnMultipleBadRequest() throws Exception { + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("10/01/1990", "20/01/1990").build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("11/01/1990", "19/01/1990").build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("12/01/1990", "18/01/1990").build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", eperson.getID().toString()) + .param("sort", "endTime,desc") + .param("sort", "startTime,desc")) + .andExpect(status().isBadRequest()); + } + + @Test + public void searchProcessTestByUserSortedOnDefault() throws Exception { + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("10/01/1990", "20/01/1990").build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("11/01/1990", "19/01/1990").build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("12/01/1990", "18/01/1990").build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", eperson.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.processes", contains( + ProcessMatcher.matchProcess(newProcess3.getName(), + String.valueOf(eperson.getID().toString()), + newProcess3.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess2.getName(), + String.valueOf(eperson.getID().toString()), + newProcess2.getID(), parameters, ProcessStatus.SCHEDULED), + ProcessMatcher.matchProcess(newProcess1.getName(), + String.valueOf(eperson.getID().toString()), + newProcess1.getID(), parameters, ProcessStatus.SCHEDULED) + ))) + .andExpect(jsonPath("$.page", is( + PageMatcher.pageEntryWithTotalPagesAndElements(0, 20, 1, 3)))); + } + + @Test + public void searchProcessTestByUserSortedOnNonExistingIsSortedAsDefault() throws Exception { + Process newProcess1 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("10/01/1990", "20/01/1990").build(); + Process newProcess2 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("11/01/1990", "19/01/1990").build(); + Process newProcess3 = ProcessBuilder.createProcess(context, eperson, "mock-script", parameters) + .withStartAndEndTime("12/01/1990", "18/01/1990").build(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/search/byProperty") + .param("userId", eperson.getID().toString()) + .param("sort", "eaz,desc")) + .andExpect(status().isBadRequest()); + } + + @Test + public void getProcessOutput() throws Exception { + try (InputStream is = IOUtils.toInputStream("Test File For Process", CharEncoding.UTF_8)) { + processService.appendLog(process.getID(), process.getName(), "testlog", ProcessLogLevel.INFO); + } + processService.createLogBitstream(context, process); + List fileTypesToCheck = new LinkedList<>(); + fileTypesToCheck.add("inputfile"); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/system/processes/" + process.getID() + "/output")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.name", + is(process.getName() + process.getID() + ".log"))) + .andExpect(jsonPath("$.type", is("bitstream"))) + .andExpect(jsonPath("$.metadata['dc.title'][0].value", + is(process.getName() + process.getID() + ".log"))) + .andExpect(jsonPath("$.metadata['dspace.process.filetype'][0].value", + is("script_output"))); + + } @After + @Override public void destroy() throws Exception { super.destroy(); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RegistrationRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RegistrationRestRepositoryIT.java new file mode 100644 index 0000000000..1c3ae58374 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RegistrationRestRepositoryIT.java @@ -0,0 +1,184 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertTrue; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.util.Iterator; +import java.util.List; +import javax.servlet.http.HttpServletResponse; + +import com.fasterxml.jackson.databind.ObjectMapper; +import org.apache.commons.lang3.StringUtils; +import org.dspace.app.rest.matcher.RegistrationMatcher; +import org.dspace.app.rest.model.RegistrationRest; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.eperson.RegistrationData; +import org.dspace.eperson.dao.RegistrationDataDAO; +import org.dspace.services.ConfigurationService; +import org.hamcrest.Matchers; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +public class RegistrationRestRepositoryIT extends AbstractControllerIntegrationTest { + + @Autowired + private RegistrationDataDAO registrationDataDAO; + + @Autowired + private ConfigurationService configurationService; + + @Test + public void findByTokenTestExistingUserTest() throws Exception { + String email = eperson.getEmail(); + createTokenForEmail(email); + RegistrationData registrationData = registrationDataDAO.findByEmail(context, email); + + try { + getClient().perform(get("/api/eperson/registrations/search/findByToken") + .param("token", registrationData.getToken())) + .andExpect(status().isOk()) + .andExpect( + jsonPath("$", Matchers.is(RegistrationMatcher.matchRegistration(email, eperson.getID())))); + + registrationDataDAO.delete(context, registrationData); + + email = "newUser@testnewuser.com"; + createTokenForEmail(email); + registrationData = registrationDataDAO.findByEmail(context, email); + + getClient().perform(get("/api/eperson/registrations/search/findByToken") + .param("token", registrationData.getToken())) + .andExpect(status().isOk()) + .andExpect( + jsonPath("$", Matchers.is(RegistrationMatcher.matchRegistration(email, null)))); + } finally { + registrationDataDAO.delete(context, registrationData); + } + + + } + + @Test + public void findByTokenTestNewUserTest() throws Exception { + String email = "newUser@testnewuser.com"; + createTokenForEmail(email); + RegistrationData registrationData = registrationDataDAO.findByEmail(context, email); + + try { + getClient().perform(get("/api/eperson/registrations/search/findByToken") + .param("token", registrationData.getToken())) + .andExpect(status().isOk()) + .andExpect( + jsonPath("$", Matchers.is(RegistrationMatcher.matchRegistration(email, null)))); + } finally { + registrationDataDAO.delete(context, registrationData); + } + + } + + @Test + public void findByTokenNotExistingTokenTest() throws Exception { + getClient().perform(get("/api/eperson/registration/search/findByToken") + .param("token", "ThisTokenDoesNotExist")) + .andExpect(status().isNotFound()); + } + + private void createTokenForEmail(String email) throws Exception { + List registrationDatas; + ObjectMapper mapper = new ObjectMapper(); + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(email); + getClient().perform(post("/api/eperson/registrations") + .content(mapper.writeValueAsBytes(registrationRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + } + + @Test + public void registrationFlowTest() throws Exception { + List registrationDataList = registrationDataDAO.findAll(context, RegistrationData.class); + assertEquals(0, registrationDataList.size()); + + ObjectMapper mapper = new ObjectMapper(); + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(eperson.getEmail()); + + try { + getClient().perform(post("/api/eperson/registrations") + .content(mapper.writeValueAsBytes(registrationRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + registrationDataList = registrationDataDAO.findAll(context, RegistrationData.class); + assertEquals(1, registrationDataList.size()); + assertTrue(StringUtils.equalsIgnoreCase(registrationDataList.get(0).getEmail(), eperson.getEmail())); + + String newEmail = "newEPersonTest@gmail.com"; + registrationRest.setEmail(newEmail); + getClient().perform(post("/api/eperson/registrations") + .content(mapper.writeValueAsBytes(registrationRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + registrationDataList = registrationDataDAO.findAll(context, RegistrationData.class); + assertTrue(registrationDataList.size() == 2); + assertTrue(StringUtils.equalsIgnoreCase(registrationDataList.get(0).getEmail(), newEmail) || + StringUtils.equalsIgnoreCase(registrationDataList.get(1).getEmail(), newEmail)); + configurationService.setProperty("user.registration", false); + + newEmail = "newEPersonTestTwo@gmail.com"; + registrationRest.setEmail(newEmail); + getClient().perform(post("/api/eperson/registrations") + .content(mapper.writeValueAsBytes(registrationRest)) + .contentType(contentType)) + .andExpect(status().is(HttpServletResponse.SC_UNAUTHORIZED)); + + assertEquals(2, registrationDataList.size()); + assertTrue(!StringUtils.equalsIgnoreCase(registrationDataList.get(0).getEmail(), newEmail) && + !StringUtils.equalsIgnoreCase(registrationDataList.get(1).getEmail(), newEmail)); + } finally { + Iterator iterator = registrationDataList.iterator(); + while (iterator.hasNext()) { + RegistrationData registrationData = iterator.next(); + registrationDataDAO.delete(context, registrationData); + } + } + } + + @Test + public void forgotPasswordTest() throws Exception { + configurationService.setProperty("user.registration", false); + + List registrationDataList = registrationDataDAO.findAll(context, RegistrationData.class); + try { + assertEquals(0, registrationDataList.size()); + + ObjectMapper mapper = new ObjectMapper(); + RegistrationRest registrationRest = new RegistrationRest(); + registrationRest.setEmail(eperson.getEmail()); + getClient().perform(post("/api/eperson/registrations") + .content(mapper.writeValueAsBytes(registrationRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + registrationDataList = registrationDataDAO.findAll(context, RegistrationData.class); + assertEquals(1, registrationDataList.size()); + assertTrue(StringUtils.equalsIgnoreCase(registrationDataList.get(0).getEmail(), eperson.getEmail())); + } finally { + Iterator iterator = registrationDataList.iterator(); + while (iterator.hasNext()) { + RegistrationData registrationData = iterator.next(); + registrationDataDAO.delete(context, registrationData); + } + } + } + +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipDeleteRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipDeleteRestRepositoryIT.java index 87d9c16e12..9cc9ce51c9 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipDeleteRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipDeleteRestRepositoryIT.java @@ -16,11 +16,11 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.util.List; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.RelationshipBuilder; import org.dspace.app.rest.test.AbstractEntityIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.RelationshipBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -618,7 +618,7 @@ public class RelationshipDeleteRestRepositoryIT extends AbstractEntityIntegratio assertThat(publicationAuthorList.size(), equalTo(0)); List publicationRelationships = itemService.getMetadata(publicationItem, - "relation", "isAuthorOfPublication", Item.ANY, Item.ANY); + "relation", "isAuthorOfPublication", Item.ANY, Item.ANY); assertThat(publicationRelationships.size(), equalTo(0)); projectItem = itemService.find(context, projectItem.getID()); @@ -628,7 +628,7 @@ public class RelationshipDeleteRestRepositoryIT extends AbstractEntityIntegratio assertThat(projectAuthorList.get(0).getValue(), equalTo("Smith, Donald")); assertNull(projectAuthorList.get(0).getAuthority()); List projectRelationships = itemService.getMetadata(projectItem, - "relation", "isPersonOfProject", Item.ANY, Item.ANY); + "relation", "isPersonOfProject", Item.ANY, Item.ANY); assertThat(projectRelationships.size(), equalTo(0)); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipRestRepositoryIT.java index a9badd03f6..060c504f3d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipRestRepositoryIT.java @@ -33,16 +33,16 @@ import java.util.concurrent.atomic.AtomicReference; import com.fasterxml.jackson.databind.ObjectMapper; import com.google.gson.JsonObject; import org.apache.commons.lang3.StringUtils; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.RelationshipBuilder; import org.dspace.app.rest.matcher.PageMatcher; import org.dspace.app.rest.matcher.RelationshipMatcher; import org.dspace.app.rest.model.RelationshipRest; import org.dspace.app.rest.test.AbstractEntityIntegrationTest; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.RelationshipBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipTypeRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipTypeRestControllerIT.java index fe4bef8b8d..9f631c0650 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipTypeRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RelationshipTypeRestControllerIT.java @@ -13,15 +13,15 @@ import static org.springframework.test.web.servlet.request.MockMvcRequestBuilder import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.RelationshipBuilder; import org.dspace.app.rest.matcher.EntityTypeMatcher; import org.dspace.app.rest.matcher.PageMatcher; import org.dspace.app.rest.matcher.RelationshipMatcher; import org.dspace.app.rest.matcher.RelationshipTypeMatcher; import org.dspace.app.rest.test.AbstractEntityIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.RelationshipBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.EntityType; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ResourcePolicyRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ResourcePolicyRestRepositoryIT.java index 1b207fbeee..d8a6234d36 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ResourcePolicyRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ResourcePolicyRestRepositoryIT.java @@ -29,12 +29,6 @@ import java.util.concurrent.atomic.AtomicReference; import javax.ws.rs.core.MediaType; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.ResourcePolicyBuilder; import org.dspace.app.rest.matcher.ResourcePolicyMatcher; import org.dspace.app.rest.model.ResourcePolicyRest; import org.dspace.app.rest.model.patch.AddOperation; @@ -45,6 +39,12 @@ import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.ResourcePolicy; import org.dspace.authorize.service.AuthorizeService; import org.dspace.authorize.service.ResourcePolicyService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RestResourceControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RestResourceControllerIT.java index 164459b228..1ca5179942 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RestResourceControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RestResourceControllerIT.java @@ -12,8 +12,8 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.util.UUID; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CommunityBuilder; import org.junit.Test; /** diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RootRestResourceControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RootRestResourceControllerIT.java index 6c1c4a9427..b4cee1672e 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/RootRestResourceControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/RootRestResourceControllerIT.java @@ -49,7 +49,8 @@ public class RootRestResourceControllerIT extends AbstractControllerIntegrationT //We expect the content type to be "application/hal+json;charset=UTF-8" .andExpect(content().contentType(contentType)) //Check that all required root links are present and that they are absolute - .andExpect(jsonPath("$._links.authorities.href", startsWith(BASE_REST_SERVER_URL))) + .andExpect(jsonPath("$._links.vocabularies.href", startsWith(BASE_REST_SERVER_URL))) + .andExpect(jsonPath("$._links.vocabularyEntryDetails.href", startsWith(BASE_REST_SERVER_URL))) .andExpect(jsonPath("$._links.bitstreamformats.href", startsWith(BASE_REST_SERVER_URL))) .andExpect(jsonPath("$._links.bitstreams.href", startsWith(BASE_REST_SERVER_URL))) .andExpect(jsonPath("$._links.browses.href", startsWith(BASE_REST_SERVER_URL))) diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ScriptRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ScriptRestRepositoryIT.java index c1ba31305e..7782b21020 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ScriptRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ScriptRestRepositoryIT.java @@ -11,9 +11,11 @@ import static com.jayway.jsonpath.JsonPath.read; import static org.hamcrest.Matchers.containsInAnyOrder; import static org.hamcrest.Matchers.hasItem; import static org.hamcrest.Matchers.is; +import static org.junit.Assert.assertThat; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.fileUpload; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.content; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; @@ -27,11 +29,8 @@ import java.util.stream.Collectors; import com.google.gson.Gson; import org.apache.commons.collections4.CollectionUtils; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.ProcessBuilder; import org.dspace.app.rest.converter.DSpaceRunnableParameterConverter; +import org.dspace.app.rest.matcher.BitstreamMatcher; import org.dspace.app.rest.matcher.PageMatcher; import org.dspace.app.rest.matcher.ProcessMatcher; import org.dspace.app.rest.matcher.ScriptMatcher; @@ -39,20 +38,28 @@ import org.dspace.app.rest.model.ParameterValueRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.AuthorizeException; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ProcessBuilder; +import org.dspace.content.Bitstream; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; import org.dspace.content.ProcessStatus; import org.dspace.content.service.BitstreamService; import org.dspace.scripts.DSpaceCommandLineParameter; +import org.dspace.scripts.Process; import org.dspace.scripts.configuration.ScriptConfiguration; import org.dspace.scripts.service.ProcessService; +import org.hamcrest.CoreMatchers; import org.hamcrest.Matchers; import org.junit.After; import org.junit.Test; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.MediaType; import org.springframework.mock.web.MockMultipartFile; +import org.springframework.test.web.servlet.MvcResult; public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { @@ -75,14 +82,20 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { getClient(token).perform(get("/api/system/scripts")) .andExpect(status().isOk()) .andExpect(jsonPath("$._embedded.scripts", containsInAnyOrder( - ScriptMatcher.matchScript(scriptConfigurations.get(0).getName(), - scriptConfigurations.get(0).getDescription()), - ScriptMatcher.matchScript(scriptConfigurations.get(1).getName(), - scriptConfigurations.get(1).getDescription()), - ScriptMatcher.matchScript(scriptConfigurations.get(2).getName(), - scriptConfigurations.get(2).getDescription()), - ScriptMatcher.matchScript(scriptConfigurations.get(3).getName(), - scriptConfigurations.get(3).getDescription()) + ScriptMatcher.matchScript(scriptConfigurations.get(0).getName(), + scriptConfigurations.get(0).getDescription()), + ScriptMatcher.matchScript(scriptConfigurations.get(1).getName(), + scriptConfigurations.get(1).getDescription()), + ScriptMatcher.matchScript(scriptConfigurations.get(2).getName(), + scriptConfigurations.get(2).getDescription()), + ScriptMatcher.matchScript(scriptConfigurations.get(3).getName(), + scriptConfigurations.get(3).getDescription()), + ScriptMatcher.matchScript(scriptConfigurations.get(4).getName(), + scriptConfigurations.get(4).getDescription()), + ScriptMatcher.matchScript(scriptConfigurations.get(5).getName(), + scriptConfigurations.get(5).getDescription()), + ScriptMatcher.matchScript(scriptConfigurations.get(6).getName(), + scriptConfigurations.get(6).getDescription()) ))); } @@ -107,12 +120,12 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { getClient(token).perform(get("/api/system/scripts").param("size", "1")) .andExpect(status().isOk()) .andExpect(jsonPath("$._embedded.scripts", Matchers.not(Matchers.hasItem( - ScriptMatcher.matchScript(scriptConfigurations.get(0).getName(), - scriptConfigurations.get(0).getDescription()) + ScriptMatcher.matchScript(scriptConfigurations.get(0).getName(), + scriptConfigurations.get(0).getDescription()) )))) .andExpect(jsonPath("$._embedded.scripts", hasItem( - ScriptMatcher.matchScript(scriptConfigurations.get(2).getName(), - scriptConfigurations.get(2).getDescription()) + ScriptMatcher.matchScript(scriptConfigurations.get(2).getName(), + scriptConfigurations.get(2).getDescription()) ))) .andExpect(jsonPath("$.page", is(PageMatcher.pageEntry(0, 1)))); @@ -121,12 +134,12 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { getClient(token).perform(get("/api/system/scripts").param("size", "1").param("page", "1")) .andExpect(status().isOk()) .andExpect(jsonPath("$._embedded.scripts", hasItem( - ScriptMatcher.matchScript(scriptConfigurations.get(1).getName(), - scriptConfigurations.get(1).getDescription()) + ScriptMatcher.matchScript(scriptConfigurations.get(1).getName(), + scriptConfigurations.get(1).getDescription()) ))) .andExpect(jsonPath("$._embedded.scripts", Matchers.not(hasItem( - ScriptMatcher.matchScript(scriptConfigurations.get(0).getName(), - scriptConfigurations.get(0).getDescription()) + ScriptMatcher.matchScript(scriptConfigurations.get(0).getName(), + scriptConfigurations.get(0).getDescription()) )))) .andExpect(jsonPath("$.page", is(PageMatcher.pageEntry(1, 1)))); @@ -139,7 +152,8 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { getClient(token).perform(get("/api/system/scripts/mock-script")) .andExpect(status().isOk()) .andExpect(jsonPath("$", ScriptMatcher - .matchMockScript(scriptConfigurations.get(3).getOptions()))); + .matchMockScript( + scriptConfigurations.get(scriptConfigurations.size() - 1).getOptions()))); } @Test @@ -180,14 +194,14 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { try { getClient(token) - .perform(post("/api/system/scripts/mock-script/processes").contentType("multipart/form-data")) - .andExpect(status().isAccepted()) - .andExpect(jsonPath("$", is( - ProcessMatcher.matchProcess("mock-script", - String.valueOf(admin.getID()), new LinkedList<>(), - ProcessStatus.FAILED)))) - .andDo(result -> idRef - .set(read(result.getResponse().getContentAsString(), "$.processId"))); + .perform(post("/api/system/scripts/mock-script/processes").contentType("multipart/form-data")) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("mock-script", + String.valueOf(admin.getID()), new LinkedList<>(), + ProcessStatus.FAILED)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); } finally { ProcessBuilder.deleteProcess(idRef.get()); } @@ -215,7 +229,7 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { List list = parameters.stream() .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter - .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) .collect(Collectors.toList()); String token = getAuthToken(admin.getEmail(), password); @@ -224,16 +238,16 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { try { getClient(token) - .perform(post("/api/system/scripts/mock-script/processes").contentType("multipart/form-data") - .param("properties", - new Gson().toJson(list))) - .andExpect(status().isAccepted()) - .andExpect(jsonPath("$", is( - ProcessMatcher.matchProcess("mock-script", - String.valueOf(admin.getID()), parameters, - ProcessStatus.FAILED)))) - .andDo(result -> idRef - .set(read(result.getResponse().getContentAsString(), "$.processId"))); + .perform(post("/api/system/scripts/mock-script/processes").contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("mock-script", + String.valueOf(admin.getID()), parameters, + ProcessStatus.FAILED)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); } finally { ProcessBuilder.deleteProcess(idRef.get()); } @@ -244,7 +258,7 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { String token = getAuthToken(admin.getEmail(), password); getClient(token).perform(post("/api/system/scripts/mock-script-invalid/processes") - .contentType("multipart/form-data")) + .contentType("multipart/form-data")) .andExpect(status().isBadRequest()); } @@ -257,7 +271,7 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { List list = parameters.stream() .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter - .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) .collect(Collectors.toList()); String token = getAuthToken(admin.getEmail(), password); @@ -270,22 +284,95 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { try { getClient(token) - .perform(post("/api/system/scripts/mock-script/processes").contentType("multipart/form-data") - .param("properties", - new Gson().toJson(list))) - .andExpect(status().isAccepted()) - .andExpect(jsonPath("$", is( - ProcessMatcher.matchProcess("mock-script", - String.valueOf(admin.getID()), - parameters, - acceptableProcessStatuses)))) - .andDo(result -> idRef - .set(read(result.getResponse().getContentAsString(), "$.processId"))); + .perform(post("/api/system/scripts/mock-script/processes").contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("mock-script", + String.valueOf(admin.getID()), + parameters, + acceptableProcessStatuses)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); } finally { ProcessBuilder.deleteProcess(idRef.get()); } } + @Test + public void postProcessAndVerifyOutput() throws Exception { + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-r", "test")); + parameters.add(new DSpaceCommandLineParameter("-i", null)); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + String token = getAuthToken(admin.getEmail(), password); + List acceptableProcessStatuses = new LinkedList<>(); + acceptableProcessStatuses.addAll(Arrays.asList(ProcessStatus.SCHEDULED, + ProcessStatus.RUNNING, + ProcessStatus.COMPLETED)); + + AtomicReference idRef = new AtomicReference<>(); + + try { + getClient(token) + .perform(post("/api/system/scripts/mock-script/processes").contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("mock-script", + String.valueOf(admin.getID()), + parameters, + acceptableProcessStatuses)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); + + + Process process = processService.find(context, idRef.get()); + Bitstream bitstream = processService.getBitstream(context, process, Process.OUTPUT_TYPE); + + + getClient(token).perform(get("/api/system/processes/" + idRef.get() + "/output")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$", BitstreamMatcher + .matchBitstreamEntryWithoutEmbed(bitstream.getID(), bitstream.getSizeBytes()))); + + + MvcResult mvcResult = getClient(token) + .perform(get("/api/core/bitstreams/" + bitstream.getID() + "/content")).andReturn(); + String content = mvcResult.getResponse().getContentAsString(); + + assertThat(content, CoreMatchers + .containsString("INFO mock-script - " + process.getID() + " @ The script has started")); + assertThat(content, + CoreMatchers.containsString( + "INFO mock-script - " + process.getID() + " @ Logging INFO for Mock DSpace Script")); + assertThat(content, + CoreMatchers.containsString( + "ERROR mock-script - " + process.getID() + " @ Logging ERROR for Mock DSpace Script")); + assertThat(content, + CoreMatchers.containsString("WARNING mock-script - " + process + .getID() + " @ Logging WARNING for Mock DSpace Script")); + assertThat(content, CoreMatchers + .containsString("INFO mock-script - " + process.getID() + " @ The script has completed")); + + + + + } finally { + ProcessBuilder.deleteProcess(idRef.get()); + } + } + + @Test public void postProcessAdminWithWrongContentTypeBadRequestException() throws Exception { @@ -329,7 +416,7 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { List list = parameters.stream() .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter - .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) .collect(Collectors.toList()); String token = getAuthToken(admin.getEmail(), password); @@ -342,17 +429,17 @@ public class ScriptRestRepositoryIT extends AbstractControllerIntegrationTest { try { getClient(token) - .perform(fileUpload("/api/system/scripts/mock-script/processes").file(bitstreamFile) - .param("properties", - new Gson().toJson(list))) - .andExpect(status().isAccepted()) - .andExpect(jsonPath("$", is( - ProcessMatcher.matchProcess("mock-script", - String.valueOf(admin.getID()), - parameters, - acceptableProcessStatuses)))) - .andDo(result -> idRef - .set(read(result.getResponse().getContentAsString(), "$.processId"))); + .perform(fileUpload("/api/system/scripts/mock-script/processes").file(bitstreamFile) + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("mock-script", + String.valueOf(admin.getID()), + parameters, + acceptableProcessStatuses)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); } finally { ProcessBuilder.deleteProcess(idRef.get()); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SearchEventRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SearchEventRestRepositoryIT.java index 497f7b9073..bd40cfdc9d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SearchEventRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SearchEventRestRepositoryIT.java @@ -16,13 +16,13 @@ import java.util.List; import java.util.UUID; import com.fasterxml.jackson.databind.ObjectMapper; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.model.PageRest; import org.dspace.app.rest.model.SearchEventRest; import org.dspace.app.rest.model.SearchResultsRest; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ShibbolethRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ShibbolethRestControllerIT.java index e13d091168..368714300b 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ShibbolethRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ShibbolethRestControllerIT.java @@ -12,7 +12,10 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.services.ConfigurationService; +import org.junit.Before; import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; /** * Integration test that cover ShibbolethRestController @@ -21,6 +24,17 @@ import org.junit.Test; */ public class ShibbolethRestControllerIT extends AbstractControllerIntegrationTest { + @Autowired + ConfigurationService configurationService; + + + @Before + public void setup() throws Exception { + super.setUp(); + configurationService.setProperty("rest.cors.allowed-origins", + "${dspace.ui.url}, http://anotherdspacehost:4000"); + } + @Test public void testRedirectToDefaultDspaceUrl() throws Exception { String token = getAuthToken(eperson.getEmail(), password); @@ -31,12 +45,40 @@ public class ShibbolethRestControllerIT extends AbstractControllerIntegrationTes } @Test - public void testRedirectToGivenUrl() throws Exception { + public void testRedirectToGivenTrustedUrl() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); getClient(token).perform(get("/api/authn/shibboleth") - .param("redirectUrl", "http://dspace.org")) + .param("redirectUrl", "http://localhost:8080/server/api/authn/status")) .andExpect(status().is3xxRedirection()) - .andExpect(redirectedUrl("http://dspace.org")); + .andExpect(redirectedUrl("http://localhost:8080/server/api/authn/status")); + } + + @Test + public void testRedirectToAnotherGivenTrustedUrl() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + + getClient(token).perform(get("/api/authn/shibboleth") + .param("redirectUrl", "http://anotherdspacehost:4000/home")) + .andExpect(status().is3xxRedirection()) + .andExpect(redirectedUrl("http://anotherdspacehost:4000/home")); + } + + @Test + public void testRedirectToGivenUntrustedUrl() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + + // Now attempt to redirect to a URL that is NOT trusted (i.e. not the Server or UI). + // Should result in a 400 error. + getClient(token).perform(get("/api/authn/shibboleth") + .param("redirectUrl", "http://dspace.org")) + .andExpect(status().isBadRequest()); + } + + @Test + public void testRedirectRequiresAuth() throws Exception { + getClient().perform(get("/api/authn/shibboleth")) + .andExpect(status().isUnauthorized()); } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SiteRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SiteRestRepositoryIT.java index f208d0827f..092ea32b3f 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SiteRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SiteRestRepositoryIT.java @@ -14,10 +14,10 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.util.UUID; -import org.dspace.app.rest.builder.SiteBuilder; import org.dspace.app.rest.matcher.SiteMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.test.MetadataPatchSuite; +import org.dspace.builder.SiteBuilder; import org.dspace.content.Site; import org.dspace.eperson.EPerson; import org.hamcrest.Matchers; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SitemapRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SitemapRestControllerIT.java new file mode 100644 index 0000000000..fd84aa023b --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SitemapRestControllerIT.java @@ -0,0 +1,167 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import static org.dspace.builder.ItemBuilder.createItem; +import static org.junit.Assert.assertTrue; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.content; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import javax.servlet.ServletException; + +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.services.ConfigurationService; +import org.junit.After; +import org.junit.Before; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.test.web.servlet.MvcResult; + +/** + * Integration test to test the /sitemaps/{name} endpoint, see {@link SitemapRestController} + * + * @author Maria Verdonck (Atmire) on 08/07/2020 + */ +public class SitemapRestControllerIT extends AbstractControllerIntegrationTest { + + @Autowired + ConfigurationService configurationService; + + private final static String SITEMAPS_ENDPOINT = "sitemaps"; + + private Item item1; + private Item item2; + + @Before + @Override + public void setUp() throws Exception { + super.setUp(); + + configurationService.setProperty("sitemap.path", SITEMAPS_ENDPOINT); + + context.turnOffAuthorisationSystem(); + + Community community = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, community).build(); + this.item1 = createItem(context, collection) + .withTitle("Test 1") + .withIssueDate("2010-10-17") + .build(); + this.item2 = createItem(context, collection) + .withTitle("Test 2") + .withIssueDate("2015-8-3") + .build(); + + runDSpaceScript("generate-sitemaps"); + + context.restoreAuthSystemState(); + } + + @After + public void destroy() throws Exception { + // delete sitemaps generated by tests in before + runDSpaceScript("generate-sitemaps", "-d"); + + super.destroy(); + } + + @Test + public void testSitemap_notValidSiteMapFile() throws Exception { + //** WHEN ** + //We attempt to retrieve a non valid sitemap file + getClient().perform(get("/" + SITEMAPS_ENDPOINT + "/no-such-file")) + //** THEN ** + .andExpect(status().isNotFound()); + } + + @Test(expected = ServletException.class) + public void testSitemap_fileSystemTraversal_dspaceCfg() throws Exception { + //** WHEN ** + //We attempt to use endpoint for malicious file system traversal + getClient().perform(get("/" + SITEMAPS_ENDPOINT + "/%2e%2e/config/dspace.cfg")); + } + + @Test(expected = ServletException.class) + public void testSitemap_fileSystemTraversal_dspaceCfg2() throws Exception { + //** WHEN ** + //We attempt to use endpoint for malicious file system traversal + getClient().perform(get("/" + SITEMAPS_ENDPOINT + "/%2e%2e%2fconfig%2fdspace.cfg")); + } + + @Test + public void testSitemap_sitemapIndexHtml() throws Exception { + //** WHEN ** + //We retrieve sitemap_index.html + MvcResult result = getClient().perform(get("/" + SITEMAPS_ENDPOINT + "/sitemap_index.html")) + //** THEN ** + .andExpect(status().isOk()) + //We expect the content type to match + .andExpect(content().contentType("text/html")) + .andReturn(); + + String response = result.getResponse().getContentAsString(); + // contains a link to /sitemaps/sitemap0.html + assertTrue(response.contains("/sitemap0.html")); + } + + @Test + public void testSitemap_sitemap0Html() throws Exception { + //** WHEN ** + //We retrieve sitemap0.html + MvcResult result = getClient().perform(get("/" + SITEMAPS_ENDPOINT + "/sitemap0.html")) + //** THEN ** + .andExpect(status().isOk()) + //We expect the content type to match + .andExpect(content().contentType("text/html")) + .andReturn(); + + String response = result.getResponse().getContentAsString(); + // contains a link to items: [dspace.ui.url]/items/ + assertTrue(response.contains(configurationService.getProperty("dspace.ui.url") + "/items/" + item1.getID())); + assertTrue(response.contains(configurationService.getProperty("dspace.ui.url") + "/items/" + item2.getID())); + } + + @Test + public void testSitemap_sitemapIndexXml() throws Exception { + //** WHEN ** + //We retrieve sitemap_index.xml + MvcResult result = getClient().perform(get("/" + SITEMAPS_ENDPOINT + "/sitemap_index.xml")) + //** THEN ** + .andExpect(status().isOk()) + //We expect the content type to match + .andExpect(content().contentType("application/xml")) + .andReturn(); + + String response = result.getResponse().getContentAsString(); + // contains a link to /sitemaps/sitemap0.html + assertTrue(response.contains("/sitemap0.xml")); + } + + @Test + public void testSitemap_sitemap0Xml() throws Exception { + //** WHEN ** + //We retrieve sitemap0.html + MvcResult result = getClient().perform(get("/" + SITEMAPS_ENDPOINT + "/sitemap0.xml")) + //** THEN ** + .andExpect(status().isOk()) + //We expect the content type to match + .andExpect(content().contentType("application/xml")) + .andReturn(); + + String response = result.getResponse().getContentAsString(); + // contains a link to items: [dspace.ui.url]/items/ + assertTrue(response.contains(configurationService.getProperty("dspace.ui.url") + "/items/" + item1.getID())); + assertTrue(response.contains(configurationService.getProperty("dspace.ui.url") + "/items/" + item2.getID())); + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/StatisticsRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/StatisticsRestRepositoryIT.java new file mode 100644 index 0000000000..08303e57f2 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/StatisticsRestRepositoryIT.java @@ -0,0 +1,1229 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import static org.apache.commons.codec.CharEncoding.UTF_8; +import static org.apache.commons.io.IOUtils.toInputStream; +import static org.dspace.app.rest.utils.UsageReportUtils.TOP_CITIES_REPORT_ID; +import static org.dspace.app.rest.utils.UsageReportUtils.TOP_COUNTRIES_REPORT_ID; +import static org.dspace.app.rest.utils.UsageReportUtils.TOTAL_DOWNLOADS_REPORT_ID; +import static org.dspace.app.rest.utils.UsageReportUtils.TOTAL_VISITS_PER_MONTH_REPORT_ID; +import static org.dspace.app.rest.utils.UsageReportUtils.TOTAL_VISITS_REPORT_ID; +import static org.hamcrest.Matchers.empty; +import static org.hamcrest.Matchers.not; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Calendar; +import java.util.List; +import java.util.Locale; +import java.util.UUID; + +import com.fasterxml.jackson.databind.ObjectMapper; +import org.dspace.app.rest.matcher.UsageReportMatcher; +import org.dspace.app.rest.model.UsageReportPointCityRest; +import org.dspace.app.rest.model.UsageReportPointCountryRest; +import org.dspace.app.rest.model.UsageReportPointDateRest; +import org.dspace.app.rest.model.UsageReportPointDsoTotalVisitsRest; +import org.dspace.app.rest.model.UsageReportPointRest; +import org.dspace.app.rest.model.ViewEventRest; +import org.dspace.app.rest.repository.StatisticsRestRepository; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.app.rest.utils.UsageReportUtils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; +import org.dspace.builder.SiteBuilder; +import org.dspace.content.Bitstream; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.content.Site; +import org.dspace.core.Constants; +import org.dspace.eperson.EPerson; +import org.dspace.services.ConfigurationService; +import org.dspace.statistics.factory.StatisticsServiceFactory; +import org.hamcrest.Matchers; +import org.junit.Before; +import org.junit.BeforeClass; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.http.HttpStatus; + +/** + * Integration test to test the /api/statistics/usagereports/ endpoints, see {@link UsageReportUtils} and + * {@link StatisticsRestRepository} + * + * @author Maria Verdonck (Atmire) on 10/06/2020 + */ +public class StatisticsRestRepositoryIT extends AbstractControllerIntegrationTest { + + @Autowired + ConfigurationService configurationService; + @Autowired + protected AuthorizeService authorizeService; + + private Community communityNotVisited; + private Community communityVisited; + private Collection collectionNotVisited; + private Collection collectionVisited; + private Item itemNotVisitedWithBitstreams; + private Item itemVisited; + private Bitstream bitstreamNotVisited; + private Bitstream bitstreamVisited; + + private String loggedInToken; + private String adminToken; + + @BeforeClass + public static void clearStatistics() throws Exception { + // To ensure these tests start "fresh", clear out any existing statistics data. + // NOTE: this is committed immediately in removeIndex() + StatisticsServiceFactory.getInstance().getSolrLoggerService().removeIndex("*:*"); + } + + @Before + @Override + public void setUp() throws Exception { + super.setUp(); + + // Explicitly use solr commit in SolrLoggerServiceImpl#postView + configurationService.setProperty("solr-statistics.autoCommit", false); + + context.turnOffAuthorisationSystem(); + + Community community = CommunityBuilder.createCommunity(context).build(); + communityNotVisited = CommunityBuilder.createSubCommunity(context, community).build(); + communityVisited = CommunityBuilder.createSubCommunity(context, community).build(); + collectionNotVisited = CollectionBuilder.createCollection(context, community).build(); + collectionVisited = CollectionBuilder.createCollection(context, community).build(); + itemVisited = ItemBuilder.createItem(context, collectionNotVisited).build(); + itemNotVisitedWithBitstreams = ItemBuilder.createItem(context, collectionNotVisited).build(); + bitstreamNotVisited = BitstreamBuilder.createBitstream(context, + itemNotVisitedWithBitstreams, toInputStream("test", UTF_8)).withName("BitstreamNotVisitedName").build(); + bitstreamVisited = BitstreamBuilder + .createBitstream(context, itemNotVisitedWithBitstreams, toInputStream("test", UTF_8)) + .withName("BitstreamVisitedName").build(); + + loggedInToken = getAuthToken(eperson.getEmail(), password); + adminToken = getAuthToken(admin.getEmail(), password); + + context.restoreAuthSystemState(); + } + + @Test + public void usagereports_withoutId_NotImplementedException() throws Exception { + getClient().perform(get("/api/statistics/usagereports")) + .andExpect(status().is(HttpStatus.METHOD_NOT_ALLOWED.value())); + } + + @Test + public void usagereports_notProperUUIDAndReportId_Exception() throws Exception { + getClient().perform(get("/api/statistics/usagereports/notProperUUIDAndReportId")) + .andExpect(status().is(HttpStatus.BAD_REQUEST.value())); + } + + @Test + public void usagereports_nonValidUUIDpart_Exception() throws Exception { + getClient().perform(get("/api/statistics/usagereports/notAnUUID" + "_" + TOTAL_VISITS_REPORT_ID)) + .andExpect(status().is(HttpStatus.BAD_REQUEST.value())); + } + + @Test + public void usagereports_nonValidReportIDpart_Exception() throws Exception { + getClient().perform(get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + + "_NotValidReport")) + .andExpect(status().is(HttpStatus.NOT_FOUND.value())); + } + + @Test + public void usagereports_NonExistentUUID_Exception() throws Exception { + getClient().perform(get("/api/statistics/usagereports/" + UUID.randomUUID() + "_" + TOTAL_VISITS_REPORT_ID)) + .andExpect(status().is(HttpStatus.NOT_FOUND.value())); + } + + @Test + public void usagereport_onlyAdminReadRights() throws Exception { + // ** WHEN ** + authorizeService.removeAllPolicies(context, itemNotVisitedWithBitstreams); + // We request a dso's TotalVisits usage stat report as anon but dso has no read policy for anon + getClient().perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isUnauthorized()); + // We request a dso's TotalVisits usage stat report as admin + getClient(adminToken).perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()); + } + + @Test + public void usagereport_onlyAdminReadRights_unvalidToken() throws Exception { + // ** WHEN ** + authorizeService.removeAllPolicies(context, itemNotVisitedWithBitstreams); + // We request a dso's TotalVisits usage stat report with unvalid token + getClient("unvalidToken").perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isForbidden()); + } + + @Test + public void usagereport_loggedInUserReadRights() throws Exception { + // ** WHEN ** + context.turnOffAuthorisationSystem(); + authorizeService.removeAllPolicies(context, itemNotVisitedWithBitstreams); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(itemNotVisitedWithBitstreams) + .withAction(Constants.READ) + .withUser(eperson).build(); + + EPerson eperson1 = EPersonBuilder.createEPerson(context) + .withEmail("eperson1@mail.com") + .withPassword(password) + .build(); + context.restoreAuthSystemState(); + String anotherLoggedInUserToken = getAuthToken(eperson1.getEmail(), password); + // We request a dso's TotalVisits usage stat report as anon but dso has no read policy for anon + getClient().perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isUnauthorized()); + // We request a dso's TotalVisits usage stat report as logged in eperson and has read policy for this user + getClient(loggedInToken).perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()); + // We request a dso's TotalVisits usage stat report as another logged in eperson and has no read policy for + // this user + getClient(anotherLoggedInUserToken).perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isForbidden()); + } + + @Test + public void totalVisitsReport_Community_Visited() throws Exception { + // ** WHEN ** + // We visit the community + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("community"); + viewEventRest.setTargetId(communityVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 1); + expectedPoint.setType("community"); + expectedPoint.setId(communityVisited.getID().toString()); + + // And request that community's TotalVisits stat report + getClient().perform( + get("/api/statistics/usagereports/" + communityVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(communityVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, Arrays.asList(expectedPoint)))) + ); + } + + @Test + public void totalVisitsReport_Community_NotVisited() throws Exception { + // ** WHEN ** + // Community is never visited + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 0); + expectedPoint.setType("community"); + expectedPoint.setId(communityNotVisited.getID().toString()); + + // And request that community's TotalVisits stat report + getClient().perform( + get("/api/statistics/usagereports/" + communityNotVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(communityNotVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, Arrays.asList(expectedPoint)))) + ); + } + + @Test + public void totalVisitsReport_Collection_Visited() throws Exception { + // ** WHEN ** + // We visit the collection twice + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("collection"); + viewEventRest.setTargetId(collectionVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 2); + expectedPoint.setType("collection"); + expectedPoint.setId(collectionVisited.getID().toString()); + + // And request that collection's TotalVisits stat report + getClient().perform( + get("/api/statistics/usagereports/" + collectionVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(collectionVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, Arrays.asList(expectedPoint)))) + ); + } + + @Test + public void totalVisitsReport_Collection_NotVisited() throws Exception { + // ** WHEN ** + // Collection is never visited + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 0); + expectedPoint.setType("collection"); + expectedPoint.setId(collectionNotVisited.getID().toString()); + + // And request that collection's TotalVisits stat report + getClient().perform( + get("/api/statistics/usagereports/" + collectionNotVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(collectionNotVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, Arrays.asList(expectedPoint)))) + ); + } + + @Test + public void totalVisitsReport_Item_Visited() throws Exception { + // ** WHEN ** + // We visit an Item + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("item"); + viewEventRest.setTargetId(itemVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 1); + expectedPoint.setType("item"); + expectedPoint.setId(itemVisited.getID().toString()); + + // And request that collection's TotalVisits stat report + getClient().perform( + get("/api/statistics/usagereports/" + itemVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(itemVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, Arrays.asList(expectedPoint)))) + ); + } + + @Test + public void totalVisitsReport_Item_NotVisited() throws Exception { + // ** WHEN ** + //Item is never visited + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 0); + expectedPoint.setType("item"); + expectedPoint.setId(itemNotVisitedWithBitstreams.getID().toString()); + + // And request that item's TotalVisits stat report + getClient().perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, Arrays.asList(expectedPoint)))) + ); + } + + @Test + public void totalVisitsReport_Bitstream_Visited() throws Exception { + // ** WHEN ** + // We visit a Bitstream + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("bitstream"); + viewEventRest.setTargetId(bitstreamVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 1); + expectedPoint.setType("bitstream"); + expectedPoint.setId(bitstreamVisited.getID().toString()); + + // And request that bitstream's TotalVisits stat report + getClient().perform( + get("/api/statistics/usagereports/" + bitstreamVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(bitstreamVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, Arrays.asList(expectedPoint)))) + ); + } + + @Test + public void totalVisitsReport_Bitstream_NotVisited() throws Exception { + // ** WHEN ** + // Bitstream is never visited + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 0); + expectedPoint.setType("bitstream"); + expectedPoint.setId(bitstreamNotVisited.getID().toString()); + + // And request that bitstream's TotalVisits stat report + getClient().perform( + get("/api/statistics/usagereports/" + bitstreamNotVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(bitstreamNotVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, Arrays.asList(expectedPoint)))) + ); + } + + @Test + public void totalVisitsPerMonthReport_Item_Visited() throws Exception { + // ** WHEN ** + // We visit an Item + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("item"); + viewEventRest.setTargetId(itemVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + List expectedPoints = this.getListOfVisitsPerMonthsPoints(1); + + // And request that item's TotalVisitsPerMonth stat report + getClient().perform( + get("/api/statistics/usagereports/" + itemVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(itemVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID, + TOTAL_VISITS_PER_MONTH_REPORT_ID, expectedPoints)))); + } + + @Test + public void totalVisitsPerMonthReport_Item_NotVisited() throws Exception { + // ** WHEN ** + // Item is not visited + List expectedPoints = this.getListOfVisitsPerMonthsPoints(0); + + // And request that item's TotalVisitsPerMonth stat report + getClient().perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + + TOTAL_VISITS_PER_MONTH_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport( + itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID, + TOTAL_VISITS_PER_MONTH_REPORT_ID, expectedPoints)))); + } + + @Test + public void totalVisitsPerMonthReport_Collection_Visited() throws Exception { + // ** WHEN ** + // We visit a Collection twice + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("collection"); + viewEventRest.setTargetId(collectionVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + List expectedPoints = this.getListOfVisitsPerMonthsPoints(2); + + // And request that collection's TotalVisitsPerMonth stat report + getClient().perform( + get("/api/statistics/usagereports/" + collectionVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(collectionVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID, + TOTAL_VISITS_PER_MONTH_REPORT_ID, expectedPoints)))); + } + + @Test + public void TotalDownloadsReport_Bitstream() throws Exception { + // ** WHEN ** + // We visit a Bitstream + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("bitstream"); + viewEventRest.setTargetId(bitstreamVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 1); + expectedPoint.setType("bitstream"); + expectedPoint.setId(bitstreamVisited.getID().toString()); + + // And request that bitstreams's TotalDownloads stat report + getClient().perform( + get("/api/statistics/usagereports/" + bitstreamVisited.getID() + "_" + TOTAL_DOWNLOADS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(bitstreamVisited.getID() + "_" + TOTAL_DOWNLOADS_REPORT_ID, + TOTAL_DOWNLOADS_REPORT_ID, Arrays.asList(expectedPoint))))); + } + + @Test + public void TotalDownloadsReport_Item() throws Exception { + // ** WHEN ** + // We visit an Item's bitstream + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("bitstream"); + viewEventRest.setTargetId(bitstreamVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPoint = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint.addValue("views", 1); + expectedPoint.setId("BitstreamVisitedName"); + expectedPoint.setType("bitstream"); + + // And request that item's TotalDownloads stat report + getClient().perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + + TOTAL_DOWNLOADS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_DOWNLOADS_REPORT_ID, + TOTAL_DOWNLOADS_REPORT_ID, Arrays.asList(expectedPoint))))); + } + + @Test + public void TotalDownloadsReport_Item_NotVisited() throws Exception { + // ** WHEN ** + // You don't visit an item's bitstreams + // And request that item's TotalDownloads stat report + getClient().perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + + TOTAL_DOWNLOADS_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(itemNotVisitedWithBitstreams.getID() + "_" + TOTAL_DOWNLOADS_REPORT_ID, + TOTAL_DOWNLOADS_REPORT_ID, new ArrayList<>())))); + } + + @Test + public void TotalDownloadsReport_NotSupportedDSO_Collection() throws Exception { + getClient() + .perform(get("/api/statistics/usagereports/" + collectionVisited.getID() + "_" + TOTAL_DOWNLOADS_REPORT_ID)) + .andExpect(status().is(HttpStatus.BAD_REQUEST.value())); + } + + /** + * Note: Geolite response mocked in {@link org.dspace.statistics.MockSolrLoggerServiceImpl} + */ + @Test + public void topCountriesReport_Collection_Visited() throws Exception { + // ** WHEN ** + // We visit a Collection + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("collection"); + viewEventRest.setTargetId(collectionVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointCountryRest expectedPoint = new UsageReportPointCountryRest(); + expectedPoint.addValue("views", 1); + expectedPoint.setId("US"); + expectedPoint.setLabel("United States"); + + // And request that collection's TopCountries report + getClient().perform( + get("/api/statistics/usagereports/" + collectionVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(collectionVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID, + TOP_COUNTRIES_REPORT_ID, Arrays.asList(expectedPoint))))); + } + + /** + * Note: Geolite response mocked in {@link org.dspace.statistics.MockSolrLoggerServiceImpl} + */ + @Test + public void topCountriesReport_Community_Visited() throws Exception { + // ** WHEN ** + // We visit a Community twice + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("community"); + viewEventRest.setTargetId(communityVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointCountryRest expectedPoint = new UsageReportPointCountryRest(); + expectedPoint.addValue("views", 2); + expectedPoint.setId("US"); + expectedPoint.setLabel("United States"); + + // And request that collection's TopCountries report + getClient().perform( + get("/api/statistics/usagereports/" + communityVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(communityVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID, + TOP_COUNTRIES_REPORT_ID, Arrays.asList(expectedPoint))))); + } + + /** + * Note: Geolite response mocked in {@link org.dspace.statistics.MockSolrLoggerServiceImpl} + */ + @Test + public void topCountriesReport_Item_NotVisited() throws Exception { + // ** WHEN ** + // Item is not visited + // And request that item's TopCountries report + getClient().perform( + get("/api/statistics/usagereports/" + itemNotVisitedWithBitstreams.getID() + "_" + TOP_COUNTRIES_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(itemNotVisitedWithBitstreams.getID() + "_" + TOP_COUNTRIES_REPORT_ID, + TOP_COUNTRIES_REPORT_ID, new ArrayList<>())))); + } + + /** + * Note: Geolite response mocked in {@link org.dspace.statistics.MockSolrLoggerServiceImpl} + */ + @Test + public void topCitiesReport_Item_Visited() throws Exception { + // ** WHEN ** + // We visit an Item + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("item"); + viewEventRest.setTargetId(itemVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointCityRest expectedPoint = new UsageReportPointCityRest(); + expectedPoint.addValue("views", 1); + expectedPoint.setId("New York"); + + // And request that item's TopCities report + getClient().perform( + get("/api/statistics/usagereports/" + itemVisited.getID() + "_" + TOP_CITIES_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(itemVisited.getID() + "_" + TOP_CITIES_REPORT_ID, + TOP_CITIES_REPORT_ID, Arrays.asList(expectedPoint))))); + } + + /** + * Note: Geolite response mocked in {@link org.dspace.statistics.MockSolrLoggerServiceImpl} + */ + @Test + public void topCitiesReport_Community_Visited() throws Exception { + // ** WHEN ** + // We visit a Community thrice + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("community"); + viewEventRest.setTargetId(communityVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + getClient(loggedInToken).perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointCityRest expectedPoint = new UsageReportPointCityRest(); + expectedPoint.addValue("views", 3); + expectedPoint.setId("New York"); + + // And request that community's TopCities report + getClient().perform( + get("/api/statistics/usagereports/" + communityVisited.getID() + "_" + TOP_CITIES_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(communityVisited.getID() + "_" + TOP_CITIES_REPORT_ID, + TOP_CITIES_REPORT_ID, Arrays.asList(expectedPoint))))); + } + + /** + * Note: Geolite response mocked in {@link org.dspace.statistics.MockSolrLoggerServiceImpl} + */ + @Test + public void topCitiesReport_Collection_NotVisited() throws Exception { + // ** WHEN ** + // Collection is not visited + // And request that collection's TopCountries report + getClient().perform( + get("/api/statistics/usagereports/" + collectionNotVisited.getID() + "_" + TOP_CITIES_REPORT_ID)) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + UsageReportMatcher + .matchUsageReport(collectionNotVisited.getID() + "_" + TOP_CITIES_REPORT_ID, + TOP_CITIES_REPORT_ID, new ArrayList<>())))); + } + + @Test + public void usagereportsSearch_notProperURI_Exception() throws Exception { + getClient().perform(get("/api/statistics/usagereports/search/object?uri=BadUri")) + .andExpect(status().is(HttpStatus.BAD_REQUEST.value())); + } + + @Test + public void usagereportsSearch_noURI_Exception() throws Exception { + getClient().perform(get("/api/statistics/usagereports/search/object")) + .andExpect(status().is(HttpStatus.BAD_REQUEST.value())); + } + + @Test + public void usagereportsSearch_NonExistentUUID_Exception() throws Exception { + getClient().perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + UUID.randomUUID())) + .andExpect(status().is(HttpStatus.NOT_FOUND.value())); + } + + @Test + public void usagereportSearch_onlyAdminReadRights() throws Exception { + // ** WHEN ** + authorizeService.removeAllPolicies(context, itemNotVisitedWithBitstreams); + // We request a dso's TotalVisits usage stat report as anon but dso has no read policy for anon + getClient().perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + itemNotVisitedWithBitstreams.getID())) + // ** THEN ** + .andExpect(status().isUnauthorized()); + // We request a dso's TotalVisits usage stat report as admin + getClient(adminToken) + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api" + + "/core/items/" + itemNotVisitedWithBitstreams.getID())) + // ** THEN ** + .andExpect(status().isOk()); + } + + @Test + public void usagereportSearch_onlyAdminReadRights_unvalidToken() throws Exception { + // ** WHEN ** + authorizeService.removeAllPolicies(context, itemNotVisitedWithBitstreams); + // We request a dso's TotalVisits usage stat report with unvalid token + getClient("unvalidToken") + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + itemNotVisitedWithBitstreams.getID())) + // ** THEN ** + .andExpect(status().isForbidden()); + } + + @Test + public void usagereportSearch_loggedInUserReadRights() throws Exception { + // ** WHEN ** + context.turnOffAuthorisationSystem(); + authorizeService.removeAllPolicies(context, itemNotVisitedWithBitstreams); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(itemNotVisitedWithBitstreams) + .withAction(Constants.READ) + .withUser(eperson).build(); + + EPerson eperson1 = EPersonBuilder.createEPerson(context) + .withEmail("eperson1@mail.com") + .withPassword(password) + .build(); + context.restoreAuthSystemState(); + String anotherLoggedInUserToken = getAuthToken(eperson1.getEmail(), password); + // We request a dso's TotalVisits usage stat report as anon but dso has no read policy for anon + getClient() + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + itemNotVisitedWithBitstreams.getID())) + // ** THEN ** + .andExpect(status().isUnauthorized()); + // We request a dso's TotalVisits usage stat report as logged in eperson and has read policy for this user + getClient(loggedInToken) + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + itemNotVisitedWithBitstreams.getID())) + // ** THEN ** + .andExpect(status().isOk()); + // We request a dso's TotalVisits usage stat report as another logged in eperson and has no read policy for + // this user + getClient(anotherLoggedInUserToken) + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + itemNotVisitedWithBitstreams.getID())) + // ** THEN ** + .andExpect(status().isForbidden()); + } + + @Test + public void usageReportsSearch_Site() throws Exception { + context.turnOffAuthorisationSystem(); + Site site = SiteBuilder.createSite(context).build(); + Item itemVisited2 = ItemBuilder.createItem(context, collectionNotVisited).build(); + context.restoreAuthSystemState(); + + // ** WHEN ** + // We visit an item and another twice + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("item"); + viewEventRest.setTargetId(itemVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient().perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + ViewEventRest viewEventRest2 = new ViewEventRest(); + viewEventRest2.setTargetType("item"); + viewEventRest2.setTargetId(itemVisited2.getID()); + + ObjectMapper mapper2 = new ObjectMapper(); + + getClient().perform(post("/api/statistics/viewevents") + .content(mapper2.writeValueAsBytes(viewEventRest2)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + getClient().perform(post("/api/statistics/viewevents") + .content(mapper2.writeValueAsBytes(viewEventRest2)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + List points = new ArrayList<>(); + UsageReportPointDsoTotalVisitsRest expectedPoint1 = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint1.addValue("views", 1); + expectedPoint1.setType("item"); + expectedPoint1.setId(itemVisited.getID().toString()); + UsageReportPointDsoTotalVisitsRest expectedPoint2 = new UsageReportPointDsoTotalVisitsRest(); + expectedPoint2.addValue("views", 2); + expectedPoint2.setType("item"); + expectedPoint2.setId(itemVisited2.getID().toString()); + points.add(expectedPoint1); + points.add(expectedPoint2); + + // And request the sites global usage report (show top most popular items) + getClient(adminToken) + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/sites/" + site.getID())) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.usagereports", not(empty()))) + .andExpect(jsonPath("$._embedded.usagereports", Matchers.containsInAnyOrder( + UsageReportMatcher + .matchUsageReport(site.getID() + "_" + TOTAL_VISITS_REPORT_ID, TOTAL_VISITS_REPORT_ID, points)))); + } + + @Test + public void usageReportsSearch_Community_Visited() throws Exception { + // ** WHEN ** + // We visit a community + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("community"); + viewEventRest.setTargetId(communityVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient().perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPointTotalVisits = new UsageReportPointDsoTotalVisitsRest(); + expectedPointTotalVisits.addValue("views", 1); + expectedPointTotalVisits.setType("community"); + expectedPointTotalVisits.setId(communityVisited.getID().toString()); + + UsageReportPointCityRest expectedPointCity = new UsageReportPointCityRest(); + expectedPointCity.addValue("views", 1); + expectedPointCity.setId("New York"); + + UsageReportPointCountryRest expectedPointCountry = new UsageReportPointCountryRest(); + expectedPointCountry.addValue("views", 1); + expectedPointCountry.setId("US"); + expectedPointCountry.setLabel("United States"); + + // And request the community usage reports + getClient() + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/communities/" + communityVisited.getID())) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.usagereports", not(empty()))) + .andExpect(jsonPath("$._embedded.usagereports", Matchers.containsInAnyOrder( + UsageReportMatcher + .matchUsageReport(communityVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, TOTAL_VISITS_REPORT_ID, + Arrays.asList(expectedPointTotalVisits)), + UsageReportMatcher.matchUsageReport(communityVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID, + TOTAL_VISITS_PER_MONTH_REPORT_ID, + this.getListOfVisitsPerMonthsPoints(1)), + UsageReportMatcher.matchUsageReport(communityVisited.getID() + "_" + TOP_CITIES_REPORT_ID, + TOP_CITIES_REPORT_ID, Arrays.asList(expectedPointCity)), + UsageReportMatcher.matchUsageReport(communityVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID, + TOP_COUNTRIES_REPORT_ID, Arrays.asList(expectedPointCountry)) + ))); + } + + @Test + public void usageReportsSearch_Collection_NotVisited() throws Exception { + // ** WHEN ** + // Collection is not visited + + UsageReportPointDsoTotalVisitsRest expectedPointTotalVisits = new UsageReportPointDsoTotalVisitsRest(); + expectedPointTotalVisits.addValue("views", 0); + expectedPointTotalVisits.setType("collection"); + expectedPointTotalVisits.setId(collectionNotVisited.getID().toString()); + // And request the collection's usage reports + getClient() + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/collections/" + collectionNotVisited.getID())) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.usagereports", not(empty()))) + .andExpect(jsonPath("$._embedded.usagereports", Matchers.containsInAnyOrder( + UsageReportMatcher + .matchUsageReport(collectionNotVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, + TOTAL_VISITS_REPORT_ID, + Arrays.asList(expectedPointTotalVisits)), + UsageReportMatcher + .matchUsageReport(collectionNotVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID, + TOTAL_VISITS_PER_MONTH_REPORT_ID, + this.getListOfVisitsPerMonthsPoints(0)), + UsageReportMatcher.matchUsageReport(collectionNotVisited.getID() + "_" + TOP_CITIES_REPORT_ID, + TOP_CITIES_REPORT_ID, new ArrayList<>()), + UsageReportMatcher.matchUsageReport(collectionNotVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID, + TOP_COUNTRIES_REPORT_ID, new ArrayList<>())))); + } + + @Test + public void usageReportsSearch_Item_Visited_FileNotVisited() throws Exception { + // ** WHEN ** + // We visit an item + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("item"); + viewEventRest.setTargetId(itemVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient().perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPointTotalVisits = new UsageReportPointDsoTotalVisitsRest(); + expectedPointTotalVisits.addValue("views", 1); + expectedPointTotalVisits.setType("item"); + expectedPointTotalVisits.setId(itemVisited.getID().toString()); + + UsageReportPointCityRest expectedPointCity = new UsageReportPointCityRest(); + expectedPointCity.addValue("views", 1); + expectedPointCity.setId("New York"); + + UsageReportPointCountryRest expectedPointCountry = new UsageReportPointCountryRest(); + expectedPointCountry.addValue("views", 1); + expectedPointCountry.setId("US"); + expectedPointCountry.setLabel("United States"); + + // And request the community usage reports + getClient() + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + itemVisited.getID())) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.usagereports", not(empty()))) + .andExpect(jsonPath("$._embedded.usagereports", Matchers.containsInAnyOrder( + UsageReportMatcher + .matchUsageReport(itemVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, TOTAL_VISITS_REPORT_ID, + Arrays.asList(expectedPointTotalVisits)), + UsageReportMatcher.matchUsageReport(itemVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID, + TOTAL_VISITS_PER_MONTH_REPORT_ID, + this.getListOfVisitsPerMonthsPoints(1)), + UsageReportMatcher.matchUsageReport(itemVisited.getID() + "_" + TOP_CITIES_REPORT_ID, + TOP_CITIES_REPORT_ID, Arrays.asList(expectedPointCity)), + UsageReportMatcher.matchUsageReport(itemVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID, + TOP_COUNTRIES_REPORT_ID, Arrays.asList(expectedPointCountry)), + UsageReportMatcher.matchUsageReport(itemVisited.getID() + "_" + TOTAL_DOWNLOADS_REPORT_ID, + TOTAL_DOWNLOADS_REPORT_ID, new ArrayList<>())))); + } + + @Test + public void usageReportsSearch_ItemVisited_FilesVisited() throws Exception { + context.turnOffAuthorisationSystem(); + Bitstream bitstream1 = + BitstreamBuilder.createBitstream(context, itemVisited, toInputStream("test", UTF_8)).withName("bitstream1") + .build(); + Bitstream bitstream2 = + BitstreamBuilder.createBitstream(context, itemVisited, toInputStream("test", UTF_8)).withName("bitstream2") + .build(); + context.restoreAuthSystemState(); + + // ** WHEN ** + // We visit an item + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("item"); + viewEventRest.setTargetId(itemVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient().perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + // And its two files, second one twice + ViewEventRest viewEventRestBit1 = new ViewEventRest(); + viewEventRestBit1.setTargetType("bitstream"); + viewEventRestBit1.setTargetId(bitstream1.getID()); + ViewEventRest viewEventRestBit2 = new ViewEventRest(); + viewEventRestBit2.setTargetType("bitstream"); + viewEventRestBit2.setTargetId(bitstream2.getID()); + + getClient().perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRestBit1)) + .contentType(contentType)) + .andExpect(status().isCreated()); + getClient().perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRestBit2)) + .contentType(contentType)) + .andExpect(status().isCreated()); + getClient().perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRestBit2)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPointTotalVisits = new UsageReportPointDsoTotalVisitsRest(); + expectedPointTotalVisits.addValue("views", 1); + expectedPointTotalVisits.setType("item"); + expectedPointTotalVisits.setId(itemVisited.getID().toString()); + + UsageReportPointCityRest expectedPointCity = new UsageReportPointCityRest(); + expectedPointCity.addValue("views", 1); + expectedPointCity.setId("New York"); + + UsageReportPointCountryRest expectedPointCountry = new UsageReportPointCountryRest(); + expectedPointCountry.addValue("views", 1); + expectedPointCountry.setId("US"); + expectedPointCountry.setLabel("United States"); + + List totalDownloadsPoints = new ArrayList<>(); + UsageReportPointDsoTotalVisitsRest expectedPointTotalVisitsBit1 = new UsageReportPointDsoTotalVisitsRest(); + expectedPointTotalVisitsBit1.addValue("views", 1); + expectedPointTotalVisitsBit1.setId("bitstream1"); + expectedPointTotalVisitsBit1.setType("bitstream"); + UsageReportPointDsoTotalVisitsRest expectedPointTotalVisitsBit2 = new UsageReportPointDsoTotalVisitsRest(); + expectedPointTotalVisitsBit2.addValue("views", 2); + expectedPointTotalVisitsBit2.setId("bitstream2"); + expectedPointTotalVisitsBit2.setType("bitstream"); + totalDownloadsPoints.add(expectedPointTotalVisitsBit1); + totalDownloadsPoints.add(expectedPointTotalVisitsBit2); + + // And request the community usage reports + getClient() + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + itemVisited.getID())) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.usagereports", not(empty()))) + .andExpect(jsonPath("$._embedded.usagereports", Matchers.containsInAnyOrder( + UsageReportMatcher + .matchUsageReport(itemVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, TOTAL_VISITS_REPORT_ID, + Arrays.asList(expectedPointTotalVisits)), + UsageReportMatcher.matchUsageReport(itemVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID, + TOTAL_VISITS_PER_MONTH_REPORT_ID, + this.getListOfVisitsPerMonthsPoints(1)), + UsageReportMatcher.matchUsageReport(itemVisited.getID() + "_" + TOP_CITIES_REPORT_ID, + TOP_CITIES_REPORT_ID, Arrays.asList(expectedPointCity)), + UsageReportMatcher.matchUsageReport(itemVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID, + TOP_COUNTRIES_REPORT_ID, Arrays.asList(expectedPointCountry)), + UsageReportMatcher.matchUsageReport(itemVisited.getID() + "_" + TOTAL_DOWNLOADS_REPORT_ID, + TOTAL_DOWNLOADS_REPORT_ID, totalDownloadsPoints)))); + } + + @Test + public void usageReportsSearch_Bitstream_Visited() throws Exception { + // ** WHEN ** + // We visit a bitstream + ViewEventRest viewEventRest = new ViewEventRest(); + viewEventRest.setTargetType("bitstream"); + viewEventRest.setTargetId(bitstreamVisited.getID()); + + ObjectMapper mapper = new ObjectMapper(); + + getClient().perform(post("/api/statistics/viewevents") + .content(mapper.writeValueAsBytes(viewEventRest)) + .contentType(contentType)) + .andExpect(status().isCreated()); + + UsageReportPointDsoTotalVisitsRest expectedPointTotalVisits = new UsageReportPointDsoTotalVisitsRest(); + expectedPointTotalVisits.addValue("views", 1); + expectedPointTotalVisits.setType("bitstream"); + expectedPointTotalVisits.setLabel("BitstreamVisitedName"); + expectedPointTotalVisits.setId(bitstreamVisited.getID().toString()); + + UsageReportPointCityRest expectedPointCity = new UsageReportPointCityRest(); + expectedPointCity.addValue("views", 1); + expectedPointCity.setId("New York"); + + UsageReportPointCountryRest expectedPointCountry = new UsageReportPointCountryRest(); + expectedPointCountry.addValue("views", 1); + expectedPointCountry.setId("US"); + expectedPointCountry.setLabel("United States"); + + // And request the community usage reports + getClient() + .perform(get("/api/statistics/usagereports/search/object?uri=http://localhost:8080/server/api/core" + + "/items/" + bitstreamVisited.getID())) + // ** THEN ** + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.usagereports", not(empty()))) + .andExpect(jsonPath("$._embedded.usagereports", Matchers.containsInAnyOrder( + UsageReportMatcher + .matchUsageReport(bitstreamVisited.getID() + "_" + TOTAL_VISITS_REPORT_ID, TOTAL_VISITS_REPORT_ID, + Arrays.asList(expectedPointTotalVisits)), + UsageReportMatcher.matchUsageReport(bitstreamVisited.getID() + "_" + TOTAL_VISITS_PER_MONTH_REPORT_ID, + TOTAL_VISITS_PER_MONTH_REPORT_ID, + this.getListOfVisitsPerMonthsPoints(1)), + UsageReportMatcher.matchUsageReport(bitstreamVisited.getID() + "_" + TOP_CITIES_REPORT_ID, + TOP_CITIES_REPORT_ID, Arrays.asList(expectedPointCity)), + UsageReportMatcher.matchUsageReport(bitstreamVisited.getID() + "_" + TOP_COUNTRIES_REPORT_ID, + TOP_COUNTRIES_REPORT_ID, Arrays.asList(expectedPointCountry)), + UsageReportMatcher.matchUsageReport(bitstreamVisited.getID() + "_" + TOTAL_DOWNLOADS_REPORT_ID, + TOTAL_DOWNLOADS_REPORT_ID, Arrays.asList(expectedPointTotalVisits))))); + } + + // Create expected points from -6 months to now, with given number of views in current month + private List getListOfVisitsPerMonthsPoints(int viewsLastMonth) { + List expectedPoints = new ArrayList<>(); + int nrOfMonthsBack = 6; + Calendar cal = Calendar.getInstance(); + for (int i = 0; i <= nrOfMonthsBack; i++) { + UsageReportPointDateRest expectedPoint = new UsageReportPointDateRest(); + if (i > 0) { + expectedPoint.addValue("views", 0); + } else { + expectedPoint.addValue("views", viewsLastMonth); + } + String month = cal.getDisplayName(Calendar.MONTH, Calendar.LONG, Locale.getDefault()); + expectedPoint.setId(month + " " + cal.get(Calendar.YEAR)); + + expectedPoints.add(expectedPoint); + cal.add(Calendar.MONTH, -1); + } + return expectedPoints; + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubResourcePermissionsIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubResourcePermissionsIT.java index d4399a8047..c5f2ed9a13 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubResourcePermissionsIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubResourcePermissionsIT.java @@ -15,15 +15,15 @@ import java.io.InputStream; import org.apache.commons.codec.CharEncoding; import org.apache.commons.io.IOUtils; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.BundleBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.matcher.BundleMatcher; import org.dspace.app.rest.matcher.CommunityMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.BundleBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Bundle; import org.dspace.content.Collection; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubmissionDefinitionsControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubmissionDefinitionsControllerIT.java index e52b70b7bd..e642336b1c 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubmissionDefinitionsControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubmissionDefinitionsControllerIT.java @@ -18,10 +18,10 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.matcher.SubmissionDefinitionsMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; import org.dspace.content.Collection; import org.hamcrest.Matchers; import org.junit.Test; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubmissionFormsControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubmissionFormsControllerIT.java index b17ea47c2d..66938e5991 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubmissionFormsControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/SubmissionFormsControllerIT.java @@ -16,10 +16,21 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; +import java.util.Locale; + import org.dspace.app.rest.matcher.SubmissionFormFieldMatcher; +import org.dspace.app.rest.repository.SubmissionFormRestRepository; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.app.util.DCInputsReaderException; +import org.dspace.builder.EPersonBuilder; +import org.dspace.content.authority.DCInputAuthority; +import org.dspace.content.authority.service.ChoiceAuthorityService; +import org.dspace.core.service.PluginService; +import org.dspace.eperson.EPerson; +import org.dspace.services.ConfigurationService; import org.hamcrest.Matchers; import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; /** * Integration test to test the /api/config/submissionforms endpoint @@ -27,6 +38,15 @@ import org.junit.Test; */ public class SubmissionFormsControllerIT extends AbstractControllerIntegrationTest { + @Autowired + private ConfigurationService configurationService; + @Autowired + private SubmissionFormRestRepository submissionFormRestRepository; + @Autowired + private PluginService pluginService; + @Autowired + private ChoiceAuthorityService cas; + @Test public void findAll() throws Exception { //When we call the root endpoint as anonymous user @@ -43,15 +63,15 @@ public class SubmissionFormsControllerIT extends AbstractControllerIntegrationTe .andExpect(status().isOk()) //We expect the content type to be "application/hal+json;charset=UTF-8" .andExpect(content().contentType(contentType)) - //The configuration file for the test env includes 3 forms + //The configuration file for the test env includes 6 forms .andExpect(jsonPath("$.page.size", is(20))) - .andExpect(jsonPath("$.page.totalElements", equalTo(4))) + .andExpect(jsonPath("$.page.totalElements", equalTo(6))) .andExpect(jsonPath("$.page.totalPages", equalTo(1))) .andExpect(jsonPath("$.page.number", is(0))) .andExpect( jsonPath("$._links.self.href", Matchers.startsWith(REST_SERVER_URL + "config/submissionforms"))) - //The array of submissionforms should have a size of 3 - .andExpect(jsonPath("$._embedded.submissionforms", hasSize(equalTo(4)))) + //The array of submissionforms should have a size of 6 + .andExpect(jsonPath("$._embedded.submissionforms", hasSize(equalTo(6)))) ; } @@ -62,12 +82,12 @@ public class SubmissionFormsControllerIT extends AbstractControllerIntegrationTe .andExpect(status().isOk()) .andExpect(content().contentType(contentType)) .andExpect(jsonPath("$.page.size", is(20))) - .andExpect(jsonPath("$.page.totalElements", equalTo(4))) + .andExpect(jsonPath("$.page.totalElements", equalTo(6))) .andExpect(jsonPath("$.page.totalPages", equalTo(1))) .andExpect(jsonPath("$.page.number", is(0))) .andExpect(jsonPath("$._links.self.href", Matchers.startsWith(REST_SERVER_URL + "config/submissionforms"))) - .andExpect(jsonPath("$._embedded.submissionforms", hasSize(equalTo(4)))); + .andExpect(jsonPath("$._embedded.submissionforms", hasSize(equalTo(6)))); } @Test @@ -139,6 +159,121 @@ public class SubmissionFormsControllerIT extends AbstractControllerIntegrationTe "col-sm-8","dc.publisher")))); } + @Test + public void findFieldWithAuthorityConfig() throws Exception { + configurationService.setProperty("plugin.named.org.dspace.content.authority.ChoiceAuthority", + new String[] { + "org.dspace.content.authority.SolrAuthority = SolrAuthorAuthority", + "org.dspace.content.authority.SolrAuthority = SolrEditorAuthority", + "org.dspace.content.authority.SolrAuthority = SolrSubjectAuthority" + }); + + configurationService.setProperty("solr.authority.server", + "${solr.server}/authority"); + configurationService.setProperty("choices.plugin.dc.contributor.author", + "SolrAuthorAuthority"); + configurationService.setProperty("choices.presentation.dc.contributor.author", + "suggest"); + configurationService.setProperty("authority.controlled.dc.contributor.author", + "true"); + configurationService.setProperty("authority.author.indexer.field.1", + "dc.contributor.author"); + configurationService.setProperty("choices.plugin.dc.contributor.editor", + "SolrEditorAuthority"); + configurationService.setProperty("choices.presentation.dc.contributor.editor", + "authorLookup"); + configurationService.setProperty("authority.controlled.dc.contributor.editor", + "true"); + configurationService.setProperty("authority.author.indexer.field.2", + "dc.contributor.editor"); + configurationService.setProperty("choices.plugin.dc.subject", + "SolrSubjectAuthority"); + configurationService.setProperty("choices.presentation.dc.subject", + "lookup"); + configurationService.setProperty("authority.controlled.dc.subject", + "true"); + configurationService.setProperty("authority.author.indexer.field.3", + "dc.subject"); + + // These clears have to happen so that the config is actually reloaded in those classes. This is needed for + // the properties that we're altering above and this is only used within the tests + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/config/submissionforms/sampleauthority")) + //The status has to be 200 OK + .andExpect(status().isOk()) + //We expect the content type to be "application/hal+json;charset=UTF-8" + .andExpect(content().contentType(contentType)) + //Check that the JSON root matches the expected "sampleauthority" input forms + .andExpect(jsonPath("$.id", is("sampleauthority"))) + .andExpect(jsonPath("$.name", is("sampleauthority"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/sampleauthority"))) + // our test configuration include the dc.contributor.author, dc.contributor.editor and + // dc.subject fields with in separate rows all linked to an authority with different + // presentation modes (suggestion, name-lookup, lookup) + .andExpect(jsonPath("$.rows[0].fields", contains( + SubmissionFormFieldMatcher.matchFormFieldDefinition("onebox", "Author", + null, true, + "Author field that can be associated with an authority providing suggestion", + null, "dc.contributor.author", "SolrAuthorAuthority") + ))) + .andExpect(jsonPath("$.rows[1].fields", contains( + SubmissionFormFieldMatcher.matchFormFieldDefinition("lookup-name", "Editor", + null, false, + "Editor field that can be associated with an authority " + + "providing the special name lookup", + null, "dc.contributor.editor", "SolrEditorAuthority") + ))) + .andExpect(jsonPath("$.rows[2].fields", contains( + SubmissionFormFieldMatcher.matchFormFieldDefinition("lookup", "Subject", + null, true, + "Subject field that can be associated with an authority providing lookup", + null, "dc.subject", "SolrSubjectAuthority") + ))) + ; + // we need to force a reload of the config now to be able to reload also the cache of the other + // authority related services. As this is needed just by this test method it is more efficient do it + // here instead that force these reload for each method extending the destroy method + configurationService.reloadConfig(); + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + } + + @Test + public void findFieldWithValuePairsConfig() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + + getClient(token).perform(get("/api/config/submissionforms/traditionalpageone")) + //The status has to be 200 OK + .andExpect(status().isOk()) + //We expect the content type to be "application/hal+json;charset=UTF-8" + .andExpect(content().contentType(contentType)) + //Check that the JSON root matches the expected "traditionalpageone" input forms + .andExpect(jsonPath("$.id", is("traditionalpageone"))) + .andExpect(jsonPath("$.name", is("traditionalpageone"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/traditionalpageone"))) + // our test configuration include the dc.type field with a value pair in the 8th row + .andExpect(jsonPath("$.rows[7].fields", contains( + SubmissionFormFieldMatcher.matchFormFieldDefinition("dropdown", "Type", + null, true, + "Select the type(s) of content of the item. To select more than one value in the " + + "list, you may have to hold down the \"CTRL\" or \"Shift\" key.", + null, "dc.type", "common_types") + ))) + ; + } + @Test public void findOpenRelationshipConfig() throws Exception { String token = getAuthToken(admin.getEmail(), password); @@ -181,8 +316,287 @@ public class SubmissionFormsControllerIT extends AbstractControllerIntegrationTe // check the first two rows .andExpect(jsonPath("$.rows[0].fields", contains( SubmissionFormFieldMatcher.matchFormClosedRelationshipFieldDefinition("Journal", null, - false,"Select the journal related to this volume.", "isVolumeOfJournal", + false,"Select the journal related to this volume.", "isJournalOfVolume", "creativework.publisher:somepublishername", "periodical", false)))) ; } + + @Test + public void languageSupportTest() throws Exception { + context.turnOffAuthorisationSystem(); + String[] supportedLanguage = {"it","uk"}; + configurationService.setProperty("default.locale","it"); + configurationService.setProperty("webui.supported.locales",supportedLanguage); + // These clears have to happen so that the config is actually reloaded in those classes. This is needed for + // the properties that we're altering above and this is only used within the tests + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + + Locale uk = new Locale("uk"); + Locale it = new Locale("it"); + + context.restoreAuthSystemState(); + + String tokenEperson = getAuthToken(eperson.getEmail(), password); + + // user select italian language + getClient(tokenEperson).perform(get("/api/config/submissionforms/languagetest").locale(it)) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.id", is("languagetest"))) + .andExpect(jsonPath("$.name", is("languagetest"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/languagetest"))) + .andExpect(jsonPath("$.rows[0].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("name", "Autore", "\u00C8" + " richiesto almeno un autore", true, + "Aggiungi un autore", "dc.contributor.author")))) + .andExpect(jsonPath("$.rows[1].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("onebox", "Titolo", + "\u00C8" + " necessario inserire un titolo principale per questo item", false, + "Inserisci titolo principale di questo item", "dc.title")))) + .andExpect(jsonPath("$.rows[2].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("dropdown", "Lingua", null, false, + "Selezionare la lingua del contenuto principale dell'item." + + " Se la lingua non compare nell'elenco, selezionare (Altro)." + + " Se il contenuto non ha davvero una lingua" + + " (ad esempio, se è un set di dati o un'immagine) selezionare (N/A)", + null, "dc.language.iso", "common_iso_languages")))); + + // user select ukranian language + getClient(tokenEperson).perform(get("/api/config/submissionforms/languagetest").locale(uk)) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.id", is("languagetest"))) + .andExpect(jsonPath("$.name", is("languagetest"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/languagetest"))) + .andExpect(jsonPath("$.rows[0].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("name", "Автор", "Потрібно ввести хочаб одного автора!", + true, "Додати автора", "dc.contributor.author")))) + .andExpect(jsonPath("$.rows[1].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("onebox", "Заголовок", + "Заговолок файла обов'язковий !", false, + "Ввести основний заголовок файла", "dc.title")))) + .andExpect(jsonPath("$.rows[2].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("dropdown", "Мова", null, false, + "Виберiть мову головного змiсту файлу, як що мови немає у списку, вибрати (Iнша)." + + " Як що вмiст вайлу не є текстовим, наприклад є фотографiєю, тодi вибрати (N/A)", + null, "dc.language.iso", "common_iso_languages")))); + resetLocalesConfiguration(); + } + + @Test + public void preferLanguageTest() throws Exception { + context.turnOffAuthorisationSystem(); + + String[] supportedLanguage = {"it","uk"}; + configurationService.setProperty("default.locale","it"); + configurationService.setProperty("webui.supported.locales",supportedLanguage); + // These clears have to happen so that the config is actually reloaded in those classes. This is needed for + // the properties that we're altering above and this is only used within the tests + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + + EPerson epersonIT = EPersonBuilder.createEPerson(context) + .withEmail("epersonIT@example.com") + .withPassword(password) + .withLanguage("it") + .build(); + + EPerson epersonUK = EPersonBuilder.createEPerson(context) + .withEmail("epersonUK@example.com") + .withPassword(password) + .withLanguage("uk") + .build(); + + context.restoreAuthSystemState(); + + String tokenEpersonIT = getAuthToken(epersonIT.getEmail(), password); + String tokenEpersonUK = getAuthToken(epersonUK.getEmail(), password); + + // user with italian prefer language + getClient(tokenEpersonIT).perform(get("/api/config/submissionforms/languagetest")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.id", is("languagetest"))) + .andExpect(jsonPath("$.name", is("languagetest"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/languagetest"))) + .andExpect(jsonPath("$.rows[0].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("name", "Autore", "\u00C8" + " richiesto almeno un autore", true, + "Aggiungi un autore", "dc.contributor.author")))) + .andExpect(jsonPath("$.rows[1].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("onebox", "Titolo", + "\u00C8" + " necessario inserire un titolo principale per questo item", false, + "Inserisci titolo principale di questo item", "dc.title")))) + .andExpect(jsonPath("$.rows[2].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("dropdown", "Lingua", null, false, + "Selezionare la lingua del contenuto principale dell'item." + + " Se la lingua non compare nell'elenco, selezionare (Altro)." + + " Se il contenuto non ha davvero una lingua" + + " (ad esempio, se è un set di dati o un'immagine) selezionare (N/A)", + null, "dc.language.iso", "common_iso_languages")))); + + // user with ukranian prefer language + getClient(tokenEpersonUK).perform(get("/api/config/submissionforms/languagetest")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.id", is("languagetest"))) + .andExpect(jsonPath("$.name", is("languagetest"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/languagetest"))) + .andExpect(jsonPath("$.rows[0].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("name", "Автор", "Потрібно ввести хочаб одного автора!", + true, "Додати автора", "dc.contributor.author")))) + .andExpect(jsonPath("$.rows[1].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("onebox", "Заголовок", + "Заговолок файла обов'язковий !", false, + "Ввести основний заголовок файла", "dc.title")))) + .andExpect(jsonPath("$.rows[2].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("dropdown", "Мова", null, false, + "Виберiть мову головного змiсту файлу, як що мови немає у списку, вибрати (Iнша)." + + " Як що вмiст вайлу не є текстовим, наприклад є фотографiєю, тодi вибрати (N/A)", + null, "dc.language.iso", "common_iso_languages")))); + resetLocalesConfiguration(); + } + + @Test + public void userChoiceAnotherLanguageTest() throws Exception { + context.turnOffAuthorisationSystem(); + + String[] supportedLanguage = {"it","uk"}; + configurationService.setProperty("default.locale","it"); + configurationService.setProperty("webui.supported.locales",supportedLanguage); + // These clears have to happen so that the config is actually reloaded in those classes. This is needed for + // the properties that we're altering above and this is only used within the tests + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + + Locale it = new Locale("it"); + + EPerson epersonUK = EPersonBuilder.createEPerson(context) + .withEmail("epersonUK@example.com") + .withPassword(password) + .withLanguage("uk") + .build(); + + context.restoreAuthSystemState(); + + String tokenEpersonUK = getAuthToken(epersonUK.getEmail(), password); + + // user prefer ukranian but choice italian language + getClient(tokenEpersonUK).perform(get("/api/config/submissionforms/languagetest").locale(it)) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.id", is("languagetest"))) + .andExpect(jsonPath("$.name", is("languagetest"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/languagetest"))) + .andExpect(jsonPath("$.rows[0].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("name", "Autore", "\u00C8" + " richiesto almeno un autore", true, + "Aggiungi un autore", "dc.contributor.author")))) + .andExpect(jsonPath("$.rows[1].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("onebox", "Titolo", + "\u00C8" + " necessario inserire un titolo principale per questo item", false, + "Inserisci titolo principale di questo item", "dc.title")))) + .andExpect(jsonPath("$.rows[2].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("dropdown", "Lingua", null, false, + "Selezionare la lingua del contenuto principale dell'item." + + " Se la lingua non compare nell'elenco, selezionare (Altro)." + + " Se il contenuto non ha davvero una lingua" + + " (ad esempio, se è un set di dati o un'immagine) selezionare (N/A)", + null, "dc.language.iso", "common_iso_languages")))); + resetLocalesConfiguration(); + } + + @Test + public void defaultLanguageTest() throws Exception { + context.turnOffAuthorisationSystem(); + + String[] supportedLanguage = {"it","uk"}; + configurationService.setProperty("default.locale","it"); + configurationService.setProperty("webui.supported.locales",supportedLanguage); + // These clears have to happen so that the config is actually reloaded in those classes. This is needed for + // the properties that we're altering above and this is only used within the tests + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + + context.restoreAuthSystemState(); + + String tokenEperson = getAuthToken(eperson.getEmail(), password); + getClient(tokenEperson).perform(get("/api/config/submissionforms/languagetest")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.id", is("languagetest"))) + .andExpect(jsonPath("$.name", is("languagetest"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/languagetest"))) + .andExpect(jsonPath("$.rows[0].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("name", "Autore", "\u00C8 richiesto almeno un autore", true, + "Aggiungi un autore", "dc.contributor.author")))) + .andExpect(jsonPath("$.rows[1].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("onebox", "Titolo", + "\u00C8 necessario inserire un titolo principale per questo item", false, + "Inserisci titolo principale di questo item", "dc.title")))); + resetLocalesConfiguration(); + } + + @Test + public void supportLanguageUsingMultipleLocaleTest() throws Exception { + context.turnOffAuthorisationSystem(); + String[] supportedLanguage = {"it","uk","en"}; + configurationService.setProperty("default.locale","en"); + configurationService.setProperty("webui.supported.locales",supportedLanguage); + // These clears have to happen so that the config is actually reloaded in those classes. This is needed for + // the properties that we're altering above and this is only used within the tests + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + + context.restoreAuthSystemState(); + + String tokenEperson = getAuthToken(eperson.getEmail(), password); + getClient(tokenEperson).perform(get("/api/config/submissionforms/languagetest") + .header("Accept-Language", "fr;q=1, it;q=0.9")) + .andExpect(status().isOk()) + .andExpect(content().contentType(contentType)) + .andExpect(jsonPath("$.id", is("languagetest"))) + .andExpect(jsonPath("$.name", is("languagetest"))) + .andExpect(jsonPath("$.type", is("submissionform"))) + .andExpect(jsonPath("$._links.self.href", Matchers + .startsWith(REST_SERVER_URL + "config/submissionforms/languagetest"))) + .andExpect(jsonPath("$.rows[0].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("name", "Autore", "\u00C8 richiesto almeno un autore", true, + "Aggiungi un autore", "dc.contributor.author")))) + .andExpect(jsonPath("$.rows[1].fields", contains(SubmissionFormFieldMatcher + .matchFormFieldDefinition("onebox", "Titolo", + "\u00C8 necessario inserire un titolo principale per questo item", false, + "Inserisci titolo principale di questo item", "dc.title")))); + + resetLocalesConfiguration(); + } + + private void resetLocalesConfiguration() throws DCInputsReaderException { + configurationService.setProperty("default.locale","en"); + configurationService.setProperty("webui.supported.locales",null); + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/TaskRestRepositoriesIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/TaskRestRepositoriesIT.java index 8a8c567293..a7813601ec 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/TaskRestRepositoriesIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/TaskRestRepositoriesIT.java @@ -25,12 +25,6 @@ import java.util.List; import java.util.Map; import java.util.concurrent.atomic.AtomicReference; -import org.dspace.app.rest.builder.ClaimedTaskBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.PoolTaskBuilder; -import org.dspace.app.rest.builder.WorkflowItemBuilder; import org.dspace.app.rest.matcher.ClaimedTaskMatcher; import org.dspace.app.rest.matcher.EPersonMatcher; import org.dspace.app.rest.matcher.PoolTaskMatcher; @@ -41,6 +35,12 @@ import org.dspace.app.rest.matcher.WorkspaceItemMatcher; import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.ReplaceOperation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.ClaimedTaskBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.PoolTaskBuilder; +import org.dspace.builder.WorkflowItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -339,7 +339,6 @@ public class TaskRestRepositoriesIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$._links.self.href", Matchers.containsString("/api/workflow/pooltasks"))) .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", is(3))); - ; String authReviewer2 = getAuthToken(reviewer2.getEmail(), password); getClient(authReviewer2).perform(get("/api/workflow/pooltasks/search/findByUser") @@ -360,7 +359,6 @@ public class TaskRestRepositoriesIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$._links.self.href", Matchers.containsString("/api/workflow/pooltasks"))) .andExpect(jsonPath("$.page.size", is(20))) .andExpect(jsonPath("$.page.totalElements", is(2))); - ; String authAdmin = getAuthToken(admin.getEmail(), password); getClient(authAdmin).perform(get("/api/workflow/pooltasks/search/findByUser") diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/UUIDLookupRestControllerIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/UUIDLookupRestControllerIT.java index d8cad3117a..8a6debce3e 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/UUIDLookupRestControllerIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/UUIDLookupRestControllerIT.java @@ -16,13 +16,13 @@ import java.util.UUID; import org.apache.commons.codec.CharEncoding; import org.apache.commons.io.IOUtils; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.SiteBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.SiteBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Collection; import org.dspace.content.Community; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/UriListParsingIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/UriListParsingIT.java index 108859381e..91a1572e5a 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/UriListParsingIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/UriListParsingIT.java @@ -13,12 +13,12 @@ import static org.junit.Assert.assertTrue; import java.util.List; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.utils.ContextUtil; import org.dspace.app.rest.utils.Utils; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.DSpaceObject; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/VersionHistoryRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/VersionHistoryRestRepositoryIT.java index 584b099e9e..60fdb11c97 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/VersionHistoryRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/VersionHistoryRestRepositoryIT.java @@ -17,13 +17,13 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import java.sql.SQLException; import java.util.Date; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.matcher.VersionHistoryMatcher; import org.dspace.app.rest.matcher.VersionMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.authorize.AuthorizeException; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/VersionRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/VersionRestRepositoryIT.java index 1a64454dc6..a6087c58c2 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/VersionRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/VersionRestRepositoryIT.java @@ -11,12 +11,12 @@ import static org.springframework.test.web.servlet.request.MockMvcRequestBuilder import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; import org.dspace.app.rest.matcher.ItemMatcher; import org.dspace.app.rest.matcher.VersionMatcher; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ViewEventRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ViewEventRestRepositoryIT.java index a84b2138a1..5683bd30a8 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/ViewEventRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/ViewEventRestRepositoryIT.java @@ -17,13 +17,13 @@ import java.util.UUID; import com.fasterxml.jackson.databind.ObjectMapper; import org.apache.commons.codec.CharEncoding; import org.apache.commons.io.IOUtils; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.SiteBuilder; import org.dspace.app.rest.model.ViewEventRest; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.SiteBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Collection; import org.dspace.content.Community; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/VocabularyEntryDetailsIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/VocabularyEntryDetailsIT.java new file mode 100644 index 0000000000..e5eab1aa98 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/VocabularyEntryDetailsIT.java @@ -0,0 +1,442 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; +import static org.hamcrest.Matchers.endsWith; +import static org.hamcrest.Matchers.is; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.util.UUID; + +import org.dspace.app.rest.matcher.VocabularyEntryDetailsMatcher; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.hamcrest.Matcher; +import org.hamcrest.Matchers; +import org.junit.Test; + +/** + * + * + * @author Mykhaylo Boychuk (4science.it) + */ +public class VocabularyEntryDetailsIT extends AbstractControllerIntegrationTest { + @Test + public void discoverableNestedLinkTest() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + getClient(token).perform(get("/api")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._links",Matchers.allOf( + hasJsonPath("$.vocabularyEntryDetails.href", + is("http://localhost/api/submission/vocabularyEntryDetails")), + hasJsonPath("$.vocabularyEntryDetails-search.href", + is("http://localhost/api/submission/vocabularyEntryDetails/search")) + ))); + } + + @Test + public void findAllTest() throws Exception { + String authToken = getAuthToken(admin.getEmail(), password); + getClient(authToken).perform(get("/api/submission/vocabularyEntryDetails")) + .andExpect(status() + .isMethodNotAllowed()); + } + + @Test + public void findOneTest() throws Exception { + String idAuthority = "srsc:SCB110"; + String token = getAuthToken(eperson.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularyEntryDetails/" + idAuthority)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB110", "Religion/Theology", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology"))) + .andExpect(jsonPath("$.selectable", is(true))) + .andExpect(jsonPath("$.otherInformation.id", is("SCB110"))) + .andExpect(jsonPath("$.otherInformation.note", is("Religionsvetenskap/Teologi"))) + .andExpect(jsonPath("$.otherInformation.parent", + is("Research Subject Categories::HUMANITIES and RELIGION"))) + .andExpect(jsonPath("$._links.parent.href", + endsWith("api/submission/vocabularyEntryDetails/srsc:SCB110/parent"))) + .andExpect(jsonPath("$._links.children.href", + endsWith("api/submission/vocabularyEntryDetails/srsc:SCB110/children"))); + } + + @Test + public void findOneUnauthorizedTest() throws Exception { + String idAuthority = "srsc:SCB110"; + getClient().perform(get("/api/submission/vocabularyEntryDetails/" + idAuthority)) + .andExpect(status().isUnauthorized()); + } + + @Test + public void findOneNotFoundTest() throws Exception { + String idAuthority = "srsc:not-existing"; + String token = getAuthToken(eperson.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularyEntryDetails/" + idAuthority)) + .andExpect(status().isNotFound()); + + // try with a special id missing only the entry-id part + getClient(token).perform(get("/api/submission/vocabularyEntryDetails/srsc:")) + .andExpect(status().isNotFound()); + + // try to retrieve the xml root that is not a entry itself + getClient(token).perform(get("/api/submission/vocabularyEntryDetails/srsc:ResearchSubjectCategories")) + .andExpect(status().isNotFound()); + + } + + @Test + public void srscSearchTopTest() throws Exception { + String tokenAdmin = getAuthToken(admin.getEmail(), password); + String tokenEPerson = getAuthToken(eperson.getEmail(), password); + getClient(tokenAdmin).perform(get("/api/submission/vocabularyEntryDetails/search/top") + .param("vocabulary", "srsc")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.vocabularyEntryDetails", Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB11", "HUMANITIES and RELIGION", + "Research Subject Categories::HUMANITIES and RELIGION"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB12", "LAW/JURISPRUDENCE", + "Research Subject Categories::LAW/JURISPRUDENCE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB13", "SOCIAL SCIENCES", + "Research Subject Categories::SOCIAL SCIENCES"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB14", "MATHEMATICS", + "Research Subject Categories::MATHEMATICS"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB15", "NATURAL SCIENCES", + "Research Subject Categories::NATURAL SCIENCES"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB16", "TECHNOLOGY", + "Research Subject Categories::TECHNOLOGY"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB17", + "FORESTRY, AGRICULTURAL SCIENCES and LANDSCAPE PLANNING", + "Research Subject Categories::FORESTRY, AGRICULTURAL SCIENCES and LANDSCAPE PLANNING"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB18", "MEDICINE", + "Research Subject Categories::MEDICINE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB19", "ODONTOLOGY", + "Research Subject Categories::ODONTOLOGY"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB21", "PHARMACY", + "Research Subject Categories::PHARMACY"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB22", "VETERINARY MEDICINE", + "Research Subject Categories::VETERINARY MEDICINE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB23", "INTERDISCIPLINARY RESEARCH AREAS", + "Research Subject Categories::INTERDISCIPLINARY RESEARCH AREAS") + ))) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(12))); + + getClient(tokenEPerson).perform(get("/api/submission/vocabularyEntryDetails/search/top") + .param("vocabulary", "srsc")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.vocabularyEntryDetails", Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB11", "HUMANITIES and RELIGION", + "Research Subject Categories::HUMANITIES and RELIGION"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB12", "LAW/JURISPRUDENCE", + "Research Subject Categories::LAW/JURISPRUDENCE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB13", "SOCIAL SCIENCES", + "Research Subject Categories::SOCIAL SCIENCES"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB14", "MATHEMATICS", + "Research Subject Categories::MATHEMATICS"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB15", "NATURAL SCIENCES", + "Research Subject Categories::NATURAL SCIENCES"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB16", "TECHNOLOGY", + "Research Subject Categories::TECHNOLOGY"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB17", + "FORESTRY, AGRICULTURAL SCIENCES and LANDSCAPE PLANNING", + "Research Subject Categories::FORESTRY, AGRICULTURAL SCIENCES and LANDSCAPE PLANNING"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB18", "MEDICINE", + "Research Subject Categories::MEDICINE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB19", "ODONTOLOGY", + "Research Subject Categories::ODONTOLOGY"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB21", "PHARMACY", + "Research Subject Categories::PHARMACY"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB22", "VETERINARY MEDICINE", + "Research Subject Categories::VETERINARY MEDICINE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB23", "INTERDISCIPLINARY RESEARCH AREAS", + "Research Subject Categories::INTERDISCIPLINARY RESEARCH AREAS") + ))) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(12))); + } + + @Test + public void srscSearchFirstLevel_MATHEMATICS_Test() throws Exception { + String tokenAdmin = getAuthToken(admin.getEmail(), password); + getClient(tokenAdmin).perform(get("/api/submission/vocabularyEntryDetails/srsc:SCB14/children")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.children", Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB1401", + "Algebra, geometry and mathematical analysis", + "Research Subject Categories::MATHEMATICS::Algebra, geometry and mathematical analysis"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB1402", "Applied mathematics", + "Research Subject Categories::MATHEMATICS::Applied mathematics"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB1409", "Other mathematics", + "Research Subject Categories::MATHEMATICS::Other mathematics") + ))) + .andExpect(jsonPath("$._embedded.children[*].otherInformation.parent", + Matchers.everyItem(is("Research Subject Categories::MATHEMATICS")))) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(3))); + } + + @Test + public void srscSearchTopPaginationTest() throws Exception { + String tokenAdmin = getAuthToken(admin.getEmail(), password); + getClient(tokenAdmin).perform(get("/api/submission/vocabularyEntryDetails/search/top") + .param("vocabulary", "srsc") + .param("page", "0") + .param("size", "5")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.vocabularyEntryDetails", Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB11", "HUMANITIES and RELIGION", + "Research Subject Categories::HUMANITIES and RELIGION"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB12", "LAW/JURISPRUDENCE", + "Research Subject Categories::LAW/JURISPRUDENCE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB13", "SOCIAL SCIENCES", + "Research Subject Categories::SOCIAL SCIENCES"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB14", "MATHEMATICS", + "Research Subject Categories::MATHEMATICS"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB15", "NATURAL SCIENCES", + "Research Subject Categories::NATURAL SCIENCES") + ))) + .andExpect(jsonPath("$.page.totalElements", is(12))) + .andExpect(jsonPath("$.page.totalPages", is(3))) + .andExpect(jsonPath("$.page.number", is(0))); + + //second page + getClient(tokenAdmin).perform(get("/api/submission/vocabularyEntryDetails/search/top") + .param("vocabulary", "srsc") + .param("page", "1") + .param("size", "5")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.vocabularyEntryDetails", Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB16", "TECHNOLOGY", + "Research Subject Categories::TECHNOLOGY"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB17", + "FORESTRY, AGRICULTURAL SCIENCES and LANDSCAPE PLANNING", + "Research Subject Categories::FORESTRY, AGRICULTURAL SCIENCES and LANDSCAPE PLANNING"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB18", "MEDICINE", + "Research Subject Categories::MEDICINE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB19", "ODONTOLOGY", + "Research Subject Categories::ODONTOLOGY"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB21", "PHARMACY", + "Research Subject Categories::PHARMACY") + ))) + .andExpect(jsonPath("$.page.totalElements", is(12))) + .andExpect(jsonPath("$.page.totalPages", is(3))) + .andExpect(jsonPath("$.page.number", is(1))); + + // third page + getClient(tokenAdmin).perform(get("/api/submission/vocabularyEntryDetails/search/top") + .param("vocabulary", "srsc") + .param("page", "2") + .param("size", "5")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.vocabularyEntryDetails", Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB22", "VETERINARY MEDICINE", + "Research Subject Categories::VETERINARY MEDICINE"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB23", "INTERDISCIPLINARY RESEARCH AREAS", + "Research Subject Categories::INTERDISCIPLINARY RESEARCH AREAS") + ))) + .andExpect(jsonPath("$.page.totalElements", is(12))) + .andExpect(jsonPath("$.page.totalPages", is(3))) + .andExpect(jsonPath("$.page.number", is(2))); + } + + @Test + public void searchTopBadRequestTest() throws Exception { + String tokenAdmin = getAuthToken(admin.getEmail(), password); + getClient(tokenAdmin).perform(get("/api/submission/vocabularyEntryDetails/search/top") + .param("vocabulary", UUID.randomUUID().toString())) + .andExpect(status().isBadRequest()); + } + + @Test + public void searchTopUnauthorizedTest() throws Exception { + getClient().perform(get("/api/submission/vocabularyEntryDetails/search/top") + .param("vocabulary", "srsc:SCB16")) + .andExpect(status().isUnauthorized()); + } + + @Test + public void srscSearchByParentFirstLevelPaginationTest() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + // first page + getClient(token).perform(get("/api/submission/vocabularyEntryDetails/srsc:SCB14/children") + .param("page", "0") + .param("size", "2")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.children", Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB1401", + "Algebra, geometry and mathematical analysis", + "Research Subject Categories::MATHEMATICS::Algebra, geometry and mathematical analysis"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB1402", "Applied mathematics", + "Research Subject Categories::MATHEMATICS::Applied mathematics") + ))) + .andExpect(jsonPath("$.page.totalElements", is(3))) + .andExpect(jsonPath("$.page.totalPages", is(2))) + .andExpect(jsonPath("$.page.number", is(0))); + + // second page + getClient(token).perform(get("/api/submission/vocabularyEntryDetails/srsc:SCB14/children") + .param("page", "1") + .param("size", "2")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.children", Matchers.contains( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:SCB1409", "Other mathematics", + "Research Subject Categories::MATHEMATICS::Other mathematics") + ))) + .andExpect(jsonPath("$.page.totalElements", is(3))) + .andExpect(jsonPath("$.page.totalPages", is(2))) + .andExpect(jsonPath("$.page.number", is(1))); + } + + @Test + public void srscSearchByParentSecondLevel_Applied_mathematics_Test() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularyEntryDetails/srsc:SCB1402/children")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.children", Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR140202", "Numerical analysis", + "Research Subject Categories::MATHEMATICS::Applied mathematics::Numerical analysis"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR140203", "Mathematical statistics", + "Research Subject Categories::MATHEMATICS::Applied mathematics::Mathematical statistics"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR140204", "Optimization, systems theory", + "Research Subject Categories::MATHEMATICS::Applied mathematics::Optimization, systems theory"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR140205", "Theoretical computer science", + "Research Subject Categories::MATHEMATICS::Applied mathematics::Theoretical computer science") + ))) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(4))); + } + + @Test + public void srscSearchByParentEmptyTest() throws Exception { + String tokenAdmin = getAuthToken(admin.getEmail(), password); + getClient(tokenAdmin).perform(get("/api/submission/vocabularyEntryDetails/srsc:VR140202/children")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(0))); + } + + @Test + public void srscSearchByParentWrongIdTest() throws Exception { + String tokenAdmin = getAuthToken(admin.getEmail(), password); + getClient(tokenAdmin).perform(get("/api/submission/vocabularyEntryDetails/" + + UUID.randomUUID() + "/children")) + .andExpect(status().isBadRequest()); + } + + @Test + public void srscSearchTopUnauthorizedTest() throws Exception { + getClient().perform(get("/api/submission/vocabularyEntryDetails/search/top") + .param("vocabulary", "srsc")) + .andExpect(status().isUnauthorized()); + } + + @Test + public void findParentByChildTest() throws Exception { + String tokenEperson = getAuthToken(eperson.getEmail(), password); + getClient(tokenEperson).perform(get("/api/submission/vocabularyEntryDetails/srsc:SCB180/parent")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", is( + VocabularyEntryDetailsMatcher.matchAuthorityEntry( + "srsc:SCB18", "MEDICINE","Research Subject Categories::MEDICINE") + ))); + } + + @Test + public void findParentByChildBadRequestTest() throws Exception { + String tokenEperson = getAuthToken(eperson.getEmail(), password); + getClient(tokenEperson).perform(get("/api/submission/vocabularyEntryDetails/" + UUID.randomUUID() + "/parent")) + .andExpect(status().isBadRequest()); + } + + @Test + public void findParentByChildUnauthorizedTest() throws Exception { + getClient().perform(get("/api/submission/vocabularyEntryDetails/srsc:SCB180/parent")) + .andExpect(status().isUnauthorized()); + } + + @Test + public void findParentTopTest() throws Exception { + String tokenEperson = getAuthToken(eperson.getEmail(), password); + getClient(tokenEperson) + .perform(get("/api/submission/vocabularyEntryDetails/srsc:SCB11/parent")) + .andExpect(status().isNoContent()); + } + + @Test + public void srscProjectionTest() throws Exception { + String tokenAdmin = getAuthToken(admin.getEmail(), password); + getClient(tokenAdmin).perform( + get("/api/submission/vocabularyEntryDetails/srsc:SCB110").param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.parent", + VocabularyEntryDetailsMatcher.matchAuthorityEntry( + "srsc:SCB11", "HUMANITIES and RELIGION", + "Research Subject Categories::HUMANITIES and RELIGION"))) + .andExpect(jsonPath("$._embedded.children._embedded.children", matchAllSrscSC110Children())) + .andExpect(jsonPath("$._embedded.children._embedded.children[*].otherInformation.parent", + Matchers.everyItem( + is("Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology")))); + + getClient(tokenAdmin).perform( + get("/api/submission/vocabularyEntryDetails/srsc:SCB110").param("embed", "children")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.children._embedded.children", matchAllSrscSC110Children())) + .andExpect(jsonPath("$._embedded.children._embedded.children[*].otherInformation.parent", + Matchers.everyItem( + is("Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology")))); + + getClient(tokenAdmin).perform( + get("/api/submission/vocabularyEntryDetails/srsc:SCB110").param("embed", "parent")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.parent", + VocabularyEntryDetailsMatcher.matchAuthorityEntry( + "srsc:SCB11", "HUMANITIES and RELIGION", + "Research Subject Categories::HUMANITIES and RELIGION"))); + } + + private Matcher> matchAllSrscSC110Children() { + return Matchers.containsInAnyOrder( + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110102", + "History of religion", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::History of religion"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110103", + "Church studies", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Church studies"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110104", + "Missionary studies", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Missionary studies"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110105", + "Systematic theology", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Systematic theology"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110106", + "Islamology", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Islamology"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110107", + "Faith and reason", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Faith and reason"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110108", + "Sociology of religion", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Sociology of religion"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110109", + "Psychology of religion", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Psychology of religion"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110110", + "Philosophy of religion", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Philosophy of religion"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110111", + "New Testament exegesis", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::New Testament exegesis"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110112", + "Old Testament exegesis", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Old Testament exegesis"), + VocabularyEntryDetailsMatcher.matchAuthorityEntry("srsc:VR110113", + "Dogmatics with symbolics", + "Research Subject Categories::HUMANITIES and RELIGION::Religion/Theology::Dogmatics with symbolics") + ); + } + +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/VocabularyRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/VocabularyRestRepositoryIT.java new file mode 100644 index 0000000000..738a334b82 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/VocabularyRestRepositoryIT.java @@ -0,0 +1,394 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest; + +import static org.hamcrest.Matchers.is; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.util.Date; +import java.util.Locale; +import java.util.UUID; + +import org.dspace.app.rest.matcher.VocabularyMatcher; +import org.dspace.app.rest.repository.SubmissionFormRestRepository; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.authority.PersonAuthorityValue; +import org.dspace.authority.factory.AuthorityServiceFactory; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.content.Collection; +import org.dspace.content.authority.DCInputAuthority; +import org.dspace.content.authority.service.ChoiceAuthorityService; +import org.dspace.core.service.PluginService; +import org.dspace.services.ConfigurationService; +import org.hamcrest.Matchers; +import org.junit.After; +import org.junit.Before; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * This class handles all Authority related IT. It alters some config to run the tests, but it gets cleared again + * after every test + */ +public class VocabularyRestRepositoryIT extends AbstractControllerIntegrationTest { + + @Autowired + ConfigurationService configurationService; + + @Autowired + private SubmissionFormRestRepository submissionFormRestRepository; + + @Autowired + private PluginService pluginService; + + @Autowired + private ChoiceAuthorityService cas; + + @Before + public void setup() throws Exception { + super.setUp(); + configurationService.setProperty("plugin.named.org.dspace.content.authority.ChoiceAuthority", + "org.dspace.content.authority.SolrAuthority = SolrAuthorAuthority"); + + configurationService.setProperty("solr.authority.server", + "${solr.server}/authority"); + configurationService.setProperty("choices.plugin.dc.contributor.author", + "SolrAuthorAuthority"); + configurationService.setProperty("choices.presentation.dc.contributor.author", + "authorLookup"); + configurationService.setProperty("authority.controlled.dc.contributor.author", + "true"); + + configurationService.setProperty("authority.author.indexer.field.1", + "dc.contributor.author"); + + // These clears have to happen so that the config is actually reloaded in those classes. This is needed for + // the properties that we're altering above and this is only used within the tests + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + + context.turnOffAuthorisationSystem(); + parentCommunity = CommunityBuilder.createCommunity(context).withName("A parent community for all our test") + .build(); + context.restoreAuthSystemState(); + PersonAuthorityValue person1 = new PersonAuthorityValue(); + person1.setId(String.valueOf(UUID.randomUUID())); + person1.setLastName("Shirasaka"); + person1.setFirstName("Seiko"); + person1.setValue("Shirasaka, Seiko"); + person1.setField("dc_contributor_author"); + person1.setLastModified(new Date()); + person1.setCreationDate(new Date()); + AuthorityServiceFactory.getInstance().getAuthorityIndexingService().indexContent(person1); + + PersonAuthorityValue person2 = new PersonAuthorityValue(); + person2.setId(String.valueOf(UUID.randomUUID())); + person2.setLastName("Miller"); + person2.setFirstName("Tyler E"); + person2.setValue("Miller, Tyler E"); + person2.setField("dc_contributor_author"); + person2.setLastModified(new Date()); + person2.setCreationDate(new Date()); + AuthorityServiceFactory.getInstance().getAuthorityIndexingService().indexContent(person2); + + AuthorityServiceFactory.getInstance().getAuthorityIndexingService().commit(); + } + + @Override + @After + // We need to cleanup the authorities cache once than the configuration has been restored + public void destroy() throws Exception { + super.destroy(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + } + + @Test + public void findAllTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.vocabularies", Matchers.containsInAnyOrder( + VocabularyMatcher.matchProperties("srsc", "srsc", false, true), + VocabularyMatcher.matchProperties("common_types", "common_types", true, false), + VocabularyMatcher.matchProperties("common_iso_languages", "common_iso_languages", true , false), + VocabularyMatcher.matchProperties("SolrAuthorAuthority", "SolrAuthorAuthority", false , false) + ))) + .andExpect(jsonPath("$._links.self.href", + Matchers.containsString("api/submission/vocabularies"))) + .andExpect(jsonPath("$.page.totalElements", is(4))); + } + + @Test + public void findOneSRSC_Test() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies/srsc")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", is( + VocabularyMatcher.matchProperties("srsc", "srsc", false, true) + ))); + } + + @Test + public void findOneCommonTypesTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies/common_types")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", is( + VocabularyMatcher.matchProperties("common_types", "common_types", true, false) + ))); + } + + @Test + public void correctSrscQueryTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform( + get("/api/submission/vocabularies/srsc/entries") + .param("filter", "Research") + .param("size", "2")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.entries", Matchers.containsInAnyOrder( + VocabularyMatcher.matchVocabularyEntry("Research Subject Categories", + "Research Subject Categories", "vocabularyEntry"), + VocabularyMatcher.matchVocabularyEntry("Family research", + "Research Subject Categories::SOCIAL SCIENCES::Social sciences::Social work::Family research", + "vocabularyEntry")))) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(26))) + .andExpect(jsonPath("$.page.totalPages", Matchers.is(13))) + .andExpect(jsonPath("$.page.size", Matchers.is(2))); + } + + @Test + public void notScrollableVocabularyRequiredQueryTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies/srsc/entries")) + .andExpect(status().isUnprocessableEntity()); + } + + @Test + public void noResultsSrscQueryTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform( + get("/api/submission/vocabularies/srsc/entries") + .param("metadata", "dc.subject") + .param("filter", "Research2") + .param("size", "1000")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(0))); + } + + @Test + public void vocabularyEntriesCommonTypesWithPaginationTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token) + .perform(get("/api/submission/vocabularies/common_types/entries").param("size", "2")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.entries", Matchers.containsInAnyOrder( + VocabularyMatcher.matchVocabularyEntry("Animation", "Animation", "vocabularyEntry"), + VocabularyMatcher.matchVocabularyEntry("Article", "Article", "vocabularyEntry") + ))) + .andExpect(jsonPath("$._embedded.entries[*].authority").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(22))) + .andExpect(jsonPath("$.page.totalPages", Matchers.is(11))) + .andExpect(jsonPath("$.page.size", Matchers.is(2))); + + //second page + getClient(token).perform(get("/api/submission/vocabularies/common_types/entries") + .param("size", "2") + .param("page", "1")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.entries", Matchers.containsInAnyOrder( + VocabularyMatcher.matchVocabularyEntry("Book", "Book", "vocabularyEntry"), + VocabularyMatcher.matchVocabularyEntry("Book chapter", "Book chapter", "vocabularyEntry") + ))) + .andExpect(jsonPath("$._embedded.entries[*].authority").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(22))) + .andExpect(jsonPath("$.page.totalPages", Matchers.is(11))) + .andExpect(jsonPath("$.page.size", Matchers.is(2))); + } + + @Test + public void vocabularyEntriesCommon_typesWithQueryTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies/common_types/entries") + .param("filter", "Book") + .param("size", "2")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.entries", Matchers.containsInAnyOrder( + VocabularyMatcher.matchVocabularyEntry("Book", "Book", "vocabularyEntry"), + VocabularyMatcher.matchVocabularyEntry("Book chapter", "Book chapter", "vocabularyEntry") + ))) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(2))) + .andExpect(jsonPath("$.page.totalPages", Matchers.is(1))) + .andExpect(jsonPath("$.page.size", Matchers.is(2))); + } + + @Test + public void vocabularyEntriesDCInputAuthorityLocalesTest() throws Exception { + String[] supportedLanguage = {"it","uk"}; + configurationService.setProperty("default.locale","it"); + configurationService.setProperty("webui.supported.locales",supportedLanguage); + // These clears have to happen so that the config is actually reloaded in those classes. This is needed for + // the properties that we're altering above and this is only used within the tests + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + + Locale uk = new Locale("uk"); + Locale it = new Locale("it"); + + String token = getAuthToken(admin.getEmail(), password); + getClient(token) + .perform(get("/api/submission/vocabularies/common_iso_languages/entries") + .param("size", "2") + .locale(it)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.entries", Matchers.containsInAnyOrder( + VocabularyMatcher.matchVocabularyEntry("N/A", "", "vocabularyEntry"), + VocabularyMatcher.matchVocabularyEntry("Inglese (USA)", "en_US", "vocabularyEntry") + ))) + .andExpect(jsonPath("$._embedded.entries[*].authority").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(12))) + .andExpect(jsonPath("$.page.totalPages", Matchers.is(6))) + .andExpect(jsonPath("$.page.size", Matchers.is(2))); + + getClient(token) + .perform(get("/api/submission/vocabularies/common_iso_languages/entries") + .param("size", "2") + .locale(uk)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.entries", Matchers.containsInAnyOrder( + VocabularyMatcher.matchVocabularyEntry("N/A", "", "vocabularyEntry"), + VocabularyMatcher.matchVocabularyEntry("Американська (USA)", "en_US", "vocabularyEntry") + ))) + .andExpect(jsonPath("$._embedded.entries[*].authority").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(12))) + .andExpect(jsonPath("$.page.totalPages", Matchers.is(6))) + .andExpect(jsonPath("$.page.size", Matchers.is(2))); + + configurationService.setProperty("default.locale","en"); + configurationService.setProperty("webui.supported.locales",null); + submissionFormRestRepository.reload(); + DCInputAuthority.reset(); + pluginService.clearNamedPluginClasses(); + cas.clearCache(); + } + + + @Test + public void correctSolrQueryTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform( + get("/api/submission/vocabularies/SolrAuthorAuthority/entries") + .param("filter", "Shirasaka") + .param("size", "1000")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.entries", Matchers.contains( + VocabularyMatcher.matchVocabularyEntry("Shirasaka, Seiko", "Shirasaka, Seiko", "vocabularyEntry") + ))) + .andExpect(jsonPath("$._embedded.entries[0].authority").isNotEmpty()) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(1))); + } + + @Test + public void noResultsSolrQueryTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform( + get("/api/submission/vocabularies/SolrAuthorAuthority/entries") + .param("filter", "Smith") + .param("size", "1000")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(0))); + } + + @Test + public void findByMetadataAndCollectionTest() throws Exception { + context.turnOffAuthorisationSystem(); + Collection collection = CollectionBuilder.createCollection(context, parentCommunity) + .withName("Test collection") + .build(); + context.restoreAuthSystemState(); + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies/search/byMetadataAndCollection") + .param("metadata", "dc.type") + .param("collection", collection.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", is( + VocabularyMatcher.matchProperties("common_types", "common_types", true, false) + ))); + } + + @Test + public void findByMetadataAndCollectionUnprocessableEntityTest() throws Exception { + context.turnOffAuthorisationSystem(); + Collection collection = CollectionBuilder.createCollection(context, parentCommunity) + .withName("Test collection") + .build(); + context.restoreAuthSystemState(); + + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies/search/byMetadataAndCollection") + .param("metadata", "dc.not.exist") + .param("collection", collection.getID().toString())) + .andExpect(status().isUnprocessableEntity()); + + getClient(token).perform(get("/api/submission/vocabularies/search/byMetadataAndCollection") + .param("metadata", "dc.type") + .param("collection", UUID.randomUUID().toString())) + .andExpect(status().isUnprocessableEntity()); + } + + @Test + public void findByMetadataAndCollectionBadRequestTest() throws Exception { + context.turnOffAuthorisationSystem(); + + Collection collection = CollectionBuilder.createCollection(context, parentCommunity) + .withName("Test collection") + .build(); + context.restoreAuthSystemState(); + + String token = getAuthToken(admin.getEmail(), password); + //missing metadata + getClient(token).perform(get("/api/submission/vocabularies/search/byMetadataAndCollection") + .param("collection", collection.getID().toString())) + .andExpect(status().isBadRequest()); + + //missing collection + getClient(token).perform(get("/api/submission/vocabularies/search/byMetadataAndCollection") + .param("metadata", "dc.type")) + .andExpect(status().isBadRequest()); + } + + @Test + public void linkedEntitiesWithExactParamTest() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies/common_types/entries") + .param("filter", "Animation") + .param("exact", "true")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.entries", Matchers.contains( + VocabularyMatcher.matchVocabularyEntry("Animation", "Animation", "vocabularyEntry") + ))) + .andExpect(jsonPath("$.page.totalElements", Matchers.is(1))); + } + + @Test + public void linkedEntitiesWithFilterAndEntryIdTest() throws Exception { + String token = getAuthToken(eperson.getEmail(), password); + getClient(token).perform(get("/api/submission/vocabularies/srsc/entries") + .param("filter", "Research") + .param("entryID", "VR131402")) + .andExpect(status().isBadRequest()); + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkflowDefinitionRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkflowDefinitionRestRepositoryIT.java index 4a877825b9..3f7ae74000 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkflowDefinitionRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkflowDefinitionRestRepositoryIT.java @@ -19,12 +19,12 @@ import java.util.List; import java.util.UUID; import org.apache.commons.lang3.StringUtils; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.matcher.WorkflowDefinitionMatcher; import org.dspace.app.rest.model.WorkflowDefinitionRest; import org.dspace.app.rest.repository.WorkflowDefinitionRestRepository; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.xmlworkflow.factory.XmlWorkflowFactory; @@ -41,7 +41,8 @@ import org.junit.Test; */ public class WorkflowDefinitionRestRepositoryIT extends AbstractControllerIntegrationTest { - private XmlWorkflowFactory xmlWorkflowFactory = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory(); + private final XmlWorkflowFactory xmlWorkflowFactory + = XmlWorkflowServiceFactory.getInstance().getWorkflowFactory(); private static final String WORKFLOW_DEFINITIONS_ENDPOINT = "/api/" + WorkflowDefinitionRest.CATEGORY + "/" + WorkflowDefinitionRest.NAME_PLURAL; @@ -240,7 +241,7 @@ public class WorkflowDefinitionRestRepositoryIT extends AbstractControllerIntegr getClient(token).perform(get(WORKFLOW_DEFINITIONS_ENDPOINT + "/search/findByCollection?uuid=" + nonValidUUID)) //We expect a 400 Illegal Argument Exception (Bad Request) cannot convert UUID .andExpect(status().isBadRequest()) - .andExpect(status().reason(containsString("Failed to convert " + nonValidUUID))); + .andExpect(status().reason(containsString("A required parameter is invalid"))); } @Test @@ -348,7 +349,7 @@ public class WorkflowDefinitionRestRepositoryIT extends AbstractControllerIntegr if (StringUtils.isNotBlank(firstNonDefaultWorkflowName)) { List mappedCollections - = xmlWorkflowFactory.getCollectionHandlesMappedToWorklow(context, firstNonDefaultWorkflowName); + = xmlWorkflowFactory.getCollectionHandlesMappedToWorkflow(context, firstNonDefaultWorkflowName); //When we call this facets endpoint if (mappedCollections.size() > 0) { //returns array of collection jsons that are mapped to given workflow diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkflowItemRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkflowItemRestRepositoryIT.java index 13837f6461..7a0fe01d4a 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkflowItemRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkflowItemRestRepositoryIT.java @@ -10,6 +10,7 @@ package org.dspace.app.rest; import static com.jayway.jsonpath.JsonPath.read; import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; import static org.hamcrest.Matchers.is; +import static org.hamcrest.Matchers.nullValue; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.delete; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.patch; @@ -28,14 +29,6 @@ import javax.ws.rs.core.MediaType; import org.apache.commons.io.IOUtils; import org.apache.commons.lang3.CharEncoding; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.ClaimedTaskBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.WorkflowItemBuilder; -import org.dspace.app.rest.builder.WorkspaceItemBuilder; import org.dspace.app.rest.matcher.CollectionMatcher; import org.dspace.app.rest.matcher.ItemMatcher; import org.dspace.app.rest.matcher.WorkflowItemMatcher; @@ -46,6 +39,14 @@ import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.RemoveOperation; import org.dspace.app.rest.model.patch.ReplaceOperation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.ClaimedTaskBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.WorkflowItemBuilder; +import org.dspace.builder.WorkspaceItemBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Collection; import org.dspace.content.Community; @@ -1817,4 +1818,44 @@ public class WorkflowItemRestRepositoryIT extends AbstractControllerIntegrationT is("http://localhost/api/workflow/pooltask/search")) ))); } + + @Test + public void findOneFullProjectionTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1") + .withWorkflowGroup(1, admin).build(); + + //2. a workflow item + XmlWorkflowItem witem = WorkflowItemBuilder.createWorkflowItem(context, col1) + .withTitle("Workflow Item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .build(); + + context.restoreAuthSystemState(); + + String adminToken = getAuthToken(admin.getEmail(), password); + String epersonToken = getAuthToken(eperson.getEmail(), password); + + getClient(epersonToken).perform(get("/api/workflow/workflowitems/" + witem.getID()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.collection._embedded.adminGroup").doesNotExist()); + + getClient(adminToken).perform(get("/api/workflow/workflowitems/" + witem.getID()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.collection._embedded.adminGroup", nullValue())); + + } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkspaceItemRestRepositoryIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkspaceItemRestRepositoryIT.java index 40820c8cd6..2bd4636e32 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkspaceItemRestRepositoryIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/WorkspaceItemRestRepositoryIT.java @@ -13,6 +13,7 @@ import static org.dspace.app.rest.matcher.MetadataMatcher.matchMetadata; import static org.hamcrest.Matchers.allOf; import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.is; +import static org.hamcrest.Matchers.nullValue; import static org.springframework.data.rest.webmvc.RestMediaTypes.TEXT_URI_LIST_VALUE; import static org.springframework.http.MediaType.parseMediaType; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.delete; @@ -35,16 +36,6 @@ import javax.ws.rs.core.MediaType; import com.fasterxml.jackson.databind.ObjectMapper; import org.apache.commons.io.IOUtils; import org.apache.commons.lang3.CharEncoding; -import org.dspace.app.rest.builder.BitstreamBuilder; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; -import org.dspace.app.rest.builder.EntityTypeBuilder; -import org.dspace.app.rest.builder.GroupBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.RelationshipBuilder; -import org.dspace.app.rest.builder.RelationshipTypeBuilder; -import org.dspace.app.rest.builder.WorkspaceItemBuilder; import org.dspace.app.rest.matcher.CollectionMatcher; import org.dspace.app.rest.matcher.ItemMatcher; import org.dspace.app.rest.matcher.MetadataMatcher; @@ -54,6 +45,13 @@ import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.RemoveOperation; import org.dspace.app.rest.model.patch.ReplaceOperation; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.WorkspaceItemBuilder; import org.dspace.content.Bitstream; import org.dspace.content.Collection; import org.dspace.content.Community; @@ -877,7 +875,7 @@ public class WorkspaceItemRestRepositoryIT extends AbstractControllerIntegration * * @throws Exception */ - public void createMultipleWorkspaceItemFromFileTest() throws Exception { + public void createSingleWorkspaceItemFromBibtexFileWithOneEntryTest() throws Exception { context.turnOffAuthorisationSystem(); //** GIVEN ** @@ -898,57 +896,826 @@ public class WorkspaceItemRestRepositoryIT extends AbstractControllerIntegration .build(); InputStream bibtex = getClass().getResourceAsStream("bibtex-test.bib"); - final MockMultipartFile bibtexFile = new MockMultipartFile("file", "bibtex-test.bib", "application/x-bibtex", - bibtex); + final MockMultipartFile bibtexFile = new MockMultipartFile("file", "/local/path/bibtex-test.bib", + "application/x-bibtex", bibtex); context.restoreAuthSystemState(); + AtomicReference> idRef = new AtomicReference<>(); String authToken = getAuthToken(eperson.getEmail(), password); - // bulk create workspaceitems in the default collection (col1) - getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + try { + // create a workspaceitem from a single bibliographic entry file explicitly in the default collection (col1) + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") .file(bibtexFile)) - // bulk create should return 200, 201 (created) is better for single resource + // create should return 200, 201 (created) is better for single resource .andExpect(status().isOk()) .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", is("My Article"))) .andExpect( jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col1.getID().toString()))) - .andExpect(jsonPath("$._embedded.workspaceitems[1].sections.traditionalpageone['dc.title'][0].value", - is("My Article 2"))) - .andExpect( - jsonPath("$._embedded.workspaceitems[1]._embedded.collection.id", is(col1.getID().toString()))) - .andExpect(jsonPath("$._embedded.workspaceitems[2].sections.traditionalpageone['dc.title'][0].value", - is("My Article 3"))) - .andExpect( - jsonPath("$._embedded.workspaceitems[2]._embedded.collection.id", is(col1.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/bibtex-test.bib"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("bibtex-test.bib"))) .andExpect( jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) - ; + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } - // bulk create workspaceitems explicitly in the col2 - getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + // create a workspaceitem from a single bibliographic entry file explicitly in the col2 + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") .file(bibtexFile) - .param("collection", col2.getID().toString())) + .param("owningCollection", col2.getID().toString())) .andExpect(status().isOk()) .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", is("My Article"))) .andExpect( jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col2.getID().toString()))) - .andExpect(jsonPath("$._embedded.workspaceitems[1].sections.traditionalpageone['dc.title'][0].value", - is("My Article 2"))) - .andExpect( - jsonPath("$._embedded.workspaceitems[1]._embedded.collection.id", is(col2.getID().toString()))) - .andExpect(jsonPath("$._embedded.workspaceitems[2].sections.traditionalpageone['dc.title'][0].value", - is("My Article 3"))) - .andExpect( - jsonPath("$._embedded.workspaceitems[2]._embedded.collection.id", is(col2.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/bibtex-test.bib"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload" + + ".files[0].metadata['dc.title'][0].value", + is("bibtex-test.bib"))) .andExpect( jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) - ; - + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } bibtex.close(); } + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint a csv file + * + * @throws Exception + */ + public void createSingleWorkspaceItemFromCSVWithOneEntryTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + + InputStream csv = getClass().getResourceAsStream("csv-test.csv"); + final MockMultipartFile csvFile = new MockMultipartFile("file", "/local/path/csv-test.csv", + "text/csv", csv); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + // create workspaceitems in the default collection (col1) + AtomicReference> idRef = new AtomicReference<>(); + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(csvFile)) + // create should return 200, 201 (created) is better for single resource + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("My Article"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Nobody"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.date.issued'][0].value", + is("2006"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.identifier.issn'][0].value", + is("Mock ISSN"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.type'][0].value", + is("Mock subtype"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col1.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/csv-test.csv"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("csv-test.csv"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + + // create workspaceitems explicitly in the col2 + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(csvFile) + .param("owningCollection", col2.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.title'][0].value", + is("My Article"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Nobody"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.date.issued'][0].value", + is("2006"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.identifier.issn'][0].value", + is("Mock ISSN"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.type'][0].value", + is("Mock subtype"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col2.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/csv-test.csv"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload" + + ".files[0].metadata['dc.title'][0].value", + is("csv-test.csv"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + csv.close(); + } + + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint a csv file + * with some missing data + * + * @throws Exception + */ + public void createSingleWorkspaceItemFromCSVWithOneEntryAndMissingDataTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + + InputStream csv = getClass().getResourceAsStream("csv-missing-field-test.csv"); + final MockMultipartFile csvFile = new MockMultipartFile("file", "/local/path/csv-missing-field-test.csv", + "text/csv", csv); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + AtomicReference> idRef = new AtomicReference<>(); + // create workspaceitems in the default collection (col1) + + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(csvFile)) + // create should return 200, 201 (created) is better for single resource + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("My Article"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Nobody"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][1].value", + is("Try escape, in item"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.date.issued'][0].value").isEmpty()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.identifier.issn'][0].value", + is("Mock ISSN"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.type'][0].value" + ).doesNotExist()) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col1.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/csv-missing-field-test.csv"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("csv-missing-field-test.csv"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + csv.close(); + } + + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint a tsv file + * + * @throws Exception + */ + public void createSingleWorkspaceItemFromTSVWithOneEntryTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + + InputStream tsv = getClass().getResourceAsStream("tsv-test.tsv"); + final MockMultipartFile tsvFile = new MockMultipartFile("file", "/local/path/tsv-test.tsv", + "text/tsv", tsv); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + AtomicReference> idRef = new AtomicReference<>(); + + // create workspaceitems in the default collection (col1) + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(tsvFile)) + // create should return 200, 201 (created) is better for single resource + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("My Article"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Nobody"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.date.issued'][0].value", + is("2006"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.identifier.issn'][0].value", + is("Mock ISSN"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.type'][0].value", + is("Mock subtype"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col1.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/tsv-test.tsv"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("tsv-test.tsv"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + tsv.close(); + } + + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint a ris file + * + * @throws Exception + */ + public void createSingleWorkspaceItemFromRISWithOneEntryTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + + InputStream ris = getClass().getResourceAsStream("ris-test.ris"); + final MockMultipartFile tsvFile = new MockMultipartFile("file", "/local/path/ris-test.ris", + "text/ris", ris); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + AtomicReference> idRef = new AtomicReference<>(); + + // create workspaceitems in the default collection (col1) + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(tsvFile)) + // create should return 200, 201 (created) is better for single resource + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("Challenge–Response Identification"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][1].value", + is("Challenge–Response Identification second title"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Just, Mike"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.date.issued'][0].value", + is("2005"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.identifier.issn'][0].value", + is("978-0-387-23483-0"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.type'][0].value", + is("Mock subtype"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col1.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/ris-test.ris"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("ris-test.ris"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + ris.close(); + } + + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint an endnote file + * + * @throws Exception + */ + public void createSingleWorkspaceItemFromEndnoteWithOneEntryTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + + InputStream endnote = getClass().getResourceAsStream("endnote-test.enw"); + final MockMultipartFile endnoteFile = new MockMultipartFile("file", "/local/path/endnote-test.enw", + "text/endnote", endnote); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + AtomicReference> idRef = new AtomicReference<>(); + // create workspaceitems in the default collection (col1) + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(endnoteFile)) + // create should return 200, 201 (created) is better for single resource + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("My Title"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Author 1"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][1].value", + is("Author 2"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.date.issued'][0].value", + is("2005"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpagetwo" + + "['dc.description.abstract'][0].value", + is("This is my abstract"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col1.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/endnote-test.enw"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("endnote-test.enw"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + endnote.close(); + } + + + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint a csv file + * with some missing data and inner tab in field (those have to be read as list) + * + * @throws Exception + */ + public void createSingleWorkspaceItemFromTSVWithOneEntryAndMissingDataTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + + InputStream tsv = getClass().getResourceAsStream("tsv-missing-field-test.tsv"); + final MockMultipartFile csvFile = new MockMultipartFile("file", "/local/path/tsv-missing-field-test.tsv", + "text/tsv", tsv); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + AtomicReference> idRef = new AtomicReference<>(); + + // create workspaceitems in the default collection (col1) + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(csvFile)) + // create should return 200, 201 (created) is better for single resource + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("My Article"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Nobody"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][1].value", + is("Try escape \t\t\tin \t\titem"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.date.issued'][0].value").isEmpty()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.identifier.issn'][0].value", + is("Mock ISSN"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.type'][0].value" + ).doesNotExist()) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col1.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/tsv-missing-field-test.tsv"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("tsv-missing-field-test.tsv"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[*]._embedded.upload").doesNotExist()) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + tsv.close(); + } + + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint a + * bibtex and pubmed files + * + * @throws Exception + */ + public void createSingleWorkspaceItemFromMultipleFilesWithOneEntryTest() throws Exception { + context.turnOffAuthorisationSystem(); + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + + InputStream bibtex = getClass().getResourceAsStream("bibtex-test.bib"); + final MockMultipartFile bibtexFile = new MockMultipartFile("file", "/local/path/bibtex-test.bib", + "application/x-bibtex", bibtex); + InputStream xmlIS = getClass().getResourceAsStream("pubmed-test.xml"); + final MockMultipartFile pubmedFile = new MockMultipartFile("file", "/local/path/pubmed-test.xml", + "application/xml", xmlIS); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + AtomicReference> idRef = new AtomicReference<>(); + + // create a workspaceitem from a single bibliographic entry file explicitly in the default collection (col1) + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(bibtexFile).file(pubmedFile)) + // create should return 200, 201 (created) is better for single resource + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("My Article"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col1.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/bibtex-test.bib"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("bibtex-test.bib"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][1].value") + .doesNotExist()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[1]" + + ".metadata['dc.source'][0].value", + is("/local/path/pubmed-test.xml"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[1]" + + ".metadata['dc.title'][0].value", + is("pubmed-test.xml"))) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + + // create a workspaceitem from a single bibliographic entry file explicitly in the col2 + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(bibtexFile).file(pubmedFile) + .param("owningCollection", col2.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("My Article"))) + .andExpect( + jsonPath("$._embedded.workspaceitems[0]._embedded.collection.id", is(col2.getID().toString()))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/bibtex-test.bib"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload" + + ".files[0].metadata['dc.title'][0].value", + is("bibtex-test.bib"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][1].value") + .doesNotExist()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[1]" + + ".metadata['dc.source'][0].value", + is("/local/path/pubmed-test.xml"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[1]" + + ".metadata['dc.title'][0].value", + is("pubmed-test.xml"))) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + bibtex.close(); + xmlIS.close(); + } + + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint a bibtex file + * contains more than one entry. + * + * @throws Exception + */ + public void createSingleWorkspaceItemsFromSingleFileWithMultipleEntriesTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + + InputStream bibtex = getClass().getResourceAsStream("bibtex-test-3-entries.bib"); + final MockMultipartFile bibtexFile = new MockMultipartFile("file", "bibtex-test-3-entries.bib", + "application/x-bibtex", + bibtex); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + + // create a workspaceitem from a single bibliographic entry file explicitly in the default collection (col1) + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(bibtexFile)) + // create should return return a 422 because we don't allow/support bibliographic files + // that have multiple metadata records + .andExpect(status().is(422)); + bibtex.close(); + } + + @Test + /** + * Test the creation of workspaceitems POSTing to the resource collection endpoint a pubmed XML + * file. + * + * @throws Exception + */ + public void createPubmedWorkspaceItemFromFileTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 1") + .withSubmitterGroup(eperson) + .build(); + Collection col2 = CollectionBuilder.createCollection(context, child1) + .withName("Collection 2") + .withSubmitterGroup(eperson) + .build(); + InputStream xmlIS = getClass().getResourceAsStream("pubmed-test.xml"); + final MockMultipartFile pubmedFile = new MockMultipartFile("file", "/local/path/pubmed-test.xml", + "application/xml", xmlIS); + + context.restoreAuthSystemState(); + + String authToken = getAuthToken(eperson.getEmail(), password); + AtomicReference> idRef = new AtomicReference<>(); + + // create a workspaceitem from a single bibliographic entry file explicitly in the default collection (col1) + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(pubmedFile)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("Multistep microreactions with proteins using electrocapture technology."))) + .andExpect( + jsonPath( + "$._embedded.workspaceitems[0].sections.traditionalpageone['dc.identifier.other'][0].value", + is("15117179"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Astorga-Wells, Juan"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.source'][0].value", + is("/local/path/pubmed-test.xml"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0]" + + ".metadata['dc.title'][0].value", + is("pubmed-test.xml"))) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + + + // create a workspaceitem from a single bibliographic entry file explicitly in the col2 + try { + getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") + .file(pubmedFile) + .param("owningCollection", col2.getID().toString())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone['dc.title'][0].value", + is("Multistep microreactions with proteins using electrocapture technology."))) + .andExpect( + jsonPath( + "$._embedded.workspaceitems[0].sections.traditionalpageone['dc.identifier.other'][0].value", + is("15117179"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.traditionalpageone" + + "['dc.contributor.author'][0].value", + is("Astorga-Wells, Juan"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0].metadata['dc.source'][0].value", + is("/local/path/pubmed-test.xml"))) + .andExpect(jsonPath("$._embedded.workspaceitems[0].sections.upload.files[0].metadata['dc.title'][0].value", + is("pubmed-test.xml"))) + .andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), + "$._embedded.workspaceitems[*].id"))); + } finally { + if (idRef != null && idRef.get() != null) { + for (int i : idRef.get()) { + WorkspaceItemBuilder.deleteWorkspaceItem(i); + } + } + } + xmlIS.close(); + } + @Test /** * Test the creation of a workspaceitem POSTing to the resource collection endpoint a PDF file. As a single item @@ -979,10 +1746,10 @@ public class WorkspaceItemRestRepositoryIT extends AbstractControllerIntegration context.restoreAuthSystemState(); - // bulk create a workspaceitem + // create a workspaceitem getClient(authToken).perform(fileUpload("/api/submission/workspaceitems") .file(pdfFile)) - // bulk create should return 200, 201 (created) is better for single resource + // create should return 200, 201 (created) is better for single resource .andExpect(status().isOk()) //FIXME it will be nice to setup a mock grobid server for end to end testing // no metadata for now @@ -1058,7 +1825,7 @@ public class WorkspaceItemRestRepositoryIT extends AbstractControllerIntegration // create an empty workspaceitem explicitly in the col1, check validation on creation getClient(authToken).perform(post("/api/submission/workspaceitems") - .param("collection", col1.getID().toString()) + .param("owningCollection", col1.getID().toString()) .contentType(org.springframework.http.MediaType.APPLICATION_JSON)) .andExpect(status().isCreated()) // title and dateissued are required in the first panel @@ -3810,6 +4577,46 @@ public class WorkspaceItemRestRepositoryIT extends AbstractControllerIntegration "Test title", "2019-04-25", "ExtraEntry")))); } + @Test + public void findOneFullProjectionTest() throws Exception { + context.turnOffAuthorisationSystem(); + + //** GIVEN ** + //1. A community-collection structure with one parent community with sub-community and two collections. + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + //2. a workspace item + WorkspaceItem witem = WorkspaceItemBuilder.createWorkspaceItem(context, col1) + .withTitle("Workspace Item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .build(); + + context.restoreAuthSystemState(); + + String adminToken = getAuthToken(admin.getEmail(), password); + String epersonToken = getAuthToken(eperson.getEmail(), password); + + getClient(adminToken).perform(get("/api/submission/workspaceitems/" + witem.getID()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.collection._embedded.adminGroup", nullValue())); + + + getClient(epersonToken).perform(get("/api/submission/workspaceitems/" + witem.getID()) + .param("projection", "full")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.collection._embedded.adminGroup").doesNotExist()); + + } + @Test public void deleteWorkspaceItemWithMinRelationshipsTest() throws Exception { context.turnOffAuthorisationSystem(); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/AdministratorFeatureIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/AdministratorFeatureIT.java index e1e89cda02..6124e2cac7 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/AdministratorFeatureIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/AdministratorFeatureIT.java @@ -11,22 +11,29 @@ import static org.springframework.test.web.servlet.request.MockMvcRequestBuilder import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; +import java.sql.SQLException; + import org.dspace.app.rest.authorization.impl.AdministratorOfFeature; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.EPersonBuilder; import org.dspace.app.rest.converter.CollectionConverter; import org.dspace.app.rest.converter.CommunityConverter; +import org.dspace.app.rest.converter.ItemConverter; import org.dspace.app.rest.converter.SiteConverter; import org.dspace.app.rest.matcher.AuthorizationMatcher; import org.dspace.app.rest.model.CollectionRest; import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.model.ItemRest; import org.dspace.app.rest.model.SiteRest; import org.dspace.app.rest.projection.DefaultProjection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.ItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; +import org.dspace.content.Item; import org.dspace.content.Site; import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.CommunityService; @@ -55,6 +62,8 @@ public class AdministratorFeatureIT extends AbstractControllerIntegrationTest { @Autowired CommunityService communityService; @Autowired + private ItemConverter itemConverter; + @Autowired private CommunityConverter communityConverter; @Autowired private CollectionConverter collectionConverter; @@ -63,6 +72,22 @@ public class AdministratorFeatureIT extends AbstractControllerIntegrationTest { private SiteService siteService; + private EPerson adminComA; + private EPerson adminComB; + private EPerson adminColA; + private EPerson adminColB; + private EPerson adminItemA; + private EPerson adminItemB; + + private Community communityA; + private Community subCommunityOfA; + private Community communityB; + private Collection collectionA; + private Collection collectionB; + private Item itemInCollectionA; + private Item itemInCollectionB; + + /** * this hold a reference to the test feature {@link AdministratorOfFeature} */ @@ -74,201 +99,345 @@ public class AdministratorFeatureIT extends AbstractControllerIntegrationTest { super.setUp(); siteService = ContentServiceFactory.getInstance().getSiteService(); administratorFeature = authorizationFeatureService.find(AdministratorOfFeature.NAME); + initAdminsAndObjects(); + } + + private void initAdminsAndObjects() throws SQLException, AuthorizeException { + context.turnOffAuthorisationSystem(); + + + adminComA = EPersonBuilder.createEPerson(context) + .withEmail("adminComA@example.com") + .withPassword(password) + .build(); + + adminComB = EPersonBuilder.createEPerson(context) + .withEmail("adminComB@example.com") + .withPassword(password) + .build(); + + adminColA = EPersonBuilder.createEPerson(context) + .withEmail("adminColA@example.com") + .withPassword(password) + .build(); + + adminColB = EPersonBuilder.createEPerson(context) + .withEmail("adminColB@example.com") + .withPassword(password) + .build(); + + adminItemA = EPersonBuilder.createEPerson(context) + .withEmail("adminItemA@example.com") + .withPassword(password) + .build(); + + adminItemB = EPersonBuilder.createEPerson(context) + .withEmail("adminItemB@example.com") + .withPassword(password) + .build(); + + communityA = CommunityBuilder.createCommunity(context) + .withName("Community A") + .withAdminGroup(adminComA) + .build(); + + subCommunityOfA = CommunityBuilder.createSubCommunity(context, communityA) + .withName("Sub Community of CommunityA") + .build(); + + communityB = CommunityBuilder.createCommunity(context) + .withName("Community B") + .withAdminGroup(adminComB) + .build(); + + collectionA = CollectionBuilder.createCollection(context, subCommunityOfA) + .withName("Collection A") + .withAdminGroup(adminColA) + .build(); + + collectionB = CollectionBuilder.createCollection(context, communityB) + .withName("Collection B") + .withAdminGroup(adminColB) + .build(); + + itemInCollectionA = ItemBuilder.createItem(context, collectionA) + .withTitle("Item in Collection A") + .withAdminUser(adminItemA) + .build(); + + itemInCollectionB = ItemBuilder.createItem(context, collectionB) + .withTitle("Item in Collection B") + .withAdminUser(adminItemB) + .build(); + + context.restoreAuthSystemState(); } @Test public void communityWithAdministratorFeatureTest() throws Exception { - context.turnOffAuthorisationSystem(); - EPerson adminComA = EPersonBuilder.createEPerson(context) - .withEmail("adminComA@example.com") - .withPassword(password) - .build(); - - EPerson adminComB = EPersonBuilder.createEPerson(context) - .withEmail("adminComB@example.com") - .withPassword(password) - .build(); - - Community communityA = CommunityBuilder.createCommunity(context) - .withName("Community A") - .withAdminGroup(adminComA) - .build(); - - Community subCommunityOfA = CommunityBuilder.createSubCommunity(context, communityA) - .withName("Sub Community of CommunityA") - .build(); - - Collection collectionOfSubComm = CollectionBuilder.createCollection(context, subCommunityOfA) - .withName("Collection of subCommunity") - .build(); - - Community communityB = CommunityBuilder.createCommunity(context) - .withName("Community B") - .withAdminGroup(adminComB) - .build(); - - context.restoreAuthSystemState(); - CommunityRest communityRestA = communityConverter.convert(communityA, DefaultProjection.DEFAULT); - CommunityRest SubCommunityOfArest = communityConverter.convert(subCommunityOfA, DefaultProjection.DEFAULT); - CollectionRest collectionRestOfSubComm = collectionConverter.convert(collectionOfSubComm, - DefaultProjection.DEFAULT); + CommunityRest communityRestB = communityConverter.convert(communityB, DefaultProjection.DEFAULT); + CommunityRest SubCommunityOfARest = communityConverter.convert(subCommunityOfA, DefaultProjection.DEFAULT); // tokens String tokenAdminComA = getAuthToken(adminComA.getEmail(), password); String tokenAdminComB = getAuthToken(adminComB.getEmail(), password); + String tokenAdmin = getAuthToken(admin.getEmail(), password); // define authorizations that we know must exists - Authorization authAdminCommunityA = new Authorization(adminComA, administratorFeature, communityRestA); - Authorization authAdminSubCommunityOfA = new Authorization(adminComA, administratorFeature,SubCommunityOfArest); - Authorization authAdminAColl = new Authorization(adminComA, administratorFeature, collectionRestOfSubComm); + Authorization authAdminSiteComA = new Authorization(admin, administratorFeature, communityRestA); + Authorization authAdminComAComA = new Authorization(adminComA, administratorFeature, communityRestA); + Authorization authAdminComASubComA = new Authorization(adminComA, administratorFeature, SubCommunityOfARest); + Authorization authAdminComBComB = new Authorization(adminComB, administratorFeature, communityRestB); + // define authorizations that we know not exists - Authorization authAdminBColl = new Authorization(adminComB, administratorFeature, collectionRestOfSubComm); - Authorization authAdminBCommunityA = new Authorization(adminComB, administratorFeature, communityRestA); + Authorization authAdminComBComA = new Authorization(adminComB, administratorFeature, communityRestA); + Authorization authAdminComBSubComA = new Authorization(adminComB, administratorFeature, SubCommunityOfARest); + Authorization authAdminColAComA = new Authorization(adminColA, administratorFeature, communityRestA); + Authorization authAdminItemAComA = new Authorization(adminItemA, administratorFeature, communityRestA); + Authorization authEPersonComA = new Authorization(eperson, administratorFeature, communityRestA); + Authorization authAnonymousComA = new Authorization(null, administratorFeature, communityRestA); - getClient(tokenAdminComA).perform(get("/api/authz/authorizations/" + authAdminCommunityA.getID())) + + + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminSiteComA.getID())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCommunityA)))); + .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminSiteComA)))); - getClient(tokenAdminComA).perform(get("/api/authz/authorizations/" + authAdminSubCommunityOfA.getID())) + getClient(tokenAdminComA).perform(get("/api/authz/authorizations/" + authAdminComAComA.getID())) .andExpect(status().isOk()) .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher - .matchAuthorization(authAdminSubCommunityOfA)))); + .matchAuthorization(authAdminComAComA)))); - getClient(tokenAdminComA).perform(get("/api/authz/authorizations/" + authAdminAColl.getID())) + getClient(tokenAdminComA).perform(get("/api/authz/authorizations/" + authAdminComASubComA.getID())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminAColl)))); + .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminComASubComA)))); - getClient(tokenAdminComB).perform(get("/api/authz/authorizations/" + authAdminBCommunityA.getID())) + getClient(tokenAdminComB).perform(get("/api/authz/authorizations/" + authAdminComBComB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminComBComB)))); + + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminComBComA.getID())) .andExpect(status().isNotFound()); - getClient(tokenAdminComB).perform(get("/api/authz/authorizations/" + authAdminBColl.getID())) + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminComBSubComA.getID())) .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminColAComA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminItemAComA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authEPersonComA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAnonymousComA.getID())) + .andExpect(status().isNotFound()); + + } @Test public void collectionWithAdministratorFeatureTest() throws Exception { - context.turnOffAuthorisationSystem(); - - EPerson adminColA = EPersonBuilder.createEPerson(context) - .withEmail("adminColA@example.com") - .withPassword(password) - .build(); - - EPerson adminColB = EPersonBuilder.createEPerson(context) - .withEmail("adminColB@example.com") - .withPassword(password) - .build(); - - Community parentCommunity = CommunityBuilder.createCommunity(context) - .withName("Parent Community") - .build(); - - Collection collectionA = CollectionBuilder.createCollection(context, parentCommunity) - .withName("Collection A") - .withAdminGroup(adminColA) - .build(); - - Collection collectionB = CollectionBuilder.createCollection(context, parentCommunity) - .withName("Collection B") - .withAdminGroup(adminColB) - .build(); - - context.restoreAuthSystemState(); - CollectionRest collectionRestA = collectionConverter.convert(collectionA, DefaultProjection.DEFAULT); CollectionRest collectionRestB = collectionConverter.convert(collectionB, DefaultProjection.DEFAULT); String tokenAdminColA = getAuthToken(adminColA.getEmail(), password); String tokenAdminColB = getAuthToken(adminColB.getEmail(), password); + String tokenAdminComA = getAuthToken(adminComA.getEmail(), password); + String tokenAdminComB = getAuthToken(adminComB.getEmail(), password); + String tokenAdmin = getAuthToken(admin.getEmail(), password); // define authorizations that we know must exists - Authorization authAdminCollectionA = new Authorization(adminColA, administratorFeature, collectionRestA); - Authorization authAdminCollectionB = new Authorization(adminColB, administratorFeature, collectionRestB); + + Authorization authAdminSiteColA = new Authorization(admin, administratorFeature, collectionRestA); + Authorization authAdminComAColA = new Authorization(adminComA, administratorFeature, collectionRestA); + Authorization authAdminColAColA = new Authorization(adminColA, administratorFeature, collectionRestA); + + Authorization authAdminSiteColB = new Authorization(admin, administratorFeature, collectionRestB); + Authorization authAdminComBColB = new Authorization(adminComB, administratorFeature, collectionRestB); + Authorization authAdminColBColB = new Authorization(adminColB, administratorFeature, collectionRestB); // define authorization that we know not exists - Authorization authAdminBcollectionA = new Authorization(adminColB, administratorFeature, collectionRestA); + Authorization authAdminColBColA = new Authorization(adminColB, administratorFeature, collectionRestA); + Authorization authAdminComBColA = new Authorization(adminComB, administratorFeature, collectionRestA); + Authorization authAdminItemAColA = new Authorization(adminItemA, administratorFeature, collectionRestA); + Authorization authEPersonColA = new Authorization(eperson, administratorFeature, collectionRestA); + Authorization authAnonymousColA = new Authorization(null, administratorFeature, collectionRestA); - getClient(tokenAdminColA).perform(get("/api/authz/authorizations/" + authAdminCollectionA.getID())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCollectionA)))); - getClient(tokenAdminColB).perform(get("/api/authz/authorizations/" + authAdminCollectionB.getID())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCollectionB)))); - getClient(tokenAdminColB).perform(get("/api/authz/authorizations/" + authAdminBcollectionA.getID())) - .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminSiteColA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminSiteColA)))); + + getClient(tokenAdminComA).perform(get("/api/authz/authorizations/" + authAdminComAColA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminComAColA)))); + + getClient(tokenAdminColA).perform(get("/api/authz/authorizations/" + authAdminColAColA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminColAColA)))); + + + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminSiteColB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminSiteColB)))); + + getClient(tokenAdminComB).perform(get("/api/authz/authorizations/" + authAdminComBColB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminComBColB)))); + + getClient(tokenAdminColB).perform(get("/api/authz/authorizations/" + authAdminColBColB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminColBColB)))); + + + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminColBColA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminComBColA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminItemAColA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authEPersonColA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAnonymousColA.getID())) + .andExpect(status().isNotFound()); } @Test public void siteWithAdministratorFeatureTest() throws Exception { - context.turnOffAuthorisationSystem(); - - Community parentCommunity = CommunityBuilder.createCommunity(context) - .withName("Test Parent Community") - .build(); - - Collection collection = CollectionBuilder.createCollection(context, parentCommunity) - .withName("Test Collection") - .build(); - - context.restoreAuthSystemState(); Site site = siteService.findSite(context); SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); - CommunityRest communityRest = communityConverter.convert(parentCommunity, DefaultProjection.DEFAULT); - CollectionRest collectionRest = collectionConverter.convert(collection, DefaultProjection.DEFAULT); // tokens String tokenAdmin = getAuthToken(admin.getEmail(), password); - String tokenEperson = getAuthToken(eperson.getEmail(), password); - // define authorizations of Admin that we know must exists Authorization authAdminSite = new Authorization(admin, administratorFeature, siteRest); - Authorization authAdminCommunity = new Authorization(admin, administratorFeature, communityRest); - Authorization authAdminCollection = new Authorization(admin, administratorFeature, collectionRest); // define authorizations of EPerson that we know not exists + Authorization authAdminComASite = new Authorization(adminComA, administratorFeature, siteRest); + Authorization authAdminColASite = new Authorization(adminColA, administratorFeature, siteRest); + Authorization authAdminItemASite = new Authorization(adminItemA, administratorFeature, siteRest); Authorization authEPersonSite = new Authorization(eperson, administratorFeature, siteRest); - Authorization authEpersonCommunity = new Authorization(eperson, administratorFeature, communityRest); - Authorization authEpersonCollection = new Authorization(eperson, administratorFeature, collectionRest); - - // define authorizations of Anonymous that we know not exists Authorization authAnonymousSite = new Authorization(null, administratorFeature, siteRest); - Authorization authAnonymousCommunity = new Authorization(null, administratorFeature, communityRest); - Authorization authAnonymousCollection = new Authorization(null, administratorFeature, collectionRest); getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminSite.getID())) .andExpect(status().isOk()) .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminSite)))); - getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminCommunity.getID())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCommunity)))); - - getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminCollection.getID())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCollection)))); - - getClient(tokenEperson).perform(get("/api/authz/authorizations/" + authEPersonSite.getID())) + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authEPersonSite.getID())) .andExpect(status().isNotFound()); - getClient(tokenEperson).perform(get("/api/authz/authorizations/" + authEpersonCommunity.getID())) + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminComASite.getID())) .andExpect(status().isNotFound()); - getClient(tokenEperson).perform(get("/api/authz/authorizations/" + authEpersonCollection.getID())) + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminColASite.getID())) .andExpect(status().isNotFound()); - getClient().perform(get("/api/authz/authorizations/" + authAnonymousSite.getID())) + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminItemASite.getID())) .andExpect(status().isNotFound()); - getClient().perform(get("/api/authz/authorizations/" + authAnonymousCommunity.getID())) + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authEPersonSite.getID())) .andExpect(status().isNotFound()); - getClient().perform(get("/api/authz/authorizations/" + authAnonymousCollection.getID())) + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAnonymousSite.getID())) .andExpect(status().isNotFound()); } + + @Test + public void itemWithAdministratorFeatureTest() throws Exception { + + ItemRest itemRestA = itemConverter.convert(itemInCollectionA, DefaultProjection.DEFAULT); + ItemRest itemRestB = itemConverter.convert(itemInCollectionB, DefaultProjection.DEFAULT); + + String tokenAdminItemA = getAuthToken(adminItemA.getEmail(), password); + String tokenAdminItemB = getAuthToken(adminItemB.getEmail(), password); + String tokenAdminColA = getAuthToken(adminColA.getEmail(), password); + String tokenAdminColB = getAuthToken(adminColB.getEmail(), password); + String tokenAdminComA = getAuthToken(adminComA.getEmail(), password); + String tokenAdminComB = getAuthToken(adminComB.getEmail(), password); + String tokenAdmin = getAuthToken(admin.getEmail(), password); + + // define authorizations that we know must exists + + Authorization authAdminSiteItemA = new Authorization(admin, administratorFeature, itemRestA); + Authorization authAdminComAItemA = new Authorization(adminComA, administratorFeature, itemRestA); + Authorization authAdminColAItemA = new Authorization(adminColA, administratorFeature, itemRestA); + Authorization authAdminItemAItemA = new Authorization(adminItemA, administratorFeature, itemRestA); + + Authorization authAdminSiteItemB = new Authorization(admin, administratorFeature, itemRestB); + Authorization authAdminComBItemB = new Authorization(adminComB, administratorFeature, itemRestB); + Authorization authAdminColBItemB = new Authorization(adminColB, administratorFeature, itemRestB); + Authorization authAdminItemBItemB = new Authorization(adminItemB, administratorFeature, itemRestB); + + + // define authorization that we know not exists + Authorization authAdminComBItemA = new Authorization(adminComB, administratorFeature, itemRestA); + Authorization authAdminColBItemA = new Authorization(adminColB, administratorFeature, itemRestA); + Authorization authAdminItemBItemA = new Authorization(adminItemB, administratorFeature, itemRestA); + Authorization authEPersonItemA = new Authorization(eperson, administratorFeature, itemRestA); + Authorization authAnonymousItemA = new Authorization(null, administratorFeature, itemRestA); + + + + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminSiteItemA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminSiteItemA)))); + + getClient(tokenAdminComA).perform(get("/api/authz/authorizations/" + authAdminComAItemA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminComAItemA)))); + + getClient(tokenAdminColA).perform(get("/api/authz/authorizations/" + authAdminColAItemA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminColAItemA)))); + + getClient(tokenAdminItemA).perform(get("/api/authz/authorizations/" + authAdminItemAItemA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminItemAItemA)))); + + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminSiteItemB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminSiteItemB)))); + + getClient(tokenAdminComB).perform(get("/api/authz/authorizations/" + authAdminComBItemB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminComBItemB)))); + + getClient(tokenAdminColB).perform(get("/api/authz/authorizations/" + authAdminColBItemB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminColBItemB)))); + + getClient(tokenAdminItemB).perform(get("/api/authz/authorizations/" + authAdminItemBItemB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$", Matchers.is( + AuthorizationMatcher.matchAuthorization(authAdminItemBItemB)))); + + + + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminComBItemA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminColBItemA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAdminItemBItemA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authEPersonItemA.getID())) + .andExpect(status().isNotFound()); + getClient(tokenAdmin).perform(get("/api/authz/authorizations/" + authAnonymousItemA.getID())) + .andExpect(status().isNotFound()); + } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CCLicenseFeatureRestIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CCLicenseFeatureRestIT.java index 97225d7df0..38fc9a06fd 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CCLicenseFeatureRestIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CCLicenseFeatureRestIT.java @@ -7,15 +7,13 @@ */ package org.dspace.app.rest.authorization; +import static org.hamcrest.Matchers.contains; +import static org.hamcrest.Matchers.is; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; import org.dspace.app.rest.authorization.impl.CCLicenseFeature; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.ResourcePolicyBuilder; import org.dspace.app.rest.converter.ItemConverter; import org.dspace.app.rest.matcher.AuthorizationMatcher; import org.dspace.app.rest.model.ItemRest; @@ -23,6 +21,10 @@ import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.utils.Utils; import org.dspace.authorize.ResourcePolicy; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -81,13 +83,14 @@ public class CCLicenseFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", admin.getID().toString()) .param("feature", ccLicenseFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))) + ); } @Test @@ -110,13 +113,14 @@ public class CCLicenseFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", ccLicenseFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))) + ); // verify that the property core.authorization.collection-admin.item-admin.cc-license = false is respected // the community admins should be still authorized @@ -127,13 +131,14 @@ public class CCLicenseFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", ccLicenseFeature.getName())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))) + ); // now verify that the property core.authorization.community-admin.item-admin.cc-license = false is respected // and also community admins are blocked @@ -143,11 +148,11 @@ public class CCLicenseFeatureRestIT extends AbstractControllerIntegrationTest { getClient(comAdminToken).perform(get("/api/authz/authorizations/" + authAdminCCLicense.getID())) .andExpect(status().isNotFound()); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", ccLicenseFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -170,24 +175,25 @@ public class CCLicenseFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); - getClient(colAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(colAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", ccLicenseFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))) + ); // verify that the property core.authorization.collection-admin.item-admin.cc-license = false is respected configurationService.setProperty("core.authorization.item-admin.cc-license", false); configurationService.setProperty("core.authorization.collection-admin.item-admin.cc-license", false); getClient(colAdminToken).perform(get("/api/authz/authorizations/" + authAdminCCLicense.getID())) .andExpect(status().isNotFound()); - getClient(colAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(colAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", ccLicenseFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -211,23 +217,24 @@ public class CCLicenseFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); - getClient(itemAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(itemAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", ccLicenseFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminCCLicense)))) + ); // verify that the property core.authorization.item-admin.cc-license = false is respected configurationService.setProperty("core.authorization.item-admin.cc-license", false); getClient(itemAdminToken).perform(get("/api/authz/authorizations/" + authAdminCCLicense.getID())) .andExpect(status().isNotFound()); - getClient(itemAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(itemAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", ccLicenseFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -248,19 +255,19 @@ public class CCLicenseFeatureRestIT extends AbstractControllerIntegrationTest { getClient(epersonToken).perform(get("/api/authz/authorizations/" + authEpersonCCLicense.getID())) .andExpect(status().isNotFound()); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", ccLicenseFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); // check the authorization for the anonymous user getClient().perform(get("/api/authz/authorizations/" + authAnonymousCCLicense.getID())) .andExpect(status().isNotFound()); - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("feature", ccLicenseFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CanCreateCollectionsIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CanCreateCollectionsIT.java new file mode 100644 index 0000000000..97aa7c5c2f --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CanCreateCollectionsIT.java @@ -0,0 +1,272 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization; + +import static org.hamcrest.Matchers.greaterThan; +import static org.hamcrest.Matchers.is; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import org.dspace.app.rest.authorization.impl.CreateCollectionFeature; +import org.dspace.app.rest.converter.CollectionConverter; +import org.dspace.app.rest.converter.CommunityConverter; +import org.dspace.app.rest.converter.SiteConverter; +import org.dspace.app.rest.model.CollectionRest; +import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.model.SiteRest; +import org.dspace.app.rest.projection.DefaultProjection; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.app.rest.utils.Utils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Site; +import org.dspace.content.service.SiteService; +import org.dspace.core.Constants; +import org.junit.Before; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * This test deals with testing the feature {@link CreateCollectionFeature} + */ +public class CanCreateCollectionsIT extends AbstractControllerIntegrationTest { + + @Autowired + private SiteService siteService; + + @Autowired + private AuthorizationFeatureService authorizationFeatureService; + + @Autowired + private CommunityConverter communityConverter; + + @Autowired + private SiteConverter siteConverter; + + @Autowired + private CollectionConverter collectionConverter; + + @Autowired + private AuthorizeService authorizeService; + + @Autowired + private Utils utils; + + private Community communityA; + private Community communityB; + private Community communityC; + private Collection collectionA; + + private AuthorizationFeature canCreateCollectionsFeature; + + @Override + @Before + public void setUp() throws Exception { + super.setUp(); + + context.turnOffAuthorisationSystem(); + communityA = CommunityBuilder.createCommunity(context).withName("Community A").build(); + collectionA = CollectionBuilder.createCollection(context, communityA).withName("Collection A").build(); + communityB = CommunityBuilder.createCommunity(context).withName("Community B").build(); + communityC = CommunityBuilder.createSubCommunity(context, communityB).withName("Community C").build(); + context.restoreAuthSystemState(); + + canCreateCollectionsFeature = authorizationFeatureService.find(CreateCollectionFeature.NAME); + } + + @Test + public void canCreateCollectionsOnCommunityAsAdminTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + CommunityRest comRest = communityConverter.convert(communityA, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").exists()) + .andExpect(jsonPath("$.page.totalElements", greaterThan(0))); + } + + @Test + public void canCreateCollectionsOnSiteAsAdminTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + Site site = siteService.findSite(context); + SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); + String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCollectionsOnCollectionAsAdminTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + CollectionRest collectionRest = collectionConverter.convert(collectionA, DefaultProjection.DEFAULT); + String collectionUri = utils.linkToSingleResource(collectionRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", collectionUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCollectionsOnCommunityAsAnonymousTest() throws Exception { + CommunityRest comRest = communityConverter.convert(communityA, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCollectionsOnSiteAsAnonymousTest() throws Exception { + Site site = siteService.findSite(context); + SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); + String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCollectionsOnCollectionAsAnonymousTest() throws Exception { + CollectionRest collectionRest = collectionConverter.convert(collectionA, DefaultProjection.DEFAULT); + String collectionUri = utils.linkToSingleResource(collectionRest, "self").getHref(); + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", collectionUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCollectionsOnCommunityAsEPersonWithoutRightsTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + CommunityRest comRest = communityConverter.convert(communityA, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCollectionsOnSiteAsEPersonWithoutRightsTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + Site site = siteService.findSite(context); + SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); + String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCollectionsOnCollectionAsEPersonWithoutRightsTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + CollectionRest collectionRest = collectionConverter.convert(collectionA, DefaultProjection.DEFAULT); + String collectionUri = utils.linkToSingleResource(collectionRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", collectionUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCollectionsOnCommunityAsEPersonWithRightsTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + CommunityRest comRest = communityConverter.convert(communityB, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").exists()) + .andExpect(jsonPath("$.page.totalElements", greaterThan(0))); + } + + @Test + public void canCreateCollectionsOnSubCommunityAsEPersonWithRightsOnParentCommunityTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + CommunityRest comRest = communityConverter.convert(communityC, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").exists()) + .andExpect(jsonPath("$.page.totalElements", greaterThan(0))); + } + + @Test + public void canCreateCollectionsOnSubCommunityAsEPersonWithoutRightsOnParentCommunityTest() throws Exception { + authorizeService.addPolicy(context, communityA, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + CommunityRest comRest = communityConverter.convert(communityC, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCollectionsFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CanCreateCommunitiesIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CanCreateCommunitiesIT.java new file mode 100644 index 0000000000..84756b3116 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/CanCreateCommunitiesIT.java @@ -0,0 +1,238 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization; + +import static org.hamcrest.Matchers.greaterThan; +import static org.hamcrest.Matchers.is; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import org.dspace.app.rest.authorization.impl.CreateCommunityFeature; +import org.dspace.app.rest.converter.CollectionConverter; +import org.dspace.app.rest.converter.CommunityConverter; +import org.dspace.app.rest.converter.SiteConverter; +import org.dspace.app.rest.model.CollectionRest; +import org.dspace.app.rest.model.CommunityRest; +import org.dspace.app.rest.model.SiteRest; +import org.dspace.app.rest.projection.DefaultProjection; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.app.rest.utils.Utils; +import org.dspace.authorize.service.AuthorizeService; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Site; +import org.dspace.content.service.SiteService; +import org.dspace.core.Constants; +import org.junit.Before; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * This test deals with testing the feature {@link CreateCommunityFeature} + */ +public class CanCreateCommunitiesIT extends AbstractControllerIntegrationTest { + + @Autowired + private SiteService siteService; + + @Autowired + private AuthorizationFeatureService authorizationFeatureService; + + @Autowired + private CommunityConverter communityConverter; + + @Autowired + private SiteConverter siteConverter; + + @Autowired + private CollectionConverter collectionConverter; + + @Autowired + private AuthorizeService authorizeService; + + @Autowired + private Utils utils; + + private Community communityA; + private Community communityB; + private Collection collectionA; + + private AuthorizationFeature canCreateCommunitiesFeature; + + @Override + @Before + public void setUp() throws Exception { + super.setUp(); + + context.turnOffAuthorisationSystem(); + communityA = CommunityBuilder.createCommunity(context).withName("Community A").build(); + communityB = CommunityBuilder.createSubCommunity(context, communityA).withName("Community B").build(); + collectionA = CollectionBuilder.createCollection(context, communityB).withName("Collection A").build(); + context.restoreAuthSystemState(); + + canCreateCommunitiesFeature = authorizationFeatureService.find(CreateCommunityFeature.NAME); + } + + @Test + public void canCreateCommunitiesOnCommunityAsAdminTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + CommunityRest comRest = communityConverter.convert(communityA, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").exists()) + .andExpect(jsonPath("$.page.totalElements", greaterThan(0))); + } + + @Test + public void canCreateCommunitiesOnSiteAsAdminTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + Site site = siteService.findSite(context); + SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); + String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").exists()) + .andExpect(jsonPath("$.page.totalElements", greaterThan(0))); + } + + @Test + public void canCreateCommunitiesOnCollectionAsAdminTest() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + CollectionRest collectionRest = collectionConverter.convert(collectionA, DefaultProjection.DEFAULT); + String collectionUri = utils.linkToSingleResource(collectionRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", collectionUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCommunitiesOnCommunityAsAnonymousTest() throws Exception { + CommunityRest comRest = communityConverter.convert(communityA, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCommunitiesOnSiteAsAnonymousTest() throws Exception { + Site site = siteService.findSite(context); + SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); + String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCommunitiesOnCollectionAsAnonymousTest() throws Exception { + CollectionRest collectionRest = collectionConverter.convert(collectionA, DefaultProjection.DEFAULT); + String collectionUri = utils.linkToSingleResource(collectionRest, "self").getHref(); + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", collectionUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCommunitiesOnCommunityAsEPersonWithoutRightsTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + CommunityRest comRest = communityConverter.convert(communityA, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCommunitiesOnSiteAsEPersonWithoutRightsTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + Site site = siteService.findSite(context); + SiteRest siteRest = siteConverter.convert(site, DefaultProjection.DEFAULT); + String siteUri = utils.linkToSingleResource(siteRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCommunitiesOnCollectionAsEPersonWithoutRightsTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + CollectionRest collectionRest = collectionConverter.convert(collectionA, DefaultProjection.DEFAULT); + String collectionUri = utils.linkToSingleResource(collectionRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", collectionUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").doesNotExist()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + } + + @Test + public void canCreateCommunitiesOnCommunityAsEPersonWithRightsTest() throws Exception { + authorizeService.addPolicy(context, communityB, Constants.ADMIN, eperson); + String token = getAuthToken(eperson.getEmail(), password); + CommunityRest comRest = communityConverter.convert(communityB, DefaultProjection.DEFAULT); + String comUri = utils.linkToSingleResource(comRest, "self").getHref(); + + getClient(token).perform(get("/api/authz/authorizations/search/object") + .param("uri", comUri) + .param("feature", canCreateCommunitiesFeature.getName()) + .param("embed", "feature")) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded").exists()) + .andExpect(jsonPath("$.page.totalElements", greaterThan(0))); + } + +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/EPersonRegistrationFeatureIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/EPersonRegistrationFeatureIT.java new file mode 100644 index 0000000000..6372324630 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/EPersonRegistrationFeatureIT.java @@ -0,0 +1,118 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization; + +import static org.hamcrest.Matchers.greaterThan; +import static org.hamcrest.Matchers.is; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import org.dspace.app.rest.authorization.impl.EPersonRegistrationFeature; +import org.dspace.app.rest.converter.ConverterService; +import org.dspace.app.rest.converter.SiteConverter; +import org.dspace.app.rest.model.SiteRest; +import org.dspace.app.rest.projection.Projection; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.app.rest.utils.Utils; +import org.dspace.content.Site; +import org.dspace.content.service.SiteService; +import org.dspace.services.ConfigurationService; +import org.junit.Before; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +public class EPersonRegistrationFeatureIT extends AbstractControllerIntegrationTest { + + @Autowired + private AuthorizationFeatureService authorizationFeatureService; + + @Autowired + private ConverterService converterService; + + @Autowired + private ConfigurationService configurationService; + + @Autowired + private SiteService siteService; + + @Autowired + private SiteConverter siteConverter; + + @Autowired + private Utils utils; + + private AuthorizationFeature epersonRegistrationFeature; + + public static final String[] SHIB_ONLY = {"org.dspace.authenticate.ShibAuthentication"}; + + @Override + @Before + public void setUp() throws Exception { + super.setUp(); + epersonRegistrationFeature = authorizationFeatureService.find(EPersonRegistrationFeature.NAME); + } + + @Test + public void userRegistrationEnabledSuccessTest() throws Exception { + + Site site = siteService.findSite(context); + SiteRest SiteRest = siteConverter.convert(site, Projection.DEFAULT); + String siteUri = utils.linkToSingleResource(SiteRest, "self").getHref(); + + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", epersonRegistrationFeature.getName())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", greaterThan(0))); + } + + @Test + public void userRegistrationDisabledUnAuthorizedTest() throws Exception { + + Site site = siteService.findSite(context); + SiteRest SiteRest = siteConverter.convert(site, Projection.DEFAULT); + String siteUri = utils.linkToSingleResource(SiteRest, "self").getHref(); + + configurationService.setProperty("user.registration", false); + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", epersonRegistrationFeature.getName())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + } + + + @Test + public void userRegistrationEnabledShibTest() throws Exception { + + Site site = siteService.findSite(context); + SiteRest SiteRest = siteConverter.convert(site, Projection.DEFAULT); + String siteUri = utils.linkToSingleResource(SiteRest, "self").getHref(); + + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", epersonRegistrationFeature.getName())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", greaterThan(0))); + + //Enable Shibboleth and password login + configurationService.setProperty("plugin.sequence.org.dspace.authenticate.AuthenticationMethod", SHIB_ONLY); + + getClient().perform(get("/api/authz/authorizations/search/object") + .param("uri", siteUri) + .param("feature", epersonRegistrationFeature.getName())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.page.totalElements", is(0))); + + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/EnrollAdministratorIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/EnrollAdministratorIT.java index 2be3ed9466..c6ea031f7d 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/EnrollAdministratorIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/EnrollAdministratorIT.java @@ -15,12 +15,12 @@ import static org.springframework.test.web.servlet.result.MockMvcResultMatchers. import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; import org.dspace.app.rest.authorization.impl.AdministratorOfFeature; -import org.dspace.app.rest.builder.EPersonBuilder; import org.dspace.app.rest.converter.SiteConverter; import org.dspace.app.rest.matcher.AuthorizationMatcher; import org.dspace.app.rest.model.SiteRest; import org.dspace.app.rest.projection.DefaultProjection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.EPersonBuilder; import org.dspace.content.Site; import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.service.SiteService; diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/GenericAuthorizationFeatureIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/GenericAuthorizationFeatureIT.java new file mode 100644 index 0000000000..1d3b5b0516 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/GenericAuthorizationFeatureIT.java @@ -0,0 +1,1688 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.authorization; + +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.io.InputStream; + +import org.apache.commons.codec.CharEncoding; +import org.apache.commons.io.IOUtils; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.authorize.ResourcePolicy; +import org.dspace.builder.BitstreamBuilder; +import org.dspace.builder.BundleBuilder; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.GroupBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ResourcePolicyBuilder; +import org.dspace.content.Bitstream; +import org.dspace.content.Bundle; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.core.Constants; +import org.dspace.eperson.EPerson; +import org.dspace.eperson.Group; +import org.dspace.services.ConfigurationService; +import org.junit.Before; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * Test for the following authorization features: + * canManagePolicies + * canEditMetadata + * canMove + * canMakePrivate + * canMakeDiscoverable + * canDelete + * canReorderBitstreams + * canCreateBitstream + * canCreateBundle + */ +public class GenericAuthorizationFeatureIT extends AbstractControllerIntegrationTest { + + @Autowired + ConfigurationService configurationService; + + private Community communityA; + private Community communityAA; + private Community communityB; + private Community communityBB; + private Collection collectionX; + private Collection collectionY; + private Item item1; + private Item item2; + private Bundle bundle1; + private Bundle bundle2; + private Bitstream bitstream1; + private Bitstream bitstream2; + + private Group item1AdminGroup; + + private EPerson communityAAdmin; + private EPerson collectionXAdmin; + private EPerson item1Admin; + private EPerson communityAWriter; + private EPerson collectionXWriter; + private EPerson item1Writer; + + @Override + @Before + public void setUp() throws Exception { + super.setUp(); + + context.turnOffAuthorisationSystem(); + + communityAAdmin = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("communityAAdmin@my.edu") + .withPassword(password) + .build(); + collectionXAdmin = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("collectionXAdmin@my.edu") + .withPassword(password) + .build(); + item1Admin = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("item1Admin@my.edu") + .withPassword(password) + .build(); + communityAWriter = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("communityAWriter@my.edu") + .withPassword(password) + .build(); + collectionXWriter = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("collectionXWriter@my.edu") + .withPassword(password) + .build(); + item1Writer = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("item1Writer@my.edu") + .withPassword(password) + .build(); + + communityA = CommunityBuilder.createCommunity(context) + .withName("communityA") + .withAdminGroup(communityAAdmin) + .build(); + communityAA = CommunityBuilder.createCommunity(context) + .withName("communityAA") + .addParentCommunity(context, communityA) + .build(); + collectionX = CollectionBuilder.createCollection(context, communityAA) + .withName("collectionX") + .withAdminGroup(collectionXAdmin) + .build(); + item1 = ItemBuilder.createItem(context, collectionX) + .withTitle("item1") + .withIssueDate("2020-07-08") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("item1Entry") + .build(); + bundle1 = BundleBuilder.createBundle(context, item1) + .withName("bundle1") + .build(); + try (InputStream is = IOUtils.toInputStream("randomContent", CharEncoding.UTF_8)) { + bitstream1 = BitstreamBuilder.createBitstream(context, bundle1, is) + .withName("bitstream1") + .withMimeType("text/plain") + .build(); + } + + item1AdminGroup = GroupBuilder.createGroup(context) + .withName("item1AdminGroup") + .addMember(item1Admin) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.ADMIN) + .withGroup(item1AdminGroup) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(communityA) + .withAction(Constants.WRITE) + .withUser(communityAWriter) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(collectionX) + .withAction(Constants.WRITE) + .withUser(collectionXWriter) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.WRITE) + .withUser(item1Writer) + .build(); + + communityB = CommunityBuilder.createCommunity(context) + .withName("communityB") + .build(); + communityBB = CommunityBuilder.createCommunity(context) + .withName("communityBB") + .addParentCommunity(context, communityB) + .build(); + collectionY = CollectionBuilder.createCollection(context, communityBB) + .withName("collectionY") + .build(); + item2 = ItemBuilder.createItem(context, collectionY) + .withTitle("item2") + .withIssueDate("2020-07-08") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("item2Entry") + .build(); + bundle2 = BundleBuilder.createBundle(context, item2) + .withName("bundle2") + .build(); + try (InputStream is = IOUtils.toInputStream("randomContent", CharEncoding.UTF_8)) { + bitstream2 = BitstreamBuilder.createBitstream(context, bundle2, is) + .withName("bitstream2") + .withMimeType("text/plain") + .build(); + } + + context.restoreAuthSystemState(); + + configurationService.setProperty( + "org.dspace.app.rest.authorization.AlwaysThrowExceptionFeature.turnoff", "true"); + } + + private void testAdminsHavePermissionsAllDso(String feature) throws Exception { + String adminToken = getAuthToken(admin.getEmail(), password); + String communityAAdminToken = getAuthToken(communityAAdmin.getEmail(), password); + String collectionXAdminToken = getAuthToken(collectionXAdmin.getEmail(), password); + String item1AdminToken = getAuthToken(item1Admin.getEmail(), password); + String siteId = ContentServiceFactory.getInstance().getSiteService().findSite(context).getID().toString(); + + // Verify the general admin has this feature on the site + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/sites/" + siteId)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin doesn’t have this feature on the site + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/sites/" + siteId)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on community A + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on community A + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on community AA + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin doesn’t have this feature on community A + getClient(collectionXAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify community A admin doesn’t have this feature on community B + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on collection X + getClient(adminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on collection X + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on collection X + getClient(collectionXAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin doesn’t have this feature on collection X + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X admin doesn’t have this feature on collection Y + getClient(collectionXAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionY.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on item 1 + getClient(adminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on item 1 + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on item 1 + getClient(collectionXAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin has this feature on item 1 + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin doesn’t have this feature on item 2 + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on the bundle in item 1 + getClient(adminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on the bundle in item 1 + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on the bundle in item 1 + getClient(collectionXAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin has this feature on the bundle in item 1 + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin doesn’t have this feature on the bundle in item 2 + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on the bitstream in item 1 + getClient(adminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on the bitstream in item 1 + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on the bitstream in item 1 + getClient(collectionXAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin has this feature on the bitstream in item 1 + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin doesn’t have this feature on the bitstream in item 2 + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + private void testAdminsHavePermissionsItem(String feature) throws Exception { + String adminToken = getAuthToken(admin.getEmail(), password); + String communityAAdminToken = getAuthToken(communityAAdmin.getEmail(), password); + String collectionXAdminToken = getAuthToken(collectionXAdmin.getEmail(), password); + String item1AdminToken = getAuthToken(item1Admin.getEmail(), password); + + // Verify the general admin has this feature on item 1 + getClient(adminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on item 1 + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on item 1 + getClient(collectionXAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin has this feature on item 1 + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin doesn’t have this feature on item 2 + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + } + + private void testWriteUsersHavePermissionsAllDso(String feature, boolean hasDSOAccess) throws Exception { + String communityAWriterToken = getAuthToken(communityAWriter.getEmail(), password); + String collectionXWriterToken = getAuthToken(collectionXWriter.getEmail(), password); + String item1WriterToken = getAuthToken(item1Writer.getEmail(), password); + + // Verify community A write has this feature on community A if the boolean parameter is true + // (or doesn’t have access otherwise) + if (hasDSOAccess) { + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + } else { + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + // Verify community A write doesn’t have this feature on community AA + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify community A write doesn’t have this feature on collection X + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify community A write doesn’t have this feature on item 1 + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify community A write doesn’t have this feature on the bundle in item 1 + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify community A write doesn’t have this feature on the bitstream in item 1 + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write doesn’t have this feature on community A + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write doesn’t have this feature on community AA + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write has this feature on collection X if the boolean parameter is true + // (or doesn’t have access otherwise) + if (hasDSOAccess) { + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + } else { + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + // Verify collection X write doesn’t have this feature on item 1 + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write doesn’t have this feature on the bundle in item 1 + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write doesn’t have this feature on the bitstream in item 1 + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write doesn’t have this feature on community A + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write doesn’t have this feature on community AA + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write doesn’t have this feature on collection X + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write has this feature on item 1 if the boolean parameter is true + // (or doesn’t have access otherwise) + if (hasDSOAccess) { + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + } else { + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + // Verify item 1 write doesn’t have this feature on the bundle in item 1 + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write doesn’t have this feature on the bitstream in item 1 + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify community A write doesn’t have this feature on community B + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write doesn’t have this feature on collection Y + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionY.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write doesn’t have this feature on item 2 + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + private void testWriteUsersHavePermissionsItem(String feature, boolean hasDSOAccess) throws Exception { + String communityAWriterToken = getAuthToken(communityAWriter.getEmail(), password); + String collectionXWriterToken = getAuthToken(collectionXWriter.getEmail(), password); + String item1WriterToken = getAuthToken(item1Writer.getEmail(), password); + + // Verify community A write doesn’t have this feature on item 1 + getClient(communityAWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write doesn’t have this feature on item 1 + getClient(collectionXWriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write has this feature on item 1 if the boolean parameter is true + // (or doesn’t have access otherwise) + if (hasDSOAccess) { + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + } else { + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + // Verify item 1 write doesn’t have this feature on item 2 + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + } + + @Test + public void testCanManagePoliciesAdmin() throws Exception { + testWriteUsersHavePermissionsAllDso("canManagePolicies", false); + } + + @Test + public void testCanManagePoliciesWriter() throws Exception { + testWriteUsersHavePermissionsAllDso("canManagePolicies", false); + } + + + @Test + public void testCanEditMetadataAdmin() throws Exception { + testAdminsHavePermissionsAllDso("canEditMetadata"); + } + + @Test + public void testCanEditMetadataWriter() throws Exception { + testWriteUsersHavePermissionsAllDso("canEditMetadata", true); + } + + @Test + public void testCanMoveAdmin() throws Exception { + String item1WriterToken = getAuthToken(item1Writer.getEmail(), password); + String adminToken = getAuthToken(admin.getEmail(), password); + String communityAAdminToken = getAuthToken(communityAAdmin.getEmail(), password); + String collectionXAdminToken = getAuthToken(collectionXAdmin.getEmail(), password); + String item1AdminToken = getAuthToken(item1Admin.getEmail(), password); + final String feature = "canMove"; + + // Verify the general admin has this feature on item 1 + getClient(adminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on item 1 + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on item 1 + getClient(collectionXAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin doesn’t have this feature on item 1 + getClient(item1AdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify community A admin doesn’t have this feature on item 2 + getClient(communityAAdminToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + + // grant item 1 admin REMOVE permissions on the item’s owning collection + // verify item 1 admin has this feature on item 1 + context.turnOffAuthorisationSystem(); + ResourcePolicy removePermission = ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(collectionX) + .withAction(Constants.REMOVE) + .withUser(item1Writer) + .build(); + context.restoreAuthSystemState(); + + // verify item 1 write has this feature on item 1 + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='canMove')]") + .exists()); + } + + @Test + public void testCanMoveWriter() throws Exception { + testWriteUsersHavePermissionsItem("canMove", false); + + // grant item 1 write REMOVE permissions on the item’s owning collection + context.turnOffAuthorisationSystem(); + ResourcePolicy removePermission = ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(collectionX) + .withAction(Constants.REMOVE) + .withUser(item1Writer) + .build(); + context.restoreAuthSystemState(); + + String item1WriterToken = getAuthToken(item1Writer.getEmail(), password); + // verify item 1 write has this feature on item 1 + getClient(item1WriterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='canMove')]") + .exists()); + } + + @Test + public void testCanMakePrivateAdmin() throws Exception { + testAdminsHavePermissionsItem("canMakePrivate"); + } + + @Test + public void testCanMakePrivateWriter() throws Exception { + testWriteUsersHavePermissionsItem("canMakePrivate", true); + } + + @Test + public void testCanMakeDiscoverableAdmin() throws Exception { + testAdminsHavePermissionsItem("canMakeDiscoverable"); + } + + @Test + public void testCanMakeDiscoverableWriter() throws Exception { + testWriteUsersHavePermissionsItem("canMakeDiscoverable", true); + } + + @Test + public void testCanDeleteAdmin() throws Exception { + String adminToken = getAuthToken(admin.getEmail(), password); + String communityAAdminToken = getAuthToken(communityAAdmin.getEmail(), password); + String collectionXAdminToken = getAuthToken(collectionXAdmin.getEmail(), password); + String item1AdminToken = getAuthToken(item1Admin.getEmail(), password); + String siteId = ContentServiceFactory.getInstance().getSiteService().findSite(context).getID().toString(); + final String feature = "canDelete"; + + // Verify the general admin doesn’t have this feature on the site + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/sites/" + siteId)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on community A + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on community A + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on community AA + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Create a community AA admin and verify the community AA admin doesn’t have this feature on community AA + context.turnOffAuthorisationSystem(); + EPerson communityAAAdmin = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("communityAAAdmin@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(communityAA) + .withAction(Constants.ADMIN) + .withUser(communityAAAdmin) + .build(); + context.restoreAuthSystemState(); + String communityAAAdminToken = getAuthToken(communityAAAdmin.getEmail(), password); + getClient(communityAAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X admin doesn’t have this feature on community A + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify community A admin doesn’t have this feature on community B + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityB.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on collection X + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on collection X + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin doesn’t have this feature on collection X + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 admin doesn’t have this feature on collection X + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X admin doesn’t have this feature on collection Y + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionY.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on item 1 + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on item 1 + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on item 1 + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin doesn’t have this feature on item 1 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 admin doesn’t have this feature on item 2 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on the bundle in item 1 + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on the bundle in item 1 + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on the bundle in item 1 + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin has this feature on the bundle in item 1 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin doesn’t have this feature on the bundle in item 2 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify the general admin has this feature on the bitstream in item 1 + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on the bitstream in item 1 + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on the bitstream in item 1 + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin has this feature on the bitstream in item 1 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin doesn’t have this feature on the bitstream in item 2 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + @Test + public void testCanDeleteAdminParent() throws Exception { + String collectionXAdminToken = getAuthToken(collectionXAdmin.getEmail(), password); + String item1AdminToken = getAuthToken(item1Admin.getEmail(), password); + final String feature = "canDelete"; + + // Create a community AA admin, grant REMOVE permissions on community A to this user + context.turnOffAuthorisationSystem(); + EPerson communityAAAdmin = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("communityAAAdmin@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(communityA) + .withAction(Constants.REMOVE) + .withUser(communityAAAdmin) + .build(); + context.restoreAuthSystemState(); + String communityAAAdminToken = getAuthToken(communityAAAdmin.getEmail(), password); + //verify the community AA admin has this feature on community AA + getClient(communityAAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Grant REMOVE permissions on community AA for collection X admin + context.turnOffAuthorisationSystem(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(communityAA) + .withAction(Constants.REMOVE) + .withUser(collectionXAdmin) + .build(); + context.restoreAuthSystemState(); + // verify collection X admin has this feature on collection X + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Grant REMOVE permissions on collection X for item 1 admin + context.turnOffAuthorisationSystem(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(collectionX) + .withAction(Constants.REMOVE) + .withUser(item1Admin) + .build(); + context.restoreAuthSystemState(); + // verify item 1 admin has this feature on item 1 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + } + + @Test + public void testCanDeleteWriter() throws Exception { + testWriteUsersHavePermissionsAllDso("canManagePolicies", false); + } + + @Test + public void testCanDeleteMinimalPermissions() throws Exception { + final String feature = "canDelete"; + + // Create a new user, grant DELETE permissions on community A to this user + context.turnOffAuthorisationSystem(); + EPerson communityADeleter = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("communityADeleter@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(communityA) + .withAction(Constants.DELETE) + .withUser(communityADeleter) + .build(); + context.restoreAuthSystemState(); + String communityADeleterToken = getAuthToken(communityADeleter.getEmail(), password); + // Verify the user has this feature on community A + getClient(communityADeleterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + // Verify this user doesn’t have this feature on community AA + getClient(communityADeleterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + + // Create a new user, grant REMOVE permissions on community A to this user + context.turnOffAuthorisationSystem(); + EPerson communityARemover = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("communityARemover@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(communityA) + .withAction(Constants.REMOVE) + .withUser(communityARemover) + .build(); + context.restoreAuthSystemState(); + String communityARemoverToken = getAuthToken(communityARemover.getEmail(), password); + // Verify the user has this feature on community AA + getClient(communityARemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + // Verify this user doesn’t have this feature on community A + getClient(communityARemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + // Verify this user doesn’t have this feature on collection X + getClient(communityARemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant REMOVE permissions on community AA to this user + context.turnOffAuthorisationSystem(); + EPerson communityAARemover = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("communityAARemover@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(communityAA) + .withAction(Constants.REMOVE) + .withUser(communityAARemover) + .build(); + context.restoreAuthSystemState(); + String communityAARemoverToken = getAuthToken(communityAARemover.getEmail(), password); + // Verify the user has this feature on collection X + getClient(communityAARemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + // Verify this user doesn’t have this feature on community AA + getClient(communityAARemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/communities/" + communityAA.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + // Verify this user doesn’t have this feature on item 1 + getClient(communityAARemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant REMOVE permissions on collection X to this user + context.turnOffAuthorisationSystem(); + EPerson collectionXRemover = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("communityXRemover@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(collectionX) + .withAction(Constants.REMOVE) + .withUser(collectionXRemover) + .build(); + context.restoreAuthSystemState(); + String collectionXRemoverToken = getAuthToken(collectionXRemover.getEmail(), password); + // Verify the user doesn’t have this feature on item 1 + getClient(collectionXRemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant DELETE permissions on item 1 to this user + context.turnOffAuthorisationSystem(); + EPerson item1Deleter = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("item1Deleter@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.DELETE) + .withUser(item1Deleter) + .build(); + context.restoreAuthSystemState(); + String item1DeleterToken = getAuthToken(item1Deleter.getEmail(), password); + // Verify the user doesn’t have this feature on item 1 + getClient(item1DeleterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant REMOVE permissions on collection X and DELETE permissions on item 1 to this user + context.turnOffAuthorisationSystem(); + EPerson collectionXRemoverItem1Deleter = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("collectionXDeleter@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(collectionX) + .withAction(Constants.REMOVE) + .withUser(collectionXRemoverItem1Deleter) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.DELETE) + .withUser(collectionXRemoverItem1Deleter) + .build(); + context.restoreAuthSystemState(); + String collectionXRemoverItem1DeleterToken = getAuthToken(collectionXRemoverItem1Deleter.getEmail(), password); + // Verify the user has this feature on item 1 + getClient(collectionXRemoverItem1DeleterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + // Verify this user doesn’t have this feature on collection X + getClient(collectionXRemoverItem1DeleterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/collections/" + collectionX.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + // Verify this user doesn’t have this feature on the bundle in item 1 + getClient(collectionXRemoverItem1DeleterToken).perform( + get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant REMOVE permissions on item 1 to this user + context.turnOffAuthorisationSystem(); + EPerson item1Remover = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("item1Remover@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.REMOVE) + .withUser(item1Remover) + .build(); + context.restoreAuthSystemState(); + String item1RemoverToken = getAuthToken(item1Remover.getEmail(), password); + // Verify the user has this feature on the bundle in item 1 + getClient(item1RemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + // Verify this user doesn’t have this feature on item 1 + getClient(item1RemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + // Verify this user doesn’t have this feature on the bitstream in item 1 + getClient(item1RemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant REMOVE permissions on the bundle in item 1 to this user + context.turnOffAuthorisationSystem(); + EPerson bundle1Remover = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("bundle1Remover@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(bundle1) + .withAction(Constants.REMOVE) + .withUser(bundle1Remover) + .build(); + context.restoreAuthSystemState(); + String bundle1RemoverToken = getAuthToken(bundle1Remover.getEmail(), password); + // Verify the user doesn’t have this feature on the bitstream in item 1 + getClient(bundle1RemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant REMOVE permissions on the bundle in item 1 + // and REMOVE permissions on item 1 to this user + context.turnOffAuthorisationSystem(); + EPerson bundle1item1Remover = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("bundle1item1Remover@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(bundle1) + .withAction(Constants.REMOVE) + .withUser(bundle1item1Remover) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.REMOVE) + .withUser(bundle1item1Remover) + .build(); + context.restoreAuthSystemState(); + String bundle1item1RemoverToken = getAuthToken(bundle1item1Remover.getEmail(), password); + // Verify the user has this feature on the bitstream in item 1 + getClient(bundle1item1RemoverToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bitstreams/" + bitstream1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + @Test + public void testCanReorderBitstreamsAdmin() throws Exception { + String adminToken = getAuthToken(admin.getEmail(), password); + String communityAAdminToken = getAuthToken(communityAAdmin.getEmail(), password); + String collectionXAdminToken = getAuthToken(collectionXAdmin.getEmail(), password); + String item1AdminToken = getAuthToken(item1Admin.getEmail(), password); + final String feature = "canReorderBitstreams"; + + // Verify the general admin has this feature on the bundle in item 1 + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on the bundle in item 1 + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on the bundle in item 1 + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin has this feature on the bundle in item 1 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin doesn’t have this feature on the bundle in item 2 + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + @Test + public void testCanReorderBitstreamsWriter() throws Exception { + String communityAWriterToken = getAuthToken(communityAWriter.getEmail(), password); + String collectionXWriterToken = getAuthToken(collectionXWriter.getEmail(), password); + String item1WriterToken = getAuthToken(item1Writer.getEmail(), password); + final String feature = "canReorderBitstreams"; + + // Verify community A write doesn’t have this feature on the bundle in item 1 + getClient(communityAWriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + // Verify collection X write doesn’t have this feature on the bundle in item 1 + getClient(collectionXWriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + // Verify item 1 write doesn’t have this feature on the bundle in item 1 + getClient(item1WriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant WRITE permissions on the bundle in item 1 to this user + // Verify the user has this feature on the bundle in item 1 + getClient(communityAWriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + @Test + public void testCanCreateBitstreamAdmin() throws Exception { + String adminToken = getAuthToken(admin.getEmail(), password); + String communityAAdminToken = getAuthToken(communityAAdmin.getEmail(), password); + String collectionXAdminToken = getAuthToken(collectionXAdmin.getEmail(), password); + String item1AdminToken = getAuthToken(item1Admin.getEmail(), password); + final String feature = "canCreateBitstream"; + + // Verify the general admin has this feature on the bundle in item 1 + getClient(adminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin has this feature on the bundle in item 1 + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify collection X admin has this feature on the bundle in item 1 + getClient(collectionXAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify item 1 admin has this feature on the bundle in item 1 + getClient(item1AdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + + // Verify community A admin doesn’t have this feature on the bundle in item 2 + getClient(communityAAdminToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle2.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + } + + @Test + public void testCanCreateBitstreamWriter() throws Exception { + String communityAWriterToken = getAuthToken(communityAWriter.getEmail(), password); + String collectionXWriterToken = getAuthToken(collectionXWriter.getEmail(), password); + String item1WriterToken = getAuthToken(item1Writer.getEmail(), password); + final String feature = "canCreateBitstream"; + + // Verify community A write doesn’t have this feature on the bundle in item 1 + getClient(communityAWriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write doesn’t have this feature on the bundle in item 1 + getClient(collectionXWriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write doesn’t have this feature on the bundle in item 1 + getClient(item1WriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant WRITE permissions on the bundle in item 1 to this user + context.turnOffAuthorisationSystem(); + EPerson bundle1Writer = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("bundle1Writer@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(bundle1) + .withAction(Constants.WRITE) + .withUser(bundle1Writer) + .build(); + context.restoreAuthSystemState(); + String bundle1WriterToken = getAuthToken(bundle1Writer.getEmail(), password); + // Verify the user doesn’t have this feature on the bundle in item 1 + getClient(bundle1WriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant ADD permissions on the bundle in item 1 to this user + context.turnOffAuthorisationSystem(); + EPerson bundle1Adder = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("bundle1Adder@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(bundle1) + .withAction(Constants.ADD) + .withUser(bundle1Adder) + .build(); + context.restoreAuthSystemState(); + String bundle1AdderToken = getAuthToken(bundle1Adder.getEmail(), password); + // Verify the user doesn’t have this feature on the bundle in item 1 + getClient(bundle1AdderToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant ADD and WRITE permissions on the bundle in item 1 + // and ADD and WRITE permission on the item to this user + context.turnOffAuthorisationSystem(); + EPerson bundle1WriterAdder = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("bundle1WriterAdder@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(bundle1) + .withAction(Constants.ADD) + .withUser(bundle1WriterAdder) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(bundle1) + .withAction(Constants.WRITE) + .withUser(bundle1WriterAdder) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.ADD) + .withUser(bundle1WriterAdder) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.WRITE) + .withUser(bundle1WriterAdder) + .build(); + context.restoreAuthSystemState(); + String bundle1WriterAdderToken = getAuthToken(bundle1WriterAdder.getEmail(), password); + // Verify the user has this feature on the bundle in item 1 + getClient(bundle1WriterAdderToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/bundles/" + bundle1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + } + + @Test + public void testCanCreateBundleAdmin() throws Exception { + testAdminsHavePermissionsItem("canCreateBundle"); + } + + @Test + public void testCanCreateBundleWriter() throws Exception { + String communityAWriterToken = getAuthToken(communityAWriter.getEmail(), password); + String collectionXWriterToken = getAuthToken(collectionXWriter.getEmail(), password); + String item1WriterToken = getAuthToken(item1Writer.getEmail(), password); + final String feature = "canCreateBundle"; + + // Verify community A write doesn’t have this feature on item 1 + getClient(communityAWriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify collection X write doesn’t have this feature on item 1 + getClient(collectionXWriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Verify item 1 write doesn’t have this feature on item 1 + getClient(item1WriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").doesNotExist()); + + // Create a new user, grant ADD and WRITE permissions on item 1 to this user + context.turnOffAuthorisationSystem(); + EPerson item1AdderWriter = EPersonBuilder.createEPerson(context) + .withNameInMetadata("Jhon", "Brown") + .withEmail("item1AdderWriter@my.edu") + .withPassword(password) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.ADD) + .withUser(item1AdderWriter) + .build(); + ResourcePolicyBuilder.createResourcePolicy(context) + .withDspaceObject(item1) + .withAction(Constants.WRITE) + .withUser(item1AdderWriter) + .build(); + context.restoreAuthSystemState(); + String item1AdderWriterToken = getAuthToken(item1AdderWriter.getEmail(), password); + // Verify the user has this feature on item 1 + getClient(item1AdderWriterToken).perform(get("/api/authz/authorizations/search/object?embed=feature&uri=" + + "http://localhost/api/core/items/" + item1.getID())) + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations[?(@._embedded.feature.id=='" + + feature + "')]").exists()); + } +} \ No newline at end of file diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/LoginOnBehalfOfFeatureRestIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/LoginOnBehalfOfFeatureRestIT.java index 207592c68a..890f8bf239 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/LoginOnBehalfOfFeatureRestIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/LoginOnBehalfOfFeatureRestIT.java @@ -7,12 +7,12 @@ */ package org.dspace.app.rest.authorization; +import static org.hamcrest.Matchers.is; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; import org.dspace.app.rest.authorization.impl.LoginOnBehalfOfFeature; -import org.dspace.app.rest.builder.CommunityBuilder; import org.dspace.app.rest.converter.CommunityConverter; import org.dspace.app.rest.converter.SiteConverter; import org.dspace.app.rest.matcher.AuthorizationMatcher; @@ -21,6 +21,7 @@ import org.dspace.app.rest.model.SiteRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.utils.Utils; +import org.dspace.builder.CommunityBuilder; import org.dspace.content.Site; import org.dspace.content.service.SiteService; import org.dspace.services.ConfigurationService; @@ -101,8 +102,7 @@ public class LoginOnBehalfOfFeatureRestIT extends AbstractControllerIntegrationT .param("eperson", String.valueOf(admin.getID())) .param("feature", loginOnBehalfOf.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$._embedded.authorizations", Matchers.not(Matchers.hasItem( - AuthorizationMatcher.matchAuthorization(loginOnBehalfOfAuthorization))))); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -122,8 +122,7 @@ public class LoginOnBehalfOfFeatureRestIT extends AbstractControllerIntegrationT .param("eperson", String.valueOf(eperson.getID())) .param("feature", loginOnBehalfOf.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$._embedded.authorizations", Matchers.not( - Matchers.hasItem(AuthorizationMatcher.matchAuthorization(loginOnBehalfOfAuthorization))))); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -143,8 +142,7 @@ public class LoginOnBehalfOfFeatureRestIT extends AbstractControllerIntegrationT .param("eperson", String.valueOf(eperson.getID())) .param("feature", loginOnBehalfOf.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$._embedded.authorizations", Matchers.not( - Matchers.hasItem(AuthorizationMatcher.matchAuthorization(loginOnBehalfOfAuthorization))))); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -164,7 +162,6 @@ public class LoginOnBehalfOfFeatureRestIT extends AbstractControllerIntegrationT .param("eperson", String.valueOf(admin.getID())) .param("feature", loginOnBehalfOf.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$._embedded.authorizations", Matchers.not( - Matchers.hasItem(AuthorizationMatcher.matchAuthorization(loginOnBehalfOfAuthorization))))); + .andExpect(jsonPath("$.page.totalElements", is(0))); } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/ReinstateFeatureRestIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/ReinstateFeatureRestIT.java index 96cd2243fc..8c630a6795 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/ReinstateFeatureRestIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/ReinstateFeatureRestIT.java @@ -7,22 +7,24 @@ */ package org.dspace.app.rest.authorization; +import static org.hamcrest.Matchers.contains; +import static org.hamcrest.Matchers.is; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; import org.dspace.app.rest.authorization.impl.ReinstateFeature; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.WorkflowItemBuilder; -import org.dspace.app.rest.builder.WorkspaceItemBuilder; import org.dspace.app.rest.converter.ItemConverter; import org.dspace.app.rest.matcher.AuthorizationMatcher; import org.dspace.app.rest.model.ItemRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.utils.Utils; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.WorkflowItemBuilder; +import org.dspace.builder.WorkspaceItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -82,13 +84,14 @@ public class ReinstateFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", admin.getID().toString()) .param("feature", reinstateFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))) + ); } @Test @@ -111,13 +114,14 @@ public class ReinstateFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))) + ); // verify that the property core.authorization.collection-admin.item.reinstatiate = false is respected // the community admins should be still authorized @@ -127,13 +131,14 @@ public class ReinstateFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))) + ); // now verify that the property core.authorization.community-admin.item.reinstatiate = false is respected // and also community admins are blocked @@ -143,11 +148,11 @@ public class ReinstateFeatureRestIT extends AbstractControllerIntegrationTest { getClient(comAdminToken).perform(get("/api/authz/authorizations/" + authAdminWithdraw.getID())) .andExpect(status().isNotFound()); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -170,23 +175,24 @@ public class ReinstateFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); - getClient(colAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(colAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))) + ); // verify that the property core.authorization.collection-admin.item.reinstatiate = false is respected configurationService.setProperty("core.authorization.collection-admin.item.reinstatiate", false); getClient(colAdminToken).perform(get("/api/authz/authorizations/" + authAdminWithdraw.getID())) .andExpect(status().isNotFound()); - getClient(colAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(colAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -207,20 +213,20 @@ public class ReinstateFeatureRestIT extends AbstractControllerIntegrationTest { getClient(epersonToken).perform(get("/api/authz/authorizations/" + authEpersonWithdraw.getID())) .andExpect(status().isNotFound()); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); // check the authorization for the anonymous user getClient().perform(get("/api/authz/authorizations/" + authAnonymousWithdraw.getID())) .andExpect(status().isNotFound()); - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("feature", reinstateFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -251,28 +257,28 @@ public class ReinstateFeatureRestIT extends AbstractControllerIntegrationTest { getClient(adminToken).perform(get("/api/authz/authorizations/" + authWithdrawnItem.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", archivedItemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); getClient(adminToken).perform(get("/api/authz/authorizations/" + authWsItem.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wsItemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); getClient(adminToken).perform(get("/api/authz/authorizations/" + authWFItem.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wfItemUri) .param("eperson", eperson.getID().toString()) .param("feature", reinstateFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/WithdrawFeatureRestIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/WithdrawFeatureRestIT.java index dac47a3a88..e0d00fa63a 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/WithdrawFeatureRestIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/authorization/WithdrawFeatureRestIT.java @@ -7,22 +7,24 @@ */ package org.dspace.app.rest.authorization; +import static org.hamcrest.Matchers.contains; +import static org.hamcrest.Matchers.is; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; import org.dspace.app.rest.authorization.impl.WithdrawFeature; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; -import org.dspace.app.rest.builder.WorkflowItemBuilder; -import org.dspace.app.rest.builder.WorkspaceItemBuilder; import org.dspace.app.rest.converter.ItemConverter; import org.dspace.app.rest.matcher.AuthorizationMatcher; import org.dspace.app.rest.model.ItemRest; import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.utils.Utils; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.WorkflowItemBuilder; +import org.dspace.builder.WorkspaceItemBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; @@ -82,13 +84,14 @@ public class WithdrawFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", admin.getID().toString()) .param("feature", withdrawFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))) + ); } @Test @@ -111,13 +114,14 @@ public class WithdrawFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))) + ); // verify that the property core.authorization.collection-admin.item.withdraw = false is respected // the community admins should be still authorized @@ -127,13 +131,14 @@ public class WithdrawFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) - .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); + .andExpect(status().isOk()) + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))) + ); // now verify that the property core.authorization.community-admin.item.withdraw = false is respected // and also community admins are blocked @@ -143,11 +148,11 @@ public class WithdrawFeatureRestIT extends AbstractControllerIntegrationTest { getClient(comAdminToken).perform(get("/api/authz/authorizations/" + authAdminWithdraw.getID())) .andExpect(status().isNotFound()); - getClient(comAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(comAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -170,23 +175,24 @@ public class WithdrawFeatureRestIT extends AbstractControllerIntegrationTest { .andExpect(jsonPath("$", Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); - getClient(colAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(colAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) .andExpect(status().isOk()) - .andExpect(jsonPath("$", - Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))); + .andExpect(jsonPath("$._embedded.authorizations", contains( + Matchers.is(AuthorizationMatcher.matchAuthorization(authAdminWithdraw)))) + ); // verify that the property core.authorization.collection-admin.item.withdraw = false is respected configurationService.setProperty("core.authorization.collection-admin.item.withdraw", false); getClient(colAdminToken).perform(get("/api/authz/authorizations/" + authAdminWithdraw.getID())) .andExpect(status().isNotFound()); - getClient(colAdminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(colAdminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -207,20 +213,20 @@ public class WithdrawFeatureRestIT extends AbstractControllerIntegrationTest { getClient(epersonToken).perform(get("/api/authz/authorizations/" + authEpersonWithdraw.getID())) .andExpect(status().isNotFound()); - getClient(epersonToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(epersonToken).perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); // check the authorization for the anonymous user getClient().perform(get("/api/authz/authorizations/" + authAnonymousWithdraw.getID())) .andExpect(status().isNotFound()); - getClient().perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient().perform(get("/api/authz/authorizations/search/object") .param("uri", itemUri) .param("feature", withdrawFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } @Test @@ -252,28 +258,28 @@ public class WithdrawFeatureRestIT extends AbstractControllerIntegrationTest { getClient(adminToken).perform(get("/api/authz/authorizations/" + authWithdrawnItem.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", withdrawnItemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); getClient(adminToken).perform(get("/api/authz/authorizations/" + authWsItem.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wsItemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); getClient(adminToken).perform(get("/api/authz/authorizations/" + authWFItem.getID())) .andExpect(status().isNotFound()); - getClient(adminToken).perform(get("/api/authz/authorizations/search/objectAndFeature") + getClient(adminToken).perform(get("/api/authz/authorizations/search/object") .param("uri", wfItemUri) .param("eperson", eperson.getID().toString()) .param("feature", withdrawFeature.getName())) - .andExpect(status().isNoContent()); + .andExpect(jsonPath("$.page.totalElements", is(0))); } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CsvExportIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CsvExportIT.java new file mode 100644 index 0000000000..6186b4b242 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CsvExportIT.java @@ -0,0 +1,147 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.csv; + +import static com.jayway.jsonpath.JsonPath.read; +import static org.hamcrest.Matchers.is; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.fileUpload; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.util.LinkedList; +import java.util.List; +import java.util.concurrent.atomic.AtomicReference; +import java.util.stream.Collectors; + +import com.google.gson.Gson; +import org.dspace.app.rest.converter.DSpaceRunnableParameterConverter; +import org.dspace.app.rest.matcher.ProcessMatcher; +import org.dspace.app.rest.model.ParameterValueRest; +import org.dspace.app.rest.projection.Projection; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ProcessBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.content.ProcessStatus; +import org.dspace.scripts.DSpaceCommandLineParameter; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +public class CsvExportIT extends AbstractControllerIntegrationTest { + + + @Autowired + private DSpaceRunnableParameterConverter dSpaceRunnableParameterConverter; + + @Test + public void metadataExportTestWithoutFileParameterSucceeds() throws Exception { + + context.turnOffAuthorisationSystem(); + + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + Collection col2 = CollectionBuilder.createCollection(context, child1).withName("Collection 2").build(); + Collection col3 = CollectionBuilder.createCollection(context, child1).withName("OrgUnits").build(); + + Item article = ItemBuilder.createItem(context, col1) + .withTitle("Article") + .withIssueDate("2017-10-17") + .withRelationshipType("Publication") + .build(); + + AtomicReference idRef = new AtomicReference<>(); + + LinkedList parameters = new LinkedList<>(); + parameters.add(new DSpaceCommandLineParameter("-i", col1.getHandle())); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + try { + String token = getAuthToken(admin.getEmail(), password); + + getClient(token) + .perform(fileUpload("/api/system/scripts/metadata-export/processes") + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("metadata-export", + String.valueOf(admin.getID()), parameters, + ProcessStatus.COMPLETED)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); + String t = ""; + } finally { + ProcessBuilder.deleteProcess(idRef.get()); + } + } + + @Test + public void metadataExportTestWithFileParameterFails() throws Exception { + + context.turnOffAuthorisationSystem(); + + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + Collection col2 = CollectionBuilder.createCollection(context, child1).withName("Collection 2").build(); + Collection col3 = CollectionBuilder.createCollection(context, child1).withName("OrgUnits").build(); + + Item article = ItemBuilder.createItem(context, col1) + .withTitle("Article") + .withIssueDate("2017-10-17") + .withRelationshipType("Publication") + .build(); + + AtomicReference idRef = new AtomicReference<>(); + + LinkedList parameters = new LinkedList<>(); + parameters.add(new DSpaceCommandLineParameter("-f", "test.csv")); + parameters.add(new DSpaceCommandLineParameter("-i", col1.getHandle())); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + try { + String token = getAuthToken(admin.getEmail(), password); + + getClient(token) + .perform(fileUpload("/api/system/scripts/metadata-export/processes") + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("metadata-export", + String.valueOf(admin.getID()), parameters, + ProcessStatus.FAILED)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); + String t = ""; + } finally { + ProcessBuilder.deleteProcess(idRef.get()); + } + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CsvImportIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CsvImportIT.java index 5955edbdda..7e3e6f6026 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CsvImportIT.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/csv/CsvImportIT.java @@ -7,39 +7,55 @@ */ package org.dspace.app.rest.csv; +import static com.jayway.jsonpath.JsonPath.read; import static org.hamcrest.Matchers.containsString; import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.is; +import static org.junit.Assert.assertFalse; import static org.junit.Assert.assertThat; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.fileUpload; import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; -import java.io.BufferedWriter; -import java.io.File; -import java.io.FileOutputStream; -import java.io.OutputStreamWriter; +import java.io.ByteArrayInputStream; +import java.io.InputStream; +import java.nio.charset.StandardCharsets; import java.sql.SQLException; import java.util.ArrayList; +import java.util.Arrays; import java.util.Iterator; +import java.util.LinkedList; import java.util.List; +import java.util.concurrent.atomic.AtomicReference; +import java.util.stream.Collectors; -import org.dspace.app.rest.builder.CollectionBuilder; -import org.dspace.app.rest.builder.CommunityBuilder; -import org.dspace.app.rest.builder.ItemBuilder; +import com.google.gson.Gson; +import org.dspace.app.rest.converter.DSpaceRunnableParameterConverter; +import org.dspace.app.rest.matcher.ProcessMatcher; import org.dspace.app.rest.matcher.RelationshipMatcher; +import org.dspace.app.rest.model.ParameterValueRest; +import org.dspace.app.rest.projection.Projection; import org.dspace.app.rest.test.AbstractEntityIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ProcessBuilder; import org.dspace.content.Collection; import org.dspace.content.Community; import org.dspace.content.Item; +import org.dspace.content.ProcessStatus; import org.dspace.content.Relationship; import org.dspace.content.service.EntityTypeService; import org.dspace.content.service.ItemService; import org.dspace.content.service.RelationshipService; import org.dspace.content.service.RelationshipTypeService; +import org.dspace.scripts.DSpaceCommandLineParameter; import org.hamcrest.Matchers; import org.junit.Test; import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.http.MediaType; +import org.springframework.mock.web.MockMultipartFile; public class CsvImportIT extends AbstractEntityIntegrationTest { @@ -55,6 +71,9 @@ public class CsvImportIT extends AbstractEntityIntegrationTest { @Autowired private ItemService itemService; + @Autowired + private DSpaceRunnableParameterConverter dSpaceRunnableParameterConverter; + @Test public void createRelationshipsWithCsvImportTest() throws Exception { context.turnOffAuthorisationSystem(); @@ -119,6 +138,7 @@ public class CsvImportIT extends AbstractEntityIntegrationTest { assertArticleRelationships(article, itemB, itemC, itemF); + } private void assertItemERelationships(Item itemB, Item itemE, Item itemF) throws SQLException { @@ -132,8 +152,8 @@ public class CsvImportIT extends AbstractEntityIntegrationTest { List relationshipsForArticle = relationshipService .findByItemAndRelationshipType(context, article, relationshipTypeService .findbyTypesAndTypeName(context, entityTypeService.findByEntityType(context, "Publication"), - entityTypeService.findByEntityType(context, "Person"), "isAuthorOfPublication", - "isPublicationOfAuthor")); + entityTypeService.findByEntityType(context, "Person"), "isAuthorOfPublication", + "isPublicationOfAuthor")); assertThat(relationshipsForArticle.size(), is(3)); List expectedRelationshipsItemsForArticle = new ArrayList<>(); expectedRelationshipsItemsForArticle.add(itemC); @@ -149,7 +169,7 @@ public class CsvImportIT extends AbstractEntityIntegrationTest { } } assertThat(true, Matchers.is(actualRelationshipsItemsForArticle - .containsAll(expectedRelationshipsItemsForArticle))); + .containsAll(expectedRelationshipsItemsForArticle))); } private void updateArticleItemToAddAnotherRelationship(Collection col1, Item article, Item itemB, Item itemC, @@ -222,22 +242,104 @@ public class CsvImportIT extends AbstractEntityIntegrationTest { } private void performImportScript(String[] csv) throws Exception { - String filename = "test.csv"; - BufferedWriter out = new BufferedWriter( - new OutputStreamWriter( - new FileOutputStream(filename), "UTF-8")); - for (String csvLine : csv) { - out.write(csvLine + "\n"); - } - out.flush(); - out.close(); - out = null; + InputStream inputStream = new ByteArrayInputStream(String.join(System.lineSeparator(), + Arrays.asList(csv)) + .getBytes(StandardCharsets.UTF_8)); - runDSpaceScript("metadata-import", "-f", filename, "-e", "admin@email.com", "-s"); + MockMultipartFile bitstreamFile = new MockMultipartFile("file", + "test.csv", MediaType.TEXT_PLAIN_VALUE, + inputStream); - File file = new File(filename); - if (file.exists()) { - file.delete(); + AtomicReference idRef = new AtomicReference<>(); + + LinkedList parameters = new LinkedList<>(); + parameters.add(new DSpaceCommandLineParameter("-f", "test.csv")); + parameters.add(new DSpaceCommandLineParameter("-s", "")); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + try { + String token = getAuthToken(admin.getEmail(), password); + + getClient(token) + .perform(fileUpload("/api/system/scripts/metadata-import/processes").file(bitstreamFile) + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); + String t = ""; + } finally { + ProcessBuilder.deleteProcess(idRef.get()); } } + + @Test + public void csvImportWithSpecifiedEPersonParameterTestShouldFailProcess() throws Exception { + context.turnOffAuthorisationSystem(); + + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + Collection col2 = CollectionBuilder.createCollection(context, child1).withName("Collection 2").build(); + Collection col3 = CollectionBuilder.createCollection(context, child1).withName("OrgUnits").build(); + + Item article = ItemBuilder.createItem(context, col1) + .withTitle("Article") + .withIssueDate("2017-10-17") + .withRelationshipType("Publication") + .build(); + + String csvLineString = "+," + col1.getHandle() + ",TestItemB,Person," + article + .getID().toString(); + String[] csv = {"id,collection,dc.title,relationship.type,relation.isPublicationOfAuthor", csvLineString}; + + InputStream inputStream = new ByteArrayInputStream(String.join(System.lineSeparator(), + Arrays.asList(csv)) + .getBytes(StandardCharsets.UTF_8)); + + MockMultipartFile bitstreamFile = new MockMultipartFile("file", + "test.csv", MediaType.TEXT_PLAIN_VALUE, + inputStream); + + AtomicReference idRef = new AtomicReference<>(); + + LinkedList parameters = new LinkedList<>(); + parameters.add(new DSpaceCommandLineParameter("-f", "test.csv")); + parameters.add(new DSpaceCommandLineParameter("-s", "")); + parameters.add(new DSpaceCommandLineParameter("-e", "dspace@dspace.com")); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + try { + String token = getAuthToken(admin.getEmail(), password); + + getClient(token) + .perform(fileUpload("/api/system/scripts/metadata-import/processes").file(bitstreamFile) + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("metadata-import", + String.valueOf(admin.getID()), parameters, + ProcessStatus.FAILED)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); + } finally { + ProcessBuilder.deleteProcess(idRef.get()); + } + + Iterator itemIteratorItem = itemService.findByMetadataField(context, "dc", "title", null, "TestItemB"); + assertFalse(itemIteratorItem.hasNext()); + } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/eperson/DeleteEPersonSubmitterIT.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/eperson/DeleteEPersonSubmitterIT.java new file mode 100644 index 0000000000..9b5acf19fd --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/eperson/DeleteEPersonSubmitterIT.java @@ -0,0 +1,385 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.eperson; + +import static com.jayway.jsonpath.JsonPath.read; +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertNull; +import static org.junit.Assert.fail; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.delete; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.patch; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.sql.SQLException; +import java.util.ArrayList; +import java.util.List; +import java.util.UUID; +import java.util.concurrent.atomic.AtomicReference; +import javax.ws.rs.core.MediaType; + +import org.apache.logging.log4j.Logger; +import org.dspace.app.requestitem.RequestItemAuthor; +import org.dspace.app.requestitem.RequestItemAuthorExtractor; +import org.dspace.app.rest.model.patch.Operation; +import org.dspace.app.rest.model.patch.ReplaceOperation; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.EPersonBuilder; +import org.dspace.builder.WorkspaceItemBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.content.WorkspaceItem; +import org.dspace.content.factory.ContentServiceFactory; +import org.dspace.content.service.InstallItemService; +import org.dspace.content.service.ItemService; +import org.dspace.content.service.WorkspaceItemService; +import org.dspace.eperson.EPerson; +import org.dspace.eperson.factory.EPersonServiceFactory; +import org.dspace.eperson.service.EPersonService; +import org.dspace.services.factory.DSpaceServicesFactory; +import org.dspace.versioning.Version; +import org.dspace.versioning.factory.VersionServiceFactory; +import org.dspace.versioning.service.VersioningService; +import org.dspace.xmlworkflow.factory.XmlWorkflowServiceFactory; +import org.dspace.xmlworkflow.service.XmlWorkflowService; +import org.dspace.xmlworkflow.storedcomponents.XmlWorkflowItem; +import org.hamcrest.Matchers; +import org.junit.Before; +import org.junit.Ignore; +import org.junit.Test; + +/** + * Class to test interaction between EPerson deletion and tasks present in the workflow + */ +public class DeleteEPersonSubmitterIT extends AbstractControllerIntegrationTest { + + protected EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); + protected ItemService itemService = ContentServiceFactory.getInstance().getItemService(); + protected InstallItemService installItemService = ContentServiceFactory.getInstance().getInstallItemService(); + protected WorkspaceItemService workspaceItemService = ContentServiceFactory.getInstance() + .getWorkspaceItemService(); + protected XmlWorkflowService xmlWorkflowService = XmlWorkflowServiceFactory.getInstance().getXmlWorkflowService(); + protected VersioningService versioningService = VersionServiceFactory.getInstance().getVersionService(); + + protected RequestItemAuthorExtractor requestItemAuthorExtractor = + DSpaceServicesFactory.getInstance() + .getServiceManager() + .getServiceByName("org.dspace.app.requestitem.RequestItemAuthorExtractor", + RequestItemAuthorExtractor.class); + + + private EPerson submitter; + private EPerson submitterForVersion1; + private EPerson submitterForVersion2; + private EPerson workflowUser; + + private static final Logger log = org.apache.logging.log4j.LogManager.getLogger(DeleteEPersonSubmitterIT.class); + + /** + * This method will be run before every test as per @Before. It will + * initialize resources required for the tests. + * + * Other methods can be annotated with @Before here or in subclasses but no + * execution order is guaranteed + */ + @Before + @Override + public void setUp() throws Exception { + super.setUp(); + + context.turnOffAuthorisationSystem(); + + submitter = EPersonBuilder.createEPerson(context).withEmail("submitter@example.org").build(); + workflowUser = EPersonBuilder.createEPerson(context).withEmail("workflowUser@example.org").build(); + submitterForVersion1 = EPersonBuilder.createEPerson(context).withEmail("submitterForVersion1@example.org") + .build(); + submitterForVersion2 = EPersonBuilder.createEPerson(context).withEmail("submitterForVersion2@example.org") + .build(); + + context.restoreAuthSystemState(); + + } + + + /** + * This test verifies that when the submitter Eperson is deleted, the delete succeeds and the item will have + * 'null' as submitter + * + * @throws Exception + */ + @Test + public void testArchivedItemSubmitterDelete() throws Exception { + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(submitter) + .withTitle("Test Item") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Item installItem = installItemService.installItem(context, wsi); + + assertDeletionOfEperson(submitter); + + assertNull(retrieveItemSubmitter(installItem.getID())); + + + Item item = itemService.find(context, installItem.getID()); + RequestItemAuthor requestItemAuthor = requestItemAuthorExtractor.getRequestItemAuthor(context, item); + + assertEquals("Help Desk", requestItemAuthor.getFullName()); + assertEquals("dspace-help@myu.edu", requestItemAuthor.getEmail()); + } + + /** + * This test verifies that when the submitter Eperson is deleted, the delete succeeds and the item will have + * 'null' as submitter + * + * @throws Exception + */ + @Test + public void testWIthdrawnItemSubmitterDelete() throws Exception { + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(submitter) + .withTitle("Test Item") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Item item = installItemService.installItem(context, wsi); + + List opsToWithDraw = new ArrayList(); + ReplaceOperation replaceOperationToWithDraw = new ReplaceOperation("/withdrawn", true); + opsToWithDraw.add(replaceOperationToWithDraw); + String patchBodyToWithdraw = getPatchContent(opsToWithDraw); + + // withdraw item + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(patch("/api/core/items/" + item.getID()) + .content(patchBodyToWithdraw) + .contentType(MediaType.APPLICATION_JSON_PATCH_JSON)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.uuid", Matchers.is(item.getID().toString()))) + .andExpect(jsonPath("$.withdrawn", Matchers.is(true))) + .andExpect(jsonPath("$.inArchive", Matchers.is(false))); + + + assertDeletionOfEperson(submitter); + + assertNull(retrieveItemSubmitter(item.getID())); + + List opsToReinstate = new ArrayList(); + ReplaceOperation replaceOperationToReinstate = new ReplaceOperation("/withdrawn", false); + opsToReinstate.add(replaceOperationToReinstate); + String patchBodyToReinstate = getPatchContent(opsToReinstate); + + // reinstate item + getClient(token).perform(patch("/api/core/items/" + item.getID()) + .content(patchBodyToReinstate) + .contentType(MediaType.APPLICATION_JSON_PATCH_JSON)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.uuid", Matchers.is(item.getID().toString()))) + .andExpect(jsonPath("$.withdrawn", Matchers.is(false))) + .andExpect(jsonPath("$.inArchive", Matchers.is(true))); + + assertNull(retrieveItemSubmitter(item.getID())); + + + // withdraw item again + getClient(token).perform(patch("/api/core/items/" + item.getID()) + .content(patchBodyToWithdraw) + .contentType(MediaType.APPLICATION_JSON_PATCH_JSON)) + .andExpect(status().isOk()) + .andExpect(jsonPath("$.uuid", Matchers.is(item.getID().toString()))) + .andExpect(jsonPath("$.withdrawn", Matchers.is(true))) + .andExpect(jsonPath("$.inArchive", Matchers.is(false))); + + assertNull(retrieveItemSubmitter(item.getID())); + + } + + @Test + public void testVersionItemSubmitterDelete() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(submitter) + .withTitle("Test Item") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + Item item = installItemService.installItem(context, wsi); + + context.setCurrentUser(submitter); + //TODO: Replace this with a REST call when possible + Version version1 = versioningService.createNewVersion(context, item); + Integer version1ID = version1.getID(); + WorkspaceItem version1WorkspaceItem = workspaceItemService.findByItem(context, version1.getItem()); + installItemService.installItem(context, version1WorkspaceItem); + + assertDeletionOfEperson(submitter); + assertNull(retrieveItemSubmitter(item.getID())); + + Item version1Item = retrieveVersionItem(version1ID); + assertNull(retrieveItemSubmitter(version1Item.getID())); + + + context.setCurrentUser(submitterForVersion1); + + Version version2 = versioningService.createNewVersion(context, item); + Integer version2ID = version2.getID(); + WorkspaceItem version2WorkspaceItem = workspaceItemService.findByItem(context, version2.getItem()); + installItemService.installItem(context, version2WorkspaceItem); + Item version2Item = retrieveVersionItem(version2ID); + assertEquals(submitterForVersion1.getID(), retrieveItemSubmitter(version2Item.getID()).getID()); + + context.setCurrentUser(submitterForVersion2); + Version version3 = versioningService.createNewVersion(context, version2Item); + Integer version3ID = version3.getID(); + assertDeletionOfEperson(submitterForVersion2); + + getClient(token).perform(get("/api/versioning/versions/" + version3ID + "/item")) + .andExpect(status().isNoContent()); + + + // Clean up versions + cleanupVersion(version1ID); + cleanupVersion(version2ID); + cleanupVersion(version3ID); + + } + + + @Test + public void testWorkspaceItemSubmitterDelete() throws Exception { + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(submitter) + .withTitle("Test Item") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/submission/workspaceitems/" + wsi.getID())) + .andExpect(status().isOk()); + + assertDeletionOfEperson(submitter); + + getClient(token).perform(get("/api/submission/workspaceitems/" + wsi.getID())) + .andExpect(status().isNotFound()); + } + + @Ignore + @Test + /** + * TODO: This test currently fails with a status 500 + */ + public void testWorkflowItemSubmitterDelete() throws Exception { + context.turnOffAuthorisationSystem(); + + Community parent = CommunityBuilder.createCommunity(context).build(); + Collection collection = CollectionBuilder.createCollection(context, parent) + .withWorkflowGroup(1, workflowUser) + .build(); + + WorkspaceItem wsi = WorkspaceItemBuilder.createWorkspaceItem(context, collection) + .withSubmitter(submitter) + .withTitle("Test Item") + .withIssueDate("2019-03-06") + .withSubject("ExtraEntry") + .build(); + + XmlWorkflowItem workflowItem = xmlWorkflowService.startWithoutNotify(context, wsi); + + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/workflow/workflowitems/" + workflowItem.getID())) + .andExpect(status().isOk()); + + assertDeletionOfEperson(submitter); + + getClient(token).perform(get("/api/workflow/workflowitems/" + workflowItem.getID())) + .andExpect(status().isOk()); + + } + + + private void assertDeletionOfEperson(EPerson ePerson) throws SQLException { + try { + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(delete("/api/eperson/epersons/" + ePerson.getID())) + .andExpect(status().isNoContent()); + + } catch (Exception ex) { + log.error("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + ": ", ex); + fail("Caught an Exception while deleting an EPerson. " + ex.getClass().getName() + + ": " + ex.getMessage()); + } + EPerson ePersonCheck = ePersonService.find(context, ePerson.getID()); + assertNull(ePersonCheck); + } + + /** + * TODO: THis method currently retrieves the submitter through the itemService. A method should be added to retrieve + * TODO: this through the REST API + */ + private EPerson retrieveItemSubmitter(UUID itemID) throws Exception { + + Item item = itemService.find(context, itemID); + return item.getSubmitter(); + + } + + private Item retrieveVersionItem(int id) throws Exception { + AtomicReference idRef = new AtomicReference<>(); + + String token = getAuthToken(admin.getEmail(), password); + getClient(token).perform(get("/api/versioning/versions/" + id + "/item")) + .andExpect(status().isOk()) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.uuid"))); + + return itemService.find(context, UUID.fromString(idRef.get())); + } + + private void cleanupVersion(int id) throws SQLException { + Version version = versioningService.getVersion(context, id); + versioningService.removeVersion(context, version); + + } + + +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/jackson/IgnoreJacksonWriteOnlyAccess.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/jackson/IgnoreJacksonWriteOnlyAccess.java new file mode 100644 index 0000000000..df68ce8ab3 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/jackson/IgnoreJacksonWriteOnlyAccess.java @@ -0,0 +1,35 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.jackson; + +import com.fasterxml.jackson.annotation.JsonProperty; +import com.fasterxml.jackson.databind.introspect.Annotated; +import com.fasterxml.jackson.databind.introspect.JacksonAnnotationIntrospector; + +/** + * This is a custom JacksonAnnotationIntrospector which allows us to ignore `@JsonProperty(access = Access + * .WRITE_ONLY)` annotations in our tests. + * Normally, this annotation allows the property to be written to (during deserialization), + * but does NOT allow it to be read (during serialization). + * In some tests, we need to ignore this annotation so that the test can use/verify the property + * during both serialization & deserialization. + * + * In order to use this class in a test, assign it the the current mapper like this: + * mapper.setAnnotationIntrospector(new IgnoreJacksonWriteOnlyAccess()); + */ +public class IgnoreJacksonWriteOnlyAccess extends JacksonAnnotationIntrospector { + + @Override + public JsonProperty.Access findPropertyAccess(Annotated m) { + JsonProperty.Access access = super.findPropertyAccess(m); + if (access == JsonProperty.Access.WRITE_ONLY) { + return JsonProperty.Access.AUTO; + } + return access; + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/BundleMatcher.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/BundleMatcher.java index 4fb5606293..812daabf58 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/BundleMatcher.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/BundleMatcher.java @@ -48,7 +48,8 @@ public class BundleMatcher { public static Matcher matchFullEmbeds() { return matchEmbeds( "bitstreams[]", - "primaryBitstream" + "primaryBitstream", + "item" ); } @@ -57,6 +58,7 @@ public class BundleMatcher { */ public static Matcher matchLinks(UUID uuid) { return HalMatcher.matchLinks(REST_SERVER_URL + "core/bundles/" + uuid, + "item", "bitstreams", "primaryBitstream", "self" diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/CommunityMatcher.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/CommunityMatcher.java index 8b7e5669de..59f99794d8 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/CommunityMatcher.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/CommunityMatcher.java @@ -58,12 +58,23 @@ public class CommunityMatcher { ); } + public static Matcher matchCommunityEntryNonAdminEmbeds(String name, UUID uuid, String handle) { + return allOf( + matchProperties(name, uuid, handle), + hasJsonPath("$._embedded.collections", Matchers.not(Matchers.empty())), + hasJsonPath("$._embedded.logo", Matchers.not(Matchers.empty())), + matchLinks(uuid), + matchNonAdminEmbeds() + ); + } + public static Matcher matchCommunityEntryFullProjection(String name, UUID uuid, String handle) { return allOf( matchProperties(name, uuid, handle), hasJsonPath("$._embedded.collections", Matchers.not(Matchers.empty())), hasJsonPath("$._embedded.logo", Matchers.not(Matchers.empty())), - matchLinks(uuid) + matchLinks(uuid), + matchFullEmbeds() ); } @@ -82,7 +93,7 @@ public class CommunityMatcher { /** * Gets a matcher for all expected embeds when the full projection is requested. */ - public static Matcher matchFullEmbeds() { + public static Matcher matchNonAdminEmbeds() { return matchEmbeds( "collections[]", "logo", @@ -91,6 +102,19 @@ public class CommunityMatcher { ); } + /** + * Gets a matcher for all expected embeds when the full projection is requested. + */ + public static Matcher matchFullEmbeds() { + return matchEmbeds( + "collections[]", + "logo", + "parentCommunity", + "subcommunities[]", + "adminGroup" + ); + } + /** * Gets a matcher for all expected links. */ @@ -117,7 +141,7 @@ public class CommunityMatcher { ); } - public static String getFullEmbedsParameters() { + public static String getNonAdminEmbeds() { return "collections,logo,parentCommunity,subcommunities"; } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/RegistrationMatcher.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/RegistrationMatcher.java new file mode 100644 index 0000000000..a154091a2e --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/RegistrationMatcher.java @@ -0,0 +1,29 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.matcher; + +import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; +import static org.hamcrest.Matchers.allOf; +import static org.hamcrest.Matchers.is; + +import java.util.UUID; + +import org.hamcrest.Matcher; + +public class RegistrationMatcher { + + private RegistrationMatcher(){} + + public static Matcher matchRegistration(String email, UUID epersonUuid) { + return allOf( + hasJsonPath("$.email", is(email)), + hasJsonPath("$.user", is(epersonUuid == null ? null : String.valueOf(epersonUuid))) + ); + + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/SubmissionFormFieldMatcher.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/SubmissionFormFieldMatcher.java index 67f2494cf3..773a751b9f 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/SubmissionFormFieldMatcher.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/SubmissionFormFieldMatcher.java @@ -28,8 +28,8 @@ public class SubmissionFormFieldMatcher { /** * Shortcut for the - * {@link SubmissionFormFieldMatcher#matchFormFieldDefinition(String, String, String, boolean, String, String, String)} - * with a null style + * {@link SubmissionFormFieldMatcher#matchFormFieldDefinition(String, String, String, boolean, String, String, String, String)} + * with a null style and vocabulary name * * @param type * the expected input type @@ -53,7 +53,9 @@ public class SubmissionFormFieldMatcher { } /** - * Check the json representation of a submission form + * Shortcut for the + * {@link SubmissionFormFieldMatcher#matchFormFieldDefinition(String, String, String, boolean, String, String, String, String)} + * with a null controlled vocabulary * * @param type * the expected input type @@ -74,13 +76,45 @@ public class SubmissionFormFieldMatcher { * @return a Matcher for all the condition above */ public static Matcher matchFormFieldDefinition(String type, String label, String mandatoryMessage, - boolean repeatable, - String hints, String style, String metadata) { + boolean repeatable, + String hints, String style, String metadata) { + return matchFormFieldDefinition(type, label, mandatoryMessage, repeatable, hints, style, metadata, null); + } + + /** + * Check the json representation of a submission form + * + * @param type + * the expected input type + * @param label + * the expected label + * @param mandatoryMessage + * the expected mandatoryMessage, can be null. If not empty the field is expected to be flagged as + * mandatory + * @param repeatable + * the expected repeatable flag + * @param hints + * the expected hints message + * @param style + * the expected style for the field, can be null. If null the corresponding json path is expected to be + * missing + * @param metadata + * the expected metadata + * @param controlled vocabulary + * the expected controlled vocabulary, can be null. If null the corresponding json path is expected to be + * missing + * @return a Matcher for all the condition above + */ + public static Matcher matchFormFieldDefinition(String type, String label, String mandatoryMessage, + boolean repeatable, String hints, String style, + String metadata, String controlledVocabulary) { return allOf( // check each field definition hasJsonPath("$.input.type", is(type)), hasJsonPath("$.label", containsString(label)), hasJsonPath("$.selectableMetadata[0].metadata", is(metadata)), + controlledVocabulary != null ? hasJsonPath("$.selectableMetadata[0].controlledVocabulary", + is(controlledVocabulary)) : hasNoJsonPath("$.selectableMetadata[0].controlledVocabulary"), mandatoryMessage != null ? hasJsonPath("$.mandatoryMessage", containsString(mandatoryMessage)) : hasNoJsonPath("$.mandatoryMessage"), hasJsonPath("$.mandatory", is(mandatoryMessage != null)), diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/UsageReportMatcher.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/UsageReportMatcher.java new file mode 100644 index 0000000000..ba82d69b40 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/UsageReportMatcher.java @@ -0,0 +1,61 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.matcher; + +import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; +import static org.hamcrest.Matchers.allOf; +import static org.hamcrest.Matchers.is; + +import java.util.List; +import java.util.stream.Collectors; + +import org.dspace.app.rest.model.UsageReportPointRest; +import org.hamcrest.Matcher; +import org.hamcrest.Matchers; + +/** + * Matcher to match {@Link UsageReportRest} + * + * @author Maria Verdonck (Atmire) on 10/06/2020 + */ +public class UsageReportMatcher { + + private UsageReportMatcher() { + } + + /** + * Matcher for the usage report on just id and report-type + * + * @param id Id to match if of json of UsageReport + * @param reportType ReportType to match if of json of UsageReport + * @return The matcher + */ + private static Matcher matchUsageReport(String id, String reportType) { + return allOf( + hasJsonPath("$.id", is(id)), + hasJsonPath("$.report-type", is(reportType))); + } + + /** + * Matcher for the usage report including the {@link UsageReportPointRest} points + * + * @param id Id to match if of json of UsageReport + * @param reportType ReportType to match if of json of UsageReport + * @param points List of points to match to the json of UsageReport's list of points + * @return The matcher + */ + public static Matcher matchUsageReport(String id, String reportType, + List points) { + return allOf( + matchUsageReport(id, reportType), + hasJsonPath("$.points", Matchers.containsInAnyOrder( + points.stream().map(point -> UsageReportPointMatcher + .matchUsageReportPoint(point.getId(), point.getType(), point.getValues().get("views"))) + .collect(Collectors.toList())))); + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/UsageReportPointMatcher.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/UsageReportPointMatcher.java new file mode 100644 index 0000000000..e7f22066e5 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/UsageReportPointMatcher.java @@ -0,0 +1,42 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.matcher; + +import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; +import static org.hamcrest.Matchers.allOf; +import static org.hamcrest.Matchers.is; + +import org.dspace.app.rest.model.UsageReportPointRest; +import org.hamcrest.Matcher; + +/** + * Matcher to match {@Link UsageReportPointRest} + * + * @author Maria Verdonck (Atmire) on 10/06/2020 + */ +public class UsageReportPointMatcher { + + private UsageReportPointMatcher() { + } + + /** + * Matcher for the usage report points (see {@link UsageReportPointRest}) + * + * @param id Id to match if of json of UsageReportPoint + * @param type Type to match if of json of UsageReportPoint + * @param views Nr of views, is in the values key-value pair of values of UsageReportPoint with key "views" + * @return The matcher + */ + public static Matcher matchUsageReportPoint(String id, String type, int views) { + return allOf( + hasJsonPath("$.id", is(id)), + hasJsonPath("$.type", is(type)), + hasJsonPath("$.values.views", is(views)) + ); + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/AuthorityEntryMatcher.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/VocabularyEntryDetailsMatcher.java similarity index 68% rename from dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/AuthorityEntryMatcher.java rename to dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/VocabularyEntryDetailsMatcher.java index 5758d3ee65..8eb2cba3c4 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/AuthorityEntryMatcher.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/VocabularyEntryDetailsMatcher.java @@ -8,7 +8,6 @@ package org.dspace.app.rest.matcher; import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; -import static org.dspace.app.rest.matcher.HalMatcher.matchEmbeds; import static org.hamcrest.Matchers.allOf; import static org.hamcrest.Matchers.containsString; import static org.hamcrest.Matchers.is; @@ -18,20 +17,20 @@ import org.hamcrest.Matcher; /** * This matcher has been created so that we can use a predefined Matcher class to verify Authority Entries */ -public class AuthorityEntryMatcher { +public class VocabularyEntryDetailsMatcher { - private AuthorityEntryMatcher() { + private VocabularyEntryDetailsMatcher() { } public static Matcher matchAuthorityEntry(String id, String display, String value) { return allOf( matchProperties(id, display, value), - matchLinks()); + matchLinks(id)); } - public static Matcher matchLinks() { + public static Matcher matchLinks(String id) { return allOf( - hasJsonPath("$._links.self.href", containsString("api/integration/authority/"))); + hasJsonPath("$._links.self.href", containsString("api/submission/vocabularyEntryDetails/" + id))); } private static Matcher matchProperties(String id, String display, String value) { @@ -39,16 +38,7 @@ public class AuthorityEntryMatcher { hasJsonPath("$.id", is(id)), hasJsonPath("$.display", is(display)), hasJsonPath("$.value", is(value)), - hasJsonPath("$.type", is("authority")) - ); - } - - /** - * Gets a matcher for all expected embeds when the full projection is requested. - */ - public static Matcher matchFullEmbeds() { - return matchEmbeds( - "authorityEntries" + hasJsonPath("$.type", is("vocabularyEntryDetail")) ); } } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/VocabularyMatcher.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/VocabularyMatcher.java new file mode 100644 index 0000000000..6e23560911 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/matcher/VocabularyMatcher.java @@ -0,0 +1,44 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.matcher; + +import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; +import static org.hamcrest.Matchers.allOf; +import static org.hamcrest.Matchers.is; + +import org.hamcrest.Matcher; + +/** + * + * + * @author mykhaylo + * + */ +public class VocabularyMatcher { + + private VocabularyMatcher() {} + + public static Matcher matchProperties(String id, String name, + boolean scrollable, boolean hierarchical) { + return allOf( + hasJsonPath("$.id", is(id)), + hasJsonPath("$.name", is(name)), + hasJsonPath("$.scrollable", is(scrollable)), + hasJsonPath("$.hierarchical", is(hierarchical)), + hasJsonPath("$.type", is("vocabulary")) + ); + } + + public static Matcher matchVocabularyEntry(String display, String value, String type) { + return allOf( + hasJsonPath("$.display", is(display)), + hasJsonPath("$.value", is(value)), + hasJsonPath("$.type", is(type)) + ); + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/EPersonRestPermissionEvaluatorPluginTest.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/EPersonRestPermissionEvaluatorPluginTest.java index 21fcd60b38..9ff264a1c7 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/EPersonRestPermissionEvaluatorPluginTest.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/EPersonRestPermissionEvaluatorPluginTest.java @@ -19,27 +19,40 @@ import java.util.List; import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.model.patch.Patch; import org.dspace.app.rest.model.patch.ReplaceOperation; +import org.dspace.services.RequestService; import org.junit.Before; import org.junit.Test; +import org.junit.runner.RunWith; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.junit.MockitoJUnitRunner; import org.springframework.security.core.Authentication; +import org.springframework.test.util.ReflectionTestUtils; /** * This class verifies that {@link EPersonRestPermissionEvaluatorPlugin} properly * evaluates Patch requests. */ +@RunWith(MockitoJUnitRunner.class) public class EPersonRestPermissionEvaluatorPluginTest { + @InjectMocks private EPersonRestPermissionEvaluatorPlugin ePersonRestPermissionEvaluatorPlugin; private Authentication authentication; + @Mock + private RequestService requestService; + @Before public void setUp() throws Exception { ePersonRestPermissionEvaluatorPlugin = spy(EPersonRestPermissionEvaluatorPlugin.class); authentication = mock(Authentication.class); DSpaceRestPermission restPermission = DSpaceRestPermission.convert("WRITE"); when(ePersonRestPermissionEvaluatorPlugin - .hasDSpacePermission(authentication, null, null, restPermission)).thenReturn(true); + .hasDSpacePermission(authentication, null, null, restPermission)).thenReturn(true); + ReflectionTestUtils.setField(ePersonRestPermissionEvaluatorPlugin, "requestService", requestService); + when(requestService.getCurrentRequest()).thenReturn(null); } @Test @@ -52,7 +65,7 @@ public class EPersonRestPermissionEvaluatorPluginTest { ops.add(canLoginOperation); Patch patch = new Patch(ops); assertFalse(ePersonRestPermissionEvaluatorPlugin - .hasPatchPermission(authentication, null, null, patch)); + .hasPatchPermission(authentication, null, null, patch)); } @@ -64,7 +77,7 @@ public class EPersonRestPermissionEvaluatorPluginTest { ops.add(passwordOperation); Patch patch = new Patch(ops); assertTrue(ePersonRestPermissionEvaluatorPlugin - .hasPatchPermission(authentication, null, null, patch)); + .hasPatchPermission(authentication, null, null, patch)); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/jwt/JWTTokenHandlerTest.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/jwt/JWTTokenHandlerTest.java index aeda2d0e18..3b0eb84793 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/jwt/JWTTokenHandlerTest.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/jwt/JWTTokenHandlerTest.java @@ -24,6 +24,7 @@ import org.dspace.core.Context; import org.dspace.eperson.EPerson; import org.dspace.eperson.service.EPersonService; import org.dspace.service.ClientInfoService; +import org.dspace.services.ConfigurationService; import org.junit.After; import org.junit.Before; import org.junit.Test; @@ -46,28 +47,31 @@ public class JWTTokenHandlerTest { @InjectMocks @Spy - JWTTokenHandler jwtTokenHandler; + private LoginJWTTokenHandler loginJWTTokenHandler; @Mock - private Context context; + protected ConfigurationService configurationService; @Mock - private EPerson ePerson; + protected Context context; @Mock - private HttpServletRequest httpServletRequest; + protected EPerson ePerson; @Mock - private EPersonService ePersonService; + protected HttpServletRequest httpServletRequest; @Mock - private EPersonClaimProvider ePersonClaimProvider; + protected EPersonService ePersonService; @Mock - private ClientInfoService clientInfoService; + protected EPersonClaimProvider ePersonClaimProvider; + + @Mock + protected ClientInfoService clientInfoService; @Spy - private List jwtClaimProviders = new ArrayList<>(); + protected List jwtClaimProviders = new ArrayList<>(); @Before public void setUp() throws Exception { @@ -87,7 +91,7 @@ public class JWTTokenHandlerTest { @Test public void testJWTNoEncryption() throws Exception { Date previous = new Date(System.currentTimeMillis() - 10000000000L); - String token = jwtTokenHandler + String token = loginJWTTokenHandler .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); SignedJWT signedJWT = SignedJWT.parse(token); String personId = (String) signedJWT.getJWTClaimsSet().getClaim(EPersonClaimProvider.EPERSON_ID); @@ -96,11 +100,11 @@ public class JWTTokenHandlerTest { @Test(expected = ParseException.class) public void testJWTEncrypted() throws Exception { - when(jwtTokenHandler.isEncryptionEnabled()).thenReturn(true); + when(loginJWTTokenHandler.isEncryptionEnabled()).thenReturn(true); Date previous = new Date(System.currentTimeMillis() - 10000000000L); StringKeyGenerator keyGenerator = KeyGenerators.string(); - when(jwtTokenHandler.getEncryptionKey()).thenReturn(keyGenerator.generateKey().getBytes()); - String token = jwtTokenHandler + when(configurationService.getProperty("jwt.login.encryption.secret")).thenReturn(keyGenerator.generateKey()); + String token = loginJWTTokenHandler .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); SignedJWT signedJWT = SignedJWT.parse(token); } @@ -108,12 +112,12 @@ public class JWTTokenHandlerTest { //temporary set a negative expiration time so the token is invalid immediately @Test public void testExpiredToken() throws Exception { - when(jwtTokenHandler.getExpirationPeriod()).thenReturn(-99999999L); + when(configurationService.getLongProperty("jwt.login.token.expiration", 1800000)).thenReturn(-99999999L); when(ePersonClaimProvider.getEPerson(any(Context.class), any(JWTClaimsSet.class))).thenReturn(ePerson); Date previous = new Date(new Date().getTime() - 10000000000L); - String token = jwtTokenHandler + String token = loginJWTTokenHandler .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); - EPerson parsed = jwtTokenHandler.parseEPersonFromToken(token, httpServletRequest, context); + EPerson parsed = loginJWTTokenHandler.parseEPersonFromToken(token, httpServletRequest, context); assertEquals(null, parsed); } @@ -121,17 +125,17 @@ public class JWTTokenHandlerTest { //Try if we can change the expiration date @Test public void testTokenTampering() throws Exception { - when(jwtTokenHandler.getExpirationPeriod()).thenReturn(-99999999L); + when(loginJWTTokenHandler.getExpirationPeriod()).thenReturn(-99999999L); when(ePersonClaimProvider.getEPerson(any(Context.class), any(JWTClaimsSet.class))).thenReturn(ePerson); Date previous = new Date(new Date().getTime() - 10000000000L); - String token = jwtTokenHandler + String token = loginJWTTokenHandler .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); JWTClaimsSet jwtClaimsSet = new JWTClaimsSet.Builder().claim("eid", "epersonID").expirationTime( new Date(System.currentTimeMillis() + 99999999)).build(); String tamperedPayload = new String(Base64.getUrlEncoder().encode(jwtClaimsSet.toString().getBytes())); String[] splitToken = token.split("\\."); String tamperedToken = splitToken[0] + "." + tamperedPayload + "." + splitToken[2]; - EPerson parsed = jwtTokenHandler.parseEPersonFromToken(tamperedToken, httpServletRequest, context); + EPerson parsed = loginJWTTokenHandler.parseEPersonFromToken(tamperedToken, httpServletRequest, context); assertEquals(null, parsed); } @@ -139,12 +143,12 @@ public class JWTTokenHandlerTest { public void testInvalidatedToken() throws Exception { Date previous = new Date(System.currentTimeMillis() - 10000000000L); // create a new token - String token = jwtTokenHandler + String token = loginJWTTokenHandler .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); // immediately invalidate it - jwtTokenHandler.invalidateToken(token, new MockHttpServletRequest(), context); + loginJWTTokenHandler.invalidateToken(token, new MockHttpServletRequest(), context); // Check if it is still valid by trying to parse the EPerson from it (should return null) - EPerson parsed = jwtTokenHandler.parseEPersonFromToken(token, httpServletRequest, context); + EPerson parsed = loginJWTTokenHandler.parseEPersonFromToken(token, httpServletRequest, context); assertEquals(null, parsed); } diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/jwt/ShortLivedJWTTokenHandlerTest.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/jwt/ShortLivedJWTTokenHandlerTest.java new file mode 100644 index 0000000000..795694b202 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/security/jwt/ShortLivedJWTTokenHandlerTest.java @@ -0,0 +1,120 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.app.rest.security.jwt; + +import static org.junit.Assert.assertEquals; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.when; + +import java.text.ParseException; +import java.util.ArrayList; +import java.util.Base64; +import java.util.Date; +import javax.servlet.http.HttpServletRequest; + +import com.nimbusds.jwt.JWTClaimsSet; +import com.nimbusds.jwt.SignedJWT; +import org.dspace.core.Context; +import org.dspace.eperson.EPerson; +import org.junit.Before; +import org.junit.Test; +import org.junit.runner.RunWith; +import org.mockito.InjectMocks; +import org.mockito.Mockito; +import org.mockito.Spy; +import org.mockito.junit.MockitoJUnitRunner; +import org.springframework.mock.web.MockHttpServletRequest; +import org.springframework.security.crypto.keygen.KeyGenerators; +import org.springframework.security.crypto.keygen.StringKeyGenerator; + +/** + * Test suite for the short lived authentication token + */ +@RunWith(MockitoJUnitRunner.class) +public class ShortLivedJWTTokenHandlerTest extends JWTTokenHandlerTest { + @InjectMocks + @Spy + private ShortLivedJWTTokenHandler shortLivedJWTTokenHandler; + + @Before + @Override + public void setUp() throws Exception { + when(ePerson.getSessionSalt()).thenReturn("01234567890123456789012345678901"); + when(context.getCurrentUser()).thenReturn(ePerson); + when(clientInfoService.getClientIp(any())).thenReturn("123.123.123.123"); + when(ePersonClaimProvider.getKey()).thenReturn("eid"); + when(ePersonClaimProvider.getValue(any(), Mockito.any(HttpServletRequest.class))).thenReturn("epersonID"); + jwtClaimProviders.add(ePersonClaimProvider); + } + + @Test + public void testJWTNoEncryption() throws Exception { + Date previous = new Date(System.currentTimeMillis() - 10000000000L); + String token = shortLivedJWTTokenHandler + .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); + SignedJWT signedJWT = SignedJWT.parse(token); + String personId = (String) signedJWT.getJWTClaimsSet().getClaim(EPersonClaimProvider.EPERSON_ID); + assertEquals("epersonID", personId); + } + + @Test(expected = ParseException.class) + public void testJWTEncrypted() throws Exception { + when(shortLivedJWTTokenHandler.isEncryptionEnabled()).thenReturn(true); + Date previous = new Date(System.currentTimeMillis() - 10000000000L); + StringKeyGenerator keyGenerator = KeyGenerators.string(); + when(configurationService.getProperty("jwt.shortLived.encryption.secret")) + .thenReturn(keyGenerator.generateKey()); + String token = shortLivedJWTTokenHandler + .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); + SignedJWT signedJWT = SignedJWT.parse(token); + } + + //temporary set a negative expiration time so the token is invalid immediately + @Test + public void testExpiredToken() throws Exception { + when(configurationService.getLongProperty("jwt.shortLived.token.expiration", 1800000)) + .thenReturn(-99999999L); + when(ePersonClaimProvider.getEPerson(any(Context.class), any(JWTClaimsSet.class))).thenReturn(ePerson); + Date previous = new Date(new Date().getTime() - 10000000000L); + String token = shortLivedJWTTokenHandler + .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); + EPerson parsed = shortLivedJWTTokenHandler.parseEPersonFromToken(token, httpServletRequest, context); + assertEquals(null, parsed); + + } + + //Try if we can change the expiration date + @Test + public void testTokenTampering() throws Exception { + when(shortLivedJWTTokenHandler.getExpirationPeriod()).thenReturn(-99999999L); + when(ePersonClaimProvider.getEPerson(any(Context.class), any(JWTClaimsSet.class))).thenReturn(ePerson); + Date previous = new Date(new Date().getTime() - 10000000000L); + String token = shortLivedJWTTokenHandler + .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); + JWTClaimsSet jwtClaimsSet = new JWTClaimsSet.Builder().claim("eid", "epersonID").expirationTime( + new Date(System.currentTimeMillis() + 99999999)).build(); + String tamperedPayload = new String(Base64.getUrlEncoder().encode(jwtClaimsSet.toString().getBytes())); + String[] splitToken = token.split("\\."); + String tamperedToken = splitToken[0] + "." + tamperedPayload + "." + splitToken[2]; + EPerson parsed = shortLivedJWTTokenHandler.parseEPersonFromToken(tamperedToken, httpServletRequest, context); + assertEquals(null, parsed); + } + + @Test + public void testInvalidatedToken() throws Exception { + Date previous = new Date(System.currentTimeMillis() - 10000000000L); + // create a new token + String token = shortLivedJWTTokenHandler + .createTokenForEPerson(context, new MockHttpServletRequest(), previous, new ArrayList<>()); + // immediately invalidate it + shortLivedJWTTokenHandler.invalidateToken(token, new MockHttpServletRequest(), context); + // Check if it is still valid by trying to parse the EPerson from it (should return null) + EPerson parsed = shortLivedJWTTokenHandler.parseEPersonFromToken(token, httpServletRequest, context); + assertEquals(null, parsed); + } +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractControllerIntegrationTest.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractControllerIntegrationTest.java index de9003b2fe..98a7101a9b 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractControllerIntegrationTest.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractControllerIntegrationTest.java @@ -19,6 +19,7 @@ import javax.servlet.Filter; import com.fasterxml.jackson.core.JsonProcessingException; import com.fasterxml.jackson.databind.ObjectMapper; import org.apache.commons.lang3.StringUtils; +import org.dspace.AbstractIntegrationTestWithDatabase; import org.dspace.app.rest.Application; import org.dspace.app.rest.model.patch.Operation; import org.dspace.app.rest.utils.DSpaceConfigurationInitializer; @@ -134,12 +135,29 @@ public class AbstractControllerIntegrationTest extends AbstractIntegrationTestWi .andReturn().getResponse(); } + public MockHttpServletResponse getAuthResponseWithXForwardedForHeader(String user, String password, + String xForwardedFor) throws Exception { + return getClient().perform(post("/api/authn/login") + .param("user", user) + .param("password", password) + .header("X-Forwarded-For", xForwardedFor)) + .andReturn().getResponse(); + } + + public String getAuthToken(String user, String password) throws Exception { return StringUtils.substringAfter( getAuthResponse(user, password).getHeader(AUTHORIZATION_HEADER), AUTHORIZATION_TYPE); } + public String getAuthTokenWithXForwardedForHeader(String user, String password, String xForwardedFor) + throws Exception { + return StringUtils.substringAfter( + getAuthResponseWithXForwardedForHeader(user, password, xForwardedFor).getHeader(AUTHORIZATION_HEADER), + AUTHORIZATION_TYPE); + } + public String getPatchContent(List ops) { ObjectMapper objectMapper = new ObjectMapper(); try { diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractEntityIntegrationTest.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractEntityIntegrationTest.java index dec2461779..0f771df0b9 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractEntityIntegrationTest.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractEntityIntegrationTest.java @@ -7,8 +7,8 @@ */ package org.dspace.app.rest.test; -import org.dspace.app.rest.builder.EntityTypeBuilder; -import org.dspace.app.rest.builder.RelationshipTypeBuilder; +import org.dspace.builder.EntityTypeBuilder; +import org.dspace.builder.RelationshipTypeBuilder; import org.dspace.content.EntityType; import org.dspace.content.service.EntityTypeService; import org.junit.Before; @@ -37,6 +37,7 @@ public class AbstractEntityIntegrationTest extends AbstractControllerIntegration * in relationship-types.xml */ @Before + @Override public void setUp() throws Exception { super.setUp(); diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractWebClientIntegrationTest.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractWebClientIntegrationTest.java index eef602f47c..9083887581 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractWebClientIntegrationTest.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/test/AbstractWebClientIntegrationTest.java @@ -8,6 +8,7 @@ package org.dspace.app.rest.test; import org.apache.commons.lang3.StringUtils; +import org.dspace.AbstractIntegrationTestWithDatabase; import org.dspace.app.rest.Application; import org.dspace.app.rest.utils.DSpaceConfigurationInitializer; import org.dspace.app.rest.utils.DSpaceKernelInitializer; @@ -102,6 +103,7 @@ public class AbstractWebClientIntegrationTest extends AbstractIntegrationTestWit * @param path path to perform GET against * @param username Username (may be null to perform an unauthenticated POST) * @param password Password + * @param requestEntity unknown -- not used. * @return ResponseEntity with a String body */ public ResponseEntity postResponseAsString(String path, String username, String password, diff --git a/dspace-server-webapp/src/test/java/org/dspace/app/rest/utils/DiscoverQueryBuilderTest.java b/dspace-server-webapp/src/test/java/org/dspace/app/rest/utils/DiscoverQueryBuilderTest.java index 195cc31027..5a1e7cd1a9 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/app/rest/utils/DiscoverQueryBuilderTest.java +++ b/dspace-server-webapp/src/test/java/org/dspace/app/rest/utils/DiscoverQueryBuilderTest.java @@ -7,6 +7,7 @@ */ package org.dspace.app.rest.utils; +import static java.util.Collections.emptyList; import static org.dspace.discovery.configuration.DiscoveryConfigurationParameters.SORT.COUNT; import static org.dspace.discovery.configuration.DiscoveryConfigurationParameters.SORT.VALUE; import static org.dspace.discovery.configuration.DiscoveryConfigurationParameters.TYPE_HIERARCHICAL; @@ -14,10 +15,11 @@ import static org.dspace.discovery.configuration.DiscoveryConfigurationParameter import static org.hamcrest.Matchers.allOf; import static org.hamcrest.Matchers.contains; import static org.hamcrest.Matchers.containsInAnyOrder; +import static org.hamcrest.Matchers.empty; +import static org.hamcrest.Matchers.emptyOrNullString; import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.hasSize; import static org.hamcrest.Matchers.is; -import static org.hamcrest.Matchers.isEmptyOrNullString; import static org.junit.Assert.assertThat; import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyInt; @@ -181,7 +183,7 @@ public class DiscoverQueryBuilderTest { assertThat(discoverQuery.getFilterQueries(), containsInAnyOrder("archived:true", "subject:\"Java\"")); assertThat(discoverQuery.getQuery(), is(query)); - assertThat(discoverQuery.getDSpaceObjectFilter(), is(IndexableItem.TYPE)); + assertThat(discoverQuery.getDSpaceObjectFilters(), contains(IndexableItem.TYPE)); assertThat(discoverQuery.getSortField(), is("dc.title_sort")); assertThat(discoverQuery.getSortOrder(), is(DiscoverQuery.SORT_ORDER.asc)); assertThat(discoverQuery.getMaxResults(), is(10)); @@ -203,11 +205,11 @@ public class DiscoverQueryBuilderTest { @Test public void testBuildQueryDefaults() throws Exception { DiscoverQuery discoverQuery = - queryBuilder.buildQuery(context, null, discoveryConfiguration, null, null, null, null); + queryBuilder.buildQuery(context, null, discoveryConfiguration, null, null, emptyList(), null); assertThat(discoverQuery.getFilterQueries(), containsInAnyOrder("archived:true")); - assertThat(discoverQuery.getQuery(), isEmptyOrNullString()); - assertThat(discoverQuery.getDSpaceObjectFilter(), isEmptyOrNullString()); + assertThat(discoverQuery.getQuery(), is(emptyOrNullString())); + assertThat(discoverQuery.getDSpaceObjectFilters(), is(empty())); //Note this should actually be "dc.date.accessioned_dt" but remember that our searchService is just a stupid // mock assertThat(discoverQuery.getSortField(), is("dc.date.accessioned_sort")); @@ -233,11 +235,11 @@ public class DiscoverQueryBuilderTest { page = PageRequest.of(2, 10, Sort.Direction.ASC, "SCORE"); DiscoverQuery discoverQuery = - queryBuilder.buildQuery(context, null, discoveryConfiguration, null, null, null, page); + queryBuilder.buildQuery(context, null, discoveryConfiguration, null, null, emptyList(), page); assertThat(discoverQuery.getFilterQueries(), containsInAnyOrder("archived:true")); - assertThat(discoverQuery.getQuery(), isEmptyOrNullString()); - assertThat(discoverQuery.getDSpaceObjectFilter(), is(isEmptyOrNullString())); + assertThat(discoverQuery.getQuery(), is(emptyOrNullString())); + assertThat(discoverQuery.getDSpaceObjectFilters(), is(empty())); //Note this should actually be "dc.date.accessioned_dt" but remember that our searchService is just a stupid // mock assertThat(discoverQuery.getSortField(), is("score_sort")); @@ -297,8 +299,8 @@ public class DiscoverQueryBuilderTest { assertThat(discoverQuery.getFilterQueries(), containsInAnyOrder("archived:true", "subject:\"Java\"")); assertThat(discoverQuery.getQuery(), is(query)); - assertThat(discoverQuery.getDSpaceObjectFilter(), is(IndexableItem.TYPE)); - assertThat(discoverQuery.getSortField(), isEmptyOrNullString()); + assertThat(discoverQuery.getDSpaceObjectFilters(), contains(IndexableItem.TYPE)); + assertThat(discoverQuery.getSortField(), is(emptyOrNullString())); assertThat(discoverQuery.getMaxResults(), is(0)); assertThat(discoverQuery.getStart(), is(0)); assertThat(discoverQuery.getFacetMinCount(), is(1)); diff --git a/dspace-server-webapp/src/test/java/org/dspace/curate/CurationScriptIT.java b/dspace-server-webapp/src/test/java/org/dspace/curate/CurationScriptIT.java new file mode 100644 index 0000000000..66c0319857 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/curate/CurationScriptIT.java @@ -0,0 +1,385 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.curate; + +import static com.jayway.jsonpath.JsonPath.read; +import static org.hamcrest.Matchers.is; +import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath; +import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status; + +import java.io.File; +import java.util.LinkedList; +import java.util.List; +import java.util.concurrent.atomic.AtomicReference; +import java.util.stream.Collectors; + +import com.google.gson.Gson; +import org.dspace.app.rest.converter.DSpaceRunnableParameterConverter; +import org.dspace.app.rest.matcher.ProcessMatcher; +import org.dspace.app.rest.model.ParameterValueRest; +import org.dspace.app.rest.model.ProcessRest; +import org.dspace.app.rest.model.ScriptRest; +import org.dspace.app.rest.projection.Projection; +import org.dspace.app.rest.test.AbstractControllerIntegrationTest; +import org.dspace.builder.CollectionBuilder; +import org.dspace.builder.CommunityBuilder; +import org.dspace.builder.ItemBuilder; +import org.dspace.builder.ProcessBuilder; +import org.dspace.content.Collection; +import org.dspace.content.Community; +import org.dspace.content.Item; +import org.dspace.content.ProcessStatus; +import org.dspace.scripts.DSpaceCommandLineParameter; +import org.junit.Test; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * IT for {@link Curation} + * + * @author Maria Verdonck (Atmire) on 24/06/2020 + */ +public class CurationScriptIT extends AbstractControllerIntegrationTest { + + @Autowired + private DSpaceRunnableParameterConverter dSpaceRunnableParameterConverter; + + private final static String SCRIPTS_ENDPOINT = "/api/" + ScriptRest.CATEGORY + "/" + ScriptRest.PLURAL_NAME; + private final static String CURATE_SCRIPT_ENDPOINT = SCRIPTS_ENDPOINT + "/curate/" + ProcessRest.PLURAL_NAME; + + @Test + public void curateScript_invalidTaskOption() throws Exception { + context.turnOffAuthorisationSystem(); + + String token = getAuthToken(admin.getEmail(), password); + + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Public item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .build(); + + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-i", publicItem1.getHandle())); + parameters.add(new DSpaceCommandLineParameter("-t", "invalidTaskOption")); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + context.restoreAuthSystemState(); + + // Request with -t + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + // Illegal Argument Exception + .andExpect(status().isBadRequest()); + } + + @Test + public void curateScript_MissingHandle() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-t", CurationClientOptions.getTaskOptions().get(0))); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + // Request with missing required -i + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + // Illegal Argument Exception + .andExpect(status().isBadRequest()); + } + + @Test + public void curateScript_invalidHandle() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-i", "invalidhandle")); + parameters.add(new DSpaceCommandLineParameter("-t", CurationClientOptions.getTaskOptions().get(0))); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + // Request with missing required -i + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + // Illegal Argument Exception + .andExpect(status().isBadRequest()); + } + + @Test + public void curateScript_MissingTaskOrTaskFile() throws Exception { + context.turnOffAuthorisationSystem(); + + String token = getAuthToken(admin.getEmail(), password); + + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Public item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .build(); + + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-i", publicItem1.getHandle())); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + context.restoreAuthSystemState(); + + // Request without -t or -T (and no -q ) + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + // Illegal Argument Exception + .andExpect(status().isBadRequest()); + } + + @Test + public void curateScript_InvalidScope() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-i", "all")); + parameters.add(new DSpaceCommandLineParameter("-s", "invalidScope")); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + // Request with invalid -s ; must be object, curation or open + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + // Illegal Argument Exception + .andExpect(status().isBadRequest()); + } + + @Test + public void curateScript_InvalidTaskFile() throws Exception { + String token = getAuthToken(admin.getEmail(), password); + + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-i", "all")); + parameters.add(new DSpaceCommandLineParameter("-T", "invalidTaskFile")); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + // Request with invalid -s ; must be object, curation or open + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + // Illegal Argument Exception + .andExpect(status().isBadRequest()); + } + + @Test + public void curateScript_validRequest_Task() throws Exception { + context.turnOffAuthorisationSystem(); + + String token = getAuthToken(admin.getEmail(), password); + AtomicReference idRef = new AtomicReference<>(); + + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Public item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .build(); + + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-i", publicItem1.getHandle())); + parameters.add(new DSpaceCommandLineParameter("-t", CurationClientOptions.getTaskOptions().get(0))); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + context.restoreAuthSystemState(); + + try { + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("curate", + String.valueOf(admin.getID()), parameters, + ProcessStatus.COMPLETED)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); + } finally { + ProcessBuilder.deleteProcess(idRef.get()); + } + } + + @Test + public void curateScript_validRequest_TaskFile() throws Exception { + context.turnOffAuthorisationSystem(); + + String token = getAuthToken(admin.getEmail(), password); + AtomicReference idRef = new AtomicReference<>(); + + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Public item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .build(); + + File taskFile = new File(testProps.get("test.curateTaskFile").toString()); + + LinkedList parameters = new LinkedList<>(); + parameters.add(new DSpaceCommandLineParameter("-i", publicItem1.getHandle())); + parameters.add(new DSpaceCommandLineParameter("-T", taskFile.getAbsolutePath())); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + + context.restoreAuthSystemState(); + + try { + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + .andExpect(status().isAccepted()) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("curate", + String.valueOf(admin.getID()), parameters, + ProcessStatus.COMPLETED)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); + } finally { + ProcessBuilder.deleteProcess(idRef.get()); + } + } + + @Test + public void curateScript_EPersonInParametersFails() throws Exception { + context.turnOffAuthorisationSystem(); + + String token = getAuthToken(admin.getEmail(), password); + + parentCommunity = CommunityBuilder.createCommunity(context) + .withName("Parent Community") + .build(); + Community child1 = CommunityBuilder.createSubCommunity(context, parentCommunity) + .withName("Sub Community") + .build(); + Collection col1 = CollectionBuilder.createCollection(context, child1).withName("Collection 1").build(); + + Item publicItem1 = ItemBuilder.createItem(context, col1) + .withTitle("Public item 1") + .withIssueDate("2017-10-17") + .withAuthor("Smith, Donald").withAuthor("Doe, John") + .withSubject("ExtraEntry") + .build(); + + LinkedList parameters = new LinkedList<>(); + + parameters.add(new DSpaceCommandLineParameter("-e", eperson.getEmail())); + parameters.add(new DSpaceCommandLineParameter("-i", publicItem1.getHandle())); + parameters.add(new DSpaceCommandLineParameter("-t", CurationClientOptions.getTaskOptions().get(0))); + + List list = parameters.stream() + .map(dSpaceCommandLineParameter -> dSpaceRunnableParameterConverter + .convert(dSpaceCommandLineParameter, Projection.DEFAULT)) + .collect(Collectors.toList()); + AtomicReference idRef = new AtomicReference<>(); + + context.restoreAuthSystemState(); + try { + + getClient(token) + .perform(post(CURATE_SCRIPT_ENDPOINT).contentType("multipart/form-data") + .param("properties", + new Gson().toJson(list))) + .andExpect(jsonPath("$", is( + ProcessMatcher.matchProcess("curate", + String.valueOf(admin.getID()), parameters, + ProcessStatus.FAILED)))) + .andDo(result -> idRef + .set(read(result.getResponse().getContentAsString(), "$.processId"))); + } finally { + ProcessBuilder.deleteProcess(idRef.get()); + } + } + + + +} diff --git a/dspace-server-webapp/src/test/java/org/dspace/scripts/impl/MockDSpaceRunnableScript.java b/dspace-server-webapp/src/test/java/org/dspace/scripts/impl/MockDSpaceRunnableScript.java index 960927e90a..0a4242e469 100644 --- a/dspace-server-webapp/src/test/java/org/dspace/scripts/impl/MockDSpaceRunnableScript.java +++ b/dspace-server-webapp/src/test/java/org/dspace/scripts/impl/MockDSpaceRunnableScript.java @@ -15,6 +15,10 @@ import org.dspace.utils.DSpace; public class MockDSpaceRunnableScript extends DSpaceRunnable { @Override public void internalRun() throws Exception { + handler.logInfo("Logging INFO for Mock DSpace Script"); + handler.logError("Logging ERROR for Mock DSpace Script"); + handler.logWarning("Logging WARNING for Mock DSpace Script"); + handler.logDebug("Logging DEBUG for Mock DSpace Script"); } @Override diff --git a/dspace-server-webapp/src/test/java/org/dspace/statistics/MockSolrLoggerServiceImpl.java b/dspace-server-webapp/src/test/java/org/dspace/statistics/MockSolrLoggerServiceImpl.java deleted file mode 100644 index 245bb8f86f..0000000000 --- a/dspace-server-webapp/src/test/java/org/dspace/statistics/MockSolrLoggerServiceImpl.java +++ /dev/null @@ -1,101 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.statistics; - -import static org.mockito.ArgumentMatchers.any; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; - -import java.net.InetAddress; -import java.util.ArrayList; -import java.util.Collections; -import java.util.HashMap; -import java.util.List; - -import com.maxmind.geoip2.DatabaseReader; -import com.maxmind.geoip2.model.CityResponse; -import com.maxmind.geoip2.record.City; -import com.maxmind.geoip2.record.Continent; -import com.maxmind.geoip2.record.Country; -import com.maxmind.geoip2.record.Location; -import com.maxmind.geoip2.record.MaxMind; -import com.maxmind.geoip2.record.Postal; -import com.maxmind.geoip2.record.RepresentedCountry; -import com.maxmind.geoip2.record.Traits; -import org.apache.logging.log4j.LogManager; -import org.apache.logging.log4j.Logger; -import org.dspace.solr.MockSolrServer; -import org.springframework.beans.factory.DisposableBean; -import org.springframework.beans.factory.InitializingBean; -import org.springframework.stereotype.Service; - -/** - * Mock service that uses an embedded SOLR server for the statistics core. - * - *

    - * NOTE: this class overrides one of the same name - * defined in dspace-api and declared as a bean there. - * See {@code test/data/dspaceFolder/config/spring/api/solr-services.xml}. Some kind of classpath - * magic makes this work. - */ -@Service -public class MockSolrLoggerServiceImpl - extends SolrLoggerServiceImpl - implements InitializingBean, DisposableBean { - - private static final Logger log = LogManager.getLogger(); - - private MockSolrServer mockSolrServer; - - public MockSolrLoggerServiceImpl() { - } - - @Override - public void afterPropertiesSet() throws Exception { - // Initialize our service with a Mock Solr statistics core - mockSolrServer = new MockSolrServer("statistics"); - solr = mockSolrServer.getSolrServer(); - - // Mock GeoIP's DatabaseReader - DatabaseReader reader = mock(DatabaseReader.class); - // Ensure that any tests requesting a city() get a mock/fake CityResponse - when(reader.city(any(InetAddress.class))).thenReturn(mockCityResponse()); - // Save this mock DatabaseReader to be used by SolrLoggerService - locationService = reader; - } - - /** - * A mock/fake GeoIP CityResponse, which will be used for *all* test statistical requests - * @return faked CityResponse - */ - private CityResponse mockCityResponse() { - List cityNames = new ArrayList(Collections.singleton("New York")); - City city = new City(cityNames, 1, 1, new HashMap()); - - List countryNames = new ArrayList(Collections.singleton("United States")); - Country country = new Country(countryNames, 1, 1, "US", new HashMap()); - - Location location = new Location(1, 1, 40.760498D, -73.9933D, 501, 1, "EST"); - - Postal postal = new Postal("10036", 1); - - return new CityResponse(city, new Continent(), country, location, new MaxMind(), postal, - country, new RepresentedCountry(), new ArrayList<>(0), - new Traits()); - } - - /** Remove all records. */ - public void reset() { - mockSolrServer.reset(); - } - - @Override - public void destroy() throws Exception { - mockSolrServer.destroy(); - } -} diff --git a/dspace-server-webapp/src/test/java/org/dspace/statistics/export/service/MockOpenUrlServiceImpl.java b/dspace-server-webapp/src/test/java/org/dspace/statistics/export/service/MockOpenUrlServiceImpl.java new file mode 100644 index 0000000000..14ac9d36d5 --- /dev/null +++ b/dspace-server-webapp/src/test/java/org/dspace/statistics/export/service/MockOpenUrlServiceImpl.java @@ -0,0 +1,41 @@ +/** + * The contents of this file are subject to the license and copyright + * detailed in the LICENSE and NOTICE files at the root of the source + * tree and available online at + * + * http://www.dspace.org/license/ + */ +package org.dspace.statistics.export.service; + +import java.io.IOException; +import java.net.HttpURLConnection; +import java.util.ArrayList; + +import org.apache.commons.lang.StringUtils; +import org.springframework.beans.factory.annotation.Autowired; + +/** + * Mock OpenUrlService that will ensure that IRUS tracker does need to be contacted in order to test the functionality + */ +public class MockOpenUrlServiceImpl extends OpenUrlServiceImpl { + + @Autowired + ArrayList testProcessedUrls; + + /** + * Returns a response code to simulate contact to the external url + * When the url contains "fail", a fail code 500 will be returned + * Otherwise the success code 200 will be returned + * @param urlStr + * @return 200 or 500 depending on whether the "fail" keyword is present in the url + * @throws IOException + */ + protected int getResponseCodeFromUrl(final String urlStr) throws IOException { + if (StringUtils.contains(urlStr, "fail")) { + return HttpURLConnection.HTTP_INTERNAL_ERROR; + } else { + testProcessedUrls.add(urlStr); + return HttpURLConnection.HTTP_OK; + } + } +} diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/bibtex-test-3-entries.bib b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/bibtex-test-3-entries.bib new file mode 100644 index 0000000000..4d197ff90f --- /dev/null +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/bibtex-test-3-entries.bib @@ -0,0 +1,14 @@ +@misc{ Nobody01, + author = "Nobody Jr", + title = "My Article", + year = "2006" } + +@misc{ Nobody02, + author = "Nobody Jr", + title = "My Article 2", + year = "2006" } + +@misc{ Nobody03, + author = "Nobody Jr", + title = "My Article 3", + year = "2018" } diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/bibtex-test.bib b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/bibtex-test.bib index 4d197ff90f..d6e0d992a4 100644 --- a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/bibtex-test.bib +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/bibtex-test.bib @@ -1,14 +1,4 @@ @misc{ Nobody01, author = "Nobody Jr", title = "My Article", - year = "2006" } - -@misc{ Nobody02, - author = "Nobody Jr", - title = "My Article 2", - year = "2006" } - -@misc{ Nobody03, - author = "Nobody Jr", - title = "My Article 3", - year = "2018" } + year = "2006" } \ No newline at end of file diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/csv-missing-field-test.csv b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/csv-missing-field-test.csv new file mode 100644 index 0000000000..7f3f5cb750 --- /dev/null +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/csv-missing-field-test.csv @@ -0,0 +1,2 @@ +Title,Author,Year,Journal,Abstract,ISSN,Type +My Article,"Nobody, \"Try escape, in item\"",,My Journal,"This is my abstract, i use comma to check escape works fine",Mock ISSN \ No newline at end of file diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/csv-test.csv b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/csv-test.csv new file mode 100644 index 0000000000..d5bc35a77b --- /dev/null +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/csv-test.csv @@ -0,0 +1,2 @@ +Title,Author,Year,Journal,Abstract,ISSN,Type +My Article,Nobody,2006,My Journal,"This is my abstract, i use comma to check escape works fine",Mock ISSN,Mock subtype \ No newline at end of file diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/endnote-test.enw b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/endnote-test.enw new file mode 100644 index 0000000000..25cc749d92 --- /dev/null +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/endnote-test.enw @@ -0,0 +1,10 @@ +FN +VR +SO My Journal +PY 2005 +AB This is my abstract +AU Author 1 +AU Author 2 +TI My Title +ER +EF \ No newline at end of file diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/pubmed-test.xml b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/pubmed-test.xml new file mode 100644 index 0000000000..3fdceb3880 --- /dev/null +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/pubmed-test.xml @@ -0,0 +1,151 @@ + + + + + + 15117179 + + 2005 + 02 + 15 + + + 2006 + 11 + 15 + +

    + + 0003-2700 + + 76 + 9 + + 2004 + May + 01 + + + Analytical chemistry + Anal. Chem. + + Multistep microreactions with proteins using electrocapture technology. + + 2425-9 + + + A method to perform multistep reactions by means of electroimmobilization of a target molecule in a microflow stream is presented. A target protein is captured by the opposing effects between the hydrodynamic and electric forces, after which another medium is injected into the system. The second medium carries enzymes or other reagents, which are brought into contact with the target protein and react. The immobilization is reversed by disconnecting the electric field, upon which products are collected at the outlet of the device for analysis. On-line reduction, alkylation, and trypsin digestion of proteins is demonstrated and was monitored by MALDI mass spectrometry. + + + + Astorga-Wells + Juan + J + + Department of Medical Biochemistry and Biophysics, Karolinska Institutet, SE-171 77 Stockholm, Sweden. + + + + Bergman + Tomas + T + + + Jörnvall + Hans + H + + + eng + + Journal Article + Research Support, Non-U.S. Gov't + +
    + + United States + Anal Chem + 0370536 + 0003-2700 + + + + 0 + Proteins + + + EC 3.4.21.4 + Trypsin + + + IM + + + Animals + + + Cattle + + + Electrochemistry + + + Horses + + + Microfluidics + instrumentation + methods + + + Peptide Mapping + methods + + + Proteins + analysis + chemistry + + + Spectrometry, Mass, Matrix-Assisted Laser Desorption-Ionization + methods + + + Trypsin + chemistry + + + + + + + 2004 + 5 + 1 + 5 + 0 + + + 2005 + 2 + 16 + 9 + 0 + + + 2004 + 5 + 1 + 5 + 0 + + + ppublish + + 15117179 + 10.1021/ac0354342 + + + + + \ No newline at end of file diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/ris-test.ris b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/ris-test.ris new file mode 100644 index 0000000000..e056e6ace2 --- /dev/null +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/ris-test.ris @@ -0,0 +1,20 @@ +TY - CHAP +AU - Just, Mike +ED - van Tilborg, Henk C. A. +PY - 2005 +DA - 2005// +TI - Challenge–Response Identification +T1 - Challenge–Response Identification second title +BT - Encyclopedia of Cryptography and Security +SP - 73 +EP - 74 +PB - Springer US +CY - Boston, MA +SN - 978-0-387-23483-0 +SO - My Journal +UR - https://doi.org/10.1007/0-387-23483-7_56 +DO - 10.1007/0-387-23483-7_56 +ID - Just2005 +PT - Mock subtype +AB - This is the abstract +ER - \ No newline at end of file diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/tsv-missing-field-test.tsv b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/tsv-missing-field-test.tsv new file mode 100644 index 0000000000..86659b9a38 --- /dev/null +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/tsv-missing-field-test.tsv @@ -0,0 +1,2 @@ +Title Author Year Journal Abstract ISSN Type +My Article "Nobody, \"Try escape in item\"" My Journal "This is my abstract, i use tab to check escape works fine" Mock ISSN \ No newline at end of file diff --git a/dspace-server-webapp/src/test/resources/org/dspace/app/rest/tsv-test.tsv b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/tsv-test.tsv new file mode 100644 index 0000000000..506a725c96 --- /dev/null +++ b/dspace-server-webapp/src/test/resources/org/dspace/app/rest/tsv-test.tsv @@ -0,0 +1,2 @@ +Title Author Year Journal Abstract ISSN Type +My Article Nobody 2006 My Journal "This is my abstract i'm using use tab to check escape works fine" Mock ISSN Mock subtype \ No newline at end of file diff --git a/dspace-server-webapp/src/test/resources/test-config.properties b/dspace-server-webapp/src/test/resources/test-config.properties index 273d93c968..3af96b20fc 100644 --- a/dspace-server-webapp/src/test/resources/test-config.properties +++ b/dspace-server-webapp/src/test/resources/test-config.properties @@ -11,3 +11,6 @@ test.folder.assetstore = ./target/testing/dspace/assetstore #Path for a test file to create bitstreams test.bitstream = ./target/testing/dspace/assetstore/ConstitutionofIreland.pdf + +#Path for a test Taskfile for the curate script +test.curateTaskFile = ./target/testing/dspace/assetstore/curate.txt diff --git a/dspace-services/pom.xml b/dspace-services/pom.xml index 1670c8454f..b9128654b4 100644 --- a/dspace-services/pom.xml +++ b/dspace-services/pom.xml @@ -9,7 +9,7 @@ org.dspace dspace-parent - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT @@ -18,21 +18,20 @@ - + - test-environment + unit-test-environment false - maven.test.skip + skipUnitTests false - - + maven-surefire-plugin diff --git a/dspace-services/src/main/java/org/dspace/kernel/ServiceManager.java b/dspace-services/src/main/java/org/dspace/kernel/ServiceManager.java index 7254bee989..e4cca677c7 100644 --- a/dspace-services/src/main/java/org/dspace/kernel/ServiceManager.java +++ b/dspace-services/src/main/java/org/dspace/kernel/ServiceManager.java @@ -21,7 +21,8 @@ import org.springframework.context.ConfigurableApplicationContext; public interface ServiceManager { /** - * Get the application context + * Get the application context. + * @return the Spring application context. */ public ConfigurableApplicationContext getApplicationContext(); @@ -46,18 +47,14 @@ public interface ServiceManager { * service manager objects. If using Spring this allows access to the * underlying ApplicationContext object like so:
    * {@code getServiceByName(ApplicationContext.class.getName(), ApplicationContext.class);} - * If using Guice then the same applies like so:
    - * {@code getServiceByName(Injector.class.getName(), Injector.class);} - * It is also possible to register a module and cause Guice to fill - * in any injected core services (see register method). *

    * - * @param Class type + * @param Class type. * @param name (optional) the unique name for this service. * If null then the bean will be returned if there is only one * service of this type. - * @param type the type for the requested service (this will typically be the interface class but can be concrete - * as well) + * @param type the type for the requested service (this will typically be + * the interface class but can be concrete as well). * @return the service singleton OR null if none is found */ public T getServiceByName(String name, Class type); @@ -90,12 +87,6 @@ public interface ServiceManager { * down the context (webapp, etc.) that registered the service so * that the full lifecycle completes correctly. *

    - *

    - * NOTE: if using Guice it is possible to register a Guice - * Module as a service, which will not actually register it but will - * cause anything in the Module to have existing core services injected - * into it. You can use anything as the name in this case. - *

    * * @param name the name of the service (must be unique) * @param service the object to register as a singleton service @@ -103,6 +94,14 @@ public interface ServiceManager { */ public void registerService(String name, Object service); + /** + * Add a singleton service at runtime, but do not inject dependencies. + * Typically used with a service instance that has already had all + * dependencies injected explicitly, for example in test code. + * + * @param name the name of the service (must be unique). + * @param service the instance to register as a singleton service. + */ public void registerServiceNoAutowire(String name, Object service); /** @@ -112,7 +111,7 @@ public interface ServiceManager { * except that it allows the core service manager to startup your * service for you instead of you providing a service to the core. * In general, it is better if you use your own service manager - * (like Spring or Guice) to manage your services and simply + * (like Spring) to manage your services and simply * inherit the core service beans from the DSpace core service * manager using the special capabilities of * {@link #getServiceByName(String, Class)}. diff --git a/dspace-services/src/main/java/org/dspace/kernel/mixins/InitializedService.java b/dspace-services/src/main/java/org/dspace/kernel/mixins/InitializedService.java deleted file mode 100644 index 780c879582..0000000000 --- a/dspace-services/src/main/java/org/dspace/kernel/mixins/InitializedService.java +++ /dev/null @@ -1,26 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.kernel.mixins; - -/** - * Allow the service or provider to be initialized when it is started - * by the service manager. After all injections are complete the init - * method will be called. Any initialization that a service needs to do - * should happen here. - * - * @author Aaron Zeckoski (azeckoski @ gmail.com) - */ -public interface InitializedService { - - /** - * Executed after the service is created and all dependencies and - * configurations injected. - */ - public void init(); - -} diff --git a/dspace-services/src/main/java/org/dspace/kernel/mixins/ShutdownService.java b/dspace-services/src/main/java/org/dspace/kernel/mixins/ShutdownService.java deleted file mode 100644 index 9a1bb02ded..0000000000 --- a/dspace-services/src/main/java/org/dspace/kernel/mixins/ShutdownService.java +++ /dev/null @@ -1,26 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.kernel.mixins; - - -/** - * Allow the service to be notified when the service manager is shutting - * it down. This will typically be called when the kernel is stopped or - * destroyed. Any cleanup that a service needs to do when it is - * shut down should happen here. - * - * @author Aaron Zeckoski (azeckoski @ gmail.com) - */ -public interface ShutdownService { - - /** - * Called as the service manager is stopping or shutting down. - */ - public void shutdown(); - -} diff --git a/dspace-services/src/main/java/org/dspace/servicemanager/DSpaceServiceManager.java b/dspace-services/src/main/java/org/dspace/servicemanager/DSpaceServiceManager.java index d3f2b03ff2..c17b31f68d 100644 --- a/dspace-services/src/main/java/org/dspace/servicemanager/DSpaceServiceManager.java +++ b/dspace-services/src/main/java/org/dspace/servicemanager/DSpaceServiceManager.java @@ -8,6 +8,7 @@ package org.dspace.servicemanager; import java.lang.reflect.InvocationTargetException; +import java.lang.reflect.Method; import java.util.ArrayList; import java.util.Arrays; import java.util.Collection; @@ -17,23 +18,19 @@ import java.util.LinkedHashMap; import java.util.LinkedList; import java.util.List; import java.util.Map; +import javax.annotation.PreDestroy; import org.apache.commons.lang3.ArrayUtils; import org.dspace.kernel.Activator; import org.dspace.kernel.config.SpringLoader; import org.dspace.kernel.mixins.ConfigChangeListener; -import org.dspace.kernel.mixins.InitializedService; import org.dspace.kernel.mixins.ServiceChangeListener; import org.dspace.kernel.mixins.ServiceManagerReadyAware; -import org.dspace.kernel.mixins.ShutdownService; import org.dspace.servicemanager.config.DSpaceConfigurationService; import org.dspace.servicemanager.spring.DSpaceBeanFactoryPostProcessor; -import org.dspace.services.ConfigurationService; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.beans.BeanWrapper; import org.springframework.beans.BeansException; -import org.springframework.beans.PropertyAccessorFactory; import org.springframework.beans.factory.ListableBeanFactory; import org.springframework.beans.factory.NoSuchBeanDefinitionException; import org.springframework.beans.factory.config.AutowireCapableBeanFactory; @@ -374,7 +371,18 @@ public final class DSpaceServiceManager implements ServiceManagerSystem { applicationContext.getBeanFactory().destroyBean(name, beanInstance); } catch (NoSuchBeanDefinitionException e) { // this happens if the bean was registered manually (annoyingly) - DSpaceServiceManager.shutdownService(beanInstance); + for (final Method method : beanInstance.getClass().getMethods()) { + if (method.isAnnotationPresent(PreDestroy.class)) { + try { + method.invoke(beanInstance); + } catch (IllegalAccessException + | IllegalArgumentException + | InvocationTargetException ex) { + log.warn("Failed to call declared @PreDestroy method of {} service", + name, ex); + } + } + } } } catch (BeansException e) { // nothing to do here, could not find the bean @@ -578,79 +586,6 @@ public final class DSpaceServiceManager implements ServiceManagerSystem { // STATICS - /** - * Configures a given service (i.e. bean) based on any DSpace configuration - * settings which refer to it by name. . - *

    - * NOTE: Any configurations related to a specific service MUST be prefixed - * with the given service's name (e.g. [serviceName].setting = value) - *

    - * This method logs an error if it encounters configs which refer to a - * service by name, but is an invalid setting for that service. - * - * @param serviceName the name of the service - * @param service the service object (which will be configured) - * @param config the running configuration service - */ - public static void configureService(String serviceName, Object service, ConfigurationService config) { - - // Check if the configuration has any properties whose prefix - // corresponds to this service's name - List configKeys = config.getPropertyKeys(serviceName); - if (configKeys != null && !configKeys.isEmpty()) { - BeanWrapper beanWrapper = PropertyAccessorFactory.forBeanPropertyAccess(service); - for (String key : configKeys) { - // Remove serviceName prefix from key. This is the name of the actual bean's parameter - // This removes the first x chars, where x is length of serviceName + 1 char - // Format of Key: [serviceName].[param] - String param = key.substring(serviceName.length() + 1); - - try { - // Attempt to set this configuration on the given service's bean - beanWrapper.setPropertyValue(param, config.getProperty(key)); - log.info("Set param (" + param + ") on service bean (" + serviceName + ") to: " + config - .getProperty(key)); - } catch (RuntimeException e) { - // If an error occurs, just log it - log.error("Unable to set param (" + param + ") on service bean (" + serviceName + ") to: " + config - .getProperty(key), e); - } - } - } - } - - /** - * Initializes a service if it asks to be initialized or does nothing. - * - * @param service any bean - * @throws IllegalStateException if the service init fails - */ - public static void initService(Object service) { - if (service instanceof InitializedService) { - try { - ((InitializedService) service).init(); - } catch (Exception e) { - throw new IllegalStateException( - "Failure attempting to initialize service (" + service + "): " + e.getMessage(), e); - } - } - } - - /** - * Shuts down a service if it asks to be shutdown or does nothing. - * - * @param service any bean - */ - public static void shutdownService(Object service) { - if (service instanceof ShutdownService) { - try { - ((ShutdownService) service).shutdown(); - } catch (Exception e) { - log.error("Failure shutting down service: {}", service, e); - } - } - } - /** * Build the complete list of Spring configuration paths, including * hard-wired paths. @@ -702,5 +637,4 @@ public final class DSpaceServiceManager implements ServiceManagerSystem { } return pathList.toArray(new String[pathList.size()]); } - } diff --git a/dspace-services/src/main/java/org/dspace/servicemanager/spring/DSpaceBeanPostProcessor.java b/dspace-services/src/main/java/org/dspace/servicemanager/spring/DSpaceBeanPostProcessor.java deleted file mode 100644 index 3b349b0080..0000000000 --- a/dspace-services/src/main/java/org/dspace/servicemanager/spring/DSpaceBeanPostProcessor.java +++ /dev/null @@ -1,72 +0,0 @@ -/** - * The contents of this file are subject to the license and copyright - * detailed in the LICENSE and NOTICE files at the root of the source - * tree and available online at - * - * http://www.dspace.org/license/ - */ -package org.dspace.servicemanager.spring; - -import org.dspace.servicemanager.DSpaceServiceManager; -import org.dspace.servicemanager.config.DSpaceConfigurationService; -import org.springframework.beans.BeansException; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.beans.factory.config.BeanPostProcessor; -import org.springframework.beans.factory.config.DestructionAwareBeanPostProcessor; - -/** - * This processes beans as they are loaded into the system by spring. - * Allows us to handle the init method and also push config options. - * - * @author Aaron Zeckoski (azeckoski @ gmail.com) - */ -public final class DSpaceBeanPostProcessor implements BeanPostProcessor, DestructionAwareBeanPostProcessor { - - private DSpaceConfigurationService configurationService; - - @Autowired - public DSpaceBeanPostProcessor(DSpaceConfigurationService configurationService) { - if (configurationService == null) { - throw new IllegalArgumentException("configuration service cannot be null"); - } - this.configurationService = configurationService; - } - - /* (non-Javadoc) - * @see org.springframework.beans.factory.config.BeanPostProcessor#postProcessBeforeInitialization(java.lang - * .Object, java.lang.String) - */ - @Override - public Object postProcessBeforeInitialization(Object bean, String beanName) - throws BeansException { - // Before initializing the service, first configure it based on any related settings in the configurationService - // NOTE: configs related to this bean MUST be prefixed with the bean's name (e.g. [beanName].setting = value) - DSpaceServiceManager.configureService(beanName, bean, configurationService); - return bean; - } - - /* (non-Javadoc) - * @see org.springframework.beans.factory.config.BeanPostProcessor#postProcessAfterInitialization(java.lang - * .Object, java.lang.String) - */ - @Override - public Object postProcessAfterInitialization(Object bean, String beanName) - throws BeansException { - DSpaceServiceManager.initService(bean); - return bean; - } - - /* (non-Javadoc) - * @see org.springframework.beans.factory.config.DestructionAwareBeanPostProcessor#postProcessBeforeDestruction - * (java.lang.Object, java.lang.String) - */ - @Override - public void postProcessBeforeDestruction(Object bean, String beanName) throws BeansException { - DSpaceServiceManager.shutdownService(bean); - } - - // @Override - public boolean requiresDestruction(Object arg0) { - return false; - } -} diff --git a/dspace-services/src/main/java/org/dspace/services/caching/CachingServiceImpl.java b/dspace-services/src/main/java/org/dspace/services/caching/CachingServiceImpl.java index 63cdf32181..0a85755e1a 100644 --- a/dspace-services/src/main/java/org/dspace/services/caching/CachingServiceImpl.java +++ b/dspace-services/src/main/java/org/dspace/services/caching/CachingServiceImpl.java @@ -16,14 +16,14 @@ import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.concurrent.ConcurrentHashMap; +import javax.annotation.PostConstruct; +import javax.annotation.PreDestroy; import net.sf.ehcache.Ehcache; import net.sf.ehcache.Statistics; import org.dspace.kernel.ServiceManager; import org.dspace.kernel.mixins.ConfigChangeListener; -import org.dspace.kernel.mixins.InitializedService; import org.dspace.kernel.mixins.ServiceChangeListener; -import org.dspace.kernel.mixins.ShutdownService; import org.dspace.providers.CacheProvider; import org.dspace.services.CachingService; import org.dspace.services.ConfigurationService; @@ -38,7 +38,6 @@ import org.dspace.utils.servicemanager.ProviderHolder; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.beans.factory.annotation.Required; /** * Implementation of the core caching service, which is available for @@ -47,16 +46,16 @@ import org.springframework.beans.factory.annotation.Required; * @author Aaron Zeckoski (azeckoski @ gmail.com) */ public final class CachingServiceImpl - implements CachingService, InitializedService, ShutdownService, ConfigChangeListener, ServiceChangeListener { + implements CachingService, ConfigChangeListener, ServiceChangeListener { - private static Logger log = LoggerFactory.getLogger(CachingServiceImpl.class); + private static final Logger log = LoggerFactory.getLogger(CachingServiceImpl.class); /** * This is the event key for a full cache reset. */ protected static final String EVENT_RESET = "caching.reset"; /** - * The default config location. + * The default configuration location. */ protected static final String DEFAULT_CONFIG = "org/dspace/services/caching/ehcache-config.xml"; @@ -64,15 +63,14 @@ public final class CachingServiceImpl * All the non-thread caches that we know about. * Mostly used for tracking purposes. */ - private Map cacheRecord = new ConcurrentHashMap(); + private final Map cacheRecord = new ConcurrentHashMap<>(); /** * All the request caches. This is bound to the thread. * The initial value of this TL is set automatically when it is * created. */ - private Map> requestCachesMap = new ConcurrentHashMap>(); + private final Map> requestCachesMap = new ConcurrentHashMap<>(); /** * @return the current request map which is bound to the current thread @@ -84,7 +82,7 @@ public final class CachingServiceImpl Map requestCaches = requestCachesMap.get(requestService.getCurrentRequestId()); if (requestCaches == null) { - requestCaches = new HashMap(); + requestCaches = new HashMap<>(); requestCachesMap.put(requestService.getCurrentRequestId(), requestCaches); } @@ -94,6 +92,7 @@ public final class CachingServiceImpl /** * Unbinds all request caches. Destroys the caches completely. */ + @Override public void unbindRequestCaches() { if (requestService != null) { requestCachesMap.remove(requestService.getCurrentRequestId()); @@ -102,8 +101,7 @@ public final class CachingServiceImpl private ConfigurationService configurationService; - @Autowired - @Required + @Autowired(required = true) public void setConfigurationService(ConfigurationService configurationService) { this.configurationService = configurationService; } @@ -117,8 +115,7 @@ public final class CachingServiceImpl private ServiceManager serviceManager; - @Autowired - @Required + @Autowired(required = true) public void setServiceManager(ServiceManager serviceManager) { this.serviceManager = serviceManager; } @@ -128,8 +125,7 @@ public final class CachingServiceImpl */ protected net.sf.ehcache.CacheManager cacheManager; - @Autowired - @Required + @Autowired(required = true) public void setCacheManager(net.sf.ehcache.CacheManager cacheManager) { this.cacheManager = cacheManager; } @@ -145,7 +141,7 @@ public final class CachingServiceImpl private int timeToIdleSecs = 600; /** - * Reloads the config settings from the configuration service. + * Reloads the configuration settings from the configuration service. */ protected void reloadConfig() { // Reload caching configurations, but have sane default values if unspecified in configs @@ -160,7 +156,7 @@ public final class CachingServiceImpl * WARNING: Do not change the order of these!
    * If you do, you have to fix the {@link #reloadConfig()} method -AZ */ - private String[] knownConfigNames = { + private final String[] knownConfigNames = { "caching.use.clustering", // bool - whether to use clustering "caching.default.use.disk.store", // whether to use the disk store "caching.default.max.elements", // the maximum number of elements in memory, before they are evicted @@ -172,13 +168,15 @@ public final class CachingServiceImpl /* (non-Javadoc) * @see org.dspace.kernel.mixins.ConfigChangeListener#notifyForConfigNames() */ + @Override public String[] notifyForConfigNames() { - return knownConfigNames == null ? null : knownConfigNames.clone(); + return knownConfigNames.clone(); } /* (non-Javadoc) * @see org.dspace.kernel.mixins.ConfigChangeListener#configurationChanged(java.util.List, java.util.Map) */ + @Override public void configurationChanged(List changedSettingNames, Map changedSettings) { reloadConfig(); } @@ -187,7 +185,7 @@ public final class CachingServiceImpl * This will make it easier to handle a provider which might go away * because the classloader is gone. */ - private ProviderHolder provider = new ProviderHolder(); + private final ProviderHolder provider = new ProviderHolder<>(); public CacheProvider getCacheProvider() { return provider.getProvider(); @@ -211,6 +209,7 @@ public final class CachingServiceImpl /* (non-Javadoc) * @see org.dspace.kernel.mixins.ServiceChangeListener#notifyForTypes() */ + @Override public Class[] notifyForTypes() { return new Class[] {CacheProvider.class}; } @@ -219,6 +218,7 @@ public final class CachingServiceImpl * @see org.dspace.kernel.mixins.ServiceChangeListener#serviceRegistered(java.lang.String, java.lang.Object, java * .util.List) */ + @Override public void serviceRegistered(String serviceName, Object service, List> implementedTypes) { provider.setProvider((CacheProvider) service); } @@ -226,14 +226,12 @@ public final class CachingServiceImpl /* (non-Javadoc) * @see org.dspace.kernel.mixins.ServiceChangeListener#serviceUnregistered(java.lang.String, java.lang.Object) */ + @Override public void serviceUnregistered(String serviceName, Object service) { provider.setProvider(null); } - /* (non-Javadoc) - * @see org.dspace.kernel.mixins.InitializedService#init() - */ - @Override + @PostConstruct public void init() { log.info("init()"); // get settings @@ -256,17 +254,13 @@ public final class CachingServiceImpl log.info("Caching service initialized:\n" + getStatus(null)); } - /* (non-Javadoc) - * @see org.dspace.kernel.mixins.ShutdownService#shutdown() - */ + @PreDestroy public void shutdown() { log.info("destroy()"); // for some reason this causes lots of errors so not using it for now -AZ //ehCacheManagementService.dispose(); try { - if (cacheRecord != null) { - cacheRecord.clear(); - } + cacheRecord.clear(); } catch (RuntimeException e) { // whatever } @@ -290,6 +284,7 @@ public final class CachingServiceImpl /* (non-Javadoc) * @see org.dspace.services.CachingService#destroyCache(java.lang.String) */ + @Override public void destroyCache(String cacheName) { if (cacheName == null || "".equals(cacheName)) { throw new IllegalArgumentException("cacheName cannot be null or empty string"); @@ -319,6 +314,7 @@ public final class CachingServiceImpl /* (non-Javadoc) * @see org.dspace.services.CachingService#getCache(java.lang.String, org.dspace.services.model.CacheConfig) */ + @Override public Cache getCache(String cacheName, CacheConfig cacheConfig) { Cache cache = null; @@ -359,8 +355,9 @@ public final class CachingServiceImpl /* (non-Javadoc) * @see org.dspace.services.CachingService#getCaches() */ + @Override public List getCaches() { - List caches = new ArrayList(this.cacheRecord.values()); + List caches = new ArrayList<>(this.cacheRecord.values()); if (getCacheProvider() != null) { try { caches.addAll(getCacheProvider().getCaches()); @@ -377,6 +374,7 @@ public final class CachingServiceImpl /* (non-Javadoc) * @see org.dspace.services.CachingService#getStatus(java.lang.String) */ + @Override public String getStatus(String cacheName) { final StringBuilder sb = new StringBuilder(); @@ -433,6 +431,7 @@ public final class CachingServiceImpl /* (non-Javadoc) * @see org.dspace.services.CachingService#resetCaches() */ + @Override public void resetCaches() { log.debug("resetCaches()"); @@ -468,7 +467,7 @@ public final class CachingServiceImpl if (sorted) { Arrays.sort(cacheNames); } - final List caches = new ArrayList(cacheNames.length); + final List caches = new ArrayList<>(cacheNames.length); for (String cacheName : cacheNames) { caches.add(cacheManager.getEhcache(cacheName)); } @@ -612,6 +611,7 @@ public final class CachingServiceImpl public static final class NameComparator implements Comparator, Serializable { public static final long serialVersionUID = 1l; + @Override public int compare(Cache o1, Cache o2) { return o1.getName().compareTo(o2.getName()); } @@ -619,22 +619,25 @@ public final class CachingServiceImpl private class CachingServiceRequestInterceptor implements RequestInterceptor { + @Override public void onStart(String requestId) { if (requestId != null) { Map requestCaches = requestCachesMap.get(requestId); if (requestCaches == null) { - requestCaches = new HashMap(); + requestCaches = new HashMap<>(); requestCachesMap.put(requestId, requestCaches); } } } + @Override public void onEnd(String requestId, boolean succeeded, Exception failure) { if (requestId != null) { requestCachesMap.remove(requestId); } } + @Override public int getOrder() { return 1; } diff --git a/dspace-services/src/main/java/org/dspace/services/email/EmailServiceImpl.java b/dspace-services/src/main/java/org/dspace/services/email/EmailServiceImpl.java index f20458f51a..2a822c7c6e 100644 --- a/dspace-services/src/main/java/org/dspace/services/email/EmailServiceImpl.java +++ b/dspace-services/src/main/java/org/dspace/services/email/EmailServiceImpl.java @@ -8,6 +8,7 @@ package org.dspace.services.email; import java.util.Properties; +import javax.annotation.PostConstruct; import javax.mail.Authenticator; import javax.mail.PasswordAuthentication; import javax.mail.Session; @@ -16,14 +17,13 @@ import javax.naming.NameNotFoundException; import javax.naming.NamingException; import javax.naming.NoInitialContextException; -import org.dspace.kernel.mixins.InitializedService; +import org.apache.commons.lang3.StringUtils; import org.dspace.services.ConfigurationService; import org.dspace.services.EmailService; import org.dspace.services.factory.DSpaceServicesFactory; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.beans.factory.annotation.Required; /** * Provides mail sending services through JavaMail. If a {@link javax.mail.Session} @@ -34,7 +34,7 @@ import org.springframework.beans.factory.annotation.Required; */ public class EmailServiceImpl extends Authenticator - implements EmailService, InitializedService { + implements EmailService { private static final Logger logger = LoggerFactory.getLogger(EmailServiceImpl.class); private Session session = null; @@ -46,8 +46,7 @@ public class EmailServiceImpl * * @param cfg the configurationService object */ - @Autowired - @Required + @Autowired(required = true) public void setCfg(ConfigurationService cfg) { this.cfg = cfg; } @@ -62,7 +61,7 @@ public class EmailServiceImpl return session; } - @Override + @PostConstruct public void init() { // See if there is already a Session in our environment String sessionName = cfg.getProperty("mail.session.name"); @@ -106,7 +105,7 @@ public class EmailServiceImpl props.put(key, value); } } - if (null == cfg.getProperty("mail.server.username")) { + if (StringUtils.isBlank(cfg.getProperty("mail.server.username"))) { session = Session.getInstance(props); } else { props.put("mail.smtp.auth", "true"); @@ -125,4 +124,12 @@ public class EmailServiceImpl cfg.getProperty("mail.server.username"), cfg.getProperty("mail.server.password")); } + + /** + * Force a new initialization of the session, useful for testing purpose + */ + public void reset() { + session = null; + init(); + } } diff --git a/dspace-services/src/main/java/org/dspace/services/events/SystemEventService.java b/dspace-services/src/main/java/org/dspace/services/events/SystemEventService.java index de67e504a5..39a1f41f6a 100644 --- a/dspace-services/src/main/java/org/dspace/services/events/SystemEventService.java +++ b/dspace-services/src/main/java/org/dspace/services/events/SystemEventService.java @@ -12,9 +12,9 @@ import java.util.List; import java.util.Map; import java.util.Random; import java.util.concurrent.ConcurrentHashMap; +import javax.annotation.PreDestroy; import org.apache.commons.lang3.ArrayUtils; -import org.dspace.kernel.mixins.ShutdownService; import org.dspace.services.CachingService; import org.dspace.services.EventService; import org.dspace.services.RequestService; @@ -36,7 +36,7 @@ import org.springframework.beans.factory.annotation.Autowired; * * @author Aaron Zeckoski (azeckoski@gmail.com) - azeckoski - 4:02:31 PM Nov 19, 2008 */ -public final class SystemEventService implements EventService, ShutdownService { +public final class SystemEventService implements EventService { private final Logger log = LoggerFactory.getLogger(SystemEventService.class); @@ -45,7 +45,7 @@ public final class SystemEventService implements EventService, ShutdownService { /** * Map for holding onto the listeners which is ClassLoader safe. */ - private Map listenersMap = new ConcurrentHashMap(); + private final Map listenersMap = new ConcurrentHashMap<>(); private final RequestService requestService; private final CachingService cachingService; @@ -64,9 +64,7 @@ public final class SystemEventService implements EventService, ShutdownService { this.requestService.registerRequestInterceptor(this.requestInterceptor); } - /* (non-Javadoc) - * @see org.dspace.kernel.mixins.ShutdownService#shutdown() - */ + @PreDestroy public void shutdown() { this.requestInterceptor = null; // clear the interceptor this.listenersMap.clear(); @@ -76,6 +74,7 @@ public final class SystemEventService implements EventService, ShutdownService { /* (non-Javadoc) * @see org.dspace.services.EventService#fireEvent(org.dspace.services.model.Event) */ + @Override public void fireEvent(Event event) { validateEvent(event); // check scopes for this event @@ -97,6 +96,7 @@ public final class SystemEventService implements EventService, ShutdownService { /* (non-Javadoc) * @see org.dspace.services.EventService#queueEvent(org.dspace.services.model.Event) */ + @Override public void queueEvent(Event event) { validateEvent(event); @@ -118,6 +118,7 @@ public final class SystemEventService implements EventService, ShutdownService { /* (non-Javadoc) * @see org.dspace.services.EventService#registerEventListener(org.dspace.services.model.EventListener) */ + @Override public void registerEventListener(EventListener listener) { if (listener == null) { throw new IllegalArgumentException("Cannot register a listener that is null"); @@ -293,7 +294,7 @@ public final class SystemEventService implements EventService, ShutdownService { return allowName && allowResource; } - private Random random = new Random(); + private final Random random = new Random(); /** * Generate an event ID used to identify and track this event uniquely. @@ -316,6 +317,7 @@ public final class SystemEventService implements EventService, ShutdownService { * @see org.dspace.services.model.RequestInterceptor#onStart(java.lang.String, org.dspace.services.model * .Session) */ + @Override public void onStart(String requestId) { // nothing to really do here unless we decide we should purge out any existing events? -AZ } @@ -324,6 +326,7 @@ public final class SystemEventService implements EventService, ShutdownService { * @see org.dspace.services.model.RequestInterceptor#onEnd(java.lang.String, org.dspace.services.model * .Session, boolean, java.lang.Exception) */ + @Override public void onEnd(String requestId, boolean succeeded, Exception failure) { if (succeeded) { int fired = fireQueuedEvents(); @@ -338,6 +341,7 @@ public final class SystemEventService implements EventService, ShutdownService { /* (non-Javadoc) * @see org.dspace.kernel.mixins.OrderedService#getOrder() */ + @Override public int getOrder() { return 20; // this should fire pretty late } diff --git a/dspace-services/src/main/java/org/dspace/services/sessions/StatelessRequestServiceImpl.java b/dspace-services/src/main/java/org/dspace/services/sessions/StatelessRequestServiceImpl.java index 07798a0225..cbba714c81 100644 --- a/dspace-services/src/main/java/org/dspace/services/sessions/StatelessRequestServiceImpl.java +++ b/dspace-services/src/main/java/org/dspace/services/sessions/StatelessRequestServiceImpl.java @@ -15,12 +15,12 @@ import java.util.Map; import java.util.Objects; import java.util.UUID; import java.util.concurrent.ConcurrentHashMap; +import javax.annotation.PostConstruct; +import javax.annotation.PreDestroy; import javax.servlet.ServletRequest; import javax.servlet.ServletResponse; import org.apache.commons.lang3.StringUtils; -import org.dspace.kernel.mixins.InitializedService; -import org.dspace.kernel.mixins.ShutdownService; import org.dspace.services.ConfigurationService; import org.dspace.services.RequestService; import org.dspace.services.model.Request; @@ -32,7 +32,6 @@ import org.dspace.utils.servicemanager.OrderedServiceComparator; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.beans.factory.annotation.Required; /** @@ -45,14 +44,13 @@ import org.springframework.beans.factory.annotation.Required; * @author Aaron Zeckoski (azeckoski @ gmail.com) * @author Tom Desair (tom dot desair at atmire dot com) */ -public final class StatelessRequestServiceImpl implements RequestService, InitializedService, ShutdownService { +public final class StatelessRequestServiceImpl implements RequestService { - private static Logger log = LoggerFactory.getLogger(StatelessRequestServiceImpl.class); + private static final Logger log = LoggerFactory.getLogger(StatelessRequestServiceImpl.class); private ConfigurationService configurationService; - @Autowired - @Required + @Autowired(required = true) public void setConfigurationService(ConfigurationService configurationService) { this.configurationService = configurationService; } @@ -60,18 +58,14 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia /** * map for holding onto the request interceptors which is classloader safe. */ - private Map interceptorsMap = new HashMap(); + private final Map interceptorsMap = new HashMap<>(); - /* (non-Javadoc) - * @see org.dspace.kernel.mixins.InitializedService#init() - */ + @PostConstruct public void init() { log.info("init"); } - /* (non-Javadoc) - * @see org.dspace.kernel.mixins.ShutdownService#shutdown() - */ + @PreDestroy public void shutdown() { log.info("shutdown"); clear(); @@ -90,6 +84,7 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia /* (non-Javadoc) * @see org.dspace.services.RequestService#startRequest() */ + @Override public String startRequest() { return startRequest(new InternalRequestImpl()); } @@ -97,6 +92,7 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia /* (non-Javadoc) * @see org.dspace.services.RequestService#startRequest() */ + @Override public String startRequest(ServletRequest request, ServletResponse response) { return startRequest(new HttpRequestImpl(request, response)); } @@ -128,6 +124,7 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia /* (non-Javadoc) * @see org.dspace.services.RequestService#endRequest(java.lang.Exception) */ + @Override public String endRequest(Exception failure) { String requestId = null; try { @@ -175,7 +172,7 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia * @return the current list of interceptors in the correct order */ private List getInterceptors(boolean reverse) { - ArrayList l = new ArrayList(this.interceptorsMap.values()); + ArrayList l = new ArrayList<>(this.interceptorsMap.values()); OrderedServiceComparator comparator = new OrderedServiceComparator(); Collections.sort(l, comparator); if (reverse) { @@ -187,6 +184,7 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia /* (non-Javadoc) * @see org.dspace.services.RequestService#registerRequestListener(org.dspace.services.model.RequestInterceptor) */ + @Override public void registerRequestInterceptor(RequestInterceptor interceptor) { if (interceptor == null) { throw new IllegalArgumentException("Cannot register an interceptor that is null"); @@ -198,11 +196,12 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia this.interceptorsMap.put(key, interceptor); } - /** + /* * (non-Javadoc) * * @see org.dspace.services.RequestService#getCurrentUserId() */ + @Override public String getCurrentUserId() { Request currentRequest = getCurrentRequest(); if (currentRequest == null) { @@ -212,11 +211,12 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia } } - /** + /* * (non-Javadoc) * * @see org.dspace.services.RequestService#setCurrentUserId() */ + @Override public void setCurrentUserId(UUID epersonId) { Request currentRequest = getCurrentRequest(); if (currentRequest != null) { @@ -227,6 +227,7 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia /* (non-Javadoc) * @see org.dspace.services.RequestService#getCurrentRequestId() */ + @Override public String getCurrentRequestId() { Request req = requests.getCurrent(); if (req != null) { @@ -239,6 +240,7 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia /* (non-Javadoc) * @see org.dspace.services.RequestService#getCurrentRequest() */ + @Override public Request getCurrentRequest() { return requests.getCurrent(); } @@ -247,7 +249,7 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia * Class to hold the current request. Uses Map keyed on current thread id. */ private class RequestHolder { - Map requestMap = new ConcurrentHashMap(); + Map requestMap = new ConcurrentHashMap<>(); Request getCurrent() { return requestMap.get(Thread.currentThread().getId()); @@ -298,5 +300,5 @@ public final class StatelessRequestServiceImpl implements RequestService, Initia } } - private RequestHolder requests = new RequestHolder(); + private final RequestHolder requests = new RequestHolder(); } diff --git a/dspace-services/src/main/resources/spring/spring-dspace-applicationContext.xml b/dspace-services/src/main/resources/spring/spring-dspace-applicationContext.xml index 957b84b2a5..2075eb1516 100644 --- a/dspace-services/src/main/resources/spring/spring-dspace-applicationContext.xml +++ b/dspace-services/src/main/resources/spring/spring-dspace-applicationContext.xml @@ -19,7 +19,4 @@ - - - \ No newline at end of file diff --git a/dspace-services/src/test/java/org/dspace/servicemanager/DSpaceServiceManagerTest.java b/dspace-services/src/test/java/org/dspace/servicemanager/DSpaceServiceManagerTest.java index a49d115410..03c74d7592 100644 --- a/dspace-services/src/test/java/org/dspace/servicemanager/DSpaceServiceManagerTest.java +++ b/dspace-services/src/test/java/org/dspace/servicemanager/DSpaceServiceManagerTest.java @@ -16,9 +16,9 @@ import static org.junit.Assert.fail; import java.util.HashMap; import java.util.List; import java.util.Map; +import javax.annotation.PostConstruct; +import javax.annotation.PreDestroy; -import org.dspace.kernel.mixins.InitializedService; -import org.dspace.kernel.mixins.ShutdownService; import org.dspace.servicemanager.config.DSpaceConfigurationService; import org.dspace.servicemanager.example.ConcreteExample; import org.dspace.servicemanager.fakeservices.FakeService1; @@ -28,7 +28,7 @@ import org.junit.Before; import org.junit.Test; /** - * testing the main dspace service manager + * Testing the main DSpace service manager. * * @author Aaron Zeckoski (azeckoski @ gmail.com) */ @@ -42,10 +42,6 @@ public class DSpaceServiceManagerTest { public void init() { configurationService = new DSpaceConfigurationService(); - // Set some sample configurations relating to services/beans - configurationService.loadConfig(SampleAnnotationBean.class.getName() + ".sampleValue", "beckyz"); - configurationService.loadConfig("fakeBean.fakeParam", "beckyz"); - dsm = new DSpaceServiceManager(configurationService, SPRING_TEST_CONFIG_FILE); } @@ -175,16 +171,6 @@ public class DSpaceServiceManagerTest { ConcreteExample concrete = dsm.getServiceByName(ConcreteExample.class.getName(), ConcreteExample.class); assertNotNull(concrete); assertEquals("azeckoski", concrete.getName()); - concrete = null; - - // initialize a SampleAnnotationBean - SampleAnnotationBean sab = dsm - .getServiceByName(SampleAnnotationBean.class.getName(), SampleAnnotationBean.class); - assertNotNull(sab); - // Based on the configuration for "sampleValue" in the init() method above, - // a value should be pre-set! - assertEquals("beckyz", sab.getSampleValue()); - sab = null; SpringAnnotationBean spr = dsm.getServiceByName( SpringAnnotationBean.class.getName(), SpringAnnotationBean.class); @@ -192,7 +178,6 @@ public class DSpaceServiceManagerTest { assertEquals("azeckoski", spr.getConcreteName()); assertEquals("aaronz", spr.getExampleName()); assertEquals(null, spr.getSampleValue()); - spr = null; } /** @@ -271,25 +256,6 @@ public class DSpaceServiceManagerTest { // TODO need to do a better test here } - @Test - public void testInitAndShutdown() { - dsm.startup(); - - SampleAnnotationBean sab = dsm - .getServiceByName(SampleAnnotationBean.class.getName(), SampleAnnotationBean.class); - assertNotNull(sab); - assertEquals(1, sab.initCounter); - sab = null; - - TestService ts = new TestService(); - assertEquals(0, ts.value); - dsm.registerService(TestService.class.getName(), ts); - assertEquals(1, ts.value); - dsm.unregisterService(TestService.class.getName()); - assertEquals(2, ts.value); - ts = null; - } - @Test public void testRegisterProviderLifecycle() { dsm.startup(); @@ -321,16 +287,16 @@ public class DSpaceServiceManagerTest { properties = null; } - public static class TestService implements InitializedService, ShutdownService { + public static class TestService { public int value = 0; - @Override + @PostConstruct public void init() { value++; } - @Override + @PreDestroy public void shutdown() { value++; } diff --git a/dspace-services/src/test/java/org/dspace/servicemanager/SampleAnnotationBean.java b/dspace-services/src/test/java/org/dspace/servicemanager/SampleAnnotationBean.java index 3f0d47590e..e3d08100a4 100644 --- a/dspace-services/src/test/java/org/dspace/servicemanager/SampleAnnotationBean.java +++ b/dspace-services/src/test/java/org/dspace/servicemanager/SampleAnnotationBean.java @@ -7,44 +7,45 @@ */ package org.dspace.servicemanager; -import org.dspace.kernel.mixins.InitializedService; -import org.dspace.kernel.mixins.ShutdownService; +import javax.annotation.PostConstruct; +import javax.annotation.PreDestroy; + import org.dspace.servicemanager.example.ConcreteExample; import org.dspace.servicemanager.example.ServiceExample; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.beans.factory.annotation.Required; import org.springframework.stereotype.Service; /** - * This bean is a simple example of a bean which is annotated as a spring bean and should be found when the AC starts up + * This bean is a simple example of a bean which is annotated as a Spring Bean + * and should be found when the AC starts up. * * @author Aaron Zeckoski (azeckoski @ gmail.com) */ @Service -public class SampleAnnotationBean implements InitializedService, ShutdownService { +public class SampleAnnotationBean { public int initCounter = 0; + @PostConstruct public void init() { initCounter++; } + @PreDestroy public void shutdown() { initCounter++; } private ServiceExample serviceExample; - @Autowired - @Required + @Autowired(required = true) public void setServiceExample(ServiceExample serviceExample) { this.serviceExample = serviceExample; } private ConcreteExample concreteExample; - @Autowired - @Required + @Autowired(required = true) public void setConcreteExample(ConcreteExample concreteExample) { this.concreteExample = concreteExample; } diff --git a/dspace-services/src/test/java/org/dspace/servicemanager/fakeservices/FakeService1.java b/dspace-services/src/test/java/org/dspace/servicemanager/fakeservices/FakeService1.java index ed88e7e7f7..bc00c79e99 100644 --- a/dspace-services/src/test/java/org/dspace/servicemanager/fakeservices/FakeService1.java +++ b/dspace-services/src/test/java/org/dspace/servicemanager/fakeservices/FakeService1.java @@ -10,23 +10,23 @@ package org.dspace.servicemanager.fakeservices; import java.io.Serializable; import java.util.List; import java.util.Map; +import javax.annotation.PostConstruct; +import javax.annotation.PreDestroy; import org.dspace.kernel.mixins.ConfigChangeListener; -import org.dspace.kernel.mixins.InitializedService; import org.dspace.kernel.mixins.ServiceChangeListener; -import org.dspace.kernel.mixins.ShutdownService; import org.dspace.services.ConfigurationService; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.beans.factory.annotation.Required; /** - * This is just testing a fake service and running it through some paces to see if the lifecycles work + * This is just testing a fake service and running it through some paces to see + * if the lifecycles work. * * @author Aaron Zeckoski (azeckoski @ gmail.com) */ public class FakeService1 implements ConfigChangeListener, ServiceChangeListener, - InitializedService, ShutdownService, Serializable { + Serializable { private static final long serialVersionUID = 1L; public int triggers = 0; @@ -56,8 +56,7 @@ public class FakeService1 implements ConfigChangeListener, ServiceChangeListener private ConfigurationService configurationService; - @Autowired - @Required + @Autowired(required = true) public void setConfigurationService(ConfigurationService configurationService) { this.configurationService = configurationService; } @@ -69,6 +68,7 @@ public class FakeService1 implements ConfigChangeListener, ServiceChangeListener /* (non-Javadoc) * @see org.dspace.kernel.mixins.ConfigChangeListener#configurationChanged(java.util.List, java.util.Map) */ + @Override public void configurationChanged(List changedSettingNames, Map changedSettings) { something = "config:" + changedSettings.get("azeckoski.FakeService1.something"); @@ -79,6 +79,7 @@ public class FakeService1 implements ConfigChangeListener, ServiceChangeListener * @see org.dspace.kernel.mixins.ServiceChangeListener#serviceRegistered(java.lang.String, java.lang.Object, java * .util.List) */ + @Override public void serviceRegistered(String serviceName, Object service, List> implementedTypes) { something = "registered:" + serviceName; @@ -88,22 +89,19 @@ public class FakeService1 implements ConfigChangeListener, ServiceChangeListener /* (non-Javadoc) * @see org.dspace.kernel.mixins.ServiceChangeListener#serviceUnregistered(java.lang.String, java.lang.Object) */ + @Override public void serviceUnregistered(String serviceName, Object service) { something = "unregistered:" + serviceName; triggers++; } - /* (non-Javadoc) - * @see org.dspace.kernel.mixins.InitializedService#init() - */ + @PostConstruct public void init() { something = "init"; triggers = 1; // RESET to 1 } - /* (non-Javadoc) - * @see org.dspace.kernel.mixins.ShutdownService#shutdown() - */ + @PreDestroy public void shutdown() { something = "shutdown"; triggers++; @@ -112,6 +110,7 @@ public class FakeService1 implements ConfigChangeListener, ServiceChangeListener /* (non-Javadoc) * @see org.dspace.kernel.mixins.ConfigChangeListener#notifyForConfigNames() */ + @Override public String[] notifyForConfigNames() { return null; // ALL } @@ -119,6 +118,7 @@ public class FakeService1 implements ConfigChangeListener, ServiceChangeListener /* (non-Javadoc) * @see org.dspace.kernel.mixins.ServiceChangeListener#notifyForTypes() */ + @Override public Class[] notifyForTypes() { return null; // ALL } diff --git a/dspace-services/src/test/java/org/dspace/servicemanager/fakeservices/FakeService2.java b/dspace-services/src/test/java/org/dspace/servicemanager/fakeservices/FakeService2.java index 32b99db315..5a9024b9a7 100644 --- a/dspace-services/src/test/java/org/dspace/servicemanager/fakeservices/FakeService2.java +++ b/dspace-services/src/test/java/org/dspace/servicemanager/fakeservices/FakeService2.java @@ -8,16 +8,14 @@ package org.dspace.servicemanager.fakeservices; import java.io.Serializable; - -import org.dspace.kernel.mixins.InitializedService; - +import javax.annotation.PostConstruct; /** * Simple fake service 2 * * @author Aaron Zeckoski (azeckoski @ gmail.com) */ -public class FakeService2 implements InitializedService, Comparable, Serializable { +public class FakeService2 implements Comparable, Serializable { private static final long serialVersionUID = 1L; public String data = "data"; @@ -30,13 +28,12 @@ public class FakeService2 implements InitializedService, Comparable org.dspace dspace-parent - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. diff --git a/dspace-sword/src/main/java/org/dspace/sword/DepositManager.java b/dspace-sword/src/main/java/org/dspace/sword/DepositManager.java index 7535302139..4491f876cc 100644 --- a/dspace-sword/src/main/java/org/dspace/sword/DepositManager.java +++ b/dspace-sword/src/main/java/org/dspace/sword/DepositManager.java @@ -245,6 +245,8 @@ public class DepositManager { String filenameBase = "sword-" + deposit.getUsername() + "-" + (new Date()).getTime(); + // No dots or slashes allowed in filename + filenameBase = filenameBase.replaceAll("\\.", "").replaceAll("/", ""). replaceAll("\\\\", ""); File packageFile = new File(path, filenameBase); File headersFile = new File(path, filenameBase + "-headers"); diff --git a/dspace-sword/src/main/java/org/purl/sword/server/ServiceDocumentServlet.java b/dspace-sword/src/main/java/org/purl/sword/server/ServiceDocumentServlet.java index 28c309b2ca..494cbc9db4 100644 --- a/dspace-sword/src/main/java/org/purl/sword/server/ServiceDocumentServlet.java +++ b/dspace-sword/src/main/java/org/purl/sword/server/ServiceDocumentServlet.java @@ -164,7 +164,8 @@ public class ServiceDocumentServlet extends HttpServlet { } catch (SWORDException se) { log.error("Internal error", se); // Throw a HTTP 500 - response.sendError(HttpServletResponse.SC_INTERNAL_SERVER_ERROR, se.getMessage()); + response.sendError(HttpServletResponse.SC_INTERNAL_SERVER_ERROR, + "Internal error (check logs for more information)"); } } diff --git a/dspace-swordv2/pom.xml b/dspace-swordv2/pom.xml index 934eae3682..dee2a0d4a1 100644 --- a/dspace-swordv2/pom.xml +++ b/dspace-swordv2/pom.xml @@ -13,7 +13,7 @@ org.dspace dspace-parent - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. diff --git a/dspace/bin/start-handle-server b/dspace/bin/start-handle-server index 0df1c3a7f4..b9dd53fbef 100755 --- a/dspace/bin/start-handle-server +++ b/dspace/bin/start-handle-server @@ -37,7 +37,7 @@ if [ "$JAVA_OPTS" = "" ]; then fi # Remove lock file, in case the old Handle server did not shut down properly -rm -f $handledir/txns/lock +rm -f $HANDLEDIR/txns/lock # Start the Handle server, with a special log4j properties file. # We cannot simply write to the same logs, since log4j diff --git a/dspace/config/controlled-vocabularies/controlledvocabulary.xsd b/dspace/config/controlled-vocabularies/controlledvocabulary.xsd index 30fbb7f8ad..7a5defefbd 100644 --- a/dspace/config/controlled-vocabularies/controlledvocabulary.xsd +++ b/dspace/config/controlled-vocabularies/controlledvocabulary.xsd @@ -58,6 +58,7 @@ or refer to the Web site http://dspace-dev.dsi.uminho.pt. + diff --git a/dspace/config/crosswalks/google-metadata.properties b/dspace/config/crosswalks/google-metadata.properties index 51cb1b4167..157ee9c0b1 100644 --- a/dspace/config/crosswalks/google-metadata.properties +++ b/dspace/config/crosswalks/google-metadata.properties @@ -6,9 +6,9 @@ # Field Identifiers -# Pairs of field/value matches indended to uniquely identify an +# Pairs of field/value matches indended to uniquely identify an # item of a particular type for unique metadata field assignment, -# e.g. a dissertation item that contains values for the +# e.g. a dissertation item that contains values for the # dissertation-specific metadata elements. google.identifiers.dissertation = dc.type:Thesis @@ -22,13 +22,13 @@ google.identifiers.technical_report = dc.type:Technical Report # synonymous with "option" field-set. # - Single fields allowed -# Comma-delimited lists: +# Comma-delimited lists: # - Used to list metadata fields for aggregate value fields. # - Will be treated like pipes if used for single-value fields. # Wildcard characters will be expanded into all fields present for # items and are intended for use where a field aggregates values, -# e.g. citation_authors. +# e.g. citation_authors. # # If used in a first-match path, there is no guarantee of search order. @@ -55,7 +55,7 @@ google.citation_lastpage = google.citation_doi = google.citation_issn = dc.identifier.issn google.citation_isbn = dc.identifier.isbn -google.citation_conference = +google.citation_conference = # Type-specific fields retrieved when one of the above identifiers # is matched for the item. @@ -67,13 +67,13 @@ google.citation_dissertation_institution = dc.publisher # a list of ISO 3166-1 alpha-3 codes per # http://en.wikipedia.org/wiki/ISO_3166-1_alpha-3, not # a metadata field. -google.citation_patent_country = +google.citation_patent_country = google.citation_patent_number = google.citation_technical_report_number = google.citation_technical_report_institution = dc.publisher -#priority whitelist for citation_pdf_url, shortnames are defined in dspace/config/registries/bitstream-formats.xml +#priority "allow list" for citation_pdf_url, shortnames are defined in dspace/config/registries/bitstream-formats.xml #priority order is defined here, where the first type is the most important google.citation.prioritized_types = Adobe PDF, Postscript, Microsoft Word XML, Microsoft Word, RTF, EPUB diff --git a/dspace/config/dspace.cfg b/dspace/config/dspace.cfg index 1bbd895131..96b8ddf536 100644 --- a/dspace/config/dspace.cfg +++ b/dspace/config/dspace.cfg @@ -121,6 +121,7 @@ feedback.recipient = dspace-help@myu.edu # General site administration (Webmaster) e-mail # System notifications/reports and other sysadmin emails are sent to this address mail.admin = dspace-help@myu.edu +mail.admin.name = DSpace Administrator # Recipient for server errors and alerts (defaults to mail.admin) alert.recipient = ${mail.admin} @@ -672,7 +673,7 @@ event.dispatcher.noindex.consumers = eperson # consumer to maintain the discovery index event.consumer.discovery.class = org.dspace.discovery.IndexEventConsumer -event.consumer.discovery.filters = Community|Collection|Item|Bundle+Add|Create|Modify|Modify_Metadata|Delete|Remove +event.consumer.discovery.filters = Community|Collection|Item|Bundle|Site+Add|Create|Modify|Modify_Metadata|Delete|Remove # consumer related to EPerson changes event.consumer.eperson.class = org.dspace.eperson.EPersonConsumer @@ -1429,6 +1430,10 @@ webui.content_disposition_threshold = 8388608 # the directory where the generated sitemaps are stored sitemap.dir = ${dspace.dir}/sitemaps +# Customize the path of sitemaps in the server webapp +# Defaults to "sitemaps", which means they are available at ${dspace.server.url}/sitemaps/ +# sitemap.path = sitemaps + # # Comma-separated list of search engine URLs to 'ping' when a new Sitemap has # been created. Include everything except the Sitemap URL itself (which will @@ -1442,6 +1447,14 @@ sitemap.engineurls = http://www.google.com/webmasters/sitemaps/ping?sitemap= # # No known Sitemap 'ping' URL for MSN/Live search +# Define cron for how frequently the sitemap should refresh. +# Defaults to running daily at 1:15am +# Cron syntax is defined at https://www.quartz-scheduler.org/api/2.3.0/org/quartz/CronTrigger.html +# Remove (comment out) this config to disable the sitemap scheduler. +# Sitemap scheduler can also be disabled by setting to "-" (single dash) in local.cfg. +# Keep in mind, changing the schedule requires rebooting your servlet container, e.g. Tomcat. +sitemap.cron = 0 15 1 * * ? + ##### SHERPA/Romeo Integration Settings #### # the SHERPA/RoMEO endpoint sherpa.romeo.url = http://www.sherpa.ac.uk/romeo/api29.php @@ -1474,7 +1487,7 @@ orcid.url = https://orcid.org/ ## eg: nsi, srsc. ## Each DSpaceControlledVocabulary plugin comes with three configuration options: # vocabulary.plugin._plugin_.hierarchy.store = # default: true -# vocabulary.plugin._plugin_.hierarchy.suggest = # default: true +# vocabulary.plugin._plugin_.hierarchy.suggest = # default: false # vocabulary.plugin._plugin_.delimiter = "" # default: "::" ## ## An example using "srsc" can be found later in this section @@ -1953,6 +1966,7 @@ xmlui.search.metadata_export = admin request.item.type = all # Helpdesk E-mail mail.helpdesk = ${mail.admin} +mail.helpdesk.name = Help Desk # Should all Request Copy emails go to the helpdesk instead of the item submitter? request.item.helpdesk.override = false @@ -2012,3 +2026,4 @@ include = ${module_dir}/translator.cfg include = ${module_dir}/usage-statistics.cfg include = ${module_dir}/versioning.cfg include = ${module_dir}/workflow.cfg +include = ${module_dir}/irus-statistics.cfg diff --git a/dspace/config/emails/request_item.to_admin b/dspace/config/emails/request_item.to_admin new file mode 100644 index 0000000000..244fa5647b --- /dev/null +++ b/dspace/config/emails/request_item.to_admin @@ -0,0 +1,13 @@ +Subject: Request copy of document + +Dear Administrator, + +A user of {7}, named {0} and using the email {1}, requested a copy of the file(s) associated with the document: "{4}" ({3}). + +This request came along with the following message: + +"{5}" + +To answer, click {6}. + +PLEASE REDIRECT THIS MESSAGE TO THE AUTHOR(S). diff --git a/dspace/config/hibernate.cfg.xml b/dspace/config/hibernate.cfg.xml index 8b8d43a340..39f5a11378 100644 --- a/dspace/config/hibernate.cfg.xml +++ b/dspace/config/hibernate.cfg.xml @@ -82,5 +82,7 @@ + + diff --git a/dspace/config/launcher.xml b/dspace/config/launcher.xml index f06225e5cc..3e77704c3d 100644 --- a/dspace/config/launcher.xml +++ b/dspace/config/launcher.xml @@ -54,13 +54,6 @@ org.dspace.administer.CreateAdministrator - - curate - Perform curation tasks on DSpace objects - - org.dspace.curate.CurationCli - - database Perform database tasks like test database connection, migrate/repair database, remove database @@ -194,6 +187,13 @@ org.dspace.administer.RegistryLoader + + retry-tracker + Retry all failed commits to the OpenURLTracker + + org.dspace.statistics.export.RetryFailedOpenUrlTracker + + solr-export-statistics Export usage statistics data from Solr for back-up purposes diff --git a/dspace/config/log4j-handle-plugin.properties b/dspace/config/log4j-handle-plugin.properties index 72381a698c..44d39fb1bd 100644 --- a/dspace/config/log4j-handle-plugin.properties +++ b/dspace/config/log4j-handle-plugin.properties @@ -20,12 +20,12 @@ log.dir=${dspace.dir}/log log4j.rootCategory=INFO, A1 # A1 is set to be a DailyRollingFileAppender. -log4j.appender.A1=org.apache.logging.log4j.DailyRollingFileAppender +log4j.appender.A1=org.apache.log4j.DailyRollingFileAppender log4j.appender.A1.File=${log.dir}/handle-plugin.log log4j.appender.A1.DatePattern='.'yyyy-MM-dd # A1 uses PatternLayout. -log4j.appender.A1.layout=org.apache.logging.log4j.PatternLayout +log4j.appender.A1.layout=org.apache.log4j.PatternLayout log4j.appender.A1.layout.ConversionPattern=%d %-5p %c @ %m%n diff --git a/dspace/config/modules/authentication.cfg b/dspace/config/modules/authentication.cfg index d5d010af6d..f22e2eaf19 100644 --- a/dspace/config/modules/authentication.cfg +++ b/dspace/config/modules/authentication.cfg @@ -57,26 +57,57 @@ plugin.sequence.org.dspace.authenticate.AuthenticationMethod = org.dspace.authen # Server key part that is a part of the key used to sign the authentication tokens. # If this property is not set or empty, DSpace will generate a random key on startup. # IF YOU ARE RUNNING DSPACE IN A CLUSTER, you need to set a value for this property here or as an environment variable -# jwt.token.secret = +# jwt.login.token.secret = # This property enables/disables encryption of the payload in a stateless token. Enabling this makes the data encrypted # and unreadable by the receiver, but makes the token larger in size. false by default -jwt.encryption.enabled = false +jwt.login.encryption.enabled = false # Encryption key to use when JWT token encryption is enabled (JWE). Note that encrypting tokens might required additional # configuration in the REST clients -# jwt.encryption.secret = +# jwt.login.encryption.secret = # This enables compression of the payload of a jwt, enabling this will make the jwt token a little smaller at the cost # of some performance, this setting WILL ONLY BE used when encrypting the jwt. -jwt.compression.enabled = true +jwt.login.compression.enabled = true -# Expiration time of a token in minutes -jwt.token.expiration = 30 +# Expiration time of a token in milliseconds +jwt.login.token.expiration = 1800000 # Restrict tokens to a specific ip-address to prevent theft/session hijacking. This is achieved by making the ip-address # a part of the JWT siging key. If this property is set to false then the ip-address won't be used as part of # the signing key of a jwt token and tokens can be shared over multiple ip-addresses. # For security reasons, this defaults to true -jwt.token.include.ip = true +jwt.login.token.include.ip = true + +#---------------------------------------------------------------# +#---Stateless JWT Authentication for downloads of bitstreams----# +#----------------------among other things-----------------------# +#---------------------------------------------------------------# + +# Server key part that is a part of the key used to sign the authentication tokens. +# If this property is not set or empty, DSpace will generate a random key on startup. +# IF YOU ARE RUNNING DSPACE IN A CLUSTER, you need to set a value for this property here or as an environment variable +# jwt.shortLived.token.secret = + +# This property enables/disables encryption of the payload in a stateless token. Enabling this makes the data encrypted +# and unreadable by the receiver, but makes the token larger in size. false by default +jwt.shortLived.encryption.enabled = false + +# Encryption key to use when JWT token encryption is enabled (JWE). Note that encrypting tokens might required additional +# configuration in the REST clients +# jwt.shortLived.encryption.secret = + +# This enables compression of the payload of a jwt, enabling this will make the jwt token a little smaller at the cost +# of some performance, this setting WILL ONLY BE used when encrypting the jwt. +jwt.shortLived.compression.enabled = true + +# Expiration time of a token in milliseconds +jwt.shortLived.token.expiration = 2000 + +# Restrict tokens to a specific ip-address to prevent theft/session hijacking. This is achieved by making the ip-address +# a part of the JWT siging key. If this property is set to false then the ip-address won't be used as part of +# the signing key of a jwt token and tokens can be shared over multiple ip-addresses. +# For security reasons, this defaults to true +jwt.shortLived.token.include.ip = true diff --git a/dspace/config/modules/curate.cfg b/dspace/config/modules/curate.cfg index cf1a25410c..df6d4f855a 100644 --- a/dspace/config/modules/curate.cfg +++ b/dspace/config/modules/curate.cfg @@ -11,8 +11,8 @@ plugin.named.org.dspace.curate.CurationTask = org.dspace.ctask.general.NoOpCurationTask = noop plugin.named.org.dspace.curate.CurationTask = org.dspace.ctask.general.ProfileFormats = profileformats plugin.named.org.dspace.curate.CurationTask = org.dspace.ctask.general.RequiredMetadata = requiredmetadata -plugin.named.org.dspace.curate.CurationTask = org.dspace.ctask.general.ClamScan = vscan -plugin.named.org.dspace.curate.CurationTask = org.dspace.ctask.general.MicrosoftTranslator = translate +#plugin.named.org.dspace.curate.CurationTask = org.dspace.ctask.general.ClamScan = vscan +#plugin.named.org.dspace.curate.CurationTask = org.dspace.ctask.general.MicrosoftTranslator = translate plugin.named.org.dspace.curate.CurationTask = org.dspace.ctask.general.MetadataValueLinkChecker = checklinks # add new tasks here (or in additional config files) @@ -25,30 +25,6 @@ curate.taskqueue.dir = ${dspace.dir}/ctqueues # (optional) directory location of scripted (non-java) tasks # curate.script.dir = ${dspace.dir}/ctscripts -# Friendly names for curation tasks to appear in admin UI -# Also acts as a filter - i.e. tasks not enumerated here can still -# be invoked on cmd line, etc - just not in UI -curate.ui.tasknames = profileformats = Profile Bitstream Formats -curate.ui.tasknames = requiredmetadata = Check for Required Metadata -curate.ui.tasknames = checklinks = Check Links in Metadata - -# Tasks may be organized into named groups which display together in UI drop-downs -# curate.ui.taskgroups = \ -# general = General Purpose Tasks, - -# Group membership is defined using comma-separated lists of task names, one property per group -# curate.ui.taskgroup.general = profileformats, requiredmetadata, checklinks - -# Name of queue used when tasks queued in Admin UI -curate.ui.queuename = admin_ui - -# Localized names for curation status codes in Admin UI -curate.ui.statusmessages = \ - -3 = Unknown Task, \ - -2 = No Status Set, \ - -1 = Error, \ - 0 = Success, \ - 1 = Fail, \ - 2 = Skip, \ - other = Invalid Status +# Ensure list of Curation Tasks (defined above) is available via the REST API /api/config/properties endpoint +rest.properties.exposed = plugin.named.org.dspace.curate.CurationTask diff --git a/dspace/config/modules/irus-statistics.cfg b/dspace/config/modules/irus-statistics.cfg new file mode 100644 index 0000000000..3982149ed7 --- /dev/null +++ b/dspace/config/modules/irus-statistics.cfg @@ -0,0 +1,35 @@ +# Enable the IRUS tracker. By default or when omitted, the tracker will be disabled +irus.statistics.tracker.enabled = false + +# OPTIONAL metadata field used for filtering. +# If items with specific values for the "dc.type" field should be excluded, "dc.type" should be placed here. +# This should comply to the syntax schema.element.qualified or schema.element if the qualifier is null. +# irus.statistics.tracker.type-field = dc.type +# If "tracker.type-field" is set, the list of values must be defined in "tracker.type-value". +# This lists a comma separated list of values that will be excluded for the given field. +# irus.statistics.tracker.type-value = Article, Postprint + +# This lists a comma separated list of entities that will be included +# When no list is provided, the default value "Publication" will be used +# irus.statistics.tracker.entity-types = Publication + +# Set the tracker environment to "test" or "production". Defaults to "test" if empty. +# The URL used by the test environment can be configured in property tracker.testurl +# The URL used by the production environment can be configured in property tracker.produrl +irus.statistics.tracker.environment = test +# The url used to test the submission of tracking info to. +irus.statistics.tracker.testurl = https://irus.jisc.ac.uk/counter/test/ +# The base url for submitting the tracking info to. +irus.statistics.tracker.produrl = https://irus.jisc.ac.uk/counter/ +# Identifies data as OpenURL 1.0 +irus.statistics.tracker.urlversion = Z39.88-2004 + +# Add the agentregex configuration below uncommented to local.cfg to include the bot agents list by +# Project COUNTER when filtering bots in DSpace. The agents file is downloaded by the Apache ant +# stage of the build process. + +# Location of the COUNTER agents file +# irus.statistics.spider.agentregex.regexfile = ${dspace.dir}/config/spiders/agents/COUNTER_Robots_list.txt + +# External URL to COUNTER the agents file +# irus.statistics.spider.agentregex.url = https://raw.githubusercontent.com/atmire/COUNTER-Robots/master/generated/COUNTER_Robots_list.txt \ No newline at end of file diff --git a/dspace/config/modules/solr-statistics.cfg b/dspace/config/modules/solr-statistics.cfg index d73baea7cb..5b67cd4799 100644 --- a/dspace/config/modules/solr-statistics.cfg +++ b/dspace/config/modules/solr-statistics.cfg @@ -27,6 +27,10 @@ solr-statistics.configset = statistics # if record is a bot. true by default. #solr-statistics.query.filter.isBot = true +# Whether or not explicit solr.commit can be done in SolrLoggerServiceImpl#postView, or to be left to the autocommit. +# Defaults to true (i.e. via autoCommit, no explicit commits); set to false in statistics tests (e.g. StatisticsRestRepositoryIT) +solr-statistics.autoCommit = true + # URLs to download IP addresses of search engine spiders from solr-statistics.spiderips.urls = http://iplists.com/google.txt, \ http://iplists.com/inktomi.txt, \ diff --git a/dspace/config/modules/usage-statistics.cfg b/dspace/config/modules/usage-statistics.cfg index 4703dff0b3..2c4428b213 100644 --- a/dspace/config/modules/usage-statistics.cfg +++ b/dspace/config/modules/usage-statistics.cfg @@ -40,3 +40,7 @@ usage-statistics.authorization.admin.workflow=true # Enable/disable if a matching for a bot should be case sensitive # Setting this value to true will increase cpu usage, but bots will be found more accurately #usage-statistics.bots.case-insensitive = false + +# Set to true if the statistics core is sharded into a core per year, defaults to false +# If you are sharding your statistics index each year by running "dspace stats-util -s", you should set this to "true" +usage-statistics.shardedByYear = false diff --git a/dspace/config/registries/dspace-types.xml b/dspace/config/registries/dspace-types.xml index 56985373f0..f88def2453 100644 --- a/dspace/config/registries/dspace-types.xml +++ b/dspace/config/registries/dspace-types.xml @@ -16,5 +16,18 @@ + + dspace + agreements + end-user + Stores whether the End User Agreement has been accepted by an EPerson. Valid values; true, false + + + + dspace + agreements + cookies + Stores the cookie preferences of an EPerson, as selected in last session. Value will be an array of cookieName/boolean pairs, specifying which cookies are allowed or not allowed. + diff --git a/dspace/config/spring/api/arxiv-integration.xml b/dspace/config/spring/api/arxiv-integration.xml new file mode 100644 index 0000000000..e963e73a20 --- /dev/null +++ b/dspace/config/spring/api/arxiv-integration.xml @@ -0,0 +1,119 @@ + + + + + + + Defines which metadatum is mapped on which metadatum. Note that while the key must be unique it + only matters here for postprocessing of the value. The mapped MetadatumContributor has full control over + what metadatafield is generated. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dspace/config/spring/api/bibtex-integration.xml b/dspace/config/spring/api/bibtex-integration.xml new file mode 100644 index 0000000000..eeabace1c7 --- /dev/null +++ b/dspace/config/spring/api/bibtex-integration.xml @@ -0,0 +1,52 @@ + + + + + + + Defines which metadatum is mapped on which metadatum. Note that while the key must be unique it + only matters here for postprocessing of the value. The mapped MetadatumContributor has full control over + what metadatafield is generated. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/dspace/config/spring/api/bte.xml b/dspace/config/spring/api/bte.xml index b081ec5444..59bdc862e1 100644 --- a/dspace/config/spring/api/bte.xml +++ b/dspace/config/spring/api/bte.xml @@ -14,9 +14,7 @@ - - @@ -79,7 +77,6 @@ jeissn pisbn eisbn - arxivCategory keywords mesh language @@ -106,13 +103,9 @@ - - - - @@ -129,40 +122,11 @@ - - - - - - - - - - - arxivCategory - - - - - - - - - - - - - - publicationStatus - - - - @@ -357,75 +321,6 @@ value="http://ebooks.serrelib.gr/serrelib-oai/request" /> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - @@ -544,7 +439,6 @@ - @@ -553,7 +447,6 @@ - diff --git a/dspace/config/spring/api/characterseparated-integration.xml b/dspace/config/spring/api/characterseparated-integration.xml new file mode 100644 index 0000000000..1ee62173f1 --- /dev/null +++ b/dspace/config/spring/api/characterseparated-integration.xml @@ -0,0 +1,80 @@ + + + + + + + Defines which metadatum is mapped on which metadatum. Note that while the key must be unique it + only matters here for postprocessing of the value. The mapped MetadatumContributor has full control over + what metadatafield is generated. + + + + + + + + + + + + Defines which metadatum is mapped on which metadatum. Note that while the key must be unique it + only matters here for postprocessing of the value. The mapped MetadatumContributor has full control over + what metadatafield is generated. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/dspace/config/spring/api/core-dao-services.xml b/dspace/config/spring/api/core-dao-services.xml index b1a1f48475..5c68e4fad9 100644 --- a/dspace/config/spring/api/core-dao-services.xml +++ b/dspace/config/spring/api/core-dao-services.xml @@ -62,6 +62,7 @@ + diff --git a/dspace/config/spring/api/core-factory-services.xml b/dspace/config/spring/api/core-factory-services.xml index 85a9daea13..3fc907d226 100644 --- a/dspace/config/spring/api/core-factory-services.xml +++ b/dspace/config/spring/api/core-factory-services.xml @@ -49,4 +49,6 @@ + + diff --git a/dspace/config/spring/api/core-services.xml b/dspace/config/spring/api/core-services.xml index 74bdbf85da..7c45609f65 100644 --- a/dspace/config/spring/api/core-services.xml +++ b/dspace/config/spring/api/core-services.xml @@ -126,6 +126,7 @@ + diff --git a/dspace/config/spring/api/discovery.xml b/dspace/config/spring/api/discovery.xml index 803d1cad81..76e5a07239 100644 --- a/dspace/config/spring/api/discovery.xml +++ b/dspace/config/spring/api/discovery.xml @@ -66,13 +66,13 @@ - - - - + + + + - + @@ -1079,9 +1079,9 @@ - - + @@ -1144,9 +1144,9 @@ - - + @@ -1205,9 +1205,9 @@ - - + @@ -1265,9 +1265,9 @@ - - + @@ -1325,9 +1325,9 @@ - - + diff --git a/dspace/config/spring/api/dublicore-metadata-mapper.xml b/dspace/config/spring/api/dublicore-metadata-mapper.xml new file mode 100644 index 0000000000..6461f129a5 --- /dev/null +++ b/dspace/config/spring/api/dublicore-metadata-mapper.xml @@ -0,0 +1,59 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/dspace/config/spring/api/endnote-integration.xml b/dspace/config/spring/api/endnote-integration.xml new file mode 100644 index 0000000000..15ff3ca6f7 --- /dev/null +++ b/dspace/config/spring/api/endnote-integration.xml @@ -0,0 +1,52 @@ + + + + + + + Defines which metadatum is mapped on which metadatum. Note that while the key must be unique it + only matters here for postprocessing of the value. The mapped MetadatumContributor has full control over + what metadatafield is generated. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/dspace/config/spring/api/external-services.xml b/dspace/config/spring/api/external-services.xml index 520c21a963..b56870b24b 100644 --- a/dspace/config/spring/api/external-services.xml +++ b/dspace/config/spring/api/external-services.xml @@ -30,5 +30,18 @@ + + + + + + + + + + + + + diff --git a/dspace/config/spring/api/openurltracker.xml b/dspace/config/spring/api/openurltracker.xml new file mode 100644 index 0000000000..01dab53904 --- /dev/null +++ b/dspace/config/spring/api/openurltracker.xml @@ -0,0 +1,10 @@ + + + + + + + + \ No newline at end of file diff --git a/dspace/config/spring/api/ris-integration.xml b/dspace/config/spring/api/ris-integration.xml new file mode 100644 index 0000000000..3a7f0feade --- /dev/null +++ b/dspace/config/spring/api/ris-integration.xml @@ -0,0 +1,77 @@ + + + + + + + Defines which metadatum is mapped on which metadatum. Note that while the key must be unique it + only matters here for postprocessing of the value. The mapped MetadatumContributor has full control over + what metadatafield is generated. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/dspace/config/spring/api/scripts.xml b/dspace/config/spring/api/scripts.xml index 0713baad19..8dd374bd79 100644 --- a/dspace/config/spring/api/scripts.xml +++ b/dspace/config/spring/api/scripts.xml @@ -3,7 +3,6 @@ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd"> - @@ -14,8 +13,18 @@ - + - + + + + + + + + + + + diff --git a/dspace/config/spring/api/step-processing-listener.xml b/dspace/config/spring/api/step-processing-listener.xml index eb016c5133..986b850875 100644 --- a/dspace/config/spring/api/step-processing-listener.xml +++ b/dspace/config/spring/api/step-processing-listener.xml @@ -13,9 +13,7 @@ - - diff --git a/dspace/config/spring/rest/event-service-listeners.xml b/dspace/config/spring/rest/event-service-listeners.xml index 6d62e60f17..712497fe12 100644 --- a/dspace/config/spring/rest/event-service-listeners.xml +++ b/dspace/config/spring/rest/event-service-listeners.xml @@ -21,4 +21,9 @@ + + + + + \ No newline at end of file diff --git a/dspace/config/spring/rest/scripts.xml b/dspace/config/spring/rest/scripts.xml index 04cacb4930..bec5469f25 100644 --- a/dspace/config/spring/rest/scripts.xml +++ b/dspace/config/spring/rest/scripts.xml @@ -12,4 +12,14 @@ + + + + + + + + + + \ No newline at end of file diff --git a/dspace/config/submission-forms.xml b/dspace/config/submission-forms.xml index 1a6ddcf049..9729fb74c5 100644 --- a/dspace/config/submission-forms.xml +++ b/dspace/config/submission-forms.xml @@ -489,8 +489,8 @@

    - isVolumeOfJournal - periodical + isJournalOfVolume + journal creativework.publisher:somepublishername Select the journal related to this volume. @@ -614,7 +614,7 @@ isAuthorOfPublication - personOrOrganization + personOrOrgunit true true @@ -1750,4 +1750,4 @@ - \ No newline at end of file + diff --git a/dspace/modules/additions/pom.xml b/dspace/modules/additions/pom.xml index 0c05de84a4..4eaee1f8e2 100644 --- a/dspace/modules/additions/pom.xml +++ b/dspace/modules/additions/pom.xml @@ -17,7 +17,7 @@ org.dspace modules - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. @@ -26,6 +26,40 @@ ${basedir}/../../.. + + + + + org.codehaus.gmaven + groovy-maven-plugin + + + setproperty + initialize + + execute + + + + project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/'); + log.info("Initializing Maven property 'agnostic.build.dir' to: {}", project.properties['agnostic.build.dir']); + + + + + + + + oracle-support @@ -42,20 +76,20 @@ - + + - test-environment + unit-test-environment false - maven.test.skip + skipUnitTests false - @@ -75,50 +109,12 @@ - setupTestEnvironment + setupUnitTestEnvironment generate-test-resources unpack - - setupIntegrationTestEnvironment - pre-integration-test - - unpack - - - - - - - - org.codehaus.gmaven - groovy-maven-plugin - - - setproperty - generate-test-resources - - - execute - - - - project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/'); - println("Initializing Maven property 'agnostic.build.dir' to: " + project.properties['agnostic.build.dir']); - - - @@ -136,6 +132,60 @@ + + + + + + org.dspace + dspace-api + test-jar + test + + + + + + + integration-test-environment + + false + + skipIntegrationTests + false + + + + + + + maven-dependency-plugin + + ${project.build.directory}/testing + + + org.dspace + dspace-parent + ${project.version} + zip + testEnvironment + + + + + + setupIntegrationTestEnvironment + pre-integration-test + + unpack + + + + @@ -158,12 +208,12 @@ org.dspace dspace-api - 7.0-beta3-SNAPSHOT test-jar test + + + org.apache.lucene + lucene-analyzers-icu + test + + + org.apache.lucene + lucene-analyzers-smartcn + test + + + org.apache.lucene + lucene-analyzers-stempel + test + + + org.apache.xmlbeans + xmlbeans + 2.6.0 + junit diff --git a/dspace/modules/pom.xml b/dspace/modules/pom.xml index 84c685a981..4d9e654b7f 100644 --- a/dspace/modules/pom.xml +++ b/dspace/modules/pom.xml @@ -11,7 +11,7 @@ org.dspace dspace-parent - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT ../../pom.xml @@ -44,6 +44,10 @@ dspace-rest false + + + release + rest diff --git a/dspace/modules/rest/pom.xml b/dspace/modules/rest/pom.xml index a17ff70f80..e801ea7e27 100644 --- a/dspace/modules/rest/pom.xml +++ b/dspace/modules/rest/pom.xml @@ -13,7 +13,7 @@ org.dspace modules - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. diff --git a/dspace/modules/server/pom.xml b/dspace/modules/server/pom.xml index 06d466f522..75467c5d67 100644 --- a/dspace/modules/server/pom.xml +++ b/dspace/modules/server/pom.xml @@ -13,7 +13,7 @@ just adding new jar in the classloader modules org.dspace - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT .. @@ -73,24 +73,52 @@ just adding new jar in the classloader + + + org.codehaus.gmaven + groovy-maven-plugin + + + setproperty + initialize + + execute + + + + project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/'); + log.info("Initializing Maven property 'agnostic.build.dir' to: {}", project.properties['agnostic.build.dir']); + + + + + - + - test-environment + unit-test-environment false - maven.test.skip + skipUnitTests false - @@ -110,49 +138,12 @@ just adding new jar in the classloader - setupTestEnvironment + setupUnitTestEnvironment generate-test-resources unpack - - setupIntegrationTestEnvironment - pre-integration-test - - unpack - - - - - - - - org.codehaus.gmaven - groovy-maven-plugin - - - setproperty - initialize - - execute - - - - project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/'); - log.info("Initializing Maven property 'agnostic.build.dir' to: {}", project.properties['agnostic.build.dir']); - - - @@ -171,6 +162,60 @@ just adding new jar in the classloader + + + + + + org.dspace + dspace-server-webapp + test-jar + test + + + + + + + integration-test-environment + + false + + skipIntegrationTests + false + + + + + + + maven-dependency-plugin + + ${project.build.directory}/testing + + + org.dspace + dspace-parent + ${project.version} + zip + testEnvironment + + + + + + setupIntegrationTestEnvironment + pre-integration-test + + unpack + + + + @@ -194,12 +239,12 @@ just adding new jar in the classloader org.dspace dspace-server-webapp - 7.0-beta3-SNAPSHOT test-jar test + oracle-support @@ -235,6 +280,18 @@ just adding new jar in the classloader + + org.dspace + dspace-api + test-jar + test + + + org.dspace + dspace-server-webapp + test-jar + test + org.springframework.boot spring-boot-starter-test @@ -266,6 +323,14 @@ just adding new jar in the classloader solr-cell test + + org.bouncycastle + bcpkix-jdk15on + + + org.bouncycastle + bcprov-jdk15on + org.eclipse.jetty jetty-continuation diff --git a/dspace/pom.xml b/dspace/pom.xml index 6c693eacec..5a332c03c7 100644 --- a/dspace/pom.xml +++ b/dspace/pom.xml @@ -16,8 +16,8 @@ org.dspace dspace-parent - 7.0-beta3-SNAPSHOT - .. + 7.0-beta5-SNAPSHOT + ../pom.xml @@ -148,28 +148,28 @@ - + - test-coverage-report + coverage-report false - - maven.test.skip - false - - + org.jacoco jacoco-maven-plugin aggregate-test-report - post-integration-test + verify report-aggregate @@ -180,8 +180,6 @@ **/jacoco-ut.exec **/jacoco-it.exec - - ${project.reporting.outputDirectory}/jacoco-aggregated @@ -229,50 +227,6 @@ - - - - coveralls - - false - - - - - org.eluder.coveralls - coveralls-maven-plugin - - - report-test-coverage - verify - - report - - - false - - ${project.reporting.outputDirectory}/jacoco-aggregated/jacoco.xml - - - ${project.parent.basedir}/dspace-api/src/main/java - ${project.parent.basedir}/dspace-api/target/generated-sources/annotations - ${project.parent.basedir}/dspace-oai/src/main/java - ${project.parent.basedir}/dspace-rdf/src/main/java - ${project.parent.basedir}/dspace-rest/src/main/java - ${project.parent.basedir}/dspace-services/src/main/java - ${project.parent.basedir}/dspace-server-webapp/src/main/java - ${project.parent.basedir}/dspace-sword/src/main/java - ${project.parent.basedir}/dspace-swordv2/src/main/java - - - - - - - - diff --git a/dspace/src/main/config/build.xml b/dspace/src/main/config/build.xml index 4a48dce374..990e767c63 100644 --- a/dspace/src/main/config/build.xml +++ b/dspace/src/main/config/build.xml @@ -114,6 +114,7 @@ Common usage: + @@ -177,6 +178,7 @@ Common usage: + @@ -827,6 +829,8 @@ Common usage: + + ==================================================================== The DSpace code has been installed. @@ -856,4 +860,27 @@ Common usage: + + + Downloading: ${irus.statistics.spider.agentregex.url} + + + + + + + + + + + + + + + + + + + + diff --git a/dspace/src/main/docker-compose/README.md b/dspace/src/main/docker-compose/README.md index 9c92f627b6..372a03a6c5 100644 --- a/dspace/src/main/docker-compose/README.md +++ b/dspace/src/main/docker-compose/README.md @@ -84,3 +84,28 @@ Download an assetstore from a tar file on the internet. ``` docker-compose -p d7 -f docker-compose-cli.yml -f dspace/src/main/docker-compose/cli.assetstore.yml run dspace-cli ``` + +## Modify DSpace Configuration in Docker +While your Docker containers are running, you may directly modify the `local.cfg` in this directory which will change the DSpace configuration for the running Docker container. (Keep in mind, this works because our `docker-compose.yml` mounts this `[src]/dspace/src/main/docker-compose/local.cfg` from the host into the running Docker instance.) + +Many DSpace configuration settings will reload automatically (after a few seconds). However, configurations which are cached by DSpace (or by Spring Boot) may require you to quickly reboot the Docker containers by running `docker-compose -p d7 down` followed by `docker-compose -p d7 up -d`. + +## Running DSpace CLI scripts in Docker +While the Docker containers are running, you can use the DSpace CLI image to run any DSpace commandline script (i.e. any command that normally can be run by `[dspace]/bin/dspace`). The general format is: + +``` +docker-compose -p d7 -f docker-compose-cli.yml run --rm dspace-cli [command] [parameters] +``` + +So, for example, to reindex all content in Discovery, normally you'd run `./dspace index-discovery -b` from commandline. Using our DSpace CLI image, that command becomes: + +``` +docker-compose -p d7 -f docker-compose-cli.yml run --rm dspace-cli index-discovery -b +``` + +Similarly, you can see the value of any DSpace configuration (in local.cfg or dspace.cfg) by running: + +``` +# Output the value of `dspace.ui.url` from running Docker instance +docker-compose -p d7 -f docker-compose-cli.yml run --rm dspace-cli dsprop -p dspace.ui.url +``` diff --git a/dspace/src/main/docker-compose/cli.ingest.yml b/dspace/src/main/docker-compose/cli.ingest.yml index 16cebe4f3b..d22a235d4f 100644 --- a/dspace/src/main/docker-compose/cli.ingest.yml +++ b/dspace/src/main/docker-compose/cli.ingest.yml @@ -11,7 +11,7 @@ version: "3.7" services: dspace-cli: environment: - - AIPZIP=https://github.com/DSpace-Labs/AIP-Files/raw/master/dogAndReport.zip + - AIPZIP=https://github.com/DSpace-Labs/AIP-Files/raw/main/dogAndReport.zip - ADMIN_EMAIL=test@test.edu - AIPDIR=/tmp/aip-dir entrypoint: diff --git a/dspace/src/main/docker-compose/environment.dev.ts b/dspace/src/main/docker-compose/environment.dev.ts index 573c8ebb67..0e603ef11d 100644 --- a/dspace/src/main/docker-compose/environment.dev.ts +++ b/dspace/src/main/docker-compose/environment.dev.ts @@ -13,6 +13,6 @@ export const environment = { host: 'localhost', port: 8080, // NOTE: Space is capitalized because 'namespace' is a reserved string in TypeScript - nameSpace: '/server/api' + nameSpace: '/server' } }; diff --git a/dspace/src/main/docker/README.md b/dspace/src/main/docker/README.md index a885d16ab4..dd5ec26bd5 100644 --- a/dspace/src/main/docker/README.md +++ b/dspace/src/main/docker/README.md @@ -27,7 +27,7 @@ This image deploys two DSpace webapps: docker build -t dspace/dspace:dspace-7_x-test -f Dockerfile.test . ``` -This image is built *automatically* after each commit is made to the `master` branch. +This image is built *automatically* after each commit is made to the `main` branch. A corresponding image exists for DSpace 4-6. @@ -46,7 +46,7 @@ This image deploys two DSpace webapps: docker build -t dspace/dspace:dspace-7_x -f Dockerfile . ``` -This image is built *automatically* after each commit is made to the `master` branch. +This image is built *automatically* after each commit is made to the `main` branch. A corresponding image exists for DSpace 4-6. @@ -62,7 +62,7 @@ This Dockerfile builds a DSpace 7 CLI image, which can be used to run commandlin docker build -t dspace/dspace-cli:dspace-7_x -f Dockerfile.cli . ``` -This image is built *automatically* after each commit is made to the master branch. +This image is built *automatically* after each commit is made to the `main` branch. A corresponding image exists for DSpace 6. diff --git a/pom.xml b/pom.xml index aa8a8b15dd..8bd508886f 100644 --- a/pom.xml +++ b/pom.xml @@ -4,7 +4,7 @@ org.dspace dspace-parent pom - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT DSpace Parent Project DSpace open source software is a turnkey institutional repository application. @@ -35,8 +35,8 @@ 1.3.2 2.3.1 2.3.1 - - 9.4.8.v20171121 + + 9.4.15.v20190215 2.11.2 2.0.15 3.17 @@ -61,6 +61,17 @@ UTF-8 ${project.build.sourceEncoding} + + + + true + true + ${basedir} @@ -160,7 +171,7 @@ - + org.apache.maven.plugins maven-surefire-plugin @@ -177,13 +188,15 @@ true false + + ${skipUnitTests} - + maven-failsafe-plugin 2.22.2 @@ -198,10 +211,11 @@ true false + + ${skipIntegrationTests} - integration-test integration-test verify @@ -249,14 +263,21 @@ - org.codehaus.mojo - findbugs-maven-plugin - 3.0.5 + com.github.spotbugs + spotbugs-maven-plugin + 4.0.4 Max Low true + + + com.github.spotbugs + spotbugs + 4.1.2 + + compile @@ -289,7 +310,7 @@ org.apache.maven.plugins maven-dependency-plugin - 3.1.1 + 3.1.2 org.apache.maven.plugins @@ -312,7 +333,7 @@ org.apache.maven.plugins maven-javadoc-plugin - 3.1.1 + 3.2.0 false @@ -343,21 +364,6 @@ jacoco-maven-plugin 0.8.5 - - - org.eluder.coveralls - coveralls-maven-plugin - 4.3.0 - - - - javax.xml.bind - jaxb-api - ${jaxb-api.version} - - - @@ -370,8 +376,9 @@ maven-release-plugin 2.5.3 - - release + + -Drelease deploy dspace-@{project.version} @@ -475,38 +482,6 @@ - - - skiptests - - - - !maven.test.skip - - - - true - - - - - - skipits - - - - !skipITs - - - - true - - - + - generate-test-env + test-environment - false - - maven.test.skip - false - src/main/assembly/testEnvironment.xml + + + !release + @@ -567,12 +539,13 @@ + - measure-test-coverage + measure-unit-test-coverage false - maven.test.skip + skipUnitTests false @@ -602,7 +575,29 @@ surefireJacoco + + + + + + + + measure-integration-test-coverage + + false + + skipIntegrationTests + false + + + + + + + org.jacoco + jacoco-maven-plugin + + + release + dspace-rest @@ -848,14 +847,14 @@ org.dspace dspace-rest - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT jar classes org.dspace dspace-rest - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT war @@ -913,25 +912,19 @@ The 'release' profile is used by the 'maven-release-plugin' (see above) to actually perform a DSpace software release to Maven central. This profile contains settings which are ONLY enabled when performing - a DSpace release. See alse https://wiki.duraspace.org/display/DSPACE/Release+Procedure + a DSpace release. See also https://wiki.duraspace.org/display/DSPACE/Release+Procedure + NOTE: You MUST trigger this profile by running "-Drelease" + (as that flag also triggers other modules to be enabled/disabled as necessary for release) --> release false + + + release + - - - dspace-api - dspace-oai - dspace-rdf - dspace-rest - dspace-services - dspace-sword - dspace-swordv2 - dspace-server-webapp - + org.apache.maven.plugins maven-javadoc-plugin @@ -970,7 +963,7 @@ attach-javadocs - aggregate-jar + jar @@ -1008,50 +1001,64 @@ org.dspace dspace-api - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT + + + org.dspace + dspace-api + test-jar + 7.0-beta5-SNAPSHOT + test org.dspace.modules additions - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT org.dspace dspace-sword - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT org.dspace dspace-swordv2 - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT org.dspace dspace-oai - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT org.dspace dspace-services - 7.0-beta3-SNAPSHOT - - - org.dspace - dspace-rdf - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT org.dspace dspace-server-webapp - 7.0-beta3-SNAPSHOT + test-jar + 7.0-beta5-SNAPSHOT + test + + + org.dspace + dspace-rdf + 7.0-beta5-SNAPSHOT + + + org.dspace + dspace-server-webapp + 7.0-beta5-SNAPSHOT jar classes org.dspace dspace-server-webapp - 7.0-beta3-SNAPSHOT + 7.0-beta5-SNAPSHOT war @@ -1209,6 +1216,11 @@ solr-cell ${solr.client.version} + + org.apache.lucene + lucene-core + ${solr.client.version} + @@ -1217,6 +1229,18 @@ ${solr.client.version} test + + org.apache.lucene + lucene-analyzers-smartcn + ${solr.client.version} + test + + + org.apache.lucene + lucene-analyzers-stempel + ${solr.client.version} + test + org.apache.ant @@ -1236,9 +1260,15 @@ - org.dspace + net.handle handle - 9.1.0.v20190416 + 9.3.0 + + + + net.cnri + cnri-servlet-container + 3.0.0 @@ -1261,12 +1291,12 @@ commons-beanutils commons-beanutils - 1.9.3 + 1.9.4 commons-cli commons-cli - 1.3.1 + 1.4 commons-codec @@ -1590,7 +1620,7 @@ com.google.code.gson gson - 2.6.1 + 2.8.6 compile @@ -1629,6 +1659,7 @@ google-oauth-client 1.23.0 + com.google.code.findbugs @@ -1642,6 +1673,7 @@ 3.0.1u2 provided + com.fasterxml @@ -1716,7 +1748,7 @@ DuraSpace BSD License - https://raw.github.com/DSpace/DSpace/master/LICENSE + https://raw.github.com/DSpace/DSpace/main/LICENSE repo A BSD 3-Clause license for the DSpace codebase. @@ -1829,8 +1861,13 @@ - + + + maven-central + https://repo.maven.apache.org/maven2 + + maven-snapshots https://oss.sonatype.org/content/repositories/snapshots @@ -1842,6 +1879,11 @@ true + + + handle.net + https://handle.net/maven +