Merge branch 'main' into CST-7756-SubscriptionFeature

This commit is contained in:
corrado lombardi
2022-12-30 11:29:43 +01:00
58 changed files with 928 additions and 164 deletions

View File

@@ -1,7 +1,7 @@
## References ## References
_Add references/links to any related issues or PRs. These may include:_ _Add references/links to any related issues or PRs. These may include:_
* Fixes #[issue-number] * Fixes #`issue-number` (if this fixes an issue ticket)
* Related to [REST Contract](https://github.com/DSpace/Rest7Contract) * Related to DSpace/RestContract#`pr-number` (if a corresponding REST Contract PR exists)
## Description ## Description
Short summary of changes (1-2 sentences). Short summary of changes (1-2 sentences).
@@ -22,5 +22,7 @@ _This checklist provides a reminder of what we are going to look for when review
- [ ] My PR passes Checkstyle validation based on the [Code Style Guide](https://wiki.lyrasis.org/display/DSPACE/Code+Style+Guide). - [ ] My PR passes Checkstyle validation based on the [Code Style Guide](https://wiki.lyrasis.org/display/DSPACE/Code+Style+Guide).
- [ ] My PR includes Javadoc for _all new (or modified) public methods and classes_. It also includes Javadoc for large or complex private methods. - [ ] My PR includes Javadoc for _all new (or modified) public methods and classes_. It also includes Javadoc for large or complex private methods.
- [ ] My PR passes all tests and includes new/updated Unit or Integration Tests based on the [Code Testing Guide](https://wiki.lyrasis.org/display/DSPACE/Code+Testing+Guide). - [ ] My PR passes all tests and includes new/updated Unit or Integration Tests based on the [Code Testing Guide](https://wiki.lyrasis.org/display/DSPACE/Code+Testing+Guide).
- [ ] If my PR includes new, third-party dependencies (in any `pom.xml`), I've made sure their licenses align with the [DSpace BSD License](https://github.com/DSpace/DSpace/blob/main/LICENSE) based on the [Licensing of Contributions](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines#CodeContributionGuidelines-LicensingofContributions) documentation. - [ ] If my PR includes new libraries/dependencies (in any `pom.xml`), I've made sure their licenses align with the [DSpace BSD License](https://github.com/DSpace/DSpace/blob/main/LICENSE) based on the [Licensing of Contributions](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines#CodeContributionGuidelines-LicensingofContributions) documentation.
- [ ] If my PR modifies the REST API, I've opened a separate [REST Contract](https://github.com/DSpace/RestContract/blob/main/README.md) PR related to this change. - [ ] If my PR modifies REST API endpoints, I've opened a separate [REST Contract](https://github.com/DSpace/RestContract/blob/main/README.md) PR related to this change.
- [ ] If my PR includes new configurations, I've provided basic technical documentation in the PR itself.
- [ ] If my PR fixes an issue ticket, I've [linked them together](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue).

59
.github/workflows/codescan.yml vendored Normal file
View File

@@ -0,0 +1,59 @@
# DSpace CodeQL code scanning configuration for GitHub
# https://docs.github.com/en/code-security/code-scanning
#
# NOTE: Code scanning must be run separate from our default build.yml
# because CodeQL requires a fresh build with all tests *disabled*.
name: "Code Scanning"
# Run this code scan for all pushes / PRs to main branch. Also run once a week.
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
# Don't run if PR is only updating static documentation
paths-ignore:
- '**/*.md'
- '**/*.txt'
schedule:
- cron: "37 0 * * 1"
jobs:
analyze:
name: Analyze Code
runs-on: ubuntu-latest
# Limit permissions of this GitHub action. Can only write to security-events
permissions:
actions: read
contents: read
security-events: write
steps:
# https://github.com/actions/checkout
- name: Checkout repository
uses: actions/checkout@v3
# https://github.com/actions/setup-java
- name: Install JDK
uses: actions/setup-java@v3
with:
java-version: 11
distribution: 'temurin'
# Initializes the CodeQL tools for scanning.
# https://github.com/github/codeql-action
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
# Codescan Javascript as well since a few JS files exist in REST API's interface
languages: java, javascript
# Autobuild attempts to build any compiled languages
# NOTE: Based on testing, this autobuild process works well for DSpace. A custom
# DSpace build w/caching (like in build.yml) was about the same speed as autobuild.
- name: Autobuild
uses: github/codeql-action/autobuild@v2
# Perform GitHub Code Scanning.
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2

45
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,45 @@
# How to Contribute
DSpace is a community built and supported project. We do not have a centralized development or support team, but have a dedicated group of volunteers who help us improve the software, documentation, resources, etc.
* [Contribute new code via a Pull Request](#contribute-new-code-via-a-pull-request)
* [Contribute documentation](#contribute-documentation)
* [Help others on mailing lists or Slack](#help-others-on-mailing-lists-or-slack)
* [Join a working or interest group](#join-a-working-or-interest-group)
## Contribute new code via a Pull Request
We accept [GitHub Pull Requests (PRs)](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request-from-a-fork) at any time from anyone.
Contributors to each release are recognized in our [Release Notes](https://wiki.lyrasis.org/display/DSDOC7x/Release+Notes).
Code Contribution Checklist
- [ ] PRs _should_ be smaller in size (ideally less than 1,000 lines of code, not including comments & tests)
- [ ] PRs **must** pass Checkstyle validation based on our [Code Style Guide](https://wiki.lyrasis.org/display/DSPACE/Code+Style+Guide).
- [ ] PRs **must** include Javadoc for _all new/modified public methods and classes_. Larger private methods should also have Javadoc
- [ ] PRs **must** pass all automated tests and include new/updated Unit or Integration tests based on our [Code Testing Guide](https://wiki.lyrasis.org/display/DSPACE/Code+Testing+Guide).
- [ ] If a PR includes new libraries/dependencies (in any `pom.xml`), then their software licenses **must** align with the [DSpace BSD License](https://github.com/DSpace/DSpace/blob/main/LICENSE) based on the [Licensing of Contributions](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines#CodeContributionGuidelines-LicensingofContributions) documentation.
- [ ] Basic technical documentation _should_ be provided for any new features or changes to the REST API. REST API changes should be documented in our [Rest Contract](https://github.com/DSpace/RestContract).
- [ ] If a PR fixes an issue ticket, please [link them together](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue).
Additional details on the code contribution process can be found in our [Code Contribution Guidelines](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines)
## Contribute documentation
DSpace Documentation is a collaborative effort in a shared Wiki. The latest documentation is at https://wiki.lyrasis.org/display/DSDOC7x
If you find areas of the DSpace Documentation which you wish to improve, please request a Wiki account by emailing wikihelp@lyrasis.org.
Once you have an account setup, contact @tdonohue (via [Slack](https://wiki.lyrasis.org/display/DSPACE/Slack) or email) for access to edit our Documentation.
## Help others on mailing lists or Slack
DSpace has our own [Slack](https://wiki.lyrasis.org/display/DSPACE/Slack) community and [Mailing Lists](https://wiki.lyrasis.org/display/DSPACE/Mailing+Lists) where discussions take place and questions are answered.
Anyone is welcome to join and help others. We just ask you to follow our [Code of Conduct](https://www.lyrasis.org/about/Pages/Code-of-Conduct.aspx) (adopted via LYRASIS).
## Join a working or interest group
Most of the work in building/improving DSpace comes via [Working Groups](https://wiki.lyrasis.org/display/DSPACE/DSpace+Working+Groups) or [Interest Groups](https://wiki.lyrasis.org/display/DSPACE/DSpace+Interest+Groups).
All working/interest groups are open to anyone to join and participate. A few key groups to be aware of include:
* [DSpace 7 Working Group](https://wiki.lyrasis.org/display/DSPACE/DSpace+7+Working+Group) - This is the main (mostly volunteer) development team. We meet weekly to review our current development [project board](https://github.com/orgs/DSpace/projects), assigning tickets and/or PRs.
* [DSpace Community Advisory Team (DCAT)](https://wiki.lyrasis.org/display/cmtygp/DSpace+Community+Advisory+Team) - This is an interest group for repository managers/administrators. We meet monthly to discuss DSpace, share tips & provide feedback back to developers.

View File

@@ -48,18 +48,7 @@ See [Running DSpace 7 with Docker Compose](dspace/src/main/docker-compose/README
## Contributing ## Contributing
DSpace is a community built and supported project. We do not have a centralized development or support team, See [Contributing documentation](CONTRIBUTING.md)
but have a dedicated group of volunteers who help us improve the software, documentation, resources, etc.
We welcome contributions of any type. Here's a few basic guides that provide suggestions for contributing to DSpace:
* [How to Contribute to DSpace](https://wiki.lyrasis.org/display/DSPACE/How+to+Contribute+to+DSpace): How to contribute in general (via code, documentation, bug reports, expertise, etc)
* [Code Contribution Guidelines](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines): How to give back code or contribute features, bug fixes, etc.
* [DSpace Community Advisory Team (DCAT)](https://wiki.lyrasis.org/display/cmtygp/DSpace+Community+Advisory+Team): If you are not a developer, we also have an interest group specifically for repository managers. The DCAT group meets virtually, once a month, and sends open invitations to join their meetings via the [DCAT mailing list](https://groups.google.com/d/forum/DSpaceCommunityAdvisoryTeam).
We also encourage GitHub Pull Requests (PRs) at any time. Please see our [Development with Git](https://wiki.lyrasis.org/display/DSPACE/Development+with+Git) guide for more info.
In addition, a listing of all known contributors to DSpace software can be
found online at: https://wiki.lyrasis.org/display/DSPACE/DSpaceContributors
## Getting Help ## Getting Help

View File

@@ -92,9 +92,7 @@ For more information on CheckStyle configurations below, see: http://checkstyle.
<!-- Requirements for Javadocs for methods --> <!-- Requirements for Javadocs for methods -->
<module name="JavadocMethod"> <module name="JavadocMethod">
<!-- All public methods MUST HAVE Javadocs --> <!-- All public methods MUST HAVE Javadocs -->
<!-- <property name="scope" value="public"/> --> <property name="scope" value="public"/>
<!-- TODO: Above rule has been disabled because of large amount of missing public method Javadocs -->
<property name="scope" value="nothing"/>
<!-- Allow params, throws and return tags to be optional --> <!-- Allow params, throws and return tags to be optional -->
<property name="allowMissingParamTags" value="true"/> <property name="allowMissingParamTags" value="true"/>
<property name="allowMissingReturnTag" value="true"/> <property name="allowMissingReturnTag" value="true"/>

View File

@@ -598,18 +598,19 @@ public class MetadataImport extends DSpaceRunnable<MetadataImportScriptConfigura
changes.add(whatHasChanged); changes.add(whatHasChanged);
} }
if (change) { if (change && (rowCount % configurationService.getIntProperty("bulkedit.change.commit.count", 100) == 0)) {
//only clear cache if changes have been made. c.commit();
c.uncacheEntity(wsItem); handler.logInfo(LogHelper.getHeader(c, "metadata_import_commit", "lineNumber=" + rowCount));
c.uncacheEntity(wfItem);
c.uncacheEntity(item);
} }
populateRefAndRowMap(line, item == null ? null : item.getID()); populateRefAndRowMap(line, item == null ? null : item.getID());
// keep track of current rows processed // keep track of current rows processed
rowCount++; rowCount++;
} }
if (change) {
c.commit();
}
c.setMode(originalMode); c.setMode(Context.Mode.READ_ONLY);
// Return the changes // Return the changes

View File

@@ -67,6 +67,7 @@ public class ItemImport extends DSpaceRunnable<ItemImportScriptConfiguration> {
protected String eperson = null; protected String eperson = null;
protected String[] collections = null; protected String[] collections = null;
protected boolean isTest = false; protected boolean isTest = false;
protected boolean isExcludeContent = false;
protected boolean isResume = false; protected boolean isResume = false;
protected boolean useWorkflow = false; protected boolean useWorkflow = false;
protected boolean useWorkflowSendEmail = false; protected boolean useWorkflowSendEmail = false;
@@ -119,6 +120,8 @@ public class ItemImport extends DSpaceRunnable<ItemImportScriptConfiguration> {
handler.logInfo("**Test Run** - not actually importing items."); handler.logInfo("**Test Run** - not actually importing items.");
} }
isExcludeContent = commandLine.hasOption('x');
if (commandLine.hasOption('p')) { if (commandLine.hasOption('p')) {
template = true; template = true;
} }
@@ -204,6 +207,7 @@ public class ItemImport extends DSpaceRunnable<ItemImportScriptConfiguration> {
.getItemImportService(); .getItemImportService();
try { try {
itemImportService.setTest(isTest); itemImportService.setTest(isTest);
itemImportService.setExcludeContent(isExcludeContent);
itemImportService.setResume(isResume); itemImportService.setResume(isResume);
itemImportService.setUseWorkflow(useWorkflow); itemImportService.setUseWorkflow(useWorkflow);
itemImportService.setUseWorkflowSendEmail(useWorkflowSendEmail); itemImportService.setUseWorkflowSendEmail(useWorkflowSendEmail);

View File

@@ -13,7 +13,7 @@ import org.dspace.scripts.configuration.ScriptConfiguration;
/** /**
* The {@link ScriptConfiguration} for the {@link ItemImportCLI} script * The {@link ScriptConfiguration} for the {@link ItemImportCLI} script
* *
* @author Francesco Pio Scognamiglio (francescopio.scognamiglio at 4science.com) * @author Francesco Pio Scognamiglio (francescopio.scognamiglio at 4science.com)
*/ */
public class ItemImportCLIScriptConfiguration extends ItemImportScriptConfiguration<ItemImportCLI> { public class ItemImportCLIScriptConfiguration extends ItemImportScriptConfiguration<ItemImportCLI> {
@@ -55,6 +55,9 @@ public class ItemImportCLIScriptConfiguration extends ItemImportScriptConfigurat
options.addOption(Option.builder("v").longOpt("validate") options.addOption(Option.builder("v").longOpt("validate")
.desc("test run - do not actually import items") .desc("test run - do not actually import items")
.hasArg(false).required(false).build()); .hasArg(false).required(false).build());
options.addOption(Option.builder("x").longOpt("exclude-bitstreams")
.desc("do not load or expect content bitstreams")
.hasArg(false).required(false).build());
options.addOption(Option.builder("p").longOpt("template") options.addOption(Option.builder("p").longOpt("template")
.desc("apply template") .desc("apply template")
.hasArg(false).required(false).build()); .hasArg(false).required(false).build());

View File

@@ -19,7 +19,7 @@ import org.springframework.beans.factory.annotation.Autowired;
/** /**
* The {@link ScriptConfiguration} for the {@link ItemImport} script * The {@link ScriptConfiguration} for the {@link ItemImport} script
* *
* @author Francesco Pio Scognamiglio (francescopio.scognamiglio at 4science.com) * @author Francesco Pio Scognamiglio (francescopio.scognamiglio at 4science.com)
*/ */
public class ItemImportScriptConfiguration<T extends ItemImport> extends ScriptConfiguration<T> { public class ItemImportScriptConfiguration<T extends ItemImport> extends ScriptConfiguration<T> {
@@ -81,6 +81,9 @@ public class ItemImportScriptConfiguration<T extends ItemImport> extends ScriptC
options.addOption(Option.builder("v").longOpt("validate") options.addOption(Option.builder("v").longOpt("validate")
.desc("test run - do not actually import items") .desc("test run - do not actually import items")
.hasArg(false).required(false).build()); .hasArg(false).required(false).build());
options.addOption(Option.builder("x").longOpt("exclude-bitstreams")
.desc("do not load or expect content bitstreams")
.hasArg(false).required(false).build());
options.addOption(Option.builder("p").longOpt("template") options.addOption(Option.builder("p").longOpt("template")
.desc("apply template") .desc("apply template")
.hasArg(false).required(false).build()); .hasArg(false).required(false).build());

View File

@@ -62,6 +62,7 @@ import org.apache.commons.io.FileUtils;
import org.apache.commons.lang3.RandomStringUtils; import org.apache.commons.lang3.RandomStringUtils;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.exception.ExceptionUtils; import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
import org.dspace.app.itemimport.service.ItemImportService; import org.dspace.app.itemimport.service.ItemImportService;
import org.dspace.app.util.LocalSchemaFilenameFilter; import org.dspace.app.util.LocalSchemaFilenameFilter;
@@ -135,7 +136,7 @@ import org.xml.sax.SAXException;
* allow the registration of files (bitstreams) into DSpace. * allow the registration of files (bitstreams) into DSpace.
*/ */
public class ItemImportServiceImpl implements ItemImportService, InitializingBean { public class ItemImportServiceImpl implements ItemImportService, InitializingBean {
private final Logger log = org.apache.logging.log4j.LogManager.getLogger(ItemImportServiceImpl.class); private final Logger log = LogManager.getLogger();
private DSpaceRunnableHandler handler; private DSpaceRunnableHandler handler;
@@ -181,6 +182,7 @@ public class ItemImportServiceImpl implements ItemImportService, InitializingBea
protected String tempWorkDir; protected String tempWorkDir;
protected boolean isTest = false; protected boolean isTest = false;
protected boolean isExcludeContent = false;
protected boolean isResume = false; protected boolean isResume = false;
protected boolean useWorkflow = false; protected boolean useWorkflow = false;
protected boolean useWorkflowSendEmail = false; protected boolean useWorkflowSendEmail = false;
@@ -1403,6 +1405,10 @@ public class ItemImportServiceImpl implements ItemImportService, InitializingBea
protected void processContentFileEntry(Context c, Item i, String path, protected void processContentFileEntry(Context c, Item i, String path,
String fileName, String bundleName, boolean primary) throws SQLException, String fileName, String bundleName, boolean primary) throws SQLException,
IOException, AuthorizeException { IOException, AuthorizeException {
if (isExcludeContent) {
return;
}
String fullpath = path + File.separatorChar + fileName; String fullpath = path + File.separatorChar + fileName;
// get an input stream // get an input stream
@@ -2342,6 +2348,11 @@ public class ItemImportServiceImpl implements ItemImportService, InitializingBea
this.isTest = isTest; this.isTest = isTest;
} }
@Override
public void setExcludeContent(boolean isExcludeContent) {
this.isExcludeContent = isExcludeContent;
}
@Override @Override
public void setResume(boolean isResume) { public void setResume(boolean isResume) {
this.isResume = isResume; this.isResume = isResume;

View File

@@ -211,6 +211,13 @@ public interface ItemImportService {
*/ */
public void setTest(boolean isTest); public void setTest(boolean isTest);
/**
* Set exclude-content flag.
*
* @param isExcludeContent true or false
*/
public void setExcludeContent(boolean isExcludeContent);
/** /**
* Set resume flag * Set resume flag
* *

View File

@@ -14,6 +14,9 @@ import java.io.InputStream;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import java.util.regex.PatternSyntaxException; import java.util.regex.PatternSyntaxException;
import org.apache.pdfbox.pdmodel.PDDocument;
import org.apache.pdfbox.pdmodel.PDPage;
import org.apache.pdfbox.pdmodel.common.PDRectangle;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
import org.dspace.content.Bundle; import org.dspace.content.Bundle;
import org.dspace.content.Item; import org.dspace.content.Item;
@@ -132,6 +135,26 @@ public abstract class ImageMagickThumbnailFilter extends MediaFilter {
op.density(Integer.valueOf(density)); op.density(Integer.valueOf(density));
} }
// Check the PDF's MediaBox and CropBox to see if they are the same.
// If not, then tell ImageMagick to use the CropBox when generating
// the thumbnail because the CropBox is generally used to define the
// area displayed when a user opens the PDF on a screen, whereas the
// MediaBox is used for print. Not all PDFs set these correctly, so
// we can use ImageMagick's default behavior unless we see an explit
// CropBox. Note: we don't need to do anything special to detect if
// the CropBox is missing or empty because pdfbox will set it to the
// same size as the MediaBox if it doesn't exist. Also note that we
// only need to check the first page, since that's what we use for
// generating the thumbnail (PDDocument uses a zero-based index).
PDPage pdfPage = PDDocument.load(f).getPage(0);
PDRectangle pdfPageMediaBox = pdfPage.getMediaBox();
PDRectangle pdfPageCropBox = pdfPage.getCropBox();
// This option must come *before* we open the input file.
if (pdfPageCropBox != pdfPageMediaBox) {
op.define("pdf:use-cropbox=true");
}
String s = "[" + page + "]"; String s = "[" + page + "]";
op.addImage(f.getAbsolutePath() + s); op.addImage(f.getAbsolutePath() + s);
if (configurationService.getBooleanProperty(PRE + ".flatten", true)) { if (configurationService.getBooleanProperty(PRE + ".flatten", true)) {

View File

@@ -10,6 +10,7 @@ package org.dspace.app.util;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Optional;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import java.util.regex.PatternSyntaxException; import java.util.regex.PatternSyntaxException;
import javax.annotation.Nullable; import javax.annotation.Nullable;
@@ -131,10 +132,15 @@ public class DCInput {
private boolean closedVocabulary = false; private boolean closedVocabulary = false;
/** /**
* the regex to comply with, null if nothing * the regex in ECMAScript standard format, usable also by rests.
*/ */
private String regex = null; private String regex = null;
/**
* the computed pattern, null if nothing
*/
private Pattern pattern = null;
/** /**
* allowed document types * allowed document types
*/ */
@@ -178,7 +184,7 @@ public class DCInput {
//check if the input have a language tag //check if the input have a language tag
language = Boolean.valueOf(fieldMap.get("language")); language = Boolean.valueOf(fieldMap.get("language"));
valueLanguageList = new ArrayList(); valueLanguageList = new ArrayList<>();
if (language) { if (language) {
String languageNameTmp = fieldMap.get("value-pairs-name"); String languageNameTmp = fieldMap.get("value-pairs-name");
if (StringUtils.isBlank(languageNameTmp)) { if (StringUtils.isBlank(languageNameTmp)) {
@@ -191,7 +197,7 @@ public class DCInput {
repeatable = "true".equalsIgnoreCase(repStr) repeatable = "true".equalsIgnoreCase(repStr)
|| "yes".equalsIgnoreCase(repStr); || "yes".equalsIgnoreCase(repStr);
String nameVariantsString = fieldMap.get("name-variants"); String nameVariantsString = fieldMap.get("name-variants");
nameVariants = (StringUtils.isNotBlank(nameVariantsString)) ? nameVariants = StringUtils.isNotBlank(nameVariantsString) ?
nameVariantsString.equalsIgnoreCase("true") : false; nameVariantsString.equalsIgnoreCase("true") : false;
label = fieldMap.get("label"); label = fieldMap.get("label");
inputType = fieldMap.get("input-type"); inputType = fieldMap.get("input-type");
@@ -203,17 +209,17 @@ public class DCInput {
} }
hint = fieldMap.get("hint"); hint = fieldMap.get("hint");
warning = fieldMap.get("required"); warning = fieldMap.get("required");
required = (warning != null && warning.length() > 0); required = warning != null && warning.length() > 0;
visibility = fieldMap.get("visibility"); visibility = fieldMap.get("visibility");
readOnly = fieldMap.get("readonly"); readOnly = fieldMap.get("readonly");
vocabulary = fieldMap.get("vocabulary"); vocabulary = fieldMap.get("vocabulary");
regex = fieldMap.get("regex"); this.initRegex(fieldMap.get("regex"));
String closedVocabularyStr = fieldMap.get("closedVocabulary"); String closedVocabularyStr = fieldMap.get("closedVocabulary");
closedVocabulary = "true".equalsIgnoreCase(closedVocabularyStr) closedVocabulary = "true".equalsIgnoreCase(closedVocabularyStr)
|| "yes".equalsIgnoreCase(closedVocabularyStr); || "yes".equalsIgnoreCase(closedVocabularyStr);
// parsing of the <type-bind> element (using the colon as split separator) // parsing of the <type-bind> element (using the colon as split separator)
typeBind = new ArrayList<>(); typeBind = new ArrayList<String>();
String typeBindDef = fieldMap.get("type-bind"); String typeBindDef = fieldMap.get("type-bind");
if (typeBindDef != null && typeBindDef.trim().length() > 0) { if (typeBindDef != null && typeBindDef.trim().length() > 0) {
String[] types = typeBindDef.split(","); String[] types = typeBindDef.split(",");
@@ -238,6 +244,22 @@ public class DCInput {
} }
protected void initRegex(String regex) {
this.regex = null;
this.pattern = null;
if (regex != null) {
try {
Optional.ofNullable(RegexPatternUtils.computePattern(regex))
.ifPresent(pattern -> {
this.pattern = pattern;
this.regex = regex;
});
} catch (PatternSyntaxException e) {
log.warn("The regex field of input {} with value {} is invalid!", this.label, regex);
}
}
}
/** /**
* Is this DCInput for display in the given scope? The scope should be * Is this DCInput for display in the given scope? The scope should be
* either "workflow" or "submit", as per the input forms definition. If the * either "workflow" or "submit", as per the input forms definition. If the
@@ -248,7 +270,7 @@ public class DCInput {
* @return whether the input should be displayed or not * @return whether the input should be displayed or not
*/ */
public boolean isVisible(String scope) { public boolean isVisible(String scope) {
return (visibility == null || visibility.equals(scope)); return visibility == null || visibility.equals(scope);
} }
/** /**
@@ -381,7 +403,7 @@ public class DCInput {
/** /**
* Get the style for this form field * Get the style for this form field
* *
* @return the style * @return the style
*/ */
public String getStyle() { public String getStyle() {
@@ -512,8 +534,12 @@ public class DCInput {
return visibility; return visibility;
} }
public Pattern getPattern() {
return this.pattern;
}
public String getRegex() { public String getRegex() {
return regex; return this.regex;
} }
public String getFieldName() { public String getFieldName() {
@@ -546,8 +572,7 @@ public class DCInput {
public boolean validate(String value) { public boolean validate(String value) {
if (StringUtils.isNotBlank(value)) { if (StringUtils.isNotBlank(value)) {
try { try {
if (StringUtils.isNotBlank(regex)) { if (this.pattern != null) {
Pattern pattern = Pattern.compile(regex);
if (!pattern.matcher(value).matches()) { if (!pattern.matcher(value).matches()) {
return false; return false;
} }
@@ -557,7 +582,6 @@ public class DCInput {
} }
} }
return true; return true;
} }

View File

@@ -0,0 +1,73 @@
/**
* The contents of this file are subject to the license and copyright
* detailed in the LICENSE and NOTICE files at the root of the source
* tree and available online at
*
* http://www.dspace.org/license/
*/
package org.dspace.app.util;
import static java.util.regex.Pattern.CASE_INSENSITIVE;
import java.util.Optional;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.regex.PatternSyntaxException;
import org.apache.commons.lang3.StringUtils;
/**
* Utility class useful for check regex and patterns.
*
* @author Vincenzo Mecca (vins01-4science - vincenzo.mecca at 4science.com)
*
*/
public class RegexPatternUtils {
// checks input having the format /{pattern}/{flags}
// allowed flags are: g,i,m,s,u,y
public static final String REGEX_INPUT_VALIDATOR = "(/?)(.+)\\1([gimsuy]*)";
// flags usable inside regex definition using format (?i|m|s|u|y)
public static final String REGEX_FLAGS = "(?%s)";
public static final Pattern PATTERN_REGEX_INPUT_VALIDATOR =
Pattern.compile(REGEX_INPUT_VALIDATOR, CASE_INSENSITIVE);
/**
* Computes a pattern starting from a regex definition with flags that
* uses the standard format: <code>/{regex}/{flags}</code> (ECMAScript format).
* This method can transform an ECMAScript regex into a java {@code Pattern} object
* wich can be used to validate strings.
* <br/>
* If regex is null, empty or blank a null {@code Pattern} will be retrieved
* If it's a valid regex, then a non-null {@code Pattern} will be retrieved,
* an exception will be thrown otherwise.
*
* @param regex with format <code>/{regex}/{flags}</code>
* @return {@code Pattern} regex pattern instance
* @throws PatternSyntaxException
*/
public static final Pattern computePattern(String regex) throws PatternSyntaxException {
if (StringUtils.isBlank(regex)) {
return null;
}
Matcher inputMatcher = PATTERN_REGEX_INPUT_VALIDATOR.matcher(regex);
String regexPattern = regex;
String regexFlags = "";
if (inputMatcher.matches()) {
regexPattern =
Optional.of(inputMatcher.group(2))
.filter(StringUtils::isNotBlank)
.orElse(regex);
regexFlags =
Optional.ofNullable(inputMatcher.group(3))
.filter(StringUtils::isNotBlank)
.map(flags -> String.format(REGEX_FLAGS, flags))
.orElse("")
.replaceAll("g", "");
}
return Pattern.compile(regexFlags + regexPattern);
}
private RegexPatternUtils() {}
}

View File

@@ -31,10 +31,12 @@ import org.dspace.content.DSpaceObject;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.BitstreamService; import org.dspace.content.service.BitstreamService;
import org.dspace.content.service.CollectionService;
import org.dspace.content.service.WorkspaceItemService; import org.dspace.content.service.WorkspaceItemService;
import org.dspace.core.Constants; import org.dspace.core.Constants;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.discovery.DiscoverQuery; import org.dspace.discovery.DiscoverQuery;
import org.dspace.discovery.DiscoverQuery.SORT_ORDER;
import org.dspace.discovery.DiscoverResult; import org.dspace.discovery.DiscoverResult;
import org.dspace.discovery.IndexableObject; import org.dspace.discovery.IndexableObject;
import org.dspace.discovery.SearchService; import org.dspace.discovery.SearchService;
@@ -830,7 +832,7 @@ public class AuthorizeServiceImpl implements AuthorizeService {
query = formatCustomQuery(query); query = formatCustomQuery(query);
DiscoverResult discoverResult = getDiscoverResult(context, query + "search.resourcetype:" + DiscoverResult discoverResult = getDiscoverResult(context, query + "search.resourcetype:" +
IndexableCommunity.TYPE, IndexableCommunity.TYPE,
offset, limit); offset, limit, null, null);
for (IndexableObject solrCollections : discoverResult.getIndexableObjects()) { for (IndexableObject solrCollections : discoverResult.getIndexableObjects()) {
Community community = ((IndexableCommunity) solrCollections).getIndexedObject(); Community community = ((IndexableCommunity) solrCollections).getIndexedObject();
communities.add(community); communities.add(community);
@@ -852,7 +854,7 @@ public class AuthorizeServiceImpl implements AuthorizeService {
query = formatCustomQuery(query); query = formatCustomQuery(query);
DiscoverResult discoverResult = getDiscoverResult(context, query + "search.resourcetype:" + DiscoverResult discoverResult = getDiscoverResult(context, query + "search.resourcetype:" +
IndexableCommunity.TYPE, IndexableCommunity.TYPE,
null, null); null, null, null, null);
return discoverResult.getTotalSearchResults(); return discoverResult.getTotalSearchResults();
} }
@@ -877,7 +879,7 @@ public class AuthorizeServiceImpl implements AuthorizeService {
query = formatCustomQuery(query); query = formatCustomQuery(query);
DiscoverResult discoverResult = getDiscoverResult(context, query + "search.resourcetype:" + DiscoverResult discoverResult = getDiscoverResult(context, query + "search.resourcetype:" +
IndexableCollection.TYPE, IndexableCollection.TYPE,
offset, limit); offset, limit, CollectionService.SOLR_SORT_FIELD, SORT_ORDER.asc);
for (IndexableObject solrCollections : discoverResult.getIndexableObjects()) { for (IndexableObject solrCollections : discoverResult.getIndexableObjects()) {
Collection collection = ((IndexableCollection) solrCollections).getIndexedObject(); Collection collection = ((IndexableCollection) solrCollections).getIndexedObject();
collections.add(collection); collections.add(collection);
@@ -899,7 +901,7 @@ public class AuthorizeServiceImpl implements AuthorizeService {
query = formatCustomQuery(query); query = formatCustomQuery(query);
DiscoverResult discoverResult = getDiscoverResult(context, query + "search.resourcetype:" + DiscoverResult discoverResult = getDiscoverResult(context, query + "search.resourcetype:" +
IndexableCollection.TYPE, IndexableCollection.TYPE,
null, null); null, null, null, null);
return discoverResult.getTotalSearchResults(); return discoverResult.getTotalSearchResults();
} }
@@ -919,7 +921,7 @@ public class AuthorizeServiceImpl implements AuthorizeService {
} }
try { try {
DiscoverResult discoverResult = getDiscoverResult(context, query, null, null); DiscoverResult discoverResult = getDiscoverResult(context, query, null, null, null, null);
if (discoverResult.getTotalSearchResults() > 0) { if (discoverResult.getTotalSearchResults() > 0) {
return true; return true;
} }
@@ -931,7 +933,8 @@ public class AuthorizeServiceImpl implements AuthorizeService {
return false; return false;
} }
private DiscoverResult getDiscoverResult(Context context, String query, Integer offset, Integer limit) private DiscoverResult getDiscoverResult(Context context, String query, Integer offset, Integer limit,
String sortField, SORT_ORDER sortOrder)
throws SearchServiceException, SQLException { throws SearchServiceException, SQLException {
String groupQuery = getGroupToQuery(groupService.allMemberGroups(context, context.getCurrentUser())); String groupQuery = getGroupToQuery(groupService.allMemberGroups(context, context.getCurrentUser()));
@@ -947,7 +950,9 @@ public class AuthorizeServiceImpl implements AuthorizeService {
if (limit != null) { if (limit != null) {
discoverQuery.setMaxResults(limit); discoverQuery.setMaxResults(limit);
} }
if (sortField != null && sortOrder != null) {
discoverQuery.setSortField(sortField, sortOrder);
}
return searchService.search(context, discoverQuery); return searchService.search(context, discoverQuery);
} }

View File

@@ -17,6 +17,7 @@ import java.util.UUID;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
import org.apache.solr.client.solrj.util.ClientUtils;
import org.dspace.authorize.factory.AuthorizeServiceFactory; import org.dspace.authorize.factory.AuthorizeServiceFactory;
import org.dspace.authorize.service.AuthorizeService; import org.dspace.authorize.service.AuthorizeService;
import org.dspace.content.Item; import org.dspace.content.Item;
@@ -206,7 +207,8 @@ public class SolrBrowseDAO implements BrowseDAO {
query.addFilterQueries("{!field f=" + facetField + "_partial}" + value); query.addFilterQueries("{!field f=" + facetField + "_partial}" + value);
} }
if (StringUtils.isNotBlank(startsWith) && orderField != null) { if (StringUtils.isNotBlank(startsWith) && orderField != null) {
query.addFilterQueries("bi_" + orderField + "_sort:" + startsWith + "*"); query.addFilterQueries(
"bi_" + orderField + "_sort:" + ClientUtils.escapeQueryChars(startsWith) + "*");
} }
// filter on item to be sure to don't include any other object // filter on item to be sure to don't include any other object
// indexed in the Discovery Search core // indexed in the Discovery Search core

View File

@@ -43,6 +43,7 @@ import org.dspace.core.I18nUtil;
import org.dspace.core.LogHelper; import org.dspace.core.LogHelper;
import org.dspace.core.service.LicenseService; import org.dspace.core.service.LicenseService;
import org.dspace.discovery.DiscoverQuery; import org.dspace.discovery.DiscoverQuery;
import org.dspace.discovery.DiscoverQuery.SORT_ORDER;
import org.dspace.discovery.DiscoverResult; import org.dspace.discovery.DiscoverResult;
import org.dspace.discovery.IndexableObject; import org.dspace.discovery.IndexableObject;
import org.dspace.discovery.SearchService; import org.dspace.discovery.SearchService;
@@ -946,6 +947,7 @@ public class CollectionServiceImpl extends DSpaceObjectServiceImpl<Collection> i
discoverQuery.setDSpaceObjectFilter(IndexableCollection.TYPE); discoverQuery.setDSpaceObjectFilter(IndexableCollection.TYPE);
discoverQuery.setStart(offset); discoverQuery.setStart(offset);
discoverQuery.setMaxResults(limit); discoverQuery.setMaxResults(limit);
discoverQuery.setSortField(SOLR_SORT_FIELD, SORT_ORDER.asc);
DiscoverResult resp = retrieveCollectionsWithSubmit(context, discoverQuery, null, community, q); DiscoverResult resp = retrieveCollectionsWithSubmit(context, discoverQuery, null, community, q);
for (IndexableObject solrCollections : resp.getIndexableObjects()) { for (IndexableObject solrCollections : resp.getIndexableObjects()) {
Collection c = ((IndexableCollection) solrCollections).getIndexedObject(); Collection c = ((IndexableCollection) solrCollections).getIndexedObject();
@@ -1025,6 +1027,7 @@ public class CollectionServiceImpl extends DSpaceObjectServiceImpl<Collection> i
discoverQuery.setDSpaceObjectFilter(IndexableCollection.TYPE); discoverQuery.setDSpaceObjectFilter(IndexableCollection.TYPE);
discoverQuery.setStart(offset); discoverQuery.setStart(offset);
discoverQuery.setMaxResults(limit); discoverQuery.setMaxResults(limit);
discoverQuery.setSortField(SOLR_SORT_FIELD, SORT_ORDER.asc);
DiscoverResult resp = retrieveCollectionsWithSubmit(context, discoverQuery, DiscoverResult resp = retrieveCollectionsWithSubmit(context, discoverQuery,
entityType, community, q); entityType, community, q);
for (IndexableObject solrCollections : resp.getIndexableObjects()) { for (IndexableObject solrCollections : resp.getIndexableObjects()) {

View File

@@ -33,6 +33,11 @@ import org.dspace.eperson.Group;
public interface CollectionService public interface CollectionService
extends DSpaceObjectService<Collection>, DSpaceObjectLegacySupportService<Collection> { extends DSpaceObjectService<Collection>, DSpaceObjectLegacySupportService<Collection> {
/*
* Field used to sort community and collection lists at solr
*/
public static final String SOLR_SORT_FIELD = "dc.title_sort";
/** /**
* Create a new collection with a new ID. * Create a new collection with a new ID.
* Once created the collection is added to the given community * Once created the collection is added to the given community
@@ -46,7 +51,6 @@ public interface CollectionService
public Collection create(Context context, Community community) throws SQLException, public Collection create(Context context, Community community) throws SQLException,
AuthorizeException; AuthorizeException;
/** /**
* Create a new collection with the supplied handle and with a new ID. * Create a new collection with the supplied handle and with a new ID.
* Once created the collection is added to the given community * Once created the collection is added to the given community

View File

@@ -335,7 +335,7 @@ public class OpenAIRERestConnector {
/** /**
* tokenUsage true to enable the usage of an access token * tokenUsage true to enable the usage of an access token
* *
* @param tokenUsage * @param tokenEnabled true/false
*/ */
@Autowired(required = false) @Autowired(required = false)
public void setTokenEnabled(boolean tokenEnabled) { public void setTokenEnabled(boolean tokenEnabled) {

View File

@@ -57,7 +57,7 @@ public class LiveImportDataProvider extends AbstractExternalDataProvider {
/** /**
* This method set the MetadataSource for the ExternalDataProvider * This method set the MetadataSource for the ExternalDataProvider
* @param metadataSource {@link org.dspace.importer.external.service.components.MetadataSource} implementation used to process the input data * @param querySource Source {@link org.dspace.importer.external.service.components.QuerySource} implementation used to process the input data
*/ */
public void setMetadataSource(QuerySource querySource) { public void setMetadataSource(QuerySource querySource) {
this.querySource = querySource; this.querySource = querySource;

View File

@@ -9,6 +9,7 @@ package org.dspace.handle;
import java.sql.SQLException; import java.sql.SQLException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Iterator;
import java.util.List; import java.util.List;
import java.util.regex.Matcher; import java.util.regex.Matcher;
import java.util.regex.Pattern; import java.util.regex.Pattern;
@@ -211,17 +212,17 @@ public class HandleServiceImpl implements HandleService {
@Override @Override
public void unbindHandle(Context context, DSpaceObject dso) public void unbindHandle(Context context, DSpaceObject dso)
throws SQLException { throws SQLException {
List<Handle> handles = getInternalHandles(context, dso); Iterator<Handle> handles = dso.getHandles().iterator();
if (CollectionUtils.isNotEmpty(handles)) { if (handles.hasNext()) {
for (Handle handle : handles) { while (handles.hasNext()) {
final Handle handle = handles.next();
handles.remove();
//Only set the "resouce_id" column to null when unbinding a handle. //Only set the "resouce_id" column to null when unbinding a handle.
// We want to keep around the "resource_type_id" value, so that we // We want to keep around the "resource_type_id" value, so that we
// can verify during a restore whether the same *type* of resource // can verify during a restore whether the same *type* of resource
// is reusing this handle! // is reusing this handle!
handle.setDSpaceObject(null); handle.setDSpaceObject(null);
//Also remove the handle from the DSO list to keep a consistent model
dso.getHandles().remove(handle);
handleDAO.save(context, handle); handleDAO.save(context, handle);
@@ -256,7 +257,7 @@ public class HandleServiceImpl implements HandleService {
@Override @Override
public String findHandle(Context context, DSpaceObject dso) public String findHandle(Context context, DSpaceObject dso)
throws SQLException { throws SQLException {
List<Handle> handles = getInternalHandles(context, dso); List<Handle> handles = dso.getHandles();
if (CollectionUtils.isEmpty(handles)) { if (CollectionUtils.isEmpty(handles)) {
return null; return null;
} else { } else {
@@ -328,20 +329,6 @@ public class HandleServiceImpl implements HandleService {
//////////////////////////////////////// ////////////////////////////////////////
// Internal methods // Internal methods
//////////////////////////////////////// ////////////////////////////////////////
/**
* Return the handle for an Object, or null if the Object has no handle.
*
* @param context DSpace context
* @param dso DSpaceObject for which we require our handles
* @return The handle for object, or null if the object has no handle.
* @throws SQLException If a database error occurs
*/
protected List<Handle> getInternalHandles(Context context, DSpaceObject dso)
throws SQLException {
return handleDAO.getHandlesByDSpaceObject(context, dso);
}
/** /**
* Find the database row corresponding to handle. * Find the database row corresponding to handle.
* *

View File

@@ -23,8 +23,7 @@ import org.dspace.iiif.util.IIIFSharedUtils;
/** /**
* Queries the configured IIIF server for image dimensions. Used for * Queries the configured IIIF image server via the Image API.
* formats that cannot be easily read using ImageIO (jpeg 2000).
* *
* @author Michael Spalti mspalti@willamette.edu * @author Michael Spalti mspalti@willamette.edu
*/ */

View File

@@ -105,10 +105,10 @@ public class BibtexImportMetadataSourceServiceImpl extends AbstractPlainMetadata
/** /**
* Retrieve the MetadataFieldMapping containing the mapping between RecordType * Set the MetadataFieldMapping containing the mapping between RecordType
* (in this case PlainMetadataSourceDto.class) and Metadata * (in this case PlainMetadataSourceDto.class) and Metadata
* *
* @return The configured MetadataFieldMapping * @param metadataFieldMap The configured MetadataFieldMapping
*/ */
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")

View File

@@ -21,7 +21,7 @@ public interface LiveImportClient {
* *
* @param timeout The connect timeout in milliseconds * @param timeout The connect timeout in milliseconds
* @param URL URL * @param URL URL
* @param requestParams This map contains the parameters to be included in the request. * @param params This map contains the parameters to be included in the request.
* Each parameter will be added to the url?(key=value) * Each parameter will be added to the url?(key=value)
* @return The response in String type converted from InputStream * @return The response in String type converted from InputStream
*/ */

View File

@@ -156,7 +156,7 @@ public class EpoIdMetadataContributor implements MetadataContributor<Element> {
* Depending on the retrieved node (using the query), different types of values will be added to the MetadatumDTO * Depending on the retrieved node (using the query), different types of values will be added to the MetadatumDTO
* list * list
* *
* @param t A class to retrieve metadata from. * @param element A class to retrieve metadata from.
* @return a collection of import records. Only the identifier of the found records may be put in the record. * @return a collection of import records. Only the identifier of the found records may be put in the record.
*/ */
@Override @Override

View File

@@ -118,7 +118,7 @@ public class SimpleJsonPathMetadataContributor implements MetadataContributor<St
* Retrieve the metadata associated with the given object. * Retrieve the metadata associated with the given object.
* The toString() of the resulting object will be used. * The toString() of the resulting object will be used.
* *
* @param t A class to retrieve metadata from. * @param fullJson A class to retrieve metadata from.
* @return a collection of import records. Only the identifier of the found records may be put in the record. * @return a collection of import records. Only the identifier of the found records may be put in the record.
*/ */
@Override @Override

View File

@@ -126,10 +126,10 @@ public class RisImportMetadataSourceServiceImpl extends AbstractPlainMetadataSou
} }
/** /**
* Retrieve the MetadataFieldMapping containing the mapping between RecordType * Set the MetadataFieldMapping containing the mapping between RecordType
* (in this case PlainMetadataSourceDto.class) and Metadata * (in this case PlainMetadataSourceDto.class) and Metadata
* *
* @return The configured MetadataFieldMapping * @param metadataFieldMap The configured MetadataFieldMapping
*/ */
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")

View File

@@ -42,7 +42,7 @@ public abstract class AbstractPlainMetadataSource
/** /**
* Set the file extensions supported by this metadata service * Set the file extensions supported by this metadata service
* *
* @param supportedExtensionsthe file extensions (xml,txt,...) supported by this service * @param supportedExtensions the file extensions (xml,txt,...) supported by this service
*/ */
public void setSupportedExtensions(List<String> supportedExtensions) { public void setSupportedExtensions(List<String> supportedExtensions) {
this.supportedExtensions = supportedExtensions; this.supportedExtensions = supportedExtensions;
@@ -57,7 +57,7 @@ public abstract class AbstractPlainMetadataSource
* Return a list of ImportRecord constructed from input file. This list is based on * Return a list of ImportRecord constructed from input file. This list is based on
* the results retrieved from the file (InputStream) parsed through abstract method readData * the results retrieved from the file (InputStream) parsed through abstract method readData
* *
* @param InputStream The inputStream of the file * @param is The inputStream of the file
* @return A list of {@link ImportRecord} * @return A list of {@link ImportRecord}
* @throws FileSourceException if, for any reason, the file is not parsable * @throws FileSourceException if, for any reason, the file is not parsable
*/ */
@@ -76,7 +76,7 @@ public abstract class AbstractPlainMetadataSource
* the result retrieved from the file (InputStream) parsed through abstract method * the result retrieved from the file (InputStream) parsed through abstract method
* "readData" implementation * "readData" implementation
* *
* @param InputStream The inputStream of the file * @param is The inputStream of the file
* @return An {@link ImportRecord} matching the file content * @return An {@link ImportRecord} matching the file content
* @throws FileSourceException if, for any reason, the file is not parsable * @throws FileSourceException if, for any reason, the file is not parsable
* @throws FileMultipleOccurencesException if the file contains more than one entry * @throws FileMultipleOccurencesException if the file contains more than one entry

View File

@@ -30,7 +30,7 @@ public interface FileSource extends MetadataSource {
/** /**
* Return a list of ImportRecord constructed from input file. * Return a list of ImportRecord constructed from input file.
* *
* @param InputStream The inputStream of the file * @param inputStream The inputStream of the file
* @return A list of {@link ImportRecord} * @return A list of {@link ImportRecord}
* @throws FileSourceException if, for any reason, the file is not parsable * @throws FileSourceException if, for any reason, the file is not parsable
*/ */
@@ -40,7 +40,7 @@ public interface FileSource extends MetadataSource {
/** /**
* Return an ImportRecord constructed from input file. * Return an ImportRecord constructed from input file.
* *
* @param InputStream The inputStream of the file * @param inputStream The inputStream of the file
* @return An {@link ImportRecord} matching the file content * @return An {@link ImportRecord} matching the file content
* @throws FileSourceException if, for any reason, the file is not parsable * @throws FileSourceException if, for any reason, the file is not parsable
* @throws FileMultipleOccurencesException if the file contains more than one entry * @throws FileMultipleOccurencesException if the file contains more than one entry

View File

@@ -225,15 +225,15 @@ public class Process implements ReloadableEntity<Integer> {
} }
/** /**
* This method sets the special groups associated with the Process. * This method will return the special groups associated with the Process.
*/ */
public List<Group> getGroups() { public List<Group> getGroups() {
return groups; return groups;
} }
/** /**
* This method will return special groups associated with the Process. * This method sets the special groups associated with the Process.
* @return The special groups of this process. * @param groups The special groups of this process.
*/ */
public void setGroups(List<Group> groups) { public void setGroups(List<Group> groups) {
this.groups = groups; this.groups = groups;

View File

@@ -197,7 +197,7 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea
@Override @Override
public void postView(DSpaceObject dspaceObject, HttpServletRequest request, public void postView(DSpaceObject dspaceObject, HttpServletRequest request,
EPerson currentUser) { EPerson currentUser) {
if (solr == null || locationService == null) { if (solr == null) {
return; return;
} }
initSolrYearCores(); initSolrYearCores();
@@ -238,7 +238,7 @@ public class SolrLoggerServiceImpl implements SolrLoggerService, InitializingBea
@Override @Override
public void postView(DSpaceObject dspaceObject, public void postView(DSpaceObject dspaceObject,
String ip, String userAgent, String xforwardedfor, EPerson currentUser) { String ip, String userAgent, String xforwardedfor, EPerson currentUser) {
if (solr == null || locationService == null) { if (solr == null) {
return; return;
} }
initSolrYearCores(); initSolrYearCores();

View File

@@ -0,0 +1,9 @@
--
-- The contents of this file are subject to the license and copyright
-- detailed in the LICENSE and NOTICE files at the root of the source
-- tree and available online at
--
-- http://www.dspace.org/license/
--
CREATE INDEX resourcepolicy_action_idx ON resourcepolicy(action_id);

View File

@@ -0,0 +1,9 @@
--
-- The contents of this file are subject to the license and copyright
-- detailed in the LICENSE and NOTICE files at the root of the source
-- tree and available online at
--
-- http://www.dspace.org/license/
--
CREATE INDEX resourcepolicy_action_idx ON resourcepolicy(action_id);

View File

@@ -0,0 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd"
default-lazy-init="true">
<bean id="iiifCanvasDimensionServiceFactory" class="org.dspace.iiif.canvasdimension.factory.IIIFCanvasDimensionServiceFactoryImpl"/>
<bean class="org.dspace.iiif.canvasdimension.IIIFCanvasDimensionServiceImpl" scope="prototype"/>
<bean class="org.dspace.iiif.MockIIIFApiQueryServiceImpl" id="org.dspace.iiif.IIIFApiQueryServiceImpl"
autowire-candidate="true"/>
</beans>

View File

@@ -0,0 +1,214 @@
/**
* The contents of this file are subject to the license and copyright
* detailed in the LICENSE and NOTICE files at the root of the source
* tree and available online at
*
* http://www.dspace.org/license/
*/
package org.dspace.app.util;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertNull;
import static org.junit.Assert.assertThrows;
import static org.junit.Assert.assertTrue;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.regex.PatternSyntaxException;
import org.dspace.AbstractUnitTest;
import org.junit.Test;
/**
* Tests for RegexPatternUtils
*
* @author Vincenzo Mecca (vins01-4science - vincenzo.mecca at 4science.com)
*
*/
public class RegexPatternUtilsTest extends AbstractUnitTest {
@Test
public void testValidRegexWithFlag() {
final String insensitiveWord = "/[a-z]+/i";
Pattern computePattern = Pattern.compile(insensitiveWord);
assertNotNull(computePattern);
Matcher matcher = computePattern.matcher("Hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("DSpace");
assertFalse(matcher.matches());
matcher = computePattern.matcher("Community");
assertFalse(matcher.matches());
matcher = computePattern.matcher("/wrongpattern/i");
assertTrue(matcher.matches());
matcher = computePattern.matcher("001");
assertFalse(matcher.matches());
matcher = computePattern.matcher("?/'`}{][<>.,");
assertFalse(matcher.matches());
computePattern = RegexPatternUtils.computePattern(insensitiveWord);
assertNotNull(computePattern);
matcher = computePattern.matcher("Hello");
assertTrue(matcher.matches());
matcher = computePattern.matcher("DSpace");
assertTrue(matcher.matches());
matcher = computePattern.matcher("Community");
assertTrue(matcher.matches());
matcher = computePattern.matcher("/wrong-pattern/i");
assertFalse(matcher.matches());
matcher = computePattern.matcher("001");
assertFalse(matcher.matches());
matcher = computePattern.matcher("?/'`}{][<>.,");
assertFalse(matcher.matches());
}
@Test
public void testRegexWithoutFlag() {
final String sensitiveWord = "[a-z]+";
Pattern computePattern = RegexPatternUtils.computePattern(sensitiveWord);
assertNotNull(computePattern);
Matcher matcher = computePattern.matcher("hello");
assertTrue(matcher.matches());
matcher = computePattern.matcher("dspace");
assertTrue(matcher.matches());
matcher = computePattern.matcher("community");
assertTrue(matcher.matches());
matcher = computePattern.matcher("Hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("DSpace");
assertFalse(matcher.matches());
matcher = computePattern.matcher("Community");
assertFalse(matcher.matches());
matcher = computePattern.matcher("/wrongpattern/i");
assertFalse(matcher.matches());
matcher = computePattern.matcher("001");
assertFalse(matcher.matches());
matcher = computePattern.matcher("?/'`}{][<>.,");
assertFalse(matcher.matches());
final String sensitiveWordWithDelimiter = "/[a-z]+/";
computePattern = RegexPatternUtils.computePattern(sensitiveWordWithDelimiter);
assertNotNull(computePattern);
matcher = computePattern.matcher("hello");
assertTrue(matcher.matches());
matcher = computePattern.matcher("dspace");
assertTrue(matcher.matches());
matcher = computePattern.matcher("community");
assertTrue(matcher.matches());
matcher = computePattern.matcher("Hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("DSpace");
assertFalse(matcher.matches());
matcher = computePattern.matcher("Community");
assertFalse(matcher.matches());
matcher = computePattern.matcher("/wrongpattern/i");
assertFalse(matcher.matches());
matcher = computePattern.matcher("001");
assertFalse(matcher.matches());
matcher = computePattern.matcher("?/'`}{][<>.,");
assertFalse(matcher.matches());
}
@Test
public void testWithFuzzyRegex() {
String fuzzyRegex = "/[a-z]+";
Pattern computePattern = RegexPatternUtils.computePattern(fuzzyRegex);
assertNotNull(computePattern);
Matcher matcher = computePattern.matcher("/hello");
assertTrue(matcher.matches());
matcher = computePattern.matcher("hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("Hello");
assertFalse(matcher.matches());
fuzzyRegex = "[a-z]+/";
computePattern = RegexPatternUtils.computePattern(fuzzyRegex);
matcher = computePattern.matcher("hello/");
assertTrue(matcher.matches());
matcher = computePattern.matcher("/hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("Hello");
assertFalse(matcher.matches());
// equals to pattern \\[a-z]+\\ -> searching for a word delimited by '\'
fuzzyRegex = "\\\\[a-z]+\\\\";
computePattern = RegexPatternUtils.computePattern(fuzzyRegex);
// equals to '\hello\'
matcher = computePattern.matcher("\\hello\\");
assertTrue(matcher.matches());
matcher = computePattern.matcher("/hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("Hello");
assertFalse(matcher.matches());
// equals to pattern /[a-z]+/ -> searching for a string delimited by '/'
fuzzyRegex = "\\/[a-z]+\\/";
computePattern = RegexPatternUtils.computePattern(fuzzyRegex);
matcher = computePattern.matcher("/hello/");
assertTrue(matcher.matches());
matcher = computePattern.matcher("/hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("hello");
assertFalse(matcher.matches());
matcher = computePattern.matcher("Hello");
assertFalse(matcher.matches());
}
@Test
public void testInvalidRegex() {
String invalidSensitive = "[a-z+";
assertThrows(PatternSyntaxException.class, () -> RegexPatternUtils.computePattern(invalidSensitive));
String invalidRange = "a{1-";
assertThrows(PatternSyntaxException.class, () -> RegexPatternUtils.computePattern(invalidRange));
String invalidGroupPattern = "(abc";
assertThrows(PatternSyntaxException.class, () -> RegexPatternUtils.computePattern(invalidGroupPattern));
String emptyPattern = "";
Pattern computePattern = RegexPatternUtils.computePattern(emptyPattern);
assertNull(computePattern);
String blankPattern = " ";
computePattern = RegexPatternUtils.computePattern(blankPattern);
assertNull(computePattern);
String nullPattern = null;
computePattern = RegexPatternUtils.computePattern(nullPattern);
assertNull(computePattern);
}
@Test
public void testMultiFlagRegex() {
String multilineSensitive = "/[a-z]+/gi";
Pattern computePattern = RegexPatternUtils.computePattern(multilineSensitive);
assertNotNull(computePattern);
Matcher matcher = computePattern.matcher("hello");
assertTrue(matcher.matches());
matcher = computePattern.matcher("Hello");
assertTrue(matcher.matches());
multilineSensitive = "/[a-z]+/gim";
computePattern = RegexPatternUtils.computePattern(multilineSensitive);
assertNotNull(computePattern);
matcher = computePattern.matcher("Hello" + System.lineSeparator() + "Everyone");
assertTrue(matcher.find());
assertEquals("Hello", matcher.group());
assertTrue(matcher.find());
assertEquals("Everyone", matcher.group());
matcher = computePattern.matcher("hello");
assertTrue(matcher.matches());
matcher = computePattern.matcher("HELLO");
assertTrue(matcher.matches());
}
}

View File

@@ -397,12 +397,17 @@ public class ItemBuilder extends AbstractDSpaceObjectBuilder<Item> {
try (Context c = new Context()) { try (Context c = new Context()) {
c.setDispatcher("noindex"); c.setDispatcher("noindex");
c.turnOffAuthorisationSystem(); c.turnOffAuthorisationSystem();
// If the workspaceItem used to create this item still exists, delete it
workspaceItem = c.reloadEntity(workspaceItem);
if (workspaceItem != null) {
workspaceItemService.deleteAll(c, workspaceItem);
}
// Ensure object and any related objects are reloaded before checking to see what needs cleanup // Ensure object and any related objects are reloaded before checking to see what needs cleanup
item = c.reloadEntity(item); item = c.reloadEntity(item);
if (item != null) { if (item != null) {
delete(c, item); delete(c, item);
c.complete();
} }
c.complete();
} }
} }

View File

@@ -17,6 +17,7 @@ import org.dspace.builder.CommunityBuilder;
import org.dspace.builder.ItemBuilder; import org.dspace.builder.ItemBuilder;
import org.dspace.content.Collection; import org.dspace.content.Collection;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.core.factory.CoreServiceFactory;
import org.dspace.curate.Curator; import org.dspace.curate.Curator;
import org.dspace.identifier.VersionedHandleIdentifierProviderWithCanonicalHandles; import org.dspace.identifier.VersionedHandleIdentifierProviderWithCanonicalHandles;
import org.dspace.services.ConfigurationService; import org.dspace.services.ConfigurationService;
@@ -38,10 +39,11 @@ public class CreateMissingIdentifiersIT
@Test @Test
public void testPerform() public void testPerform()
throws IOException { throws IOException {
ConfigurationService configurationService // Must remove any cached named plugins before creating a new one
= DSpaceServicesFactory.getInstance().getConfigurationService(); CoreServiceFactory.getInstance().getPluginService().clearNamedPluginClasses();
configurationService.setProperty(P_TASK_DEF, null); ConfigurationService configurationService = kernelImpl.getConfigurationService();
configurationService.addPropertyValue(P_TASK_DEF, // Define a new task dynamically
configurationService.setProperty(P_TASK_DEF,
CreateMissingIdentifiers.class.getCanonicalName() + " = " + TASK_NAME); CreateMissingIdentifiers.class.getCanonicalName() + " = " + TASK_NAME);
Curator curator = new Curator(); Curator curator = new Curator();
@@ -49,11 +51,11 @@ public class CreateMissingIdentifiersIT
context.setCurrentUser(admin); context.setCurrentUser(admin);
parentCommunity = CommunityBuilder.createCommunity(context) parentCommunity = CommunityBuilder.createCommunity(context)
.build(); .build();
Collection collection = CollectionBuilder.createCollection(context, parentCommunity) Collection collection = CollectionBuilder.createCollection(context, parentCommunity)
.build(); .build();
Item item = ItemBuilder.createItem(context, collection) Item item = ItemBuilder.createItem(context, collection)
.build(); .build();
/* /*
* Curate with regular test configuration -- should succeed. * Curate with regular test configuration -- should succeed.

View File

@@ -0,0 +1,20 @@
/**
* The contents of this file are subject to the license and copyright
* detailed in the LICENSE and NOTICE files at the root of the source
* tree and available online at
*
* http://www.dspace.org/license/
*/
package org.dspace.iiif;
import org.dspace.content.Bitstream;
/**
* Mock for the IIIFApiQueryService.
* @author Michael Spalti (mspalti at willamette.edu)
*/
public class MockIIIFApiQueryServiceImpl extends IIIFApiQueryServiceImpl {
public int[] getImageDimensions(Bitstream bitstream) {
return new int[]{64, 64};
}
}

View File

@@ -30,7 +30,6 @@ import java.util.Date;
import java.util.List; import java.util.List;
import org.dspace.AbstractIntegrationTestWithDatabase; import org.dspace.AbstractIntegrationTestWithDatabase;
import org.dspace.authorize.AuthorizeException;
import org.dspace.builder.CollectionBuilder; import org.dspace.builder.CollectionBuilder;
import org.dspace.builder.CommunityBuilder; import org.dspace.builder.CommunityBuilder;
import org.dspace.builder.EntityTypeBuilder; import org.dspace.builder.EntityTypeBuilder;
@@ -70,7 +69,9 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
private Collection profileCollection; private Collection profileCollection;
@Before @Before
public void setup() { @Override
public void setUp() throws Exception {
super.setUp();
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
@@ -84,12 +85,15 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
} }
@After @After
public void after() throws SQLException, AuthorizeException { @Override
public void destroy() throws Exception {
List<OrcidQueue> records = orcidQueueService.findAll(context); List<OrcidQueue> records = orcidQueueService.findAll(context);
for (OrcidQueue record : records) { for (OrcidQueue record : records) {
orcidQueueService.delete(context, record); orcidQueueService.delete(context, record);
} }
context.setDispatcher(null); context.setDispatcher(null);
super.destroy();
} }
@Test @Test
@@ -139,6 +143,8 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
@Test @Test
public void testOrcidQueueRecordCreationForProfile() throws Exception { public void testOrcidQueueRecordCreationForProfile() throws Exception {
// Set a fake handle prefix for this test which we will use to assign handles below
configurationService.setProperty("handle.prefix", "fake-handle");
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
Item profile = ItemBuilder.createItem(context, profileCollection) Item profile = ItemBuilder.createItem(context, profileCollection)
@@ -146,7 +152,7 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
.withOrcidIdentifier("0000-1111-2222-3333") .withOrcidIdentifier("0000-1111-2222-3333")
.withOrcidAccessToken("ab4d18a0-8d9a-40f1-b601-a417255c8d20", eperson) .withOrcidAccessToken("ab4d18a0-8d9a-40f1-b601-a417255c8d20", eperson)
.withSubject("test") .withSubject("test")
.withHandle("123456789/200") .withHandle("fake-handle/190")
.withOrcidSynchronizationProfilePreference(BIOGRAPHICAL) .withOrcidSynchronizationProfilePreference(BIOGRAPHICAL)
.withOrcidSynchronizationProfilePreference(IDENTIFIERS) .withOrcidSynchronizationProfilePreference(IDENTIFIERS)
.build(); .build();
@@ -159,8 +165,8 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
assertThat(queueRecords, hasItem(matches(profile, profile, "KEYWORDS", null, assertThat(queueRecords, hasItem(matches(profile, profile, "KEYWORDS", null,
"dc.subject::test", "test", INSERT))); "dc.subject::test", "test", INSERT)));
assertThat(queueRecords, hasItem(matches(profile, "RESEARCHER_URLS", null, assertThat(queueRecords, hasItem(matches(profile, "RESEARCHER_URLS", null,
"dc.identifier.uri::http://localhost:4000/handle/123456789/200", "dc.identifier.uri::http://localhost:4000/handle/fake-handle/190",
"http://localhost:4000/handle/123456789/200", INSERT))); "http://localhost:4000/handle/fake-handle/190", INSERT)));
addMetadata(profile, "person", "name", "variant", "User Test", null); addMetadata(profile, "person", "name", "variant", "User Test", null);
context.commit(); context.commit();
@@ -170,8 +176,8 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
assertThat(queueRecords, hasItem( assertThat(queueRecords, hasItem(
matches(profile, profile, "KEYWORDS", null, "dc.subject::test", "test", INSERT))); matches(profile, profile, "KEYWORDS", null, "dc.subject::test", "test", INSERT)));
assertThat(queueRecords, hasItem(matches(profile, "RESEARCHER_URLS", null, assertThat(queueRecords, hasItem(matches(profile, "RESEARCHER_URLS", null,
"dc.identifier.uri::http://localhost:4000/handle/123456789/200", "dc.identifier.uri::http://localhost:4000/handle/fake-handle/190",
"http://localhost:4000/handle/123456789/200", INSERT))); "http://localhost:4000/handle/fake-handle/190", INSERT)));
assertThat(queueRecords, hasItem(matches(profile, profile, "OTHER_NAMES", assertThat(queueRecords, hasItem(matches(profile, profile, "OTHER_NAMES",
null, "person.name.variant::User Test", "User Test", INSERT))); null, "person.name.variant::User Test", "User Test", INSERT)));
} }
@@ -640,7 +646,8 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
@Test @Test
public void testOrcidQueueRecalculationOnProfilePreferenceUpdate() throws Exception { public void testOrcidQueueRecalculationOnProfilePreferenceUpdate() throws Exception {
// Set a fake handle prefix for this test which we will use to assign handles below
configurationService.setProperty("handle.prefix", "fake-handle");
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
Item profile = ItemBuilder.createItem(context, profileCollection) Item profile = ItemBuilder.createItem(context, profileCollection)
@@ -648,7 +655,7 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
.withOrcidIdentifier("0000-0000-0012-2345") .withOrcidIdentifier("0000-0000-0012-2345")
.withOrcidAccessToken("ab4d18a0-8d9a-40f1-b601-a417255c8d20", eperson) .withOrcidAccessToken("ab4d18a0-8d9a-40f1-b601-a417255c8d20", eperson)
.withSubject("Math") .withSubject("Math")
.withHandle("123456789/200") .withHandle("fake-handle/200")
.withOrcidSynchronizationProfilePreference(BIOGRAPHICAL) .withOrcidSynchronizationProfilePreference(BIOGRAPHICAL)
.build(); .build();
@@ -669,8 +676,8 @@ public class OrcidQueueConsumerIT extends AbstractIntegrationTestWithDatabase {
assertThat(records, hasItem(matches(profile, "KEYWORDS", null, "dc.subject::Math", "Math", INSERT))); assertThat(records, hasItem(matches(profile, "KEYWORDS", null, "dc.subject::Math", "Math", INSERT)));
assertThat(records, hasItem(matches(profile, "EXTERNAL_IDS", null, "person.identifier.rid::ID", "ID", INSERT))); assertThat(records, hasItem(matches(profile, "EXTERNAL_IDS", null, "person.identifier.rid::ID", "ID", INSERT)));
assertThat(records, hasItem(matches(profile, "RESEARCHER_URLS", null, assertThat(records, hasItem(matches(profile, "RESEARCHER_URLS", null,
"dc.identifier.uri::http://localhost:4000/handle/123456789/200", "dc.identifier.uri::http://localhost:4000/handle/fake-handle/200",
"http://localhost:4000/handle/123456789/200", INSERT))); "http://localhost:4000/handle/fake-handle/200", INSERT)));
removeMetadata(profile, "dspace", "orcid", "sync-profile"); removeMetadata(profile, "dspace", "orcid", "sync-profile");

View File

@@ -78,29 +78,45 @@ public class CanvasService extends AbstractResourceService {
} }
/** /**
* Checks for bitstream iiif.image.width metadata in the first * Checks for "iiif.image.width" metadata in IIIF bundles. When bitstream
* bitstream in first IIIF bundle. If bitstream metadata is not * metadata is not found for the first image in the bundle this method updates the
* found, use the IIIF image service to update the default canvas * default canvas dimensions for the request based on the actual image dimensions,
* dimensions for this request. Called once for each manifest. * using the IIIF image service. Called once for each manifest.
* @param bundles IIIF bundles for this item * @param bundles IIIF bundles for this item
*/ */
protected void guessCanvasDimensions(List<Bundle> bundles) { protected void guessCanvasDimensions(Context context, List<Bundle> bundles) {
Bitstream firstBistream = bundles.get(0).getBitstreams().get(0); // prevent redundant updates.
if (!utils.hasWidthMetadata(firstBistream)) { boolean dimensionUpdated = false;
int[] imageDims = utils.getImageDimensions(firstBistream);
if (imageDims != null && imageDims.length == 2) { for (Bundle bundle : bundles) {
// update the fallback dimensions if (!dimensionUpdated) {
defaultCanvasWidthFallback = imageDims[0]; for (Bitstream bitstream : bundle.getBitstreams()) {
defaultCanvasHeightFallback = imageDims[1]; if (utils.isIIIFBitstream(context, bitstream)) {
// check for width dimension
if (!utils.hasWidthMetadata(bitstream)) {
// get the dimensions of the image.
int[] imageDims = utils.getImageDimensions(bitstream);
if (imageDims != null && imageDims.length == 2) {
// update the fallback dimensions
defaultCanvasWidthFallback = imageDims[0];
defaultCanvasHeightFallback = imageDims[1];
}
setDefaultCanvasDimensions();
// stop processing the bundles
dimensionUpdated = true;
}
// check only the first image
break;
}
}
} }
setDefaultCanvasDimensions();
} }
} }
/** /**
* Used to set the height and width dimensions for all images when iiif.image.default-width and * Sets the height and width dimensions for all images when "iiif.image.default-width"
* iiif.image.default-height are set to -1 in DSpace configuration. * and "iiif.image.default-height" are set to -1 in DSpace configuration. The values
* The values are updated only if the bitstream does not have its own iiif.image.width metadata. * are updated only when the bitstream does not have its own image dimension metadata.
* @param bitstream * @param bitstream
*/ */
private void setCanvasDimensions(Bitstream bitstream) { private void setCanvasDimensions(Bitstream bitstream) {

View File

@@ -156,7 +156,7 @@ public class ManifestService extends AbstractResourceService {
List<Bundle> bundles = utils.getIIIFBundles(item); List<Bundle> bundles = utils.getIIIFBundles(item);
// Set the default canvas dimensions. // Set the default canvas dimensions.
if (guessCanvasDimension) { if (guessCanvasDimension) {
canvasService.guessCanvasDimensions(bundles); canvasService.guessCanvasDimensions(context, bundles);
} }
for (Bundle bnd : bundles) { for (Bundle bnd : bundles) {
String bundleToCPrefix = null; String bundleToCPrefix = null;

View File

@@ -137,7 +137,7 @@ public class IIIFUtils {
* @param b the DSpace bitstream to check * @param b the DSpace bitstream to check
* @return true if the bitstream can be used as IIIF resource * @return true if the bitstream can be used as IIIF resource
*/ */
private boolean isIIIFBitstream(Context context, Bitstream b) { public boolean isIIIFBitstream(Context context, Bitstream b) {
return checkImageMimeType(getBitstreamMimeType(b, context)) && b.getMetadata().stream() return checkImageMimeType(getBitstreamMimeType(b, context)) && b.getMetadata().stream()
.filter(m -> m.getMetadataField().toString('.').contentEquals(METADATA_IIIF_ENABLED)) .filter(m -> m.getMetadataField().toString('.').contentEquals(METADATA_IIIF_ENABLED))
.noneMatch(m -> m.getValue().equalsIgnoreCase("false") || m.getValue().equalsIgnoreCase("no")); .noneMatch(m -> m.getValue().equalsIgnoreCase("false") || m.getValue().equalsIgnoreCase("no"));
@@ -228,7 +228,7 @@ public class IIIFUtils {
* @param mimetype * @param mimetype
* @return true if an image * @return true if an image
*/ */
public boolean checkImageMimeType(String mimetype) { private boolean checkImageMimeType(String mimetype) {
if (mimetype != null && mimetype.contains("image/")) { if (mimetype != null && mimetype.contains("image/")) {
return true; return true;
} }

View File

@@ -45,6 +45,8 @@ import org.dspace.discovery.SearchUtils;
import org.dspace.discovery.configuration.DiscoveryConfiguration; import org.dspace.discovery.configuration.DiscoveryConfiguration;
import org.dspace.discovery.configuration.DiscoveryConfigurationService; import org.dspace.discovery.configuration.DiscoveryConfigurationService;
import org.dspace.discovery.configuration.DiscoverySearchFilter; import org.dspace.discovery.configuration.DiscoverySearchFilter;
import org.dspace.discovery.configuration.DiscoverySortConfiguration;
import org.dspace.discovery.configuration.DiscoverySortFieldConfiguration;
import org.dspace.discovery.indexobject.IndexableItem; import org.dspace.discovery.indexobject.IndexableItem;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller; import org.springframework.stereotype.Controller;
@@ -141,16 +143,36 @@ public class OpenSearchController {
queryArgs.setStart(start); queryArgs.setStart(start);
queryArgs.setMaxResults(count); queryArgs.setMaxResults(count);
queryArgs.setDSpaceObjectFilter(IndexableItem.TYPE); queryArgs.setDSpaceObjectFilter(IndexableItem.TYPE);
if (sort != null) { if (sort != null) {
//this is the default sort so we want to switch this to date accessioned DiscoveryConfiguration discoveryConfiguration =
if (sortDirection != null && sortDirection.equals("DESC")) { searchConfigurationService.getDiscoveryConfiguration("");
queryArgs.setSortField(sort + "_sort", SORT_ORDER.desc); if (discoveryConfiguration != null) {
} else { DiscoverySortConfiguration searchSortConfiguration = discoveryConfiguration
queryArgs.setSortField(sort + "_sort", SORT_ORDER.asc); .getSearchSortConfiguration();
if (searchSortConfiguration != null) {
DiscoverySortFieldConfiguration sortFieldConfiguration = searchSortConfiguration
.getSortFieldConfiguration(sort);
if (sortFieldConfiguration != null) {
String sortField = searchService
.toSortFieldIndex(sortFieldConfiguration.getMetadataField(),
sortFieldConfiguration.getType());
if (sortDirection != null && sortDirection.equals("DESC")) {
queryArgs.setSortField(sortField, SORT_ORDER.desc);
} else {
queryArgs.setSortField(sortField, SORT_ORDER.asc);
}
} else {
throw new IllegalArgumentException(sort + " is not a valid sort field");
}
}
} }
} else { } else {
// this is the default sort so we want to switch this to date accessioned
queryArgs.setSortField("dc.date.accessioned_dt", SORT_ORDER.desc); queryArgs.setSortField("dc.date.accessioned_dt", SORT_ORDER.desc);
} }
if (dsoObject != null) { if (dsoObject != null) {
container = scopeResolver.resolveScope(context, dsoObject); container = scopeResolver.resolveScope(context, dsoObject);
DiscoveryConfiguration discoveryConfiguration = searchConfigurationService DiscoveryConfiguration discoveryConfiguration = searchConfigurationService

View File

@@ -111,7 +111,7 @@ public class VersionRestRepository extends DSpaceRestRepository<VersionRest, Int
} }
EPerson submitter = item.getSubmitter(); EPerson submitter = item.getSubmitter();
boolean isAdmin = authorizeService.isAdmin(context); boolean isAdmin = authorizeService.isAdmin(context, item);
boolean canCreateVersion = configurationService.getBooleanProperty("versioning.submitterCanCreateNewVersion"); boolean canCreateVersion = configurationService.getBooleanProperty("versioning.submitterCanCreateNewVersion");
if (!isAdmin && !(canCreateVersion && Objects.equals(submitter, context.getCurrentUser()))) { if (!isAdmin && !(canCreateVersion && Objects.equals(submitter, context.getCurrentUser()))) {

View File

@@ -53,7 +53,7 @@ public interface RestAuthenticationService {
* Checks the current request for a valid authentication token. If found, extracts that token and obtains the * Checks the current request for a valid authentication token. If found, extracts that token and obtains the
* currently logged in EPerson. * currently logged in EPerson.
* @param request current request * @param request current request
* @param request current response * @param response current response
* @param context current DSpace Context * @param context current DSpace Context
* @return EPerson of the logged in user (if auth token found), or null if no auth token is found * @return EPerson of the logged in user (if auth token found), or null if no auth token is found
*/ */

View File

@@ -0,0 +1,9 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd"
default-lazy-init="true">
<bean class="org.dspace.iiif.MockIIIFApiQueryServiceImpl" id="org.dspace.iiif.IIIFApiQueryServiceImpl" autowire-candidate="true"/>
</beans>

View File

@@ -184,6 +184,32 @@ public class OpenSearchControllerIT extends AbstractControllerIntegrationTest {
; ;
} }
@Test
public void validSortTest() throws Exception {
//When we call the root endpoint
getClient().perform(get("/opensearch/search")
.param("query", "")
.param("sort", "dc.date.issued"))
//The status has to be 200 OK
.andExpect(status().isOk())
//We expect the content type to be "application/atom+xml;charset=UTF-8"
.andExpect(content().contentType("application/atom+xml;charset=UTF-8"))
.andExpect(xpath("feed/totalResults").string("0"))
;
}
@Test
public void invalidSortTest() throws Exception {
//When we call the root endpoint
getClient().perform(get("/opensearch/search")
.param("query", "")
.param("sort", "dc.invalid.field"))
//We get an exception for such a sort field
//The status has to be 400 ERROR
.andExpect(status().is(400))
;
}
@Test @Test
public void serviceDocumentTest() throws Exception { public void serviceDocumentTest() throws Exception {
//When we call the root endpoint //When we call the root endpoint

View File

@@ -1111,7 +1111,7 @@ public class BrowsesResourceControllerIT extends AbstractControllerIntegrationTe
//We expect the totalElements to be the 1 item present in the collection //We expect the totalElements to be the 1 item present in the collection
.andExpect(jsonPath("$.page.totalElements", is(1))) .andExpect(jsonPath("$.page.totalElements", is(1)))
//As this is is a small collection, we expect to go-to page 0 //As this is a small collection, we expect to go-to page 0
.andExpect(jsonPath("$.page.number", is(0))) .andExpect(jsonPath("$.page.number", is(0)))
.andExpect(jsonPath("$._links.self.href", containsString("startsWith=Blade"))) .andExpect(jsonPath("$._links.self.href", containsString("startsWith=Blade")))
@@ -1121,6 +1121,33 @@ public class BrowsesResourceControllerIT extends AbstractControllerIntegrationTe
"Blade Runner", "Blade Runner",
"1982-06-25") "1982-06-25")
))); )));
//Test filtering with spaces:
//** WHEN **
//An anonymous user browses the items in the Browse by Title endpoint
//with startsWith set to Blade Runner and scope set to Col 1
getClient().perform(get("/api/discover/browses/title/items?startsWith=Blade Runner")
.param("scope", col1.getID().toString())
.param("size", "2"))
//** THEN **
//The status has to be 200 OK
.andExpect(status().isOk())
//We expect the content type to be "application/hal+json;charset=UTF-8"
.andExpect(content().contentType(contentType))
//We expect the totalElements to be the 1 item present in the collection
.andExpect(jsonPath("$.page.totalElements", is(1)))
//As this is a small collection, we expect to go-to page 0
.andExpect(jsonPath("$.page.number", is(0)))
.andExpect(jsonPath("$._links.self.href", containsString("startsWith=Blade Runner")))
//Verify that the index jumps to the "Blade Runner" item.
.andExpect(jsonPath("$._embedded.items",
contains(ItemMatcher.matchItemWithTitleAndDateIssued(item2,
"Blade Runner",
"1982-06-25")
)));
} }
@Test @Test

View File

@@ -153,23 +153,19 @@ public class ExternalSourcesRestControllerIT extends AbstractControllerIntegrati
getClient().perform(get("/api/integration/externalsources/search/findByEntityType") getClient().perform(get("/api/integration/externalsources/search/findByEntityType")
.param("entityType", "Publication")) .param("entityType", "Publication"))
.andExpect(status().isOk()) .andExpect(status().isOk())
// Expect *at least* 3 Publication sources // Expect that Publication sources match (check a max of 20 as that is default page size)
.andExpect(jsonPath("$._embedded.externalsources", Matchers.hasItems( .andExpect(jsonPath("$._embedded.externalsources",
ExternalSourceMatcher.matchExternalSource(publicationProviders.get(0)), ExternalSourceMatcher.matchAllExternalSources(publicationProviders, 20)
ExternalSourceMatcher.matchExternalSource(publicationProviders.get(1)) ))
)))
.andExpect(jsonPath("$.page.totalElements", Matchers.is(publicationProviders.size()))); .andExpect(jsonPath("$.page.totalElements", Matchers.is(publicationProviders.size())));
getClient().perform(get("/api/integration/externalsources/search/findByEntityType") getClient().perform(get("/api/integration/externalsources/search/findByEntityType")
.param("entityType", "Journal")) .param("entityType", "Journal"))
.andExpect(status().isOk()) .andExpect(status().isOk())
// Expect *at least* 5 Journal sources // Check that Journal sources match (check a max of 20 as that is default page size)
.andExpect(jsonPath("$._embedded.externalsources", Matchers.hasItems( .andExpect(jsonPath("$._embedded.externalsources",
ExternalSourceMatcher.matchExternalSource(journalProviders.get(0)), ExternalSourceMatcher.matchAllExternalSources(journalProviders, 20)
ExternalSourceMatcher.matchExternalSource(journalProviders.get(1)), ))
ExternalSourceMatcher.matchExternalSource(journalProviders.get(2)),
ExternalSourceMatcher.matchExternalSource(journalProviders.get(3))
)))
.andExpect(jsonPath("$.page.totalElements", Matchers.is(journalProviders.size()))); .andExpect(jsonPath("$.page.totalElements", Matchers.is(journalProviders.size())));
} }
@@ -193,10 +189,9 @@ public class ExternalSourcesRestControllerIT extends AbstractControllerIntegrati
.param("entityType", "Journal") .param("entityType", "Journal")
.param("size", String.valueOf(pageSize))) .param("size", String.valueOf(pageSize)))
.andExpect(status().isOk()) .andExpect(status().isOk())
.andExpect(jsonPath("$._embedded.externalsources", Matchers.contains( .andExpect(jsonPath("$._embedded.externalsources",
ExternalSourceMatcher.matchExternalSource(journalProviders.get(0)), ExternalSourceMatcher.matchAllExternalSources(journalProviders, pageSize)
ExternalSourceMatcher.matchExternalSource(journalProviders.get(1)) ))
)))
.andExpect(jsonPath("$.page.totalPages", Matchers.is(numberOfPages))) .andExpect(jsonPath("$.page.totalPages", Matchers.is(numberOfPages)))
.andExpect(jsonPath("$.page.totalElements", Matchers.is(numJournalProviders))); .andExpect(jsonPath("$.page.totalElements", Matchers.is(numJournalProviders)));
@@ -205,10 +200,12 @@ public class ExternalSourcesRestControllerIT extends AbstractControllerIntegrati
.param("page", "1") .param("page", "1")
.param("size", String.valueOf(pageSize))) .param("size", String.valueOf(pageSize)))
.andExpect(status().isOk()) .andExpect(status().isOk())
.andExpect(jsonPath("$._embedded.externalsources", Matchers.contains( // Check that second page has journal sources starting at index 2.
ExternalSourceMatcher.matchExternalSource(journalProviders.get(2)), .andExpect(jsonPath("$._embedded.externalsources",
ExternalSourceMatcher.matchExternalSource(journalProviders.get(3)) ExternalSourceMatcher.matchAllExternalSources(
))) journalProviders.subList(2, journalProviders.size()),
pageSize)
))
.andExpect(jsonPath("$.page.totalPages", Matchers.is(numberOfPages))) .andExpect(jsonPath("$.page.totalPages", Matchers.is(numberOfPages)))
.andExpect(jsonPath("$.page.totalElements", Matchers.is(numJournalProviders))); .andExpect(jsonPath("$.page.totalElements", Matchers.is(numJournalProviders)));
} }

View File

@@ -850,6 +850,55 @@ public class VersionRestRepositoryIT extends AbstractControllerIntegrationTest {
.andExpect(status().isUnauthorized()); .andExpect(status().isUnauthorized());
} }
@Test
public void createNewVersionItemByCollectionAdminTest() throws Exception {
context.turnOffAuthorisationSystem();
Community rootCommunity = CommunityBuilder.createCommunity(context)
.withName("Parent Community")
.build();
EPerson colAdmin = EPersonBuilder.createEPerson(context)
.withCanLogin(true)
.withEmail("coladmin@email.com")
.withPassword(password)
.withNameInMetadata("Collection", "Admin")
.build();
Collection col = CollectionBuilder
.createCollection(context, rootCommunity)
.withName("Collection 1")
.withAdminGroup(colAdmin)
.build();
Item item = ItemBuilder.createItem(context, col)
.withTitle("Public test item")
.withIssueDate("2022-12-19")
.withAuthor("Doe, John")
.withSubject("ExtraEntry")
.build();
item.setSubmitter(eperson);
context.restoreAuthSystemState();
AtomicReference<Integer> idRef = new AtomicReference<Integer>();
String token = getAuthToken(colAdmin.getEmail(), password);
try {
getClient(token).perform(post("/api/versioning/versions")
.param("summary", "test summary!")
.contentType(MediaType.parseMediaType(RestMediaTypes.TEXT_URI_LIST_VALUE))
.content("/api/core/items/" + item.getID()))
.andExpect(status().isCreated())
.andExpect(jsonPath("$", Matchers.allOf(
hasJsonPath("$.version", is(2)),
hasJsonPath("$.summary", is("test summary!")),
hasJsonPath("$.type", is("version"))
)))
.andDo(result -> idRef.set(read(result.getResponse().getContentAsString(), "$.id")));
} finally {
VersionBuilder.delete(idRef.get());
}
}
@Test @Test
public void patchReplaceSummaryTest() throws Exception { public void patchReplaceSummaryTest() throws Exception {
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();

View File

@@ -22,12 +22,14 @@ import org.apache.commons.codec.CharEncoding;
import org.apache.commons.io.IOUtils; import org.apache.commons.io.IOUtils;
import org.dspace.app.rest.test.AbstractControllerIntegrationTest; import org.dspace.app.rest.test.AbstractControllerIntegrationTest;
import org.dspace.builder.BitstreamBuilder; import org.dspace.builder.BitstreamBuilder;
import org.dspace.builder.BundleBuilder;
import org.dspace.builder.CollectionBuilder; import org.dspace.builder.CollectionBuilder;
import org.dspace.builder.CommunityBuilder; import org.dspace.builder.CommunityBuilder;
import org.dspace.builder.EPersonBuilder; import org.dspace.builder.EPersonBuilder;
import org.dspace.builder.GroupBuilder; import org.dspace.builder.GroupBuilder;
import org.dspace.builder.ItemBuilder; import org.dspace.builder.ItemBuilder;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
import org.dspace.content.Bundle;
import org.dspace.content.Collection; import org.dspace.content.Collection;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.content.service.ItemService; import org.dspace.content.service.ItemService;
@@ -85,7 +87,7 @@ public class IIIFControllerIT extends AbstractControllerIntegrationTest {
} }
@Test @Test
public void findOneIIIFSearchableEntityTypeWithGlobalConfigIT() throws Exception { public void findOneIIIFSearchableItemWithDefaultDimensionsIT() throws Exception {
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
parentCommunity = CommunityBuilder.createCommunity(context) parentCommunity = CommunityBuilder.createCommunity(context)
.withName("Parent Community") .withName("Parent Community")
@@ -138,7 +140,8 @@ public class IIIFControllerIT extends AbstractControllerIntegrationTest {
.andExpect(jsonPath("$.sequences[0].canvases[0].@id", .andExpect(jsonPath("$.sequences[0].canvases[0].@id",
Matchers.containsString("/iiif/" + publicItem1.getID() + "/canvas/c0"))) Matchers.containsString("/iiif/" + publicItem1.getID() + "/canvas/c0")))
.andExpect(jsonPath("$.sequences[0].canvases[0].label", is("Page 1"))) .andExpect(jsonPath("$.sequences[0].canvases[0].label", is("Page 1")))
.andExpect(jsonPath("$.sequences[0].canvases[0].width", is(2200))) .andExpect(jsonPath("$.sequences[0].canvases[0].width", is(64)))
.andExpect(jsonPath("$.sequences[0].canvases[0].height", is(64)))
.andExpect(jsonPath("$.sequences[0].canvases[0].images[0].resource.service.@id", .andExpect(jsonPath("$.sequences[0].canvases[0].images[0].resource.service.@id",
Matchers.endsWith(bitstream1.getID().toString()))) Matchers.endsWith(bitstream1.getID().toString())))
.andExpect(jsonPath("$.sequences[0].canvases[0].metadata[0].label", is("File name"))) .andExpect(jsonPath("$.sequences[0].canvases[0].metadata[0].label", is("File name")))
@@ -1290,4 +1293,45 @@ public class IIIFControllerIT extends AbstractControllerIntegrationTest {
.andExpect(jsonPath("$.metadata[0].value", is("Public item (revised)"))); .andExpect(jsonPath("$.metadata[0].value", is("Public item (revised)")));
} }
@Test
public void setDefaultCanvasDimensionCustomBundle() throws Exception {
context.turnOffAuthorisationSystem();
parentCommunity = CommunityBuilder.createCommunity(context)
.withName("Parent Community")
.build();
Collection col1 = CollectionBuilder.createCollection(context, parentCommunity).withName("Collection 1")
.build();
Item publicItem1 = ItemBuilder.createItem(context, col1)
.withTitle("Public item 1")
.withIssueDate("2017-10-17")
.withAuthor("Smith, Donald").withAuthor("Doe, John")
.enableIIIF()
.build();
Bundle targetBundle = BundleBuilder.createBundle(context, publicItem1)
.withName(IIIFBundle)
.build();
String bitstreamContent = "ThisIsSomeDummyText";
try (InputStream is = IOUtils.toInputStream(bitstreamContent, CharEncoding.UTF_8)) {
Bitstream bitstream1 = BitstreamBuilder
.createBitstream(context, targetBundle, is)
.withName("Bitstream1.jpg")
.withMimeType("image/jpeg")
.build();
}
context.restoreAuthSystemState();
// canvas dimensions using bitstream in the custom bundle (no bitstreams in ORIGINAL)
getClient().perform(get("/iiif/" + publicItem1.getID() + "/manifest"))
.andExpect(status().isOk())
.andExpect(jsonPath("$.sequences[0].canvases[0].width", is(64)))
.andExpect(jsonPath("$.sequences[0].canvases[0].height", is(64)));
}
} }

View File

@@ -28,8 +28,8 @@ public class ClaimedTaskMatcher {
/** /**
* Check if the returned json expose all the required links and properties * Check if the returned json expose all the required links and properties
* *
* @param ptask * @param cTask
* the pool task * the claimed task
* @param step * @param step
* the step name * the step name
* @return * @return

View File

@@ -10,8 +10,12 @@ package org.dspace.app.rest.matcher;
import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath; import static com.jayway.jsonpath.matchers.JsonPathMatchers.hasJsonPath;
import static org.dspace.app.rest.test.AbstractControllerIntegrationTest.REST_SERVER_URL; import static org.dspace.app.rest.test.AbstractControllerIntegrationTest.REST_SERVER_URL;
import static org.hamcrest.Matchers.allOf; import static org.hamcrest.Matchers.allOf;
import static org.hamcrest.Matchers.contains;
import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.is;
import java.util.ArrayList;
import java.util.List;
import org.dspace.external.provider.ExternalDataProvider; import org.dspace.external.provider.ExternalDataProvider;
import org.hamcrest.Matcher; import org.hamcrest.Matcher;
@@ -20,6 +24,28 @@ public class ExternalSourceMatcher {
private ExternalSourceMatcher() { private ExternalSourceMatcher() {
} }
/**
* Matcher which checks if all external source providers are listed (in exact order), up to the maximum number
* @param providers List of providers to check against
* @param max maximum number to check
* @return Matcher for this list of providers
*/
public static Matcher<Iterable<? extends List<Matcher>>> matchAllExternalSources(
List<ExternalDataProvider> providers, int max) {
List<Matcher> matchers = new ArrayList<>();
int count = 0;
for (ExternalDataProvider provider: providers) {
count++;
if (count > max) {
break;
}
matchers.add(matchExternalSource(provider));
}
// Make sure all providers exist in this exact order
return contains(matchers.toArray(new Matcher[0]));
}
public static Matcher<? super Object> matchExternalSource(ExternalDataProvider provider) { public static Matcher<? super Object> matchExternalSource(ExternalDataProvider provider) {
return matchExternalSource(provider.getSourceIdentifier(), provider.getSourceIdentifier(), false); return matchExternalSource(provider.getSourceIdentifier(), provider.getSourceIdentifier(), false);
} }

View File

@@ -28,7 +28,7 @@ public class PoolTaskMatcher {
/** /**
* Check if the returned json expose all the required links and properties * Check if the returned json expose all the required links and properties
* *
* @param ptask * @param pTask
* the pool task * the pool task
* @param step * @param step
* the step name * the step name

View File

@@ -0,0 +1,20 @@
/**
* The contents of this file are subject to the license and copyright
* detailed in the LICENSE and NOTICE files at the root of the source
* tree and available online at
*
* http://www.dspace.org/license/
*/
package org.dspace.iiif;
import org.dspace.content.Bitstream;
/**
* Mock for the IIIFApiQueryService.
* @author Michael Spalti (mspalti at willamette.edu)
*/
public class MockIIIFApiQueryServiceImpl extends IIIFApiQueryServiceImpl {
public int[] getImageDimensions(Bitstream bitstream) {
return new int[]{64, 64};
}
}

View File

@@ -111,7 +111,7 @@ public interface RequestService {
/** /**
* Set the ID of the current authenticated user * Set the ID of the current authenticated user
* *
* @return the id of the user associated with the current thread OR null if there is no user * @param epersonId the id of the user associated with the current thread OR null if there is no user
*/ */
public void setCurrentUserId(UUID epersonId); public void setCurrentUserId(UUID epersonId);

View File

@@ -32,3 +32,11 @@
# By default, only 'dspace.agreements.end-user' can be deleted in bulk, as doing so allows # By default, only 'dspace.agreements.end-user' can be deleted in bulk, as doing so allows
# an administrator to force all users to re-review the End User Agreement on their next login. # an administrator to force all users to re-review the End User Agreement on their next login.
bulkedit.allow-bulk-deletion = dspace.agreements.end-user bulkedit.allow-bulk-deletion = dspace.agreements.end-user
### metadata import script ###
# Set the number after which the changes should be committed while running the script
# After too much consecutive records everything starts to slow down because too many things are being loaded into memory
# If we commit these to the database these are cleared out of our memory and we don't lose as much performance
# By default this is set to 100
bulkedit.change.commit.count = 100

View File

@@ -24,7 +24,7 @@
<spring-security.version>5.6.5</spring-security.version> <!-- sync with version used by spring-boot--> <spring-security.version>5.6.5</spring-security.version> <!-- sync with version used by spring-boot-->
<hibernate.version>5.6.5.Final</hibernate.version> <hibernate.version>5.6.5.Final</hibernate.version>
<hibernate-validator.version>6.0.23.Final</hibernate-validator.version> <hibernate-validator.version>6.0.23.Final</hibernate-validator.version>
<postgresql.driver.version>42.4.1</postgresql.driver.version> <postgresql.driver.version>42.4.3</postgresql.driver.version>
<solr.client.version>8.11.1</solr.client.version> <solr.client.version>8.11.1</solr.client.version>
<ehcache.version>3.4.0</ehcache.version> <ehcache.version>3.4.0</ehcache.version>