Compare commits

..

873 Commits

Author SHA1 Message Date
Tim Donohue
66647c7bf9 [maven-release-plugin] prepare release dspace-6.4 2022-07-28 12:15:43 -05:00
Tim Donohue
0b1edec149 Merge pull request #8430 from tdonohue/minor_docker_fixes
Minor Docker fixes for 6.x. Ensure correct permissions and ensure wget installed
2022-07-27 12:19:03 -05:00
Tim Donohue
08c3f1d3b1 Minor Docker fixes. Ensure correct permissions and ensure wget installed 2022-07-27 12:07:41 -05:00
Kim Shepherd
503a6af57f [DS-4383] Escape Request a Copy HTML (JSPUI) 2022-07-26 16:36:05 +12:00
Tim Donohue
15ec90b7c3 Merge pull request #8411 from tdonohue/prep_for_6.4
Final Prepartions/Updates for 6.4 bug-fix-only release
2022-07-18 13:54:43 -05:00
Tim Donohue
dc6a00f17e Update old DuraSpace links to point at LYRASIS 2022-07-18 13:24:31 -05:00
Tim Donohue
c92d20acaa Update LICENSE and NOTICE based on latest main (resync with main) 2022-07-18 13:24:00 -05:00
Tim Donohue
0467c26f40 Update LICENSES_THIRD_PARTY for 6.4 2022-07-18 13:23:33 -05:00
Hrafn Malmquist
070bdaea80 Merge pull request #8409 from J4bbi/6_x/update_readme_issues
Updated README with correct issue tracker info
2022-07-16 01:09:21 +01:00
Hrafn Malmquist
0b749cff01 Updated README with correct issue tracker info 2022-07-16 00:46:48 +01:00
Hrafn Malmquist
d058e369a6 Merge pull request #8144 from kshepherd/8143_dspace6_reload4j
DSpace 6.x - replace log4j with reload4j, slf4j-log4j12 with slf4j-reload4j
2022-07-16 00:33:59 +01:00
Kim Shepherd
09fec5c69c #8143 - Bump reload4j version to 1.2.20 2022-06-21 21:03:39 +12:00
Kim Shepherd
4d49af2619 #8143 - Fix variable, bump version 2022-06-21 20:54:15 +12:00
Alan Orth
7f80e42e94 Merge pull request #8331 from alanorth/mirage2-nodejs-14
Use Node.js 14 to build Mirage 2
2022-06-08 19:45:27 +03:00
Alan Orth
df44dcdc4c DS-3881: fix newlines in i18n:text strings 2022-06-08 19:37:35 +03:00
Alan Orth
2b8d987fe6 Merge pull request #2371 from atmire/DS-3881_Fixed-discovery-view-more-facet-displaying-incorrect-totals_dspace-6
[DS-3881] Fixed discovery "view more" facet displaying incorrect totals
2022-06-08 19:36:47 +03:00
Alan Orth
38136784cc Use Node.js 14 to build Mirage 2
Node.js 12 is no longer a supported release by upstream and Mirage 2
builds fine with Node.js 14.

See: https://nodejs.org/en/about/releases/
2022-06-07 09:06:33 +03:00
Hrafn Malmquist
0900862c27 Merge pull request #7993 from alanorth/7946-rest-api-null-qualifier
dspace-api: check for null AND empty qualifier in findByElement()
2022-06-07 00:06:15 +01:00
kshepherd
e287d751f0 Merge pull request #8330 from leonardoguerrero/mirage2-discover
Mirage2 discover
2022-06-07 08:45:08 +12:00
Alan Orth
74030ca39c Merge pull request #8292 from alanorth/mirage2-npm-node-sass
Migrate Mirage 2 build from bower and Ruby sass to npm and node-sass
2022-06-06 20:12:48 +03:00
Leonardo Guerrero
990e334d21 fix restoreOriginalFilters function 2022-06-02 14:31:41 -05:00
Leonardo Guerrero
fde39e3ffd update jQuery methods 2022-06-02 14:31:20 -05:00
Alan Orth
7bc99ec88a dspace-xmlui-mirage2: update readme.md
Add some notes about migrating dependencies from bower.json to the
new npm package.json.

Remove references to `compass watch` since that doesn't work anymore.
2022-05-31 20:35:16 +03:00
Alan Orth
a32641974e Remove unneeded .bowerrc 2022-05-31 20:35:16 +03:00
Alan Orth
093f0f234d dspace-xmlui-mirage2: update grunt deps in package.json
This bumps the version of several grunt tools to their latest, with
the exception of grunt-contrib-handlebars because our handlebars is
old.
2022-05-31 20:35:16 +03:00
Alan Orth
8dda729d75 dspace-xmlui-mirage2: update readme.md
Remove all references to Ruby and update install instructions to
mention npm instead of bower. Also clean up some formatting and
plaintext http links as well as some minor style edits.
2022-05-31 20:35:16 +03:00
Alan Orth
21701d9026 .github/workflows/build.yml: remove pre-requisites task
We will get npm from the Node.js 12.x distribution, and GitHub CI
already has grunt installed, so there's actually nothing else we
need to install in order to build Mirage 2.
2022-05-31 20:35:16 +03:00
Alan Orth
8d5aecd15f dspace/modules/xmlui-mirage2/pom.xml: remove bower comment 2022-05-31 20:35:16 +03:00
Alan Orth
f1ac15c8ce Use latest Node.js 12.x
For CI we can simply use "12" which will take the latest 12.x LTS
release rather than pinning 12.22.4. For the normal maven build
we need to specify an actual version number because that is how
the frontend-maven-plugin works.

Also update comment about keeping these versions in sync in pom.xml
and .github/workflows/build.yml, instead of travis.yml.
2022-05-31 20:35:16 +03:00
Alan Orth
c311e91679 dspace-xmlui-mirage2: remove bower.json
We no longer use bower so this file is not needed.
2022-05-31 20:35:16 +03:00
Alan Orth
43d0d442d3 dspace-xmlui-mirage2: update jQuery to 3.4.1
We bumped this in bower.json earlier in 6.4-SNAPSHOT development.

See: https://github.com/DSpace/DSpace/pull/2918
2022-05-31 20:35:16 +03:00
Alan Orth
34491fcf37 .github/workflows/build.yml: Use setup-node
Use setup-node action to install Node.js instead of a bash command
with nvm. For some reason the old nvm method keeps giving me Node
16...

See: https://github.com/actions/setup-node
2022-05-31 20:35:16 +03:00
Alan Orth
33e9f7d831 .github: update Mirage2 build
We no longer use Ruby-based SASS processors.
2022-05-31 20:35:16 +03:00
Alan Orth
3f50638d1b dspace-xmlui-mirage2/pom.xml: exclude modernizr.min.js
Exclude our static modernizr.min.js from license check.
2022-05-31 20:35:16 +03:00
Alan Orth
808c7e62a3 xmlui: remove sass.version var from pom.xml 2022-05-31 20:35:16 +03:00
Alan Orth
66a7c5d171 dspace-xmlui-mirage2: Fix glyphicon encoding
This is due to using node-sass instead of Ruby compass. The header
trail shows  instead of /.

See: https://github.com/twbs/bootstrap-sass/issues/919
2022-05-31 20:35:16 +03:00
Alan Orth
bc24638ad6 xmlui: Update slow Mirage2 build
Make sure we run the correct grunt tasks when running the "slow"
Mirage2 build with `-Dmirage2.deps.included=true`.
2022-05-31 20:35:16 +03:00
Alan Orth
255b527d87 xmlui: Remove remaining Ruby and compass stuff
We no longer need the Ruby-based sass and compass to build Mirage2.
2022-05-31 20:35:16 +03:00
Alan Orth
48c41a912c dspace/modules/xmlui-mirage2: remove rubygems
The rubygems.torquebox.org domain is no longer active so it is not
possible to retrieve these gems during build (and we're not using
them anyways, since we switched to node-sass).
2022-05-31 20:35:16 +03:00
Alan Orth
2007643dbc dspace-xmlui-mirage2: fix path to JavaScript assets
The modules installed via npm are organized slightly differently
than they were when installed via bower.
2022-05-31 20:35:16 +03:00
Alan Orth
5bd9f9a814 dspace-xmlui-mirage2: fix path to IE 9 scripts
These are now installed by npm and we place them in the scripts
directory with grunt.
2022-05-31 20:35:16 +03:00
Alan Orth
7143f3d05a dspace-xmlui-mirage2: use a static copy of modernizer.min.js
Long story short: this was previous installed by bower, but bower
is deprecated and we are trying to move dependencies to npm. The
version here is 2.8.3, which is six years old and does not exist
on npm's registry, and can't be installed from git because it is
missing package.json. Since the file is not changing whatsoever
I think it's fine if we just keep a static copy in the tree and
bundle it in with the theme.js using grunt, which has the added
benefit of doing one less network request.
2022-05-31 20:35:16 +03:00
Alan Orth
45b7ee6702 dspace-xmlui-mirage2: build Mirage 2 without Ruby
Update Mirage 2 so it can build without the Ruby-based sass and
compass:

- Port bower.json dependencies to package.json
- Polyfill compass's sass mixins in _main.scss from compass-mixins
- Adjust the Gruntfile.js to use node-sass* instead of Ruby sass and
  add task for copying Bootstrap web fonts
- Allow building on newer Node.js (LTS 12 for now)

Note: Node.js 12.x does not seem to support node-sass 8.x so we use
7.x here.
2022-05-31 20:35:16 +03:00
Alan Orth
d7c89f2b4b dspace/modules/xmlui-mirage2/pom.xml: update npm tasks
Update frontend-maven-plugin and use the default npm provided by
the distribution rather than specifying a version explicitly.

Also, add a minor tweak to the copy-resources goal so that our
overrides in dspace/modules/xmlui-mirage2 don't get over written
by the default Mirage2 source.
2022-05-31 20:35:16 +03:00
Kim Shepherd
afcc6c3389 [DS-4449] Sanitise stacktrace output and default to NOT output stacktraces 2022-05-28 11:51:46 +12:00
Kim Shepherd
ebb83a7523 [DS-4453] Fix XSS handling in JSPUI discovery spellcheck 2022-05-28 11:51:40 +12:00
Kim Shepherd
35030a23e4 [DS-4453] Fix XSS handling in JSPUI discovery autocomplete 2022-05-28 11:51:34 +12:00
Kim Shepherd
7569c6374a [DS-4132] Fix JSPUI resumable upload path traversal 2022-05-28 11:51:28 +12:00
Alan Orth
985b60ecb3 Merge pull request #2593 from TexasDigitalLibrary/dspace-6_x-1980-include-noindex-tag-for-private-items
DS-1980: Adds meta tag to prevent robots from indexing the item if it is private
2022-05-05 10:21:44 +03:00
Alan Orth
75b9517e62 Merge pull request #2543 from atmire/DS-4271-Solr-query-term-wrapping
DS-4271: Replaced brackets with double quotes in SolrServiceImpl.
2022-05-04 21:50:16 +03:00
Tim Donohue
693c4a9811 Merge pull request #8171 from hzafar/8055_dspace6_item_moves_2
DSpace 6: Adds check for whether source and target collection are the same when moving an item.
2022-02-15 09:45:29 -06:00
Huma Zafar
ee4f219183 Adds check for whether source and target collection are the same, for DSpace 6. (#8055) 2022-02-14 14:07:25 -05:00
Tim Donohue
706f14d8a4 Merge pull request #8154 from the-library-code/DS-8152-dspace-6_x
Remove unnecessary second Context in RDFConsumer in DSpace 6
2022-02-08 11:49:37 -06:00
Pascal-Nicolas Becker
f0a6b51f34 Remove unneccessary second Context in RDFConsumer
fixes #8152
2022-02-08 16:19:33 +01:00
Kim Shepherd
d695a48d94 #8143 - Use version variables for reload4k, slf4j-reload4j 2022-02-01 10:51:40 +13:00
Kim Shepherd
43e629da54 #8143 - Fix typo introduced to license plugin reference 2022-01-29 16:50:32 +13:00
Kim Shepherd
4eb522c1fe #8143 - Replace log4j with reload4j, slf4j-log4j12 with slf4j-reload4j 2022-01-29 16:20:12 +13:00
Kim Shepherd
7af52a0883 [DS-4131] Fix zip import handling to avoid path traversal exploit 2022-01-14 13:25:59 +13:00
Kim Shepherd
f7758457b7 [DS-4133] Improve URL handling in Controlled Vocab JSPUI servlet 2022-01-14 13:12:00 +13:00
Tim Donohue
e196e748ad Merge pull request #7995 from mwoodiupui/7988-6x
Avoid exporting metadata of mapped Item more than once.
2021-11-11 16:16:51 -06:00
Mark H. Wood
7626a57461 Merge branch 'dspace-6_x' into 7988-6x 2021-10-13 16:52:23 -04:00
Mark H. Wood
89918cdeb6 Avoid exporting mapped Item more than once. #7988 2021-10-13 16:38:14 -04:00
Alan Orth
8363467343 dspace-api: check for null AND empty qualifier in findByElement()
The legacy DSpace 6 REST API passes an empty string to findByElement()
for elements with no qualifier when requesting the following path:

    /registries/schema/{schema}/metadata-fields/{element}

... yet MetadataFieldDAOImpl's findByElement only checks for a null
qualifier, which is of course not the same thing. There may be other
places in the code that set qualifier to null, but for this case we
need to check whether it is both not null AND not empty.

Closes: https://github.com/DSpace/DSpace/issues/7946
2021-10-13 10:35:31 +03:00
Tim Donohue
b53f351e03 Merge pull request #2339 from atmire/DS-4157
DS-4157: Save and Exit on workflow edit metadata
2021-09-17 16:50:49 -05:00
Tim Donohue
a610d25114 Merge pull request #3350 from alanorth/DS-4589-nodejs-version
DS-4589: Update Node.js to version 12 LTS
2021-09-14 08:53:05 -05:00
Alan Orth
83623027fe dspace-xmlui-mirage2/readme.md: Update nvm notes 2021-09-10 13:38:24 +03:00
Alan Orth
9a0b89cd5d dspace-xmlui-mirage2/readme.md: Recommend Node.js v12.x 2021-09-10 13:38:24 +03:00
Alan Orth
687f2cd3ef DS-4589: Update Node.js to version 12 in tests
This updates the Node.js version used in GitHub CI to match the one
used by Mirage 2's pom.xml. Also, don't install grunt here because
it seems to already exist and fails during GitHub CI runs:

> added 1 package from 1 contributor in 2.397s
> /usr/local/bin/grunt -> /usr/local/lib/node_modules/grunt/bin/grunt
> + grunt@1.4.1
> updated 2 packages in 4.709s
> npm ERR! code EEXIST
> npm ERR! syscall symlink
> npm ERR! path ../lib/node_modules/grunt-cli/bin/grunt
> npm ERR! dest /usr/local/bin/grunt
> npm ERR! errno -17
> npm ERR! EEXIST: file already exists, symlink
> '../lib/node_modules/grunt-cli/bin/grunt' -> '/usr/local/bin/grunt'
> npm ERR! File exists: /usr/local/bin/grunt

If we skip installing grunt here it still exists (from somewhere)
and the build succeeds. Perhaps it's a GitHub CI cache issue?
2021-09-10 13:38:24 +03:00
Alan Orth
4419515b85 DS-4589: Update Node.js to version 12 LTS
We currently build Mirage 2 with Node.js version 8, which stopped
receiving support in 2019-12. We should use a currently supported
Node.js LTS version. Mirage 2 builds successfully with version 10
and 12, but version 10 also stopped receiving support as of early
2021, which leaves only version 12 as an option.

See: https://nodejs.org/en/about/releases
See: https://en.wikipedia.org/wiki/Node.js#Releases
2021-09-10 13:38:24 +03:00
Tim Donohue
efb71f377a Merge pull request #2124 from KingKrimmson/DS-3873_PDFBoxThumbnail_fix_6_x
DS-3873 Limit the usage of PDFBoxThumbnail to PDFs
2021-09-09 16:06:26 -05:00
Tim Donohue
c11f727541 Merge pull request #3080 from UoEMainLibrary/DS-4562
[DS-4562] - Change maxwait default to 10 seconds
2021-09-09 16:03:54 -05:00
Tim Donohue
12cb440235 Merge pull request #7940 from UoEMainLibrary/orcidnpe
Protecting against possibly missing values supplied by Orcid and logg…
2021-09-09 15:59:54 -05:00
Hrafn Malmquist
a64c6836f2 Protecting against possibly missing values supplied by Orcid and logging the error. 2021-09-09 21:25:12 +01:00
Hrafn Malmquist
574e25496a Merge pull request #2451 from AlexanderS/fix-generator-permissions
DS-304, DS-1922: METS generator should check for READ permissions
2021-09-09 20:02:33 +01:00
Tim Donohue
43108352dd Merge pull request #3250 from UoEMainLibrary/DS-4580
DS-4580 Update confidence when editing record
2021-09-07 16:49:13 -05:00
Tim Donohue
b9d8ad97ef Merge pull request #2649 from 4Science/dspace-6_x-fix_itemMapper
Use findByCollection method to retrieve only archived item in ItemMapServlet (6.x)
2021-09-03 16:53:53 -05:00
Tim Donohue
654689246c Merge pull request #2493 from atmire/DS-4328-6x
DS-4328
2021-09-03 16:53:01 -05:00
Tim Donohue
af0171e78e Merge pull request #2550 from saiful-semantic/patch-4362
Patch for DS-4362
2021-09-03 16:49:07 -05:00
Tim Donohue
c0e0db7536 Merge pull request #2662 from TexasDigitalLibrary/dspace-6_x-3725
DS-3725: Adds 'id' property to 'default' and 'site' DiscoveryConfiguration
2021-09-03 15:59:20 -05:00
Tim Donohue
1db04e8728 Merge pull request #2161 from KingKrimmson/DS-3976_Reduce_ItemCounter_Init_6_x
DS-3976 Reduce itemCounter init
2021-09-03 15:11:58 -05:00
Tim Donohue
7308c56246 Merge pull request #2490 from Georgetown-University-Libraries/ds4208
[DS-4208] Fix hidden field creation issue when browsing by date
2021-09-03 14:12:21 -05:00
Tim Donohue
bfa16d9a5f Merge pull request #2558 from TexasDigitalLibrary/DS-3891
DS-3891: fixes minor security issue with HTML opened in a separate tab
2021-09-03 14:03:15 -05:00
nwoodward
9b91d1c478 added rel="noopener" to outgoing links 2021-09-03 12:52:59 -05:00
Tim Donohue
5af28f46b4 Merge branch 'dspace-6_x' into ds4208 2021-09-03 12:24:01 -05:00
Tim Donohue
29a8d1f119 Merge pull request #1801 from AlexanderS/fix/default-license-on-ingest
DS-3643: Use correct license on ingest
2021-09-03 09:23:15 -05:00
Tim Donohue
728c789649 Merge pull request #2439 from the-library-code/DS-4260-dspace6
DS-4260: Remove non-existing command from oai harvester's help
2021-09-01 15:08:04 -05:00
kshepherd
37ff44ec50 Merge pull request #3162 from UoEMainLibrary/DS-4574
[DS-4574] v. 6 - Upgrade DBCP2 dependency
2021-08-24 10:27:43 +12:00
Hrafn Malmquist
f34d81f9ef Merge pull request #2753 from bram-atmire/DS-4493
DS-4493 prevent empty string assignment for language
2021-08-23 10:09:34 +01:00
kshepherd
787b536c76 Merge pull request #2374 from J4bbi/DS-4190
[DS-4190] Word-break CSS class breaks words without hyphens in Firefox…
2021-08-12 14:10:05 +12:00
kshepherd
7e11f9f3a8 Merge pull request #2389 from J4bbi/DS-4201
[DS-4201] Do not pass starts_with if browsing an item index.
2021-08-12 14:09:44 +12:00
Hrafn Malmquist
da3154b045 Merge pull request #2742 from TexasDigitalLibrary/dspace-6_x-update_pdfbox
bump up pdfbox version on 6.x to match main branch
Haven't tested myself.
There's one approval and this is a minor upgrade of deps.
2021-07-24 22:35:27 +01:00
Hrafn Malmquist
9fc9a42157 Merge pull request #3333 from alanorth/DS-4587
DS-4587: update spider user agent file for more accurate Solr usage statistics
2021-07-24 20:47:10 +01:00
Hrafn Malmquist
f25a665375 Merge pull request #3336 from alanorth/DS-4588
DS-4588: fix case-insensitive spiderdetector test
2021-07-24 20:36:32 +01:00
nwoodward
4bbe97477b bump up pdfbox version again to match main 2021-07-24 09:51:58 -05:00
nwoodward
a1c7fc12fa bump up pdfbox version again to match main 2021-07-24 09:39:39 -05:00
nicholas
55eda19767 bump up pdfbox version to match master branch 2021-07-24 09:28:03 -05:00
kshepherd
9a935de69c Merge pull request #3228 from UoEMainLibrary/DS-4579
[DS-4579] Re-enable HTTP Ranges support
2021-07-24 12:22:59 +12:00
Hrafn Malmquist
073c320dec Merge pull request #2465 from KingKrimmson/DS-4293__Known_Supported_label_fix
DS-4293 Fix Known/Supported labels in UploadStep/UploadWithEmbargoStep
2021-07-23 23:45:50 +01:00
Alan Orth
00530a9753 DS-4588: fix case-insensitive spiderdetector test
We should not use the string "FirefOx" to test the validity of our
spider user agent patterns because it is not a valid user agent in
the first place. The Mozilla Firefox browser uses patterns such as
this:

Mozilla/5.0 (X11; Linux x86_64; rv:91.0) Gecko/20100101 Firefox/91.0

The test currently matches the following regular expression:

^firefox$

In this case, if a client *did* connect with a single-word string
such as "Firefox" or "firefox" it would definitely be a non-human
user and we should not log a statistics hit.

If the goal of the test is to check *valid* human user agents aga-
inst case-insensitive regular expression matching, then we should
use the following:

mozilla/5.0 (x11; linux x86_64; rv:91.0) gecko/20100101 firefox/91.0
2021-07-22 21:41:47 +03:00
Alan Orth
87bb92a8e7 DS-4587: Update the spider user agent file
This updates the included spider user agent file to the latest from
the COUNTER-Robots project. DSpace's own copy is over five years old
and is missing a bunch of new patterns, which greatly decreases the
accuracy of the Solr usage statistics.

See: https://jira.lyrasis.org/browse/DS-4587
See: https://github.com/atmire/COUNTER-Robots/releases/tag/2021-07-05
2021-07-22 19:09:33 +03:00
Hrafn Malmquist
17109243ff Merge pull request #2276 from toniprieto/DS-4034-strip-diacritics
DS-4034 - The "first few letters" search doesn't work with diacritics
2021-07-21 17:03:47 +01:00
Hrafn Malmquist
b16f1a5622 Merge pull request #1800 from atmire/DS-2852_6.x
DS-2852 - Discovery label fix for authority display value
2021-07-19 23:20:36 +01:00
Hrafn Malmquist
e0e7a2a391 Merge pull request #2317 from bram-atmire/DS-4135-Scholar-Escaping
DS-4135 prevent escaping for citation_ tags
2021-07-19 01:48:40 +01:00
Hrafn Malmquist
2683f11719 Merge pull request #2918 from kshepherd/DS-4508_javascript_dependencies_6x
[DS-4508] Update JQuery, JQueryUI and other JS dependencies
2021-07-18 23:20:17 +01:00
Hrafn Malmquist
ee25bbbe3f Merge pull request #3275 from IordanisKostelidis/dspace-6_x
fixes restlet's blocked repository - dspace 6.x
2021-07-17 13:19:49 +01:00
Iordanis Kostelidis
2f45f00723 fixes DSpace#3247 2021-07-17 12:18:46 +03:00
Hrafn Malmquist
1b810186e4 DS-4574. Updated dependencies commons-dbcp2 to 2.1.1 -> 2.8.0 and commons-pool2 2.4.2 -> 2.10.0. 2021-07-16 21:48:26 +01:00
Hrafn Malmquist
6694befb26 DS-4580 Update confidence when editing record 2021-04-25 22:59:45 +01:00
Hrafn Malmquist
0669edb102 Re-enable Ranges requests 2021-04-09 00:53:36 +01:00
kshepherd
b574fb141a Merge pull request #3087 from atmire/DS-4563_jspui-curate-commit
DS-4563 Signal that the transaction should be committed
2020-12-14 14:21:09 +13:00
Chris Wilper
0e184c7bd0 DS-4563 Signal that the transaction should be committed 2020-12-10 12:16:33 -05:00
j4bbi
306dc779a2 Change maxwait default to 10 seconds 2020-12-08 09:53:42 +00:00
Tim Donohue
a72ff50de1 Update build.yml to not limit by branch
Cherry pick of #3072
2020-11-30 10:22:08 -06:00
Tim Donohue
eafac164c8 Merge pull request #3064 from tdonohue/github_ci_6x
Add CI to GitHub via Actions (based on Travis CI config) for 6.x
2020-11-25 15:05:23 -06:00
Tim Donohue
5f2672d508 Remove .travis.yml & move over comments to build.yml 2020-11-25 11:18:40 -06:00
Tim Donohue
e6a770b253 Only enable for dspace-6_x branch & PRs to that branch 2020-11-25 10:48:45 -06:00
Tim Donohue
6e3e3652a1 Set GEM_HOME and GEM_PATH for Mirage 2 build 2020-11-25 10:44:36 -06:00
Tim Donohue
6afe234beb Print gem env for debugging 2020-11-25 10:44:36 -06:00
Tim Donohue
c5b77bc202 Install proper version of Ruby for Mirage 2 build. Minor cleanup / spacing 2020-11-25 10:44:35 -06:00
Tim Donohue
2687f145fa Remove sudo from gems 2020-11-25 10:44:35 -06:00
Tim Donohue
55cbc8b9f6 Use sudo to avoid permissions errors 2020-11-24 16:20:06 -06:00
Tim Donohue
41771c95fd Fix nvm call. Needs global shell 2020-11-24 16:20:06 -06:00
Tim Donohue
d324e94586 Initial GitHub CI based on Travis CI configs 2020-11-24 16:20:06 -06:00
kshepherd
2825646570 Merge pull request #2986 from DSpace/revert-2985-DS-4552
[DS-4552] Revert PR#2985 "DS-4552 Update crosswalk according to new guidelines"
2020-09-30 12:15:21 +13:00
kshepherd
1f306cf3d7 Revert "DS-4552 Update crosswalk according to new guidelines" 2020-09-29 13:30:50 +13:00
kshepherd
a995c2970b Merge pull request #2985 from UoEMainLibrary/DS-4552
DS-4552 Update crosswalk according to new guidelines
2020-09-29 11:55:01 +13:00
j4bbi
e3bf3adf1d DS-4552 Update crosswalk according to new guidelines 2020-09-28 13:40:22 +01:00
kshepherd
95a25f3342 Merge pull request #2977 from UoEMainLibrary/DS-2569
[DS-2569] Fail gracefully if CC-API is down
2020-09-28 13:07:54 +13:00
kshepherd
626f46885b Merge pull request #2357 from J4bbi/DS-4162
DS-4162. Bower upgrade due to security vulnerability
2020-09-28 13:07:25 +13:00
j4bbi
aa85b0f25f Fail gracefully if CC-API is down 2020-09-25 01:57:54 +00:00
Pascal-Nicolas Becker
5c5e415276 Merge pull request #2964 from tuub/DS-4551
DS-4551 Close FileInputStreams in JSPUI's submission file upload
2020-09-15 12:06:09 +02:00
Martin Walk
1f6b7909f0 DS-4551 Close FileInputStreams 2020-09-11 10:39:00 +02:00
Tim Donohue
e3c2fe2f8e Merge pull request #2832 from bme-omikk/dspace-6_x
DS-4534 fix for Anonymous as a sub-group.
2020-08-21 16:21:28 -05:00
Tim Donohue
6d5aad35f0 Merge pull request #2114 from toniprieto/start-handle-server-deleting-lock-file
DS-3946: start-handle-server script fails to remove the lock file
2020-08-19 14:05:48 -05:00
Kim Shepherd
5bb386fddd [DS-4508] Add jquery UI 1.12.1 CSS to Mirage 1 theme 2020-08-06 15:50:20 +12:00
Kim Shepherd
a2eb56a375 [DS-4508] Fix bugs in person-lookup.js (related to 3.4.1?) 2020-08-06 15:44:26 +12:00
Kim Shepherd
e0e88a3f01 [DS-4508] Add jquery UI 1.12.1 minified js to Mirage 1 theme 2020-08-06 15:43:53 +12:00
Kim Shepherd
c28a9afeb5 [DS-4508] Add jquery UI CSS to JSPUI 2020-08-06 15:15:07 +12:00
Kim Shepherd
c33c0838a9 [DS-4508] Add local jquery-3.4.1 and update Mirage 1 themes 2020-08-06 15:04:26 +12:00
Kim Shepherd
bf44929263 [DS-4508] Update jquery and jquery-ui versions in Mirage2 2020-08-06 14:58:10 +12:00
Kim Shepherd
423af4ca98 [DS-4508] Applying js update patch for JSPUI (Andrea Pascarelli) 2020-08-06 13:58:18 +12:00
kshepherd
4acd6998de Merge pull request #2859 from 4Science/DS-4149-onlycoremod_6-x
[DS-4149] support for additional "metadata" insertion on "item.compile" solr document field from external plugin (6.x)
2020-07-30 13:54:57 +12:00
kshepherd
e59d85cfe7 Merge pull request #2113 from toniprieto/D6-startswith-bug
DS-3945: XMLUI: filtering by "starts with" field twice on browsing pages doesn't work
2020-07-27 16:28:07 +12:00
Francesco Pio Scognamiglio
8030c0a3d8 [DS-4149] improvements on plugins 2020-07-15 21:12:54 +02:00
Luigi Andrea Pascarelli
f7d46fd2eb [DS-4462] fix CREATEDATE format using the correct specifiers 2020-07-15 21:12:54 +02:00
Anis
e7a7c7b87f Update the test as well 2020-07-15 21:12:54 +02:00
Anis
eadaabfa97 Explicitly initialize TransofmerFactory with the Saxon transformer in OAI 2020-07-15 21:12:54 +02:00
Anis
77f5dfe62f upgrade mets xslt version to 2.0 2020-07-15 21:12:54 +02:00
Luigi Andrea Pascarelli
d98db52d22 [DS-4149] refactoring code; added java doc; 2020-07-15 21:12:54 +02:00
Luigi Andrea Pascarelli
df5475b40d [DS-4149] improvements to support extra 'metadata' from external plugin 2020-07-15 18:31:29 +02:00
Istvan Vig
3fb647ffa5 DS-4534 fix for Anonymous as a sub-group. 2020-07-08 14:30:58 +02:00
kshepherd
acf16ebe6d Merge pull request #2693 from atmire/DS-4440_GDPR_anonymize-statistics-feature_dspace-6_x
DS-4440 GDPR - Anonymize statistics feature
2020-06-07 16:16:42 +12:00
Toni Prieto
b813f6e6f4 Merge branch 'dspace-6_x' into DS-4034-strip-diacritics 2020-06-02 16:02:04 +02:00
kshepherd
65fef9cffc Merge pull request #2710 from 4Science/DS-2715_6-x
[DS-2715] porting ORCID Integration for JSPUI (6.x)
2020-05-30 13:56:13 +12:00
kshepherd
7672e40638 Merge pull request #2755 from the-library-code/DS-3940_SHERPA_v2_API
[DS-3940] New SHERPA - fix for incorrect metadata (publisher policy) URI
2020-05-20 19:21:38 +12:00
Samuel
47cf110279 DS-4440 GDPR - Anonymize statistics feature - feedback: default configuration 2020-05-07 11:41:11 +02:00
Bram Luyten
2ec197e06e DS-4493 improving empty string check 2020-05-06 17:21:49 +02:00
Kim Shepherd
a41db4154e [DS-3940] Bugfix for incorrect metadata (publisher policy) URI reference 2020-05-05 10:49:46 +12:00
Bram Luyten
ae7a22d25f DS-4493 prevent empty string assignment for language 2020-04-30 08:52:09 +02:00
kshepherd
24728e15d4 Merge pull request #2739 from the-library-code/DS-3940_SHERPA_v2_API
[DS-3940] SHERPA API v2 refactor
2020-04-30 10:57:34 +12:00
Kim Shepherd
8a744b43be [DS-3940] try/catch for parsing before finally closing stream, d7 code style 2020-04-30 10:19:53 +12:00
Kim Shepherd
23c26a44f3 [DS-3940] Explicitly close SHERPAService response content InputStream 2020-04-28 15:56:12 +12:00
Kim Shepherd
8af613a851 [DS-3940] Ensure publisher URL is set correctly, refactor parse pub name 2020-04-18 14:43:05 +12:00
Kim Shepherd
7ec7cbe59d [DS-3940] Split large parseJSON method into functional parts
including slightly restructuring Journal and Permitted Version
to work better with the way things are scoped in the new structure
2020-04-18 14:33:29 +12:00
Kim Shepherd
385fa6c768 [DS-3940] Remove ROMeO colour from data models, as per review 2020-04-18 13:02:46 +12:00
Kim Shepherd
6a28df95b3 [DS-3940] Close input stream reader, throw IOException from parse methods 2020-04-18 13:00:21 +12:00
Kim Shepherd
005e00f7bc [DS-3940] Remove formatted text from response object as per review 2020-04-18 12:53:31 +12:00
Kim Shepherd
b4edbc2a1b [DS-3940] Remove unused imports 2020-04-10 12:48:05 +12:00
Kim Shepherd
476f0fd04d [DS-3940] Changes to handle multiple ISSN queries / errors in JSPUI 2020-04-10 12:34:53 +12:00
Samuel
8a19dc3cb7 DS-4440 GDPR - Anonymize statistics feature - feedback: default configuration & spelling tris 2020-04-09 14:59:56 +02:00
Samuel
b0088af73f DS-4440 GDPR - Anonymize statistics feature - feedback: default configuration & spelling bis 2020-04-09 14:59:08 +02:00
Samuel
245ef2f95c DS-4440 GDPR - Anonymize statistics feature - feedback: default configuration & spelling 2020-04-09 14:52:51 +02:00
Kim Shepherd
a9c13d586c [DS-3940] Revert "getISSNs" to return LinkedHashSet to avoid duplicates 2020-04-08 12:32:35 +12:00
Kim Shepherd
2a7b24963d [DS-3940] Apply missing license header for one bean class 2020-04-07 16:10:58 +12:00
Kim Shepherd
1dc76cb380 [DS-3940] SHERPA API v2 refactor 2020-04-07 15:34:35 +12:00
kshepherd
7e738a0bcc Merge pull request #2021 from AndrewZWood/DS-3888_HTMLMimeType
DS-3888 Respect primary bitstreams with text html mime types in item …
2020-04-02 20:21:57 +13:00
kshepherd
b0470eb59d Merge pull request #2606 from atmire/w2p-67450_Discovery-clean-index-fix_6
[DS-4393] Discovery clean index fix DSpace 6
2020-03-29 12:02:03 +13:00
kshepherd
69561735cf Merge pull request #2358 from the-library-code/DS-3444
DS-3444 JSPUI must keep shibboleth attributes on session renewal
2020-03-29 11:57:18 +13:00
Tim Donohue
f81b8ce949 Update to LICENSE and NOTICE per LYRASIS merger 2020-03-27 12:18:04 -05:00
Luigi Andrea Pascarelli
51fa6bc17a [DS-2715] manage no option selected with the first value on the result list 2020-03-12 11:30:30 +01:00
Samuel
d9b3e465e5 DS-4440 GDPR - Anonymize statistics feature - feedback 2020-03-10 18:57:01 +01:00
Luigi Andrea Pascarelli
71f2c0c372 [DS-2715] Orcid integration for JSPUI 2020-03-10 14:32:00 +01:00
Luigi Andrea Pascarelli
861fbd1944 [DS-4439] fix loses ORCID info stored in solr record 2020-03-10 14:29:47 +01:00
Philip Vissenaekens
14683491cd DS-4328: Clear the "today" dc.date.issues value when selecting published in the initial questions step 2020-03-09 18:02:12 +01:00
Luigi Andrea Pascarelli
e036dd17a1 [DS-4149] upgrade XOAI to version 3.3.0 2020-03-06 17:45:43 +01:00
Saiful Amin
52c84c31aa Added braces in if statement 2020-03-06 11:12:49 +05:30
kshepherd
b8b030518b Merge pull request #2556 from TexasDigitalLibrary/DS-4029
DS-4029: adds missing keys in Messages.properties for search filters
2020-03-05 23:56:50 +13:00
kshepherd
afefb9e7e1 Merge pull request #2396 from TexasDigitalLibrary/dspace-6_x-4211-metadata-export-empty-community
DS-4211: Set Item iterator to emptyIterator() so that it's never null
2020-03-05 23:46:08 +13:00
kshepherd
cf1d7c4835 Merge pull request #2350 from AlexanderS/ds4169
DS-4169: Treat empty language as null in edit item
2020-03-05 23:43:46 +13:00
kshepherd
cceee9e9b4 Merge pull request #2463 from KingKrimmson/DS-4291__Fix_0-9_in_Browse
DS-4291 Properly show results for 0-9 link in browse
2020-03-05 23:41:05 +13:00
kshepherd
bb9d9eb3cc Merge pull request #2661 from 4Science/dspace-6_x-fix-bulkedit.ignore-on-export
Fix issue with bulkedit.ignore-on-export parameter on DSpaceCSV
2020-03-05 09:53:59 +13:00
kshepherd
18f6c776ca Merge pull request #2681 from mwoodiupui/DS-4087-6x
[DS-4087] 'dspace structure-builder' errors are too hard to interpret -- 6_x
2020-03-04 12:26:58 +13:00
kshepherd
de5b5bf6bc Merge pull request #2699 from TexasDigitalLibrary/dspace-6_x-DS-4073
DS-4073 fix. FindByValue should pass in value, not qualifier (dspace-6_x port)
2020-03-04 10:23:31 +13:00
nicholas
0a36954f54 pass value instead of qualifier to method 2020-03-02 16:27:02 -06:00
Chris Herron
3753e9e60b DS-4291 Properly show results for 0-9 link in browse 2020-02-27 12:05:03 -05:00
Alexander Sulfrian
cdb8bea193 DS-4169: Treat empty language as null in edit item
The metadata editor in the admin edit item view always submits an empty string
if there is no language (the HTML input fields cannot submit null values). To
preserve the null values of the language attributes all empty strings in the
language input fields are treated as null values, now.
2020-02-26 17:41:24 +01:00
Samuel
301c4932ca DS-4440 GDPR - Anonymize statistics feature 2020-02-26 11:38:42 +01:00
kshepherd
ae1b42af31 Merge pull request #2652 from mwoodiupui/DS-4409-6_x
[DS-4409] Remove GeoIP download Ant target, reconfigure for external provision
2020-02-20 10:42:15 +13:00
Kim Shepherd
0a6766e003 [DS-4409] Update SolrLoggerServiceImpl error messages 2020-02-20 10:13:24 +13:00
kshepherd
9f6cb99908 Merge pull request #2683 from TexasDigitalLibrary/dspace-6_x-3895b
DS-3895b: restores getSize() in Bitstream
2020-02-20 09:17:51 +13:00
nicholas
b5582b4f67 restores getSize() method that points to getSizeBytes(); marked as deprecated 2020-02-19 10:22:05 -06:00
Mark H. Wood
f3cd04971c [DS-4087] Small improvements to message texts. 2020-02-19 09:54:28 -05:00
Mark H. Wood
ecd452393b [DS-4087] First pass at better error reporting. 2020-02-19 09:46:30 -05:00
kshepherd
31e8f3e18b Merge pull request #2567 from AndreaJenisSaroni/DS-4377-6_x
DS-4377 Fix Sherpa RoMEO layout
2020-02-19 23:52:53 +13:00
kshepherd
340520f86c Merge pull request #2112 from mwoodiupui/DS-3453-6x
[DS-3453] OAI-PMH verb Identify doesn't work on Oracle
2020-02-19 23:32:08 +13:00
kshepherd
bbc5ca73a6 Merge pull request #2501 from atmire/DS-3849
[DS-3849] REST API items resource returns items in non-deterministic order (further 6.x changes)
2020-02-19 23:25:00 +13:00
kshepherd
3ebb746914 Merge pull request #1901 from jonas-atmire/DS-3791-Missing-date-values-when-faceting
DS-3791 : Missing date values while faceting
2020-02-19 00:20:09 +13:00
nicholas
26b99b7155 adds 'id' property to default and site DiscoveryConfigurations 2020-02-07 13:39:38 -06:00
Giuseppe Digilio
c7ba52f51a [CST-2448] Fix issue with bulkedit.ignore-on-export parameter on DSpaceCSV 2020-02-07 12:03:30 +01:00
Mark H. Wood
48d9bcfe60 [DS-4409] Don't use live GeoLite database for testing -- there might not be one. 2020-01-23 14:06:47 -05:00
Mark H. Wood
9ca34d74bf [DS-4409] Leave GeoLite database location unconfigured by default. 2020-01-23 13:40:21 -05:00
Mark H. Wood
9b930960fd [DS-4409] Remove GeoIP download Ant target, reconfigure for external provision. 2020-01-23 13:16:54 -05:00
Giuseppe Digilio
02d3e196a7 [CST-2385] Use findByCollection method to retrieve only archived item in ItemMapServlet 2020-01-22 15:50:44 +01:00
Kristof De Langhe
ca58c95cdf 67450: while to foreach 2019-11-27 11:44:48 +01:00
Kristof De Langhe
9a6ef42b9f 67450: Discovery clean index fix 2019-11-27 11:37:54 +01:00
nicholas
5144c7bb21 adds meta tag to prevent robots from indexing the item if it is private 2019-11-20 16:32:32 -06:00
Tim Donohue
2975cdbe7b Ensure bin scripts ALWAYS have correct line endings 2019-11-15 12:51:21 -06:00
Andrea Jenis Saroni
5db02cb754 DS-4377 Fix Sherpa RoMEO layout 2019-11-05 10:36:02 +01:00
Luigi Andrea Pascarelli
6a7086258d Merge pull request #2516 from atmire/DS-4342-6x
[DS-4342] improve the performance of the collections/collection_id/items REST endpoint
2019-10-28 18:44:46 +01:00
nicholas
9a9dcf1798 adds missing keys for search filters 2019-10-18 12:01:23 -05:00
Saiful Amin
9506cc4c5b DS-4362: Fix for sites without sub-domain
Fixes break of feedback link on sites without sub-domain
2019-10-13 19:18:05 +05:30
Tim Donohue
c1aca4e17b Merge pull request #2542 from Georgetown-University-Libraries/ds4356r6
docker build instructions
2019-10-10 11:48:49 -05:00
jonas-atmire
896cebb7ca DS-4271: Replaced brackets with double quotes in SolrServiceImpl. 2019-10-10 11:44:49 +02:00
Terry Brady
9848d2acf2 docker build instructions 2019-10-07 11:27:10 -07:00
Tim Donohue
6ce4013b4e Merge pull request #2540 from Georgetown-University-Libraries/ds4355r6
[DS-4355] 6x: Re-scope .dockerignore excludes
2019-10-04 22:35:02 +02:00
Terry Brady
c357471ee9 Clean up .dockerignore 2019-10-04 13:21:28 -07:00
Tim Donohue
3aa3128dc6 Merge pull request #2534 from terrywbrady/ds4349r6
[DS-4349] 6x: Add Postgres Dockerfiles to code base
2019-10-04 21:06:27 +02:00
Terry Brady
21b691babe Merge branch 'dspace-6_x' into ds4349r6 2019-10-04 10:59:13 -07:00
Tim Donohue
cea23868ac Merge pull request #2523 from terrywbrady/ds4346
[DS-4346] 6x: Add default docker-compose file to code base
2019-10-04 19:06:51 +02:00
Tim Donohue
058f66df0c Merge pull request #2537 from paulo-graca/patch-3
DSpace6: Fixing exception error when using UUID on Harvesting
2019-10-04 16:09:41 +02:00
Paulo Graça
b1611c50da Replacing Collection ID references with UUIDs 2019-10-04 11:59:45 +01:00
Terry Brady
d031667339 Clarify pull/build in README 2019-10-01 15:32:39 -07:00
Terry Brady
67eaebff73 document root dir compose files 2019-10-01 10:38:30 -07:00
Terry Brady
fb1f3b9f29 Docker Compose README 2019-09-30 22:32:09 -07:00
Terry Brady
d93c3dcc9a add docker directory readme 2019-09-30 22:19:19 -07:00
Terry Brady
e4542b757a add license header to scripts 2019-09-30 13:41:53 -07:00
Terry Brady
14ec1db720 add license to dockerfile 2019-09-30 13:25:41 -07:00
Terry Brady
35121a5619 add postgres dockerfiles 2019-09-30 13:20:52 -07:00
Terry Brady
6a830c95d3 update docker-ignore 2019-09-30 13:16:54 -07:00
Terry Brady
2119a450e6 add header, trigger travis rebuild 2019-09-25 15:19:55 -07:00
Terry Brady
0065775ffb Merge pull request #2519 from kshepherd/DS-4144_update_node_npm_mirage2_6x
[DS-4144] Update node and npm versions for mirage 2 (6.x)
2019-09-23 13:47:10 -07:00
Terry Brady
72b874ec58 Update license header 2019-09-23 13:05:43 -07:00
Terry Brady
3a659cd083 Add header, add default image name 2019-09-23 11:21:32 -07:00
Terry Brady
ce16d258cd Add ingest compose files 2019-09-23 10:49:04 -07:00
Terry Brady
3e1bc50575 docker-compose in code base 2019-09-23 10:34:03 -07:00
kshepherd
033df16942 Merge pull request #2513 from 4Science/dspace-6_x
DS-4340 Duplicate Headers when bitstream has a comma in the title
2019-09-22 11:16:23 +12:00
Kim Shepherd
2cc105040a [DS-4144] Update node and npm versions for mirage 2 dependencies 2019-09-17 14:30:40 +12:00
Philip Vissenaekens
c2e6719fa7 DS-4342 2019-09-12 11:54:37 +02:00
Andrea Bollini
84a7a15561 DS-4340 Duplicate Headers when bitstream has a comma in the title (Chrome) - JSPUI Only 2019-09-09 09:40:14 +02:00
kshepherd
40a09f0311 Merge pull request #2362 from Georgetown-University-Libraries/ds4167r6
[DS-4167] 6x Port: Migrate update-sequences.sql to `database` command
2019-09-07 11:51:22 +12:00
kshepherd
f1f2b1a7fe Merge pull request #2476 from terrywbrady/dspace-cli-image
[DS-4321] 6x: docker image dspace/dspace-cli:dspace-6_x
2019-09-07 11:49:47 +12:00
Terry Brady
6af0b1126e update ant archive location 2019-09-06 14:58:33 -07:00
Terry Brady
34b26e2e8b Merge pull request #2510 from J4bbi/ds4336v6
[DS-4336] Point Ant to archived, stable URL
2019-09-06 12:27:03 -07:00
j4bbi
b34428937e Point Ant to archived, stable URL 2019-09-06 19:41:19 +01:00
Tim Donohue
cb7cb4058e Merge pull request #2505 from Georgetown-University-Libraries/ds4336r6
[DS-4336] Update ant version in Docker Build (6x)
2019-09-06 18:04:22 +02:00
Terry Brady
f56090f478 DS-4336 2019-09-05 11:03:18 -07:00
Terry Brady
2e1cc77eb2 update ant version 2019-09-05 10:39:50 -07:00
Philip Vissenaekens
2a2ea0cb5d DS-3849 2019-09-04 15:28:45 +02:00
Terry Brady
16af1f18b0 oai needed for build 2019-09-03 08:56:27 -07:00
Terry Brady
b8baf662d4 exclude webapps 2019-09-02 21:25:48 -07:00
Terry Brady
0c1307b3ca scope maven build 2019-09-02 18:45:43 -07:00
Terry Brady
83369676a8 correct comment 2019-09-02 18:09:21 -07:00
Philip Vissenaekens
9959279ccc DS-4328 2019-08-23 14:12:03 +02:00
Terry Brady
c5ac497c99 fix hidden field issue 2019-08-20 13:10:14 -07:00
Terry Brady
bfb7b747e5 Update comments 2019-08-06 13:30:07 -07:00
Terry Brady
21ab38d714 Update comments 2019-08-06 13:30:07 -07:00
Terry Brady
d7ebc8cb15 undo local.cfg changes 2019-08-06 13:30:06 -07:00
Terry Brady
848e6ade8d simplify CMD 2019-08-06 13:30:06 -07:00
Terry Brady
7aba0a37d8 dspace-cli 2019-08-06 13:30:06 -07:00
Tim Donohue
c23e4bd9ae Merge pull request #2481 from mwoodiupui/2477-6_x
Travis CI: Continue to use Ubuntu Trusty 14.04
2019-08-06 22:23:14 +02:00
Mark H. Wood
1fef8120c0 Tell Travis to boot Trusty Tahr, not Xenial Xerus, so we get Java 8 not 11. 2019-08-06 15:53:40 -04:00
Mark H. Wood
f8da63585b Merge pull request #2292 from Georgetown-University-Libraries/ds4110
DS-4110: fix issue in legacy id cleanup of stats records
2019-08-06 14:00:42 -04:00
Bram Luyten
ee23dcc9e4 Merge pull request #2460 from atmire/DS-3849
[DS-3849] REST API items resource returns items in non-deterministic order
2019-07-25 13:42:24 +02:00
Kim Shepherd
8d1aa33f7b [DS-3849] Default ID 'order by' clause for other 'get items' queries 2019-07-25 17:28:37 +12:00
Bram Luyten
4671202246 Merge pull request #1690 from samuelcambien/dspace-6_DS-3545
Dspace 6 ds 3545 mirage2: custom sitemap.xmap is ignored
2019-07-05 14:41:03 +02:00
Chris Herron
e37ef608f8 DS-4293 Fix Known/Supported labels in UploadStep/UploadWithEmbargoStep 2019-07-03 12:09:37 -04:00
Philip Vissenaekens
7b888fa558 DS-3849 2019-07-01 11:01:58 +02:00
Tim Donohue
2fdef2cbc5 Merge pull request #2452 from mwoodiupui/DS-4032-6_x
[DS-4032] RequestItem.isAccept_request() can NPE if setAccept_request() was never called.
2019-06-24 19:57:25 +02:00
Mark H. Wood
1b02313303 [DS-4032] Replace 'Boolean' with 'boolean' to avoid NPE. 2019-06-20 12:30:07 -04:00
Alexander Sulfrian
621a0569ed DSpaceOREGenerator: Check permissions
The DSpaceOREGenerator needs to check whether the item is readable by the current
user before generating a XML document.
2019-06-20 15:32:34 +02:00
Alexander Sulfrian
e94467ead6 DSpaceOREGenerator: Remove unused code
This code was copied from the DSpaceMETSGenerator and is not used anymore.
2019-06-20 15:32:31 +02:00
Alexander Sulfrian
c226821988 DSpaceMETSGenerator: Check permissions
Ensure that the metadata of objects are not accessible via the mets file, if
the current user doesn't have sufficient permissions.
2019-06-20 15:32:21 +02:00
Pascal-Nicolas Becker
b22fdc994c DS-4260: Remove non-existing command from oai harvester's help 2019-05-25 11:55:56 +02:00
nicholas
379715f233 removed null check; set result to empty iterator instead 2019-04-08 11:16:45 -05:00
nicholas
e472a2ffd4 added check for null items iterator before concatenation 2019-04-05 10:29:09 -05:00
Mark H. Wood
022fc445d3 Merge pull request #2184 from AndrewZWood/DS-3658
DS-3658 Configure ReindexerThread disable reindex
2019-03-28 12:10:06 -04:00
j4bbi
3063503533 DS-4201
Do not pass starts_with if browsing an item index
2019-03-28 11:54:25 +00:00
Tim Donohue
7253095b62 Add info about how/when auto-reindexing takes place 2019-03-27 15:45:42 -05:00
Bram Luyten
d21b3674f8 preventing escaping for citation_tags for Mirage 1 2019-03-19 10:17:48 +01:00
j4bbi
3cc6bea03e DS-419. Removed comment 2019-03-18 09:30:53 +00:00
j4bbi
55bf1e468b DS-4190. Word-break CSS class breaks words without hyphens in Firefox and Opera mini 2019-03-15 11:09:40 +00:00
Philip Vissenaekens
67d89abd22 DS-4157: Save and Exit on workflow edit metadata 2019-03-12 14:14:47 +01:00
Kristof De Langhe
c949deee8c DS-3881: Show no total results on search-filter 2019-03-12 08:58:05 +01:00
Terry Brady
0104645525 typo 2019-03-08 16:07:09 -08:00
Terry Brady
22ceb0dba0 init migration of PR 2348 2019-03-08 16:03:55 -08:00
Pascal-Nicolas Becker
1352fe5412 [DS-3444] JSPUI must keep shibboleth attributes on session renewal 2019-03-06 16:08:50 +01:00
j4bbi
6be62c02bd DS-4162
Upgrades bower dependency for Mirage2 from 1.7.9 to 1.8.8 because of
a serious security vulnerability, see:
https://snyk.io/blog/severe-security-vulnerability-in-bowers-zip-archive-extraction/
2019-03-05 14:15:12 +00:00
kshepherd
2ffa8edd0c Merge pull request #2306 from r3r57/pr-feature-add-configurable-limit-to-rpp
[DS-4120] Add configurable limit to results per page
2019-02-18 13:04:16 +13:00
Mark H. Wood
c9df9ca66d Merge pull request #2342 from ssolim/ds-4136b---Skip-empty-list-and-dont-add-it-to-solrserver
[DS-4163] XOAI.java skip empty list and dont add it to solr
2019-02-14 13:22:07 -05:00
Terry Brady
aeb134d601 Merge pull request #2307 from terrywbrady/ds4126
[DS-4126] Optimize docker builds
2019-02-06 10:28:15 -08:00
PTrottier
841bda1c5e Update Dockerfile.dependencies comment
Co-Authored-By: terrywbrady <terrywbrady@gmail.com>
2019-02-06 10:04:16 -08:00
ssolim
8c171f54b0 XOAI.java skip empty list and dont add it to solr 2019-02-06 09:42:23 +01:00
Philip Vissenaekens
43b50eee97 DS-4157: Save and Exit on workflow edit metadata 2019-02-01 11:10:49 +01:00
Terry Brady
79dec9e8b2 Insert test resources at end of build for caching 2019-01-15 21:06:03 -08:00
Terry Brady
abebea83df Call init_installation, remove ant from final image 2019-01-15 16:27:32 -08:00
kshepherd
a907e02185 Merge pull request #2320 from Georgetown-University-Libraries/ds4136a
[DS-4136] Improve OAI import performance for a large install
2019-01-16 13:05:12 +13:00
Terry Brady
37004bbcf4 make config val 2019-01-15 14:41:34 -08:00
Terry Brady
a6afbd8c78 Sync with recent branch changes 2019-01-14 09:03:53 -08:00
Terry Brady
88e06a5723 Merge branch 'dspace-6_x' into ds4126 2019-01-14 09:02:16 -08:00
kshepherd
e0bc382fc3 Merge pull request #2294 from DSpace/DS-4104-Avoid-crosswalk-of-incorrect-publication-dates-for-google-scholar
DS-4104: Avoid crosswalk of incorrect publication dates for google scholar
2019-01-13 14:38:47 +13:00
Terry Brady
3a1fc9d86a Merge pull request #2322 from J4bbi/DS-4142
DS-4142 updated maven docker build to specify jdk8
2019-01-11 09:19:16 -08:00
Hrafn Malmquist
258780697c put same jdk8 image in doc 2019-01-11 15:59:35 +00:00
Hrafn Malmquist
b43b50888e updated maven docker build to specify jdk8 2019-01-11 11:29:07 +00:00
Terry Brady
3f81daf3d8 optimize xoai processing 2019-01-08 14:52:20 -08:00
Cornelius Matějka
f975ac1d39 Set default maxRpp to 100 2019-01-08 15:26:34 +01:00
Cornelius Matějka
f2db7b8d3b Correct logical flaw 2019-01-06 17:04:47 +01:00
Bram Luyten
27068065e8 DS-4135 prevent escaping for citation_ tags 2019-01-05 15:41:52 +01:00
Tim Donohue
e9a87567c8 DS-4129: Remove unnecessary HarvestConsumer 2019-01-04 18:03:09 +00:00
kshepherd
3087605a5e Merge pull request #2315 from Georgetown-University-Libraries/ds4128
[DS-4115] Update JRuby and SASS version for maven based Mirage2 build
2019-01-04 10:10:51 +13:00
Terry Brady
34be119d79 Note ruby ver 2019-01-03 12:48:22 -08:00
Tim Donohue
99d181d5eb Add description for new jruby.version property 2019-01-03 12:00:35 -06:00
Tim Donohue
645db7409e Comment should be for "before_install" section 2019-01-03 11:58:56 -06:00
Terry Brady
912b27e6e6 update jruby and sass versions 2019-01-03 09:51:31 -08:00
kshepherd
295bf95889 Merge pull request #2309 from terrywbrady/ds4115
[DS-4115] Update Sass Version to Fix Travis Build
2019-01-03 10:45:42 +13:00
Terry Brady
e631fd14ef update sass version 2018-12-21 16:40:16 -08:00
Terry Brady
a13f18f093 Optimize docker builds 2018-12-20 21:16:05 -08:00
Cornelius Matějka
0d0bb2f266 Add configurable limit to results per page 2018-12-19 17:46:35 +01:00
Bram Luyten
5844ee4954 DS-4104 Avoid crosswalk of incorrect publication dates for google scholar 2018-12-13 08:22:34 +01:00
Terry Brady
2fb3751c9a add sort 2018-12-12 12:21:44 -08:00
Terry Brady
e6004e57f0 add comment on pending PR's 2018-12-12 10:56:08 -08:00
Terry Brady
3752247d6a fix id sort issue 2018-12-12 10:19:25 -08:00
Terry Brady
9c16711dbe Merge pull request #2178 from AlexanderS/ds3914
DS-3914: Fix community defiliation
2018-12-06 17:21:48 -08:00
santit96
150e835581 DS-4019 Added a check for null when the group of a policy is obtained in getMostRecentModificationDate() and willChangeStatus() methods from XOAI.java class (#2213) 2018-12-03 22:01:16 +01:00
Toni Prieto
58165aa246 Add StripDiacritics to some OrderFormat classes and normalize the user-input text submitted 2018-12-03 10:12:25 +01:00
Terry Brady
4bfa0e27d5 Merge pull request #2261 from the-library-code/DS-4078
DS-4078
2018-11-15 06:55:53 -08:00
Pascal-Nicolas Becker
8504f8d5a1 DS-4078: Bitstreams should keep their formats when being versioned. 2018-11-12 17:42:25 +01:00
Tim Donohue
36f9be298c Merge pull request #2260 from Georgetown-University-Libraries/ds4075-6x
[DS-4075] (6x) correct command line usage for solr-upgrade-statistics-6x
2018-11-08 13:53:36 -06:00
Terry Brady
323feec90a correct usage 2018-11-07 15:25:02 -08:00
Tim Donohue
9fc7f692d5 Merge pull request #2256 from Georgetown-University-Libraries/ds3602d6
[DS-3602] 6x Port: Incremental Update of Legacy Id fields in Solr Statistics
2018-11-07 16:13:31 -06:00
Terry Brady
184f2b2153 port 1810 to 6x 2018-11-02 08:32:56 -07:00
Tim Donohue
f15cb33ab4 Fix DS-4066 by update all IDs to string type in schema 2018-10-29 21:32:57 +00:00
Tim Donohue
d5e6ed90d2 Merge pull request #2242 from mwoodiupui/DS-4031-6x
[DS-4031] Updated link to DRIVER guidelines -- 6_x.
2018-10-18 11:43:20 -05:00
Mark H. Wood
b9dbf11e3c [DS-4031] Updated link to DRIVER guidelines. 2018-10-18 09:35:22 -04:00
Terry Brady
ab9e97c15a Merge pull request #2218 from terrywbrady/ds4012_6x
[DS-4012] port of 4012 to 6x
2018-10-17 11:38:57 -07:00
Terry Brady
c996cb263c pr comments 2018-10-10 08:32:59 -07:00
Terry Brady
05750614d8 port of 4012 to 6x 2018-09-27 22:47:18 -07:00
Terry Brady
90e0f52d0a Merge pull request #2201 from AlexanderS/DS-3664
[DS-3664] ImageMagick: Only execute "identify" on first page
2018-09-14 08:52:51 -07:00
Alexander Sulfrian
33ba419f35 ImageMagick: Only execute "identify" on first page
The Info object used to get the color format runs "identify" on the supplied
input file. If the file has many pages, this process might require some time.
"identify" supports the same syntax for the input file like the other
ImageMagick tools and we can simply restrict the pages by changing the input
file name.

This fixes DS-3664.
2018-09-12 16:55:49 +02:00
Andrew Wood
05959ef315 DS-3658 Document new property reflect default behavior in value 2018-08-28 15:18:59 -04:00
Andrew Wood
1d2f10592a DS-3658 Configure ReindexerThread disable reindex 2018-08-28 13:38:50 -04:00
Tim Donohue
9c18b9bcad Merge pull request #2071 from tuub/DS-3919
DS-3919: Rename identifier-service-test.xml to identifier-service.xml
2018-08-21 15:43:32 -05:00
Tim Donohue
e44cd707ab Merge pull request #2072 from AlexanderS/fix-typo-version
Fix typo "verion"
2018-08-21 15:41:36 -05:00
Alexander Sulfrian
b86a7b8d66 CommunityFiliator: Some cleanup
This removes the loops for checking if a community is contained in a list of
communities. Community.equals() does the same check, so we simply can use
contains().
2018-08-21 18:16:20 +02:00
Alexander Sulfrian
19cc971987 DS-3914: Fix community defiliation
This fixes in issue in the defiliate method of the community filiator. The
child and parent relations should be managed using the provided methods of the
Community.

This changes the visibility of Community.removeSubCommunity() to public, but
Community.removeParentCommunity() was public before already.
2018-08-21 18:16:16 +02:00
Chris Herron
f9e2f0b6a1 Reduce itemCounter init 2018-08-08 09:50:28 -04:00
Tim Donohue
b537b68345 Merge pull request #2134 from terrywbradyC9/dockerDspace6x
[DS-3967] 6x - Migrate Dockerfile to DSpace/DSpace
2018-08-07 14:43:07 -05:00
Terry Brady
f1180aedbd Include Mirage2 in the build 2018-08-02 13:03:11 -07:00
Terry Brady
31b1b57881 Update local.cfg 2018-08-01 19:02:22 -07:00
Terry Brady
7fd9ba218d fix second ant path 2018-08-01 15:51:23 -07:00
Terry Brady
356e275fa3 ant version update 2018-08-01 15:40:42 -07:00
Terry Brady
2059e1b5bf fix path name 2018-08-01 15:38:18 -07:00
Terry Brady
dc4999e996 move Dockerfile, update host 2018-08-01 15:28:33 -07:00
Terry Brady
481bb99d8b add usage 2018-07-25 17:57:24 -07:00
Terry Brady
0f50e50478 move dockerfile into code repo 2018-07-25 16:32:05 -07:00
Mark H. Wood
8a07d5b2c8 Merge pull request #2042 from mwoodiupui/DS-3895-6x
[DS-3895] Bitstream size can't be referenced in HQL queries
2018-07-25 11:52:36 -04:00
Chris Herron
5f3494f518 DS-3873 Limit the usage of PDFBoxThumbnail to PDFs 2018-07-17 14:07:41 -04:00
Toni Prieto
7bf193eb86 Remove starts_with hidden field 2018-07-10 00:25:30 +02:00
Tim Donohue
d07af1ca73 Merge pull request #2106 from KingKrimmson/DS-3939-6_x
DS-3939 OAI-Harvester, skip item and continue if handle is missing
2018-07-09 09:57:10 -05:00
Mark H. Wood
0b76b791db [DS-3453] Manual port of master PR #1683 to 6_x 2018-07-06 12:45:28 -04:00
Chris Herron
57750064ea DS-3939 OAI-Harvester, skip item and continue if handle is missing 2018-07-02 12:46:38 -04:00
Kim Shepherd
4c5b0a78ec [maven-release-plugin] prepare for next development iteration 2018-06-26 17:58:32 +12:00
Kim Shepherd
813800ce17 [maven-release-plugin] prepare release dspace-6.3 2018-06-26 17:58:16 +12:00
Tim Donohue
a91cde6cbc Merge pull request #2099 from kshepherd/DS-3936_bower_registry_needs_updating
DS-3936 bower registry needs updating (blocker)
2018-06-25 13:48:44 -05:00
Toni Prieto
cfe75e6577 Update start-handle-server to delete lock file 2018-06-25 11:41:34 +02:00
Kim Shepherd
1cd1ccc2ab fix typo in json bowerrc 2018-06-24 23:08:16 +00:00
kshepherd
b5f43835b4 Merge pull request #2097 from kshepherd/dspace-6_x
Regenerated LICENSES_THIRD_PARTY file with a few manual tidy-ups
2018-06-25 11:00:21 +12:00
Kim Shepherd
558b36c11d update bower registry to https://registry.bower.io as per official instructions 2018-06-24 22:53:55 +00:00
Kim Shepherd
5791a5cec0 manually switch pgsql jdbc driver from BSD-2-Clause to PostgreSQL License to match past license descriptions / be more descriptive (though it is bsd-2-clause under the hood, according to github) 2018-06-24 22:31:19 +00:00
Kim Shepherd
c0072f19e4 Updated third party licenses in preparation for 6.3 release 2018-06-24 22:23:31 +00:00
kshepherd
6433ce04c4 Merge pull request #2096 from kshepherd/dspace-6_x
Update copyright year range in LICENSE file in preparation for 6.3 release
2018-06-25 10:06:26 +12:00
Kim Shepherd
9353b0a42c Update copyright year range in LICENSE file in preparation for 6.3 release 2018-06-24 22:01:52 +00:00
Tim Donohue
9f80c48e15 Merge pull request #2095 from ItsNotYou/dspace-6_x
DS-3933 Updated Pubmed endpoints from http:// to https://.
2018-06-21 10:29:36 -05:00
Hendrik Geßner
4fbf330de9 DS-3933 Updated Pubmed endpoints from http:// to https://. 2018-06-21 15:22:00 +02:00
kshepherd
adfaae33fd Merge pull request #2082 from mwoodiupui/DS-3795-6x
[DS-3795] Manage versions of some buggy transitive dependencies.
2018-06-21 12:37:48 +12:00
kshepherd
b259dbecc6 Merge pull request #1999 from atmire/DS-3870-S3-bitstore-leaves-connections-to-AWS-open
DS-3870 S3 bitstore leaves connections to AWS open
2018-06-21 12:27:10 +12:00
kshepherd
da0c6e6bda Merge pull request #2080 from AndrewZWood/DS-3930
DS-3930 Remove extra space
2018-06-21 08:18:16 +12:00
kshepherd
5aaebc3b73 Merge pull request #2079 from tdonohue/reapply-DS-3710
DS-3710 Fix ehcache config conflict (Reapply to 6.x)
2018-06-21 08:08:04 +12:00
Mark H. Wood
ade12e54d9 [DS-3795] Manage versions of some buggy transitive dependencies. 2018-06-15 17:28:39 -04:00
Chris Herron
d9f4aa6da6 DS-3710 Fix ehcache config conflict 2018-06-13 16:04:17 +00:00
kshepherd
ff6252d660 Merge pull request #1910 from wgresshoff/DS-3310
Fix authentication problem in SwordV2 implementation (DS-3310).
2018-06-13 11:28:12 +12:00
Andrew Wood
dd61128b14 DS-3930 Remove extra space 2018-06-08 14:52:45 -04:00
kshepherd
c01823119d Merge pull request #1973 from antzsch/dspace-6_x
DS-3856 - foreignkey-constraint community2community_child_comm_id_fkey
2018-06-01 11:13:48 +12:00
Alexander Sulfrian
dea0d818ff Fix typo "verion"
Simple typo fix for the two places where the "s" is missing.
2018-05-31 17:14:31 +02:00
Mark H. Wood
02d348064f Merge pull request #2069 from tdonohue/DS-3498-disable-full-text-snippets
DS-3498 quick fix. Disable full text snippets in search results & add warning
2018-05-30 16:03:00 -04:00
Pascal-Nicolas Becker
632b656ef2 DS-3919: rename identifier-service-test.xml to identifier-service.xml 2018-05-23 10:52:10 +02:00
Tim Donohue
06268855ab DS-3498 quick fix. Disable full text snippets in search results & provide warning. 2018-05-22 21:07:26 +00:00
kshepherd
a9d83925cd Merge pull request #2068 from kshepherd/DS-3840
Fixes for DS-3840
2018-05-22 08:40:10 +12:00
kshepherd
7829eca276 Merge pull request #2067 from kshepherd/DS-3866
Fixes for DS-3866
2018-05-22 08:39:33 +12:00
Kim Shepherd
df244882ef Fixes for DS-3480 2018-05-21 11:18:26 +12:00
Kim Shepherd
22baf1e698 fixes for DS-3866 2018-05-21 11:08:23 +12:00
kshepherd
7b2050e1de Merge pull request #1730 from atmire/DS-2675-Browse-by-jump-to-bugfixes-DSpace6
DS-2675: Bugfixing: Jump to value general errors with order (for 6.x)
2018-05-20 14:27:55 +12:00
kshepherd
815568d926 Merge pull request #1770 from edusperoni/controlled-vocabulary-search6
[DS-3616] Fix nested vocabulary search (dspace-6_x)
2018-05-20 14:06:42 +12:00
kshepherd
e7e249709c Merge pull request #1779 from AlexanderS/remove-submission-policies
DS-3522: Ensure Submission Policies are removed in XMLWorkflow
2018-05-20 12:02:36 +12:00
kshepherd
76d3a7e4a1 Merge pull request #1864 from ssolim/DS-3629---Listing-of-all-Groups-misses-pagination
Ds 3629   listing of all groups misses pagination - XMLUI
2018-05-18 21:24:53 +12:00
kshepherd
9193e9a412 Merge pull request #1835 from AlexanderS/authority-indexer
DS-3681: Refactoring of DSpaceAuthorityIndexer
2018-05-17 11:48:42 +12:00
kshepherd
d65babd069 Merge pull request #2026 from KingKrimmson/DS-3894-6_x
[DS-3894] Fix "No enum constant" exception for facet "View More" links
2018-05-17 11:28:48 +12:00
Tim Donohue
4f453430ec Merge pull request #1755 from AlexanderS/jspui-fix-authority-lookup
DS-3404: JSPUI: Fix authority lookup
2018-05-16 11:01:31 -05:00
kshepherd
513533e3e2 Merge pull request #2039 from tdonohue/DS-3447-ORCID-v2-integration-slim
DS-3447 : ORCID v2 integration (using DSpace/orcid-jaxb-api)
2018-05-14 09:50:08 +12:00
kshepherd
5d6282ebfb Merge pull request #2045 from kshepherd/ds-3377_solr_queries_too_long
[DS-3377] Solr queries too long  (change search GET requests to POST)
2018-05-11 12:34:37 +12:00
Tim Donohue
72d9712555 Update POM with newly released version of orcid-jaxb-api 2018-05-10 20:15:36 +00:00
kshepherd
68b6314e8d Merge pull request #1891 from tuub/DS-3769
DS-3769 Set the right hibernate property of …
2018-05-10 09:08:58 +12:00
Pascal-Nicolas Becker
8e0b2c0bbb Merge pull request #2038 from tuub/DS-3507
DS-3507: Fixes the issues with the colon and Solr special characters …
2018-05-09 23:04:03 +02:00
kshepherd
d032784e59 Merge pull request #1890 from tuub/DS-3768
DS-3768 Fixes the harvest solr parse error by
2018-05-10 09:03:41 +12:00
kshepherd
74ab77f1e9 Merge pull request #1838 from eDissPlus/dspace-6_x
DS-3693: Add plugin to index filenames and file descriptions for files in ORIGINAL bundle
2018-05-09 14:39:14 +12:00
Kim Shepherd
9b981d963c [DS-3377] Fix some newline/spacing issues, add log warning for non-string solr parameters encountered 2018-05-09 11:24:41 +12:00
Kim Shepherd
3d2b5bc03a [DS-3377] Ensure post parameters are added to http request (fixes bug found during review) 2018-05-06 11:41:29 +12:00
Terry Brady
b51043ed84 Merge pull request #2049 from Georgetown-University-Libraries/ds3903
At +2.  Merging... [DS-3903] Resolve Jaskson Dependency Issues in 6x REST API
2018-05-04 16:09:46 -04:00
Terry W Brady
79ca53954d upgrade jackson and comment 2018-05-04 11:52:10 -07:00
Terry W Brady
cfb758d8ee update version for synk issues 2018-05-04 11:35:56 -07:00
Terry W Brady
f5306597f4 jackson version issues 2018-05-04 11:01:49 -07:00
Tim Donohue
b8744f3465 Bug fixes to querying for ORCIDs by name 2018-05-04 16:28:59 +00:00
kshepherd
e1b86828b4 Merge pull request #2000 from minurmin/DS-2862
DS-2862 Fix legacy embargo checks on DSpace 6
2018-05-03 18:44:55 +12:00
kshepherd
34a6625e6b Merge pull request #2012 from KingKrimmson/DS-3877-6_x
DS-3877 Trim bitstream name during filter-media comparison
2018-05-03 17:49:23 +12:00
Kim Shepherd
237e06a32e [DS-3377] Explicitly set SolrRequest.METHOD.POST for all regular calls to SolrQuery.query. Removed some debug logging. 2018-05-03 11:26:12 +12:00
Kim Shepherd
2d37722357 [DS-3377] Replace GET with POST in simple 'find dspace object' as well 2018-05-03 10:54:42 +12:00
Kim Shepherd
87300f4108 [DS-3377] Replace GET with POST in simple 'find dspace object' as well 2018-05-03 10:52:29 +12:00
Kim Shepherd
08bf1d8ece [DS-3377] Fixed missing paren in debug logging 2018-05-03 10:43:41 +12:00
Kim Shepherd
c21b2b9899 [DS-3377] Change http client method in SolrServiceImpl to POST instead of GET 2018-05-03 10:40:31 +12:00
Mark H. Wood
580ad8dff3 [DS-3895] Someone snuck a new method into BitstreamService while I wasn't looking. Patched new use of getSize(). 2018-05-02 16:06:43 -04:00
Mark H. Wood
1c911e22d8 [DS-3895] Rename Bitstream.getSize() to .getSizeBytes() 2018-05-02 15:53:48 -04:00
Tim Donohue
8c1580bbe5 Fix default ORCID API URL. Move to correct place in dspace.cfg 2018-05-02 19:30:33 +00:00
Tim Donohue
2904680ab0 Merge pull request #1856 from helix84/DS-3705-reference-fix-pagination-6_x
DS-3705 Recent Submissions in Reference theme completely covered up by navigation (6.x)
2018-05-02 10:43:57 -05:00
kshepherd
a5638327fc Merge pull request #2043 from kshepherd/ds-3694_re-add_mirage2_war_exclusion_dspace-6
[DS-3694] Re-add the assembly exclusion for the Mirage 2 war
2018-05-02 14:33:50 +12:00
Kim Shepherd
17ea05f0d0 [DS-3694] Re-add the assembly exclusion for the Mirage 2 war (accidentally removed during port from master PR) 2018-05-02 11:17:29 +12:00
Tim Donohue
bdd252acc5 Remove unnecessary org.orcid.* classes (accidentally included in a previous commit) 2018-04-30 15:45:17 +00:00
Tim Donohue
51ade53ef3 Update/Refactor to use DSpace/orcid-jaxb-api (currently SNAPSHOT version) 2018-04-27 21:28:57 +00:00
Jonas Van Goolen
0cd69ef817 DS-3447 Removal of incorrectly copied <p> tag + re-add accidentally removed license header 2018-04-27 20:31:39 +00:00
Jonas Van Goolen
e768986e37 DS-3447 Removal of more unused ORCID v1 classes + commenting in new classes 2018-04-27 20:31:08 +00:00
Jonas Van Goolen
a3368dc779 DS-3447 Additional licences + Default orcid.api.url 2018-04-27 20:30:43 +00:00
Jonas Van Goolen
8f13958c2a DS-3447 Orcid v2 implementation + removal of obsolete v1 implementation 2018-04-27 20:30:28 +00:00
Marsa Haoua
a47844ef89 DS-3507: Fixes the issues with the colon and Solr special characters at the end of a searching string by escaping them. 2018-04-26 14:33:30 +02:00
kshepherd
4e10c27e84 Merge pull request #2033 from kshepherd/DS-3300_coverpage_size_dspace6x_port
DS-3300: Add option to select citation page format (LETTER or A4) - dspace-6_x port
2018-04-26 09:21:53 +12:00
kshepherd
5f39765960 Merge pull request #1913 from tuub/DS-3556
[DS-3556] Rollback of Xalan from 2.7.2 to 2.7.0 to fix DS-3556 and DS…
2018-04-26 07:49:59 +12:00
kshepherd
a8e2ff8c07 Merge pull request #1877 from christian-scheible/DS-3734
DS-3734 Fixed missing trigger of item last modified date when adding a bitstream
2018-04-26 07:39:37 +12:00
Ilja Sidoroff
9d0d58ab3f Add option to select citation page format (LETTER or A4) 2018-04-23 23:16:58 +00:00
Tim Donohue
d8e80e20c8 Merge pull request #2017 from kshepherd/ds-3788_dspace-6_x_port
drop indexes, update, recreate
2018-04-19 08:44:55 -05:00
Terry Brady
944e030bd4 Merge pull request #1996 from AndrewBennet/DS-3511
DS-3511: Fix HTTP 500 errors on REST API bitstream updates
2018-04-18 20:09:36 -04:00
kshepherd
ec307a1c78 Merge pull request #2003 from mwoodiupui/DS-3832-v6
[DS-3832] GeoIP-API(com.maxmind.geoip:geoip-api) needs to be replaced by GeoIP2 ( com.maxmind.geoip2:geoip2 )
2018-04-19 09:33:00 +12:00
Chris Herron
ed24a9917e Fix "No enum constant" exception for facet "View More" links 2018-04-18 11:14:46 -04:00
marsaoua
746401bfe5 DS-3702 & DS-3703: Rebuild the old behavior of bitstreams during versioning in DSpace 6. Bitstreams are pointing to the same file on the disk and reuse the same internal id of the predecessor item's bitstreams version as long as the new bitstreams do not differ from them. Bitstream's metadata are also duplicated only for the new version of the item. (#1883) 2018-04-17 14:57:53 -05:00
Mark H. Wood
0835822359 [DS-3832] Don't spew stack traces for simple exceptions. 2018-04-17 10:20:05 -04:00
Miika Nurminen
c549d3abca DS-2862 Remove the call to legacy lifter.liftEmbargo, show legacy embargo check warnings only if the policy has no date
Even if the legacy embargo is used, lifting should be based on policies with dates so liftEmbargo call is not needed after all. If the policy has a date it is assumed to determine the embargo, even if the corresponding metadata field is set differently.
2018-04-16 09:33:53 +03:00
kshepherd
26406ec73e Merge pull request #1941 from AlexanderS/fix-xslt-ingestion-crosswalk
[DS-3822] Don't guess XML structure during ingest
2018-04-15 17:51:12 +12:00
kshepherd
ea5d27c65f Merge pull request #1867 from philip-muench/oai-embargo-fix
DS-3707, DS-3715: Fixes to item level embargo/privacy in OAI-PMH
2018-04-15 17:34:56 +12:00
kshepherd
fef9550684 Merge pull request #1914 from atmire/DS-3800
DS-3800 The metadata.hide configuration property does not take into account the boolean value assigned to it.
2018-04-15 12:34:05 +12:00
kshepherd
9edb231be1 Merge pull request #1805 from atmire/DS-3560_6.x
DS-3560 MathJax CDN provider change
2018-04-15 12:07:23 +12:00
Andrew Wood
49041c250e DS-3888 Respect primary bitstreams with text html mime types in item view 2018-04-12 15:06:46 -04:00
Per Broman
8aaa4695b1 [DS-3770] always uncache item after performed curation task for better performance (#1892) 2018-04-11 22:27:26 -05:00
Jacob Brown
70db7006ed drop indexes, update, recreate 2018-04-12 10:18:56 +12:00
kshepherd
da2369229d Merge pull request #1885 from AlexanderS/DS-3756_Submission-Back
DS-3756: Fix "back" on last page of submission
2018-04-12 09:22:50 +12:00
kshepherd
fadb48eb54 Merge pull request #1949 from jonas-atmire/DS-3830-Cache-issue-for-version-creation-link
DS-3830 Item retrieval fallbacks for versioning navigation
2018-04-12 08:56:10 +12:00
kshepherd
483bbd9dc2 Merge pull request #2016 from tdonohue/DS-3883-fixes
DS-3883: Speed up Item summary lists by being smarter about when we load Thumbnails/Bitstreams
2018-04-12 07:28:08 +12:00
Tim Donohue
cf2021aee1 DS-3883: If only including thumbnails, only load the main item thumbnail. 2018-04-10 13:44:24 +00:00
Terry Brady
5ebe6f4b4d Merge pull request #1972 from ssolim/DS-3858-DataCiteConnector-leads-to-org.apache.http.NoHttpResponseException
DS-3858 add HttpRequestRetryHandler to DataCiteCon -- At +2, merging
2018-04-09 14:56:30 -04:00
Tim Donohue
1a62edaefb DS-3883: Don't loop through original bitstreams if only displaying thumbnails 2018-04-09 17:39:01 +00:00
ssolim
ae1920cb43 DS-3858: HttpClientBuilder replaces deprectated
DS-3858: Use HttpClientBuilder as the other methods were deprecated. Commit from pnbecker
2018-04-09 09:42:50 +02:00
kshepherd
8b1cf7d6a4 Merge pull request #2014 from Georgetown-University-Libraries/ds3694
[DS-3694] Clean up EHCache configuration mess (Port to 6x)
2018-04-09 16:27:36 +12:00
Terry W Brady
362b81d2ac update method signature 2018-04-06 16:29:36 -07:00
Terry W Brady
81bcaa47c4 port of PR1940 2018-04-06 16:20:00 -07:00
Chris Herron
b70b170657 DS-3877 Trim bitstream name during filter-media comparison (prevent duplicate bitstream generation) 2018-04-04 09:42:04 -04:00
Mark H. Wood
825b97f6f5 Merge pull request #2006 from kshepherd/pr1804_dspace6_port
Change Content-Type in OAI-Response (DSpace 6 port)
I'm going to go ahead and merge, since it's a port of a patch already merged on another branch.
2018-03-30 14:29:18 -04:00
Mark H. Wood
9aabe46c33 [DS-3832] Clean up more references to v1 database. 2018-03-29 13:27:56 -04:00
Mark H. Wood
9757989336 [DS-3832] Fetch and use GeoLite v2 City database. 2018-03-29 12:46:20 -04:00
Terry Brady
c4aab55e7c Merge pull request #1977 from TAMULib/DS3775-hibernate-session-bug
[DS-3775] Hibernate session bug when submitting item with Versioning enabled
2018-03-28 15:51:30 -07:00
Saiful Amin
b0a1fb7384 Change Content-Type in OAI-Response
As per OAI 2.0 spec (3.1.2.1) the response content type *must* be text/xml.
http://www.openarchives.org/OAI/openarchivesprotocol.html#HTTPResponseFormat

Our OAI client is rejecting the response from DSpace OAI server.
2018-03-28 22:15:41 +00:00
philip-muench
7587d9bd05 Removing debug output 2018-03-28 23:30:05 +02:00
kshepherd
ad4680a26c Merge pull request #1998 from dineshmendhe1/DS-3875
DS-3875: Fix the attribute name of <identifier> tag to 'identifierType'.
2018-03-29 10:15:12 +13:00
kshepherd
999b2f23be Merge pull request #1982 from philip-muench/dspace-6_x
DS-3865: HTML5 Upload Pause Button does not work
2018-03-29 10:09:26 +13:00
Mark H. Wood
f906915879 [DS-3832] Resolve dependency convergence problems. 2018-03-28 13:13:23 -04:00
Mark H. Wood
8be78e6430 [DS-3832] Fix ElasticSearch too. 2018-03-28 12:40:01 -04:00
Mark H. Wood
faf5bd1f94 [DS-3832] Recast test support classes. 2018-03-28 11:56:31 -04:00
Mark H. Wood
4ea7575e4a [DS-3832] Upgrade to GeoIP2. 2018-03-28 11:52:38 -04:00
Kim Shepherd
7bf2f36300 DS-3707, DS-3715: Final name update: removed verbose method name / test number info as per Slack discussion (we won't try to force uniqueness on JUnit assertions) 2018-03-28 10:27:50 +13:00
Kim Shepherd
f354979777 DS-3707, DS-3715: Some update to name text, appending function and test name just to ensure uniqueness (this won't look so ugly if we follow same convention for shorter method names). Line breaks added beneath while blocks. 2018-03-28 10:19:23 +13:00
Kim Shepherd
7b1010cc36 DS-3707, DS-3715: Make test assertion names more descriptive. Added an extra test case to ensure non-discoverable items are not returned by the 'discoverable' version of the find 2018-03-28 10:08:31 +13:00
Kim Shepherd
24ba583921 DS-3707, DS-3715: Add unit tests for existing findInArchiveOrWithdrawnDiscoverableModifiedSince ItemService method and new findInArchiveOrWithdrawnNonDiscoverableModifiedSince ItemService method. Tests ensure that items are returned whether withdrawn or archived, that passing the 'since' timestamp works as expected, and that we get expected results when the item is discoverable and non-discoverable 2018-03-26 10:36:17 +13:00
Miika Nurminen
1f76a54384 DS-2862 Activate legacy embargo lifter, fix conditions on embargo check warnings 2018-03-24 00:58:54 +02:00
benbosman
a3774944ad DS-3870 S3 bitstore leaves connections to AWS open 2018-03-22 16:23:41 +01:00
Dinesh Mendhe
d885ec0d4e DS-3875: Fix the attribute name of <identifier> tag to 'identifierType'.
https://jira.duraspace.org/browse/DS-3875
2018-03-21 17:58:48 -04:00
Tim Donohue
b5aba21902 Merge pull request #1969 from tdonohue/lastest_postgres_jdbx_6x
DS-3854: Update to latest PostgreSQL JDBC driver
2018-03-21 13:48:42 -05:00
Miika Nurminen
17fb6cab87 DS-3511: Add context.complete() to updateBitstreamData and deleteBitstreamPolicy to prevent HTTP 500 response on REST updates 2018-03-21 13:49:16 +00:00
Terry Brady
0506d2ffe3 Merge pull request #1934 from Georgetown-University-Libraries/ds3811r6
[DS-3811] Integrate Shibboleth into DSpace REST Report Tools
2018-03-19 15:17:36 -07:00
Terry W Brady
df16cde989 define shib path at top of file 2018-03-19 14:39:41 -07:00
jsavell
5b75b0a3b9 reload dso when getting handle from HandleUtil 2018-03-19 10:30:08 -05:00
jsavell
ff2305ee51 Revert "reload dso before adding to validity"
This reverts commit b1d56059cd.
2018-03-19 09:57:17 -05:00
Terry Brady
79d94d92f9 Merge branch 'dspace-6_x' into ds3811r6 2018-03-15 14:38:07 -07:00
kshepherd
78d4fb14c1 Merge pull request #1854 from Georgetown-University-Libraries/ds3704
[DS-3704] Expand DSpace REST Reports to include bitstream fields in item listing
2018-03-16 10:16:51 +13:00
kshepherd
d6f35fbda1 Merge pull request #1862 from Georgetown-University-Libraries/ds3713
[DS-3714] REST Collection Report - Need a paginated findByCollection call that can return withdrawn items
2018-03-16 09:51:19 +13:00
kshepherd
077a6e99d6 Merge pull request #1863 from Georgetown-University-Libraries/ds3714
[DS-3713] REST Query/Collection Report - Bug Filtering for Bitstream Permissions
2018-03-15 22:18:23 +13:00
kshepherd
daee0646da Merge pull request #1845 from jrihak/shibboleth-authentication-fix
[DS-3662] DSpace 'logging in' without password or with non-existent e-mail using Shib and Password authentication
2018-03-15 11:48:47 +13:00
kshepherd
6787983574 Merge pull request #1860 from KingKrimmson/dspace-6_x
DS-3710 Fix ehcache config conflict
2018-03-15 11:44:23 +13:00
Philip Vissenaekens
7592f48064 DS-3800 2018-03-07 17:05:03 +01:00
philip-muench
a61c21f216 Update choose-file.jsp
Make pause button work again
2018-03-07 15:18:40 +01:00
jsavell
b1d56059cd reload dso before adding to validity 2018-03-02 16:33:56 -06:00
Stefan
1c17e8e475 DS-3856 - foreignkey-constraint community2community_child_comm_id_fkey
Move the deletion of the parent-child relationship to the rawDelete() method.
2018-03-01 10:15:57 +01:00
ssolim
7ed10610de use StandardHttpRequestRetryHandler instead of override 2018-02-28 16:05:36 +01:00
ssolim
7ce43d4027 DS-3858 add HttpRequestRetryHandler to DataCiteCon 2018-02-28 11:30:50 +01:00
Stefan Fritzsche
17907bf442 DS-3856 - foreignkey-constraint community2community_child_comm_id_fkey 2018-02-28 11:00:54 +01:00
Tim Donohue
1b21e0baef Update to latest PostgreSQL JDBC driver 2018-02-26 20:31:07 +00:00
Tim Donohue
7bf537e3ab Merge pull request #1966 from mwoodiupui/DS-3852-6_x
[DS-3852] OAI indexer message not helpful in locating problems
2018-02-26 09:32:49 -06:00
Mark H. Wood
78ed97c78d [DS-3852] Give more information about the item just indexed, to help identify it in case of problems.
07b050c
2018-02-24 17:39:44 -05:00
ihausmann
fbb0b73b61 DS-3404: JSPUI: Fix authority lookup
My colleague Eike made a mistake fixing the bug. His solution only works for repeatable fields.
 
https://jira.duraspace.org/browse/DS-3404
https://github.com/DSpace/DSpace/pull/1755
2018-02-23 12:36:27 +01:00
Tom Desair (Atmire)
a33e886cf0 Merge pull request #1848 from tuub/DS-3700
DS-3700: MediaFilterServiceImpl forgot to close an input stream.
2018-02-21 16:09:36 +01:00
kshepherd
48b1ac8e18 Merge pull request #1951 from Georgetown-University-Libraries/ds3835
[DS-3835] Add js work around to preserve current scope
2018-02-20 18:23:45 +13:00
kshepherd
fb08f721c5 Merge pull request #1960 from hardyoyo/DS-3839-revised
[DS-3839] moved the autoorient IM op to the top of the operations lis…
2018-02-20 16:04:18 +13:00
Hardy Pottinger
d1edfb5f85 [DS-3839] moved the autoorient IM op to the top of the operations list, where it belongs 2018-02-19 11:17:48 -06:00
Hardy Pottinger
2c5dcf13ef [DS-3839] added op.autoOrient to ImageMagickThumbnailFilter (#1956) 2018-02-16 07:59:46 -06:00
Pascal-Nicolas Becker
2754a7bad8 Merge pull request #1946 from philip-muench/dspace-6_x
DS-3827 LazyInitializationException in formats.jsp
2018-02-15 11:48:54 +01:00
Terry W Brady
6275a59f24 add Mirage fix 2018-02-08 16:30:03 -08:00
Terry W Brady
409075e447 Add js work around to preserve current scope 2018-02-08 14:56:03 -08:00
Mark H. Wood
4cdfd34a08 Merge pull request #1925 from mwoodiupui/DS-3434
[DS-3434] DSpace fails to start when a database connection pool is supplied through JNDI
2018-02-07 10:25:48 -05:00
Jonas Van Goolen
5684f24944 DS-3830 Item retrieval fallbacks for versioning navigation 2018-02-06 08:43:29 +01:00
Mark H. Wood
2b55ec5c63 Merge pull request #1917 from alanorth/6_x-remove-jndi-dspacecfg
DS-3803 Remove db.jndi setting from dspace.cfg
Should be ported to master as well.
2018-02-05 10:53:20 -05:00
philip-muench
5ed105974b Update formats.jsp
This JSP produces an internal error due to a LazyInitializationException. The exception is produced because the dspace context is aborted before completing all database queries. JIRA ticket: DS-3827
2018-02-05 13:07:38 +01:00
Tim Donohue
81af7f47a7 DS-3795: Update Apache POI library to latest version 2018-02-02 16:09:37 +00:00
Alexander Sulfrian
de1e26b3ee [DS-3822] Don't guess XML structure during ingest
The XML document used during ingestion can contain multiple XML nodes directly
inside the XML root. The crosswalk should not modify the source document, but
only hand it over to the XSLT stylesheet.
2018-01-30 19:09:45 +01:00
Tim Donohue
d90e1667b6 Merge pull request #1937 from tdonohue/bump-commons-fileupload-version
[DS-3795] increased version for commons-fileupload (backport to 6.x)
2018-01-29 10:14:26 -06:00
Tim Donohue
d9c7ac61e9 Merge pull request #1938 from tdonohue/DS-3795-bump-jackson-version
[DS-3795] bumped google-http-client-jackson2 to 1.23.0
2018-01-29 10:14:05 -06:00
Hardy Pottinger
5d8e34c0c3 [DS-3795] whoops, let's pick an actual version number for google-api-services-analytics, heh 2018-01-26 22:34:21 +00:00
Hardy Pottinger
47a5898fd9 [DS-3795] bumped bumped other Google API dependencies to 1.23.0, as per suggestion of tdonohue 2018-01-26 22:34:13 +00:00
Hardy Pottinger
b16b116f54 [DS-3795] bumped google-http-client-jackson2 to 1.23.0 2018-01-26 22:34:04 +00:00
Hardy Pottinger
9654ea87c9 increased version for commons-fileupload 2018-01-26 22:31:46 +00:00
Terry W Brady
e2975e26ed Eliminate bypass authentication checks 2018-01-25 11:31:10 -08:00
Terry W Brady
061298640b Enable shibb authentication in rest tools 2018-01-25 11:30:44 -08:00
Mark H. Wood
e2a771d10d [DS-3434] Look up a bean implementing DataSource instead of accepting any old Object. This should be common to all implementations. 2018-01-23 11:10:51 -05:00
Mark H. Wood
d37670776b [DS-3434] Make some documentation more visible. 2018-01-19 14:54:24 -05:00
Mark H. Wood
49e9e3817e [DS-3434] Look up generic object instead of a specific DataSource subclass. 2018-01-19 14:06:58 -05:00
Mark H. Wood
e640106468 Merge pull request #1820 from mwoodiupui/DS-3667
[DS-3667] Document fundamental persistence support classes.
2018-01-18 15:32:34 -05:00
Alan Orth
3bb04dac4c DS-3803 Remove db.jndi setting from dspace.cfg
As of DSpace 6.x this setting is no longer used and is not customizable
by the user. Now DSpace always looks for a pool named "jdbc/dspace" in
JNDI and falls back to creating a pool with the db.* settings located
in dspace.cfg.

See: https://wiki.duraspace.org/display/DSDOC6x/Configuration+Reference
See: dspace/config/spring/api/core-hibernate.xml
See: https://jira.duraspace.org/browse/DS-3434
2018-01-16 19:24:12 +02:00
Philip Vissenaekens
1ea55a4fe6 DS-3800 2018-01-10 10:18:18 +01:00
Per Broman
31a613cb1a [DS-3556] Rollback of Xalan from 2.7.2 to 2.7.0 to fix DS-3556 and DS-3733 2018-01-06 09:37:25 +01:00
gressho
bda2f8709c Fix authentication problem in SwordV2 implementation (DS-3310). 2018-01-04 10:59:35 +01:00
Jonas Van Goolen
d9ad114879 DS-3791 Make sure the "yearDifference" takes into account that a gap of 10 year contains 11 years 2017-12-20 15:13:45 +01:00
Hardy Pottinger
ff69c1fa8c [DS-3087] removed inlineMath setting from MathJax config in Mirage and Mirage2 page-structure.xsl files, the defaults are sensible and preferable (#1896) 2017-12-12 20:18:25 -06:00
Hardy Pottinger
8f85b764f4 [DS-3757] increase default clamav socket timeout to 6 minutes (#1886) 2017-11-27 10:24:07 -06:00
marsa
83ec310d0e DS-3769 Set the right hibernate property of org.dspace.eperson.Subscription: ePerson instead of eperson.id 2017-11-27 17:12:18 +01:00
marsa
38d951062c DS-3768 Fixes the harvest solr parse error by setting the requirements and syntaxes of the field location and lastModified properly. 2017-11-27 16:42:28 +01:00
samuel
71c68f2f54 DS-3545 mirage2: custom sitemap.xmap is ignored 2017-11-22 17:26:10 +01:00
Lotte Hofstede
14eef1b409 DS-3560: update deprecated MathJax url for 6.x 2017-11-21 11:03:58 +01:00
Mark H. Wood
10dc184824 Merge pull request #1831 from AlexanderS/fix/multiple-use-vocabulary
DS-3682: Fix reusing of the same vocabulary dialog
2017-11-20 10:23:55 -05:00
Mark H. Wood
f8244980f0 Merge pull request #1839 from Generalelektrix/dspace-6_x
[DS-3332] Handle resolver is hardcoded in org.dspace.handle.UpdateHandlePrefix
2017-11-20 10:22:02 -05:00
Martin Walk
1ba1a17c52 Empty commit to trigger Travis CI 2017-11-17 14:39:54 +01:00
Martin Walk
9ce4653ffd Add search filters
- add search filter for original_bundle_filenames and original_bundle_descriptions to discovery.xml
-  add messages
2017-11-17 11:37:31 +01:00
marsaoua
1b90001420 DS-3729: Set the Bitstream deletion flag in the database in case of an item deletion (#1874) 2017-11-16 13:42:32 -06:00
Alexander Sulfrian
cf45326276 DS-3756: Fix "back" on last page of submission
The back button on the last page (the maximum reached) of the current
submission should ignore required fields. Because maxStepAndPage is a Java
object equals have to be used to compare multiple instances.
2017-11-15 16:59:23 +01:00
Mark H. Wood
4304d8a872 Merge pull request #1870 from AlexanderS/missing-readonly
DS-3724: Missing readonly in input-forms.dtd

This should be ported to DSpace 7, but note that the submission configuration is being refactored.
2017-11-01 12:11:35 -04:00
Christian Scheible
5debb078d0 DS-3734 Fixed missing trigger of item last modified date when adding a bitstream. 2017-10-27 14:06:32 +02:00
Alexander Sulfrian
2bf07661bf DS-3724: Missing readonly in input-forms.dtd 2017-10-23 15:24:19 +02:00
Terry W Brady
63e6823b62 clarify methods that return all vs archived items 2017-10-16 08:51:35 -07:00
Philip Muench
6f892e70e8 DS-3707, DS-3715: Fixes to item level embargo/privacy in OAI-PMH 2017-10-16 15:28:47 +02:00
ssolim
e5cead0063 change decodefromurl default value to normal behaviour
DS-3629
2017-10-06 09:00:26 +02:00
ssolim
dd5a277f7b Fix logical error in searchResultCount
GroupServiceImpl searchResultCount returned always 1 when no searchterm was submitted, but was intended to return the count of all groups
2017-10-06 08:58:32 +02:00
Terry W Brady
e8b0a1d86b Call findAll/countAll to include withdrawn items 2017-10-05 16:22:55 -07:00
Terry W Brady
da5c795804 Add countAllItems methods 2017-10-05 16:22:11 -07:00
Terry W Brady
7f9e2d7bb0 convert bundle enum to string 2017-10-05 13:53:13 -07:00
Terry W Brady
aa0ced3d10 add pagination methods to find all 2017-10-05 13:50:48 -07:00
Chris Herron
9e6768241b DS-3710 Fix ehcache config conflict 2017-10-05 11:07:44 -04:00
Ivan Masár
c30018b089 DS-3705 Recent Submissions in Reference theme completely covered up by navigation 2017-10-04 10:50:45 +02:00
Terry W Brady
5c334351fa clean up css 2017-10-03 15:09:10 -07:00
Pascal-Nicolas Becker
23f2573460 Merge pull request #1851 from tuub/DS-3627
DS-3627: Cleanup utility leaves files in assetstore
2017-09-29 17:46:22 +02:00
Pascal-Nicolas Becker
179141dc4a DS-3700: MediaFilterServiceImpl forgot to close an input stream. 2017-09-29 16:47:58 +02:00
Terry W Brady
fd1afab6fc Merge bitstream fields into report 2017-09-28 17:22:13 -07:00
Pascal-Nicolas Becker
e59611b5c7 DS-3627: Cleanup utility leaves files in assetstore 2017-09-28 08:11:53 +02:00
Jakub Řihák
968487b9d2 [DS-3662] DSpace 'logging in' without password or with non-existent e-mail using Shib and Password authentication
Added extra check for empty value of an attribute.
In case that value is Empty, it should not be returned, return 'null' instead.
This prevents passing empty value to other methods, stops the authentication process
and prevents creation of 'empty' DSpace EPerson if autoregister == true and it subsequent
authentication.
2017-09-21 17:40:13 +02:00
Martin Walk
a3ea6d5df8 Add expected license header 2017-09-14 09:21:47 +02:00
Generalelektrix
5948e33517 DS-3332
Centralized most references to http://hdl.handle.net/ and to handle.canonical.prefix to HandleService. Created a new method: getCanonicalPrefix() in HandleService, adjusted getCanonicalForm(). As far as I can tell, remaining references to  http://hdl.handle.net/ shout stay like that since they are used as default values or are just included in documentation sections.
2017-09-12 15:12:12 -04:00
Martin Walk
aa69b2220a Add plugin to index filenames and file descriptions for files in ORIGINAL bundle 2017-09-12 13:01:11 +02:00
Generalelektrix
d7a0e0f560 Merge remote-tracking branch 'upstream/dspace-6_x' into dspace-6_x 2017-09-08 11:33:06 -04:00
Mark H. Wood
fc3ea83049 [maven-release-plugin] prepare for next development iteration 2017-09-07 16:14:13 -04:00
Mark H. Wood
e5cb62997a [maven-release-plugin] prepare release dspace-6.2 2017-09-07 16:14:03 -04:00
Mark H. Wood
ff7c7e3d6d Regenerate third-party license list 2017-09-07 15:16:09 -04:00
Mark H. Wood
068047cb11 Update LICENSE copyright claim with current year. 2017-09-07 15:08:17 -04:00
Terry Brady
0e7c7c0886 Merge pull request #1836 from Generalelektrix/dspace-6_x
DS-3687
2017-09-06 12:14:19 -07:00
Generalelektrix
a814d177e5 Merge remote-tracking branch 'upstream/dspace-6_x' into dspace-6_x 2017-09-06 10:45:52 -04:00
Generalelektrix
6626901564 DS-3687
Making key generic for legacy note value since it is not only used in jspui.
2017-09-06 10:28:14 -04:00
Generalelektrix
6b8f072d3e DS-3687 Hard coded note not compatible with multi-lingual sites for legacy stats
Changed hard coded string for reference to a new field in language bundle.
2017-09-06 10:18:50 -04:00
Alexander Sulfrian
41d8668331 AuthorityIndexer: Uncache Item after getting authority values
The AuthorityIndexClient walks over all Items in the repository and generates
all missing AuthorityValues for all of them. After getting the missing values
we need to uncache the Items to free the memory.
2017-08-31 18:52:29 +02:00
Alexander Sulfrian
0082e5b7da AuthorityIndexer: Only use non-null authority values 2017-08-31 18:52:29 +02:00
Alexander Sulfrian
3cb60236d0 AuthorityIndexer: Add possibility to cache new values
Now the DSpaceAuthorityIndexer can cache authority values for specific metadata
content (without an authority key) again.
2017-08-31 18:49:49 +02:00
Alexander Sulfrian
64022a92fb AuthorityIndexer: Remove state
The DSpaceAuthorityIndexer now returns all AuthorityValues for the specified
Items at once. It iterates over all values of all metadata fields (that are
configured for authority) of the item and gets or generates the authority
values for all of them. So it can be called from various threads without the
possibility for concurrent data access.

This currently removes the cache for newly generated authority values.
2017-08-31 18:48:11 +02:00
Alexander Sulfrian
c3214d6f77 AuthorityIndexer: Only handle one item at a time
The DSpaceAuthorityIndexer should only handle one item at a time. If the caller
needs multiple items to be indexed, the Indexer should be called multiple
times.
2017-08-31 18:48:11 +02:00
Alexander Sulfrian
889499105f AuthorityIndexer: Remove unused parameter 2017-08-31 18:48:07 +02:00
Tim Donohue
fa587c52ed Merge pull request #1830 from tuub/DS-3680
DS-3680: Database changes of consumers aren't persisted anymore
2017-08-31 06:41:19 +10:00
Alexander Sulfrian
6a14047db6 vocabulary-support: Fix reusing of the same dialog
If the same vocabulary is used multiple times on the same input page, the
dialog is only fetched the first time from the server and reused afterwards.
The problem is, that the target input field is contained in the dialog. When
reusing the dialog the target field should be updated.
2017-08-25 14:48:46 +02:00
Pascal-Nicolas Becker
d753c09b22 DS-3680: Remove problematic unaching. Also see DS-3681 as follow-up. 2017-08-24 18:25:41 +02:00
Pascal-Nicolas Becker
fbb45ba758 DS-3680: clarify that we need to dispatch events before committing 2017-08-24 18:25:20 +02:00
Pascal-Nicolas Becker
014456e1ed Revert "Events must be dispatched after commit() to ensure they can retrieve latest data from DB"
This reverts commit 646936a3d8.
2017-08-24 18:22:58 +02:00
Terry Brady
258b4f00e9 [DS-3602] Ensure Consistent Use of Legacy Id in Usage Queries (#1782)
* ensure that owning Item,Coll,Comm use legacy consistently

* scopeId query

* refine queries

* alter id query

* Commenting the behavior of the id / legacyId search

* Address duplicate disp for DSO w legacy and uuid stats
2017-08-17 23:48:25 +10:00
Tim Donohue
3798a12778 Merge pull request #1824 from tdonohue/DS-3656_and_DS-3648
DS-3656 and DS-3648 : Fix several Hibernate caching / saving issues
2017-08-17 06:53:28 +10:00
Hardy Pottinger
bc82adef5e [DS-3674] copied over input-forms.xml to the test config folder 2017-08-15 14:43:41 -05:00
Mark H. Wood
28bbf4b930 [DS-3667] Take up PR comments and document another class. 2017-08-11 17:02:14 -04:00
Tim Donohue
d4d61eed68 Replace dispatchEvents() call with an actual commit() to ensure changes are saved 2017-08-10 21:27:34 +00:00
Tim Donohue
646936a3d8 Events must be dispatched after commit() to ensure they can retrieve latest data from DB 2017-08-10 21:27:00 +00:00
Tim Donohue
9dd6bb0f08 DS-3648: Don't uncache submitter and related groups. Also DS-3656: Flush changes before evict() 2017-08-10 21:25:38 +00:00
Terry Brady
0e2ed31deb Merge pull request #1821 from Georgetown-University-Libraries/ds3661r6x
[DS-3661] Port to 6x: ImageMagick PDF Processing Degraded with Color Space Changes
2017-08-09 13:18:55 -07:00
Terry W Brady
1492dfef92 Normalize space 2017-08-09 13:02:04 -07:00
Terry W Brady
8b6c1acab1 Port PR1817, Only request image info if color space 2017-08-09 13:01:17 -07:00
Alan Orth
e88924b7da DS-3517 Allow improved handling of CMYK PDFs
Allow ImageMagick to generate thumbnails with more accurate colors
for PDFs using the CMYK color system. This adds two options to the
dspace.cfg where the user can optionally specify paths to CMYK and
RGB color profiles if they are available on their system (they are
provided by Ghostscript 9.x).

Uses im4java's Info class to determine the color system being used
by the PDF.

See: http://im4java.sourceforge.net/docs/dev-guide.html
2017-08-09 19:45:28 +00:00
Terry Brady
42608e028e Merge pull request #1816 from AlexanderS/fix-discovery-reindex
DS-3660: Fix discovery reindex on metadata change
2017-08-09 12:08:31 -07:00
Mark H. Wood
801f39daeb [DS-3667] Document fundamental persistence support classes. 2017-08-08 23:46:52 -04:00
Alexander Sulfrian
7e68165ded DS-3660: Fix discovery reindex on metadata change
Stored objects may get evicted from the session cache and get into detached
state. Lazy loaded fields are inaccessible and throw an exception on access.

Before using objects they have to be reloaded (retrieved from the
database and associated with the session again).
2017-08-03 16:25:39 +02:00
Tim Donohue
cfecf10e81 Merge pull request #1815 from tdonohue/DS-3659
DS-3659: Database migrate fails to create the initial groups
2017-08-03 23:51:47 +10:00
Alexander Sulfrian
5d656ea922 XMLUI: Remove doubled translation key (#1818)
The key "xmlui.ChoiceLookupTransformer.lookup" is already in line 2368 of the
same file.
2017-08-03 15:23:49 +02:00
Tim Donohue
62e2ac81fb Merge pull request #1814 from AlexanderS/fix/i18n-key-typo
XMLUI/SwordClient: Fix typo in i18n key
2017-08-02 07:33:05 +10:00
Tim Donohue
e9ace604a7 DS-3659: Ensure readonly connections can never rollback 2017-08-01 18:00:28 +00:00
Alexander Sulfrian
7f91528c1a XMLUI/SwordClient: Fix typo in i18n key 2017-07-25 15:21:10 +02:00
Tim Donohue
4881e9da20 [maven-release-plugin] prepare for next development iteration 2017-07-13 12:15:12 -05:00
Tim Donohue
eb4d56201a [maven-release-plugin] prepare release dspace-6.1 2017-07-13 12:15:02 -05:00
Tim Donohue
df9fb114ba Merge pull request #1807 from tdonohue/travis-fixes
Pin versions of SASS and Compass that Travis CI uses
2017-07-14 02:58:35 +10:00
Tim Donohue
f3556278aa Pin versions of SASS and Compass that Travis uses 2017-07-13 16:28:35 +00:00
Tim Donohue
f6af76c6d8 Revert 6.1 release 2017-07-13 14:15:21 +00:00
Tim Donohue
151a5f8fe2 [maven-release-plugin] prepare for next development iteration 2017-07-12 20:55:13 +00:00
Tim Donohue
57044f6698 [maven-release-plugin] prepare release dspace-6.1 2017-07-12 20:55:07 +00:00
Tim Donohue
4954f96f1d Merge pull request #1785 from atmire/DS-3127-DSpace-6_Whitelist-allowable-formats-Google-Scholar-citation_pdf_url
DS-3127 Whitelist allowable formats google scholar citation pdf url
2017-07-12 06:40:45 +10:00
Tim Donohue
972f76e771 Merge pull request #1790 from tomdesair/DS-3632_Correct-update-handle-prefix-script
DS-3632: Correct update-handle-prefix script
2017-07-12 06:27:08 +10:00
Tim Donohue
e30b0cdec6 DS-3431 : Fix broken tests by removing nullifying of global eperson 2017-07-11 16:13:25 +00:00
Pascal-Nicolas Becker
a0f226b763 [DS-3431] Harden DSpace's BasicWorfklowService 2017-07-11 16:10:08 +00:00
Alexander Sulfrian
ec4ae3319d DS-3643: Use correct license on ingest
If a package contains no license on ingest the packager will add the default
license of the collection. If the collection has no specific license set, the
global default license should be used.

Collection.getLicenseCollection() does not fallback to the default license and
so an empty license file was added.
2017-07-10 17:33:33 +02:00
Tim Donohue
bcf3110db9 Merge pull request #1723 from atmire/DS-2359
DS-2359 Error when depositing large files via browser (over 2Gb)
2017-07-08 05:56:33 +10:00
Tom Desair
c34b277c8d DS-3628: Check READ resouce policies for items return by REST find-by-metadata-field endpoint 2017-07-07 19:47:26 +00:00
Pascal-Nicolas Becker
6263444f79 DS-3619: AuthorizeService.getAuthorizedGroups(...) should check dates 2017-07-07 19:30:00 +00:00
Tim Donohue
9caff2caab Merge pull request #1799 from tdonohue/DS-3397-6x
[DS-3397] Fix error when getting bitstream policies in REST API (6.x version)
2017-07-07 02:47:42 +10:00
Tim Donohue
6151f4f594 Merge pull request #1798 from atmire/DS-3563-DSpace-6_Missing-index-metadatavalue-resource-type-id
DS-3563: Fix Oracle Flyway migration error
2017-07-06 01:34:38 +10:00
Tim Donohue
f953848a6d [DS-3397] Add null checks to EPerson and Group 2017-07-05 15:27:43 +00:00
Tom Desair
ccc1b1b784 DS-3563: Fix Oracle Flyway migration error 2017-07-05 14:02:29 +02:00
Tom Desair
1bb6369ad6 DS-3127: Update test assert descriptions of GoogleBitstreamComparatorTest 2017-07-04 16:07:57 +02:00
Tom Desair
e31daa0230 DS-3632: Prevent the use of the locate function as this seems to give inconsistent results 2017-06-30 17:13:31 +02:00
Tom Desair
762197b452 DS-3632: Changed the update-handle-prefix script so that it does not change the handle suffix 2017-06-30 16:58:15 +02:00
kshepherd
ecd0230943 Merge pull request #1780 from atmire/DS-3595-6x
DS-3595
2017-06-30 05:41:42 +10:00
Philip Vissenaekens
c9cad9083e Merge branch 'dspace-6_x' into DS-3595-6x 2017-06-29 15:38:20 +02:00
Tom Desair
b462e0ac6d Merge branch 'dspace-6_x' into DS-3127-DSpace-6_Whitelist-allowable-formats-Google-Scholar-citation_pdf_url 2017-06-29 09:55:42 +02:00
Terry Brady
65d638771f Merge pull request #1747 from AlexanderS/localization-input-forms-xmlui
DS-3598: Allow localization of input-forms.xml with XMLUI
2017-06-28 17:15:40 -07:00
Terry Brady
224df82087 Merge pull request #1752 from AlexanderS/fix/DS-3601-npe-feedback-page
DS-3601: Fix NPE when accessing feedback page without "Referer" header
2017-06-28 16:31:44 -07:00
Terry Brady
a6b3ce0d46 Merge pull request #1784 from rivaldi8/DS-3245-csv-linebreaks_ds6
DS-3245: CSV linebreaks not supported by Bulkedit -  DSpace 6
2017-06-28 15:47:39 -07:00
Terry Brady
2944279618 Merge pull request #1727 from tomdesair/DS-3579_Context-mode-and-cache-management-CLI-commands
DS-3579 Context mode and cache management for CLI commands
2017-06-28 14:49:11 -07:00
Tom Desair
fe115125d1 DS-3127: Prevent database updates when directly manipulating the bistream list of a bundle 2017-06-28 17:46:58 +02:00
Tom Desair
6e9dec2c85 DS-3579: Make sure context.complete() can be called when in read-only 2017-06-28 16:15:30 +02:00
Terry Brady
fd298ae462 Merge pull request #1772 from tomdesair/DS-3571_Log-Hibernate-validation-errors
DS-3571 Log hibernate validation errors
2017-06-27 15:22:44 -07:00
Lotte Hofstede
0683b5c5b1 40648: fix for filters 2017-06-27 11:30:50 +02:00
Mark H. Wood
470c9b8f50 Merge pull request #1788 from mwoodiupui/DS-3568
[DS-3568] UTF-8 characters are now supported in configuration files
2017-06-26 13:34:04 -04:00
Terry Brady
33d3df72d6 Merge pull request #1732 from samuelcambien/DS-3584
DS-3584 when editing an eperson, trying to change its email address is ignored if another user already has that email address.
2017-06-23 16:56:27 -07:00
Christian Scheible
43cc3bd874 DS-3568. UTF-8 characters are now supported in configuration files 2017-06-22 16:35:30 -04:00
Tom Desair
3dc4909935 Fix IT tests 2017-06-22 17:07:55 +02:00
Tom Desair
71791c720f DS-3127: Process review feedback and fix tests 2017-06-22 15:01:45 +02:00
Àlex Magaz Graça
70a5124373 DS-3245: CSV linebreaks not supported by Bulkedit
When a multiline field contained empty lines, the importer stopped
reading the file. This reverts a change in 53d387fed to stop when the
end of the file has been reached instead.

Fixes https://jira.duraspace.org/browse/DS-3245
2017-06-22 13:57:06 +02:00
Philip Vissenaekens
7879ecdf14 DS-3595 2017-06-21 17:18:30 +02:00
Mark H. Wood
1db3261b54 Merge pull request #1696 from tomdesair/DS-2748_Improve-cocoon-page-not-found-page
DS-2748: Do not throw an exception in the PageNotFoundTransformer
2017-06-21 10:18:53 -04:00
Alexander Sulfrian
593e6a5b37 DS-3522: Ensure Submission Policies are removed in XMLWorkflow 2017-06-21 15:21:13 +02:00
Tom Desair
3732cafc4e Merge branch 'dspace-6_x' into DS-3579_Context-mode-and-cache-management-CLI-commands 2017-06-19 17:36:55 +02:00
Tom Desair
6f52d9700a Merge branch 'dspace-6_x' into DS-3579_Context-mode-and-cache-management-CLI-commands 2017-06-19 17:18:22 +02:00
Tom Desair
769d3b590f DS-3579: Fix bug in metadata-import script 2017-06-19 14:59:00 +02:00
Tom Desair
7d04016436 Merge branch 'DS-3579_Context-mode-and-cache-management-CLI-commands' of https://github.com/tomdesair/DSpace into DS-3579_Context-mode-and-cache-management-CLI-commands 2017-06-19 14:38:28 +02:00
edusperoni
0084ae3833 DS-2291 Autocomplete not working on Mirage2 (#1741)
* fixing autocomplete problem listed on DS-2291. Also fixes the spinner that was being referenced in the wrong path.

* fix common lookup button (now consistent with the author lookup button)
2017-06-14 11:36:45 -05:00
Lotte Hofstede
6bcaead1a3 Merged with last 6_x version and resolved conflicts 2017-06-14 11:46:48 +02:00
Lotte Hofstede
e9f9c41221 Merge branch 'dspace-6_x' into DS-2852_6.x
Conflicts:
	dspace-xmlui-mirage2/src/main/webapp/templates/discovery_simple_filters.hbs
2017-06-14 10:19:31 +02:00
Pascal-Nicolas Becker
fc1b22e59c Merge pull request #1767 from tomdesair/PR-1715
DS-3572: Check authorization for a specified user instead of currentUser
2017-06-13 16:08:33 +02:00
Tom Desair
9af33bc244 DS-3571: Make sure that any Hibernate schema validation error is logged instead of just a NullPointerException 2017-06-13 11:17:20 +02:00
Tom Desair
bd2d81d556 DS-3572: Renamed epersonInGroup to isEPersonInGroup 2017-06-12 15:17:59 +02:00
Tom Desair
f6eb13cf53 DS-3572: Restored behaviour of GroupService.isMember and moved new behaviour to GroupService.isParentOf 2017-06-12 15:05:59 +02:00
Tom Desair
b4a24fff7b DS-3572: Fix bug where normal group membership is ignored if special groups are present + added tests 2017-06-10 14:32:45 +02:00
Tom Desair
8bb7eb0fe5 Improve tests + make GroupService.isMember method more performant for special groups 2017-06-10 00:34:24 +02:00
Eduardo Speroni
eb9ce230ad fixed nested vocabulary search 2017-06-09 19:18:15 -03:00
Tom Desair
f48178ed41 Fix DSpace AIP IT tests: Set correct membership for admin 2017-06-09 20:09:15 +02:00
Tim Donohue
1b70e64f77 Merge pull request #1751 from tomdesair/DS-3406_Sort-Communities-and-Collections-Hibernate-Sort-Annotation
DS-3406: Sort communities and collections iteration 2
2017-06-09 09:35:00 -07:00
Tom Desair
b56bb4de3e Attempt to fix contstraint violation 2017-06-09 17:51:27 +02:00
Tom Desair
139f01fffd Restore GroupServiceImpl.isMember logic + fix tests 2017-06-09 17:30:06 +02:00
frederic
257d75ca0c DS-3406 unit tests for getCollection/getCommunity for different dspace objects 2017-06-09 10:05:36 +02:00
frederic
5422a63f08 DS-3579 removed FETCH keyword and fixed typo in help message of harvest 2017-06-09 09:46:28 +02:00
Pascal-Nicolas Becker
853e6baff1 Merge pull request #1761 from tdonohue/DS-3604
DS-3604: Fix Bitstream reordering in JSPUI
2017-06-06 23:08:06 +02:00
Tim Donohue
205d8b9f92 Refactor BundleServiceImpl.setOrder() to be more failsafe. Update Tests to prove out (previously these new tests failed) 2017-06-06 14:07:16 +00:00
Pascal-Nicolas Becker
bb1e13a3b2 DS-3572: Adding simple unit test for DS-3572. 2017-06-06 15:54:13 +02:00
Pascal-Nicolas Becker
d2311663d3 DS-3572: Check authorization for a specified user instead of currentUser 2017-06-06 15:54:12 +02:00
kshepherd
7d1836bddc Merge pull request #1762 from Georgetown-University-Libraries/ds3563-6x
[DS-3563] Port PR to 6x
2017-06-06 12:36:46 +12:00
Tom Desair
36002b5829 DS-3563: Conditional create index for Oracle 2017-06-02 13:19:02 -07:00
Tom Desair
6392e195b9 DS-3563 Added missing index on metadatavalue.resource_type_id 2017-06-02 13:18:43 -07:00
Tim Donohue
d37d3a04ac Create a valid unit test for BundleServiceImpl.setOrder() method 2017-06-02 20:14:29 +00:00
Tim Donohue
ef3afe19eb DS-3604: Sync JSPUI bitstream reorder code with XMLUI code 2017-06-02 19:50:14 +00:00
Pascal-Nicolas Becker
81e171ec24 Merge pull request #1760 from tuub/DS-3582
DS-3582: Reintroduce calls to context.abort() at the end of some JSPs to free db resources.
2017-06-02 12:54:29 +02:00
Pascal-Nicolas Becker
4086e73e0b DS-3582: Any jsp that call UIUtil.obtainContext must free DB resources
Any jsp that call UIUtil.obationContext must either call context.abort
or context.commit to free the database connection to avoid exhausting
the database connection pool.
2017-06-01 17:37:30 +02:00
Tim Donohue
5f827ecbe8 Merge pull request #1759 from AlexanderS/rest-submissions-to-workflow
DS-3281: Start workflow for REST submissions
2017-05-31 13:52:42 -07:00
Alexander Sulfrian
30c4ca0fea DS-3281: Start workflow for REST submissions
If an item is submitted through the REST API (via POST on
/{collection_id}/items) the item should not be published immediately,
but should be approved via the defined workflow.
2017-05-31 18:27:44 +02:00
Terry Brady
094f775b6a Merge pull request #1746 from Georgetown-University-Libraries/ds3594
[DS-3594] Refine unit tests to run against postgres
2017-05-31 08:59:14 -07:00
Alexander Sulfrian
85588871ca JSPUI: Fix DSpaceChoiceLookup in edit-item-form.jsp
The collection id changed from a simple integer to a UUID and JavaScript
needs quotes around it.
2017-05-31 10:49:59 +02:00
nteike
1ad95bdb12 JSPUI: Fix DSpaceChoiceLookup in edit-metadata.jsp 2017-05-31 10:49:37 +02:00
Terry Brady
593cc085d2 Add comment for null check during sort 2017-05-23 10:23:16 -07:00
Tom Desair
f4cdfb4e65 Revert imports 2017-05-22 17:35:03 +02:00
Tom Desair
b4d8436672 DS-3406: Remove unnecessary commit 2017-05-22 17:17:03 +02:00
Tom Desair
271b6913ab Fix integration tests. Remove Hibernate Sort annotations as a collection name can change and this breaks the Set semantics 2017-05-22 15:06:44 +02:00
Alexander Sulfrian
137384c13f DS-3601: Fix NPE when accessing feedback page without "Referer" header 2017-05-22 12:24:31 +02:00
Tom Desair
72f8f9461b Fix bug so that comparator can be used for sets 2017-05-22 10:52:15 +02:00
Tom Desair
78effeac61 Fixing tests 2017-05-22 09:39:13 +02:00
Yana De Pauw
62c804f1e8 DS-3406: Ordering sub communities and collections 2017-05-22 09:39:12 +02:00
Tim Donohue
40b05ec773 Fix minor compilation error in cherry-pick of PR#1662 2017-05-18 21:03:35 +00:00
Miika Nurminen
a0e91cacd9 [DS-3463] Fix IP authentication for anonymous users
Added group membership check based on context even if no eperson is found. Affects file downloads in (at least) xmlui.
2017-05-18 20:12:34 +00:00
Alexander Sulfrian
90ca4deb35 Fix code style 2017-05-18 11:20:15 +02:00
Alexander Sulfrian
83002c3177 DS-3598: Allow localization of input-forms.xml with XMLUI
This allows separate input-forms.xml for the different locales with
XMLUI. The feature was already present in JSPUI.
2017-05-17 16:05:14 +02:00
Terry Brady
ebf256caa1 Avoid NPE 2017-05-15 14:37:59 -07:00
Terry Brady
1d655e97c9 Make destroy more forgiving of test failures 2017-05-15 14:31:41 -07:00
Terry Brady
d85a2d9153 Avoid handle collision in persistent db 2017-05-15 14:19:39 -07:00
Terry Brady
6f8a8b7f25 change parameter setting for db portability 2017-05-15 13:47:20 -07:00
Generalelektrix
3ea041d4dc DS-3164 Item statistic displays UUID of bitstreams instead of name (#1744)
simple change to return bit.getName() as opposed to return value
2017-05-10 17:16:50 -04:00
Tom Desair (Atmire)
6333fb6706 Ds 3552 read only context and hibernate improvements (#1694)
* Refactor READ ONLY mode in Context and adjust hibernate settings accordingly

* Set Context in READ-ONLY mode when retrieving community lists

* Fix Hibernate EHCache configuration + fix some Hibernate warnings

* Cache authorized actions and group membership when Context is in READ-ONLY mode

* Set default Context mode

* Let ConfigurableBrowse use a READ-ONLY context

* Add 2nd level cache support for Site and EPerson DSpaceObjects

* Added 2nd level caching for Community and Collection

* Fix tests and license checks

* Cache collection and community queries

* Small refactorings + backwards compatibility

* Set Context to READ-ONLY for JSPUI submissions and 'select collection' step

* OAI improvements part 1

* OAI indexing improvements part 1

* OAI indexing improvements part 2

* DS-3552: Only uncache resource policies in AuthorizeService when in read-only

* DS-3552: Additional comment on caching handles

* DS-3552: Fix cache leakage in SolrServiceResourceRestrictionPlugin

* DS-3552: Clear the read-only cache when switching Context modes

* DS-3552: Correct Group 2nd level cache size

* DS-3552: Always clear the cache, except when going from READ_ONLY to READ_ONLY
2017-05-04 14:12:06 -04:00
Hardy Pottinger
f62c32efe6 Merge pull request #1739 from edusperoni/handlebars-4
DS-3387 Upgrade handlebars to v4.
2017-05-04 12:28:15 -04:00
Hardy Pottinger
068be33265 Merge pull request #1707 from Frederic-Atmire/DS-3558
DS 3558 Case-insensitive bot matching option
2017-05-04 10:08:59 -04:00
Lotte Hofstede
afe2bf6f93 40648: DS-2852 - Facet tag fix for authority values for DSpace 6 2017-05-04 12:42:56 +02:00
Eduardo Speroni
3c25e04c08 upgrade grunt-contrib-handlebars to 1.0.0 2017-05-03 21:11:58 -03:00
Pascal-Nicolas Becker
a44b109f7a Merge pull request #1684 from tomdesair/DS-3406_Sort-Communities-and-Collections-with-comparator
DS-3406: Sort communities and collections in-memory using a comparator
2017-05-03 14:37:24 +02:00
frederic
a24b0078c2 Made service for SpringDetector and made SpringDetector delegate to it 2017-05-03 11:15:35 +02:00
Tom Desair
e358cb84d1 DS-3406: Resolve review feedback 2017-05-02 17:59:25 +02:00
frederic
0f51d5ad6a ported DS-3558 from dspace 5 to dspace6 2017-05-02 10:52:59 +02:00
frederic
454b0c9d6a Few tests to test case-(in)sensitive matching 2017-04-28 09:57:22 +02:00
frederic
6e1a5d1df9 made the necessary changes to easily test this class 2017-04-28 09:56:43 +02:00
frederic
b61c821e66 case-insensitive option commented out by default 2017-04-28 09:56:16 +02:00
frederic
fd76b587be wrote tests for botmatching 2017-04-27 14:24:07 +02:00
Eduardo Speroni
f12006fe21 Upgrade handlebars to v4.
Fixed advanced filters to work with handlebars v4. (https://github.com/wycats/handlebars.js/issues/1028)
2017-04-26 16:55:49 -03:00
Tim Donohue
3116c53d5e Merge pull request #1737 from cjuergen/DS-3585-6_x
Fix for DS3585
2017-04-26 11:09:14 -07:00
cjuergen
e2ffbaa3b8 Fix for DS3585 2017-04-26 15:49:28 +02:00
samuel
856e5ad388 DS-3584 when editing an eperson, trying to change its email address is ignored if another user already has that email address 2017-04-26 11:36:08 +02:00
Jonas Van Goolen
2c8e36fcb9 DSpace 6_X/xmlui counterpart of JIRA ticket
https://jira.duraspace.org/browse/DS-2675

Contains the following:

Fixed "starts_with" implementation to only retrieve items that ACTUALLY start with the string.
Ascending/Descending ordering now properly reset the offset (instead of remaining somewhere halfway)
2017-04-24 13:24:44 +02:00
Tom Desair
d2577fa16c DS-3579: Fix tests 2017-04-21 11:45:55 +02:00
Tom Desair
d5f9d9b0db DS-3579: Improve cache usage rdfizer, sub-daily, doi organiser 2017-04-21 11:45:55 +02:00
Tom Desair
e4b26d64ce DS-3579: Improve cache usage harvest 2017-04-21 11:45:55 +02:00
Tom Desair
2dde39abe7 DS-3579: Improve cache usage bitstore-migrate, cleanup, curate, embargo-lifter 2017-04-21 11:45:55 +02:00
Tom Desair
a715ae4d15 DS-3579: Improve cache usage export, import, itemupdate, metadata-export, packager 2017-04-21 11:45:55 +02:00
Tom Desair
e63b3f4c13 DS-3579: Improve cache usage export, import, itemupdate, metadata-export, packager 2017-04-21 11:45:54 +02:00
Tom Desair
acedcacdb3 DS-3579: Improve cache usage update-handle-prefix 2017-04-21 11:45:54 +02:00
Tom Desair
37219a986d DS-3579: checker, checker-emailer, filter-media, generate-sitemaps, index-authority 2017-04-21 11:45:54 +02:00
Tom Desair
a3fc30ad94 DS-3579: Fix tests 2017-04-20 21:55:28 +02:00
Terry Brady
e2862b3058 Merge pull request #1714 from tuub/DS-3575
DS-3575: Rename misguiding find method in ResourcePolicyService
2017-04-20 11:47:20 -07:00
Mark H. Wood
8442e6f395 Merge pull request #1717 from mwoodiupui/DS-3564
[DS-3564] Limit maximum idle database connections by default
2017-04-20 12:39:11 -04:00
Tom Desair
7e1a0a1a0c DS-3552: Fix cache leakage in SolrServiceResourceRestrictionPlugin 2017-04-20 17:40:24 +02:00
Tom Desair
a5d414c0b2 DS-3552: Additional comment on caching handles 2017-04-20 17:36:10 +02:00
Tom Desair
cabb4fab66 DS-3579: Improve cache usage rdfizer, sub-daily, doi organiser 2017-04-20 17:33:07 +02:00
Tom Desair
5c19bb52e0 DS-3579: Improve cache usage harvest 2017-04-20 17:32:26 +02:00
Tom Desair
1e62dfdbbc DS-3579: Improve cache usage bitstore-migrate, cleanup, curate, embargo-lifter 2017-04-20 17:31:49 +02:00
Tom Desair
867ab6c9b9 DS-3579: Improve cache usage export, import, itemupdate, metadata-export, packager 2017-04-20 17:30:37 +02:00
Tom Desair
392dd2653a DS-3579: Improve cache usage export, import, itemupdate, metadata-export, packager 2017-04-20 17:30:07 +02:00
Tom Desair
6f3546f844 DS-3579: Improve cache usage update-handle-prefix 2017-04-20 17:28:28 +02:00
Tim Donohue
9a0d293abf Merge pull request #1720 from Georgetown-University-Libraries/ds3516-6x
[DS-3516] 6x Port ImageMagick PDF Thumbnail class should only process PDFs
2017-04-20 06:56:08 -07:00
Philip Vissenaekens
782a963916 DS-2359 2017-04-20 13:10:39 +02:00
Tom Desair
0235ba391f DS-3579: checker, checker-emailer, filter-media, generate-sitemaps, index-authority 2017-04-20 10:41:51 +02:00
Alan Orth
eae5a96179 port PR1709 to 6x 2017-04-19 14:44:28 -07:00
Mark H. Wood
1ef1170159 [DS-3564] Limit maximum idle database connections by default 2017-04-19 14:56:44 -04:00
Tim Donohue
4f7410232a Merge pull request #1682 from tuub/DS-3535
[DS-3535] Reduced error logging by interrupted download
2017-04-19 09:45:05 -07:00
Tim Donohue
6c29cd61b6 Merge pull request #1699 from enrique/patch-1
DS-3554: Check for empty title in Submissions
2017-04-19 09:32:06 -07:00
Tim Donohue
f6a651d4df Merge pull request #1703 from samuelcambien/DS-3553
DS-3553: when creating a new version, do context complete before redirecting to the submission page
2017-04-19 09:27:14 -07:00
Tim Donohue
c57b443611 Merge pull request #1713 from atmire/DS-3573-Filtername-in-XMLUI-Discovery-filter-labels-dspace6
DS-3573: Filtername in XMLUI Discovery filter labels
2017-04-19 09:19:54 -07:00
Pascal-Nicolas Becker
a5bdff0803 DS-3575: Rename misguiding find method in ResourcePolicyService 2017-04-18 18:12:32 +02:00
samuel
e3f72b280d DS-3553: when creating a new version, do context complete before redirecting to the submission page 2017-04-18 11:01:47 +02:00
Yana De Pauw
63ed4cc1e0 DS-3573: Filtername in XMLUI Discovery filter labels 2017-04-14 15:26:08 +02:00
Tom Desair
f0a5e7d380 DS-3552: Only uncache resource policies in AuthorizeService when in read-only 2017-04-14 09:26:08 +02:00
Tom Desair
1e64850af2 OAI indexing improvements part 2 2017-04-14 00:40:19 +02:00
Tom Desair
d9db5a66ca OAI indexing improvements part 1 2017-04-14 00:21:03 +02:00
Tom Desair
5f77bd441a OAI improvements part 1 2017-04-13 17:44:21 +02:00
frederic
4b87935cbb DS-3558 removed duplicate code and changed default option 2017-04-13 16:27:19 +02:00
Tim Donohue
3db74c7ba3 Merge pull request #1671 from mwoodiupui/DS-3505
[DS-3505] Bad redirection from logout action
2017-04-12 13:37:17 -07:00
frederic
f000b280c1 DS-3558 added comments on code 2017-04-12 15:04:57 +02:00
frederic
cad79dc6c9 DS-3558 made case insensitive botsearch configurable and optimized case insensitive pattern matching 2017-04-12 14:29:58 +02:00
Enrique Martínez Zúñiga
794600b96e Fix for DS-3554
Use StringUtils.isNotBlank instead of only check for title.lenght
2017-04-05 09:31:20 -05:00
Tom Desair
044ba1acd3 DS-2748: Do not throw an exception in the PageNotFoundTransformer but do return a 404 error code 2017-04-05 15:45:32 +02:00
Tom Desair
f54fe5c12e Set Context to READ-ONLY for JSPUI submissions and 'select collection' step 2017-04-05 15:23:16 +02:00
Tom Desair
1e917ed845 Small refactorings + backwards compatibility 2017-04-05 11:02:58 +02:00
Tom Desair
7719848d47 Cache collection and community queries 2017-04-05 09:59:31 +02:00
Tom Desair
f0e9e04a3a Fix tests and license checks 2017-04-04 13:44:38 +02:00
Tom Desair
5f194334ff Added 2nd level caching for Community and Collection 2017-04-04 13:16:13 +02:00
Tom Desair
7371a7c71d Add 2nd level cache support for Site and EPerson DSpaceObjects 2017-04-03 16:21:14 +02:00
Tom Desair
3963c95f6e Let ConfigurableBrowse use a READ-ONLY context 2017-04-03 15:59:13 +02:00
Tom Desair
75497f5107 Set default Context mode 2017-04-03 15:54:18 +02:00
Tom Desair
852c4d3b62 Cache authorized actions and group membership when Context is in READ-ONLY mode 2017-04-03 15:26:29 +02:00
Tom Desair
d108464a3a Fix Hibernate EHCache configuration + fix some Hibernate warnings 2017-04-03 15:26:29 +02:00
Tom Desair
dbfc8ce9a7 Set Context in READ-ONLY mode when retrieving community lists 2017-04-03 15:26:28 +02:00
Tom Desair
eee4923518 Refactor READ ONLY mode in Context and adjust hibernate settings accordingly 2017-04-03 15:26:28 +02:00
Toni Prieto
9ef505498b [DS-2947] DIM crosswalks repeats authority & confidence values in the metadata values 2017-03-24 16:16:31 +00:00
Tom Desair
3540fe5ec6 DS-3406: Sort communities and collections in-memory using a comparator 2017-03-23 15:27:02 +01:00
Tim Donohue
57f2a10da1 Merge pull request #1663 from mwoodiupui/DS-1140
[DS-1140] Update MSWord Media Filter to use Apache POI (like PPT Filter) and also support .docx
2017-03-22 10:31:35 -05:00
Per Broman
1e33e27a84 [DS-3535] Reduced error logging by interrupted download 2017-03-21 10:29:06 +01:00
Pascal-Nicolas Becker
a54bf11b8c Merge pull request #1673 from tuub/DS-3523
[DS-3523] Bugfix for search with embargoed thumbnails
2017-03-09 12:38:58 +01:00
Per Broman
0601e9f061 [DS-3523] Bugfix for search with embargoed thumbnails 2017-03-09 12:07:52 +01:00
Mark H. Wood
b578abd054 [DS-3505] On logout redirect to dspace.url, not context path. 2017-03-08 15:51:01 -05:00
Terry Brady
bc8629b145 [DS-3348] Drop date check in EmbargoService (#1542)
* Drop date check in EmbargoService

* Revise comment per review
2017-03-08 18:29:12 +00:00
Peter Dietz
26859b1133 DS-3366 Fix handleresolver by removing out.close (#1560) 2017-03-08 18:25:38 +00:00
Andrea Schweer
97785d778f [DS-3336] Properly sort collections in move item drop-down 2017-03-08 18:08:30 +00:00
Terry Brady
f1c3a9d919 fix typo in comment 2017-03-08 17:44:30 +00:00
Terry Brady
6442c979aa First attempt to resort submitters 2017-03-08 17:44:12 +00:00
Tim Donohue
a36f5b1f48 Merge pull request #1670 from tuub/DS-3521
[DS-3521] Bugfix browsing embargoed thumbnail
2017-03-08 09:51:56 -06:00
Per Broman
36a87c2107 [DS-3521] Bugfix browsing embargoed thumbnail 2017-03-07 12:09:28 +01:00
Mark H. Wood
43d7cd564c [DS-1140] Add configuration data 2017-03-02 15:49:34 -05:00
Mark H. Wood
9d8738c934 [DS-1140] Add unit test. 2017-03-02 14:50:14 -05:00
Mark H. Wood
c09edc5a15 [DS-1140] No need to treat old and new Word formats differently 2017-03-02 14:49:24 -05:00
Tim Donohue
2d95c7a2a1 Merge pull request #1652 from Georgetown-University-Libraries/ds3282-6x
[DS-3282] 6x Fix js error for filters with dashes
2017-03-01 14:59:47 -06:00
Terry Brady
d2c43b8aa5 Merge pull request #1654 from Georgetown-University-Libraries/ds2789-6_x
[DS-2789] 6x Display a "restricted image" for a thumbnail if the bitstream is restricted
2017-03-01 12:53:44 -08:00
Terry Brady
5d9dd4d4e3 Merge pull request #1660 from Georgetown-University-Libraries/ds3283-6x2
[DS-3283] 6x Mirage2: Edit Collection Source - No Field Label for Set Id
2017-03-01 12:42:38 -08:00
Mark H. Wood
24c1f5367c [DS-1140] New POI-based MS Word extractor and some comment cleanup 2017-02-28 17:12:23 -05:00
Hardy Pottinger
fbaf950388 [DS-3475] adding more guidance to example local.cfg as per suggestion of Tim Donohue 2017-02-28 16:10:08 -06:00
Hardy Pottinger
ddedfa2a14 [DS-3475] added back assetstore.dir configuration to dspace.cfg 2017-02-28 16:07:58 -06:00
Terry W Brady
2b96f9472b Add default lock icon for Mirage theme 2017-02-27 14:10:02 -08:00
Terry W Brady
1af23f2d8b reapply pr from master 2017-02-27 14:10:02 -08:00
Terry W Brady
a868a4bc9b Re-applying changes 2017-02-27 13:45:53 -08:00
Tim Donohue
2734dca1cd Merge pull request #1659 from tdonohue/fix_travis_timeouts
Fix Travis CI Maven download timeouts
2017-02-27 15:36:07 -06:00
Tim Donohue
8c70f9bc8c Workaround for travis-ci/travis-ci#4629 2017-02-27 21:21:08 +00:00
Tom Desair
8d56e828a2 DS-3367: Fix authorization error when non-admin users claim a configurable workflow task 2017-02-23 16:28:37 -05:00
Mark H. Wood
0e8c95a196 Merge pull request #1651 from mwoodiupui/DS-3378
[DS-3378] Patch to restore lost indices, from Adan Roman
2017-02-23 16:06:08 -05:00
Terry Brady
cf190c78e8 Fix js error for filters with dashes 2017-02-23 09:40:10 -08:00
Mark H. Wood
2d1c59ac49 [DS-3378] Patch to restore lost indices, from Adan Roman 2017-02-22 17:24:46 -05:00
Tom Desair
3a03e7a9d3 DS-2952: Added missing license 2017-02-22 20:26:42 +00:00
Tom Desair
757264c1f6 DS-2952: Only prepend new line if we have an actual input stream 2017-02-22 20:26:33 +00:00
Tom Desair
dfe6d79da4 DS-2952: Small improvements to FullTextContentStreams and added a unit test for it 2017-02-22 20:26:23 +00:00
Tom Desair
708fe215b0 DS-2952: Use a SequenceInputStream to add the content of multiple full text bitstreams to SOLR 2017-02-22 20:26:09 +00:00
Hardy Pottinger
a51ad3c6eb Merge pull request #1614 from jonas-atmire/DS-3448-MultiSelect-in-Submission
DS-3448 Multi-select in submission for workflow and workspace items
2017-02-22 12:13:12 -06:00
Hardy Pottinger
c5aebee9cc Merge pull request #1649 from hardyoyo/DS-3501-fix-XML-validation-by-excluding-failing-node-packages
[DS-3501] adjust XML validation
2017-02-22 11:17:38 -06:00
Hardy Pottinger
8a06522fa9 [DS-3501] adjust XML validation to skip contents of any folder that includes the text node/node_modules 2017-02-22 16:41:35 +00:00
samuel
267518ebaf DS 3425 outputstream gets closed in JSONDiscoverySearcher 2017-02-21 21:34:29 +00:00
samuel
2685cd793e DS-3415 - administrative.js doEditCommunity wrong parameter name 2017-02-21 21:03:19 +00:00
Tim Donohue
36c7fa9c1a Merge pull request #1588 from atmire/DS-3419-6_x
DS-3419
2017-02-21 14:55:56 -06:00
Bram Luyten
54c5c2932b DS-2840 sidebar facet logging from INFO to DEBUG
Changes INFO level sidebar facet transformer log entries to DEBUG
2017-02-18 14:20:08 +01:00
Luigi Andrea Pascarelli
7225f2597a DS-3356 add turnoff authz system 2017-02-15 22:10:18 +00:00
Mark H. Wood
59632413c2 [DS-3469] virus scan during submission attempts to read uploaded bitstream as anonymous user, which fails (#1632)
* [DS-3469] Add the current session context to the curation task run.

* [DS-3469] Log how I/O failed, not just that it did.

* [DS-3469] Keep reference to Bundle from which we just removed the Bitstream instead of expecting the List of Bundle to be unaltered.

* [DS-3469] Finish switching from e.getMessage() to e

* [DS-3469] Note the side effect of calling curate() with a Context.
2017-02-08 10:32:29 -06:00
Tim Donohue
7650af1e69 Merge pull request #1639 from rradillen/DS-3473
DS-3473: add guard code in case no dot is present in bitstream name
2017-02-08 10:24:28 -06:00
Tim Donohue
e4659832a0 Merge pull request #1641 from cjuergen/DS-3479-6_x
Fix for DS-3479 preventing the import of empty metadata
2017-02-08 10:15:26 -06:00
Tim Donohue
ab982e4f0b Merge pull request #1613 from tomdesair/DS-3436-Sharding-corrupts-multivalued-fields
DS-3436 Sharding SOLR cores corrupts multivalued fields
2017-02-08 09:47:22 -06:00
Terry Brady
8d76aa2010 [DS-3456] 6x Fix Command Line Parameters for statistics import/export tools (#1624)
* Clarify command line args

* support flexible import/export of stats

* Fix DS-3464 solr-reindex-statistics for shard

* Preserve multi val fields on import/export

* Time zone consistency in shard name creation

* Migrate PR feedback from 5x to 6x

* whitespace
2017-02-08 09:43:03 -06:00
Tim Donohue
9eb7c6734c Merge pull request #1633 from Georgetown-University-Libraries/ds3457b
[DS-3457] Address tomcat hang when multiple solr shards exist in DSpace 6
2017-02-08 09:30:42 -06:00
cjuergen
99c1af8688 Fix for DS-3479 preventing the import of empty metadata 2017-02-06 15:11:14 +01:00
Roeland Dillen
866bfe8fd8 add guard code in case no dot is present in bitsream name 2017-02-05 13:45:40 +01:00
Terry Brady
12de02c7f3 Merge pull request #1637 from kshepherd/DS-3477
[DS-3477] fix altmetrics config lookups in item-view.xsl (6.x)
2017-02-02 09:56:12 -08:00
Kim Shepherd
0c0b280d05 [DS-3477] fix altmetrics config lookups in item-view.xsl 2017-02-02 18:04:36 +13:00
Hardy Pottinger
bf1979fd41 [DS-3475] adding more guidance to example local.cfg as per suggestion of Tim Donohue 2017-02-01 15:49:19 -06:00
Hardy Pottinger
e32b93bae3 [DS-3475] added back assetstore.dir configuration to dspace.cfg 2017-02-01 15:48:51 -06:00
kshepherd
f86fff9063 Merge pull request #1611 from tomdesair/DS-3446-DSpace-6x_Non-admin-submitter-cannot-remove-bitstream
DS-3446: On bitstream delete, remove policies only after the bitstream has been updated
2017-02-02 09:42:33 +13:00
Terry W Brady
f7cadf8774 Initialize solr shards at first stats post
Make it more likely that the shards are awake on first use
2017-01-31 15:02:55 -08:00
Terry W Brady
4f7520d532 Additional comments 2017-01-30 17:05:04 -08:00
Terry W Brady
9904fdb412 DS-3457 and DS-3458 fixes 2017-01-30 12:11:06 -08:00
Terry Brady
e0e223e2bf [DS-3468] 6x Ignore bin directory built by Eclipse (#1627)
* Exclude top level /bin directory built by Eclipse
2017-01-26 16:28:25 +01:00
Hardy Pottinger
45762e993d Merge pull request #1617 from jonas-atmire/DS-3445-ChecksumChecker-no-enum-constant-error
DS-3445 Only add "ResultCode" if not default
2017-01-19 10:15:11 -06:00
Andrew Bennet
ce72010805 [DS-3460] Fix incorrect REST documentation 2017-01-17 21:32:40 +01:00
Bram Luyten
faa12bfd33 Merge pull request #1610 from tomdesair/DS-3108-DSpace-6x_Support-non-email-based-authentication-in-REST-API
DS-3108 DSpace 6x: Support non-email based authentication in REST API
2017-01-14 11:44:35 +01:00
Jonas Van Goolen
2805386f9d DS-3445 Only add "ResultCode" if not default 2017-01-13 10:41:30 +01:00
Jonas Van Goolen
a62eddeb59 DS-3448 Removal of unnecessary duplicate javascript file 2017-01-13 09:43:43 +01:00
Jonas Van Goolen
c873e554d3 DS-3448 Multi-select in submission for workflow and workspace items -> License headers in new files 2017-01-12 13:52:21 +01:00
Jonas Van Goolen
01dee698c2 DS-3448 Multi-select in submission for workflow and workspace items 2017-01-11 15:33:25 +01:00
Tom Desair
eb5dc58384 DS-3436: Tell SOLR to split values of multi-valued fields when sharding cores 2017-01-11 12:55:10 +01:00
Tim Donohue
958631c81c Merge pull request #1600 from samuelcambien/dspace-6_x-DS-3435
DS-3435 possible nullpointerexception at AccessStepUtil$populateEmbar…
2017-01-10 09:04:35 -06:00
Tom Desair
89ded55942 DS-3108 DSpace 6 only: Revert rename REST API login paramter email to user 2017-01-10 14:04:01 +01:00
Tom Desair
9855022228 Revert "DS-3108: Rename REST API login paramter email to user"
This reverts commit d2c4233d9e.
2017-01-10 13:57:29 +01:00
Tom Desair
bfc68d3354 DS-3446: Remove policies only after the bitstream has been updated (otherwise the current user has not WRITE rights) 2017-01-09 22:53:52 +01:00
Tom Desair
38848e16d3 DS-3108: Update REST API authentication documentation
Conflicts:
	dspace-rest/src/main/java/org/dspace/rest/RestIndex.java
2017-01-09 17:33:58 +01:00
Tom Desair
0244a425ae DS-3108: Remove deprication since there is no alternative 2017-01-09 17:32:55 +01:00
Tom Desair
c3c5287880 DS-3108: Remove unused imports 2017-01-09 17:32:49 +01:00
Tom Desair
3321cba560 DS-3108: Remove unnecessary /login-shibboleth endpoint
Conflicts:
	dspace-rest/src/main/java/org/dspace/rest/RestIndex.java
2017-01-09 17:32:45 +01:00
Tom Desair
684e87ed20 DS-3108: Return FORBIDDEN error code when authentication on the REST API failed
Conflicts:
	dspace-rest/src/main/java/org/dspace/rest/RestIndex.java
2017-01-09 17:31:24 +01:00
Tom Desair
d2c4233d9e DS-3108: Rename REST API login paramter email to user 2017-01-09 17:30:38 +01:00
Tom Desair
ae9862395a DS-3108: Support authenticaton mechanisms where the e-mail attribute is not an e-mail address 2017-01-09 17:30:26 +01:00
Tim Donohue
6256c673b9 Merge pull request #1607 from bram-atmire/DS-3289
DS-3289 Removing double slashes in image paths
2017-01-09 09:17:23 -06:00
Bram Luyten
2b0448fe64 DS-3289 Removing double slashes in image paths 2017-01-07 18:22:03 +01:00
cjuergen
1e4ae0b5e3 Cherry pick DS-3440 solution d95902b 2017-01-06 19:09:44 +01:00
Bram Luyten
1f36899abe Merge pull request #1605 from 4Science/DS-3441-6x
DS-3441 READ permssion on the Collection object not respected by the JSPUI (6_x)
2017-01-06 18:18:50 +01:00
Andrea Bollini
a6aa9816d2 DS-3441 READ permssion on the Collection object not respected by the JSPUI 2017-01-06 13:56:47 +01:00
Bram Luyten
242d1357c7 Merge pull request #1601 from tomdesair/DS-3381_Workspace-item-not-saved-when-using-versioning
DS-3381 workspace item not saved when using versioning
2017-01-05 16:43:50 +01:00
Tom Desair
4b927562b6 DS-3381: Do an explicit commit so that the workspace item is written to the database before the redirect to the submission form (see versioning.js doCreateNewVersion) 2017-01-04 23:05:20 +01:00
samuel
7b6ea8e807 DS-3435 possible nullpointerexception at AccessStepUtil$populateEmbargoDetail
Conflicts:
	dspace-xmlui/src/main/java/org/dspace/app/xmlui/aspect/submission/submit/AccessStepUtil.java
2017-01-03 12:40:56 +01:00
Philip Vissenaekens
a3c6aa2ced DS-3419 2016-12-09 13:14:55 +01:00
Ivan Masár
50eed239f5 DS-3363 CSV import error says "row", means "column" 2016-11-14 18:28:11 +01:00
Ivan Masár
3065389435 typo: xforwarderfor -> xforwardedfor 2016-11-01 16:18:45 +01:00
3514 changed files with 365544 additions and 121450 deletions

10
.dockerignore Normal file
View File

@@ -0,0 +1,10 @@
.git/
.idea/
.settings/
*/target/
dspace/modules/*/target/
Dockerfile.*
dspace/src/main/docker/dspace-postgres-pgcrypto
dspace/src/main/docker/dspace-postgres-pgcrypto-curl
dspace/src/main/docker/README.md
dspace/src/main/docker-compose/

63
.github/workflows/build.yml vendored Normal file
View File

@@ -0,0 +1,63 @@
# DSpace 6.x Continuous Integration/Build via GitHub Actions
# Concepts borrowed from
# https://docs.github.com/en/free-pro-team@latest/actions/guides/building-and-testing-java-with-maven
name: Build
# Run this Build for all pushes / PRs to current branch
on: [push, pull_request]
jobs:
tests:
runs-on: ubuntu-18.04
env:
# Give Maven 1GB of memory to work with
# Suppress all Maven "downloading" messages in Travis logs (see https://stackoverflow.com/a/35653426)
# This also slightly speeds builds, as there is less logging
MAVEN_OPTS: "-Xmx1024M -Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn"
# These are the actual CI steps to perform per job
steps:
# https://github.com/actions/checkout
- name: Checkout codebase
uses: actions/checkout@v1
# https://github.com/actions/setup-java
- name: Install JDK 8
uses: actions/setup-java@v1
with:
java-version: 8
# GitHub CI already has Node.js installed, but this gives us the correct
# version. GitHub CI also already has grunt installed so we don't need to
# install any other pre-requisites for Mirage 2.
- name: Install Node.js v14 (for Mirage 2)
uses: actions/setup-node@v3
with:
node-version: 14
# https://github.com/actions/cache
- name: Cache Maven dependencies
uses: actions/cache@v2
with:
# Cache entire ~/.m2/repository
path: ~/.m2/repository
# Cache key is hash of all pom.xml files. Therefore any changes to POMs will invalidate cache
key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: ${{ runner.os }}-maven-
# [Build & Unit Test] Check source code licenses and run source code Unit Tests
# license:check => Validate all source code license headers
# -Dmaven.test.skip=false => Enable DSpace Unit Tests
# -DskipITs=false => Enable DSpace Integration Tests
# -P !assembly => Skip normal assembly (as it can be memory intensive)
# -B => Maven batch/non-interactive mode (recommended for CI)
# -V => Display Maven version info before build
# -Dsurefire.rerunFailingTestsCount=2 => try again for flaky tests, and keep track of/report on number of retries
- name: Run Maven Build & Test
run: mvn clean install license:check -Dmaven.test.skip=false -DskipITs=false -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2
# [Assemble DSpace] Ensure assembly process works (from [src]/dspace/), including Mirage 2
# -Dmirage2.on=true => Build Mirage2
# -Dmirage2.deps.included=false => Don't include Mirage2 build dependencies (we installed them above)
# -P !assembly => SKIP the actual building of [src]/dspace/dspace-installer (as it can be memory intensive)
- name: Assemble DSpace & Build Mirage 2
run: cd dspace && mvn package -Dmirage2.on=true -Dmirage2.deps.included=false -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2

5
.gitignore vendored
View File

@@ -9,7 +9,6 @@ tags
/bin/
.project
.classpath
.checkstyle
## Ignore project files created by IntelliJ IDEA
*.iml
@@ -26,6 +25,7 @@ dist/
nbdist/
nbactions.xml
nb-configuration.xml
META-INF/
## Ignore all *.properties file in root folder, EXCEPT build.properties (the default)
## KEPT FOR BACKWARDS COMPATIBILITY WITH 5.x (build.properties is now replaced with local.cfg)
@@ -39,6 +39,3 @@ nb-configuration.xml
##Mac noise
.DS_Store
##Ignore JRebel project configuration
rebel.xml

View File

@@ -1,44 +0,0 @@
language: java
sudo: false
env:
# Give Maven 1GB of memory to work with
- MAVEN_OPTS=-Xmx1024M
jdk:
# DS-3384 Oracle JDK 8 has DocLint enabled by default.
# Let's use this to catch any newly introduced DocLint issues.
- oraclejdk8
## Should we run into any problems with oraclejdk8 on Travis, we may try the following workaround.
## https://docs.travis-ci.com/user/languages/java#Testing-Against-Multiple-JDKs
## https://github.com/travis-ci/travis-ci/issues/3259#issuecomment-130860338
#addons:
# apt:
# packages:
# - oracle-java8-installer
# Install prerequisites for building Mirage2 more rapidly
before_install:
# Remove outdated settings.xml from Travis builds. Workaround for https://github.com/travis-ci/travis-ci/issues/4629
- rm ~/.m2/settings.xml
# Skip install stage, as we'll do it below
install: "echo 'Skipping install stage, dependencies will be downloaded during build and test stages.'"
# Two stage Build and Test
# 1. Install & Unit Test APIs
# 2. Assemble DSpace
script:
# 1. [Install & Unit Test] Check source code licenses and run source code Unit Tests
# license:check => Validate all source code license headers
# -Dmaven.test.skip=false => Enable DSpace Unit Tests
# -DskipITs=false => Enable DSpace Integration Tests
# -P !assembly => Skip normal assembly (as it can be memory intensive)
# -B => Maven batch/non-interactive mode (recommended for CI)
# -V => Display Maven version info before build
# -Dsurefire.rerunFailingTestsCount=2 => try again for flakey tests, and keep track of/report on number of retries
- "mvn clean install license:check -Dmaven.test.skip=false -DskipITs=false -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2"
# 2. [Assemble DSpace] Ensure overlay & assembly process works (from [src]/dspace/)
# -P !assembly => SKIP the actual building of [src]/dspace/dspace-installer (as it can be memory intensive)
- "cd dspace && mvn package -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2"

59
Dockerfile.cli.jdk8 Normal file
View File

@@ -0,0 +1,59 @@
# This image will be published as dspace/dspace-cli
# See https://dspace-labs.github.io/DSpace-Docker-Images/ for usage details
#
# This version is JDK8 compatible
# - openjdk:8-jdk
# - ANT 1.10.7
# - maven:3-jdk-8
# - note:
# - default tag for branch: dspace/dspace-cli: dspace/dspace-cli:dspace-6_x
# Step 1 - Run Maven Build
FROM dspace/dspace-dependencies:dspace-6_x as build
ARG TARGET_DIR=dspace-installer
WORKDIR /app
# The dspace-install directory will be written to /install
RUN mkdir /install \
&& chown -Rv dspace: /install \
&& chown -Rv dspace: /app
USER dspace
# Copy the DSpace source code into the workdir (excluding .dockerignore contents)
ADD --chown=dspace . /app/
COPY dspace/src/main/docker/local.cfg /app/local.cfg
# Build DSpace. Copy the dspace-install directory to /install. Clean up the build to keep the docker image small
RUN mvn package -P'!dspace-solr,!dspace-jspui,!dspace-xmlui,!dspace-rest,!dspace-xmlui-mirage2,!dspace-rdf,!dspace-sword,!dspace-swordv2' && \
mv /app/dspace/target/${TARGET_DIR}/* /install && \
mvn clean
# Step 2 - Run Ant Deploy
FROM openjdk:8-jdk as ant_build
ARG TARGET_DIR=dspace-installer
COPY --from=build /install /dspace-src
WORKDIR /dspace-src
# Create the initial install deployment using ANT
ENV ANT_VERSION 1.10.7
ENV ANT_HOME /tmp/ant-$ANT_VERSION
ENV PATH $ANT_HOME/bin:$PATH
# Need wget to install ant
RUN apt-get update \
&& apt-get install -y --no-install-recommends wget \
&& apt-get purge -y --auto-remove \
&& rm -rf /var/lib/apt/lists/*
# Download and install 'ant'
RUN mkdir $ANT_HOME && \
wget -qO- "https://archive.apache.org/dist/ant/binaries/apache-ant-$ANT_VERSION-bin.tar.gz" | tar -zx --strip-components=1 -C $ANT_HOME
# Run necessary 'ant' deploy scripts
RUN ant init_installation update_configs update_code
# Step 3 - Run jdk
# Create a new jdk image that does not retain the the build directory contents
FROM openjdk:8-jdk
ENV DSPACE_INSTALL=/dspace
COPY --from=ant_build /dspace $DSPACE_INSTALL
ENV JAVA_OPTS=-Xmx1000m

27
Dockerfile.dependencies Normal file
View File

@@ -0,0 +1,27 @@
# This image will be published as dspace/dspace-dependencies
# The purpose of this image is to make the build for dspace/dspace run faster
# Step 1 - Run Maven Build
FROM maven:3-jdk-8 as build
ARG TARGET_DIR=dspace-installer
WORKDIR /app
# The Mirage2 build cannot run as root. Setting the user to dspace.
RUN useradd dspace \
&& mkdir /home/dspace \
&& chown -Rv dspace: /home/dspace \
&& chown -Rv dspace: /app
USER dspace
# Copy the DSpace source code into the workdir (excluding .dockerignore contents)
ADD --chown=dspace . /app/
COPY dspace/src/main/docker/local.cfg /app/local.cfg
# Trigger the installation of all maven dependencies including the Mirage2 dependencies
# Clean up the built artifacts in the same step to keep the docker image small
RUN mvn package -Dmirage2.on=true && mvn clean
# Clear the contents of the /app directory so no artifacts are left when dspace:dspace is built
USER root
RUN rm -rf /app/*

69
Dockerfile.jdk8 Normal file
View File

@@ -0,0 +1,69 @@
# This image will be published as dspace/dspace
# See https://dspace-labs.github.io/DSpace-Docker-Images/ for usage details
#
# This version is JDK8 compatible
# - tomcat:8-jre8
# - ANT 1.10.7
# - maven:3-jdk-8
# - note:
# - default tag for branch: dspace/dspace: dspace/dspace:dspace-6_x-jdk8
# Step 1 - Run Maven Build
FROM dspace/dspace-dependencies:dspace-6_x as build
ARG TARGET_DIR=dspace-installer
WORKDIR /app
# The dspace-install directory will be written to /install
RUN mkdir /install \
&& chown -Rv dspace: /install \
&& chown -Rv dspace: /app
USER dspace
# Copy the DSpace source code into the workdir (excluding .dockerignore contents)
ADD --chown=dspace . /app/
COPY dspace/src/main/docker/local.cfg /app/local.cfg
# Build DSpace. Copy the dspace-install directory to /install. Clean up the build to keep the docker image small
RUN mvn package -Dmirage2.on=true && \
mv /app/dspace/target/${TARGET_DIR}/* /install && \
mvn clean
# Step 2 - Run Ant Deploy
FROM tomcat:8-jre8 as ant_build
ARG TARGET_DIR=dspace-installer
COPY --from=build /install /dspace-src
WORKDIR /dspace-src
# Create the initial install deployment using ANT
ENV ANT_VERSION 1.10.7
ENV ANT_HOME /tmp/ant-$ANT_VERSION
ENV PATH $ANT_HOME/bin:$PATH
# Need wget to install ant
RUN apt-get update \
&& apt-get install -y --no-install-recommends wget \
&& apt-get purge -y --auto-remove \
&& rm -rf /var/lib/apt/lists/*
# Download and install 'ant'
RUN mkdir $ANT_HOME && \
wget -qO- "https://archive.apache.org/dist/ant/binaries/apache-ant-$ANT_VERSION-bin.tar.gz" | tar -zx --strip-components=1 -C $ANT_HOME
# Run necessary 'ant' deploy scripts
RUN ant init_installation update_configs update_code update_webapps update_solr_indexes
# Step 3 - Run tomcat
# Create a new tomcat image that does not retain the the build directory contents
FROM tomcat:8-jre8
ENV DSPACE_INSTALL=/dspace
COPY --from=ant_build /dspace $DSPACE_INSTALL
EXPOSE 8080 8009
ENV JAVA_OPTS=-Xmx2000m
RUN ln -s $DSPACE_INSTALL/webapps/solr /usr/local/tomcat/webapps/solr && \
ln -s $DSPACE_INSTALL/webapps/xmlui /usr/local/tomcat/webapps/xmlui && \
ln -s $DSPACE_INSTALL/webapps/jspui /usr/local/tomcat/webapps/jspui && \
ln -s $DSPACE_INSTALL/webapps/rest /usr/local/tomcat/webapps/rest && \
ln -s $DSPACE_INSTALL/webapps/oai /usr/local/tomcat/webapps/oai && \
ln -s $DSPACE_INSTALL/webapps/rdf /usr/local/tomcat/webapps/rdf && \
ln -s $DSPACE_INSTALL/webapps/sword /usr/local/tomcat/webapps/sword && \
ln -s $DSPACE_INSTALL/webapps/swordv2 /usr/local/tomcat/webapps/swordv2

75
Dockerfile.jdk8-test Normal file
View File

@@ -0,0 +1,75 @@
# This image will be published as dspace/dspace
# See https://dspace-labs.github.io/DSpace-Docker-Images/ for usage details
#
# This version is JDK8 compatible
# - tomcat:8-jre8
# - ANT 1.10.7
# - maven:3-jdk-8
# - note: expose /solr to any host; provide /rest over http
# - default tag for branch: dspace/dspace: dspace/dspace:dspace-6_x-jdk8
# Step 1 - Run Maven Build
FROM dspace/dspace-dependencies:dspace-6_x as build
ARG TARGET_DIR=dspace-installer
WORKDIR /app
# The dspace-install directory will be written to /install
RUN mkdir /install \
&& chown -Rv dspace: /install \
&& chown -Rv dspace: /app
USER dspace
# Copy the DSpace source code into the workdir (excluding .dockerignore contents)
ADD --chown=dspace . /app/
COPY dspace/src/main/docker/local.cfg /app/local.cfg
# Build DSpace. Copy the dspace-install directory to /install. Clean up the build to keep the docker image small
RUN mvn package -Dmirage2.on=true && \
mv /app/dspace/target/${TARGET_DIR}/* /install && \
mvn clean
# Step 2 - Run Ant Deploy
FROM tomcat:8-jre8 as ant_build
ARG TARGET_DIR=dspace-installer
COPY --from=build /install /dspace-src
WORKDIR /dspace-src
# Create the initial install deployment using ANT
ENV ANT_VERSION 1.10.7
ENV ANT_HOME /tmp/ant-$ANT_VERSION
ENV PATH $ANT_HOME/bin:$PATH
# Need wget to install ant
RUN apt-get update \
&& apt-get install -y --no-install-recommends wget \
&& apt-get purge -y --auto-remove \
&& rm -rf /var/lib/apt/lists/*
# Download and install 'ant'
RUN mkdir $ANT_HOME && \
wget -qO- "https://archive.apache.org/dist/ant/binaries/apache-ant-$ANT_VERSION-bin.tar.gz" | tar -zx --strip-components=1 -C $ANT_HOME
# Run necessary 'ant' deploy scripts
RUN ant init_installation update_configs update_code update_webapps update_solr_indexes
# Step 3 - Run tomcat
# Create a new tomcat image that does not retain the the build directory contents
FROM tomcat:8-jre8
ENV DSPACE_INSTALL=/dspace
COPY --from=ant_build /dspace $DSPACE_INSTALL
EXPOSE 8080 8009
ENV JAVA_OPTS=-Xmx2000m
RUN ln -s $DSPACE_INSTALL/webapps/solr /usr/local/tomcat/webapps/solr && \
ln -s $DSPACE_INSTALL/webapps/xmlui /usr/local/tomcat/webapps/xmlui && \
ln -s $DSPACE_INSTALL/webapps/jspui /usr/local/tomcat/webapps/jspui && \
ln -s $DSPACE_INSTALL/webapps/rest /usr/local/tomcat/webapps/rest && \
ln -s $DSPACE_INSTALL/webapps/oai /usr/local/tomcat/webapps/oai && \
ln -s $DSPACE_INSTALL/webapps/rdf /usr/local/tomcat/webapps/rdf && \
ln -s $DSPACE_INSTALL/webapps/sword /usr/local/tomcat/webapps/sword && \
ln -s $DSPACE_INSTALL/webapps/swordv2 /usr/local/tomcat/webapps/swordv2
COPY dspace/src/main/docker/test/solr_web.xml $DSPACE_INSTALL/webapps/solr/WEB-INF/web.xml
COPY dspace/src/main/docker/test/rest_web.xml $DSPACE_INSTALL/webapps/rest/WEB-INF/web.xml
RUN sed -i -e "s|\${dspace.dir}|$DSPACE_INSTALL|" $DSPACE_INSTALL/webapps/solr/WEB-INF/web.xml && \
sed -i -e "s|\${dspace.dir}|$DSPACE_INSTALL|" $DSPACE_INSTALL/webapps/rest/WEB-INF/web.xml

21
LICENSE
View File

@@ -1,7 +1,6 @@
DSpace source code license:
BSD 3-Clause License
Copyright (c) 2002-2016, DuraSpace. All rights reserved.
Copyright (c) 2002-2022, LYRASIS. All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
@@ -14,13 +13,12 @@ notice, this list of conditions and the following disclaimer.
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
- Neither the name DuraSpace nor the name of the DSpace Foundation
nor the names of its contributors may be used to endorse or promote
products derived from this software without specific prior written
permission.
- Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
HOLDERS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
@@ -31,10 +29,3 @@ ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
DAMAGE.
DSpace uses third-party libraries which may be distributed under
different licenses to the above. Information about these licenses
is detailed in the LICENSES_THIRD_PARTY file at the root of the source
tree. You must agree to the terms of these licenses, in addition to
the above DSpace source code license, in order to use this software.

View File

@@ -15,37 +15,35 @@ PLEASE NOTE: Some dependencies may be listed under multiple licenses if they
are dual-licensed. This is especially true of anything listed as
"GNU General Public Library" below, as DSpace actually does NOT allow for any
dependencies that are solely released under GPL terms. For more info see:
https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines
---------------------------------------------------
Apache Software License, Version 2.0:
* Ant-Contrib Tasks (ant-contrib:ant-contrib:1.0b3 - http://ant-contrib.sourceforge.net)
* Code Generation Library (cglib:cglib:2.2.2 - http://cglib.sourceforge.net/)
* reload4j (ch.qos.reload4j:reload4j:1.2.20 - https://reload4j.qos.ch)
* AWS SDK for Java - Core (com.amazonaws:aws-java-sdk-core:1.10.50 - https://aws.amazon.com/sdkforjava)
* AWS Java SDK for AWS KMS (com.amazonaws:aws-java-sdk-kms:1.10.50 - https://aws.amazon.com/sdkforjava)
* AWS Java SDK for Amazon S3 (com.amazonaws:aws-java-sdk-s3:1.10.50 - https://aws.amazon.com/sdkforjava)
* HPPC Collections (com.carrotsearch:hppc:0.5.2 - http://labs.carrotsearch.com/hppc.html/hppc)
* metadata-extractor (com.drewnoakes:metadata-extractor:2.6.2 - http://code.google.com/p/metadata-extractor/)
* Jackson-annotations (com.fasterxml.jackson.core:jackson-annotations:2.5.4 - http://github.com/FasterXML/jackson)
* Jackson-annotations (com.fasterxml.jackson.core:jackson-annotations:2.7.0 - http://github.com/FasterXML/jackson)
* Jackson-core (com.fasterxml.jackson.core:jackson-core:2.5.4 - https://github.com/FasterXML/jackson)
* Jackson-core (com.fasterxml.jackson.core:jackson-core:2.7.0 - https://github.com/FasterXML/jackson-core)
* jackson-databind (com.fasterxml.jackson.core:jackson-databind:2.5.4 - http://github.com/FasterXML/jackson)
* jackson-databind (com.fasterxml.jackson.core:jackson-databind:2.7.0 - http://github.com/FasterXML/jackson)
* Jackson-JAXRS-base (com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:2.5.4 - http://wiki.fasterxml.com/JacksonHome/jackson-jaxrs-base)
* Jackson-JAXRS-JSON (com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:2.5.4 - http://wiki.fasterxml.com/JacksonHome/jackson-jaxrs-json-provider)
* Jackson-module-JAXB-annotations (com.fasterxml.jackson.module:jackson-module-jaxb-annotations:2.5.4 - http://wiki.fasterxml.com/JacksonJAXBAnnotations)
* Google APIs Client Library for Java (com.google.api-client:google-api-client:1.21.0 - https://github.com/google/google-api-java-client/google-api-client)
* Google Analytics API v3-rev123-1.21.0 (com.google.apis:google-api-services-analytics:v3-rev123-1.21.0 - http://nexus.sonatype.org/oss-repository-hosting.html/google-api-services-analytics)
* Jackson-annotations (com.fasterxml.jackson.core:jackson-annotations:2.8.11 - http://github.com/FasterXML/jackson)
* Jackson-core (com.fasterxml.jackson.core:jackson-core:2.8.11 - https://github.com/FasterXML/jackson-core)
* jackson-databind (com.fasterxml.jackson.core:jackson-databind:2.8.11.1 - http://github.com/FasterXML/jackson)
* Jackson-JAXRS-base (com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:2.8.11 - http://github.com/FasterXML/jackson-jaxrs-providers/jackson-jaxrs-base)
* Jackson-JAXRS-JSON (com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:2.8.11 - http://github.com/FasterXML/jackson-jaxrs-providers/jackson-jaxrs-json-provider)
* Jackson module: JAXB-annotations (com.fasterxml.jackson.module:jackson-module-jaxb-annotations:2.8.11 - http://github.com/FasterXML/jackson-module-jaxb-annotations)
* Google APIs Client Library for Java (com.google.api-client:google-api-client:1.23.0 - https://github.com/google/google-api-java-client/google-api-client)
* Google Analytics API v3-rev145-1.23.0 (com.google.apis:google-api-services-analytics:v3-rev145-1.23.0 - http://nexus.sonatype.org/oss-repository-hosting.html/google-api-services-analytics)
* FindBugs-jsr305 (com.google.code.findbugs:jsr305:3.0.1 - http://findbugs.sourceforge.net/)
* Gson (com.google.code.gson:gson:2.6.1 - https://github.com/google/gson/gson)
* Guava: Google Core Libraries for Java (com.google.guava:guava:14.0.1 - http://code.google.com/p/guava-libraries/guava)
* Guava: Google Core Libraries for Java (com.google.guava:guava:19.0 - https://github.com/google/guava/guava)
* Guava: Google Core Libraries for Java (JDK5 Backport) (com.google.guava:guava-jdk5:17.0 - http://code.google.com/p/guava-libraries/guava-jdk5)
* Google HTTP Client Library for Java (com.google.http-client:google-http-client:1.21.0 - https://github.com/google/google-http-java-client/google-http-client)
* Jackson 2 extensions to the Google HTTP Client Library for Java. (com.google.http-client:google-http-client-jackson2:1.21.0 - https://github.com/google/google-http-java-client/google-http-client-jackson2)
* Google OAuth Client Library for Java (com.google.oauth-client:google-oauth-client:1.21.0 - https://github.com/google/google-oauth-java-client/google-oauth-client)
* Google HTTP Client Library for Java (com.google.http-client:google-http-client:1.23.0 - https://github.com/google/google-http-java-client/google-http-client)
* Jackson 2 extensions to the Google HTTP Client Library for Java. (com.google.http-client:google-http-client-jackson2:1.23.0 - https://github.com/google/google-http-java-client/google-http-client-jackson2)
* Google OAuth Client Library for Java (com.google.oauth-client:google-oauth-client:1.23.0 - https://github.com/google/google-oauth-java-client/google-oauth-client)
* ConcurrentLinkedHashMap (com.googlecode.concurrentlinkedhashmap:concurrentlinkedhashmap-lru:1.2 - http://code.google.com/p/concurrentlinkedhashmap)
* ISO Parser (com.googlecode.mp4parser:isoparser:1.0-RC-1 - http://code.google.com/p/mp4parser/)
* builder-commons (com.lyncode:builder-commons:1.0.2 - http://nexus.sonatype.org/oss-repository-hosting.html/builder-commons)
@@ -53,14 +51,17 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Jtwig Core Functions (com.lyncode:jtwig-functions:2.0.1 - http://www.lyncode.com/jtwig-functions)
* Jtwig Spring (com.lyncode:jtwig-spring:2.0.1 - http://www.lyncode.com/jtwig-spring)
* Test Support (com.lyncode:test-support:1.0.3 - http://nexus.sonatype.org/oss-repository-hosting.html/test-support)
* MaxMind DB Reader (com.maxmind.db:maxmind-db:1.2.2 - http://dev.maxmind.com/)
* MaxMind GeoIP2 API (com.maxmind.geoip2:geoip2:2.11.0 - http://dev.maxmind.com/geoip/geoip2/web-services)
* Spatial4J (com.spatial4j:spatial4j:0.4.1 - https://github.com/spatial4j/spatial4j)
* fastinfoset (com.sun.xml.fastinfoset:FastInfoset:1.2.15 - http://fi.java.net)
* Apache Commons BeanUtils (commons-beanutils:commons-beanutils:1.9.2 - http://commons.apache.org/proper/commons-beanutils/)
* Apache Commons CLI (commons-cli:commons-cli:1.3.1 - http://commons.apache.org/proper/commons-cli/)
* Apache Commons Codec (commons-codec:commons-codec:1.10 - http://commons.apache.org/proper/commons-codec/)
* Apache Commons Collections (commons-collections:commons-collections:3.2.2 - http://commons.apache.org/collections/)
* Apache Commons Configuration (commons-configuration:commons-configuration:1.10 - http://commons.apache.org/configuration/)
* Commons Digester (commons-digester:commons-digester:1.8.1 - http://commons.apache.org/digester/)
* Apache Commons FileUpload (commons-fileupload:commons-fileupload:1.3.1 - http://commons.apache.org/proper/commons-fileupload/)
* Apache Commons FileUpload (commons-fileupload:commons-fileupload:1.3.3 - http://commons.apache.org/proper/commons-fileupload/)
* HttpClient (commons-httpclient:commons-httpclient:3.1 - http://jakarta.apache.org/httpcomponents/httpclient-3.x/)
* Commons IO (commons-io:commons-io:2.4 - http://commons.apache.org/io/)
* commons-jexl (commons-jexl:commons-jexl:1.0 - no url defined)
@@ -69,7 +70,6 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Apache Commons Logging (commons-logging:commons-logging:1.2 - http://commons.apache.org/proper/commons-logging/)
* Apache Commons Validator (commons-validator:commons-validator:1.5.0 - http://commons.apache.org/proper/commons-validator/)
* Boilerpipe -- Boilerplate Removal and Fulltext Extraction from HTML pages (de.l3s.boilerpipe:boilerpipe:1.1.0 - http://code.google.com/p/boilerpipe/)
* The Netty Project (io.netty:netty:3.7.0.Final - http://netty.io/)
* jakarta-regexp (jakarta-regexp:jakarta-regexp:1.4 - no url defined)
* javax.inject (javax.inject:javax.inject:1 - http://code.google.com/p/atinject/)
* Bean Validation API (javax.validation:validation-api:1.1.0.Final - http://beanvalidation.org)
@@ -84,8 +84,8 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Abdera Core (org.apache.abdera:abdera-core:1.1.3 - http://abdera.apache.org/abdera-core)
* I18N Libraries (org.apache.abdera:abdera-i18n:1.1.3 - http://abdera.apache.org)
* Abdera Parser (org.apache.abdera:abdera-parser:1.1.3 - http://abdera.apache.org/abdera-parser)
* org.apache.tools.ant (org.apache.ant:ant:1.7.0 - http://ant.apache.org/ant/)
* ant-launcher (org.apache.ant:ant-launcher:1.7.0 - http://ant.apache.org/ant-launcher/)
* Apache Ant Core (org.apache.ant:ant:1.9.1 - http://ant.apache.org/)
* Apache Ant Launcher (org.apache.ant:ant-launcher:1.9.1 - http://ant.apache.org/)
* Avalon Framework API (org.apache.avalon.framework:avalon-framework-api:4.3.1 - http://www.apache.org/excalibur/avalon-framework/avalon-framework-api/)
* Avalon Framework Implementation (org.apache.avalon.framework:avalon-framework-impl:4.3.1 - http://www.apache.org/excalibur/avalon-framework/avalon-framework-impl/)
* Cocoon Configuration API (org.apache.cocoon:cocoon-configuration-api:1.0.2 - http://cocoon.apache.org/subprojects/configuration/1.0/configuration-api/1.0/)
@@ -111,11 +111,12 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Cocoon XML Implementation (org.apache.cocoon:cocoon-xml-impl:1.0.0 - http://cocoon.apache.org/2.2/core-modules/xml-impl/1.0/)
* Cocoon XML Resolver (org.apache.cocoon:cocoon-xml-resolver:1.0.0 - http://cocoon.apache.org/2.2/core-modules/xml-resolver/1.0/)
* Cocoon XML Utilities (org.apache.cocoon:cocoon-xml-util:1.0.0 - http://cocoon.apache.org/2.2/core-modules/xml-util/1.0/)
* Apache Commons Collections (org.apache.commons:commons-collections4:4.1 - http://commons.apache.org/proper/commons-collections/)
* Apache Commons Compress (org.apache.commons:commons-compress:1.7 - http://commons.apache.org/proper/commons-compress/)
* Apache Commons CSV (org.apache.commons:commons-csv:1.0 - http://commons.apache.org/proper/commons-csv/)
* Apache Commons DBCP (org.apache.commons:commons-dbcp2:2.1.1 - http://commons.apache.org/dbcp/)
* Apache Commons DBCP (org.apache.commons:commons-dbcp2:2.8.0 - https://commons.apache.org/dbcp/)
* Apache Commons Lang (org.apache.commons:commons-lang3:3.3.2 - http://commons.apache.org/proper/commons-lang/)
* Apache Commons Pool (org.apache.commons:commons-pool2:2.4.2 - http://commons.apache.org/proper/commons-pool/)
* Apache Commons Pool (org.apache.commons:commons-pool2:2.10.0 - https://commons.apache.org/proper/commons-pool/)
* Excalibur Pool API (org.apache.excalibur.components:excalibur-pool-api:2.2.1 - http://www.apache.org/excalibur/excalibur-components-modules/excalibur-pool-modules/excalibur-pool-api/)
* Excalibur Sourceresolve (org.apache.excalibur.components:excalibur-sourceresolve:2.2.3 - http://www.apache.org/excalibur/excalibur-sourceresolve/)
* Excalibur Store (org.apache.excalibur.components:excalibur-store:2.2.1 - http://www.apache.org/excalibur/excalibur-components-modules/excalibur-store/)
@@ -171,14 +172,14 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Lucene Spatial (org.apache.lucene:lucene-spatial:4.10.4 - http://lucene.apache.org/lucene-parent/lucene-spatial)
* Lucene Suggest (org.apache.lucene:lucene-suggest:4.10.2 - http://lucene.apache.org/lucene-parent/lucene-suggest)
* Lucene Suggest (org.apache.lucene:lucene-suggest:4.10.4 - http://lucene.apache.org/lucene-parent/lucene-suggest)
* Apache FontBox (org.apache.pdfbox:fontbox:2.0.2 - http://pdfbox.apache.org/)
* Apache FontBox (org.apache.pdfbox:fontbox:2.0.24 - http://pdfbox.apache.org/)
* Apache JempBox (org.apache.pdfbox:jempbox:1.8.4 - http://www.apache.org/pdfbox-parent/jempbox/)
* Apache PDFBox (org.apache.pdfbox:pdfbox:2.0.2 - http://www.apache.org/pdfbox-parent/pdfbox/)
* Apache POI (org.apache.poi:poi:3.13 - http://poi.apache.org/)
* Apache POI (org.apache.poi:poi-ooxml:3.13 - http://poi.apache.org/)
* Apache PDFBox (org.apache.pdfbox:pdfbox:2.0.24 - https://www.apache.org/pdfbox-parent/pdfbox/)
* Apache POI (org.apache.poi:poi:3.17 - http://poi.apache.org/)
* Apache POI (org.apache.poi:poi-ooxml:3.17 - http://poi.apache.org/)
* Apache POI (org.apache.poi:poi-ooxml-schemas:3.10.1 - http://poi.apache.org/)
* Apache POI (org.apache.poi:poi-ooxml-schemas:3.13 - http://poi.apache.org/)
* Apache POI (org.apache.poi:poi-scratchpad:3.13 - http://poi.apache.org/)
* Apache POI (org.apache.poi:poi-ooxml-schemas:3.17 - http://poi.apache.org/)
* Apache POI (org.apache.poi:poi-scratchpad:3.17 - http://poi.apache.org/)
* Apache Solr Search Server (org.apache.solr:solr:4.10.4 - http://lucene.apache.org/solr-parent/solr)
* Apache Solr Analysis Extras (org.apache.solr:solr-analysis-extras:4.10.4 - http://lucene.apache.org/solr-parent/solr-analysis-extras)
* Apache Solr Content Extraction Library (org.apache.solr:solr-cell:4.10.4 - http://lucene.apache.org/solr-parent/solr-cell)
@@ -191,6 +192,8 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Axiom API (org.apache.ws.commons.axiom:axiom-api:1.2.14 - http://ws.apache.org/axiom/)
* Axiom Impl (org.apache.ws.commons.axiom:axiom-impl:1.2.14 - http://ws.apache.org/axiom/)
* XmlBeans (org.apache.xmlbeans:xmlbeans:2.6.0 - http://xmlbeans.apache.org)
* Apache Yetus - Audience Annotations (org.apache.yetus:audience-annotations:0.5.0 - https://yetus.apache.org/audience-annotations)
* zookeeper (org.apache.zookeeper:zookeeper:3.4.11 - no url defined)
* zookeeper (org.apache.zookeeper:zookeeper:3.4.6 - no url defined)
* Evo Inflector (org.atteo:evo-inflector:1.2.1 - http://atteo.org/static/evo-inflector)
* TagSoup (org.ccil.cowan.tagsoup:tagsoup:1.2.1 - http://home.ccil.org/~cowan/XML/tagsoup/)
@@ -199,6 +202,8 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Woodstox (org.codehaus.woodstox:woodstox-core-asl:4.1.4 - http://woodstox.codehaus.org)
* Woodstox (org.codehaus.woodstox:wstx-asl:3.2.0 - http://woodstox.codehaus.org)
* Woodstox (org.codehaus.woodstox:wstx-asl:3.2.7 - http://woodstox.codehaus.org)
* databene ContiPerf (org.databene:contiperf:2.3.4 - http://databene.org/contiperf)
* elasticsearch (org.elasticsearch:elasticsearch:1.4.0 - http://nexus.sonatype.org/oss-repository-hosting.html/elasticsearch)
* flyway-core (org.flywaydb:flyway-core:4.0.3 - https://flywaydb.org/flyway-core)
* Ogg and Vorbis for Java, Core (org.gagravarr:vorbis-java-core:0.1 - https://github.com/Gagravarr/VorbisJava)
* Apache Tika plugin for Ogg, Vorbis and FLAC (org.gagravarr:vorbis-java-tika:0.1 - https://github.com/Gagravarr/VorbisJava)
@@ -211,8 +216,8 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Objenesis (org.objenesis:objenesis:2.1 - http://objenesis.org)
* parboiled-core (org.parboiled:parboiled-core:1.1.6 - http://parboiled.org)
* parboiled-java (org.parboiled:parboiled-java:1.1.6 - http://parboiled.org)
* org.restlet (org.restlet.jee:org.restlet:2.1.1 - no url defined)
* org.restlet.ext.servlet (org.restlet.jee:org.restlet.ext.servlet:2.1.1 - no url defined)
* Restlet Core - API and Engine (org.restlet.jee:org.restlet:2.1.1 - http://www.restlet.org/org.restlet)
* Restlet Extension - Servlet (org.restlet.jee:org.restlet.ext.servlet:2.1.1 - http://www.restlet.org/org.restlet.ext.servlet)
* rome-modules (org.rometools:rome-modules:1.0 - http://www.rometools.org)
* Spring AOP (org.springframework:spring-aop:3.2.16.RELEASE - https://github.com/SpringSource/spring-framework)
* Spring AOP (org.springframework:spring-aop:3.2.5.RELEASE - https://github.com/SpringSource/spring-framework)
@@ -249,8 +254,7 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* oai4j (se.kb:oai4j:0.6b1 - http://oai4j-client.sourceforge.net/)
* StAX API (stax:stax-api:1.0.1 - http://stax.codehaus.org/)
* standard (taglibs:standard:1.1.2 - no url defined)
* Xalan Java Serializer (xalan:serializer:2.7.2 - http://xml.apache.org/xalan-j/)
* Xalan Java (xalan:xalan:2.7.2 - http://xml.apache.org/xalan-j/)
* xalan (xalan:xalan:2.7.0 - no url defined)
* Xerces2-j (xerces:xercesImpl:2.11.0 - https://xerces.apache.org/xerces2-j/)
* xmlParserAPIs (xerces:xmlParserAPIs:2.6.2 - no url defined)
* XML Commons External Components XML APIs (xml-apis:xml-apis:1.4.01 - http://xml.apache.org/commons/components/external/)
@@ -263,6 +267,7 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* XMP Library for Java (com.adobe.xmp:xmpcore:5.1.2 - http://www.adobe.com/devnet/xmp.html)
* coverity-escapers (com.coverity.security:coverity-escapers:1.1.1 - http://coverity.com/security)
* JSONLD Java :: Core (com.github.jsonld-java:jsonld-java:0.5.1 - http://github.com/jsonld-java/jsonld-java/jsonld-java/)
* curvesapi (com.github.virtuald:curvesapi:1.04 - https://github.com/virtuald/curvesapi)
* Protocol Buffer Java API (com.google.protobuf:protobuf-java:2.5.0 - http://code.google.com/p/protobuf)
* Jena IRI (com.hp.hpl.jena:iri:0.8 - http://jena.sf.net/iri)
* Jena (com.hp.hpl.jena:jena:2.6.4 - http://www.openjena.org/)
@@ -272,47 +277,53 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Biblio Transformation Engine :: Core (gr.ekt.bte:bte-core:0.9.3.5 - http://github.com/EKT/Biblio-Transformation-Engine/bte-core)
* Biblio Transformation Engine :: Input/Output (gr.ekt.bte:bte-io:0.9.3.5 - http://github.com/EKT/Biblio-Transformation-Engine/bte-io)
* jaxen (jaxen:jaxen:1.1.6 - http://jaxen.codehaus.org/)
* JLine (jline:jline:0.9.94 - http://jline.sourceforge.net)
* ANTLR 3 Runtime (org.antlr:antlr-runtime:3.5 - http://www.antlr.org)
* Morfologik FSA (org.carrot2:morfologik-fsa:1.7.1 - http://morfologik.blogspot.com/morfologik-fsa/)
* Morfologik Stemming Dictionary for Polish (org.carrot2:morfologik-polish:1.7.1 - http://morfologik.blogspot.com/morfologik-polish/)
* Morfologik Stemming APIs (org.carrot2:morfologik-stemming:1.7.1 - http://morfologik.blogspot.com/morfologik-stemming/)
* Stax2 API (org.codehaus.woodstox:stax2-api:3.1.1 - http://woodstox.codehaus.org/StAX2)
* DSpace Kernel :: API and Implementation (org.dspace:dspace-api:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-api)
* DSpace I18N :: Language Packs (org.dspace:dspace-api-lang:6.0.3 - https://github.com/dspace/dspace-api-lang)
* DSpace JSP-UI (org.dspace:dspace-jspui:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-jspui)
* DSpace OAI-PMH (org.dspace:dspace-oai:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-oai)
* DSpace RDF (org.dspace:dspace-rdf:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-rdf)
* DSpace REST :: API and Implementation (org.dspace:dspace-rest:6.0-rc4-SNAPSHOT - http://demo.dspace.org)
* DSpace Services Framework :: API and Implementation (org.dspace:dspace-services:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-services)
* Apache Solr Webapp (org.dspace:dspace-solr:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-solr)
* DSpace SWORD (org.dspace:dspace-sword:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-sword)
* DSpace SWORD v2 (org.dspace:dspace-swordv2:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-swordv2)
* DSpace XML-UI (Manakin) (org.dspace:dspace-xmlui:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-xmlui)
* DSpace XML-UI (Manakin) I18N :: Language Packs (org.dspace:dspace-xmlui-lang:6.0.3 - https://github.com/dspace/dspace-xmlui-lang)
* dom4j (org.dom4j:dom4j:2.1.0 - http://dom4j.github.io/)
* DSpace Kernel :: API and Implementation (org.dspace:dspace-api:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-api)
* DSpace I18N :: Language Packs (org.dspace:dspace-api-lang:6.0.6 - https://github.com/dspace/dspace-api-lang)
* DSpace JSP-UI (org.dspace:dspace-jspui:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-jspui)
* DSpace OAI-PMH (org.dspace:dspace-oai:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-oai)
* DSpace RDF (org.dspace:dspace-rdf:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-rdf)
* DSpace REST :: API and Implementation (org.dspace:dspace-rest:6.4-SNAPSHOT - http://demo.dspace.org)
* DSpace Services Framework :: API and Implementation (org.dspace:dspace-services:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-services)
* Apache Solr Webapp (org.dspace:dspace-solr:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-solr)
* DSpace SWORD (org.dspace:dspace-sword:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-sword)
* DSpace SWORD v2 (org.dspace:dspace-swordv2:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-swordv2)
* DSpace XML-UI (Manakin) (org.dspace:dspace-xmlui:6.4-SNAPSHOT - https://github.com/dspace/DSpace/dspace-xmlui)
* DSpace XML-UI (Manakin) I18N :: Language Packs (org.dspace:dspace-xmlui-lang:6.0.7 - https://github.com/dspace/dspace-xmlui-lang)
* handle (org.dspace:handle:6.2 - no url defined)
* jargon (org.dspace:jargon:1.4.25 - no url defined)
* mets (org.dspace:mets:1.5.2 - no url defined)
* oclc-harvester2 (org.dspace:oclc-harvester2:0.1.12 - no url defined)
* XOAI : OAI-PMH Java Toolkit (org.dspace:xoai:3.2.10 - http://nexus.sonatype.org/oss-repository-hosting.html/xoai)
* XOAI : OAI-PMH Java Toolkit (org.dspace:xoai:3.3.0 - https://github.com/dspace/xoai)
* Repackaged Cocoon Servlet Service Implementation (org.dspace.dependencies.cocoon:dspace-cocoon-servlet-service-impl:1.0.3 - http://projects.dspace.org/dspace-pom/dspace-cocoon-servlet-service-impl)
* DSpace Kernel :: Additions and Local Customizations (org.dspace.modules:additions:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/modules/additions)
* DSpace Kernel :: Additions and Local Customizations (org.dspace.modules:additions:6.4-SNAPSHOT - https://github.com/dspace/DSpace/modules/additions)
* Hamcrest All (org.hamcrest:hamcrest-all:1.3 - https://github.com/hamcrest/JavaHamcrest/hamcrest-all)
* Hamcrest Core (org.hamcrest:hamcrest-all:1.3 - https://github.com/hamcrest/JavaHamcrest/hamcrest-all)
* Hamcrest Core (org.hamcrest:hamcrest-core:1.3 - https://github.com/hamcrest/JavaHamcrest/hamcrest-core)
* JBibTeX (org.jbibtex:jbibtex:1.0.10 - http://www.jbibtex.org)
* ASM Core (org.ow2.asm:asm:4.1 - http://asm.objectweb.org/asm/)
* ASM Analysis (org.ow2.asm:asm-analysis:4.1 - http://asm.objectweb.org/asm-analysis/)
* ASM Commons (org.ow2.asm:asm-commons:4.1 - http://asm.objectweb.org/asm-commons/)
* ASM Tree (org.ow2.asm:asm-tree:4.1 - http://asm.objectweb.org/asm-tree/)
* ASM Util (org.ow2.asm:asm-util:4.1 - http://asm.objectweb.org/asm-util/)
* PostgreSQL JDBC Driver - JDBC 4.2 (org.postgresql:postgresql:42.2.1 - https://github.com/pgjdbc/pgjdbc)
* XMLUnit for Java (xmlunit:xmlunit:1.1 - http://xmlunit.sourceforge.net/)
* XMLUnit for Java (xmlunit:xmlunit:1.3 - http://xmlunit.sourceforge.net/)
BSD-Style License:
* JAXB2 Basics - Runtime (org.jvnet.jaxb2_commons:jaxb2-basics-runtime:0.9.5 - https://github.com/highsource/jaxb2-basics/jaxb2-basics-runtime)
Common Development and Distribution License (CDDL):
* JAXB Reference Implementation (com.sun.xml.bind:jaxb-impl:2.2.5 - http://jaxb.java.net/)
* istack common utility code runtime (com.sun.istack:istack-commons-runtime:3.0.7 - http://java.net/istack-commons/istack-commons-runtime/)
* JHighlight (com.uwyn:jhighlight:1.0 - https://jhighlight.dev.java.net/)
* JavaBeans(TM) Activation Framework (javax.activation:activation:1.1.1 - http://java.sun.com/javase/technologies/desktop/javabeans/jaf/index.jsp)
* JavaBeans Activation Framework API jar (javax.activation:javax.activation-api:1.2.0 - http://java.net/all/javax.activation-api/)
* javax.annotation API (javax.annotation:javax.annotation-api:1.2 - http://jcp.org/en/jsr/detail?id=250)
* JavaMail API (compat) (javax.mail:mail:1.4.7 - http://kenai.com/projects/javamail/mail)
* Java Servlet API (javax.servlet:javax.servlet-api:3.1.0 - http://servlet-spec.java.net)
@@ -320,6 +331,7 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* jstl (javax.servlet:jstl:1.2 - no url defined)
* servlet-api (javax.servlet:servlet-api:2.5 - no url defined)
* javax.ws.rs-api (javax.ws.rs:javax.ws.rs-api:2.0.1 - http://jax-rs-spec.java.net)
* jaxb-api (javax.xml.bind:jaxb-api:2.3.1 - https://github.com/javaee/jaxb-spec/jaxb-api)
* Class Model for Hk2 (org.glassfish.hk2:class-model:2.4.0-b31 - https://hk2.java.net/class-model)
* HK2 config types (org.glassfish.hk2:config-types:2.4.0-b31 - https://hk2.java.net/hk2-configuration/hk2-configuration-persistence/hk2-xml-dom/config-types)
* HK2 module of HK2 itself (org.glassfish.hk2:hk2:2.4.0-b31 - https://hk2.java.net/hk2)
@@ -335,6 +347,8 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* ASM library repackaged as OSGi bundle (org.glassfish.hk2.external:asm-all-repackaged:2.4.0-b31 - https://hk2.java.net/external/asm-all-repackaged)
* javax.validation:1.1.0.Final as OSGi bundle (org.glassfish.hk2.external:bean-validator:2.4.0-b31 - https://hk2.java.net/external/bean-validator)
* javax.inject:1 as OSGi bundle (org.glassfish.hk2.external:javax.inject:2.4.0-b31 - https://hk2.java.net/external/javax.inject)
* JAXB Runtime (org.glassfish.jaxb:jaxb-runtime:2.3.1 - http://jaxb.java.net/jaxb-runtime-parent/jaxb-runtime)
* TXW2 Runtime (org.glassfish.jaxb:txw2:2.3.1 - http://jaxb.java.net/jaxb-txw-parent/txw2)
* jersey-repackaged-guava (org.glassfish.jersey.bundles.repackaged:jersey-guava:2.22.1 - https://jersey.java.net/project/project/jersey-guava/)
* jersey-container-servlet (org.glassfish.jersey.containers:jersey-container-servlet:2.22.1 - https://jersey.java.net/project/jersey-container-servlet/)
* jersey-container-servlet-core (org.glassfish.jersey.containers:jersey-container-servlet-core:2.22.1 - https://jersey.java.net/project/jersey-container-servlet-core/)
@@ -347,6 +361,9 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* jersey-media-json-jackson (org.glassfish.jersey.media:jersey-media-json-jackson:2.22.1 - https://jersey.java.net/project/jersey-media-json-jackson/)
* Java Transaction API (org.jboss.spec.javax.transaction:jboss-transaction-api_1.1_spec:1.0.1.Final - http://www.jboss.org/jboss-transaction-api_1.1_spec)
* Type arithmetic library for Java5 (org.jvnet:tiger-types:1.4 - http://java.net/tiger-types/)
* Extended StAX API (org.jvnet.staxex:stax-ex:1.8 - http://stax-ex.java.net/)
* Restlet Core - API and Engine (org.restlet.jee:org.restlet:2.1.1 - http://www.restlet.org/org.restlet)
* Restlet Extension - Servlet (org.restlet.jee:org.restlet.ext.servlet:2.1.1 - http://www.restlet.org/org.restlet.ext.servlet)
Eclipse Public License:
@@ -356,6 +373,8 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Jetty Server (org.mortbay.jetty:jetty:6.1.26 - http://www.eclipse.org/jetty/jetty-parent/project/modules/jetty)
* Jetty Servlet Tester (org.mortbay.jetty:jetty-servlet-tester:6.1.26 - http://www.eclipse.org/jetty/jetty-parent/project/jetty-servlet-tester)
* Jetty Utilities (org.mortbay.jetty:jetty-util:6.1.26 - http://www.eclipse.org/jetty/jetty-parent/project/jetty-util)
* Restlet Core - API and Engine (org.restlet.jee:org.restlet:2.1.1 - http://www.restlet.org/org.restlet)
* Restlet Extension - Servlet (org.restlet.jee:org.restlet.ext.servlet:2.1.1 - http://www.restlet.org/org.restlet.ext.servlet)
GNU General Public License, Version 2 with the Classpath Exception:
@@ -364,7 +383,6 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
GNU Lesser General Public License (LGPL):
* FindBugs-Annotations (com.google.code.findbugs:annotations:3.0.1u2 - http://findbugs.sourceforge.net/)
* MaxMind GeoIP Legacy API (com.maxmind.geoip:geoip-api:1.3.0 - https://github.com/maxmind/geoip-api-java)
* JHighlight (com.uwyn:jhighlight:1.0 - https://jhighlight.dev.java.net/)
* DSpace TM-Extractors Dependency (org.dspace.dependencies:dspace-tm-extractors:1.0.1 - http://projects.dspace.org/dspace-pom/dspace-tm-extractors)
* A Hibernate O/RM Module (org.hibernate:hibernate-core:4.2.21.Final - http://hibernate.org)
@@ -374,6 +392,8 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Javassist (org.javassist:javassist:3.18.1-GA - http://www.javassist.org/)
* JBoss Logging 3 (org.jboss.logging:jboss-logging:3.1.0.GA - http://www.jboss.org)
* org.jdesktop - Swing Worker (org.jdesktop:swing-worker:1.1 - no url defined)
* Restlet Core - API and Engine (org.restlet.jee:org.restlet:2.1.1 - http://www.restlet.org/org.restlet)
* Restlet Extension - Servlet (org.restlet.jee:org.restlet.ext.servlet:2.1.1 - http://www.restlet.org/org.restlet.ext.servlet)
* xom (xom:xom:1.1 - http://www.xom.nu)
* XOM (xom:xom:1.2.5 - http://xom.nu)
@@ -389,18 +409,20 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Bouncy Castle CMS and S/MIME API (org.bouncycastle:bcmail-jdk15:1.46 - http://www.bouncycastle.org/java.html)
* Bouncy Castle Provider (org.bouncycastle:bcprov-jdk15:1.46 - http://www.bouncycastle.org/java.html)
* ORCID Java API generated via JAXB (org.dspace:orcid-jaxb-api:2.1.0 - https://github.com/DSpace/orcid-jaxb-api)
* Main (org.jmockit:jmockit:1.21 - http://www.jmockit.org)
* OpenCloud (org.mcavallo:opencloud:0.3 - http://opencloud.mcavallo.org/)
* Mockito (org.mockito:mockito-core:1.10.19 - http://www.mockito.org)
* JCL 1.1.1 implemented over SLF4J (org.slf4j:jcl-over-slf4j:1.7.14 - http://www.slf4j.org)
* JUL to SLF4J bridge (org.slf4j:jul-to-slf4j:1.7.14 - http://www.slf4j.org)
* SLF4J API Module (org.slf4j:slf4j-api:1.7.14 - http://www.slf4j.org)
* SLF4J LOG4J-12 Binding (org.slf4j:slf4j-log4j12:1.7.14 - http://www.slf4j.org)
* SLF4J Reload4j Binding (org.slf4j:slf4j-reload4j:1.7.36 - http://reload4j.qos.ch)
Mozilla Public License:
* juniversalchardet (com.googlecode.juniversalchardet:juniversalchardet:1.0.3 - http://juniversalchardet.googlecode.com/)
* h2 (com.h2database:h2:1.4.187 - no url defined)
* Saxon-HE (net.sf.saxon:Saxon-HE:9.8.0-14 - http://www.saxonica.com/)
* Javassist (org.javassist:javassist:3.18.1-GA - http://www.javassist.org/)
* Rhino (rhino:js:1.6R7 - http://www.mozilla.org/rhino/)
@@ -415,9 +437,9 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* JDOM (org.jdom:jdom:1.1.3 - http://www.jdom.org)
The PostgreSQL License:
The JSON License:
* PostgreSQL JDBC Driver - JDBC 4.2 (org.postgresql:postgresql:9.4.1211 - https://github.com/pgjdbc/pgjdbc)
* JSON in Java (org.json:json:20180130 - https://github.com/douglascrockford/JSON-java)
license.txt:

19
NOTICE
View File

@@ -1,11 +1,24 @@
Licenses of Third-Party Libraries
=================================
Licensing Notice
DSpace uses third-party libraries which may be distributed under
different licenses than specified in our LICENSE file. Information
about these licenses is detailed in the LICENSES_THIRD_PARTY file at
the root of the source tree. You must agree to the terms of these
licenses, in addition to the DSpace source code license, in order to
use this software.
Fedora Commons joined with the DSpace Foundation and began operating under
Licensing Notices
=================
[July 2019] DuraSpace joined with LYRASIS (another 501(c)3 organization) in July 2019.
LYRASIS holds the copyrights of DuraSpace.
[July 2009] Fedora Commons joined with the DSpace Foundation and began operating under
the new name DuraSpace in July 2009. DuraSpace holds the copyrights of
the DSpace Foundation, Inc.
The DSpace Foundation, Inc. is a 501(c)3 corporation established in July 2007
[July 2007] The DSpace Foundation, Inc. is a 501(c)3 corporation established in July 2007
with a mission to promote and advance the dspace platform enabling management,
access and preservation of digital works. The Foundation was able to transfer
the legal copyright from Hewlett-Packard Company (HP) and Massachusetts

View File

@@ -1,31 +1,17 @@
# DSpace
## NOTE: The rest-tutorial branch has been created to support the [DSpace 7 REST documentation](https://dspace-labs.github.io/DSpace7RestTutorial/walkthrough/intro)
- This branch provides stable, referencable line numbers in code
[![Build Status](https://github.com/DSpace/DSpace/workflows/Build/badge.svg?branch=dspace-6_x)](https://github.com/DSpace/DSpace/actions?query=workflow%3ABuild)
[![Build Status](https://travis-ci.org/DSpace/DSpace.png?branch=master)](https://travis-ci.org/DSpace/DSpace)
[DSpace Documentation](https://wiki.duraspace.org/display/DSDOC/) |
[DSpace Documentation](https://wiki.lyrasis.org/display/DSDOC/) |
[DSpace Releases](https://github.com/DSpace/DSpace/releases) |
[DSpace Wiki](https://wiki.duraspace.org/display/DSPACE/Home) |
[Support](https://wiki.duraspace.org/display/DSPACE/Support)
[DSpace Wiki](https://wiki.lyrasis.org/display/DSPACE/Home) |
[Support](https://wiki.lyrasis.org/display/DSPACE/Support)
DSpace open source software is a turnkey repository application used by more than
2,000 organizations and institutions worldwide to provide durable access to digital resources.
1000+ organizations and institutions worldwide to provide durable access to digital resources.
For more information, visit http://www.dspace.org/
***
:warning: **Work on DSpace 7 has begun on our `master` branch.** This means that there is temporarily NO user interface on this `master` branch. DSpace 7 will feature a new, unified [Angular](https://angular.io/) user interface, along with an enhanced, rebuilt REST API. The latest status of this work can be found on the [DSpace 7 UI Working Group](https://wiki.duraspace.org/display/DSPACE/DSpace+7+UI+Working+Group) page. Additionally, the codebases can be found in the following places:
* DSpace 7 REST API work is occurring on the [`master` branch](https://github.com/DSpace/DSpace/tree/master/dspace-spring-rest) of this repository.
* The REST Contract is being documented at https://github.com/DSpace/Rest7Contract
* DSpace 7 Angular UI work is occurring at https://github.com/DSpace/dspace-angular
**If you would like to get involved in our DSpace 7 development effort, we welcome new contributors.** Just join one of our meetings or get in touch via Slack. See the [DSpace 7 UI Working Group](https://wiki.duraspace.org/display/DSPACE/DSpace+7+UI+Working+Group) wiki page for more info.
**If you are looking for the ongoing maintenance work for DSpace 6 (or prior releases)**, you can find that work on the corresponding maintenance branch (e.g. [`dspace-6_x`](https://github.com/DSpace/DSpace/tree/dspace-6_x)) in this repository.
***
## Downloads
The latest release of DSpace can be downloaded from the [DSpace website](http://www.dspace.org/latest-release/) or from [GitHub](https://github.com/DSpace/DSpace/releases).
@@ -34,29 +20,32 @@ Past releases are all available via GitHub at https://github.com/DSpace/DSpace/r
## Documentation / Installation
Documentation for each release may be viewed online or downloaded via our [Documentation Wiki](https://wiki.duraspace.org/display/DSDOC/).
Documentation for each release may be viewed online or downloaded via our [Documentation Wiki](https://wiki.lyrasis.org/display/DSDOC/).
The latest DSpace Installation instructions are available at:
https://wiki.duraspace.org/display/DSDOC6x/Installing+DSpace
https://wiki.lyrasis.org/display/DSDOC6x/Installing+DSpace
Please be aware that, as a Java web application, DSpace requires a database (PostgreSQL or Oracle)
and a servlet container (usually Tomcat) in order to function.
More information about these and all other prerequisites can be found in the Installation instructions above.
## Running DSpace 6 in Docker
See [Running DSpace 6 with Docker Compose](dspace/src/main/docker-compose/README.md)
## Contributing
DSpace is a community built and supported project. We do not have a centralized development or support team,
but have a dedicated group of volunteers who help us improve the software, documentation, resources, etc.
We welcome contributions of any type. Here's a few basic guides that provide suggestions for contributing to DSpace:
* [How to Contribute to DSpace](https://wiki.duraspace.org/display/DSPACE/How+to+Contribute+to+DSpace): How to contribute in general (via code, documentation, bug reports, expertise, etc)
* [Code Contribution Guidelines](https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines): How to give back code or contribute features, bug fixes, etc.
* [DSpace Community Advisory Team (DCAT)](https://wiki.duraspace.org/display/cmtygp/DSpace+Community+Advisory+Team): If you are not a developer, we also have an interest group specifically for repository managers. The DCAT group meets virtually, once a month, and sends open invitations to join their meetings via the [DCAT mailing list](https://groups.google.com/d/forum/DSpaceCommunityAdvisoryTeam).
* [How to Contribute to DSpace](https://wiki.lyrasis.org/display/DSPACE/How+to+Contribute+to+DSpace): How to contribute in general (via code, documentation, bug reports, expertise, etc)
* [Code Contribution Guidelines](https://wiki.lyrasis.org/display/DSPACE/Code+Contribution+Guidelines): How to give back code or contribute features, bug fixes, etc.
* [DSpace Community Advisory Team (DCAT)](https://wiki.lyrasis.org/display/cmtygp/DSpace+Community+Advisory+Team): If you are not a developer, we also have an interest group specifically for repository managers. The DCAT group meets virtually, once a month, and sends open invitations to join their meetings via the [DCAT mailing list](https://groups.google.com/d/forum/DSpaceCommunityAdvisoryTeam).
We also encourage GitHub Pull Requests (PRs) at any time. Please see our [Development with Git](https://wiki.duraspace.org/display/DSPACE/Development+with+Git) guide for more info.
We also encourage GitHub Pull Requests (PRs) at any time. Please see our [Development with Git](https://wiki.lyrasis.org/display/DSPACE/Development+with+Git) guide for more info.
In addition, a listing of all known contributors to DSpace software can be
found online at: https://wiki.duraspace.org/display/DSPACE/DSpaceContributors
found online at: https://wiki.lyrasis.org/display/DSPACE/DSpaceContributors
## Getting Help
@@ -64,12 +53,10 @@ DSpace provides public mailing lists where you can post questions or raise topic
We welcome everyone to participate in these lists:
* [dspace-community@googlegroups.com](https://groups.google.com/d/forum/dspace-community) : General discussion about DSpace platform, announcements, sharing of best practices
* [dspace-tech@googlegroups.com](https://groups.google.com/d/forum/dspace-tech) : Technical support mailing list. See also our guide for [How to troubleshoot an error](https://wiki.duraspace.org/display/DSPACE/Troubleshoot+an+error).
* [dspace-tech@googlegroups.com](https://groups.google.com/d/forum/dspace-tech) : Technical support mailing list. See also our guide for [How to troubleshoot an error](https://wiki.lyrasis.org/display/DSPACE/Troubleshoot+an+error).
* [dspace-devel@googlegroups.com](https://groups.google.com/d/forum/dspace-devel) : Developers / Development mailing list
Great Q&A is also available under the [DSpace tag on Stackoverflow](http://stackoverflow.com/questions/tagged/dspace)
Additional support options are listed at https://wiki.duraspace.org/display/DSPACE/Support
Additional support options are listed at https://wiki.lyrasis.org/display/DSPACE/Support
DSpace also has an active service provider network. If you'd rather hire a service provider to
install, upgrade, customize or host DSpace, then we recommend getting in touch with one of our
@@ -77,7 +64,8 @@ install, upgrade, customize or host DSpace, then we recommend getting in touch w
## Issue Tracker
The DSpace Issue Tracker can be found at: https://jira.duraspace.org/projects/DS/summary
DSpace uses Github to track issues:
https://github.com/DSpace/DSpace/issues
## License

View File

@@ -1,10 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suppressions PUBLIC
"-//Puppy Crawl//DTD Suppressions 1.2//EN"
"http://checkstyle.sourceforge.net/dtds/suppressions_1_2.dtd">
<suppressions>
<!-- Temporarily suppress indentation checks for all Tests -->
<!-- TODO: We should have these turned on. But, currently there's a known bug with indentation checks
on JMockIt Expectations blocks and similar. See https://github.com/checkstyle/checkstyle/issues/3739 -->
<suppress checks="Indentation" files="src[/\\]test[/\\]java"/>
</suppressions>

View File

@@ -1,144 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE module PUBLIC
"-//Puppy Crawl//DTD Check Configuration 1.3//EN"
"http://checkstyle.sourceforge.net/dtds/configuration_1_3.dtd">
<!--
DSpace CodeStyle Requirements
1. 4-space indents for Java, and 2-space indents for XML. NO TABS ALLOWED.
2. K&R style braces required. Braces required on all blocks.
3. Do not use wildcard imports (e.g. import java.util.*). Duplicated or unused imports also not allowed.
4. Javadocs should exist for all public classes and methods. (Methods rule is unenforced at this time.) Keep it short and to the point
5. Maximum line length is 120 characters (except for long URLs, packages or imports)
6. No trailing spaces allowed (except in comments)
7. Tokens should be surrounded by whitespace (see http://checkstyle.sourceforge.net/config_whitespace.html#WhitespaceAround)
8. Each source file must include our license header (validated separately by license-maven-plugin, see pom.xml)
For more information on CheckStyle configurations below, see: http://checkstyle.sourceforge.net/checks.html
-->
<module name="Checker">
<!-- Configure checker to use UTF-8 encoding -->
<property name="charset" value="UTF-8"/>
<!-- Configure checker to run on files with these extensions -->
<property name="fileExtensions" value="java, properties, cfg, xml"/>
<!-- Suppression configurations in checkstyle-suppressions.xml in same directory -->
<module name="SuppressionFilter">
<property name="file" value="${checkstyle.suppressions.file}" default="checkstyle-suppressions.xml"/>
</module>
<!-- No tab characters ('\t') allowed in the source code -->
<module name="FileTabCharacter">
<property name="eachLine" value="true"/>
<property name="fileExtensions" value="java, properties, cfg, css, js, xml"/>
</module>
<!-- No Trailing Whitespace, except on lines that only have an asterisk (e.g. Javadoc comments) -->
<module name="RegexpSingleline">
<property name="format" value="(?&lt;!\*)\s+$|\*\s\s+$"/>
<property name="message" value="Line has trailing whitespace"/>
<property name="fileExtensions" value="java, properties, cfg, css, js, xml"/>
</module>
<!-- Allow individual lines of code to be excluded from these rules, if they are annotated
with @SuppressWarnings. See also SuppressWarningsHolder below -->
<module name="SuppressWarningsFilter" />
<!-- Check individual Java source files for specific rules -->
<module name="TreeWalker">
<!-- Maximum line length is 120 characters -->
<module name="LineLength">
<property name="max" value="120"/>
<!-- Only exceptions for packages, imports, URLs, and JavaDoc {@link} tags -->
<property name="ignorePattern" value="^package.*|^import.*|http://|https://|@link"/>
</module>
<!-- Highlight any TODO or FIXME comments in info messages -->
<module name="TodoComment">
<property name="severity" value="info"/>
<property name="format" value="(TODO)|(FIXME)"/>
</module>
<!-- Do not report errors on any lines annotated with @SuppressWarnings -->
<module name="SuppressWarningsHolder"/>
<!-- ##### Import statement requirements ##### -->
<!-- Star imports (e.g. import java.util.*) are NOT ALLOWED -->
<module name="AvoidStarImport"/>
<!-- Redundant import statements are NOT ALLOWED -->
<module name="RedundantImport"/>
<!-- Unused import statements are NOT ALLOWED -->
<module name="UnusedImports"/>
<!-- Ensure imports appear alphabetically and grouped -->
<module name="CustomImportOrder">
<property name="sortImportsInGroupAlphabetically" value="true"/>
<property name="separateLineBetweenGroups" value="true"/>
<property name="customImportOrderRules" value="STATIC###STANDARD_JAVA_PACKAGE###THIRD_PARTY_PACKAGE"/>
</module>
<!-- ##### Javadocs requirements ##### -->
<!-- Requirements for Javadocs for classes/interfaces -->
<module name="JavadocType">
<!-- All public classes/interfaces MUST HAVE Javadocs -->
<property name="scope" value="public"/>
<!-- Add an exception for anonymous inner classes -->
<property name="excludeScope" value="anoninner"/>
<!-- Ignore errors related to unknown tags -->
<property name="allowUnknownTags" value="true"/>
<!-- Allow params tags to be optional -->
<property name="allowMissingParamTags" value="false"/>
</module>
<!-- Requirements for Javadocs for methods -->
<module name="JavadocMethod">
<!-- All public methods MUST HAVE Javadocs -->
<!-- <property name="scope" value="public"/> -->
<!-- TODO: Above rule has been disabled because of large amount of missing public method Javadocs -->
<property name="scope" value="nothing"/>
<!-- Allow RuntimeExceptions to be undeclared -->
<property name="allowUndeclaredRTE" value="true"/>
<!-- Allow params, throws and return tags to be optional -->
<property name="allowMissingParamTags" value="true"/>
<property name="allowMissingThrowsTags" value="true"/>
<property name="allowMissingReturnTag" value="true"/>
</module>
<!-- ##### Requirements for K&R Style braces ##### -->
<!-- Code blocks MUST HAVE braces, even single line statements (if, while, etc) -->
<module name="NeedBraces"/>
<!-- Left braces should be at the end of current line (default value)-->
<module name="LeftCurly"/>
<!-- Right braces should be on start of a new line (default value) -->
<module name="RightCurly"/>
<!-- ##### Indentation / Whitespace requirements ##### -->
<!-- Require 4-space indentation (default value) -->
<module name="Indentation"/>
<!-- Whitespace should exist around all major tokens -->
<module name="WhitespaceAround">
<!-- However, make an exception for empty constructors, methods, types, etc. -->
<property name="allowEmptyConstructors" value="true"/>
<property name="allowEmptyMethods" value="true"/>
<property name="allowEmptyTypes" value="true"/>
<property name="allowEmptyLoops" value="true"/>
</module>
<!-- Validate whitespace around Generics (angle brackets) per typical conventions
http://checkstyle.sourceforge.net/config_whitespace.html#GenericWhitespace -->
<module name="GenericWhitespace"/>
<!-- ##### Requirements for "switch" statements ##### -->
<!-- "switch" statements MUST have a "default" clause -->
<module name="MissingSwitchDefault"/>
<!-- "case" clauses in switch statements MUST include break, return, throw or continue -->
<module name="FallThrough"/>
<!-- ##### Other / Miscellaneous requirements ##### -->
<!-- Require utility classes do not have a public constructor -->
<module name="HideUtilityClassConstructor"/>
<!-- Require each variable declaration is its own statement on its own line -->
<module name="MultipleVariableDeclarations"/>
<!-- Each line of code can only include one statement -->
<module name="OneStatementPerLine"/>
<!-- Require that "catch" statements are not empty (must at least contain a comment) -->
<module name="EmptyCatchBlock"/>
</module>
</module>

33
docker-compose-cli.yml Normal file
View File

@@ -0,0 +1,33 @@
#
# The contents of this file are subject to the license and copyright
# detailed in the LICENSE and NOTICE files at the root of the source
# tree and available online at
#
# http://www.dspace.org/license/
#
version: "3.7"
services:
dspace-cli:
image: "dspace/dspace-cli:${DSPACE_VER:-dspace-6_x}"
container_name: dspace-cli
build:
context: .
dockerfile: Dockerfile.cli.jdk8
#environment:
volumes:
- ./dspace/src/main/docker-compose/local.cfg:/dspace/config/local.cfg
- assetstore:/dspace/assetstore
entrypoint: /dspace/bin/dspace
command: help
networks:
- dspacenet
tty: true
stdin_open: true
volumes:
assetstore:
networks:
dspacenet:

53
docker-compose.yml Normal file
View File

@@ -0,0 +1,53 @@
#
# The contents of this file are subject to the license and copyright
# detailed in the LICENSE and NOTICE files at the root of the source
# tree and available online at
#
# http://www.dspace.org/license/
#
version: '3.7'
networks:
dspacenet:
services:
dspace:
container_name: dspace
depends_on:
- dspacedb
image: "${DOCKER_OWNER:-dspace}/dspace:${DSPACE_VER:-dspace-6_x-jdk8-test}"
build:
context: .
dockerfile: Dockerfile.jdk8-test
networks:
dspacenet:
ports:
- published: 8080
target: 8080
stdin_open: true
tty: true
volumes:
- ./dspace/src/main/docker-compose/local.cfg:/dspace/config/local.cfg
- ./dspace/src/main/docker-compose/xmlui.xconf:/dspace/config/xmlui.xconf
- assetstore:/dspace/assetstore
- solr_authority:/dspace/solr/authority/data
- solr_oai:/dspace/solr/oai/data
- solr_search:/dspace/solr/search/data
- solr_statistics:/dspace/solr/statistics/data
dspacedb:
container_name: dspacedb
environment:
PGDATA: /pgdata
image: dspace/dspace-postgres-pgcrypto
networks:
dspacenet:
stdin_open: true
tty: true
volumes:
- pgdata:/pgdata
volumes:
assetstore:
pgdata:
solr_authority:
solr_oai:
solr_search:
solr_statistics:

View File

@@ -1,5 +1,4 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.dspace</groupId>
<artifactId>dspace-api</artifactId>
@@ -13,7 +12,7 @@
<parent>
<groupId>org.dspace</groupId>
<artifactId>dspace-parent</artifactId>
<version>7.0-SNAPSHOT</version>
<version>6.4</version>
<relativePath>..</relativePath>
</parent>
@@ -206,8 +205,7 @@
<executions>
<execution>
<id>setproperty</id>
<phase>generate-test-resources
</phase> <!-- XXX I think this should be 'initialize' - MHW -->
<phase>generate-test-resources</phase> <!-- XXX I think this should be 'initialize' - MHW -->
<goals>
<goal>execute</goal>
</goals>
@@ -268,12 +266,16 @@
<include>**/*.xsl</include>
<include>**/*.xmap</include>
</includes>
<excludes>
<exclude>**/node/node_modules/**</exclude>
</excludes>
</validationSet>
</validationSets>
</configuration>
</plugin>
<!-- Run Integration Testing! This plugin just kicks off the tests (when enabled). -->
<plugin>
<artifactId>maven-failsafe-plugin</artifactId>
@@ -292,6 +294,7 @@
</profile>
</profiles>
<dependencies>
<dependency>
<groupId>org.hibernate</groupId>
@@ -307,21 +310,10 @@
<groupId>org.hibernate</groupId>
<artifactId>hibernate-ehcache</artifactId>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator-cdi</artifactId>
<version>${hibernate-validator.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.1-b10</version>
</dependency>
<dependency>
<groupId>org.dspace</groupId>
<artifactId>handle</artifactId>
@@ -344,12 +336,12 @@
<type>pom</type>
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
@@ -365,10 +357,6 @@
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-collections4</artifactId>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId>
@@ -418,8 +406,8 @@
<artifactId>jdom</artifactId>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<groupId>ch.qos.reload4j</groupId>
<artifactId>reload4j</artifactId>
</dependency>
<dependency>
<groupId>oro</groupId>
@@ -449,6 +437,10 @@
<groupId>org.apache.poi</groupId>
<artifactId>poi-scratchpad</artifactId>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
</dependency>
<dependency>
<groupId>rome</groupId>
<artifactId>rome</artifactId>
@@ -511,12 +503,16 @@
<artifactId>h2</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.databene</groupId>
<artifactId>contiperf</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.rometools</groupId>
<artifactId>rome-modules</artifactId>
@@ -532,6 +528,14 @@
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
@@ -549,6 +553,10 @@
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
@@ -572,6 +580,14 @@
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
@@ -595,9 +611,9 @@
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-core</artifactId>
<version>4.10.4</version>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
@@ -683,6 +699,7 @@
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.2</version>
</dependency>
<dependency>
<groupId>javax.inject</groupId>
@@ -719,20 +736,19 @@
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- S3 also wanted jackson... -->
<!-- For ORCID v2 integration -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<groupId>org.dspace</groupId>
<artifactId>orcid-jaxb-api</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20180130</version>
</dependency>
</dependencies>

View File

@@ -19,6 +19,7 @@ package org.apache.solr.handler.extraction;
/**
* The various Solr Parameters names to use when extracting content.
*
**/
public interface ExtractingParams {
@@ -40,6 +41,8 @@ public interface ExtractingParams {
* <pre>fmap.title=solr.title</pre>
*
* In this example, the tika "title" metadata value will be added to a Solr field named "solr.title"
*
*
*/
public static final String MAP_PREFIX = "fmap.";
@@ -52,6 +55,7 @@ public interface ExtractingParams {
* boost.solr.title=2.5
* </pre>
* will boost the solr.title field for this document by 2.5
*
*/
public static final String BOOST_PREFIX = "boost.";
@@ -60,6 +64,7 @@ public interface ExtractingParams {
* <pre>
* literal.myField=Foo
* </pre>
*
*/
public static final String LITERALS_PREFIX = "literal.";
@@ -67,10 +72,10 @@ public interface ExtractingParams {
/**
* Restrict the extracted parts of a document to be indexed
* by passing in an XPath expression. All content that satisfies the XPath expr.
* will be passed to the {@link org.apache.solr.handler.extraction.SolrContentHandler}.
* will be passed to the {@link SolrContentHandler}.
* <p>
* See Tika's docs for what the extracted document looks like.
*
* <p>
* @see #CAPTURE_ELEMENTS
*/
public static final String XPATH_EXPRESSION = "xpath";
@@ -87,24 +92,20 @@ public interface ExtractingParams {
public static final String EXTRACT_FORMAT = "extractFormat";
/**
* Capture attributes separately according to the name of the element, instead of just adding them to the string
* buffer
* Capture attributes separately according to the name of the element, instead of just adding them to the string buffer
*/
public static final String CAPTURE_ATTRIBUTES = "captureAttr";
/**
* Literal field values will by default override other values such as metadata and content. Set this to false to
* revert to pre-4.0 behaviour
* Literal field values will by default override other values such as metadata and content. Set this to false to revert to pre-4.0 behaviour
*/
public static final String LITERALS_OVERRIDE = "literalsOverride";
/**
* Capture the specified fields (and everything included below it that isn't capture by some other capture field)
* separately from the default. This is different
* Capture the specified fields (and everything included below it that isn't capture by some other capture field) separately from the default. This is different
* then the case of passing in an XPath expression.
* <p>
* The Capture field is based on the localName returned to the
* {@link org.apache.solr.handler.extraction.SolrContentHandler}
* The Capture field is based on the localName returned to the {@link SolrContentHandler}
* by Tika, not to be confused by the mapped field. The field name can then
* be mapped into the index schema.
* <p>
@@ -119,6 +120,7 @@ public interface ExtractingParams {
* </pre>
* By passing in the p tag, you could capture all P tags separately from the rest of the t
* Thus, in the example, the capture of the P tag would be: "some text here. more text"
*
*/
public static final String CAPTURE_ELEMENTS = "capture";

View File

@@ -35,7 +35,8 @@ import org.dspace.handle.service.HandleService;
* @version $Revision$
*/
public class CommunityFiliator {
public class CommunityFiliator
{
protected CommunityService communityService;
protected HandleService handleService;
@@ -46,10 +47,12 @@ public class CommunityFiliator {
}
/**
* @param argv the command line arguments given
*
* @param argv arguments
* @throws Exception if error
*/
public static void main(String[] argv) throws Exception {
public static void main(String[] argv) throws Exception
{
// create an options object and populate it
CommandLineParser parser = new PosixParser();
@@ -70,7 +73,8 @@ public class CommunityFiliator {
String parentID = null;
String childID = null;
if (line.hasOption('h')) {
if (line.hasOption('h'))
{
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("CommunityFiliator\n", options);
System.out
@@ -81,37 +85,45 @@ public class CommunityFiliator {
System.exit(0);
}
if (line.hasOption('s')) {
if (line.hasOption('s'))
{
command = "set";
}
if (line.hasOption('r')) {
if (line.hasOption('r'))
{
command = "remove";
}
if (line.hasOption('p')) { // parent
if (line.hasOption('p')) // parent
{
parentID = line.getOptionValue('p');
}
if (line.hasOption('c')) { // child
if (line.hasOption('c')) // child
{
childID = line.getOptionValue('c');
}
// now validate
// must have a command set
if (command == null) {
if (command == null)
{
System.out
.println("Error - must run with either set or remove (run with -h flag for details)");
System.exit(1);
}
if ("set".equals(command) || "remove".equals(command)) {
if (parentID == null) {
if ("set".equals(command) || "remove".equals(command))
{
if (parentID == null)
{
System.out.println("Error - a parentID must be specified (run with -h flag for details)");
System.exit(1);
}
if (childID == null) {
if (childID == null)
{
System.out.println("Error - a childID must be specified (run with -h flag for details)");
System.exit(1);
}
@@ -123,39 +135,52 @@ public class CommunityFiliator {
// we are superuser!
c.turnOffAuthorisationSystem();
try {
try
{
// validate and resolve the parent and child IDs into commmunities
Community parent = filiator.resolveCommunity(c, parentID);
Community child = filiator.resolveCommunity(c, childID);
if (parent == null) {
if (parent == null)
{
System.out.println("Error, parent community cannot be found: "
+ parentID);
System.exit(1);
}
if (child == null) {
if (child == null)
{
System.out.println("Error, child community cannot be found: "
+ childID);
System.exit(1);
}
if ("set".equals(command)) {
if ("set".equals(command))
{
filiator.filiate(c, parent, child);
} else {
}
else
{
filiator.defiliate(c, parent, child);
}
} catch (SQLException sqlE) {
}
catch (SQLException sqlE)
{
System.out.println("Error - SQL exception: " + sqlE.toString());
} catch (AuthorizeException authE) {
}
catch (AuthorizeException authE)
{
System.out.println("Error - Authorize exception: "
+ authE.toString());
} catch (IOException ioE) {
}
catch (IOException ioE)
{
System.out.println("Error - IO exception: " + ioE.toString());
}
}
/**
*
* @param c context
* @param parent parent Community
* @param child child community
@@ -164,14 +189,15 @@ public class CommunityFiliator {
* @throws IOException if IO error
*/
public void filiate(Context c, Community parent, Community child)
throws SQLException, AuthorizeException, IOException {
throws SQLException, AuthorizeException, IOException
{
// check that a valid filiation would be established
// first test - proposed child must currently be an orphan (i.e.
// top-level)
Community childDad = CollectionUtils.isNotEmpty(child.getParentCommunities()) ? child.getParentCommunities()
.iterator().next() : null;
Community childDad = CollectionUtils.isNotEmpty(child.getParentCommunities()) ? child.getParentCommunities().iterator().next() : null;
if (childDad != null) {
if (childDad != null)
{
System.out.println("Error, child community: " + child.getID()
+ " already a child of: " + childDad.getID());
System.exit(1);
@@ -180,14 +206,12 @@ public class CommunityFiliator {
// second test - circularity: parent's parents can't include proposed
// child
List<Community> parentDads = parent.getParentCommunities();
for (int i = 0; i < parentDads.size(); i++) {
if (parentDads.get(i).getID().equals(child.getID())) {
System.out
.println("Error, circular parentage - child is parent of parent");
if (parentDads.contains(child))
{
System.out.println(
"Error, circular parentage - child is parent of parent");
System.exit(1);
}
}
// everthing's OK
communityService.addSubcommunity(c, parent, child);
@@ -199,6 +223,7 @@ public class CommunityFiliator {
}
/**
*
* @param c context
* @param parent parent Community
* @param child child community
@@ -207,20 +232,12 @@ public class CommunityFiliator {
* @throws IOException if IO error
*/
public void defiliate(Context c, Community parent, Community child)
throws SQLException, AuthorizeException, IOException {
throws SQLException, AuthorizeException, IOException
{
// verify that child is indeed a child of parent
List<Community> parentKids = parent.getSubcommunities();
boolean isChild = false;
for (int i = 0; i < parentKids.size(); i++) {
if (parentKids.get(i).getID().equals(child.getID())) {
isChild = true;
break;
}
}
if (!isChild) {
if (!parentKids.contains(child))
{
System.out
.println("Error, child community not a child of parent community");
System.exit(1);
@@ -228,8 +245,8 @@ public class CommunityFiliator {
// OK remove the mappings - but leave the community, which will become
// top-level
child.getParentCommunities().remove(parent);
parent.getSubcommunities().remove(child);
child.removeParentCommunity(parent);
parent.removeSubCommunity(child);
communityService.update(c, child);
communityService.update(c, parent);
@@ -242,27 +259,31 @@ public class CommunityFiliator {
/**
* Find a community by ID
*
* @param c context
* @param communityID community ID
* @return Community object
* @throws SQLException if database error
*/
protected Community resolveCommunity(Context c, String communityID)
throws SQLException {
throws SQLException
{
Community community = null;
if (communityID.indexOf('/') != -1) {
if (communityID.indexOf('/') != -1)
{
// has a / must be a handle
community = (Community) handleService.resolveToObject(c,
communityID);
// ensure it's a community
if ((community == null)
|| (community.getType() != Constants.COMMUNITY)) {
|| (community.getType() != Constants.COMMUNITY))
{
community = null;
}
} else {
}
else
{
community = communityService.find(c, UUID.fromString(communityID));
}

View File

@@ -15,6 +15,7 @@ import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.lang.StringUtils;
import org.dspace.core.ConfigurationManager;
import org.dspace.core.Context;
@@ -41,12 +42,12 @@ import org.dspace.eperson.service.GroupService;
*
* @author Robert Tansley
* @author Richard Jones
*
* @version $Revision$
*/
public final class CreateAdministrator {
/**
* DSpace Context object
*/
public final class CreateAdministrator
{
/** DSpace Context object */
private final Context context;
protected EPersonService ePersonService;
@@ -56,11 +57,13 @@ public final class CreateAdministrator {
* For invoking via the command line. If called with no command line arguments,
* it will negotiate with the user for the administrator details
*
* @param argv the command line arguments given
* @param argv
* command-line arguments
* @throws Exception if error
*/
public static void main(String[] argv)
throws Exception {
throws Exception
{
CommandLineParser parser = new PosixParser();
Options options = new Options();
@@ -75,11 +78,14 @@ public final class CreateAdministrator {
CommandLine line = parser.parse(options, argv);
if (line.hasOption("e") && line.hasOption("f") && line.hasOption("l") &&
line.hasOption("c") && line.hasOption("p")) {
line.hasOption("c") && line.hasOption("p"))
{
ca.createAdministrator(line.getOptionValue("e"),
line.getOptionValue("f"), line.getOptionValue("l"),
line.getOptionValue("c"), line.getOptionValue("p"));
} else {
}
else
{
ca.negotiateAdministratorDetails();
}
}
@@ -90,7 +96,8 @@ public final class CreateAdministrator {
* @throws Exception if error
*/
protected CreateAdministrator()
throws Exception {
throws Exception
{
context = new Context();
groupService = EPersonServiceFactory.getInstance().getGroupService();
ePersonService = EPersonServiceFactory.getInstance().getEPersonService();
@@ -103,7 +110,8 @@ public final class CreateAdministrator {
* @throws Exception if error
*/
protected void negotiateAdministratorDetails()
throws Exception {
throws Exception
{
Console console = System.console();
System.out.println("Creating an initial administrator account");
@@ -117,14 +125,18 @@ public final class CreateAdministrator {
char[] password2 = null;
String language = I18nUtil.DEFAULTLOCALE.getLanguage();
while (!dataOK) {
while (!dataOK)
{
System.out.print("E-mail address: ");
System.out.flush();
email = console.readLine();
if (!StringUtils.isBlank(email)) {
if (!StringUtils.isBlank(email))
{
email = email.trim();
} else {
}
else
{
System.out.println("Please provide an email address.");
continue;
}
@@ -134,7 +146,8 @@ public final class CreateAdministrator {
firstName = console.readLine();
if (firstName != null) {
if (firstName != null)
{
firstName = firstName.trim();
}
@@ -143,19 +156,21 @@ public final class CreateAdministrator {
lastName = console.readLine();
if (lastName != null) {
if (lastName != null)
{
lastName = lastName.trim();
}
if (ConfigurationManager.getProperty("webui.supported.locales") != null) {
System.out.println("Select one of the following languages: " + ConfigurationManager
.getProperty("webui.supported.locales"));
if (ConfigurationManager.getProperty("webui.supported.locales") != null)
{
System.out.println("Select one of the following languages: " + ConfigurationManager.getProperty("webui.supported.locales"));
System.out.print("Language: ");
System.out.flush();
language = console.readLine();
if (language != null) {
if (language != null)
{
language = language.trim();
language = I18nUtil.getSupportedLocale(new Locale(language)).getLanguage();
}
@@ -173,20 +188,25 @@ public final class CreateAdministrator {
password2 = console.readPassword();
//TODO real password validation
if (password1.length > 1 && Arrays.equals(password1, password2)) {
if (password1.length > 1 && Arrays.equals(password1, password2))
{
// password OK
System.out.print("Is the above data correct? (y or n): ");
System.out.flush();
String s = console.readLine();
if (s != null) {
if (s != null)
{
s = s.trim();
if (s.toLowerCase().startsWith("y")) {
if (s.toLowerCase().startsWith("y"))
{
dataOK = true;
}
}
} else {
}
else
{
System.out.println("Passwords don't match");
}
}
@@ -208,11 +228,13 @@ public final class CreateAdministrator {
* @param last user's last name
* @param language preferred language
* @param pw desired password
*
* @throws Exception if error
*/
protected void createAdministrator(String email, String first, String last,
String language, String pw)
throws Exception {
throws Exception
{
// Of course we aren't an administrator yet so we need to
// circumvent authorisation
context.turnOffAuthorisationSystem();
@@ -220,7 +242,8 @@ public final class CreateAdministrator {
// Find administrator group
Group admins = groupService.findByName(context, Group.ADMIN);
if (admins == null) {
if (admins == null)
{
throw new IllegalStateException("Error, no admin group (group 1) found");
}
@@ -229,7 +252,8 @@ public final class CreateAdministrator {
// check if the email belongs to a registered user,
// if not create a new user with this email
if (eperson == null) {
if (eperson == null)
{
eperson = ePersonService.create(context);
eperson.setEmail(email);
eperson.setCanLogIn(true);

View File

@@ -7,19 +7,7 @@
*/
package org.dspace.administer;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.cli.*;
import org.apache.xml.serialize.Method;
import org.apache.xml.serialize.OutputFormat;
import org.apache.xml.serialize.XMLSerializer;
@@ -31,6 +19,14 @@ import org.dspace.content.service.MetadataSchemaService;
import org.dspace.core.Context;
import org.xml.sax.SAXException;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* @author Graham Triggs
@@ -48,17 +44,11 @@ import org.xml.sax.SAXException;
* </metadata-schemas>
* }
*/
public class MetadataExporter {
public class MetadataExporter
{
protected static MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance()
.getMetadataSchemaService();
protected static MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance()
.getMetadataFieldService();
/**
* Default constructor
*/
private MetadataExporter() { }
protected static MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance().getMetadataSchemaService();
protected static MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService();
/**
* @param args commandline arguments
@@ -68,8 +58,8 @@ public class MetadataExporter {
* @throws SQLException if database error
* @throws RegistryExportException if export error
*/
public static void main(String[] args)
throws ParseException, SQLException, IOException, SAXException, RegistryExportException {
public static void main(String[] args) throws ParseException, SQLException, IOException, SAXException, RegistryExportException
{
// create an options object and populate it
CommandLineParser parser = new PosixParser();
Options options = new Options();
@@ -80,14 +70,18 @@ public class MetadataExporter {
String file = null;
String schema = null;
if (line.hasOption('f')) {
if (line.hasOption('f'))
{
file = line.getOptionValue('f');
} else {
}
else
{
usage();
System.exit(0);
}
if (line.hasOption('s')) {
if (line.hasOption('s'))
{
schema = line.getOptionValue('s');
}
@@ -96,7 +90,6 @@ public class MetadataExporter {
/**
* Save a registry to a filepath
*
* @param file filepath
* @param schema schema definition to save
* @throws SQLException if database error
@@ -104,8 +97,8 @@ public class MetadataExporter {
* @throws SAXException if XML error
* @throws RegistryExportException if export error
*/
public static void saveRegistry(String file, String schema)
throws SQLException, IOException, SAXException, RegistryExportException {
public static void saveRegistry(String file, String schema) throws SQLException, IOException, SAXException, RegistryExportException
{
// create a context
Context context = new Context();
context.turnOffAuthorisationSystem();
@@ -125,22 +118,27 @@ public class MetadataExporter {
List<MetadataField> mdFields = null;
// If a single schema has been specified
if (schema != null && !"".equals(schema)) {
if (schema != null && !"".equals(schema))
{
// Get the id of that schema
MetadataSchema mdSchema = metadataSchemaService.find(context, schema);
if (mdSchema == null) {
if (mdSchema == null)
{
throw new RegistryExportException("no schema to export");
}
// Get the metadata fields only for the specified schema
mdFields = metadataFieldService.findAllInSchema(context, mdSchema);
} else {
}
else
{
// Get the metadata fields for all the schemas
mdFields = metadataFieldService.findAll(context);
}
// Output the metadata fields
for (MetadataField mdField : mdFields) {
for (MetadataField mdField : mdFields)
{
saveType(context, xmlSerializer, mdField);
}
@@ -153,7 +151,6 @@ public class MetadataExporter {
/**
* Serialize the schema registry. If the parameter 'schema' is null or empty, save all schemas
*
* @param context DSpace Context
* @param xmlSerializer XML serializer
* @param schema schema (may be null to save all)
@@ -161,18 +158,22 @@ public class MetadataExporter {
* @throws SAXException if XML error
* @throws RegistryExportException if export error
*/
public static void saveSchema(Context context, XMLSerializer xmlSerializer, String schema)
throws SQLException, SAXException, RegistryExportException {
if (schema != null && !"".equals(schema)) {
public static void saveSchema(Context context, XMLSerializer xmlSerializer, String schema) throws SQLException, SAXException, RegistryExportException
{
if (schema != null && !"".equals(schema))
{
// Find a single named schema
MetadataSchema mdSchema = metadataSchemaService.find(context, schema);
saveSchema(xmlSerializer, mdSchema);
} else {
}
else
{
// Find all schemas
List<MetadataSchema> mdSchemas = metadataSchemaService.findAll(context);
for (MetadataSchema mdSchema : mdSchemas) {
for (MetadataSchema mdSchema : mdSchemas)
{
saveSchema(xmlSerializer, mdSchema);
}
}
@@ -186,22 +187,25 @@ public class MetadataExporter {
* @throws SAXException if XML error
* @throws RegistryExportException if export error
*/
private static void saveSchema(XMLSerializer xmlSerializer, MetadataSchema mdSchema)
throws SAXException, RegistryExportException {
private static void saveSchema(XMLSerializer xmlSerializer, MetadataSchema mdSchema) throws SAXException, RegistryExportException
{
// If we haven't got a schema, it's an error
if (mdSchema == null) {
if (mdSchema == null)
{
throw new RegistryExportException("no schema to export");
}
String name = mdSchema.getName();
String namespace = mdSchema.getNamespace();
if (name == null || "".equals(name)) {
if (name == null || "".equals(name))
{
System.out.println("name is null, skipping");
return;
}
if (namespace == null || "".equals(namespace)) {
if (namespace == null || "".equals(namespace))
{
System.out.println("namespace is null, skipping");
return;
}
@@ -233,10 +237,11 @@ public class MetadataExporter {
* @throws SQLException if database error
* @throws IOException if IO error
*/
private static void saveType(Context context, XMLSerializer xmlSerializer, MetadataField mdField)
throws SAXException, RegistryExportException, SQLException, IOException {
private static void saveType(Context context, XMLSerializer xmlSerializer, MetadataField mdField) throws SAXException, RegistryExportException, SQLException, IOException
{
// If we haven't been given a field, it's an error
if (mdField == null) {
if (mdField == null)
{
throw new RegistryExportException("no field to export");
}
@@ -247,7 +252,8 @@ public class MetadataExporter {
String scopeNote = mdField.getScopeNote();
// We must have a schema and element
if (schemaName == null || element == null) {
if (schemaName == null || element == null)
{
throw new RegistryExportException("incomplete field information");
}
@@ -265,20 +271,26 @@ public class MetadataExporter {
xmlSerializer.endElement("element");
// Output the qualifier, if present
if (qualifier != null) {
if (qualifier != null)
{
xmlSerializer.startElement("qualifier", null);
xmlSerializer.characters(qualifier.toCharArray(), 0, qualifier.length());
xmlSerializer.endElement("qualifier");
} else {
}
else
{
xmlSerializer.comment("unqualified");
}
// Output the scope note, if present
if (scopeNote != null) {
if (scopeNote != null)
{
xmlSerializer.startElement("scope_note", null);
xmlSerializer.characters(scopeNote.toCharArray(), 0, scopeNote.length());
xmlSerializer.endElement("scope_note");
} else {
}
else
{
xmlSerializer.comment("no scope note");
}
@@ -286,29 +298,31 @@ public class MetadataExporter {
}
static Map<Integer, String> schemaMap = new HashMap<Integer, String>();
/**
* Helper method to retrieve a schema name for the field.
* Caches the name after looking up the id.
*
* @param context DSpace Context
* @param mdField DSpace metadata field
* @return name of schema
* @throws SQLException if database error
* @throws RegistryExportException if export error
*/
private static String getSchemaName(Context context, MetadataField mdField)
throws SQLException, RegistryExportException {
private static String getSchemaName(Context context, MetadataField mdField) throws SQLException, RegistryExportException
{
// Get name from cache
String name = schemaMap.get(mdField.getMetadataSchema().getID());
if (name == null) {
if (name == null)
{
// Name not retrieved before, so get the schema now
MetadataSchema mdSchema = metadataSchemaService.find(context, mdField.getMetadataSchema().getID());
if (mdSchema != null) {
if (mdSchema != null)
{
name = mdSchema.getName();
schemaMap.put(mdSchema.getID(), name);
} else {
}
else
{
// Can't find the schema
throw new RegistryExportException("Can't get schema name for field");
}
@@ -319,7 +333,8 @@ public class MetadataExporter {
/**
* Print the usage message to stdout
*/
public static void usage() {
public static void usage()
{
String usage = "Use this class with the following options:\n" +
" -f <xml output file> : specify the output file for the schemas\n" +
" -s <schema> : name of the schema to export\n";

View File

@@ -9,6 +9,7 @@ package org.dspace.administer;
import java.io.IOException;
import java.sql.SQLException;
import javax.xml.parsers.ParserConfigurationException;
import javax.xml.transform.TransformerException;
@@ -17,7 +18,9 @@ import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import org.apache.commons.cli.PosixParser;
import org.apache.xpath.XPathAPI;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema;
@@ -28,9 +31,11 @@ import org.dspace.content.service.MetadataSchemaService;
import org.dspace.core.Context;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.w3c.dom.Document;
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import org.xml.sax.SAXException;
/**
@@ -56,26 +61,17 @@ import org.xml.sax.SAXException;
* </dspace-dc-types>
* }
*/
public class MetadataImporter {
protected static MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance()
.getMetadataSchemaService();
protected static MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance()
.getMetadataFieldService();
public class MetadataImporter
{
protected static MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance().getMetadataSchemaService();
protected static MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService();
/**
* logging category
*/
/** logging category */
private static final Logger log = LoggerFactory.getLogger(MetadataImporter.class);
/**
* Default constructor
*/
private MetadataImporter() { }
/**
* main method for reading user input from the command line
*
* @param args the command line arguments given
* @param args arguments
* @throws ParseException if parse error
* @throws SQLException if database error
* @throws IOException if IO error
@@ -89,7 +85,8 @@ public class MetadataImporter {
public static void main(String[] args)
throws ParseException, SQLException, IOException, TransformerException,
ParserConfigurationException, AuthorizeException, SAXException,
NonUniqueMetadataException, RegistryImportException {
NonUniqueMetadataException, RegistryImportException
{
boolean forceUpdate = false;
// create an options object and populate it
@@ -100,9 +97,12 @@ public class MetadataImporter {
CommandLine line = parser.parse(options, args);
String file = null;
if (line.hasOption('f')) {
if (line.hasOption('f'))
{
file = line.getOptionValue('f');
} else {
}
else
{
usage();
System.exit(0);
}
@@ -127,10 +127,12 @@ public class MetadataImporter {
*/
public static void loadRegistry(String file, boolean forceUpdate)
throws SQLException, IOException, TransformerException, ParserConfigurationException,
AuthorizeException, SAXException, NonUniqueMetadataException, RegistryImportException {
AuthorizeException, SAXException, NonUniqueMetadataException, RegistryImportException
{
Context context = null;
try {
try
{
// create a context
context = new Context();
context.turnOffAuthorisationSystem();
@@ -142,7 +144,8 @@ public class MetadataImporter {
NodeList schemaNodes = XPathAPI.selectNodeList(document, "/dspace-dc-types/dc-schema");
// Add each one as a new format to the registry
for (int i = 0; i < schemaNodes.getLength(); i++) {
for (int i = 0; i < schemaNodes.getLength(); i++)
{
Node n = schemaNodes.item(i);
loadSchema(context, n, forceUpdate);
}
@@ -151,18 +154,20 @@ public class MetadataImporter {
NodeList typeNodes = XPathAPI.selectNodeList(document, "/dspace-dc-types/dc-type");
// Add each one as a new format to the registry
for (int i = 0; i < typeNodes.getLength(); i++) {
for (int i = 0; i < typeNodes.getLength(); i++)
{
Node n = typeNodes.item(i);
loadType(context, n);
}
context.restoreAuthSystemState();
context.complete();
} finally {
// Clean up our context, if it still exists & it was never completed
if (context != null && context.isValid()) {
context.abort();
}
finally
{
// Clean up our context, if it still exists & it was never completed
if(context!=null && context.isValid())
context.abort();
}
}
@@ -170,8 +175,10 @@ public class MetadataImporter {
* Process a node in the metadata registry XML file. If the
* schema already exists, it will not be recreated
*
* @param context DSpace context object
* @param node the node in the DOM tree
* @param context
* DSpace context object
* @param node
* the node in the DOM tree
* @throws SQLException if database error
* @throws IOException if IO error
* @throws TransformerException if transformer error
@@ -181,43 +188,51 @@ public class MetadataImporter {
*/
private static void loadSchema(Context context, Node node, boolean updateExisting)
throws SQLException, IOException, TransformerException,
AuthorizeException, NonUniqueMetadataException, RegistryImportException {
AuthorizeException, NonUniqueMetadataException, RegistryImportException
{
// Get the values
String name = RegistryImporter.getElementData(node, "name");
String namespace = RegistryImporter.getElementData(node, "namespace");
if (name == null || "".equals(name)) {
if (name == null || "".equals(name))
{
throw new RegistryImportException("Name of schema must be supplied");
}
if (namespace == null || "".equals(namespace)) {
if (namespace == null || "".equals(namespace))
{
throw new RegistryImportException("Namespace of schema must be supplied");
}
// check to see if the schema already exists
MetadataSchema s = metadataSchemaService.find(context, name);
if (s == null) {
if (s == null)
{
// Schema does not exist - create
log.info("Registering Schema " + name + " (" + namespace + ")");
metadataSchemaService.create(context, name, namespace);
} else {
}
else
{
// Schema exists - if it's the same namespace, allow the type imports to continue
if (s.getNamespace().equals(namespace)) {
if (s.getNamespace().equals(namespace))
{
// This schema already exists with this namespace, skipping it
return;
}
// It's a different namespace - have we been told to update?
if (updateExisting) {
if (updateExisting)
{
// Update the existing schema namespace and continue to type import
log.info("Updating Schema " + name + ": New namespace " + namespace);
s.setNamespace(namespace);
metadataSchemaService.update(context, s);
} else {
throw new RegistryImportException(
"Schema " + name + " already registered with different namespace " + namespace + ". Rerun with " +
"'update' option enabled if you wish to update this schema.");
}
else
{
throw new RegistryImportException("Schema " + name + " already registered with different namespace " + namespace + ". Rerun with 'update' option enabled if you wish to update this schema.");
}
}
@@ -228,8 +243,10 @@ public class MetadataImporter {
* be a "dc-type" node. If the type already exists, then it
* will not be reimported
*
* @param context DSpace context object
* @param node the node in the DOM tree
* @param context
* DSpace context object
* @param node
* the node in the DOM tree
* @throws SQLException if database error
* @throws IOException if IO error
* @throws TransformerException if transformer error
@@ -239,7 +256,8 @@ public class MetadataImporter {
*/
private static void loadType(Context context, Node node)
throws SQLException, IOException, TransformerException,
AuthorizeException, NonUniqueMetadataException, RegistryImportException {
AuthorizeException, NonUniqueMetadataException, RegistryImportException
{
// Get the values
String schema = RegistryImporter.getElementData(node, "schema");
String element = RegistryImporter.getElementData(node, "element");
@@ -247,7 +265,8 @@ public class MetadataImporter {
String scopeNote = RegistryImporter.getElementData(node, "scope_note");
// If the schema is not provided default to DC
if (schema == null) {
if (schema == null)
{
schema = MetadataSchema.DC_SCHEMA;
}
@@ -255,21 +274,22 @@ public class MetadataImporter {
// Find the matching schema object
MetadataSchema schemaObj = metadataSchemaService.find(context, schema);
if (schemaObj == null) {
if (schemaObj == null)
{
throw new RegistryImportException("Schema '" + schema + "' is not registered and does not exist.");
}
MetadataField mf = metadataFieldService.findByElement(context, schemaObj, element, qualifier);
if (mf != null) {
if (mf != null)
{
// Metadata field already exists, skipping it
return;
}
// Actually create this metadata field as it doesn't yet exist
String fieldName = schema + "." + element + "." + qualifier;
if (qualifier == null) {
if(qualifier==null)
fieldName = schema + "." + element;
}
log.info("Registering metadata field " + fieldName);
MetadataField field = metadataFieldService.create(context, schemaObj, element, qualifier, scopeNote);
metadataFieldService.update(context, field);
@@ -278,7 +298,8 @@ public class MetadataImporter {
/**
* Print the usage message to stdout
*/
public static void usage() {
public static void usage()
{
String usage = "Use this class with the following option:\n" +
" -f <xml source file> : specify which xml source file " +
"contains the DC fields to import.\n";

View File

@@ -12,11 +12,13 @@ package org.dspace.administer;
*
* An exception to report any problems with registry exports
*/
public class RegistryExportException extends Exception {
public class RegistryExportException extends Exception
{
/**
* Create an empty authorize exception
*/
public RegistryExportException() {
public RegistryExportException()
{
super();
}
@@ -25,7 +27,8 @@ public class RegistryExportException extends Exception {
*
* @param message exception message
*/
public RegistryExportException(String message) {
public RegistryExportException(String message)
{
super(message);
}
@@ -35,7 +38,8 @@ public class RegistryExportException extends Exception {
* @param message exception message
* @param e reference to Throwable
*/
public RegistryExportException(String message, Throwable e) {
public RegistryExportException(String message, Throwable e)
{
super(message, e);
}
@@ -44,7 +48,8 @@ public class RegistryExportException extends Exception {
*
* @param e reference to Throwable
*/
public RegistryExportException(Throwable e) {
public RegistryExportException(Throwable e)
{
super(e);
}

View File

@@ -12,11 +12,13 @@ package org.dspace.administer;
*
* An exception to report any problems with registry imports
*/
public class RegistryImportException extends Exception {
public class RegistryImportException extends Exception
{
/**
* Create an empty authorize exception
*/
public RegistryImportException() {
public RegistryImportException()
{
super();
}
@@ -25,7 +27,8 @@ public class RegistryImportException extends Exception {
*
* @param message error message
*/
public RegistryImportException(String message) {
public RegistryImportException(String message)
{
super(message);
}
@@ -35,7 +38,8 @@ public class RegistryImportException extends Exception {
* @param message error message
* @param e throwable
*/
public RegistryImportException(String message, Throwable e) {
public RegistryImportException(String message, Throwable e)
{
super(message, e);
}
@@ -44,7 +48,8 @@ public class RegistryImportException extends Exception {
*
* @param e throwable
*/
public RegistryImportException(Throwable e) {
public RegistryImportException(Throwable e)
{
super(e);
}

View File

@@ -9,15 +9,18 @@ package org.dspace.administer;
import java.io.File;
import java.io.IOException;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException;
import javax.xml.transform.TransformerException;
import org.apache.xpath.XPathAPI;
import org.w3c.dom.Document;
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import org.xml.sax.SAXException;
/**
@@ -28,24 +31,22 @@ import org.xml.sax.SAXException;
* I am the author, really I ripped these methods off from other
* classes
*/
public class RegistryImporter {
/**
* Default constructor
*/
private RegistryImporter() { }
public class RegistryImporter
{
/**
* Load in the XML from file.
*
* @param filename the filename to load from
* @param filename
* the filename to load from
*
* @return the DOM representation of the XML file
* @throws IOException if IO error
* @throws ParserConfigurationException if configuration parse error
* @throws SAXException if XML parse error
*/
public static Document loadXML(String filename)
throws IOException, ParserConfigurationException, SAXException {
throws IOException, ParserConfigurationException, SAXException
{
DocumentBuilder builder = DocumentBuilderFactory.newInstance()
.newDocumentBuilder();
@@ -66,17 +67,21 @@ public class RegistryImporter {
* </P>
* Why this isn't a core part of the XML API I do not know...
*
* @param parentElement the element, whose child element you want the CDATA from
* @param childName the name of the element you want the CDATA from
* @return the CDATA as a <code>String</code>
* @param parentElement
* the element, whose child element you want the CDATA from
* @param childName
* the name of the element you want the CDATA from
* @throws TransformerException if error
* @return the CDATA as a <code>String</code>
*/
public static String getElementData(Node parentElement, String childName)
throws TransformerException {
throws TransformerException
{
// Grab the child node
Node childNode = XPathAPI.selectSingleNode(parentElement, childName);
if (childNode == null) {
if (childNode == null)
{
// No child node, so no values
return null;
}
@@ -84,7 +89,8 @@ public class RegistryImporter {
// Get the #text
Node dataNode = childNode.getFirstChild();
if (dataNode == null) {
if (dataNode == null)
{
return null;
}
@@ -109,19 +115,23 @@ public class RegistryImporter {
* </P>
* Why this also isn't a core part of the XML API I do not know...
*
* @param parentElement the element, whose child element you want the CDATA from
* @param childName the name of the element you want the CDATA from
* @return the CDATA as a <code>String</code>
* @param parentElement
* the element, whose child element you want the CDATA from
* @param childName
* the name of the element you want the CDATA from
* @throws TransformerException if error
* @return the CDATA as a <code>String</code>
*/
public static String[] getRepeatedElementData(Node parentElement,
String childName) throws TransformerException {
String childName) throws TransformerException
{
// Grab the child node
NodeList childNodes = XPathAPI.selectNodeList(parentElement, childName);
String[] data = new String[childNodes.getLength()];
for (int i = 0; i < childNodes.getLength(); i++) {
for (int i = 0; i < childNodes.getLength(); i++)
{
// Get the #text node
Node dataNode = childNodes.item(i).getFirstChild();

View File

@@ -12,6 +12,7 @@ import java.io.IOException;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.Arrays;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException;
@@ -43,33 +44,29 @@ import org.xml.sax.SAXException;
* @author Robert Tansley
* @version $Revision$
*/
public class RegistryLoader {
/**
* log4j category
*/
public class RegistryLoader
{
/** log4j category */
private static Logger log = Logger.getLogger(RegistryLoader.class);
protected static BitstreamFormatService bitstreamFormatService = ContentServiceFactory.getInstance()
.getBitstreamFormatService();
/**
* Default constructor
*/
private RegistryLoader() { }
protected static BitstreamFormatService bitstreamFormatService = ContentServiceFactory.getInstance().getBitstreamFormatService();
/**
* For invoking via the command line
*
* @param argv the command line arguments given
* @param argv
* command-line arguments
* @throws Exception if error
*/
public static void main(String[] argv) throws Exception {
public static void main(String[] argv) throws Exception
{
String usage = "Usage: " + RegistryLoader.class.getName()
+ " (-bitstream | -metadata) registry-file.xml";
Context context = null;
try {
try
{
context = new Context();
// Can't update registries anonymously, so we need to turn off
@@ -77,12 +74,17 @@ public class RegistryLoader {
context.turnOffAuthorisationSystem();
// Work out what we're loading
if (argv[0].equalsIgnoreCase("-bitstream")) {
if (argv[0].equalsIgnoreCase("-bitstream"))
{
RegistryLoader.loadBitstreamFormats(context, argv[1]);
} else if (argv[0].equalsIgnoreCase("-metadata")) {
}
else if (argv[0].equalsIgnoreCase("-metadata"))
{
// Call MetadataImporter, as it handles Metadata schema updates
MetadataImporter.loadRegistry(argv[1], true);
} else {
}
else
{
System.err.println(usage);
}
@@ -90,29 +92,36 @@ public class RegistryLoader {
context.complete();
System.exit(0);
} catch (ArrayIndexOutOfBoundsException ae) {
}
catch (ArrayIndexOutOfBoundsException ae)
{
System.err.println(usage);
System.exit(1);
} catch (Exception e) {
}
catch (Exception e)
{
log.fatal(LogManager.getHeader(context, "error_loading_registries",
""), e);
System.err.println("Error: \n - " + e.getMessage());
System.exit(1);
} finally {
// Clean up our context, if it still exists & it was never completed
if (context != null && context.isValid()) {
context.abort();
}
finally
{
// Clean up our context, if it still exists & it was never completed
if(context!=null && context.isValid())
context.abort();
}
}
/**
* Load Bitstream Format metadata
*
* @param context DSpace context object
* @param filename the filename of the XML file to load
* @param context
* DSpace context object
* @param filename
* the filename of the XML file to load
* @throws SQLException if database error
* @throws IOException if IO error
* @throws TransformerException if transformer error
@@ -122,7 +131,8 @@ public class RegistryLoader {
*/
public static void loadBitstreamFormats(Context context, String filename)
throws SQLException, IOException, ParserConfigurationException,
SAXException, TransformerException, AuthorizeException {
SAXException, TransformerException, AuthorizeException
{
Document document = loadXML(filename);
// Get the nodes corresponding to formats
@@ -130,7 +140,8 @@ public class RegistryLoader {
"dspace-bitstream-types/bitstream-type");
// Add each one as a new format to the registry
for (int i = 0; i < typeNodes.getLength(); i++) {
for (int i = 0; i < typeNodes.getLength(); i++)
{
Node n = typeNodes.item(i);
loadFormat(context, n);
}
@@ -143,8 +154,10 @@ public class RegistryLoader {
* Process a node in the bitstream format registry XML file. The node must
* be a "bitstream-type" node
*
* @param context DSpace context object
* @param node the node in the DOM tree
* @param context
* DSpace context object
* @param node
* the node in the DOM tree
* @throws SQLException if database error
* @throws IOException if IO error
* @throws TransformerException if transformer error
@@ -152,7 +165,8 @@ public class RegistryLoader {
*/
private static void loadFormat(Context context, Node node)
throws SQLException, IOException, TransformerException,
AuthorizeException {
AuthorizeException
{
// Get the values
String mimeType = getElementData(node, "mimetype");
String shortDesc = getElementData(node, "short_description");
@@ -170,12 +184,14 @@ public class RegistryLoader {
BitstreamFormat exists = bitstreamFormatService.findByMIMEType(context, mimeType);
// If not found by mimeType, check by short description (since this must also be unique)
if (exists == null) {
if(exists==null)
{
exists = bitstreamFormatService.findByShortDescription(context, shortDesc);
}
// If it doesn't exist, create it..otherwise skip it.
if (exists == null) {
if(exists==null)
{
// Create the format object
BitstreamFormat format = bitstreamFormatService.create(context);
@@ -199,14 +215,16 @@ public class RegistryLoader {
/**
* Load in the XML from file.
*
* @param filename the filename to load from
* @return the DOM representation of the XML file
* @param filename
* the filename to load from
* @throws IOException if IO error
* @throws ParserConfigurationException if config error
* @throws SAXException if parser error
* @return the DOM representation of the XML file
*/
private static Document loadXML(String filename) throws IOException,
ParserConfigurationException, SAXException {
ParserConfigurationException, SAXException
{
DocumentBuilder builder = DocumentBuilderFactory.newInstance()
.newDocumentBuilder();
@@ -225,17 +243,21 @@ public class RegistryLoader {
* </P>
* Why this isn't a core part of the XML API I do not know...
*
* @param parentElement the element, whose child element you want the CDATA from
* @param childName the name of the element you want the CDATA from
* @return the CDATA as a <code>String</code>
* @param parentElement
* the element, whose child element you want the CDATA from
* @param childName
* the name of the element you want the CDATA from
* @throws TransformerException if transformer error
* @return the CDATA as a <code>String</code>
*/
private static String getElementData(Node parentElement, String childName)
throws TransformerException {
throws TransformerException
{
// Grab the child node
Node childNode = XPathAPI.selectSingleNode(parentElement, childName);
if (childNode == null) {
if (childNode == null)
{
// No child node, so no values
return null;
}
@@ -243,7 +265,8 @@ public class RegistryLoader {
// Get the #text
Node dataNode = childNode.getFirstChild();
if (dataNode == null) {
if (dataNode == null)
{
return null;
}
@@ -268,19 +291,23 @@ public class RegistryLoader {
* </P>
* Why this also isn't a core part of the XML API I do not know...
*
* @param parentElement the element, whose child element you want the CDATA from
* @param childName the name of the element you want the CDATA from
* @return the CDATA as a <code>String</code>
* @param parentElement
* the element, whose child element you want the CDATA from
* @param childName
* the name of the element you want the CDATA from
* @throws TransformerException if transformer error
* @return the CDATA as a <code>String</code>
*/
private static String[] getRepeatedElementData(Node parentElement,
String childName) throws TransformerException {
String childName) throws TransformerException
{
// Grab the child node
NodeList childNodes = XPathAPI.selectNodeList(parentElement, childName);
String[] data = new String[childNodes.getLength()];
for (int i = 0; i < childNodes.getLength(); i++) {
for (int i = 0; i < childNodes.getLength(); i++)
{
// Get the #text node
Node dataNode = childNodes.item(i).getFirstChild();

View File

@@ -14,6 +14,7 @@ import java.io.IOException;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.Map;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException;
@@ -21,8 +22,10 @@ import javax.xml.transform.TransformerException;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.DefaultParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.cli.ParseException;
import org.apache.xpath.XPathAPI;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.Collection;
@@ -45,6 +48,7 @@ import org.xml.sax.SAXException;
* an XML file.
*
* The XML file structure needs to be:
* <p>
* {@code
* <import_structure>
* <community>
@@ -56,39 +60,37 @@ import org.xml.sax.SAXException;
* </community>
* </import_structure>
* }
* it can be arbitrarily deep, and supports all the metadata elements
* <p>
* It can be arbitrarily deep, and supports all the metadata elements
* that make up the community and collection metadata. See the system
* documentation for more details
* documentation for more details.
*
* @author Richard Jones
*
*/
public class StructBuilder {
/**
* the output xml document which will contain updated information about the
* imported structure
* The output XML document which will contain updated information about the
* imported structure.
*/
private static org.jdom.Document xmlOutput = new org.jdom.Document(new Element("imported_structure"));
private static final org.jdom.Document xmlOutput
= new org.jdom.Document(new Element("imported_structure"));
/**
* a hashtable to hold metadata for the collection being worked on
* A hash table to hold metadata for the collection being worked on.
*/
private static Map<String, String> collectionMap = new HashMap<String, String>();
private static final Map<String, String> collectionMap = new HashMap<>();
/**
* a hashtable to hold metadata for the community being worked on
* A hash table to hold metadata for the community being worked on.
*/
private static Map<String, String> communityMap = new HashMap<String, String>();
private static final Map<String, String> communityMap = new HashMap<>();
protected static CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService();
protected static CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService();
protected static EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService();
/**
* Default constructor
*/
private StructBuilder() { }
/**
* Main method to be run from the command line to import a structure into
* DSpace
@@ -101,19 +103,34 @@ public class StructBuilder {
* with the handle for each imported item added as an attribute.
*
* @param argv the command line arguments given
* @throws Exception if an error occurs
* @throws ParserConfigurationException passed through.
* @throws SQLException passed through.
*/
public static void main(String[] argv)
throws Exception {
CommandLineParser parser = new PosixParser();
throws ParserConfigurationException, SQLException {
CommandLineParser parser = new DefaultParser();
Options options = new Options();
options.addOption("f", "file", true, "file");
options.addOption("h", "help", false, "help");
options.addOption("?", "help");
options.addOption("f", "file", true, "input structure document");
options.addOption("e", "eperson", true, "eperson");
options.addOption("o", "output", true, "output");
options.addOption("o", "output", true, "output structure document");
CommandLine line = parser.parse(options, argv);
CommandLine line = null;
try {
line = parser.parse(options, argv);
} catch (ParseException ex) {
System.err.println(ex.getMessage());
usage(options);
System.exit(1);
}
if (line.hasOption('h') || line.hasOption('?')) {
usage(options);
System.exit(0);
}
String file = null;
String eperson = null;
@@ -132,22 +149,41 @@ public class StructBuilder {
}
if (output == null || eperson == null || file == null) {
usage();
System.exit(0);
usage(options);
System.exit(1);
}
// create a context
Context context = new Context();
// set the context
try {
context.setCurrentUser(ePersonService.findByEmail(context, eperson));
} catch (SQLException ex) {
System.err.format("That user could not be found: %s%n", ex.getMessage());
System.exit(1);
}
// load the XML
Document document = loadXML(file);
Document document = null;
try {
document = loadXML(file);
} catch (IOException ex) {
System.err.format("The input document could not be read: %s%n", ex.getMessage());
System.exit(1);
} catch (SAXException ex) {
System.err.format("The input document could not be parsed: %s%n", ex.getMessage());
System.exit(1);
}
// run the preliminary validation, to be sure that the the XML document
// is properly structured
try {
validate(document);
} catch (TransformerException ex) {
System.err.format("The input document is invalid: %s%n", ex.getMessage());
System.exit(1);
}
// load the mappings into the member variable hashmaps
communityMap.put("name", "name");
@@ -164,71 +200,83 @@ public class StructBuilder {
collectionMap.put("license", "license");
collectionMap.put("provenance", "provenance_description");
Element[] elements = new Element[]{};
try {
// get the top level community list
NodeList first = XPathAPI.selectNodeList(document, "/import_structure/community");
// run the import starting with the top level communities
Element[] elements = handleCommunities(context, first, null);
elements = handleCommunities(context, first, null);
} catch (TransformerException ex) {
System.err.format("Input content not understood: %s%n", ex.getMessage());
System.exit(1);
} catch (AuthorizeException ex) {
System.err.format("Not authorized: %s%n", ex.getMessage());
System.exit(1);
}
// generate the output
Element root = xmlOutput.getRootElement();
for (int i = 0; i < elements.length; i++) {
root.addContent(elements[i]);
for (Element element : elements) {
root.addContent(element);
}
// finally write the string into the output file
try {
BufferedWriter out = new BufferedWriter(new FileWriter(output));
try (BufferedWriter out = new BufferedWriter(new FileWriter(output));) {
out.write(new XMLOutputter().outputString(xmlOutput));
out.close();
} catch (IOException e) {
System.out.println("Unable to write to output file " + output);
System.exit(0);
System.exit(1);
}
context.complete();
}
/**
* Output the usage information
* Output the usage information.
*/
private static void usage() {
System.out.println("Usage: java StructBuilder -f <source XML file> -o <output file> -e <eperson email>");
System.out.println(
"Communities will be created from the top level, and a map of communities to handles will be returned in " +
"the output file");
return;
private static void usage(Options options) {
HelpFormatter helper = new HelpFormatter();
helper.printHelp("java StructBuilder -f <source XML file> -o <output file> -e <eperson email>",
"Load community/collection structure from a file.",
options,
"Communities will be created from the top level,"
+ " and a map of communities to handles will be returned"
+ " in the output file.");
}
/**
* Validate the XML document. This method does not return, but if validation
* fails it generates an error and ceases execution
* Validate the XML document. This method returns if the document is valid.
* If validation fails it generates an error and ceases execution.
*
* @param document the XML document object
* @throws TransformerException if transformer error
*
*/
private static void validate(org.w3c.dom.Document document)
throws TransformerException {
StringBuffer err = new StringBuffer();
StringBuilder err = new StringBuilder();
boolean trip = false;
err.append("The following errors were encountered parsing the source XML\n");
err.append("No changes have been made to the DSpace instance\n\n");
err.append("The following errors were encountered parsing the source XML.\n");
err.append("No changes have been made to the DSpace instance.\n\n");
NodeList first = XPathAPI.selectNodeList(document, "/import_structure/community");
if (first.getLength() == 0) {
err.append("-There are no top level communities in the source document");
err.append("-There are no top level communities in the source document.");
System.out.println(err.toString());
System.exit(0);
}
String errs = validateCommunities(first, 1);
if (errs != null) {
if (errs != null)
{
err.append(errs);
trip = true;
}
if (trip) {
if (trip)
{
System.out.println(err.toString());
System.exit(0);
}
@@ -236,27 +284,30 @@ public class StructBuilder {
/**
* Validate the communities section of the XML document. This returns a string
* containing any errors encountered, or null if there were no errors
* containing any errors encountered, or null if there were no errors.
*
* @param communities the NodeList of communities to validate
* @param level the level in the XML document that we are at, for the purposes
* of error reporting
*
* @return the errors that need to be generated by the calling method, or null if
* no errors.
*/
private static String validateCommunities(NodeList communities, int level)
throws TransformerException {
StringBuffer err = new StringBuffer();
StringBuilder err = new StringBuilder();
boolean trip = false;
String errs = null;
for (int i = 0; i < communities.getLength(); i++) {
for (int i = 0; i < communities.getLength(); i++)
{
Node n = communities.item(i);
NodeList name = XPathAPI.selectNodeList(n, "name");
if (name.getLength() != 1) {
String pos = Integer.toString(i + 1);
err.append("-The level " + level + " community in position " + pos);
err.append(" does not contain exactly one name field\n");
err.append("-The level ").append(level)
.append(" community in position ").append(pos)
.append(" does not contain exactly one name field.\n");
trip = true;
}
@@ -277,7 +328,8 @@ public class StructBuilder {
}
}
if (trip) {
if (trip)
{
errs = err.toString();
}
@@ -286,30 +338,34 @@ public class StructBuilder {
/**
* validate the collection section of the XML document. This generates a
* string containing any errors encountered, or returns null if no errors
* string containing any errors encountered, or returns null if no errors.
*
* @param collections a NodeList of collections to validate
* @param level the level in the XML document for the purposes of error reporting
*
* @return the errors to be generated by the calling method, or null if none
*/
private static String validateCollections(NodeList collections, int level)
throws TransformerException {
StringBuffer err = new StringBuffer();
StringBuilder err = new StringBuilder();
boolean trip = false;
String errs = null;
for (int i = 0; i < collections.getLength(); i++) {
for (int i = 0; i < collections.getLength(); i++)
{
Node n = collections.item(i);
NodeList name = XPathAPI.selectNodeList(n, "name");
if (name.getLength() != 1) {
String pos = Integer.toString(i + 1);
err.append("-The level " + level + " collection in position " + pos);
err.append(" does not contain exactly one name field\n");
err.append("-The level ").append(level)
.append(" collection in position ").append(pos)
.append(" does not contain exactly one name field.\n");
trip = true;
}
}
if (trip) {
if (trip)
{
errs = err.toString();
}
@@ -319,11 +375,14 @@ public class StructBuilder {
/**
* Load in the XML from file.
*
* @param filename the filename to load from
* @param filename
* the filename to load from
*
* @return the DOM representation of the XML file
*/
private static org.w3c.dom.Document loadXML(String filename)
throws IOException, ParserConfigurationException, SAXException {
throws IOException, ParserConfigurationException, SAXException
{
DocumentBuilder builder = DocumentBuilderFactory.newInstance()
.newDocumentBuilder();
@@ -336,15 +395,19 @@ public class StructBuilder {
* Return the String value of a Node
*
* @param node the node from which we want to extract the string value
*
* @return the string value of the node
*/
public static String getStringValue(Node node) {
public static String getStringValue(Node node)
{
String value = node.getNodeValue();
if (node.hasChildNodes()) {
if (node.hasChildNodes())
{
Node first = node.getFirstChild();
if (first.getNodeType() == Node.TEXT_NODE) {
if (first.getNodeType() == Node.TEXT_NODE)
{
return first.getNodeValue().trim();
}
}
@@ -359,21 +422,26 @@ public class StructBuilder {
* @param context the context of the request
* @param communities a nodelist of communities to create along with their sub-structures
* @param parent the parent community of the nodelist of communities to create
*
* @return an element array containing additional information regarding the
* created communities (e.g. the handles they have been assigned)
*/
private static Element[] handleCommunities(Context context, NodeList communities, Community parent)
throws TransformerException, SQLException, Exception {
throws TransformerException, SQLException, AuthorizeException {
Element[] elements = new Element[communities.getLength()];
for (int i = 0; i < communities.getLength(); i++) {
for (int i = 0; i < communities.getLength(); i++)
{
Community community;
Element element = new Element("community");
// create the community or sub community
if (parent != null) {
if (parent != null)
{
community = communityService.create(parent, context);
} else {
}
else
{
community = communityService.create(null, context);
}
@@ -382,20 +450,20 @@ public class StructBuilder {
// now update the metadata
Node tn = communities.item(i);
for (Map.Entry<String, String> entry : communityMap.entrySet()) {
for (Map.Entry<String, String> entry : communityMap.entrySet())
{
NodeList nl = XPathAPI.selectNodeList(tn, entry.getKey());
if (nl.getLength() == 1) {
if (nl.getLength() == 1)
{
communityService.setMetadata(context, community, entry.getValue(), getStringValue(nl.item(0)));
}
}
// FIXME: at the moment, if the community already exists by name
// then this will throw a PSQLException on a duplicate key
// violation
// Ideally we'd skip this row and continue to create sub
// communities
// and so forth where they don't exist, but it's proving
// difficult
// then this will throw an SQLException on a duplicate key
// violation.
// Ideally we'd skip this row and continue to create sub communities
// and so forth where they don't exist, but it's proving difficult
// to isolate the community that already exists without hitting
// the database directly.
communityService.update(context, community);
@@ -414,25 +482,29 @@ public class StructBuilder {
nameElement.setText(communityService.getMetadata(community, "name"));
element.addContent(nameElement);
if (communityService.getMetadata(community, "short_description") != null) {
if (communityService.getMetadata(community, "short_description") != null)
{
Element descriptionElement = new Element("description");
descriptionElement.setText(communityService.getMetadata(community, "short_description"));
element.addContent(descriptionElement);
}
if (communityService.getMetadata(community, "introductory_text") != null) {
if (communityService.getMetadata(community, "introductory_text") != null)
{
Element introElement = new Element("intro");
introElement.setText(communityService.getMetadata(community, "introductory_text"));
element.addContent(introElement);
}
if (communityService.getMetadata(community, "copyright_text") != null) {
if (communityService.getMetadata(community, "copyright_text") != null)
{
Element copyrightElement = new Element("copyright");
copyrightElement.setText(communityService.getMetadata(community, "copyright_text"));
element.addContent(copyrightElement);
}
if (communityService.getMetadata(community, "side_bar_text") != null) {
if (communityService.getMetadata(community, "side_bar_text") != null)
{
Element sidebarElement = new Element("sidebar");
sidebarElement.setText(communityService.getMetadata(community, "side_bar_text"));
element.addContent(sidebarElement);
@@ -447,10 +519,12 @@ public class StructBuilder {
Element[] collectionElements = handleCollections(context, collections, community);
int j;
for (j = 0; j < subCommunityElements.length; j++) {
for (j = 0; j < subCommunityElements.length; j++)
{
element.addContent(subCommunityElements[j]);
}
for (j = 0; j < collectionElements.length; j++) {
for (j = 0; j < collectionElements.length; j++)
{
element.addContent(collectionElements[j]);
}
@@ -466,14 +540,16 @@ public class StructBuilder {
* @param context the context of the request
* @param collections the node list of collections to be created
* @param parent the parent community to whom the collections belong
*
* @return an Element array containing additional information about the
* created collections (e.g. the handle)
*/
private static Element[] handleCollections(Context context, NodeList collections, Community parent)
throws TransformerException, SQLException, AuthorizeException, IOException, Exception {
throws TransformerException, SQLException, AuthorizeException {
Element[] elements = new Element[collections.getLength()];
for (int i = 0; i < collections.getLength(); i++) {
for (int i = 0; i < collections.getLength(); i++)
{
Element element = new Element("collection");
Collection collection = collectionService.create(context, parent);
@@ -482,9 +558,11 @@ public class StructBuilder {
// import the rest of the metadata
Node tn = collections.item(i);
for (Map.Entry<String, String> entry : collectionMap.entrySet()) {
for (Map.Entry<String, String> entry : collectionMap.entrySet())
{
NodeList nl = XPathAPI.selectNodeList(tn, entry.getKey());
if (nl.getLength() == 1) {
if (nl.getLength() == 1)
{
collectionService.setMetadata(context, collection, entry.getValue(), getStringValue(nl.item(0)));
}
}
@@ -497,37 +575,43 @@ public class StructBuilder {
nameElement.setText(collectionService.getMetadata(collection, "name"));
element.addContent(nameElement);
if (collectionService.getMetadata(collection, "short_description") != null) {
if (collectionService.getMetadata(collection, "short_description") != null)
{
Element descriptionElement = new Element("description");
descriptionElement.setText(collectionService.getMetadata(collection, "short_description"));
element.addContent(descriptionElement);
}
if (collectionService.getMetadata(collection, "introductory_text") != null) {
if (collectionService.getMetadata(collection, "introductory_text") != null)
{
Element introElement = new Element("intro");
introElement.setText(collectionService.getMetadata(collection, "introductory_text"));
element.addContent(introElement);
}
if (collectionService.getMetadata(collection, "copyright_text") != null) {
if (collectionService.getMetadata(collection, "copyright_text") != null)
{
Element copyrightElement = new Element("copyright");
copyrightElement.setText(collectionService.getMetadata(collection, "copyright_text"));
element.addContent(copyrightElement);
}
if (collectionService.getMetadata(collection, "side_bar_text") != null) {
if (collectionService.getMetadata(collection, "side_bar_text") != null)
{
Element sidebarElement = new Element("sidebar");
sidebarElement.setText(collectionService.getMetadata(collection, "side_bar_text"));
element.addContent(sidebarElement);
}
if (collectionService.getMetadata(collection, "license") != null) {
if (collectionService.getMetadata(collection, "license") != null)
{
Element sidebarElement = new Element("license");
sidebarElement.setText(collectionService.getMetadata(collection, "license"));
element.addContent(sidebarElement);
}
if (collectionService.getMetadata(collection, "provenance_description") != null) {
if (collectionService.getMetadata(collection, "provenance_description") != null)
{
Element sidebarElement = new Element("provenance");
sidebarElement.setText(collectionService.getMetadata(collection, "provenance_description"));
element.addContent(sidebarElement);

View File

@@ -7,93 +7,67 @@
*/
package org.dspace.app.bulkedit;
import org.dspace.content.Item;
import org.dspace.content.Collection;
import java.util.ArrayList;
import java.util.List;
import org.dspace.content.Collection;
import org.dspace.content.Item;
/**
* Utility class to store changes to item that may occur during a batch edit.
*
* @author Stuart Lewis
*/
public class BulkEditChange {
/**
* The item these changes relate to
*/
public class BulkEditChange
{
/** The item these changes relate to */
private Item item;
/**
* The List of hashtables with the new elements
*/
/** The List of hashtables with the new elements */
private List<BulkEditMetadataValue> adds;
/**
* The List of hashtables with the removed elements
*/
/** The List of hashtables with the removed elements */
private List<BulkEditMetadataValue> removes;
/**
* The List of hashtables with the unchanged elements
*/
/** The List of hashtables with the unchanged elements */
private List<BulkEditMetadataValue> constant;
/**
* The List of the complete set of new values (constant + adds)
*/
/** The List of the complete set of new values (constant + adds) */
private List<BulkEditMetadataValue> complete;
/**
* The list of old collections the item used to be mapped to
*/
/** The list of old collections the item used to be mapped to */
private List<Collection> oldMappedCollections;
/**
* The list of new collections the item has been mapped into
*/
/** The list of new collections the item has been mapped into */
private List<Collection> newMappedCollections;
/**
* The old owning collection
*/
/** The old owning collection */
private Collection oldOwningCollection;
/**
* The new owning collection
*/
/** The new owning collection */
private Collection newOwningCollection;
/**
* Is this a new item
*/
/** Is this a new item */
private boolean newItem;
/**
* Has this item been deleted?
*/
/** Has this item been deleted? */
private boolean deleted;
/**
* Has this item been withdrawn?
*/
/** Has this item been withdrawn? */
private boolean withdrawn;
/**
* Has this item been reinstated?
*/
/** Has this item been reinstated? */
private boolean reinstated;
/**
* Have any changes actually been made?
*/
/** Have any changes actually been made? */
private boolean empty;
/**
* Initialise a change holder for a new item
*/
public BulkEditChange() {
public BulkEditChange()
{
// Set the item to be null
item = null;
newItem = true;
@@ -115,7 +89,8 @@ public class BulkEditChange {
*
* @param i The Item to store
*/
public BulkEditChange(Item i) {
public BulkEditChange(Item i)
{
// Store the item
item = i;
newItem = false;
@@ -135,7 +110,8 @@ public class BulkEditChange {
*
* @param i The item
*/
public void setItem(Item i) {
public void setItem(Item i)
{
// Store the item
item = i;
}
@@ -145,7 +121,8 @@ public class BulkEditChange {
*
* @param dcv The value to add
*/
public void registerAdd(BulkEditMetadataValue dcv) {
public void registerAdd(BulkEditMetadataValue dcv)
{
// Add the added value
adds.add(dcv);
complete.add(dcv);
@@ -157,7 +134,8 @@ public class BulkEditChange {
*
* @param dcv The value to remove
*/
public void registerRemove(BulkEditMetadataValue dcv) {
public void registerRemove(BulkEditMetadataValue dcv)
{
// Add the removed value
removes.add(dcv);
empty = false;
@@ -168,7 +146,8 @@ public class BulkEditChange {
*
* @param dcv The value to keep unchanged
*/
public void registerConstant(BulkEditMetadataValue dcv) {
public void registerConstant(BulkEditMetadataValue dcv)
{
// Add the removed value
constant.add(dcv);
complete.add(dcv);
@@ -179,7 +158,8 @@ public class BulkEditChange {
*
* @param c The new mapped Collection
*/
public void registerNewMappedCollection(Collection c) {
public void registerNewMappedCollection(Collection c)
{
// Add the new owning Collection
newMappedCollections.add(c);
empty = false;
@@ -190,22 +170,27 @@ public class BulkEditChange {
*
* @param c The old mapped Collection
*/
public void registerOldMappedCollection(Collection c) {
public void registerOldMappedCollection(Collection c)
{
// Add the old owning Collection (if it isn't there already, or is an old collection)
boolean found = false;
if ((this.getOldOwningCollection() != null) &&
(this.getOldOwningCollection().getHandle().equals(c.getHandle()))) {
(this.getOldOwningCollection().getHandle().equals(c.getHandle())))
{
found = true;
}
for (Collection collection : oldMappedCollections) {
if (collection.getHandle().equals(c.getHandle())) {
for (Collection collection : oldMappedCollections)
{
if (collection.getHandle().equals(c.getHandle()))
{
found = true;
}
}
if (!found) {
if (!found)
{
oldMappedCollections.add(c);
empty = false;
}
@@ -217,7 +202,8 @@ public class BulkEditChange {
* @param oldC The old owning collection
* @param newC The new owning collection
*/
public void changeOwningCollection(Collection oldC, Collection newC) {
public void changeOwningCollection(Collection oldC, Collection newC)
{
// Store the old owning collection
oldOwningCollection = oldC;
@@ -231,7 +217,8 @@ public class BulkEditChange {
*
* @param newC The new owning collection
*/
public void setOwningCollection(Collection newC) {
public void setOwningCollection(Collection newC)
{
// Store the new owning collection
newOwningCollection = newC;
//empty = false;
@@ -242,7 +229,8 @@ public class BulkEditChange {
*
* @return The item
*/
public Item getItem() {
public Item getItem()
{
// Return the item
return item;
}
@@ -252,7 +240,8 @@ public class BulkEditChange {
*
* @return the list of elements and their values that have been added.
*/
public List<BulkEditMetadataValue> getAdds() {
public List<BulkEditMetadataValue> getAdds()
{
// Return the array
return adds;
}
@@ -262,7 +251,8 @@ public class BulkEditChange {
*
* @return the list of elements and their values that have been removed.
*/
public List<BulkEditMetadataValue> getRemoves() {
public List<BulkEditMetadataValue> getRemoves()
{
// Return the array
return removes;
}
@@ -272,7 +262,8 @@ public class BulkEditChange {
*
* @return the list of unchanged values
*/
public List<BulkEditMetadataValue> getConstant() {
public List<BulkEditMetadataValue> getConstant()
{
// Return the array
return constant;
}
@@ -282,7 +273,8 @@ public class BulkEditChange {
*
* @return the list of all values
*/
public List<BulkEditMetadataValue> getComplete() {
public List<BulkEditMetadataValue> getComplete()
{
// Return the array
return complete;
}
@@ -292,7 +284,8 @@ public class BulkEditChange {
*
* @return the list of new mapped collections
*/
public List<Collection> getNewMappedCollections() {
public List<Collection> getNewMappedCollections()
{
// Return the array
return newMappedCollections;
}
@@ -302,7 +295,8 @@ public class BulkEditChange {
*
* @return the list of old mapped collections
*/
public List<Collection> getOldMappedCollections() {
public List<Collection> getOldMappedCollections()
{
// Return the array
return oldMappedCollections;
}
@@ -312,7 +306,8 @@ public class BulkEditChange {
*
* @return the old owning collection
*/
public Collection getOldOwningCollection() {
public Collection getOldOwningCollection()
{
// Return the old owning collection
return oldOwningCollection;
}
@@ -322,7 +317,8 @@ public class BulkEditChange {
*
* @return the new owning collection
*/
public Collection getNewOwningCollection() {
public Collection getNewOwningCollection()
{
// Return the new owning collection
return newOwningCollection;
}
@@ -332,7 +328,8 @@ public class BulkEditChange {
*
* @return Whether or not this is for a new item
*/
public boolean isNewItem() {
public boolean isNewItem()
{
// Return the new item status
return newItem;
}
@@ -342,7 +339,8 @@ public class BulkEditChange {
*
* @return Whether or not this is for a deleted item
*/
public boolean isDeleted() {
public boolean isDeleted()
{
// Return the new item status
return deleted;
}
@@ -361,7 +359,8 @@ public class BulkEditChange {
*
* @return Whether or not this is for a withdrawn item
*/
public boolean isWithdrawn() {
public boolean isWithdrawn()
{
// Return the new item status
return withdrawn;
}
@@ -380,7 +379,8 @@ public class BulkEditChange {
*
* @return Whether or not this is for a reinstated item
*/
public boolean isReinstated() {
public boolean isReinstated()
{
// Return the new item status
return reinstated;
}
@@ -399,7 +399,8 @@ public class BulkEditChange {
*
* @return Whether or not changes have been made
*/
public boolean hasChanges() {
public boolean hasChanges()
{
return !empty;
}
}

View File

@@ -7,26 +7,7 @@
*/
package org.dspace.app.bulkedit;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStreamWriter;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.apache.commons.lang3.StringUtils;
import org.dspace.authority.AuthorityValue;
import org.dspace.authority.factory.AuthorityServiceFactory;
import org.dspace.authority.service.AuthorityValueService;
@@ -35,14 +16,19 @@ import org.dspace.content.Item;
import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema;
import org.dspace.content.MetadataValue;
import org.dspace.content.authority.Choices;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService;
import org.dspace.content.service.MetadataFieldService;
import org.dspace.content.service.MetadataSchemaService;
import org.dspace.content.authority.Choices;
import org.dspace.core.Context;
import org.dspace.services.factory.DSpaceServicesFactory;
import java.util.*;
import java.util.regex.Pattern;
import java.util.regex.Matcher;
import java.io.*;
/**
* Utility class to read and write CSV files
*
@@ -57,69 +43,45 @@ import org.dspace.services.factory.DSpaceServicesFactory;
*
* @author Stuart Lewis
*/
public class DSpaceCSV implements Serializable {
/**
* The headings of the CSV file
*/
public class DSpaceCSV implements Serializable
{
/** The headings of the CSV file */
protected List<String> headings;
/**
* An array list of CSV lines
*/
/** An array list of CSV lines */
protected List<DSpaceCSVLine> lines;
/**
* A counter of how many CSV lines this object holds
*/
/** A counter of how many CSV lines this object holds */
protected int counter;
/**
* The value separator (defaults to double pipe '||')
*/
/** The value separator (defaults to double pipe '||') */
protected String valueSeparator;
/**
* The value separator in an escaped form for using in regexes
*/
/** The value separator in an escaped form for using in regexes */
protected String escapedValueSeparator;
/**
* The field separator (defaults to comma)
*/
/** The field separator (defaults to comma) */
protected String fieldSeparator;
/**
* The field separator in an escaped form for using in regexes
*/
/** The field separator in an escaped form for using in regexes */
protected String escapedFieldSeparator;
/**
* The authority separator (defaults to double colon '::')
*/
/** The authority separator (defaults to double colon '::') */
protected String authoritySeparator;
/**
* The authority separator in an escaped form for using in regexes
*/
/** The authority separator in an escaped form for using in regexes */
protected String escapedAuthoritySeparator;
protected transient final ItemService itemService = ContentServiceFactory.getInstance().getItemService();
protected transient final MetadataSchemaService metadataSchemaService =
ContentServiceFactory.getInstance().getMetadataSchemaService();
protected transient final MetadataFieldService metadataFieldService =
ContentServiceFactory.getInstance().getMetadataFieldService();
protected transient final AuthorityValueService authorityValueService =
AuthorityServiceFactory.getInstance().getAuthorityValueService();
protected transient final MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance().getMetadataSchemaService();
protected transient final MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService();
protected transient final AuthorityValueService authorityValueService = AuthorityServiceFactory.getInstance().getAuthorityValueService();
/**
* Whether to export all metadata such as handles and provenance information
*/
/** Whether to export all metadata such as handles and provenance information */
protected boolean exportAll;
/**
* A list of metadata elements to ignore
*/
/** A list of metadata elements to ignore */
protected Map<String, String> ignore;
@@ -128,7 +90,8 @@ public class DSpaceCSV implements Serializable {
*
* @param exportAll Whether to export all metadata such as handles and provenance information
*/
public DSpaceCSV(boolean exportAll) {
public DSpaceCSV(boolean exportAll)
{
// Initialise the class
init();
@@ -141,37 +104,48 @@ public class DSpaceCSV implements Serializable {
*
* @param f The file to read from
* @param c The DSpace Context
*
* @throws Exception thrown if there is an error reading or processing the file
*/
public DSpaceCSV(File f, Context c) throws Exception {
public DSpaceCSV(File f, Context c) throws Exception
{
// Initialise the class
init();
// Open the CSV file
BufferedReader input = null;
try {
try
{
input = new BufferedReader(new InputStreamReader(new FileInputStream(f),"UTF-8"));
// Read the heading line
String head = input.readLine();
String[] headingElements = head.split(escapedFieldSeparator);
int columnCounter = 0;
for (String element : headingElements) {
for (String element : headingElements)
{
columnCounter++;
// Remove surrounding quotes if there are any
if ((element.startsWith("\"")) && (element.endsWith("\""))) {
if ((element.startsWith("\"")) && (element.endsWith("\"")))
{
element = element.substring(1, element.length() - 1);
}
// Store the heading
if ("collection".equals(element)) {
if ("collection".equals(element))
{
// Store the heading
headings.add(element);
} else if ("action".equals(element)) { // Store the action
}
// Store the action
else if ("action".equals(element))
{
// Store the heading
headings.add(element);
} else if (!"id".equals(element)) {
}
else if (!"id".equals(element))
{
String authorityPrefix = "";
AuthorityValue authorityValueType = authorityValueService.getAuthorityValueType(element);
if (authorityValueType != null) {
@@ -206,8 +180,7 @@ public class DSpaceCSV implements Serializable {
}
// Check that the metadata element exists in the schema
MetadataField foundField = metadataFieldService
.findByElement(c, foundSchema, metadataElement, metadataQualifier);
MetadataField foundField = metadataFieldService.findByElement(c, foundSchema, metadataElement, metadataQualifier);
if (foundField == null) {
throw new MetadataImportInvalidHeadingException(clean[0],
MetadataImportInvalidHeadingException.ELEMENT,
@@ -223,7 +196,8 @@ public class DSpaceCSV implements Serializable {
StringBuilder lineBuilder = new StringBuilder();
String lineRead;
while ((lineRead = input.readLine()) != null) {
while ((lineRead = input.readLine()) != null)
{
if (lineBuilder.length() > 0) {
// Already have a previously read value - add this line
lineBuilder.append("\n").append(lineRead);
@@ -262,8 +236,11 @@ public class DSpaceCSV implements Serializable {
addItem(lineRead);
}
}
} finally {
if (input != null) {
}
finally
{
if (input != null)
{
input.close();
}
}
@@ -272,7 +249,8 @@ public class DSpaceCSV implements Serializable {
/**
* Initialise this class with values from dspace.cfg
*/
protected void init() {
protected void init()
{
// Set the value separator
setValueSeparator();
@@ -295,16 +273,13 @@ public class DSpaceCSV implements Serializable {
ignore = new HashMap<>();
// Specify default values
String[] defaultValues =
new String[] {
"dc.date.accessioned, dc.date.available, dc.date.updated, dc.description.provenance"
};
String[] toIgnoreArray =
DSpaceServicesFactory.getInstance()
.getConfigurationService()
.getArrayProperty("bulkedit.ignore-on-export", defaultValues);
for (String toIgnoreString : toIgnoreArray) {
if (!"".equals(toIgnoreString.trim())) {
String[] defaultValues = new String[]{"dc.date.accessioned", "dc.date.available",
"dc.date.updated", "dc.description.provenance"};
String[] toIgnoreArray = DSpaceServicesFactory.getInstance().getConfigurationService().getArrayProperty("bulkedit.ignore-on-export", defaultValues);
for (String toIgnoreString : toIgnoreArray)
{
if (!"".equals(toIgnoreString.trim()))
{
ignore.put(toIgnoreString.trim(), toIgnoreString.trim());
}
}
@@ -332,13 +307,16 @@ public class DSpaceCSV implements Serializable {
*
* If not set, defaults to double pipe '||'
*/
private void setValueSeparator() {
private void setValueSeparator()
{
// Get the value separator
valueSeparator = DSpaceServicesFactory.getInstance().getConfigurationService()
.getProperty("bulkedit.valueseparator");
if ((valueSeparator != null) && (!"".equals(valueSeparator.trim()))) {
valueSeparator = DSpaceServicesFactory.getInstance().getConfigurationService().getProperty("bulkedit.valueseparator");
if ((valueSeparator != null) && (!"".equals(valueSeparator.trim())))
{
valueSeparator = valueSeparator.trim();
} else {
}
else
{
valueSeparator = "||";
}
@@ -358,22 +336,32 @@ public class DSpaceCSV implements Serializable {
* Special values are 'tab', 'hash' and 'semicolon' which will
* get substituted from the text to the value.
*/
private void setFieldSeparator() {
private void setFieldSeparator()
{
// Get the value separator
fieldSeparator = DSpaceServicesFactory.getInstance().getConfigurationService()
.getProperty("bulkedit.fieldseparator");
if ((fieldSeparator != null) && (!"".equals(fieldSeparator.trim()))) {
fieldSeparator =DSpaceServicesFactory.getInstance().getConfigurationService().getProperty("bulkedit.fieldseparator");
if ((fieldSeparator != null) && (!"".equals(fieldSeparator.trim())))
{
fieldSeparator = fieldSeparator.trim();
if ("tab".equals(fieldSeparator)) {
if ("tab".equals(fieldSeparator))
{
fieldSeparator = "\t";
} else if ("semicolon".equals(fieldSeparator)) {
}
else if ("semicolon".equals(fieldSeparator))
{
fieldSeparator = ";";
} else if ("hash".equals(fieldSeparator)) {
}
else if ("hash".equals(fieldSeparator))
{
fieldSeparator = "#";
} else {
}
else
{
fieldSeparator = fieldSeparator.trim();
}
} else {
}
else
{
fieldSeparator = ",";
}
@@ -390,13 +378,16 @@ public class DSpaceCSV implements Serializable {
*
* If not set, defaults to double colon '::'
*/
private void setAuthoritySeparator() {
private void setAuthoritySeparator()
{
// Get the value separator
authoritySeparator = DSpaceServicesFactory.getInstance().getConfigurationService()
.getProperty("bulkedit.authorityseparator");
if ((authoritySeparator != null) && (!"".equals(authoritySeparator.trim()))) {
authoritySeparator = DSpaceServicesFactory.getInstance().getConfigurationService().getProperty("bulkedit.authorityseparator");
if ((authoritySeparator != null) && (!"".equals(authoritySeparator.trim())))
{
authoritySeparator = authoritySeparator.trim();
} else {
}
else
{
authoritySeparator = "::";
}
@@ -410,9 +401,11 @@ public class DSpaceCSV implements Serializable {
* Add a DSpace item to the CSV file
*
* @param i The DSpace item
*
* @throws Exception if something goes wrong with adding the Item
*/
public final void addItem(Item i) throws Exception {
public final void addItem(Item i) throws Exception
{
// If the item does not have an "owningCollection" the the below "getHandle()" call will fail
// This should not happen but is here for safety.
if (i.getOwningCollection() == null) {
@@ -428,42 +421,49 @@ public class DSpaceCSV implements Serializable {
// Add in any mapped collections
List<Collection> collections = i.getCollections();
for (Collection c : collections) {
for (Collection c : collections)
{
// Only add if it is not the owning collection
if (!c.getHandle().equals(owningCollectionHandle)) {
if (!c.getHandle().equals(owningCollectionHandle))
{
line.add("collection", c.getHandle());
}
}
// Populate it
List<MetadataValue> md = itemService.getMetadata(i, Item.ANY, Item.ANY, Item.ANY, Item.ANY);
for (MetadataValue value : md) {
for (MetadataValue value : md)
{
MetadataField metadataField = value.getMetadataField();
MetadataSchema metadataSchema = metadataField.getMetadataSchema();
// Get the key (schema.element)
String key = metadataSchema.getName() + "." + metadataField.getElement();
// Add the qualifier if there is one (schema.element.qualifier)
if (metadataField.getQualifier() != null) {
if (metadataField.getQualifier() != null)
{
key = key + "." + metadataField.getQualifier();
}
// Add the language if there is one (schema.element.qualifier[langauge])
//if ((value.language != null) && (!"".equals(value.language)))
if (value.getLanguage() != null) {
if (value.getLanguage() != null)
{
key = key + "[" + value.getLanguage() + "]";
}
// Store the item
if (exportAll || okToExport(metadataField)) {
if (exportAll || okToExport(metadataField))
{
// Add authority and confidence if authority is not null
String mdValue = value.getValue();
if (value.getAuthority() != null && !"".equals(value.getAuthority())) {
mdValue += authoritySeparator + value.getAuthority() + authoritySeparator + (value
.getConfidence() != -1 ? value.getConfidence() : Choices.CF_ACCEPTED);
if (value.getAuthority() != null && !"".equals(value.getAuthority()))
{
mdValue += authoritySeparator + value.getAuthority() + authoritySeparator + (value.getConfidence() != -1 ? value.getConfidence() : Choices.CF_ACCEPTED);
}
line.add(key, mdValue);
if (!headings.contains(key)) {
if (!headings.contains(key))
{
headings.add(key);
}
}
@@ -478,10 +478,12 @@ public class DSpaceCSV implements Serializable {
* @param line The line of elements
* @throws Exception Thrown if an error occurs when adding the item
*/
public final void addItem(String line) throws Exception {
public final void addItem(String line) throws Exception
{
// Check to see if the last character is a field separator, which hides the last empty column
boolean last = false;
if (line.endsWith(fieldSeparator)) {
if (line.endsWith(fieldSeparator))
{
// Add a space to the end, then remove it later
last = true;
line += " ";
@@ -494,12 +496,15 @@ public class DSpaceCSV implements Serializable {
// Merge parts with embedded separators
boolean alldone = false;
while (!alldone) {
while (!alldone)
{
boolean found = false;
int i = 0;
for (String part : bits) {
for (String part : bits)
{
int bitcounter = part.length() - part.replaceAll("\"", "").length();
if ((part.startsWith("\"")) && ((!part.endsWith("\"")) || ((bitcounter & 1) == 1))) {
if ((part.startsWith("\"")) && ((!part.endsWith("\"")) || ((bitcounter & 1) == 1)))
{
found = true;
String add = bits.get(i) + fieldSeparator + bits.get(i + 1);
bits.remove(i);
@@ -514,8 +519,10 @@ public class DSpaceCSV implements Serializable {
// Deal with quotes around the elements
int i = 0;
for (String part : bits) {
if ((part.startsWith("\"")) && (part.endsWith("\""))) {
for (String part : bits)
{
if ((part.startsWith("\"")) && (part.endsWith("\"")))
{
part = part.substring(1, part.length() - 1);
bits.set(i, part);
}
@@ -524,8 +531,10 @@ public class DSpaceCSV implements Serializable {
// Remove embedded quotes
i = 0;
for (String part : bits) {
if (part.contains("\"\"")) {
for (String part : bits)
{
if (part.contains("\"\""))
{
part = part.replaceAll("\"\"", "\"");
bits.set(i, part);
}
@@ -537,12 +546,18 @@ public class DSpaceCSV implements Serializable {
DSpaceCSVLine csvLine;
// Is this an existing item, or a new item (where id = '+')
if ("+".equals(id)) {
if ("+".equals(id))
{
csvLine = new DSpaceCSVLine();
} else {
try {
}
else
{
try
{
csvLine = new DSpaceCSVLine(UUID.fromString(id));
} catch (NumberFormatException nfe) {
}
catch (NumberFormatException nfe)
{
System.err.println("Invalid item identifier: " + id);
System.err.println("Please check your CSV file for information. " +
"Item id must be numeric, or a '+' to add a new item");
@@ -552,10 +567,13 @@ public class DSpaceCSV implements Serializable {
// Add the rest of the parts
i = 0;
for (String part : bits) {
if (i > 0) {
for (String part : bits)
{
if (i > 0)
{
// Is this a last empty item?
if ((last) && (i == headings.size())) {
if ((last) && (i == headings.size()))
{
part = "";
}
@@ -567,8 +585,10 @@ public class DSpaceCSV implements Serializable {
}
csvLine.add(headings.get(i - 1), null);
String[] elements = part.split(escapedValueSeparator);
for (String element : elements) {
if ((element != null) && (!"".equals(element))) {
for (String element : elements)
{
if ((element != null) && (!"".equals(element)))
{
csvLine.add(headings.get(i - 1), element);
}
}
@@ -584,7 +604,8 @@ public class DSpaceCSV implements Serializable {
*
* @return The lines
*/
public final List<DSpaceCSVLine> getCSVLines() {
public final List<DSpaceCSVLine> getCSVLines()
{
// Return the lines
return lines;
}
@@ -594,19 +615,22 @@ public class DSpaceCSV implements Serializable {
*
* @return the array of CSV formatted Strings
*/
public final String[] getCSVLinesAsStringArray() {
public final String[] getCSVLinesAsStringArray()
{
// Create the headings line
String[] csvLines = new String[counter + 1];
csvLines[0] = "id" + fieldSeparator + "collection";
List<String> headingsCopy = new ArrayList<>(headings);
Collections.sort(headingsCopy);
for (String value : headingsCopy) {
for (String value : headingsCopy)
{
csvLines[0] = csvLines[0] + fieldSeparator + value;
}
Iterator<DSpaceCSVLine> i = lines.iterator();
int c = 1;
while (i.hasNext()) {
while (i.hasNext())
{
csvLines[c++] = i.next().toCSV(headingsCopy, fieldSeparator, valueSeparator);
}
@@ -617,9 +641,11 @@ public class DSpaceCSV implements Serializable {
* Save the CSV file to the given filename
*
* @param filename The filename to save the CSV file to
*
* @throws IOException Thrown if an error occurs when writing the file
*/
public final void save(String filename) throws IOException {
public final void save(String filename) throws IOException
{
// Save the file
BufferedWriter out = new BufferedWriter(
new OutputStreamWriter(
@@ -640,10 +666,12 @@ public class DSpaceCSV implements Serializable {
* @param md The Metadatum to examine
* @return Whether or not it is OK to export this element
*/
protected boolean okToExport(MetadataField md) {
protected boolean okToExport(MetadataField md)
{
// Now compare with the list to ignore
String key = md.getMetadataSchema().getName() + "." + md.getElement();
if (md.getQualifier() != null) {
if (md.getQualifier() != null)
{
key += "." + md.getQualifier();
}
if (ignore.get(key) != null) {
@@ -659,7 +687,8 @@ public class DSpaceCSV implements Serializable {
*
* @return The headings
*/
public List<String> getHeadings() {
public List<String> getHeadings()
{
return headings;
}
@@ -669,11 +698,13 @@ public class DSpaceCSV implements Serializable {
* @return The formatted String as a csv
*/
@Override
public final String toString() {
public final String toString()
{
// Return the csv as one long string
StringBuilder csvLines = new StringBuilder();
String[] lines = this.getCSVLinesAsStringArray();
for (String line : lines) {
for (String line : lines)
{
csvLines.append(line).append("\n");
}
return csvLines.toString();

View File

@@ -7,41 +7,30 @@
*/
package org.dspace.app.bulkedit;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Comparator;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.TreeMap;
import java.util.UUID;
import org.dspace.authority.AuthorityValue;
import org.dspace.authority.factory.AuthorityServiceFactory;
import org.dspace.authority.service.AuthorityValueService;
import java.io.Serializable;
import java.util.*;
/**
* Utility class to store a line from a CSV file
*
* @author Stuart Lewis
*/
public class DSpaceCSVLine implements Serializable {
/**
* The item id of the item represented by this line. -1 is for a new item
*/
public class DSpaceCSVLine implements Serializable
{
/** The item id of the item represented by this line. -1 is for a new item */
private final UUID id;
/**
* The elements in this line in a hashtable, keyed by the metadata type
*/
/** The elements in this line in a hashtable, keyed by the metadata type */
private final Map<String, ArrayList> items;
protected transient final AuthorityValueService authorityValueService
= AuthorityServiceFactory.getInstance().getAuthorityValueService();
/**
* ensuring that the order-sensible columns of the csv are processed in the correct order
*/
/** ensuring that the order-sensible columns of the csv are processed in the correct order */
private transient final Comparator<? super String> headerComparator = new Comparator<String>() {
@Override
public int compare(String md1, String md2) {
@@ -52,7 +41,8 @@ public class DSpaceCSVLine implements Serializable {
int compare;
if (source1 == null && source2 != null) {
compare = -1;
} else if (source1 != null && source2 == null) {
}
else if (source1 != null && source2 == null) {
compare = 1;
} else {
// the order of the rest does not matter
@@ -67,7 +57,8 @@ public class DSpaceCSVLine implements Serializable {
*
* @param itemId The item ID of the line
*/
public DSpaceCSVLine(UUID itemId) {
public DSpaceCSVLine(UUID itemId)
{
// Store the ID + separator, and initialise the hashtable
this.id = itemId;
items = new TreeMap<>(headerComparator);
@@ -77,7 +68,8 @@ public class DSpaceCSVLine implements Serializable {
/**
* Create a new CSV line for a new item
*/
public DSpaceCSVLine() {
public DSpaceCSVLine()
{
// Set the ID to be null, and initialise the hashtable
this.id = null;
this.items = new TreeMap<>(headerComparator);
@@ -88,7 +80,8 @@ public class DSpaceCSVLine implements Serializable {
*
* @return The item ID
*/
public UUID getID() {
public UUID getID()
{
// Return the ID
return id;
}
@@ -99,14 +92,17 @@ public class DSpaceCSVLine implements Serializable {
* @param key The metadata key (e.g. dc.contributor.author)
* @param value The metadata value
*/
public void add(String key, String value) {
public void add(String key, String value)
{
// Create the array list if we need to
if (items.get(key) == null) {
if (items.get(key) == null)
{
items.put(key, new ArrayList<String>());
}
// Store the item if it is not null
if (value != null) {
if (value != null)
{
items.get(key).add(value);
}
}
@@ -117,7 +113,8 @@ public class DSpaceCSVLine implements Serializable {
* @param key The metadata key
* @return All the elements that match
*/
public List<String> get(String key) {
public List<String> get(String key)
{
// Return any relevant values
return items.get(key);
}
@@ -127,7 +124,8 @@ public class DSpaceCSVLine implements Serializable {
*
* @return The action (may be blank, 'withdraw', 'reinstate' or 'delete')
*/
public String getAction() {
public String getAction()
{
if (items.containsKey("action")) {
ArrayList actions = items.get("action");
if (actions.size() > 0) {
@@ -142,7 +140,8 @@ public class DSpaceCSVLine implements Serializable {
*
* @return An enumeration of all the keys
*/
public Set<String> keys() {
public Set<String> keys()
{
// Return the keys
return items.keySet();
}
@@ -155,7 +154,8 @@ public class DSpaceCSVLine implements Serializable {
* @param valueSeparator separator between metadata values (within a field)
* @return The CSV formatted String
*/
protected String toCSV(List<String> headings, String fieldSeparator, String valueSeparator) {
protected String toCSV(List<String> headings, String fieldSeparator, String valueSeparator)
{
StringBuilder bits = new StringBuilder();
// Add the id
@@ -163,10 +163,12 @@ public class DSpaceCSVLine implements Serializable {
bits.append(valueToCSV(items.get("collection"),valueSeparator));
// Add the rest of the elements
for (String heading : headings) {
for (String heading : headings)
{
bits.append(fieldSeparator);
List<String> values = items.get(heading);
if (values != null && !"collection".equals(heading)) {
if (values != null && !"collection".equals(heading))
{
bits.append(valueToCSV(values, valueSeparator));
}
}
@@ -181,22 +183,29 @@ public class DSpaceCSVLine implements Serializable {
* @param valueSeparator value separator
* @return The line as a CSV formatted String
*/
protected String valueToCSV(List<String> values, String valueSeparator) {
protected String valueToCSV(List<String> values, String valueSeparator)
{
// Check there is some content
if (values == null) {
if (values == null)
{
return "";
}
// Get on with the work
String s;
if (values.size() == 1) {
if (values.size() == 1)
{
s = values.get(0);
} else {
}
else
{
// Concatenate any fields together
StringBuilder str = new StringBuilder();
for (String value : values) {
if (str.length() > 0) {
for (String value : values)
{
if (str.length() > 0)
{
str.append(valueSeparator);
}

View File

@@ -7,46 +7,37 @@
*/
package org.dspace.app.bulkedit;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import org.apache.commons.cli.*;
import com.google.common.collect.Iterators;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import org.apache.commons.cli.PosixParser;
import org.dspace.content.Collection;
import org.dspace.content.Community;
import org.dspace.content.DSpaceObject;
import org.dspace.content.Item;
import org.dspace.content.*;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService;
import org.dspace.core.Constants;
import org.dspace.core.Context;
import org.dspace.handle.factory.HandleServiceFactory;
import java.util.ArrayList;
import java.sql.SQLException;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Set;
/**
* Metadata exporter to allow the batch export of metadata into a file
*
* @author Stuart Lewis
*/
public class MetadataExport {
/**
* The items to export
*/
public class MetadataExport
{
/** The items to export */
protected Iterator<Item> toExport;
protected ItemService itemService;
protected Context context;
/**
* Whether to export all metadata, or just normally edited metadata
*/
/** Whether to export all metadata, or just normally edited metadata */
protected boolean exportAll;
protected MetadataExport() {
@@ -60,7 +51,8 @@ public class MetadataExport {
* @param toExport The ItemIterator of items to export
* @param exportAll whether to export all metadata or not (include handle, provenance etc)
*/
public MetadataExport(Context c, Iterator<Item> toExport, boolean exportAll) {
public MetadataExport(Context c, Iterator<Item> toExport, boolean exportAll)
{
itemService = ContentServiceFactory.getInstance().getItemService();
// Store the export settings
@@ -76,15 +68,19 @@ public class MetadataExport {
* @param toExport The Community to export
* @param exportAll whether to export all metadata or not (include handle, provenance etc)
*/
public MetadataExport(Context c, Community toExport, boolean exportAll) {
public MetadataExport(Context c, Community toExport, boolean exportAll)
{
itemService = ContentServiceFactory.getInstance().getItemService();
try {
try
{
// Try to export the community
this.toExport = buildFromCommunity(c, toExport, 0);
this.exportAll = exportAll;
this.context = c;
} catch (SQLException sqle) {
}
catch (SQLException sqle)
{
// Something went wrong...
System.err.println("Error running exporter:");
sqle.printStackTrace(System.err);
@@ -98,23 +94,26 @@ public class MetadataExport {
* @param context DSpace context
* @param community The community to build from
* @param indent How many spaces to use when writing out the names of items added
* @return The list of item ids
* @return Iterator over the Collection of item ids
* @throws SQLException if database error
*/
protected Iterator<Item> buildFromCommunity(Context context, Community community, int indent)
throws SQLException {
Set<Item> result = new HashSet<>();
// Add all the collections
List<Collection> collections = community.getCollections();
Iterator<Item> result = null;
for (Collection collection : collections) {
for (int i = 0; i < indent; i++) {
System.out.print(" ");
}
Iterator<Item> items = itemService.findByCollection(context, collection);
result = addItemsToResult(result, items);
while (items.hasNext()) {
result.add(items.next());
}
}
// Add all the sub-communities
List<Community> communities = community.getSubcommunities();
for (Community subCommunity : communities) {
@@ -122,20 +121,12 @@ public class MetadataExport {
System.out.print(" ");
}
Iterator<Item> items = buildFromCommunity(context, subCommunity, indent + 1);
result = addItemsToResult(result, items);
while (items.hasNext()) {
result.add(items.next());
}
}
return result;
}
private Iterator<Item> addItemsToResult(Iterator<Item> result, Iterator<Item> items) {
if (result == null) {
result = items;
} else {
result = Iterators.concat(result, items);
}
return result;
return result.iterator();
}
/**
@@ -143,14 +134,17 @@ public class MetadataExport {
*
* @return the exported CSV lines
*/
public DSpaceCSV export() {
try {
public DSpaceCSV export()
{
try
{
Context.Mode originalMode = context.getCurrentMode();
context.setMode(Context.Mode.READ_ONLY);
// Process each item
DSpaceCSV csv = new DSpaceCSV(exportAll);
while (toExport.hasNext()) {
while (toExport.hasNext())
{
Item item = toExport.next();
csv.addItem(item);
context.uncacheEntity(item);
@@ -159,7 +153,9 @@ public class MetadataExport {
context.setMode(originalMode);
// Return the results
return csv;
} catch (Exception e) {
}
catch (Exception e)
{
// Something went wrong...
System.err.println("Error exporting to CSV:");
e.printStackTrace();
@@ -173,7 +169,8 @@ public class MetadataExport {
* @param options The command line options the user gave
* @param exitCode the system exit code to use
*/
private static void printHelp(Options options, int exitCode) {
private static void printHelp(Options options, int exitCode)
{
// print the help message
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MetadataExport\n", options);
@@ -188,7 +185,8 @@ public class MetadataExport {
* @param argv the command line arguments given
* @throws Exception if error occurs
*/
public static void main(String[] argv) throws Exception {
public static void main(String[] argv) throws Exception
{
// Create an options object and populate it
CommandLineParser parser = new PosixParser();
@@ -196,26 +194,30 @@ public class MetadataExport {
options.addOption("i", "id", true, "ID or handle of thing to export (item, collection, or community)");
options.addOption("f", "file", true, "destination where you want file written");
options.addOption("a", "all", false,
"include all metadata fields that are not normally changed (e.g. provenance)");
options.addOption("a", "all", false, "include all metadata fields that are not normally changed (e.g. provenance)");
options.addOption("h", "help", false, "help");
CommandLine line = null;
try {
try
{
line = parser.parse(options, argv);
} catch (ParseException pe) {
}
catch (ParseException pe)
{
System.err.println("Error with commands.");
printHelp(options, 1);
System.exit(0);
}
if (line.hasOption('h')) {
if (line.hasOption('h'))
{
printHelp(options, 0);
}
// Check a filename is given
if (!line.hasOption('f')) {
if (!line.hasOption('f'))
{
System.err.println("Required parameter -f missing!");
printHelp(options, 1);
}
@@ -235,31 +237,42 @@ public class MetadataExport {
ContentServiceFactory contentServiceFactory = ContentServiceFactory.getInstance();
// Check we have an item OK
ItemService itemService = contentServiceFactory.getItemService();
if (!line.hasOption('i')) {
if (!line.hasOption('i'))
{
System.out.println("Exporting whole repository WARNING: May take some time!");
exporter = new MetadataExport(c, itemService.findAll(c), exportAll);
} else {
}
else
{
String handle = line.getOptionValue('i');
DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(c, handle);
if (dso == null) {
if (dso == null)
{
System.err.println("Item '" + handle + "' does not resolve to an item in your repository!");
printHelp(options, 1);
}
if (dso.getType() == Constants.ITEM) {
if (dso.getType() == Constants.ITEM)
{
System.out.println("Exporting item '" + dso.getName() + "' (" + handle + ")");
List<Item> item = new ArrayList<>();
item.add((Item) dso);
exporter = new MetadataExport(c, item.iterator(), exportAll);
} else if (dso.getType() == Constants.COLLECTION) {
}
else if (dso.getType() == Constants.COLLECTION)
{
System.out.println("Exporting collection '" + dso.getName() + "' (" + handle + ")");
Collection collection = (Collection)dso;
toExport = itemService.findByCollection(c, collection);
exporter = new MetadataExport(c, toExport, exportAll);
} else if (dso.getType() == Constants.COMMUNITY) {
}
else if (dso.getType() == Constants.COMMUNITY)
{
System.out.println("Exporting community '" + dso.getName() + "' (" + handle + ")");
exporter = new MetadataExport(c, (Community)dso, exportAll);
} else {
}
else
{
System.err.println("Error identifying '" + handle + "'");
System.exit(1);
}

View File

@@ -12,13 +12,15 @@ package org.dspace.app.bulkedit;
*
* @author Stuart Lewis
*/
public class MetadataImportException extends Exception {
public class MetadataImportException extends Exception
{
/**
* Instantiate a new MetadataImportException
*
* @param message the error message
*/
public MetadataImportException(String message) {
public MetadataImportException(String message)
{
super(message);
}
@@ -28,7 +30,8 @@ public class MetadataImportException extends Exception {
* @param message the error message
* @param exception the root cause
*/
public MetadataImportException(String message, Exception exception) {
public MetadataImportException(String message, Exception exception)
{
super(message, exception);
}
}

View File

@@ -12,40 +12,27 @@ package org.dspace.app.bulkedit;
*
* @author Stuart Lewis
*/
public class MetadataImportInvalidHeadingException extends Exception {
/**
* The type of error (schema or element)
*/
public class MetadataImportInvalidHeadingException extends Exception
{
/** The type of error (schema or element) */
private int type;
/**
* The bad heading
*/
/** The bad heading */
private String badHeading;
/**
* The column number
*/
/** The column number */
private int column;
/**
* Error with the schema
*/
/** Error with the schema */
public static final int SCHEMA = 0;
/**
* Error with the element
*/
/** Error with the element */
public static final int ELEMENT = 1;
/**
* Error with a missing header
*/
/** Error with a missing header */
public static final int MISSING = 98;
/**
* Error with the whole entry
*/
/** Error with the whole entry */
public static final int ENTRY = 99;
@@ -56,7 +43,8 @@ public class MetadataImportInvalidHeadingException extends Exception {
* @param theType the type of the error
* @param theColumn column number
*/
public MetadataImportInvalidHeadingException(String message, int theType, int theColumn) {
public MetadataImportInvalidHeadingException(String message, int theType, int theColumn)
{
super(message);
badHeading = message;
type = theType;
@@ -68,7 +56,8 @@ public class MetadataImportInvalidHeadingException extends Exception {
*
* @return the type of the exception
*/
public String getType() {
public String getType()
{
return "" + type;
}
@@ -77,7 +66,8 @@ public class MetadataImportInvalidHeadingException extends Exception {
*
* @return the invalid heading
*/
public String getBadHeader() {
public String getBadHeader()
{
return badHeading;
}
@@ -86,7 +76,8 @@ public class MetadataImportInvalidHeadingException extends Exception {
*
* @return the invalid column number
*/
public int getColumn() {
public int getColumn()
{
return column;
}
@@ -96,14 +87,19 @@ public class MetadataImportInvalidHeadingException extends Exception {
* @return The exception message
*/
@Override
public String getMessage() {
if (type == SCHEMA) {
public String getMessage()
{
if (type == SCHEMA)
{
return "Unknown metadata schema in column " + column + ": " + badHeading;
} else if (type == ELEMENT) {
} else if (type == ELEMENT)
{
return "Unknown metadata element in column " + column + ": " + badHeading;
} else if (type == MISSING) {
} else if (type == MISSING)
{
return "Row with missing header: column " + column;
} else {
} else
{
return "Bad metadata declaration in column" + column + ": " + badHeading;
}
}

View File

@@ -9,11 +9,7 @@ package org.dspace.app.checker;
import java.io.FileNotFoundException;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.Date;
import java.util.List;
import java.util.UUID;
import java.util.*;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
@@ -24,15 +20,7 @@ import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import org.apache.commons.cli.PosixParser;
import org.apache.log4j.Logger;
import org.dspace.checker.BitstreamDispatcher;
import org.dspace.checker.CheckerCommand;
import org.dspace.checker.HandleDispatcher;
import org.dspace.checker.IteratorDispatcher;
import org.dspace.checker.LimitedCountDispatcher;
import org.dspace.checker.LimitedDurationDispatcher;
import org.dspace.checker.ResultsLogger;
import org.dspace.checker.ResultsPruner;
import org.dspace.checker.SimpleDispatcher;
import org.dspace.checker.*;
import org.dspace.content.Bitstream;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.BitstreamService;
@@ -47,7 +35,8 @@ import org.dspace.core.Utils;
* @author Grace Carpenter
* @author Nathan Sarr
*/
public final class ChecksumChecker {
public final class ChecksumChecker
{
private static final Logger LOG = Logger.getLogger(ChecksumChecker.class);
private static final BitstreamService bitstreamService = ContentServiceFactory.getInstance().getBitstreamService();
@@ -55,13 +44,16 @@ public final class ChecksumChecker {
/**
* Blanked off constructor, this class should be used as a command line
* tool.
*
*/
private ChecksumChecker() {
private ChecksumChecker()
{
}
/**
* Command line access to the checksum package.
*
* @param args
* <dl>
* <dt>-h</dt>
* <dd>Print help on command line options</dd>
@@ -80,8 +72,6 @@ public final class ChecksumChecker {
* <dt>-p</dt>
* <dd>Don't prune results before running checker</dd>
* </dl>
*
* @param args the command line arguments given
* @throws SQLException if error
*/
public static void main(String[] args) throws SQLException {
@@ -108,22 +98,27 @@ public final class ChecksumChecker {
options.addOption(useBitstreamIds);
options.addOption("p", "prune", false, "Prune configuration file");
options.addOption(OptionBuilder
options
.addOption(OptionBuilder
.withArgName("prune")
.hasOptionalArgs(1)
.withDescription(
"Prune old results (optionally using specified properties file for configuration)")
.create('p'));
try {
try
{
line = parser.parse(options, args);
} catch (ParseException e) {
}
catch (ParseException e)
{
LOG.fatal(e);
System.exit(1);
}
// user asks for help
if (line.hasOption('h')) {
if (line.hasOption('h'))
{
printHelp(options);
}
Context context = null;
@@ -132,13 +127,17 @@ public final class ChecksumChecker {
// Prune stage
if (line.hasOption('p')) {
if (line.hasOption('p'))
{
ResultsPruner rp = null;
try {
try
{
rp = (line.getOptionValue('p') != null) ? ResultsPruner
.getPruner(context, line.getOptionValue('p')) : ResultsPruner
.getDefaultPruner(context);
} catch (FileNotFoundException e) {
}
catch (FileNotFoundException e)
{
LOG.error("File not found", e);
System.exit(1);
}
@@ -153,47 +152,68 @@ public final class ChecksumChecker {
// process should loop infinitely through
// most_recent_checksum table
if (line.hasOption('l')) {
if (line.hasOption('l'))
{
dispatcher = new SimpleDispatcher(context, processStart, false);
} else if (line.hasOption('L')) {
}
else if (line.hasOption('L'))
{
dispatcher = new SimpleDispatcher(context, processStart, true);
} else if (line.hasOption('b')) {
}
else if (line.hasOption('b'))
{
// check only specified bitstream(s)
String[] ids = line.getOptionValues('b');
List<Bitstream> bitstreams = new ArrayList<>(ids.length);
for (int i = 0; i < ids.length; i++) {
try {
for (int i = 0; i < ids.length; i++)
{
try
{
bitstreams.add(bitstreamService.find(context, UUID.fromString(ids[i])));
} catch (NumberFormatException nfe) {
}
catch (NumberFormatException nfe)
{
System.err.println("The following argument: " + ids[i]
+ " is not an integer");
System.exit(0);
}
}
dispatcher = new IteratorDispatcher(bitstreams.iterator());
} else if (line.hasOption('a')) {
}
else if (line.hasOption('a'))
{
dispatcher = new HandleDispatcher(context, line.getOptionValue('a'));
} else if (line.hasOption('d')) {
}
else if (line.hasOption('d'))
{
// run checker process for specified duration
try {
try
{
dispatcher = new LimitedDurationDispatcher(
new SimpleDispatcher(context, processStart, true), new Date(
System.currentTimeMillis()
+ Utils.parseDuration(line
.getOptionValue('d'))));
} catch (Exception e) {
}
catch (Exception e)
{
LOG.fatal("Couldn't parse " + line.getOptionValue('d')
+ " as a duration: ", e);
System.exit(0);
}
} else if (line.hasOption('c')) {
}
else if (line.hasOption('c'))
{
int count = Integer.valueOf(line.getOptionValue('c'));
// run checker process for specified number of bitstreams
dispatcher = new LimitedCountDispatcher(new SimpleDispatcher(
context, processStart, false), count);
} else {
}
else
{
dispatcher = new LimitedCountDispatcher(new SimpleDispatcher(
context, processStart, false), 1);
}
@@ -201,7 +221,8 @@ public final class ChecksumChecker {
ResultsLogger logger = new ResultsLogger(processStart);
CheckerCommand checker = new CheckerCommand(context);
// verbose reporting
if (line.hasOption('v')) {
if (line.hasOption('v'))
{
checker.setReportVerbose(true);
}
@@ -223,19 +244,24 @@ public final class ChecksumChecker {
*
* @param options that are available for the user
*/
private static void printHelp(Options options) {
private static void printHelp(Options options)
{
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("Checksum Checker\n", options);
System.out.println("\nSpecify a duration for checker process, using s(seconds),"
System.out
.println("\nSpecify a duration for checker process, using s(seconds),"
+ "m(minutes), or h(hours): ChecksumChecker -d 30s"
+ " OR ChecksumChecker -d 30m"
+ " OR ChecksumChecker -d 2h");
System.out.println("\nSpecify bitstream IDs: ChecksumChecker -b 13 15 17 20");
System.out
.println("\nSpecify bitstream IDs: ChecksumChecker -b 13 15 17 20");
System.out.println("\nLoop once through all bitstreams: "
+ "ChecksumChecker -l");
System.out.println("\nLoop continuously through all bitstreams: ChecksumChecker -L");
System.out.println("\nCheck a defined number of bitstreams: ChecksumChecker -c 10");
System.out
.println("\nLoop continuously through all bitstreams: ChecksumChecker -L");
System.out
.println("\nCheck a defined number of bitstreams: ChecksumChecker -c 10");
System.out.println("\nReport all processing (verbose)(default reports only errors): ChecksumChecker -v");
System.out.println("\nDefault (no arguments) is equivalent to '-c 1'");
System.exit(0);

View File

@@ -7,12 +7,12 @@
*/
package org.dspace.app.configuration;
import java.io.File;
import java.net.MalformedURLException;
import org.dspace.kernel.config.SpringLoader;
import org.dspace.services.ConfigurationService;
import java.io.File;
import java.net.MalformedURLException;
/**
* @author Kevin Van de Velde (kevin at atmire dot com)
*/

View File

@@ -7,17 +7,7 @@
*/
package org.dspace.app.harvest;
import java.io.IOException;
import java.sql.SQLException;
import java.util.Iterator;
import java.util.List;
import java.util.UUID;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.cli.*;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.Collection;
import org.dspace.content.DSpaceObject;
@@ -37,21 +27,27 @@ import org.dspace.harvest.OAIHarvester;
import org.dspace.harvest.factory.HarvestServiceFactory;
import org.dspace.harvest.service.HarvestedCollectionService;
import java.io.IOException;
import java.sql.SQLException;
import java.util.Iterator;
import java.util.List;
import java.util.UUID;
/**
* Test class for harvested collections.
*
* @author Alexey Maslov
*/
public class Harvest {
public class Harvest
{
private static Context context;
private static final HarvestedCollectionService harvestedCollectionService =
HarvestServiceFactory.getInstance().getHarvestedCollectionService();
private static final HarvestedCollectionService harvestedCollectionService = HarvestServiceFactory.getInstance().getHarvestedCollectionService();
private static final EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService();
private static final CollectionService collectionService =
ContentServiceFactory.getInstance().getCollectionService();
private static final CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService();
public static void main(String[] argv) throws Exception {
public static void main(String[] argv) throws Exception
{
// create an options object and populate it
CommandLineParser parser = new PosixParser();
@@ -60,26 +56,18 @@ public class Harvest {
options.addOption("p", "purge", false, "delete all items in the collection");
options.addOption("r", "run", false, "run the standard harvest procedure");
options.addOption("g", "ping", false, "test the OAI server and set");
options.addOption("o", "once", false, "run the harvest procedure with specified parameters");
options.addOption("s", "setup", false, "Set the collection up for harvesting");
options.addOption("S", "start", false, "start the harvest loop");
options.addOption("R", "reset", false, "reset harvest status on all collections");
options.addOption("P", "purge", false, "purge all harvestable collections");
options.addOption("P", "purgeAll", false, "purge all harvestable collections");
options.addOption("e", "eperson", true,
"eperson");
options.addOption("c", "collection", true,
"harvesting collection (handle or id)");
options.addOption("t", "type", true,
"type of harvesting (0 for none)");
options.addOption("a", "address", true,
"address of the OAI-PMH server");
options.addOption("i", "oai_set_id", true,
"id of the PMH set representing the harvested collection");
options.addOption("m", "metadata_format", true,
"the name of the desired metadata format for harvesting, resolved to namespace and " +
"crosswalk in dspace.cfg");
options.addOption("e", "eperson", true, "eperson");
options.addOption("c", "collection", true, "harvesting collection (handle or id)");
options.addOption("t", "type", true, "type of harvesting (0 for none)");
options.addOption("a", "address", true, "address of the OAI-PMH server");
options.addOption("i", "oai_set_id", true, "id of the PMH set representing the harvested collection");
options.addOption("m", "metadata_format", true, "the name of the desired metadata format for harvesting, resolved to namespace and crosswalk in dspace.cfg");
options.addOption("h", "help", false, "help");
@@ -93,21 +81,25 @@ public class Harvest {
String metadataKey = null;
int harvestType = 0;
if (line.hasOption('h')) {
if (line.hasOption('h'))
{
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("Harvest\n", options);
System.out.println("\nPING OAI server: Harvest -g -a oai_source -i oai_set_id");
System.out.println(
"RUNONCE harvest with arbitrary options: Harvest -o -e eperson -c collection -t harvest_type -a " +
"oai_source -i oai_set_id -m metadata_format");
System.out.println(
"SETUP a collection for harvesting: Harvest -s -c collection -t harvest_type -a oai_source -i " +
"oai_set_id -m metadata_format");
System.out.println("RUN harvest once: Harvest -r -e eperson -c collection");
System.out.println("START harvest scheduler: Harvest -S");
System.out.println("RESET all harvest status: Harvest -R");
System.out.println("PURGE a collection of items and settings: Harvest -p -e eperson -c collection");
System.out.println("PURGE all harvestable collections: Harvest -P -e eperson");
System.out
.println("\nPING OAI server: Harvest -g -a oai_source -i oai_set_id");
System.out
.println("SETUP a collection for harvesting: Harvest -s -c collection -t harvest_type -a oai_source -i oai_set_id -m metadata_format");
System.out
.println("RUN harvest once: Harvest -r -e eperson -c collection");
System.out
.println("START harvest scheduler: Harvest -S");
System.out
.println("RESET all harvest status: Harvest -R");
System.out
.println("PURGE a collection of items and settings: Harvest -p -e eperson -c collection");
System.out
.println("PURGE all harvestable collections: Harvest -P -e eperson");
System.exit(0);
@@ -125,9 +117,6 @@ public class Harvest {
if (line.hasOption('g')) {
command = "ping";
}
if (line.hasOption('o')) {
command = "runOnce";
}
if (line.hasOption('S')) {
command = "start";
}
@@ -167,13 +156,17 @@ public class Harvest {
// Check our options
if (command == null) {
if (command == null)
{
System.out
.println("Error - no parameters specified (run with -h flag for details)");
System.exit(1);
} else if ("run".equals(command)) {
}
// Run a single harvest cycle on a collection using saved settings.
if (collection == null || eperson == null) {
else if ("run".equals(command))
{
if (collection == null || eperson == null)
{
System.out
.println("Error - a target collection and eperson must be provided");
System.out.println(" (run with -h flag for details)");
@@ -181,33 +174,43 @@ public class Harvest {
}
harvester.runHarvest(collection, eperson);
} else if ("start".equals(command)) {
}
// start the harvest loop
else if ("start".equals(command))
{
startHarvester();
} else if ("reset".equals(command)) {
}
// reset harvesting status
else if ("reset".equals(command))
{
resetHarvesting();
} else if ("purgeAll".equals(command)) {
}
// purge all collections that are set up for harvesting (obviously for testing purposes only)
if (eperson == null) {
else if ("purgeAll".equals(command))
{
if (eperson == null)
{
System.out
.println("Error - an eperson must be provided");
System.out.println(" (run with -h flag for details)");
System.exit(1);
}
System.out.println("Starting to purge all harvesting collections");
List<HarvestedCollection> harvestedCollections = harvestedCollectionService.findAll(context);
for (HarvestedCollection harvestedCollection : harvestedCollections) {
System.out.println(
"Purging the following collections (deleting items and resetting harvest status): " +
harvestedCollection
.getCollection().getID().toString());
for (HarvestedCollection harvestedCollection : harvestedCollections)
{
System.out.println("Purging the following collections (deleting items and resetting harvest status): " + harvestedCollection.getCollection().getID().toString());
harvester.purgeCollection(harvestedCollection.getCollection().getID().toString(), eperson);
}
context.complete();
} else if ("purge".equals(command)) {
}
// Delete all items in a collection. Useful for testing fresh harvests.
if (collection == null || eperson == null) {
else if ("purge".equals(command))
{
if (collection == null || eperson == null)
{
System.out
.println("Error - a target collection and eperson must be provided");
System.out.println(" (run with -h flag for details)");
@@ -218,34 +221,45 @@ public class Harvest {
context.complete();
//TODO: implement this... remove all items and remember to unset "last-harvested" settings
} else if ("config".equals(command)) {
}
// Configure a collection with the three main settings
if (collection == null) {
else if ("config".equals(command))
{
if (collection == null)
{
System.out.println("Error - a target collection must be provided");
System.out.println(" (run with -h flag for details)");
System.exit(1);
}
if (oaiSource == null || oaiSetID == null) {
if (oaiSource == null || oaiSetID == null)
{
System.out.println("Error - both the OAI server address and OAI set id must be specified");
System.out.println(" (run with -h flag for details)");
System.exit(1);
}
if (metadataKey == null) {
System.out
.println("Error - a metadata key (commonly the prefix) must be specified for this collection");
if (metadataKey == null)
{
System.out.println("Error - a metadata key (commonly the prefix) must be specified for this collection");
System.out.println(" (run with -h flag for details)");
System.exit(1);
}
harvester.configureCollection(collection, harvestType, oaiSource, oaiSetID, metadataKey);
} else if ("ping".equals(command)) {
if (oaiSource == null || oaiSetID == null) {
}
else if ("ping".equals(command))
{
if (oaiSource == null || oaiSetID == null)
{
System.out.println("Error - both the OAI server address and OAI set id must be specified");
System.out.println(" (run with -h flag for details)");
System.exit(1);
}
pingResponder(oaiSource, oaiSetID, metadataKey);
} else {
System.out.println("Error - your command '" + command + "' was not recoginzed properly");
System.out.println(" (run with -h flag for details)");
System.exit(1);
}
}
@@ -260,30 +274,40 @@ public class Harvest {
try {
// is the ID a handle?
if (collectionID != null) {
if (collectionID.indexOf('/') != -1) {
if (collectionID != null)
{
if (collectionID.indexOf('/') != -1)
{
// string has a / so it must be a handle - try and resolve it
dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context, collectionID);
// resolved, now make sure it's a collection
if (dso == null || dso.getType() != Constants.COLLECTION) {
if (dso == null || dso.getType() != Constants.COLLECTION)
{
targetCollection = null;
} else {
}
else
{
targetCollection = (Collection) dso;
}
} else {
// not a handle, try and treat it as an integer collection database ID
System.out.println("Looking up by id: " + collectionID + ", parsed as '" + Integer
.parseInt(collectionID) + "', " + "in context: " + context);
}
// not a handle, try and treat it as an integer collection
// database ID
else
{
// not a handle, try and treat it as an collection database UUID
System.out.println("Looking up by UUID: " + collectionID + ", " + "in context: " + context);
targetCollection = collectionService.find(context, UUID.fromString(collectionID));
}
}
// was the collection valid?
if (targetCollection == null) {
if (targetCollection == null)
{
System.out.println("Cannot resolve " + collectionID + " to collection");
System.exit(1);
}
} catch (SQLException se) {
}
catch (SQLException se) {
se.printStackTrace();
}
@@ -291,8 +315,7 @@ public class Harvest {
}
private void configureCollection(String collectionID, int type, String oaiSource, String oaiSetId,
String mdConfigId) {
private void configureCollection(String collectionID, int type, String oaiSource, String oaiSetId, String mdConfigId) {
System.out.println("Running: configure collection");
Collection collection = resolveCollection(collectionID);
@@ -310,12 +333,15 @@ public class Harvest {
harvestedCollectionService.update(context, hc);
context.restoreAuthSystemState();
context.complete();
} catch (Exception e) {
}
catch (Exception e) {
System.out.println("Changes could not be committed");
e.printStackTrace();
System.exit(1);
} finally {
if (context != null) {
}
finally {
if (context != null)
{
context.restoreAuthSystemState();
}
}
@@ -329,11 +355,11 @@ public class Harvest {
* @param email
*/
private void purgeCollection(String collectionID, String email) {
System.out.println(
"Purging collection of all items and resetting last_harvested and harvest_message: " + collectionID);
System.out.println("Purging collection of all items and resetting last_harvested and harvest_message: " + collectionID);
Collection collection = resolveCollection(collectionID);
try {
try
{
EPerson eperson = ePersonService.findByEmail(context, email);
context.setCurrentUser(eperson);
context.turnOffAuthorisationSystem();
@@ -346,7 +372,9 @@ public class Harvest {
Item item = it.next();
System.out.println("Deleting: " + item.getHandle());
collectionService.removeItem(context, collection, item);
context.uncacheEntity(item);// Dispatch events every 50 items
context.uncacheEntity(item);
// Dispatch events every 50 items
if (i%50 == 0) {
context.dispatchEvents();
i=0;
@@ -363,11 +391,13 @@ public class Harvest {
}
context.restoreAuthSystemState();
context.dispatchEvents();
} catch (Exception e) {
}
catch (Exception e) {
System.out.println("Changes could not be committed");
e.printStackTrace();
System.exit(1);
} finally {
}
finally {
context.restoreAuthSystemState();
}
}
@@ -386,7 +416,8 @@ public class Harvest {
HarvestedCollection hc = harvestedCollectionService.find(context, collection);
harvester = new OAIHarvester(context, collection, hc);
System.out.println("success. ");
} catch (HarvestingException hex) {
}
catch (HarvestingException hex) {
System.out.print("failed. ");
System.out.println(hex.getMessage());
throw new IllegalStateException("Unable to harvest", hex);
@@ -403,11 +434,14 @@ public class Harvest {
context.setCurrentUser(eperson);
harvester.runHarvest();
context.complete();
} catch (SQLException e) {
}
catch (SQLException e) {
throw new IllegalStateException("Failed to run harvester", e);
} catch (AuthorizeException e) {
}
catch (AuthorizeException e) {
throw new IllegalStateException("Failed to run harvester", e);
} catch (IOException e) {
}
catch (IOException e) {
throw new IllegalStateException("Failed to run harvester", e);
}
@@ -415,22 +449,24 @@ public class Harvest {
}
/**
* Resets harvest_status and harvest_start_time flags for all collections that have a row in the
* harvested_collections table
* Resets harvest_status and harvest_start_time flags for all collections that have a row in the harvested_collections table
*/
private static void resetHarvesting() {
System.out.print("Resetting harvest status flag on all collections... ");
try {
try
{
List<HarvestedCollection> harvestedCollections = harvestedCollectionService.findAll(context);
for (HarvestedCollection harvestedCollection : harvestedCollections) {
for (HarvestedCollection harvestedCollection : harvestedCollections)
{
//hc.setHarvestResult(null,"");
harvestedCollection.setHarvestStartTime(null);
harvestedCollection.setHarvestStatus(HarvestedCollection.STATUS_READY);
harvestedCollectionService.update(context, harvestedCollection);
}
System.out.println("success. ");
} catch (Exception ex) {
}
catch (Exception ex) {
System.out.println("failed. ");
ex.printStackTrace();
}
@@ -439,12 +475,15 @@ public class Harvest {
/**
* Starts up the harvest scheduler. Terminating this process will stop the scheduler.
*/
private static void startHarvester() {
try {
private static void startHarvester()
{
try
{
System.out.print("Starting harvest loop... ");
HarvestServiceFactory.getInstance().getHarvestSchedulingService().startNewScheduler();
System.out.println("running. ");
} catch (Exception ex) {
}
catch (Exception ex) {
ex.printStackTrace();
}
}
@@ -456,29 +495,30 @@ public class Harvest {
* @param set name of an item set.
* @param metadataFormat local prefix name, or null for "dc".
*/
private static void pingResponder(String server, String set, String metadataFormat) {
private static void pingResponder(String server, String set, String metadataFormat)
{
List<String> errors;
System.out.print("Testing basic PMH access: ");
errors = OAIHarvester.verifyOAIharvester(server, set,
(null != metadataFormat) ? metadataFormat : "dc", false);
if (errors.isEmpty()) {
if (errors.isEmpty())
System.out.println("OK");
} else {
for (String error : errors) {
else
{
for (String error : errors)
System.err.println(error);
}
}
System.out.print("Testing ORE support: ");
errors = OAIHarvester.verifyOAIharvester(server, set,
(null != metadataFormat) ? metadataFormat : "dc", true);
if (errors.isEmpty()) {
if (errors.isEmpty())
System.out.println("OK");
} else {
for (String error : errors) {
else
{
for (String error : errors)
System.err.println(error);
}
}
}
}

View File

@@ -7,17 +7,7 @@
*/
package org.dspace.app.itemexport;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import java.util.UUID;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.cli.*;
import org.dspace.app.itemexport.factory.ItemExportServiceFactory;
import org.dspace.app.itemexport.service.ItemExportService;
import org.dspace.content.Collection;
@@ -30,6 +20,8 @@ import org.dspace.core.Context;
import org.dspace.handle.factory.HandleServiceFactory;
import org.dspace.handle.service.HandleService;
import java.util.*;
/**
* Item exporter to create simple AIPs for DSpace content. Currently exports
* individual items, or entire collections. For instructions on use, see
@@ -53,21 +45,17 @@ import org.dspace.handle.service.HandleService;
*/
public class ItemExportCLITool {
protected static ItemExportService itemExportService = ItemExportServiceFactory.getInstance()
.getItemExportService();
protected static ItemExportService itemExportService = ItemExportServiceFactory.getInstance().getItemExportService();
protected static HandleService handleService = HandleServiceFactory.getInstance().getHandleService();
protected static ItemService itemService = ContentServiceFactory.getInstance().getItemService();
protected static CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService();
/**
* Default constructor
*/
private ItemExportCLITool() { }
/*
*
*/
public static void main(String[] argv) throws Exception {
public static void main(String[] argv) throws Exception
{
// create an options object and populate it
CommandLineParser parser = new PosixParser();
@@ -77,8 +65,7 @@ public class ItemExportCLITool {
options.addOption("i", "id", true, "ID or handle of thing to export");
options.addOption("d", "dest", true,
"destination where you want items to go");
options.addOption("m", "migrate", false,
"export for migration (remove handle and metadata that will be re-created in new system)");
options.addOption("m", "migrate", false, "export for migration (remove handle and metadata that will be re-created in new system)");
options.addOption("n", "number", true,
"sequence number to begin exporting items with");
options.addOption("z", "zip", true, "export as zip file (specify filename e.g. export.zip)");
@@ -99,7 +86,8 @@ public class ItemExportCLITool {
Item myItem = null;
Collection mycollection = null;
if (line.hasOption('h')) {
if (line.hasOption('h'))
{
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("ItemExport\n", options);
System.out
@@ -110,65 +98,79 @@ public class ItemExportCLITool {
System.exit(0);
}
if (line.hasOption('t')) { // type
if (line.hasOption('t')) // type
{
typeString = line.getOptionValue('t');
if ("ITEM".equals(typeString)) {
if ("ITEM".equals(typeString))
{
myType = Constants.ITEM;
} else if ("COLLECTION".equals(typeString)) {
}
else if ("COLLECTION".equals(typeString))
{
myType = Constants.COLLECTION;
}
}
if (line.hasOption('i')) { // id
if (line.hasOption('i')) // id
{
myIDString = line.getOptionValue('i');
}
if (line.hasOption('d')) { // dest
if (line.hasOption('d')) // dest
{
destDirName = line.getOptionValue('d');
}
if (line.hasOption('n')) { // number
if (line.hasOption('n')) // number
{
seqStart = Integer.parseInt(line.getOptionValue('n'));
}
boolean migrate = false;
if (line.hasOption('m')) { // number
if (line.hasOption('m')) // number
{
migrate = true;
}
boolean zip = false;
String zipFileName = "";
if (line.hasOption('z')) {
if (line.hasOption('z'))
{
zip = true;
zipFileName = line.getOptionValue('z');
}
boolean excludeBitstreams = false;
if (line.hasOption('x')) {
if (line.hasOption('x'))
{
excludeBitstreams = true;
}
// now validate the args
if (myType == -1) {
if (myType == -1)
{
System.out
.println("type must be either COLLECTION or ITEM (-h for help)");
System.exit(1);
}
if (destDirName == null) {
if (destDirName == null)
{
System.out
.println("destination directory must be set (-h for help)");
System.exit(1);
}
if (seqStart == -1) {
if (seqStart == -1)
{
System.out
.println("sequence start number must be set (-h for help)");
System.exit(1);
}
if (myIDString == null) {
if (myIDString == null)
{
System.out
.println("ID must be set to either a database ID or a handle (-h for help)");
System.exit(1);
@@ -177,62 +179,82 @@ public class ItemExportCLITool {
Context c = new Context(Context.Mode.READ_ONLY);
c.turnOffAuthorisationSystem();
if (myType == Constants.ITEM) {
if (myType == Constants.ITEM)
{
// first, is myIDString a handle?
if (myIDString.indexOf('/') != -1) {
if (myIDString.indexOf('/') != -1)
{
myItem = (Item) handleService.resolveToObject(c, myIDString);
if ((myItem == null) || (myItem.getType() != Constants.ITEM)) {
if ((myItem == null) || (myItem.getType() != Constants.ITEM))
{
myItem = null;
}
} else {
}
else
{
myItem = itemService.find(c, UUID.fromString(myIDString));
}
if (myItem == null) {
if (myItem == null)
{
System.out
.println("Error, item cannot be found: " + myIDString);
}
} else {
if (myIDString.indexOf('/') != -1) {
}
else
{
if (myIDString.indexOf('/') != -1)
{
// has a / must be a handle
mycollection = (Collection) handleService.resolveToObject(c,
myIDString);
// ensure it's a collection
if ((mycollection == null)
|| (mycollection.getType() != Constants.COLLECTION)) {
|| (mycollection.getType() != Constants.COLLECTION))
{
mycollection = null;
}
} else if (myIDString != null) {
}
else if (myIDString != null)
{
mycollection = collectionService.find(c, UUID.fromString(myIDString));
}
if (mycollection == null) {
if (mycollection == null)
{
System.out.println("Error, collection cannot be found: "
+ myIDString);
System.exit(1);
}
}
if (zip) {
if (zip)
{
Iterator<Item> items;
if (myItem != null) {
if (myItem != null)
{
List<Item> myItems = new ArrayList<>();
myItems.add(myItem);
items = myItems.iterator();
} else {
}
else
{
System.out.println("Exporting from collection: " + myIDString);
items = itemService.findByCollection(c, mycollection);
}
itemExportService.exportAsZip(c, items, destDirName, zipFileName, seqStart, migrate, excludeBitstreams);
} else {
if (myItem != null) {
}
else
{
if (myItem != null)
{
// it's only a single item
itemExportService
.exportItem(c, Collections.singletonList(myItem).iterator(), destDirName, seqStart, migrate,
excludeBitstreams);
} else {
itemExportService.exportItem(c, Collections.singletonList(myItem).iterator(), destDirName, seqStart, migrate, excludeBitstreams);
}
else
{
System.out.println("Exporting from collection: " + myIDString);
// it's a collection, so do a bunch of items

View File

@@ -10,17 +10,20 @@ package org.dspace.app.itemexport;
/**
* An exception that can be thrown when error occur during item export
*/
public class ItemExportException extends Exception {
public class ItemExportException extends Exception
{
public static final int EXPORT_TOO_LARGE = 0;
private int reason;
public ItemExportException(int r, String message) {
public ItemExportException(int r, String message)
{
super(message);
reason = r;
}
public int getReason() {
public int getReason()
{
return reason;
}
}

View File

@@ -11,8 +11,7 @@ import org.dspace.app.itemexport.service.ItemExportService;
import org.dspace.services.factory.DSpaceServicesFactory;
/**
* Abstract factory to get services for the itemexport package, use ItemExportServiceFactory.getInstance() to
* retrieve an implementation
* Abstract factory to get services for the itemexport package, use ItemExportServiceFactory.getInstance() to retrieve an implementation
*
* @author kevinvandevelde at atmire.com
*/
@@ -21,7 +20,6 @@ public abstract class ItemExportServiceFactory {
public abstract ItemExportService getItemExportService();
public static ItemExportServiceFactory getInstance(){
return DSpaceServicesFactory.getInstance().getServiceManager()
.getServiceByName("itemExportServiceFactory", ItemExportServiceFactory.class);
return DSpaceServicesFactory.getInstance().getServiceManager().getServiceByName("itemExportServiceFactory", ItemExportServiceFactory.class);
}
}

View File

@@ -11,8 +11,7 @@ import org.dspace.app.itemexport.service.ItemExportService;
import org.springframework.beans.factory.annotation.Autowired;
/**
* Factory implementation to get services for the itemexport package, use ItemExportServiceFactory.getInstance() to
* retrieve an implementation
* Factory implementation to get services for the itemexport package, use ItemExportServiceFactory.getInstance() to retrieve an implementation
*
* @author kevinvandevelde at atmire.com
*/

View File

@@ -7,17 +7,17 @@
*/
package org.dspace.app.itemexport.service;
import java.io.InputStream;
import java.util.Date;
import java.util.Iterator;
import java.util.List;
import javax.mail.MessagingException;
import org.dspace.content.DSpaceObject;
import org.dspace.content.Item;
import org.dspace.core.Context;
import org.dspace.eperson.EPerson;
import javax.mail.MessagingException;
import java.io.InputStream;
import java.util.Date;
import java.util.Iterator;
import java.util.List;
/**
* Item exporter to create simple AIPs for DSpace content. Currently exports
* individual items, or entire collections. For instructions on use, see
@@ -71,8 +71,10 @@ public interface ItemExportService {
* Convenience methot to create export a single Community, Collection, or
* Item
*
* @param dso - the dspace object to export
* @param context - the dspace context
* @param dso
* - the dspace object to export
* @param context
* - the dspace context
* @param migrate Whether to use the migrate option or not
* @throws Exception if error
*/
@@ -83,8 +85,10 @@ public interface ItemExportService {
* Convenience method to export a List of dspace objects (Community,
* Collection or Item)
*
* @param dsObjects - List containing dspace objects
* @param context - the dspace context
* @param dsObjects
* - List containing dspace objects
* @param context
* - the dspace context
* @param migrate Whether to use the migrate option or not
* @throws Exception if error
*/
@@ -95,9 +99,12 @@ public interface ItemExportService {
* Convenience methot to create export a single Community, Collection, or
* Item
*
* @param dso - the dspace object to export
* @param context - the dspace context
* @param additionalEmail - cc email to use
* @param dso
* - the dspace object to export
* @param context
* - the dspace context
* @param additionalEmail
* - cc email to use
* @param migrate Whether to use the migrate option or not
* @throws Exception if error
*/
@@ -108,9 +115,12 @@ public interface ItemExportService {
* Convenience method to export a List of dspace objects (Community,
* Collection or Item)
*
* @param dsObjects - List containing dspace objects
* @param context - the dspace context
* @param additionalEmail - cc email to use
* @param dsObjects
* - List containing dspace objects
* @param context
* - the dspace context
* @param additionalEmail
* - cc email to use
* @param migrate Whether to use the migrate option or not
* @throws Exception if error
*/
@@ -122,8 +132,10 @@ public interface ItemExportService {
* Create a file name based on the date and eperson
*
* @param type Type of object (as string)
* @param eperson - eperson who requested export and will be able to download it
* @param date - the date the export process was created
* @param eperson
* - eperson who requested export and will be able to download it
* @param date
* - the date the export process was created
* @return String representing the file name in the form of
* 'export_yyy_MMM_dd_count_epersonID'
* @throws Exception if error
@@ -136,7 +148,8 @@ public interface ItemExportService {
* Use config file entry for org.dspace.app.itemexport.download.dir and id
* of the eperson to create a download directory name
*
* @param ePerson - the eperson who requested export archive
* @param ePerson
* - the eperson who requested export archive
* @return String representing a directory in the form of
* org.dspace.app.itemexport.download.dir/epersonID
* @throws Exception if error
@@ -157,8 +170,10 @@ public interface ItemExportService {
/**
* Used to read the export archived. Inteded for download.
*
* @param fileName the name of the file to download
* @param eperson the eperson requesting the download
* @param fileName
* the name of the file to download
* @param eperson
* the eperson requesting the download
* @return an input stream of the file to be downloaded
* @throws Exception if error
*/
@@ -169,9 +184,10 @@ public interface ItemExportService {
* Get the file size of the export archive represented by the file name.
*
* @param context DSpace context
* @param fileName name of the file to get the size.
* @return size as long
* @param fileName
* name of the file to get the size.
* @throws Exception if error
* @return size as long
*/
public long getExportFileSize(Context context, String fileName) throws Exception;
@@ -179,10 +195,11 @@ public interface ItemExportService {
* Get the last modified date of the export archive represented by the file name.
*
* @param context DSpace context
* @param fileName name of the file to get the size.
* @param fileName
* name of the file to get the size.
* @return date as long
* @throws Exception if error
* @see java.io.File#lastModified()
* @throws Exception if error
*/
public long getExportFileLastModified(Context context, String fileName)
throws Exception;
@@ -192,8 +209,10 @@ public interface ItemExportService {
* who created it When requested for download this method can check if the
* person requesting it is the same one that created it
*
* @param context dspace context
* @param fileName the file name to check auths for
* @param context
* dspace context
* @param fileName
* the file name to check auths for
* @return true if it is the same person false otherwise
*/
public boolean canDownload(Context context, String fileName);
@@ -215,7 +234,8 @@ public interface ItemExportService {
* uses the config file entry 'org.dspace.app.itemexport.life.span.hours' to
* determine if the current exports are too old and need pruging
*
* @param eperson - the eperson to clean up
* @param eperson
* - the eperson to clean up
* @throws Exception if error
*/
public void deleteOldExportArchives(EPerson eperson) throws Exception;
@@ -236,9 +256,12 @@ public interface ItemExportService {
* communication with email instead. Send a success email once the export
* archive is complete and ready for download
*
* @param context - the current Context
* @param eperson - eperson to send the email to
* @param fileName - the file name to be downloaded. It is added to the url in
* @param context
* - the current Context
* @param eperson
* - eperson to send the email to
* @param fileName
* - the file name to be downloaded. It is added to the url in
* the email
* @throws MessagingException if error
*/
@@ -251,8 +274,10 @@ public interface ItemExportService {
* communication with email instead. Send an error email if the export
* archive fails
*
* @param eperson - EPerson to send the error message to
* @param error - the error message
* @param eperson
* - EPerson to send the error message to
* @param error
* - the error message
* @throws MessagingException if error
*/
public void emailErrorMessage(EPerson eperson, String error)
@@ -260,7 +285,6 @@ public interface ItemExportService {
/**
* Zip source to target
*
* @param strSource source file
* @param target target file
* @throws Exception if error

View File

@@ -7,20 +7,23 @@
*/
package org.dspace.app.itemimport;
import gr.ekt.bte.core.DataLoader;
import gr.ekt.bte.core.TransformationEngine;
import gr.ekt.bte.dataloader.FileDataLoader;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import gr.ekt.bte.core.DataLoader;
import gr.ekt.bte.core.TransformationEngine;
import gr.ekt.bte.dataloader.FileDataLoader;
/**
* This class acts as a Service in the procedure to batch import using the Biblio-Transformation-Engine
*/
public class BTEBatchImportService {
public class BTEBatchImportService
{
TransformationEngine transformationEngine;
Map<String, DataLoader> dataLoaders = new HashMap<String, DataLoader>();
@@ -29,31 +32,31 @@ public class BTEBatchImportService {
/**
* Default constructor
*/
public BTEBatchImportService() {
public BTEBatchImportService()
{
super();
}
/**
* Setter method for dataLoaders parameter
*
* @param dataLoaders map of data loaders
*/
public void setDataLoaders(Map<String, DataLoader> dataLoaders) {
public void setDataLoaders(Map<String, DataLoader> dataLoaders)
{
this.dataLoaders = dataLoaders;
}
/**
* Get data loaders
*
* @return the map of DataLoaders
*/
public Map<String, DataLoader> getDataLoaders() {
public Map<String, DataLoader> getDataLoaders()
{
return dataLoaders;
}
/**
* Get output map
*
* @return the outputMapping
*/
public Map<String, String> getOutputMap() {
@@ -62,7 +65,6 @@ public class BTEBatchImportService {
/**
* Setter method for the outputMapping
*
* @param outputMap the output mapping
*/
public void setOutputMap(Map<String, String> outputMap) {
@@ -71,7 +73,6 @@ public class BTEBatchImportService {
/**
* Get transformation engine
*
* @return transformation engine
*/
public TransformationEngine getTransformationEngine() {
@@ -80,7 +81,6 @@ public class BTEBatchImportService {
/**
* set transformation engine
*
* @param transformationEngine transformation engine
*/
public void setTransformationEngine(TransformationEngine transformationEngine) {
@@ -89,7 +89,6 @@ public class BTEBatchImportService {
/**
* Getter of file data loaders
*
* @return List of file data loaders
*/
public List<String> getFileDataLoaders(){

View File

@@ -20,6 +20,7 @@ import java.util.List;
/**
* @author kstamatis
*
*/
public class BatchUpload {
@@ -34,7 +35,6 @@ public class BatchUpload {
/**
* Initialize with directory
*
* @param dirPath directory path
*/
public BatchUpload(String dirPath) {
@@ -45,7 +45,6 @@ public class BatchUpload {
/**
* Initialize with directory
*
* @param dir directory path
*/
public BatchUpload(File dir) {
@@ -56,7 +55,6 @@ public class BatchUpload {
/**
* Initialize with directory
*
* @param dir directory path
*/
private void initializeWithFile(File dir){
@@ -98,7 +96,6 @@ public class BatchUpload {
/**
* Count lines in file
*
* @param filename file name
* @return lines in file
* @throws IOException if IO error
@@ -109,12 +106,11 @@ public class BatchUpload {
String lineRead = "";
while ((lineRead = reader.readLine()) != null) {
String[] parts = lineRead.split(" ");
if (parts.length > 1) {
if (parts.length > 1)
handlesImported.add(parts[1].trim());
} else {
else
handlesImported.add(lineRead);
}
}
cnt = reader.getLineNumber();
reader.close();
@@ -123,7 +119,6 @@ public class BatchUpload {
/**
* Read a file
*
* @param filename file name
* @throws IOException if IO error
*/
@@ -135,9 +130,11 @@ public class BatchUpload {
if (lineRead.startsWith("\tat ")){
this.errorMsgHTML += "<span class=\"batchimport-error-tab\">" + lineRead + "</span><br/>";
} else if (lineRead.startsWith("Caused by")) {
}
else if (lineRead.startsWith("Caused by")){
this.errorMsgHTML += "<span class=\"batchimport-error-caused\">" + lineRead + "</span><br/>";
} else {
}
else {
this.errorMsgHTML += lineRead + "<br/>";
}
}
@@ -146,7 +143,6 @@ public class BatchUpload {
/**
* Get date
*
* @return Date
*/
public Date getDate() {
@@ -155,7 +151,6 @@ public class BatchUpload {
/**
* Get path to directory
*
* @return directory
*/
public File getDir() {
@@ -164,7 +159,6 @@ public class BatchUpload {
/**
* Whether successulf
*
* @return true or false
*/
public boolean isSuccessful() {
@@ -173,7 +167,6 @@ public class BatchUpload {
/**
* Get items imported
*
* @return number of items
*/
public int getItemsImported() {
@@ -182,7 +175,6 @@ public class BatchUpload {
/**
* Get total items
*
* @return total
*/
public int getTotalItems() {
@@ -191,7 +183,6 @@ public class BatchUpload {
/**
* Get formatted date (DD/MM/YY)
*
* @return date as string
*/
public String getDateFormatted(){
@@ -202,7 +193,6 @@ public class BatchUpload {
/**
* Get handles of imported files
*
* @return list of handles
*/
public List<String> getHandlesImported() {
@@ -211,7 +201,6 @@ public class BatchUpload {
/**
* Get error message
*
* @return error message
*/
public String getErrorMsg() {
@@ -220,7 +209,6 @@ public class BatchUpload {
/**
* Get error message as HTML
*
* @return error message string as HTML
*/
public String getErrorMsgHTML() {

View File

@@ -7,17 +7,7 @@
*/
package org.dspace.app.itemimport;
import java.io.File;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.UUID;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.cli.*;
import org.dspace.app.itemimport.factory.ItemImportServiceFactory;
import org.dspace.app.itemimport.service.ItemImportService;
import org.dspace.content.Collection;
@@ -31,6 +21,12 @@ import org.dspace.eperson.service.EPersonService;
import org.dspace.handle.factory.HandleServiceFactory;
import org.dspace.handle.service.HandleService;
import java.io.File;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.UUID;
/**
* Import items into DSpace. The conventional use is upload files by copying
* them. DSpace writes the item's bitstreams into its assetstore. Metadata is
@@ -51,17 +47,12 @@ public class ItemImportCLITool {
private static boolean template = false;
private static final CollectionService collectionService = ContentServiceFactory.getInstance()
.getCollectionService();
private static final CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService();
private static final EPersonService epersonService = EPersonServiceFactory.getInstance().getEPersonService();
private static final HandleService handleService = HandleServiceFactory.getInstance().getHandleService();
/**
* Default constructor
*/
private ItemImportCLITool() { }
public static void main(String[] argv) throws Exception {
public static void main(String[] argv) throws Exception
{
Date startTime = new Date();
int status = 0;
@@ -117,17 +108,13 @@ public class ItemImportCLITool {
System.out
.println("\nadding items: ItemImport -a -e eperson -c collection -s sourcedir -m mapfile");
System.out
.println(
"\nadding items from zip file: ItemImport -a -e eperson -c collection -s sourcedir -z " +
"filename.zip -m mapfile");
.println("\nadding items from zip file: ItemImport -a -e eperson -c collection -s sourcedir -z filename.zip -m mapfile");
System.out
.println("replacing items: ItemImport -r -e eperson -c collection -s sourcedir -m mapfile");
System.out
.println("deleting items: ItemImport -d -e eperson -m mapfile");
System.out
.println(
"If multiple collections are specified, the first collection will be the one that owns the " +
"item.");
.println("If multiple collections are specified, the first collection will be the one that owns the item.");
System.exit(0);
}
@@ -168,19 +155,23 @@ public class ItemImportCLITool {
template = true;
}
if (line.hasOption('s')) { // source
if (line.hasOption('s')) // source
{
sourcedir = line.getOptionValue('s');
}
if (line.hasOption('m')) { // mapfile
if (line.hasOption('m')) // mapfile
{
mapfile = line.getOptionValue('m');
}
if (line.hasOption('e')) { // eperson
if (line.hasOption('e')) // eperson
{
eperson = line.getOptionValue('e');
}
if (line.hasOption('c')) { // collections
if (line.hasOption('c')) // collections
{
collections = line.getOptionValues('c');
}
@@ -236,8 +227,7 @@ public class ItemImportCLITool {
commandLineCollections = false;
}
} else if ("add-bte".equals(command)) {
//Source dir can be null, the user can specify the parameters for his loader in the Spring XML
// configuration file
//Source dir can be null, the user can specify the parameters for his loader in the Spring XML configuration file
if (mapfile == null) {
System.out
@@ -260,9 +250,7 @@ public class ItemImportCLITool {
if (bteInputType == null) {
System.out
.println(
"Error - an input type (tsv, csv, ris, endnote, bibtex or any other type you have " +
"specified in BTE Spring XML configuration file) must be specified");
.println("Error - an input type (tsv, csv, ris, endnote, bibtex or any other type you have specified in BTE Spring XML configuration file) must be specified");
System.out.println(" (run with -h flag for details)");
System.exit(1);
}
@@ -349,8 +337,10 @@ public class ItemImportCLITool {
|| (mycollections.get(i).getType() != Constants.COLLECTION)) {
mycollections.set(i, null);
}
} else if (collections[i] != null) {
// not a handle, try and treat it as an integer collection database ID
}
// not a handle, try and treat it as an integer collection
// database ID
else if (collections[i] != null) {
mycollections.set(i, collectionService.find(c, UUID.fromString(collections[i])));
}
@@ -404,13 +394,11 @@ public class ItemImportCLITool {
try {
if (zip) {
System.gc();
System.out.println(
"Deleting temporary zip directory: " + myloader.getTempWorkDirFile().getAbsolutePath());
System.out.println("Deleting temporary zip directory: " + myloader.getTempWorkDirFile().getAbsolutePath());
myloader.cleanupZipTemp();
}
} catch (Exception ex) {
System.out.println("Unable to delete temporary zip archive location: " + myloader.getTempWorkDirFile()
.getAbsolutePath());
System.out.println("Unable to delete temporary zip archive location: " + myloader.getTempWorkDirFile().getAbsolutePath());
}
@@ -421,9 +409,7 @@ public class ItemImportCLITool {
Date endTime = new Date();
System.out.println("Started: " + startTime.getTime());
System.out.println("Ended: " + endTime.getTime());
System.out.println(
"Elapsed time: " + ((endTime.getTime() - startTime.getTime()) / 1000) + " secs (" + (endTime
.getTime() - startTime.getTime()) + " msecs)");
System.out.println("Elapsed time: " + ((endTime.getTime() - startTime.getTime()) / 1000) + " secs (" + (endTime.getTime() - startTime.getTime()) + " msecs)");
}
System.exit(status);

View File

@@ -11,8 +11,7 @@ import org.dspace.app.itemimport.service.ItemImportService;
import org.dspace.services.factory.DSpaceServicesFactory;
/**
* Abstract factory to get services for the itemimport package, use ItemImportService.getInstance() to retrieve an
* implementation
* Abstract factory to get services for the itemimport package, use ItemImportService.getInstance() to retrieve an implementation
*
* @author kevinvandevelde at atmire.com
*/
@@ -21,7 +20,6 @@ public abstract class ItemImportServiceFactory {
public abstract ItemImportService getItemImportService();
public static ItemImportServiceFactory getInstance(){
return DSpaceServicesFactory.getInstance().getServiceManager()
.getServiceByName("itemImportServiceFactory", ItemImportServiceFactory.class);
return DSpaceServicesFactory.getInstance().getServiceManager().getServiceByName("itemImportServiceFactory", ItemImportServiceFactory.class);
}
}

View File

@@ -11,8 +11,7 @@ import org.dspace.app.itemimport.service.ItemImportService;
import org.springframework.beans.factory.annotation.Autowired;
/**
* Factory implementation to get services for the itemimport package, use ItemImportService.getInstance() to retrieve
* an implementation
* Factory implementation to get services for the itemimport package, use ItemImportService.getInstance() to retrieve an implementation
*
* @author kevinvandevelde at atmire.com
*/

View File

@@ -7,16 +7,16 @@
*/
package org.dspace.app.itemimport.service;
import java.io.File;
import java.io.IOException;
import java.util.List;
import javax.mail.MessagingException;
import org.dspace.app.itemimport.BatchUpload;
import org.dspace.content.Collection;
import org.dspace.core.Context;
import org.dspace.eperson.EPerson;
import javax.mail.MessagingException;
import java.io.File;
import java.io.IOException;
import java.util.List;
/**
* Import items into DSpace. The conventional use is upload files by copying
* them. DSpace writes the item's bitstreams into its assetstore. Metadata is
@@ -37,6 +37,7 @@ public interface ItemImportService {
/**
*
* @param c DSpace Context
* @param mycollections List of Collections
* @param sourceDir source location
@@ -44,12 +45,10 @@ public interface ItemImportService {
* @param template whether to use template item
* @throws Exception if error
*/
public void addItemsAtomic(Context c, List<Collection> mycollections, String sourceDir, String mapFile,
boolean template) throws Exception;
public void addItemsAtomic(Context c, List<Collection> mycollections, String sourceDir, String mapFile, boolean template) throws Exception;
/**
* Add items
*
* @param c DSpace Context
* @param mycollections List of Collections
* @param sourceDir source location
@@ -62,7 +61,6 @@ public interface ItemImportService {
/**
* Unzip a file
*
* @param zipfile file
* @return unzip location
* @throws IOException if error
@@ -71,7 +69,6 @@ public interface ItemImportService {
/**
* Unzip a file to a destination
*
* @param zipfile file
* @param destDir destination directory
* @return unzip location
@@ -81,7 +78,6 @@ public interface ItemImportService {
/**
* Unzip a file in a specific source directory
*
* @param sourcedir source directory
* @param zipfilename file name
* @return unzip location
@@ -90,8 +86,8 @@ public interface ItemImportService {
public String unzip(String sourcedir, String zipfilename) throws IOException;
/**
* Given a public URL to a zip file that has the Simple Archive Format, this method imports the contents to DSpace
*
* Given a public URL to a zip file that has the Simple Archive Format, this method imports the contents to DSpace
* @param url The public URL of the zip file
* @param owningCollection The owning collection the items will belong to
* @param collections The collections the created items will be inserted to, apart from the owning one
@@ -101,8 +97,7 @@ public interface ItemImportService {
* @param template whether to use template item
* @throws Exception if error
*/
public void processUIImport(String url, Collection owningCollection, String[] collections, String resumeDir,
String inputType, Context context, boolean template) throws Exception;
public void processUIImport(String url, Collection owningCollection, String[] collections, String resumeDir, String inputType, Context context, boolean template) throws Exception;
/**
* Since the BTE batch import is done in a new thread we are unable to communicate
@@ -110,9 +105,12 @@ public interface ItemImportService {
* communication with email instead. Send a success email once the batch
* import is complete
*
* @param context - the current Context
* @param eperson - eperson to send the email to
* @param fileName - the filepath to the mapfile created by the batch import
* @param context
* - the current Context
* @param eperson
* - eperson to send the email to
* @param fileName
* - the filepath to the mapfile created by the batch import
* @throws MessagingException if error
*/
public void emailSuccessMessage(Context context, EPerson eperson,
@@ -124,8 +122,10 @@ public interface ItemImportService {
* communication with email instead. Send an error email if the batch
* import fails
*
* @param eperson - EPerson to send the error message to
* @param error - the error message
* @param eperson
* - EPerson to send the error message to
* @param error
* - the error message
* @throws MessagingException if error
*/
public void emailErrorMessage(EPerson eperson, String error)
@@ -134,7 +134,6 @@ public interface ItemImportService {
/**
* Get imports available for a person
*
* @param eperson EPerson object
* @return List of batch uploads
* @throws Exception if error
@@ -144,7 +143,6 @@ public interface ItemImportService {
/**
* Get import upload directory
*
* @param ePerson EPerson object
* @return directory
* @throws Exception if error
@@ -154,7 +152,6 @@ public interface ItemImportService {
/**
* Delete a batch by ID
*
* @param c DSpace Context
* @param uploadId identifier
* @throws Exception if error
@@ -163,7 +160,6 @@ public interface ItemImportService {
/**
* Replace items
*
* @param c DSpace Context
* @param mycollections List of Collections
* @param sourcedir source directory
@@ -171,12 +167,10 @@ public interface ItemImportService {
* @param template whether to use template item
* @throws Exception if error
*/
public void replaceItems(Context c, List<Collection> mycollections, String sourcedir, String mapfile,
boolean template) throws Exception;
public void replaceItems(Context c, List<Collection> mycollections, String sourcedir, String mapfile, boolean template) throws Exception;
/**
* Delete items via mapfile
*
* @param c DSpace Context
* @param mapfile map file
* @throws Exception if error
@@ -185,7 +179,6 @@ public interface ItemImportService {
/**
* Add items
*
* @param c DSpace Context
* @param mycollections List of Collections
* @param sourcedir source directory
@@ -195,23 +188,19 @@ public interface ItemImportService {
* @param workingDir working directory
* @throws Exception if error
*/
public void addBTEItems(Context c, List<Collection> mycollections, String sourcedir, String mapfile,
boolean template, String bteInputType, String workingDir) throws Exception;
public void addBTEItems(Context c, List<Collection> mycollections, String sourcedir, String mapfile, boolean template, String bteInputType, String workingDir) throws Exception;
/**
* Get temporary work directory
*
* @return directory as string
*/
public String getTempWorkDir();
/**
* Get temporary work directory (as File)
*
* @return directory as File
* @throws java.io.IOException if the directory cannot be created.
*/
public File getTempWorkDirFile() throws IOException;
public File getTempWorkDirFile();
/**
* Cleanup
@@ -220,21 +209,18 @@ public interface ItemImportService {
/**
* Set test flag
*
* @param isTest true or false
*/
public void setTest(boolean isTest);
/**
* Set resume flag
*
* @param isResume true or false
*/
public void setResume(boolean isResume);
/**
* Set use workflow
*
* @param useWorkflow whether to enable workflow
*/
public void setUseWorkflow(boolean useWorkflow);
@@ -246,7 +232,6 @@ public interface ItemImportService {
/**
* Set quiet flag
*
* @param isQuiet true or false
*/
public void setQuiet(boolean isQuiet);

View File

@@ -25,6 +25,7 @@ import org.springframework.beans.factory.annotation.Autowired;
* based on the existence of bitstreams within the ORIGINAL bundle.
*
* @author Kostas Stamatis
*
*/
public class ItemMarkingAvailabilityBitstreamStrategy implements ItemMarkingExtractor {
@@ -48,14 +49,16 @@ public class ItemMarkingAvailabilityBitstreamStrategy implements ItemMarkingExtr
markInfo.setImageName(nonAvailableImageName);
return markInfo;
} else {
}
else {
Bundle originalBundle = bundles.iterator().next();
if (originalBundle.getBitstreams().size() == 0){
ItemMarkingInfo markInfo = new ItemMarkingInfo();
markInfo.setImageName(nonAvailableImageName);
return markInfo;
} else {
}
else {
Bitstream bitstream = originalBundle.getBitstreams().get(0);
ItemMarkingInfo signInfo = new ItemMarkingInfo();
@@ -63,6 +66,7 @@ public class ItemMarkingAvailabilityBitstreamStrategy implements ItemMarkingExtr
signInfo.setTooltip(bitstream.getName());
String bsLink = "";
bsLink = bsLink + "bitstream/"

View File

@@ -20,6 +20,7 @@ import org.dspace.core.Context;
* based on the collection the items belong to
*
* @author Kostas Stamatis
*
*/
public class ItemMarkingCollectionStrategy implements ItemMarkingExtractor {

View File

@@ -16,6 +16,7 @@ import org.dspace.core.Context;
* Interface to abstract the strategy for item signing
*
* @author Kostas Stamatis
*
*/
public interface ItemMarkingExtractor {
public ItemMarkingInfo getItemMarkingInfo(Context context, Item item)

View File

@@ -11,6 +11,7 @@ package org.dspace.app.itemmarking;
* Simple DTO to transfer data about the marking info for an item
*
* @author Kostas Stamatis
*
*/
public class ItemMarkingInfo {
private String imageName;

View File

@@ -24,6 +24,7 @@ import org.springframework.beans.factory.annotation.Autowired;
* metadata field
*
* @author Kostas Stamatis
*
*/
public class ItemMarkingMetadataStrategy implements ItemMarkingExtractor {
@@ -40,9 +41,11 @@ public class ItemMarkingMetadataStrategy implements ItemMarkingExtractor {
public ItemMarkingInfo getItemMarkingInfo(Context context, Item item)
throws SQLException {
if (metadataField != null && mapping != null) {
if (metadataField != null && mapping!=null)
{
List<MetadataValue> vals = itemService.getMetadataByMetadataString(item, metadataField);
if (vals.size() > 0) {
if (vals.size() > 0)
{
for (MetadataValue value : vals){
String type = value.getValue();
if (mapping.containsKey(type)){

View File

@@ -16,6 +16,8 @@ import java.util.Map;
* Order of actions is very import for correct processing. This implementation
* supports an iterator that returns the actions in the order in which they are
* put in. Adding the same action a second time has no effect on this order.
*
*
*/
public class ActionManager implements Iterable<UpdateAction> {
@@ -24,17 +26,18 @@ public class ActionManager implements Iterable<UpdateAction> {
/**
* Get update action
*
* @param actionClass UpdateAction class
* @return instantiation of UpdateAction class
* @throws InstantiationException if instantiation error
* @throws IllegalAccessException if illegal access error
*/
public UpdateAction getUpdateAction(Class<? extends UpdateAction> actionClass)
throws InstantiationException, IllegalAccessException {
throws InstantiationException, IllegalAccessException
{
UpdateAction action = registry.get(actionClass);
if (action == null) {
if (action == null)
{
action = actionClass.newInstance();
registry.put(actionClass, action);
}
@@ -43,9 +46,11 @@ public class ActionManager implements Iterable<UpdateAction> {
}
/**
*
* @return whether any actions have been registered with this manager
*/
public boolean hasActions() {
public boolean hasActions()
{
return !registry.isEmpty();
}
@@ -56,23 +61,28 @@ public class ActionManager implements Iterable<UpdateAction> {
* @return iterator for UpdateActions
*/
@Override
public Iterator<UpdateAction> iterator() {
return new Iterator<UpdateAction>() {
public Iterator<UpdateAction> iterator()
{
return new Iterator<UpdateAction>()
{
private Iterator<Class<? extends UpdateAction>> itr = registry.keySet().iterator();
@Override
public boolean hasNext() {
public boolean hasNext()
{
return itr.hasNext();
}
@Override
public UpdateAction next() {
public UpdateAction next()
{
return registry.get(itr.next());
}
//not supported
@Override
public void remove() {
public void remove()
{
throw new UnsupportedOperationException();
}
};

View File

@@ -19,11 +19,7 @@ import java.util.List;
import org.dspace.authorize.AuthorizeException;
import org.dspace.authorize.factory.AuthorizeServiceFactory;
import org.dspace.authorize.service.AuthorizeService;
import org.dspace.content.Bitstream;
import org.dspace.content.BitstreamFormat;
import org.dspace.content.Bundle;
import org.dspace.content.DCDate;
import org.dspace.content.Item;
import org.dspace.content.*;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.BitstreamFormatService;
import org.dspace.content.service.InstallItemService;
@@ -34,16 +30,18 @@ import org.dspace.eperson.service.GroupService;
/**
* Action to add bitstreams listed in item contents file to the item in DSpace
*
*
*/
public class AddBitstreamsAction extends UpdateBitstreamsAction {
protected AuthorizeService authorizeService = AuthorizeServiceFactory.getInstance().getAuthorizeService();
protected BitstreamFormatService bitstreamFormatService = ContentServiceFactory.getInstance()
.getBitstreamFormatService();
protected BitstreamFormatService bitstreamFormatService = ContentServiceFactory.getInstance().getBitstreamFormatService();
protected GroupService groupService = EPersonServiceFactory.getInstance().getGroupService();
protected InstallItemService installItemService = ContentServiceFactory.getInstance().getInstallItemService();
public AddBitstreamsAction() {
public AddBitstreamsAction()
{
//empty
}
@@ -63,13 +61,15 @@ public class AddBitstreamsAction extends UpdateBitstreamsAction {
@Override
public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws IllegalArgumentException,
ParseException, IOException, AuthorizeException, SQLException {
ParseException, IOException, AuthorizeException, SQLException
{
Item item = itarch.getItem();
File dir = itarch.getDirectory();
List<ContentsEntry> contents = MetadataUtilities.readContentsFile(new File(dir, ItemUpdate.CONTENTS_FILE));
if (contents.isEmpty()) {
if (contents.isEmpty())
{
ItemUpdate.pr("Contents is empty - no bitstreams to add");
return;
}
@@ -78,30 +78,36 @@ public class AddBitstreamsAction extends UpdateBitstreamsAction {
String[] files = dir.list(ItemUpdate.fileFilter);
List<String> fileList = new ArrayList<String>();
for (String filename : files) {
for (String filename : files)
{
fileList.add(filename);
ItemUpdate.pr("file: " + filename);
}
for (ContentsEntry ce : contents) {
for (ContentsEntry ce : contents)
{
//validate match to existing file in archive
if (!fileList.contains(ce.filename)) {
if (!fileList.contains(ce.filename))
{
throw new IllegalArgumentException("File listed in contents is missing: " + ce.filename);
}
}
int bitstream_bundles_updated = 0;
//now okay to add
for (ContentsEntry ce : contents) {
for (ContentsEntry ce : contents)
{
String targetBundleName = addBitstream(context, itarch, item, dir, ce, suppressUndo, isTest);
if (!targetBundleName.equals("")
&& !targetBundleName.equals("THUMBNAIL")
&& !targetBundleName.equals("TEXT")) {
&& !targetBundleName.equals("TEXT"))
{
bitstream_bundles_updated++;
}
}
if (alterProvenance && bitstream_bundles_updated > 0) {
if (alterProvenance && bitstream_bundles_updated > 0)
{
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
String append = ". Added " + Integer.toString(bitstream_bundles_updated)
@@ -113,7 +119,6 @@ public class AddBitstreamsAction extends UpdateBitstreamsAction {
/**
* Add bitstream
*
* @param context DSpace Context
* @param itarch Item Archive
* @param item DSpace Item
@@ -130,7 +135,8 @@ public class AddBitstreamsAction extends UpdateBitstreamsAction {
*/
protected String addBitstream(Context context, ItemArchive itarch, Item item, File dir,
ContentsEntry ce, boolean suppressUndo, boolean isTest)
throws IOException, IllegalArgumentException, SQLException, AuthorizeException, ParseException {
throws IOException, IllegalArgumentException, SQLException, AuthorizeException, ParseException
{
ItemUpdate.pr("contents entry for bitstream: " + ce.toString());
File f = new File(dir, ce.filename);
@@ -140,29 +146,40 @@ public class AddBitstreamsAction extends UpdateBitstreamsAction {
Bitstream bs = null;
String newBundleName = ce.bundlename;
if (ce.bundlename == null) { // should be required but default convention established
if (ce.filename.equals("license.txt")) {
if (ce.bundlename == null) // should be required but default convention established
{
if (ce.filename.equals("license.txt"))
{
newBundleName = "LICENSE";
} else {
}
else
{
newBundleName = "ORIGINAL";
}
}
ItemUpdate.pr(" Bitstream " + ce.filename + " to be added to bundle: " + newBundleName);
if (!isTest) {
if (!isTest)
{
// find the bundle
List<Bundle> bundles = itemService.getBundles(item, newBundleName);
Bundle targetBundle = null;
if (bundles.size() < 1) {
if (bundles.size() < 1)
{
// not found, create a new one
targetBundle = bundleService.create(context, item, newBundleName);
} else {
}
else
{
//verify bundle + name are not duplicates
for (Bundle b : bundles) {
for (Bundle b : bundles)
{
List<Bitstream> bitstreams = b.getBitstreams();
for (Bitstream bsm : bitstreams) {
if (bsm.getName().equals(ce.filename)) {
for (Bitstream bsm : bitstreams)
{
if (bsm.getName().equals(ce.filename))
{
throw new IllegalArgumentException("Duplicate bundle + filename cannot be added: "
+ b.getName() + " + " + bsm.getName());
}
@@ -181,14 +198,17 @@ public class AddBitstreamsAction extends UpdateBitstreamsAction {
BitstreamFormat fmt = bitstreamFormatService.guessFormat(context, bs);
bitstreamService.setFormat(context, bs, fmt);
if (ce.description != null) {
if (ce.description != null)
{
bs.setDescription(context, ce.description);
}
if ((ce.permissionsActionId != -1) && (ce.permissionsGroupName != null)) {
if ((ce.permissionsActionId != -1) && (ce.permissionsGroupName != null))
{
Group group = groupService.findByName(context, ce.permissionsGroupName);
if (group != null) {
if (group != null)
{
authorizeService.removeAllPolicies(context, bs); // remove the default policy
authorizeService.createResourcePolicy(context, bs, group, null, ce.permissionsActionId, null);
}
@@ -197,7 +217,8 @@ public class AddBitstreamsAction extends UpdateBitstreamsAction {
//update after all changes are applied
bitstreamService.update(context, bs);
if (!suppressUndo) {
if (!suppressUndo)
{
itarch.addUndoDeleteContents(bs.getID());
}
return targetBundle.getName();

View File

@@ -11,10 +11,11 @@ import java.sql.SQLException;
import java.util.List;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.MetadataValue;
import org.dspace.content.Item;
import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema;
import org.dspace.content.MetadataValue;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.MetadataFieldService;
import org.dspace.content.service.MetadataSchemaService;
@@ -22,11 +23,11 @@ import org.dspace.core.Context;
/**
* Action to add metadata to item
*
*/
public class AddMetadataAction extends UpdateMetadataAction {
protected MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance()
.getMetadataSchemaService();
protected MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance().getMetadataSchemaService();
protected MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService();
/**
@@ -41,68 +42,79 @@ public class AddMetadataAction extends UpdateMetadataAction {
*/
@Override
public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws AuthorizeException, SQLException {
boolean suppressUndo) throws AuthorizeException, SQLException
{
Item item = itarch.getItem();
String dirname = itarch.getDirectoryName();
for (DtoMetadata dtom : itarch.getMetadataFields()) {
for (String f : targetFields) {
if (dtom.matches(f, false)) {
for (DtoMetadata dtom : itarch.getMetadataFields())
{
for (String f : targetFields)
{
if (dtom.matches(f, false))
{
// match against metadata for this field/value in repository
// qualifier must be strictly matched, possibly null
List<MetadataValue> ardcv = null;
ardcv = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
boolean found = false;
for (MetadataValue dcv : ardcv) {
if (dcv.getValue().equals(dtom.value)) {
for (MetadataValue dcv : ardcv)
{
if (dcv.getValue().equals(dtom.value))
{
found = true;
break;
}
}
if (found) {
if (found)
{
ItemUpdate.pr("Warning: No new metadata found to add to item " + dirname
+ " for element " + f);
} else {
if (isTest) {
}
else
{
if (isTest)
{
ItemUpdate.pr("Metadata to add: " + dtom.toString());
//validity tests that would occur in actual processing
// If we're just test the import, let's check that the actual metadata field exists.
MetadataSchema foundSchema = metadataSchemaService.find(context, dtom.schema);
if (foundSchema == null) {
if (foundSchema == null)
{
ItemUpdate.pr("ERROR: schema '"
+ dtom.schema + "' was not found in the registry; found on item " +
dirname);
} else {
MetadataField foundField = metadataFieldService
.findByElement(context, foundSchema, dtom.element, dtom.qualifier);
+ dtom.schema + "' was not found in the registry; found on item " + dirname);
}
else
{
MetadataField foundField = metadataFieldService.findByElement(context, foundSchema, dtom.element, dtom.qualifier);
if (foundField == null) {
if (foundField == null)
{
ItemUpdate.pr("ERROR: Metadata field: '" + dtom.schema + "." + dtom.element + "."
+ dtom.qualifier + "' not found in registry; found on item " +
dirname);
+ dtom.qualifier + "' not found in registry; found on item " + dirname);
}
}
} else {
itemService
.addMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language,
dtom.value);
}
else
{
itemService.addMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language, dtom.value);
ItemUpdate.pr("Metadata added: " + dtom.toString());
if (!suppressUndo) {
if (!suppressUndo)
{
//itarch.addUndoDtom(dtom);
//ItemUpdate.pr("Undo metadata: " + dtom);
// add all as a replace record to be preceded by delete
for (MetadataValue dcval : ardcv) {
for (MetadataValue dcval : ardcv)
{
MetadataField metadataField = dcval.getMetadataField();
MetadataSchema metadataSchema = metadataField.getMetadataSchema();
itarch.addUndoMetadataField(
DtoMetadata.create(metadataSchema.getName(), metadataField.getElement(),
metadataField.getQualifier(), dcval.getLanguage(),
dcval.getValue()));
itarch.addUndoMetadataField(DtoMetadata.create(metadataSchema.getName(), metadataField.getElement(),
metadataField.getQualifier(), dcval.getLanguage(), dcval.getValue()));
}
}

View File

@@ -7,17 +7,17 @@
*/
package org.dspace.app.itemupdate;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.Properties;
import java.io.InputStream;
import java.io.FileInputStream;
import org.dspace.content.Bitstream;
/**
* Filter interface to be used by ItemUpdate
* to determine which bitstreams in an Item
* acceptable for removal.
*
*/
public abstract class BitstreamFilter {
@@ -33,20 +33,26 @@ public abstract class BitstreamFilter {
public abstract boolean accept(Bitstream bitstream) throws BitstreamFilterException;
/**
*
* @param filepath - The complete path for the properties file
* @throws IOException if IO error
*/
public void initProperties(String filepath)
throws IOException {
throws IOException
{
props = new Properties();
InputStream in = null;
try {
try
{
in = new FileInputStream(filepath);
props.load(in);
} finally {
if (in != null) {
}
finally
{
if (in != null)
{
in.close();
}
}

View File

@@ -15,12 +15,14 @@ import org.dspace.content.Bundle;
/**
* BitstreamFilter implementation to filter by bundle name
*
*/
public class BitstreamFilterByBundleName extends BitstreamFilter {
protected String bundleName;
public BitstreamFilterByBundleName() {
public BitstreamFilterByBundleName()
{
//empty
}
@@ -28,27 +30,36 @@ public class BitstreamFilterByBundleName extends BitstreamFilter {
* Filter bitstream based on bundle name found in properties file
*
* @param bitstream Bitstream
* @return whether bitstream is in bundle
* @throws BitstreamFilterException if filter error
* @return whether bitstream is in bundle
*
*/
@Override
public boolean accept(Bitstream bitstream)
throws BitstreamFilterException {
if (bundleName == null) {
throws BitstreamFilterException
{
if (bundleName == null)
{
bundleName = props.getProperty("bundle");
if (bundleName == null) {
if (bundleName == null)
{
throw new BitstreamFilterException("Property 'bundle' not found.");
}
}
try {
try
{
List<Bundle> bundles = bitstream.getBundles();
for (Bundle b : bundles) {
if (b.getName().equals(bundleName)) {
for (Bundle b : bundles)
{
if (b.getName().equals(bundleName))
{
return true;
}
}
} catch (SQLException e) {
}
catch(SQLException e)
{
throw new BitstreamFilterException(e);
}
return false;

View File

@@ -7,20 +7,21 @@
*/
package org.dspace.app.itemupdate;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.regex.*;
import org.dspace.content.Bitstream;
/**
* BitstreamFilter implementation to filter by filename pattern
*
*/
public class BitstreamFilterByFilename extends BitstreamFilter {
protected Pattern pattern;
protected String filenameRegex;
public BitstreamFilterByFilename() {
public BitstreamFilterByFilename()
{
//empty
}
@@ -30,13 +31,16 @@ public class BitstreamFilterByFilename extends BitstreamFilter {
*
* @param bitstream Bitstream
* @return whether bitstream name matches the regular expression
* @throws BitstreamFilterException if filter error
* @exception BitstreamFilterException if filter error
*/
@Override
public boolean accept(Bitstream bitstream) throws BitstreamFilterException {
if (filenameRegex == null) {
public boolean accept(Bitstream bitstream) throws BitstreamFilterException
{
if (filenameRegex == null)
{
filenameRegex = props.getProperty("filename");
if (filenameRegex == null) {
if (filenameRegex == null)
{
throw new BitstreamFilterException("BitstreamFilter property 'filename' not found.");
}
pattern = Pattern.compile(filenameRegex);

View File

@@ -9,25 +9,28 @@ package org.dspace.app.itemupdate;
/**
* Exception class for BitstreamFilters
*
*/
public class BitstreamFilterException extends Exception {
public class BitstreamFilterException extends Exception
{
private static final long serialVersionUID = 1L;
public BitstreamFilterException() {
}
public BitstreamFilterException() {}
/**
*
* @param msg exception message
*/
public BitstreamFilterException(String msg) {
public BitstreamFilterException(String msg)
{
super(msg);
}
/**
*
* @param e exception
*/
public BitstreamFilterException(Exception e) {
public BitstreamFilterException(Exception e)
{
super(e);
}

View File

@@ -8,8 +8,7 @@
package org.dspace.app.itemupdate;
import java.text.ParseException;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.regex.*;
import org.dspace.core.Constants;
@@ -27,8 +26,11 @@ import org.dspace.core.Constants;
* permissions: -[r|w] ['group name']
* description: <the description of the file>
* }
*
*
*/
public class ContentsEntry {
public class ContentsEntry
{
public static final String HDR_BUNDLE = "bundle:";
public static final String HDR_PERMISSIONS = "permissions:";
public static final String HDR_DESCRIPTION = "description:";
@@ -45,7 +47,8 @@ public class ContentsEntry {
String bundlename,
int permissionsActionId,
String permissionsGroupName,
String description) {
String description)
{
this.filename = filename;
this.bundlename = bundlename;
this.permissionsActionId = permissionsActionId;
@@ -61,7 +64,8 @@ public class ContentsEntry {
* @throws ParseException if parse error
*/
public static ContentsEntry parse(String line)
throws ParseException {
throws ParseException
{
String[] ar = line.split("\t");
ItemUpdate.pr("ce line split: " + ar.length);
@@ -71,33 +75,46 @@ public class ContentsEntry {
String groupName = null;
int actionId = -1;
if (ar.length > 1) {
for (int i = 1; i < ar.length; i++) {
if (ar.length > 1)
{
for (int i=1; i < ar.length; i++)
{
ItemUpdate.pr("ce " + i + " : " + ar[i]);
if (ar[i].startsWith(HDR_BUNDLE)) {
if (ar[i].startsWith(HDR_BUNDLE))
{
arp[1] = ar[i].substring(HDR_BUNDLE.length()).trim();
} else if (ar[i].startsWith(HDR_PERMISSIONS)) {
}
else if (ar[i].startsWith(HDR_PERMISSIONS))
{
arp[2] = ar[i].substring(HDR_PERMISSIONS.length()).trim();
// parse into actionId and group name
Matcher m = permissionsPattern.matcher(arp[2]);
if (m.matches()) {
if (m.matches())
{
String action = m.group(1); //
if (action.equals("r")) {
if (action.equals("r"))
{
actionId = Constants.READ;
} else if (action.equals("w")) {
}
else if (action.equals("w"))
{
actionId = Constants.WRITE;
}
groupName = m.group(2).trim();
}
} else if (ar[i].startsWith(HDR_DESCRIPTION)) {
}
else if (ar[i].startsWith(HDR_DESCRIPTION))
{
arp[3] = ar[i].substring(HDR_DESCRIPTION.length()).trim();
} else {
}
else
{
throw new ParseException("Unknown text in contents file: " + ar[i], 0);
}
}
@@ -105,23 +122,30 @@ public class ContentsEntry {
return new ContentsEntry(arp[0], arp[1], actionId, groupName, arp[3]);
}
public String toString() {
public String toString()
{
StringBuilder sb = new StringBuilder(filename);
if (bundlename != null) {
if (bundlename != null)
{
sb.append(HDR_BUNDLE).append(" ").append(bundlename);
}
if (permissionsGroupName != null) {
if (permissionsGroupName != null)
{
sb.append(HDR_PERMISSIONS);
if (permissionsActionId == Constants.READ) {
if (permissionsActionId == Constants.READ)
{
sb.append(" -r ");
} else if (permissionsActionId == Constants.WRITE) {
}
else if (permissionsActionId == Constants.WRITE)
{
sb.append(" -w ");
}
sb.append(permissionsGroupName);
}
if (description != null) {
if (description != null)
{
sb.append(HDR_DESCRIPTION).append(" ").append(description);
}

View File

@@ -14,10 +14,7 @@ import java.text.ParseException;
import java.util.List;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.Bitstream;
import org.dspace.content.Bundle;
import org.dspace.content.DCDate;
import org.dspace.content.Item;
import org.dspace.content.*;
import org.dspace.core.Context;
/**
@@ -26,8 +23,10 @@ import org.dspace.core.Context;
* Undo not supported for this UpdateAction
*
* Derivatives of the bitstream to be deleted are not also deleted
*
*/
public class DeleteBitstreamsAction extends UpdateBitstreamsAction {
public class DeleteBitstreamsAction extends UpdateBitstreamsAction
{
/**
* Delete bitstream from item
*
@@ -44,46 +43,65 @@ public class DeleteBitstreamsAction extends UpdateBitstreamsAction {
@Override
public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws IllegalArgumentException, IOException,
SQLException, AuthorizeException, ParseException {
SQLException, AuthorizeException, ParseException
{
File f = new File(itarch.getDirectory(), ItemUpdate.DELETE_CONTENTS_FILE);
if (!f.exists()) {
if (!f.exists())
{
ItemUpdate.pr("Warning: Delete_contents file for item " + itarch.getDirectoryName() + " not found.");
} else {
}
else
{
List<String> list = MetadataUtilities.readDeleteContentsFile(f);
if (list.isEmpty()) {
if (list.isEmpty())
{
ItemUpdate.pr("Warning: empty delete_contents file for item " + itarch.getDirectoryName() );
} else {
for (String id : list) {
try {
}
else
{
for (String id : list)
{
try
{
Bitstream bs = bitstreamService.findByIdOrLegacyId(context, id);
if (bs == null) {
if (bs == null)
{
ItemUpdate.pr("Bitstream not found by id: " + id);
} else {
}
else
{
List<Bundle> bundles = bs.getBundles();
for (Bundle b : bundles) {
if (isTest) {
for (Bundle b : bundles)
{
if (isTest)
{
ItemUpdate.pr("Delete bitstream with id = " + id);
} else {
}
else
{
bundleService.removeBitstream(context, b, bs);
ItemUpdate.pr("Deleted bitstream with id = " + id);
}
}
if (alterProvenance) {
if (alterProvenance)
{
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
String append = "Bitstream " + bs.getName() + " deleted on " + DCDate
.getCurrent() + "; ";
String append = "Bitstream " + bs.getName() + " deleted on " + DCDate.getCurrent() + "; ";
Item item = bundles.iterator().next().getItems().iterator().next();
ItemUpdate.pr("Append provenance with: " + append);
if (!isTest) {
if (!isTest)
{
MetadataUtilities.appendMetadata(context, item, dtom, false, append);
}
}
}
} catch (SQLException e) {
}
catch(SQLException e)
{
ItemUpdate.pr("Error finding bitstream from id: " + id + " : " + e.toString());
}
}

View File

@@ -14,10 +14,7 @@ import java.util.ArrayList;
import java.util.List;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.Bitstream;
import org.dspace.content.Bundle;
import org.dspace.content.DCDate;
import org.dspace.content.Item;
import org.dspace.content.*;
import org.dspace.core.Context;
/**
@@ -28,6 +25,8 @@ import org.dspace.core.Context;
*
* Note: Multiple filters are impractical if trying to manage multiple properties files
* in a commandline environment
*
*
*/
public class DeleteBitstreamsByFilterAction extends UpdateBitstreamsAction {
@@ -38,16 +37,17 @@ public class DeleteBitstreamsByFilterAction extends UpdateBitstreamsAction {
*
* @param filter BitstreamFilter
*/
public void setBitstreamFilter(BitstreamFilter filter) {
public void setBitstreamFilter(BitstreamFilter filter)
{
this.filter = filter;
}
/**
* Get filter
*
* @return filter
*/
public BitstreamFilter getBitstreamFilter() {
public BitstreamFilter getBitstreamFilter()
{
return filter;
}
@@ -67,25 +67,33 @@ public class DeleteBitstreamsByFilterAction extends UpdateBitstreamsAction {
@Override
public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws AuthorizeException,
BitstreamFilterException, IOException, ParseException, SQLException {
BitstreamFilterException, IOException, ParseException, SQLException
{
List<String> deleted = new ArrayList<String>();
Item item = itarch.getItem();
List<Bundle> bundles = item.getBundles();
for (Bundle b : bundles) {
for (Bundle b : bundles)
{
List<Bitstream> bitstreams = b.getBitstreams();
String bundleName = b.getName();
for (Bitstream bs : bitstreams) {
if (filter.accept(bs)) {
if (isTest) {
for (Bitstream bs : bitstreams)
{
if (filter.accept(bs))
{
if (isTest)
{
ItemUpdate.pr("Delete from bundle " + bundleName + " bitstream " + bs.getName()
+ " with id = " + bs.getID());
} else {
}
else
{
//provenance is not maintained for derivative bitstreams
if (!bundleName.equals("THUMBNAIL") && !bundleName.equals("TEXT")) {
if (!bundleName.equals("THUMBNAIL") && !bundleName.equals("TEXT"))
{
deleted.add(bs.getName());
}
bundleService.removeBitstream(context, b, bs);
@@ -96,11 +104,13 @@ public class DeleteBitstreamsByFilterAction extends UpdateBitstreamsAction {
}
}
if (alterProvenance && !deleted.isEmpty()) {
if (alterProvenance && !deleted.isEmpty())
{
StringBuilder sb = new StringBuilder(" Bitstreams deleted on ");
sb.append(DCDate.getCurrent()).append(": ");
for (String s : deleted) {
for (String s : deleted)
{
sb.append(s).append(", ");
}
@@ -108,7 +118,8 @@ public class DeleteBitstreamsByFilterAction extends UpdateBitstreamsAction {
ItemUpdate.pr("Append provenance with: " + sb.toString());
if (!isTest) {
if (!isTest)
{
MetadataUtilities.appendMetadata(context, item, dtom, false, sb.toString());
}
}

View File

@@ -12,14 +12,16 @@ import java.text.ParseException;
import java.util.List;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.Item;
import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema;
import org.dspace.content.MetadataValue;
import org.dspace.content.Item;
import org.dspace.core.Context;
/**
* Action to delete metadata
*
*
*/
public class DeleteMetadataAction extends UpdateMetadataAction {
@@ -38,22 +40,26 @@ public class DeleteMetadataAction extends UpdateMetadataAction {
public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws AuthorizeException, ParseException, SQLException {
Item item = itarch.getItem();
for (String f : targetFields) {
for (String f : targetFields)
{
DtoMetadata dummy = DtoMetadata.create(f, Item.ANY, "");
List<MetadataValue> ardcv = itemService.getMetadataByMetadataString(item, f);
ItemUpdate.pr("Metadata to be deleted: ");
for (MetadataValue dcv : ardcv) {
for (MetadataValue dcv : ardcv)
{
ItemUpdate.pr(" " + MetadataUtilities.getDCValueString(dcv));
}
if (!isTest) {
if (!suppressUndo) {
for (MetadataValue dcv : ardcv) {
if (!isTest)
{
if (!suppressUndo)
{
for (MetadataValue dcv : ardcv)
{
MetadataField metadataField = dcv.getMetadataField();
MetadataSchema metadataSchema = metadataField.getMetadataSchema();
itarch.addUndoMetadataField(
DtoMetadata.create(metadataSchema.getName(), metadataField.getElement(),
itarch.addUndoMetadataField(DtoMetadata.create(metadataSchema.getName(), metadataField.getElement(),
metadataField.getQualifier(), dcv.getLanguage(), dcv.getValue()));
}
}

View File

@@ -11,10 +11,12 @@ import java.util.Properties;
/**
* Bitstream filter to delete from TEXT bundle
*
*/
public class DerivativeTextBitstreamFilter extends BitstreamFilterByBundleName {
public DerivativeTextBitstreamFilter() {
public DerivativeTextBitstreamFilter()
{
props = new Properties();
props.setProperty("bundle", "TEXT");
}

View File

@@ -8,7 +8,6 @@
package org.dspace.app.itemupdate;
import java.text.ParseException;
import org.dspace.content.Item;
/**
@@ -17,19 +16,22 @@ import org.dspace.content.Item;
*
* Adds some utility methods
*
* Really not at all general enough but supports Dublin Core and the compound form notation {@code <schema>
* .<element>[.<qualifier>]}
* Really not at all general enough but supports Dublin Core and the compound form notation {@code <schema>.<element>[.<qualifier>]}
*
* Does not support wildcard for qualifier
*
*
*/
class DtoMetadata {
class DtoMetadata
{
final String schema;
final String element;
final String qualifier;
final String language;
final String value;
protected DtoMetadata(String schema, String element, String qualifier, String language, String value) {
protected DtoMetadata(String schema, String element, String qualifier, String language, String value)
{
this.schema = schema;
this.element = element;
this.qualifier = qualifier;
@@ -40,6 +42,7 @@ class DtoMetadata {
/**
* Factory method
*
*
* @param schema not null, not empty - 'dc' is the standard case
* @param element not null, not empty
* @param qualifier null; don't allow empty string or * indicating 'any'
@@ -53,8 +56,10 @@ class DtoMetadata {
String qualifier,
String language,
String value)
throws IllegalArgumentException {
if ((qualifier != null) && (qualifier.equals(Item.ANY) || qualifier.equals(""))) {
throws IllegalArgumentException
{
if ((qualifier != null) && (qualifier.equals(Item.ANY) || qualifier.equals("")))
{
throw new IllegalArgumentException("Invalid qualifier: " + qualifier);
}
return new DtoMetadata(schema, element, qualifier, language, value);
@@ -63,6 +68,7 @@ class DtoMetadata {
/**
* Factory method to create metadata object
*
*
* @param compoundForm of the form <schema>.<element>[.<qualifier>]
* @param language null or empty
* @param value value
@@ -70,11 +76,13 @@ class DtoMetadata {
* @throws IllegalArgumentException if arg error
*/
public static DtoMetadata create(String compoundForm, String language, String value)
throws ParseException, IllegalArgumentException {
throws ParseException, IllegalArgumentException
{
String[] ar = MetadataUtilities.parseCompoundForm(compoundForm);
String qual = null;
if (ar.length > 2) {
if (ar.length > 2)
{
qual = ar[2];
}
@@ -85,44 +93,56 @@ class DtoMetadata {
* Determine if this metadata field matches the specified type:
* schema.element or schema.element.qualifier
*
*
* @param compoundForm of the form <schema>.<element>[.<qualifier>|.*]
* @param wildcard allow wildcards in compoundForm param
* @return whether matches
*/
public boolean matches(String compoundForm, boolean wildcard) {
public boolean matches(String compoundForm, boolean wildcard)
{
String[] ar = compoundForm.split("\\s*\\.\\s*"); //MetadataUtilities.parseCompoundForm(compoundForm);
if ((ar.length < 2) || (ar.length > 3)) {
if ((ar.length < 2) || (ar.length > 3))
{
return false;
}
if (!this.schema.equals(ar[0]) || !this.element.equals(ar[1])) {
if (!this.schema.equals(ar[0]) || !this.element.equals(ar[1]))
{
return false;
}
if (ar.length == 2) {
if (this.qualifier != null) {
if (ar.length == 2)
{
if (this.qualifier != null)
{
return false;
}
}
if (ar.length == 3) {
if (this.qualifier == null) {
if (ar.length == 3)
{
if (this.qualifier == null)
{
return false;
}
if (wildcard && ar[2].equals(Item.ANY)) {
if (wildcard && ar[2].equals(Item.ANY))
{
return true;
}
if (!this.qualifier.equals(ar[2])) {
if (!this.qualifier.equals(ar[2]))
{
return false;
}
}
return true;
}
public String toString() {
public String toString()
{
String s = "\tSchema: " + schema + " Element: " + element;
if (qualifier != null) {
if (qualifier != null)
{
s+= " Qualifier: " + qualifier;
}
s+= " Language: " + ((language == null) ? "[null]" : language);
@@ -131,7 +151,8 @@ class DtoMetadata {
return s;
}
public String getValue() {
public String getValue()
{
return value;
}

View File

@@ -10,11 +10,12 @@ package org.dspace.app.itemupdate;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileWriter;
import java.io.FilenameFilter;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.FileWriter;
import java.io.IOException;
import java.io.InputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.sql.SQLException;
@@ -22,13 +23,14 @@ import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import java.util.UUID;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerConfigurationException;
import javax.xml.transform.TransformerException;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.TransformerConfigurationException;
import org.apache.log4j.Logger;
import org.dspace.app.util.LocalSchemaFilenameFilter;
@@ -38,6 +40,7 @@ import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService;
import org.dspace.core.Context;
import org.dspace.handle.factory.HandleServiceFactory;
import org.dspace.handle.service.HandleService;
import org.w3c.dom.Document;
@@ -45,6 +48,7 @@ import org.w3c.dom.Document;
/**
* Encapsulates the Item in the context of the DSpace Archive Format
*
*/
public class ItemArchive {
private static final Logger log = Logger.getLogger(ItemArchive.class);
@@ -67,13 +71,13 @@ public class ItemArchive {
protected ItemService itemService;
//constructors
protected ItemArchive() {
protected ItemArchive()
{
handleService = HandleServiceFactory.getInstance().getHandleService();
itemService = ContentServiceFactory.getInstance().getItemService();
}
/**
* factory method
/** factory method
*
* Minimal requirements for dublin_core.xml for this application
* is the presence of dc.identifier.uri
@@ -85,38 +89,48 @@ public class ItemArchive {
* if null, the default is the handle in the dc.identifier.uri field
* @return ItemArchive object
* @throws Exception if error
*
*/
public static ItemArchive create(Context context, File dir, String itemField)
throws Exception {
throws Exception
{
ItemArchive itarch = new ItemArchive();
itarch.dir = dir;
itarch.dirname = dir.getName();
InputStream is = null;
try {
try
{
is = new FileInputStream(new File(dir, DUBLIN_CORE_XML));
itarch.dtomList = MetadataUtilities.loadDublinCore(getDocumentBuilder(), is);
//The code to search for local schema files was copied from org.dspace.app.itemimport
// .ItemImportServiceImpl.java
//The code to search for local schema files was copied from org.dspace.app.itemimport.ItemImportServiceImpl.java
File file[] = dir.listFiles(new LocalSchemaFilenameFilter());
for (int i = 0; i < file.length; i++) {
for (int i = 0; i < file.length; i++)
{
is = new FileInputStream(file[i]);
itarch.dtomList.addAll(MetadataUtilities.loadDublinCore(getDocumentBuilder(), is));
}
} finally {
if (is != null) {
}
finally
{
if (is != null)
{
is.close();
}
}
ItemUpdate.pr("Loaded metadata with " + itarch.dtomList.size() + " fields");
if (itemField == null) {
if (itemField == null)
{
itarch.item = itarch.itemFromHandleInput(context); // sets the item instance var and seeds the undo list
} else {
}
else
{
itarch.item = itarch.itemFromMetadataField(context, itemField);
}
if (itarch.item == null) {
if (itarch.item == null)
{
throw new Exception("Item not instantiated: " + itarch.dirname);
}
@@ -126,8 +140,10 @@ public class ItemArchive {
}
protected static DocumentBuilder getDocumentBuilder()
throws ParserConfigurationException {
if (builder == null) {
throws ParserConfigurationException
{
if (builder == null)
{
builder = DocumentBuilderFactory.newInstance().newDocumentBuilder();
}
return builder;
@@ -135,13 +151,14 @@ public class ItemArchive {
/**
* Getter for Transformer
*
* @return Transformer
* @throws TransformerConfigurationException if config error
*/
protected Transformer getTransformer()
throws TransformerConfigurationException {
if (transformer == null) {
throws TransformerConfigurationException
{
if (transformer == null)
{
transformer = TransformerFactory.newInstance().newTransformer();
}
return transformer;
@@ -149,55 +166,55 @@ public class ItemArchive {
/**
* Getter for the DSpace item referenced in the archive
*
* @return DSpace item
*/
public Item getItem() {
public Item getItem()
{
return item;
}
/**
* Getter for directory in archive on disk
*
* @return directory in archive
*/
public File getDirectory() {
public File getDirectory()
{
return dir;
}
/**
* Getter for directory name in archive
*
* @return directory name in archive
*/
public String getDirectoryName() {
public String getDirectoryName()
{
return dirname;
}
/**
* Add metadata field to undo list
*
* @param dtom DtoMetadata (represents metadata field)
*/
public void addUndoMetadataField(DtoMetadata dtom) {
public void addUndoMetadataField(DtoMetadata dtom)
{
this.undoDtomList.add(dtom);
}
/**
* Getter for list of metadata fields
*
* @return list of metadata fields
*/
public List<DtoMetadata> getMetadataFields() {
public List<DtoMetadata> getMetadataFields()
{
return dtomList;
}
/**
* Add bitstream id to delete contents file
*
* @param bitstreamId bitstream ID
*/
public void addUndoDeleteContents(UUID bitstreamId) {
public void addUndoDeleteContents(UUID bitstreamId)
{
this.undoAddContents.add(bitstreamId);
}
@@ -207,15 +224,16 @@ public class ItemArchive {
* This is the default implementation
* that uses the dc.identifier.uri metadatafield
* that contains the item handle as its value
*
* @param context DSpace Context
* @throws SQLException if database error
* @throws Exception if error
*/
private Item itemFromHandleInput(Context context)
throws SQLException, Exception {
throws SQLException, Exception
{
DtoMetadata dtom = getMetadataField("dc.identifier.uri");
if (dtom == null) {
if (dtom == null)
{
throw new Exception("No dc.identier.uri field found for handle");
}
@@ -223,7 +241,8 @@ public class ItemArchive {
String uri = dtom.value;
if (!uri.startsWith(ItemUpdate.HANDLE_PREFIX)) {
if (!uri.startsWith(ItemUpdate.HANDLE_PREFIX))
{
throw new Exception("dc.identifier.uri for item " + uri
+ " does not begin with prefix: " + ItemUpdate.HANDLE_PREFIX);
}
@@ -231,9 +250,12 @@ public class ItemArchive {
String handle = uri.substring(ItemUpdate.HANDLE_PREFIX.length());
DSpaceObject dso = handleService.resolveToObject(context, handle);
if (dso instanceof Item) {
if (dso instanceof Item)
{
item = (Item) dso;
} else {
}
else
{
ItemUpdate.pr("Warning: item not instantiated");
throw new IllegalArgumentException("Item " + handle + " not instantiated.");
}
@@ -244,50 +266,55 @@ public class ItemArchive {
* Find and instantiate Item from the dublin_core.xml based
* on the specified itemField for the item identifier,
*
*
* @param context - the DSpace context
* @param itemField - the compound form of the metadata element <schema>.<element>.<qualifier>
* @throws SQLException if database error
* @throws Exception if error
*/
private Item itemFromMetadataField(Context context, String itemField)
throws SQLException, AuthorizeException, Exception {
throws SQLException, AuthorizeException, Exception
{
DtoMetadata dtom = getMetadataField(itemField);
Item item = null;
if (dtom == null) {
if (dtom == null)
{
throw new IllegalArgumentException("No field found for item identifier field: " + itemField);
}
ItemUpdate.prv("Metadata field to match for item: " + dtom.toString());
this.addUndoMetadataField(dtom); //seed the undo list with the identifier field
Iterator<Item> itr = itemService
.findByMetadataField(context, dtom.schema, dtom.element, dtom.qualifier, dtom.value);
Iterator<Item> itr = itemService.findByMetadataField(context, dtom.schema, dtom.element, dtom.qualifier, dtom.value);
int count = 0;
while (itr.hasNext()) {
while (itr.hasNext())
{
item = itr.next();
count++;
}
ItemUpdate.prv("items matching = " + count );
if (count != 1) {
if (count != 1)
{
throw new Exception ("" + count + " items matching item identifier: " + dtom.value);
}
return item;
}
/**
* Get DtoMetadata field
*
* @param compoundForm compound form
* @return DtoMetadata field
*/
private DtoMetadata getMetadataField(String compoundForm) {
for (DtoMetadata dtom : dtomList) {
if (dtom.matches(compoundForm, false)) {
private DtoMetadata getMetadataField(String compoundForm)
{
for (DtoMetadata dtom : dtomList)
{
if (dtom.matches(compoundForm, false))
{
return dtom;
}
}
@@ -306,35 +333,46 @@ public class ItemArchive {
*/
public void writeUndo(File undoDir)
throws IOException, ParserConfigurationException, TransformerConfigurationException,
TransformerException, FileNotFoundException {
TransformerException, FileNotFoundException
{
// create directory for item
File dir = new File(undoDir, dirname);
if (!dir.exists() && !dir.mkdir()) {
if (!dir.exists() && !dir.mkdir())
{
log.error("Unable to create undo directory");
}
OutputStream out = null;
try {
try
{
out = new FileOutputStream(new File(dir, "dublin_core.xml"));
Document doc = MetadataUtilities.writeDublinCore(getDocumentBuilder(), undoDtomList);
MetadataUtilities.writeDocument(doc, getTransformer(), out);
// if undo has delete bitstream
if (undoAddContents.size() > 0) {
if (undoAddContents.size() > 0)
{
PrintWriter pw = null;
try {
try
{
File f = new File(dir, ItemUpdate.DELETE_CONTENTS_FILE);
pw = new PrintWriter(new BufferedWriter(new FileWriter(f)));
for (UUID i : undoAddContents) {
for (UUID i : undoAddContents)
{
pw.println(i);
}
} finally {
}
finally
{
pw.close();
}
}
} finally {
if (out != null) {
}
finally
{
if (out != null)
{
out.close();
}
}

View File

@@ -28,6 +28,7 @@ import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.lang.StringUtils;
import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService;
@@ -36,8 +37,11 @@ import org.dspace.core.Context;
import org.dspace.eperson.EPerson;
import org.dspace.eperson.factory.EPersonServiceFactory;
import org.dspace.eperson.service.EPersonService;
import org.dspace.handle.factory.HandleServiceFactory;
import org.dspace.handle.service.HandleService;
/**
*
* Provides some batch editing capabilities for items in DSpace:
* Metadata fields - Add, Delete
* Bitstreams - Add, Delete
@@ -62,7 +66,9 @@ import org.dspace.eperson.service.EPersonService;
* Some of this has been placed into the MetadataUtilities class
* for possible reuse elsewhere.
*
*
* @author W. Hays based on a conceptual design by R. Rodgers
*
*/
public class ItemUpdate {
@@ -78,28 +84,33 @@ public class ItemUpdate {
protected static final EPersonService epersonService = EPersonServiceFactory.getInstance().getEPersonService();
protected static final ItemService itemService = ContentServiceFactory.getInstance().getItemService();
protected static final HandleService handleService = HandleServiceFactory.getInstance().getHandleService();
static {
static
{
filterAliases.put("ORIGINAL", "org.dspace.app.itemupdate.OriginalBitstreamFilter");
filterAliases
.put("ORIGINAL_AND_DERIVATIVES", "org.dspace.app.itemupdate.OriginalWithDerivativesBitstreamFilter");
filterAliases.put("ORIGINAL_AND_DERIVATIVES", "org.dspace.app.itemupdate.OriginalWithDerivativesBitstreamFilter");
filterAliases.put("TEXT", "org.dspace.app.itemupdate.DerivativeTextBitstreamFilter");
filterAliases.put("THUMBNAIL", "org.dspace.app.itemupdate.ThumbnailBitstreamFilter");
}
// File listing filter to check for folders
static FilenameFilter directoryFilter = new FilenameFilter() {
static FilenameFilter directoryFilter = new FilenameFilter()
{
@Override
public boolean accept(File dir, String n) {
public boolean accept(File dir, String n)
{
File f = new File(dir.getAbsolutePath() + File.separatorChar + n);
return f.isDirectory();
}
};
// File listing filter to check for files (not directories)
static FilenameFilter fileFilter = new FilenameFilter() {
static FilenameFilter fileFilter = new FilenameFilter()
{
@Override
public boolean accept(File dir, String n) {
public boolean accept(File dir, String n)
{
File f = new File(dir.getAbsolutePath() + File.separatorChar + n);
return (f.isFile());
}
@@ -111,9 +122,11 @@ public class ItemUpdate {
protected String eperson;
/**
* @param argv the command line arguments given
*
* @param argv commandline args
*/
public static void main(String[] argv) {
public static void main(String[] argv)
{
// create an options object and populate it
CommandLineParser parser = new PosixParser();
@@ -124,23 +137,20 @@ public class ItemUpdate {
options.addOption("s", "source", true, "root directory of source dspace archive ");
//actions on items
options.addOption("a", "addmetadata", true,
"add metadata specified for each item; multiples separated by semicolon ';'");
options.addOption("a", "addmetadata", true, "add metadata specified for each item; multiples separated by semicolon ';'");
options.addOption("d", "deletemetadata", true, "delete metadata specified for each item");
options.addOption("A", "addbitstreams", false, "add bitstreams as specified for each item");
// extra work to get optional argument
Option delBitstreamOption = new Option("D", "deletebitstreams", true,
"delete bitstreams as specified for each item");
Option delBitstreamOption = new Option("D", "deletebitstreams", true, "delete bitstreams as specified for each item");
delBitstreamOption.setOptionalArg(true);
delBitstreamOption.setArgName("BitstreamFilter");
options.addOption(delBitstreamOption);
//other params
options.addOption("e", "eperson", true, "email of eperson doing the update");
options.addOption("i", "itemfield", true,
"optional metadata field that containing item identifier; default is dc.identifier.uri");
options.addOption("i", "itemfield", true, "optional metadata field that containing item identifier; default is dc.identifier.uri");
options.addOption("F", "filter-properties", true, "filter class name; only for deleting bitstream");
options.addOption("v", "verbose", false, "verbose logging");
@@ -158,10 +168,12 @@ public class ItemUpdate {
Context context = null;
ItemUpdate iu = new ItemUpdate();
try {
try
{
CommandLine line = parser.parse(options, argv);
if (line.hasOption('h')) {
if (line.hasOption('h'))
{
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("ItemUpdate", options);
pr("");
@@ -175,79 +187,91 @@ public class ItemUpdate {
System.exit(0);
}
if (line.hasOption('v')) {
if (line.hasOption('v'))
{
verbose = true;
}
if (line.hasOption('P')) {
if (line.hasOption('P'))
{
alterProvenance = false;
pr("Suppressing changes to Provenance field option");
}
iu.eperson = line.getOptionValue('e'); // db ID or email
if (!line.hasOption('s')) { // item specific changes from archive dir
if (!line.hasOption('s')) // item specific changes from archive dir
{
pr("Missing source archive option");
System.exit(1);
}
String sourcedir = line.getOptionValue('s');
if (line.hasOption('t')) { //test
if (line.hasOption('t')) //test
{
isTest = true;
pr("**Test Run** - not actually updating items.");
}
if (line.hasOption('i')) {
if (line.hasOption('i'))
{
itemField = line.getOptionValue('i');
}
if (line.hasOption('d')) {
if (line.hasOption('d'))
{
String[] targetFields = line.getOptionValues('d');
DeleteMetadataAction delMetadataAction = (DeleteMetadataAction) iu.actionMgr
.getUpdateAction(DeleteMetadataAction.class);
DeleteMetadataAction delMetadataAction = (DeleteMetadataAction) iu.actionMgr.getUpdateAction(DeleteMetadataAction.class);
delMetadataAction.addTargetFields(targetFields);
//undo is an add
for (String field : targetFields) {
for (String field : targetFields)
{
iu.undoActionList.add(" -a " + field + " ");
}
pr("Delete metadata for fields: ");
for (String s : targetFields) {
for (String s : targetFields)
{
pr(" " + s);
}
}
if (line.hasOption('a')) {
if (line.hasOption('a'))
{
String[] targetFields = line.getOptionValues('a');
AddMetadataAction addMetadataAction = (AddMetadataAction) iu.actionMgr
.getUpdateAction(AddMetadataAction.class);
AddMetadataAction addMetadataAction = (AddMetadataAction) iu.actionMgr.getUpdateAction(AddMetadataAction.class);
addMetadataAction.addTargetFields(targetFields);
//undo is a delete followed by an add of a replace record for target fields
for (String field : targetFields) {
for (String field : targetFields)
{
iu.undoActionList.add(" -d " + field + " ");
}
for (String field : targetFields) {
for (String field : targetFields)
{
iu.undoActionList.add(" -a " + field + " ");
}
pr("Add metadata for fields: ");
for (String s : targetFields) {
for (String s : targetFields)
{
pr(" " + s);
}
}
if (line.hasOption('D')) { // undo not supported
if (line.hasOption('D')) // undo not supported
{
pr("Delete bitstreams ");
String[] filterNames = line.getOptionValues('D');
if ((filterNames != null) && (filterNames.length > 1)) {
if ((filterNames != null) && (filterNames.length > 1))
{
pr("Error: Only one filter can be a used at a time.");
System.exit(1);
}
@@ -255,71 +279,84 @@ public class ItemUpdate {
String filterName = line.getOptionValue('D');
pr("Filter argument: " + filterName);
if (filterName == null) { // indicates using delete_contents files
DeleteBitstreamsAction delAction = (DeleteBitstreamsAction) iu.actionMgr
.getUpdateAction(DeleteBitstreamsAction.class);
if (filterName == null) // indicates using delete_contents files
{
DeleteBitstreamsAction delAction = (DeleteBitstreamsAction) iu.actionMgr.getUpdateAction(DeleteBitstreamsAction.class);
delAction.setAlterProvenance(alterProvenance);
} else {
}
else
{
// check if param is on ALIAS list
String filterClassname = filterAliases.get(filterName);
if (filterClassname == null) {
if (filterClassname == null)
{
filterClassname = filterName;
}
BitstreamFilter filter = null;
try {
try
{
Class<?> cfilter = Class.forName(filterClassname);
pr("BitstreamFilter class to instantiate: " + cfilter.toString());
filter = (BitstreamFilter) cfilter.newInstance(); //unfortunate cast, an erasure consequence
} catch (Exception e) {
}
catch(Exception e)
{
pr("Error: Failure instantiating bitstream filter class: " + filterClassname);
System.exit(1);
}
String filterPropertiesName = line.getOptionValue('F');
if (filterPropertiesName != null) { //not always required
try {
if (filterPropertiesName != null) //not always required
{
try
{
// TODO try multiple relative locations, e.g. source dir
if (!filterPropertiesName.startsWith("/")) {
if (!filterPropertiesName.startsWith("/"))
{
filterPropertiesName = sourcedir + File.separator + filterPropertiesName;
}
filter.initProperties(filterPropertiesName);
} catch (Exception e) {
pr("Error: Failure finding properties file for bitstream filter class: " +
filterPropertiesName);
}
catch(Exception e)
{
pr("Error: Failure finding properties file for bitstream filter class: " + filterPropertiesName);
System.exit(1);
}
}
DeleteBitstreamsByFilterAction delAction =
(DeleteBitstreamsByFilterAction) iu.actionMgr
.getUpdateAction(DeleteBitstreamsByFilterAction.class);
(DeleteBitstreamsByFilterAction) iu.actionMgr.getUpdateAction(DeleteBitstreamsByFilterAction.class);
delAction.setAlterProvenance(alterProvenance);
delAction.setBitstreamFilter(filter);
//undo not supported
}
}
if (line.hasOption('A')) {
if (line.hasOption('A'))
{
pr("Add bitstreams ");
AddBitstreamsAction addAction = (AddBitstreamsAction) iu.actionMgr
.getUpdateAction(AddBitstreamsAction.class);
AddBitstreamsAction addAction = (AddBitstreamsAction) iu.actionMgr.getUpdateAction(AddBitstreamsAction.class);
addAction.setAlterProvenance(alterProvenance);
iu.undoActionList.add(" -D "); // delete_contents file will be written, no arg required
}
if (!iu.actionMgr.hasActions()) {
if (!iu.actionMgr.hasActions())
{
pr("Error - an action must be specified");
System.exit(1);
} else {
}
else
{
pr("Actions to be performed: ");
for (UpdateAction ua : iu.actionMgr) {
for (UpdateAction ua : iu.actionMgr)
{
pr(" " + ua.getClass().getName());
}
}
@@ -330,28 +367,32 @@ public class ItemUpdate {
iu.setEPerson(context, iu.eperson);
context.turnOffAuthorisationSystem();
HANDLE_PREFIX = ConfigurationManager.getProperty("handle.canonical.prefix");
if (HANDLE_PREFIX == null || HANDLE_PREFIX.length() == 0) {
HANDLE_PREFIX = "http://hdl.handle.net/";
}
HANDLE_PREFIX = handleService.getCanonicalPrefix();
iu.processArchive(context, sourcedir, itemField, metadataIndexName, alterProvenance, isTest);
context.complete(); // complete all transactions
} catch (Exception e) {
if (context != null && context.isValid()) {
}
catch (Exception e)
{
if (context != null && context.isValid())
{
context.abort();
}
e.printStackTrace();
pr(e.toString());
status = 1;
} finally {
}
finally {
context.restoreAuthSystemState();
}
if (isTest) {
if (isTest)
{
pr("***End of Test Run***");
} else {
}
else
{
pr("End.");
}
@@ -360,7 +401,6 @@ public class ItemUpdate {
/**
* process an archive
*
* @param context DSpace Context
* @param sourceDirPath source path
* @param itemField item field
@@ -371,11 +411,13 @@ public class ItemUpdate {
*/
protected void processArchive(Context context, String sourceDirPath, String itemField,
String metadataIndexName, boolean alterProvenance, boolean isTest)
throws Exception {
throws Exception
{
// open and process the source directory
File sourceDir = new File(sourceDirPath);
if ((sourceDir == null) || !sourceDir.exists() || !sourceDir.isDirectory()) {
if ((sourceDir == null) || !sourceDir.exists() || !sourceDir.isDirectory())
{
pr("Error, cannot open archive source directory " + sourceDirPath);
throw new Exception("error with archive source directory " + sourceDirPath);
}
@@ -386,73 +428,90 @@ public class ItemUpdate {
//Undo is suppressed to prevent undo of undo
boolean suppressUndo = false;
File fSuppressUndo = new File(sourceDir, SUPPRESS_UNDO_FILENAME);
if (fSuppressUndo.exists()) {
if (fSuppressUndo.exists())
{
suppressUndo = true;
}
File undoDir = null; //sibling directory of source archive
if (!suppressUndo && !isTest) {
if (!suppressUndo && !isTest)
{
undoDir = initUndoArchive(sourceDir);
}
int itemCount = 0;
int successItemCount = 0;
for (String dirname : dircontents) {
for (String dirname : dircontents)
{
itemCount++;
pr("");
pr("processing item " + dirname);
try {
try
{
ItemArchive itarch = ItemArchive.create(context, new File(sourceDir, dirname), itemField);
for (UpdateAction action : actionMgr) {
for (UpdateAction action : actionMgr)
{
pr("action: " + action.getClass().getName());
action.execute(context, itarch, isTest, suppressUndo);
if (!isTest && !suppressUndo) {
if (!isTest && !suppressUndo)
{
itarch.writeUndo(undoDir);
}
}
if (!isTest) {
if (!isTest)
{
Item item = itarch.getItem();
itemService.update(context, item); //need to update before commit
context.uncacheEntity(item);
}
ItemUpdate.pr("Item " + dirname + " completed");
successItemCount++;
} catch (Exception e) {
}
catch(Exception e)
{
pr("Exception processing item " + dirname + ": " + e.toString());
e.printStackTrace();
}
}
if (!suppressUndo && !isTest) {
if (!suppressUndo && !isTest)
{
StringBuilder sb = new StringBuilder("dsrun org.dspace.app.itemupdate.ItemUpdate ");
sb.append(" -e ").append(this.eperson);
sb.append(" -s ").append(undoDir);
if (itemField != null) {
if (itemField != null)
{
sb.append(" -i ").append(itemField);
}
if (!alterProvenance) {
if (!alterProvenance)
{
sb.append(" -P ");
}
if (isTest) {
if (isTest)
{
sb.append(" -t ");
}
for (String actionOption : undoActionList) {
for (String actionOption : undoActionList)
{
sb.append(actionOption);
}
PrintWriter pw = null;
try {
try
{
File cmdFile = new File (undoDir.getParent(), undoDir.getName() + "_command.sh");
pw = new PrintWriter(new BufferedWriter(new FileWriter(cmdFile)));
pw.println(sb.toString());
} finally {
}
finally
{
pw.close();
}
}
@@ -464,6 +523,7 @@ public class ItemUpdate {
/**
*
* to avoid overwriting the undo source tree on repeated processing
* sequence numbers are added and checked
*
@@ -473,33 +533,38 @@ public class ItemUpdate {
* @throws IOException if IO error
*/
protected File initUndoArchive(File sourceDir)
throws FileNotFoundException, IOException {
throws FileNotFoundException, IOException
{
File parentDir = sourceDir.getCanonicalFile().getParentFile();
if (parentDir == null) {
throw new FileNotFoundException(
"Parent directory of archive directory not found; unable to write UndoArchive; no processing " +
"performed");
if (parentDir == null)
{
throw new FileNotFoundException("Parent directory of archive directory not found; unable to write UndoArchive; no processing performed");
}
String sourceDirName = sourceDir.getName();
int seqNo = 1;
File undoDir = new File(parentDir, "undo_" + sourceDirName + "_" + seqNo);
while (undoDir.exists()) {
while (undoDir.exists())
{
undoDir = new File(parentDir, "undo_" + sourceDirName+ "_" + ++seqNo); //increment
}
// create root directory
if (!undoDir.mkdir()) {
if (!undoDir.mkdir())
{
pr("ERROR creating Undo Archive directory " + undoDir.getCanonicalPath());
throw new IOException("ERROR creating Undo Archive directory " + undoDir.getCanonicalPath());
}
//Undo is suppressed to prevent undo of undo
File fSuppressUndo = new File(undoDir, ItemUpdate.SUPPRESS_UNDO_FILENAME);
try {
try
{
fSuppressUndo.createNewFile();
} catch (IOException e) {
}
catch(IOException e)
{
pr("ERROR creating Suppress Undo File " + e.toString());
throw e;
}
@@ -510,29 +575,33 @@ public class ItemUpdate {
/**
* Set EPerson doing import
*
* @param context DSpace Context
* @param eperson EPerson obj
* @throws Exception if error
*/
protected void setEPerson(Context context, String eperson)
throws Exception {
if (eperson == null) {
throws Exception
{
if (eperson == null)
{
pr("Error - an eperson to do the importing must be specified");
pr(" (run with -h flag for details)");
throw new Exception("EPerson not specified.");
}
throw new Exception("EPerson not specified."); }
EPerson myEPerson = null;
if (eperson.indexOf('@') != -1) {
if (eperson.indexOf('@') != -1)
{
// @ sign, must be an email
myEPerson = epersonService.findByEmail(context, eperson);
} else {
}
else
{
myEPerson = epersonService.find(context, UUID.fromString(eperson));
}
if (myEPerson == null) {
if (myEPerson == null)
{
pr("Error, eperson cannot be found: " + eperson);
throw new Exception("Invalid EPerson");
}
@@ -544,20 +613,21 @@ public class ItemUpdate {
* poor man's logging
* As with ItemImport, API logging goes through log4j to the DSpace.log files
* whereas the batch logging goes to the console to be captured there.
*
* @param s String
*/
static void pr(String s) {
static void pr(String s)
{
System.out.println(s);
}
/**
* print if verbose flag is set
*
* @param s String
*/
static void prv(String s) {
if (verbose) {
static void prv(String s)
{
if (verbose)
{
System.out.println(s);
}
}

View File

@@ -11,13 +11,14 @@ import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.sql.SQLException;
import java.text.ParseException;
import java.util.ArrayList;
import java.util.List;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.ParserConfigurationException;
import javax.xml.transform.Result;
@@ -30,14 +31,10 @@ import javax.xml.transform.stream.StreamResult;
import org.apache.commons.lang.StringUtils;
import org.apache.xpath.XPathAPI;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.Item;
import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema;
import org.dspace.content.MetadataValue;
import org.dspace.content.*;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService;
import org.dspace.core.ConfigurationManager;
import org.dspace.core.Context;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
@@ -46,6 +43,9 @@ import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import org.xml.sax.SAXException;
import org.dspace.authorize.AuthorizeException;
import org.dspace.core.ConfigurationManager;
/**
* Miscellaneous methods for metadata handling that build on the API
@@ -53,37 +53,37 @@ import org.xml.sax.SAXException;
* in context in ItemUpdate.
*
* The XML methods were based on those in ItemImport
*
*
*/
public class MetadataUtilities {
protected static final ItemService itemService = ContentServiceFactory.getInstance().getItemService();
/**
* Default constructor
*/
private MetadataUtilities() { }
/**
*
* Working around Item API to delete a value-specific Metadatum
* For a given element/qualifier/lang:
* get all DCValues
* clear (i.e. delete) all of these DCValues
For a given element/qualifier/lang:
get all DCValues
clear (i.e. delete) all of these DCValues
* add them back, minus the one to actually delete
*
* @param context DSpace Context
* @param item Item Object
* @param dtom metadata field
* @param isLanguageStrict whether strict or not
* @return true if metadata field is found with matching value and was deleted
* @throws SQLException if database error
* @return true if metadata field is found with matching value and was deleted
*/
public static boolean deleteMetadataByValue(Context context, Item item, DtoMetadata dtom, boolean isLanguageStrict)
throws SQLException {
public static boolean deleteMetadataByValue(Context context, Item item, DtoMetadata dtom, boolean isLanguageStrict) throws SQLException {
List<MetadataValue> ar = null;
if (isLanguageStrict) { // get all for given type
if (isLanguageStrict)
{ // get all for given type
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
} else {
}
else
{
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
}
@@ -91,18 +91,26 @@ public class MetadataUtilities {
//build new set minus the one to delete
List<String> vals = new ArrayList<String>();
for (MetadataValue dcv : ar) {
if (dcv.getValue().equals(dtom.value)) {
for (MetadataValue dcv : ar)
{
if (dcv.getValue().equals(dtom.value))
{
found = true;
} else {
}
else
{
vals.add(dcv.getValue());
}
}
if (found) { //remove all for given type ??synchronize this block??
if (isLanguageStrict) {
if (found) //remove all for given type ??synchronize this block??
{
if (isLanguageStrict)
{
itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
} else {
}
else
{
itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
}
@@ -128,34 +136,46 @@ public class MetadataUtilities {
List<MetadataValue> ar = null;
// get all values for given element/qualifier
if (isLanguageStrict) { // get all for given element/qualifier
if (isLanguageStrict) // get all for given element/qualifier
{
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
} else {
}
else
{
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
}
if (ar.size() == 0) {
if (ar.size() == 0)
{
throw new IllegalArgumentException("Metadata to append to not found");
}
int idx = 0; //index of field to change
if (ar.size() > 1) { //need to pick one, can't be sure it's the last one
if (ar.size() > 1) //need to pick one, can't be sure it's the last one
{
// TODO maybe get highest id ?
}
//build new set minus the one to delete
List<String> vals = new ArrayList<String>();
for (int i = 0; i < ar.size(); i++) {
if (i == idx) {
for (int i=0; i < ar.size(); i++)
{
if (i == idx)
{
vals.add(ar.get(i).getValue() + textToAppend);
} else {
}
else
{
vals.add(ar.get(i).getValue());
}
}
if (isLanguageStrict) {
if (isLanguageStrict)
{
itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
} else {
}
else
{
itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
}
@@ -178,7 +198,8 @@ public class MetadataUtilities {
*/
public static List<DtoMetadata> loadDublinCore(DocumentBuilder docBuilder, InputStream is)
throws SQLException, IOException, ParserConfigurationException,
SAXException, TransformerException, AuthorizeException {
SAXException, TransformerException, AuthorizeException
{
Document document = docBuilder.parse(is);
List<DtoMetadata> dtomList = new ArrayList<DtoMetadata>();
@@ -188,43 +209,55 @@ public class MetadataUtilities {
String schema = null;
NodeList metadata = XPathAPI.selectNodeList(document, "/dublin_core");
Node schemaAttr = metadata.item(0).getAttributes().getNamedItem("schema");
if (schemaAttr == null) {
if (schemaAttr == null)
{
schema = MetadataSchema.DC_SCHEMA;
} else {
}
else
{
schema = schemaAttr.getNodeValue();
}
// Get the nodes corresponding to formats
NodeList dcNodes = XPathAPI.selectNodeList(document, "/dublin_core/dcvalue");
for (int i = 0; i < dcNodes.getLength(); i++) {
for (int i = 0; i < dcNodes.getLength(); i++)
{
Node n = dcNodes.item(i);
String value = getStringValue(n).trim();
// compensate for empty value getting read as "null", which won't display
if (value == null) {
if (value == null)
{
value = "";
}
String element = getAttributeValue(n, "element");
if (element != null) {
if (element != null)
{
element = element.trim();
}
String qualifier = getAttributeValue(n, "qualifier");
if (qualifier != null) {
if (qualifier != null)
{
qualifier = qualifier.trim();
}
String language = getAttributeValue(n, "language");
if (language != null) {
if (language != null)
{
language = language.trim();
}
if ("none".equals(qualifier) || "".equals(qualifier)) {
if ("none".equals(qualifier) || "".equals(qualifier))
{
qualifier = null;
}
// a goofy default, but consistent with DSpace treatment elsewhere
if (language == null) {
if (language == null)
{
language = "en";
} else if ("".equals(language)) {
}
else if ("".equals(language))
{
language = ConfigurationManager.getProperty("default.language");
}
@@ -246,23 +279,31 @@ public class MetadataUtilities {
* @throws TransformerException if transformer error
*/
public static Document writeDublinCore(DocumentBuilder docBuilder, List<DtoMetadata> dtomList)
throws ParserConfigurationException, TransformerConfigurationException, TransformerException {
throws ParserConfigurationException, TransformerConfigurationException, TransformerException
{
Document doc = docBuilder.newDocument();
Element root = doc.createElement("dublin_core");
doc.appendChild(root);
for (DtoMetadata dtom : dtomList) {
for (DtoMetadata dtom : dtomList)
{
Element mel = doc.createElement("dcvalue");
mel.setAttribute("element", dtom.element);
if (dtom.qualifier == null) {
if (dtom.qualifier == null)
{
mel.setAttribute("qualifier", "none");
} else {
}
else
{
mel.setAttribute("qualifier", dtom.qualifier);
}
if (StringUtils.isEmpty(dtom.language)) {
if (StringUtils.isEmpty(dtom.language))
{
mel.setAttribute("language", "en");
} else {
}
else
{
mel.setAttribute("language", dtom.language);
}
mel.setTextContent(dtom.value);
@@ -274,7 +315,6 @@ public class MetadataUtilities {
/**
* write xml document to output stream
*
* @param doc XML Document
* @param transformer XML Transformer
* @param out OutputStream
@@ -282,29 +322,32 @@ public class MetadataUtilities {
* @throws TransformerException if Transformer error
*/
public static void writeDocument(Document doc, Transformer transformer, OutputStream out)
throws IOException, TransformerException {
throws IOException, TransformerException
{
Source src = new DOMSource(doc);
Result dest = new StreamResult(out);
transformer.transform(src, dest);
}
// XML utility methods
// XML utility methods
/**
* Lookup an attribute from a DOM node.
*
* @param n Node
* @param name name
* @return attribute value
*/
private static String getAttributeValue(Node n, String name) {
private static String getAttributeValue(Node n, String name)
{
NamedNodeMap nm = n.getAttributes();
for (int i = 0; i < nm.getLength(); i++) {
for (int i = 0; i < nm.getLength(); i++)
{
Node node = nm.item(i);
if (name.equals(node.getNodeName())) {
if (name.equals(node.getNodeName()))
{
return node.getNodeValue();
}
}
@@ -314,17 +357,19 @@ public class MetadataUtilities {
/**
* Return the String value of a Node.
*
* @param node node
* @return string value
*/
private static String getStringValue(Node node) {
private static String getStringValue(Node node)
{
String value = node.getNodeValue();
if (node.hasChildNodes()) {
if (node.hasChildNodes())
{
Node first = node.getFirstChild();
if (first.getNodeType() == Node.TEXT_NODE) {
if (first.getNodeType() == Node.TEXT_NODE)
{
return first.getNodeValue();
}
}
@@ -343,27 +388,36 @@ public class MetadataUtilities {
* @throws ParseException if parse error
*/
public static List<ContentsEntry> readContentsFile(File f)
throws FileNotFoundException, IOException, ParseException {
throws FileNotFoundException, IOException, ParseException
{
List<ContentsEntry> list = new ArrayList<ContentsEntry>();
BufferedReader in = null;
try {
try
{
in = new BufferedReader(new FileReader(f));
String line = null;
while ((line = in.readLine()) != null) {
while ((line = in.readLine()) != null)
{
line = line.trim();
if ("".equals(line)) {
if ("".equals(line))
{
continue;
}
ItemUpdate.pr("Contents entry: " + line);
list.add(ContentsEntry.parse(line));
}
} finally {
try {
}
finally
{
try
{
in.close();
} catch (IOException e) {
}
catch(IOException e)
{
//skip
}
}
@@ -372,33 +426,43 @@ public class MetadataUtilities {
}
/**
*
* @param f file
* @return list of lines as strings
* @throws FileNotFoundException if file doesn't exist
* @throws IOException if IO Error
*/
public static List<String> readDeleteContentsFile(File f)
throws FileNotFoundException, IOException {
throws FileNotFoundException, IOException
{
List<String> list = new ArrayList<>();
BufferedReader in = null;
try {
try
{
in = new BufferedReader(new FileReader(f));
String line = null;
while ((line = in.readLine()) != null) {
while ((line = in.readLine()) != null)
{
line = line.trim();
if ("".equals(line)) {
if ("".equals(line))
{
continue;
}
list.add(line);
}
} finally {
try {
}
finally
{
try
{
in.close();
} catch (IOException e) {
}
catch(IOException e)
{
//skip
}
}
@@ -412,28 +476,29 @@ public class MetadataUtilities {
* @param dcv MetadataValue
* @return string displaying elements of the Metadatum
*/
public static String getDCValueString(MetadataValue dcv) {
public static String getDCValueString(MetadataValue dcv)
{
MetadataField metadataField = dcv.getMetadataField();
MetadataSchema metadataSchema = metadataField.getMetadataSchema();
return "schema: " + metadataSchema.getName() + "; element: " + metadataField
.getElement() + "; qualifier: " + metadataField.getQualifier() +
return "schema: " + metadataSchema.getName() + "; element: " + metadataField.getElement() + "; qualifier: " + metadataField.getQualifier() +
"; language: " + dcv.getLanguage() + "; value: " + dcv.getValue();
}
/**
* Return compound form of a metadata field (i.e. schema.element.qualifier)
*
* @param schema schema
* @param element element
* @param qualifier qualifier
* @return a String representation of the two- or three-part form of a metadata element
* e.g. dc.identifier.uri
*/
public static String getCompoundForm(String schema, String element, String qualifier) {
public static String getCompoundForm(String schema, String element, String qualifier)
{
StringBuilder sb = new StringBuilder();
sb.append(schema).append(".").append(element);
if (qualifier != null) {
if (qualifier != null)
{
sb.append(".").append(qualifier);
}
return sb.toString();
@@ -446,16 +511,20 @@ public class MetadataUtilities {
* @param compoundForm compound form of metadata field
* @return String Array
* @throws ParseException if validity checks fail
*
*/
public static String[] parseCompoundForm(String compoundForm)
throws ParseException {
throws ParseException
{
String[] ar = compoundForm.split("\\s*\\.\\s*"); //trim ends
if ("".equals(ar[0])) {
if ("".equals(ar[0]))
{
throw new ParseException("schema is empty string: " + compoundForm, 0);
}
if ((ar.length < 2) || (ar.length > 3) || "".equals(ar[1])) {
if ((ar.length < 2) || (ar.length > 3) || "".equals(ar[1]))
{
throw new ParseException("element is malformed or empty string: " + compoundForm, 0);
}

View File

@@ -18,29 +18,37 @@ import org.dspace.content.Bundle;
* Also delete all derivative bitstreams, i.e.
* all bitstreams in the TEXT and THUMBNAIL bundles
*/
public class OriginalBitstreamFilter extends BitstreamFilterByBundleName {
public OriginalBitstreamFilter() {
public class OriginalBitstreamFilter extends BitstreamFilterByBundleName
{
public OriginalBitstreamFilter()
{
//empty
}
/**
* Tests bitstreams for containment in an ORIGINAL bundle
*
* @param bitstream Bitstream
* @return true if the bitstream is in the ORIGINAL bundle
*
* @throws BitstreamFilterException if filter error
*/
@Override
public boolean accept(Bitstream bitstream)
throws BitstreamFilterException {
try {
throws BitstreamFilterException
{
try
{
List<Bundle> bundles = bitstream.getBundles();
for (Bundle bundle : bundles) {
if (bundle.getName().equals("ORIGINAL")) {
for (Bundle bundle : bundles)
{
if (bundle.getName().equals("ORIGINAL"))
{
return true;
}
}
} catch (SQLException e) {
}
catch(SQLException e)
{
throw new BitstreamFilterException(e);
}
return false;

View File

@@ -18,10 +18,12 @@ import org.dspace.content.Bundle;
* Also delete all derivative bitstreams, i.e.
* all bitstreams in the TEXT and THUMBNAIL bundles
*/
public class OriginalWithDerivativesBitstreamFilter extends BitstreamFilter {
public class OriginalWithDerivativesBitstreamFilter extends BitstreamFilter
{
protected String[] bundlesToEmpty = { "ORIGINAL", "TEXT", "THUMBNAIL" };
public OriginalWithDerivativesBitstreamFilter() {
public OriginalWithDerivativesBitstreamFilter()
{
//empty
}
@@ -29,22 +31,29 @@ public class OriginalWithDerivativesBitstreamFilter extends BitstreamFilter {
* Tests bitstream for membership in specified bundles (ORIGINAL, TEXT, THUMBNAIL)
*
* @param bitstream Bitstream
* @return true if bitstream is in specified bundles
* @throws BitstreamFilterException if error
* @return true if bitstream is in specified bundles
*/
@Override
public boolean accept(Bitstream bitstream)
throws BitstreamFilterException {
try {
throws BitstreamFilterException
{
try
{
List<Bundle> bundles = bitstream.getBundles();
for (Bundle b : bundles) {
for (String bn : bundlesToEmpty) {
if (b.getName().equals(bn)) {
for (Bundle b : bundles)
{
for (String bn : bundlesToEmpty)
{
if (b.getName().equals(bn))
{
return true;
}
}
}
} catch (SQLException e) {
}
catch(SQLException e)
{
throw new BitstreamFilterException(e);
}
return false;

View File

@@ -11,10 +11,12 @@ import java.util.Properties;
/**
* Bitstream filter targetting the THUMBNAIL bundle
*
*/
public class ThumbnailBitstreamFilter extends BitstreamFilterByBundleName {
public ThumbnailBitstreamFilter() {
public ThumbnailBitstreamFilter()
{
props = new Properties();
props.setProperty("bundle", "THUMBNAIL");
}

View File

@@ -13,8 +13,10 @@ import org.dspace.core.Context;
/**
* Interface for actions to update an item
*
*/
public interface UpdateAction {
public interface UpdateAction
{
public ItemService itemService = ContentServiceFactory.getInstance().getItemService();

View File

@@ -13,6 +13,8 @@ import org.dspace.content.service.BundleService;
/**
* Base class for Bitstream actions
*
*
*/
public abstract class UpdateBitstreamsAction implements UpdateAction {
@@ -25,18 +27,20 @@ public abstract class UpdateBitstreamsAction implements UpdateAction {
/**
* Set variable to indicate that the dc.description.provenance field may
* be changed as a result of Bitstream changes by ItemUpdate
*
* @param alterProvenance whether to alter provenance
*/
public void setAlterProvenance(boolean alterProvenance) {
public void setAlterProvenance(boolean alterProvenance)
{
this.alterProvenance = alterProvenance;
}
/**
*
* @return boolean value to indicate whether the dc.description.provenance field may
* be changed as a result of Bitstream changes by ItemUpdate
*/
public boolean getAlterProvenance() {
public boolean getAlterProvenance()
{
return alterProvenance;
}

View File

@@ -17,6 +17,8 @@ import java.util.Set;
* on which to apply the action when the method execute is called.
*
* Implemented as a Set to avoid problems with duplicates
*
*
*/
public abstract class UpdateMetadataAction implements UpdateAction {
@@ -37,7 +39,8 @@ public abstract class UpdateMetadataAction implements UpdateAction {
* @param targetFields Set of target fields to update
*/
public void addTargetFields(Set<String> targetFields) {
for (String tf : targetFields) {
for (String tf : targetFields)
{
this.targetFields.add(tf);
}
@@ -45,11 +48,11 @@ public abstract class UpdateMetadataAction implements UpdateAction {
/**
* Add array of target fields to update
*
* @param targetFields array of target fields to update
*/
public void addTargetFields(String[] targetFields) {
for (String tf : targetFields) {
for (String tf : targetFields)
{
this.targetFields.add(tf);
}

View File

@@ -15,29 +15,29 @@ import java.io.Reader;
import java.io.StreamTokenizer;
import java.util.ArrayList;
import java.util.List;
import org.jdom.Document;
/**
*
* @author mwood
*/
public class CommandRunner {
public class CommandRunner
{
/**
* Default constructor
*/
private CommandRunner() { }
/**
* @param args the command line arguments given
*
* @param args commandline args
* @throws IOException if IO error
* @throws FileNotFoundException if file doesn't exist
*/
public static void main(String[] args)
throws FileNotFoundException, IOException {
if (args.length > 0) {
throws FileNotFoundException, IOException
{
if (args.length > 0)
{
runManyCommands(args[0]);
} else {
}
else
{
runManyCommands("-");
}
// There is no sensible way to use the status returned by runManyCommands().
@@ -58,11 +58,15 @@ public class CommandRunner {
* @throws FileNotFoundException if file doesn't exist
*/
static int runManyCommands(String script)
throws FileNotFoundException, IOException {
throws FileNotFoundException, IOException
{
Reader input;
if ("-".equals(script)) {
if ("-".equals(script))
{
input = new InputStreamReader(System.in);
} else {
}
else
{
input = new FileReader(script);
}
@@ -85,16 +89,22 @@ public class CommandRunner {
int status = 0;
List<String> tokens = new ArrayList<String>();
Document commandConfigs = ScriptLauncher.getConfig();
while (StreamTokenizer.TT_EOF != tokenizer.nextToken()) {
if (StreamTokenizer.TT_EOL == tokenizer.ttype) {
if (tokens.size() > 0) {
while (StreamTokenizer.TT_EOF != tokenizer.nextToken())
{
if (StreamTokenizer.TT_EOL == tokenizer.ttype)
{
if (tokens.size() > 0)
{
status = ScriptLauncher.runOneCommand(commandConfigs, tokens.toArray(new String[tokens.size()]));
if (status > 0) {
if (status > 0)
{
break;
}
tokens.clear();
}
} else {
}
else
{
tokens.add(tokenizer.sval);
}
}

View File

@@ -12,7 +12,6 @@ import java.io.IOException;
import java.lang.reflect.Method;
import java.util.List;
import java.util.TreeMap;
import org.dspace.servicemanager.DSpaceKernelImpl;
import org.dspace.servicemanager.DSpaceKernelInit;
import org.dspace.services.RequestService;
@@ -26,17 +25,11 @@ import org.jdom.input.SAXBuilder;
* @author Stuart Lewis
* @author Mark Diggory
*/
public class ScriptLauncher {
/**
* The service manager kernel
*/
public class ScriptLauncher
{
/** The service manager kernel */
private static transient DSpaceKernelImpl kernelImpl;
/**
* Default constructor
*/
private ScriptLauncher() { }
/**
* Execute the DSpace script launcher
*
@@ -45,18 +38,25 @@ public class ScriptLauncher {
* @throws FileNotFoundException if file doesn't exist
*/
public static void main(String[] args)
throws FileNotFoundException, IOException {
throws FileNotFoundException, IOException
{
// Initialise the service manager kernel
try {
try
{
kernelImpl = DSpaceKernelInit.getKernel(null);
if (!kernelImpl.isRunning()) {
if (!kernelImpl.isRunning())
{
kernelImpl.start();
}
} catch (Exception e) {
} catch (Exception e)
{
// Failed to start so destroy it and log and throw an exception
try {
try
{
kernelImpl.destroy();
} catch (Exception e1) {
}
catch (Exception e1)
{
// Nothing to do
}
String message = "Failure during kernel init: " + e.getMessage();
@@ -69,7 +69,8 @@ public class ScriptLauncher {
Document commandConfigs = getConfig();
// Check that there is at least one argument (if not display command options)
if (args.length < 1) {
if (args.length < 1)
{
System.err.println("You must provide at least one command argument");
display(commandConfigs);
System.exit(1);
@@ -80,7 +81,8 @@ public class ScriptLauncher {
status = runOneCommand(commandConfigs, args);
// Destroy the service kernel if it is still alive
if (kernelImpl != null) {
if (kernelImpl != null)
{
kernelImpl.destroy();
kernelImpl = null;
}
@@ -88,29 +90,28 @@ public class ScriptLauncher {
System.exit(status);
}
protected static int runOneCommand(Document commandConfigs, String[] args) {
return runOneCommand(commandConfigs, args, kernelImpl);
}
/**
* Recognize and execute a single command.
*
* @param commandConfigs Document
* @param args the command line arguments given
* @param doc Document
* @param args arguments
*/
public static int runOneCommand(Document commandConfigs, String[] args, DSpaceKernelImpl kernelImpl) {
static int runOneCommand(Document commandConfigs, String[] args)
{
String request = args[0];
Element root = commandConfigs.getRootElement();
List<Element> commands = root.getChildren("command");
Element command = null;
for (Element candidate : commands) {
if (request.equalsIgnoreCase(candidate.getChild("name").getValue())) {
for (Element candidate : commands)
{
if (request.equalsIgnoreCase(candidate.getChild("name").getValue()))
{
command = candidate;
break;
}
}
if (null == command) {
if (null == command)
{
// The command wasn't found
System.err.println("Command not found: " + args[0]);
display(commandConfigs);
@@ -119,26 +120,33 @@ public class ScriptLauncher {
// Run each step
List<Element> steps = command.getChildren("step");
for (Element step : steps) {
for (Element step : steps)
{
// Instantiate the class
Class target = null;
// Is it the special case 'dsrun' where the user provides the class name?
String className;
if ("dsrun".equals(request)) {
if (args.length < 2) {
if ("dsrun".equals(request))
{
if (args.length < 2)
{
System.err.println("Error in launcher.xml: Missing class name");
return 1;
}
className = args[1];
} else {
}
else {
className = step.getChild("class").getValue();
}
try {
try
{
target = Class.forName(className,
true,
Thread.currentThread().getContextClassLoader());
} catch (ClassNotFoundException e) {
}
catch (ClassNotFoundException e)
{
System.err.println("Error in launcher.xml: Invalid class name: " + className);
return 1;
}
@@ -149,20 +157,26 @@ public class ScriptLauncher {
Class[] argTypes = {useargs.getClass()};
boolean passargs = true;
if ((step.getAttribute("passuserargs") != null) &&
("false".equalsIgnoreCase(step.getAttribute("passuserargs").getValue()))) {
("false".equalsIgnoreCase(step.getAttribute("passuserargs").getValue())))
{
passargs = false;
}
if ((args.length == 1) || (("dsrun".equals(request)) && (args.length == 2)) || (!passargs)) {
if ((args.length == 1) || (("dsrun".equals(request)) && (args.length == 2)) || (!passargs))
{
useargs = new String[0];
} else {
}
else
{
// The number of arguments to ignore
// If dsrun is the command, ignore the next, as it is the class name not an arg
int x = 1;
if ("dsrun".equals(request)) {
if ("dsrun".equals(request))
{
x = 2;
}
String[] argsnew = new String[useargs.length - x];
for (int i = x; i < useargs.length; i++) {
for (int i = x; i < useargs.length; i++)
{
argsnew[i - x] = useargs[i];
}
useargs = argsnew;
@@ -170,13 +184,16 @@ public class ScriptLauncher {
// Add any extra properties
List<Element> bits = step.getChildren("argument");
if (step.getChild("argument") != null) {
if (step.getChild("argument") != null)
{
String[] argsnew = new String[useargs.length + bits.size()];
int i = 0;
for (Element arg : bits) {
for (Element arg : bits)
{
argsnew[i++] = arg.getValue();
}
for (; i < bits.size() + useargs.length; i++) {
for (; i < bits.size() + useargs.length; i++)
{
argsnew[i] = useargs[i - bits.size()];
}
useargs = argsnew;
@@ -185,7 +202,8 @@ public class ScriptLauncher {
// Establish the request service startup
RequestService requestService = kernelImpl.getServiceManager().getServiceByName(
RequestService.class.getName(), RequestService.class);
if (requestService == null) {
if (requestService == null)
{
throw new IllegalStateException(
"Could not get the DSpace RequestService to start the request transaction");
}
@@ -195,7 +213,8 @@ public class ScriptLauncher {
requestService.startRequest();
// Run the main() method
try {
try
{
Object[] arguments = {useargs};
// Useful for debugging, so left in the code...
@@ -211,7 +230,9 @@ public class ScriptLauncher {
// ensure we close out the request (happy request)
requestService.endRequest(null);
} catch (Exception e) {
}
catch (Exception e)
{
// Failure occurred in the request so we destroy it
requestService.endRequest(e);
@@ -232,20 +253,20 @@ public class ScriptLauncher {
*
* @return The XML configuration file Document
*/
protected static Document getConfig() {
return getConfig(kernelImpl);
}
public static Document getConfig(DSpaceKernelImpl kernelImpl) {
protected static Document getConfig()
{
// Load the launcher configuration file
String config = kernelImpl.getConfigurationService().getProperty("dspace.dir") +
System.getProperty("file.separator") + "config" +
System.getProperty("file.separator") + "launcher.xml";
SAXBuilder saxBuilder = new SAXBuilder();
Document doc = null;
try {
try
{
doc = saxBuilder.build(config);
} catch (Exception e) {
}
catch (Exception e)
{
System.err.println("Unable to load the launcher configuration file: [dspace]/config/launcher.xml");
System.err.println(e.getMessage());
e.printStackTrace();
@@ -256,10 +277,10 @@ public class ScriptLauncher {
/**
* Display the commands that the current launcher config file knows about
*
* @param commandConfigs configs as Document
*/
private static void display(Document commandConfigs) {
private static void display(Document commandConfigs)
{
// List all command elements
List<Element> commands = commandConfigs.getRootElement().getChildren("command");
@@ -267,13 +288,15 @@ public class ScriptLauncher {
// We cannot just use commands.sort() because it tries to remove and
// reinsert Elements within other Elements, and that doesn't work.
TreeMap<String, Element> sortedCommands = new TreeMap<>();
for (Element command : commands) {
for (Element command : commands)
{
sortedCommands.put(command.getChild("name").getValue(), command);
}
// Display the sorted list
System.out.println("Usage: dspace [command-name] {parameters}");
for (Element command : sortedCommands.values()) {
for (Element command : sortedCommands.values())
{
System.out.println(" - " + command.getChild("name").getValue() +
": " + command.getChild("description").getValue());
}

View File

@@ -7,12 +7,12 @@
*/
package org.dspace.app.mediafilter;
import java.awt.image.BufferedImage;
import java.awt.Color;
import java.awt.Font;
import java.awt.FontMetrics;
import java.awt.Graphics2D;
import java.awt.Rectangle;
import java.awt.image.BufferedImage;
/**
* Class to attach a footer to an image using ImageMagick.
@@ -20,7 +20,8 @@ import java.awt.image.BufferedImage;
* This version of the code is basically Ninh's but reorganised a little. Used with permission.
*/
public class Brand {
public class Brand
{
private int brandWidth;
private int brandHeight;
private Font font;
@@ -33,11 +34,13 @@ public class Brand {
* @param brandHeight height of the footer in pixels
* @param font font to use for text on the footer
* @param xOffset number of pixels text should be indented from left-hand side of footer
*
*/
public Brand(int brandWidth,
int brandHeight,
Font font,
int xOffset) {
int xOffset)
{
this.brandWidth = brandWidth;
this.brandHeight = brandHeight;
this.font = font;
@@ -52,34 +55,46 @@ public class Brand {
* the image is resized such that brandLeftText will not fit. <code>null</code> if not
* required
* @param brandRightText text that should appear in the bottom right of the image
*
* @return BufferedImage a BufferedImage object describing the brand image file
*/
public BufferedImage create(String brandLeftText,
String shortLeftText,
String brandRightText) {
String brandRightText)
{
BrandText[] allBrandText = null;
BufferedImage brandImage =
new BufferedImage(brandWidth, brandHeight, BufferedImage.TYPE_INT_RGB);
if (brandWidth >= 350) {
allBrandText = new BrandText[] {
if (brandWidth >= 350)
{
allBrandText = new BrandText[]
{
new BrandText(BrandText.BL, brandLeftText),
new BrandText(BrandText.BR, brandRightText)
};
} else if (brandWidth >= 190) {
allBrandText = new BrandText[] {
}
else if (brandWidth >= 190)
{
allBrandText = new BrandText[]
{
new BrandText(BrandText.BL, shortLeftText),
new BrandText(BrandText.BR, brandRightText)
};
} else {
allBrandText = new BrandText[] {
}
else
{
allBrandText = new BrandText[]
{
new BrandText(BrandText.BR, brandRightText)
};
}
if (allBrandText != null && allBrandText.length > 0) {
for (int i = 0; i < allBrandText.length; ++i) {
if (allBrandText != null && allBrandText.length > 0)
{
for (int i = 0; i < allBrandText.length; ++i)
{
drawImage(brandImage, allBrandText[i]);
}
}
@@ -96,37 +111,48 @@ public class Brand {
* position within the brand
*/
private void drawImage(BufferedImage brandImage,
BrandText brandText) {
BrandText brandText)
{
int imgWidth = brandImage.getWidth();
int imgHeight = brandImage.getHeight();
int bx, by, tx, ty, bWidth, bHeight;
Graphics2D g2 = brandImage.createGraphics();
g2.setFont(font);
FontMetrics fm = g2.getFontMetrics();
int bWidth = fm.stringWidth(brandText.getText()) + xOffset * 2 + 1;
int bHeight = fm.getHeight();
int bx = 0;
int by = 0;
bWidth = fm.stringWidth(brandText.getText()) + xOffset * 2 + 1;
bHeight = fm.getHeight();
if (brandText.getLocation().equals(BrandText.TL)) {
bx = 0;
by = 0;
} else if (brandText.getLocation().equals(BrandText.TR)) {
if (brandText.getLocation().equals(BrandText.TL))
{
bx = 0;
by = 0;
}
else if (brandText.getLocation().equals(BrandText.TR))
{
bx = imgWidth - bWidth;
by = 0;
} else if (brandText.getLocation().equals(BrandText.BL)) {
}
else if (brandText.getLocation().equals(BrandText.BL))
{
bx = 0;
by = imgHeight - bHeight;
} else if (brandText.getLocation().equals(BrandText.BR)) {
}
else if (brandText.getLocation().equals(BrandText.BR))
{
bx = imgWidth - bWidth;
by = imgHeight - bHeight;
}
Rectangle box = new Rectangle(bx, by, bWidth, bHeight);
int tx = bx + xOffset;
int ty = by + fm.getAscent();
tx = bx + xOffset;
ty = by + fm.getAscent();
g2.setColor(Color.black);
g2.fill(box);

View File

@@ -13,22 +13,15 @@ package org.dspace.app.mediafilter;
* This is a copy of Picture Australia's PiObj class re-organised with methods.
* Thanks to Ninh Nguyen at the National Library for providing the original source.
*/
class BrandText {
/**
* Bottom Left
*/
class BrandText
{
/** Bottom Left */
public static final String BL = "bl";
/**
* Bottom Right
*/
/** Bottom Right */
public static final String BR = "br";
/**
* Top Left
*/
/** Top Left */
public static final String TL = "tl";
/**
* Top Right
*/
/** Top Right */
public static final String TR = "tr";
private String location;
@@ -41,7 +34,8 @@ class BrandText {
* @param location one of the class location constants e.g. <code>Identifier.BL</code>
* @param the text associated with the location
*/
public BrandText(String location, String text) {
public BrandText(String location, String text)
{
this.location = location;
this.text = text;
}
@@ -51,7 +45,8 @@ class BrandText {
*
* @return String one the class location constants e.g. <code>Identifier.BL</code>
*/
public String getLocation() {
public String getLocation()
{
return location;
}
@@ -61,7 +56,8 @@ class BrandText {
*
* @return String the text associated with the Identifier object
*/
public String getText() {
public String getText()
{
return text;
}
@@ -71,7 +67,8 @@ class BrandText {
*
* @param location one of the class location constants
*/
public void setLocation(String location) {
public void setLocation(String location)
{
this.location = location;
}
@@ -81,7 +78,8 @@ class BrandText {
*
* @param text any text string (typically a branding or identifier)
*/
public void setText(String text) {
public void setText(String text)
{
this.text = text;
}
}

View File

@@ -7,13 +7,16 @@
*/
package org.dspace.app.mediafilter;
import java.awt.image.BufferedImage;
import java.awt.image.*;
import java.io.InputStream;
import javax.imageio.ImageIO;
import org.dspace.content.Item;
import org.dspace.core.ConfigurationManager;
import org.dspace.app.mediafilter.JPEGFilter;
/**
* Filter image bitstreams, scaling the image to be within the bounds of
* thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be
@@ -21,17 +24,21 @@ import org.dspace.core.ConfigurationManager;
*
* @author Jason Sherman jsherman@usao.edu
*/
public class BrandedPreviewJPEGFilter extends MediaFilter {
public class BrandedPreviewJPEGFilter extends MediaFilter
{
@Override
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".preview.jpg";
}
/**
* @return String bundle name
*
*/
@Override
public String getBundleName() {
public String getBundleName()
{
return "BRANDED_PREVIEW";
}
@@ -39,7 +46,8 @@ public class BrandedPreviewJPEGFilter extends MediaFilter {
* @return String bitstreamformat
*/
@Override
public String getFormatString() {
public String getFormatString()
{
return "JPEG";
}
@@ -47,21 +55,25 @@ public class BrandedPreviewJPEGFilter extends MediaFilter {
* @return String description
*/
@Override
public String getDescription() {
public String getDescription()
{
return "Generated Branded Preview";
}
/**
* @param currentItem item
* @param source source input stream
* @param source
* source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
// read in bitstream's image
BufferedImage buf = ImageIO.read(source);
@@ -79,8 +91,6 @@ public class BrandedPreviewJPEGFilter extends MediaFilter {
int brandFontPoint = ConfigurationManager.getIntProperty("webui.preview.brand.fontpoint");
JPEGFilter jpegFilter = new JPEGFilter();
return jpegFilter
.getThumbDim(currentItem, buf, verbose, xmax, ymax, blurring, hqscaling, brandHeight, brandFontPoint,
brandFont);
return jpegFilter.getThumbDim(currentItem, buf, verbose, xmax, ymax, blurring, hqscaling, brandHeight, brandFontPoint, brandFont);
}
}

View File

@@ -11,11 +11,12 @@ import java.io.InputStream;
import java.nio.charset.StandardCharsets;
import org.apache.commons.io.IOUtils;
import org.apache.log4j.Logger;
import org.apache.poi.POITextExtractor;
import org.apache.poi.extractor.ExtractorFactory;
import org.apache.poi.hssf.extractor.ExcelExtractor;
import org.apache.poi.xssf.extractor.XSSFExcelExtractor;
import org.apache.log4j.Logger;
import org.dspace.content.Item;
/*
@@ -34,32 +35,40 @@ import org.dspace.content.Item;
* filter.org.dspace.app.mediafilter.ExcelFilter.inputFormats = Microsoft Excel, Microsoft Excel XML
*
*/
public class ExcelFilter extends MediaFilter {
public class ExcelFilter extends MediaFilter
{
private static Logger log = Logger.getLogger(ExcelFilter.class);
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".txt";
}
/**
* @return String bundle name
*
*/
public String getBundleName() {
public String getBundleName()
{
return "TEXT";
}
/**
* @return String bitstream format
*
*
*/
public String getFormatString() {
public String getFormatString()
{
return "Text";
}
/**
* @return String description
*/
public String getDescription() {
public String getDescription()
{
return "Extracted text";
}
@@ -67,29 +76,38 @@ public class ExcelFilter extends MediaFilter {
* @param item item
* @param source source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item item, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
String extractedText = null;
try {
try
{
POITextExtractor theExtractor = ExtractorFactory.createExtractor(source);
if (theExtractor instanceof ExcelExtractor) {
if (theExtractor instanceof ExcelExtractor)
{
// for xls file
extractedText = (theExtractor).getText();
} else if (theExtractor instanceof XSSFExcelExtractor) {
}
else if (theExtractor instanceof XSSFExcelExtractor)
{
// for xlsx file
extractedText = (theExtractor).getText();
}
} catch (Exception e) {
}
catch (Exception e)
{
log.error("Error filtering bitstream: " + e.getMessage(), e);
throw e;
}
if (extractedText != null) {
if (extractedText != null)
{
// generate an input stream with the extracted text
return IOUtils.toInputStream(extractedText, StandardCharsets.UTF_8);
}

View File

@@ -18,11 +18,13 @@ import org.dspace.core.Context;
* from one format to another. This interface should be implemented by any class
* which defines a "filter" to be run by the MediaFilterManager.
*/
public interface FormatFilter {
public interface FormatFilter
{
/**
* Get a filename for a newly created filtered bitstream
*
* @param sourceName name of source bitstream
* @param sourceName
* name of source bitstream
* @return filename generated by the filter - for example, document.pdf
* becomes document.pdf.txt
*/
@@ -51,8 +53,10 @@ public interface FormatFilter {
* Read the source stream and produce the filtered content.
*
* @param item Item
* @param source input stream
* @param source
* input stream
* @param verbose verbosity flag
*
* @return result of filter's transformation as a byte stream.
* @throws Exception if error
*/
@@ -67,10 +71,12 @@ public interface FormatFilter {
* is necessary). Return false if bitstream should be skipped
* for any reason.
*
*
* @param c context
* @param item item containing bitstream to process
* @param source source bitstream to be processed
* @param verbose verbose mode
*
* @return true if bitstream processing should continue,
* false if this bitstream should be skipped
* @throws Exception if error
@@ -86,9 +92,13 @@ public interface FormatFilter {
* is necessary). Return false if bitstream should be skipped
* for some reason.
*
* @param c context
* @param item item containing bitstream to process
* @param generatedBitstream the bitstream which was generated by
*
* @param c
* context
* @param item
* item containing bitstream to process
* @param generatedBitstream
* the bitstream which was generated by
* this filter.
* @throws Exception if error
*/

View File

@@ -7,31 +7,36 @@
*/
package org.dspace.app.mediafilter;
import org.dspace.content.Item;
import java.io.ByteArrayInputStream;
import java.io.InputStream;
import javax.swing.text.Document;
import javax.swing.text.html.HTMLEditorKit;
import org.dspace.content.Item;
/*
*
* to do: helpful error messages - can't find mediafilter.cfg - can't
* instantiate filter - bitstream format doesn't exist
*
*/
public class HTMLFilter extends MediaFilter {
public class HTMLFilter extends MediaFilter
{
@Override
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".txt";
}
/**
* @return String bundle name
*
*/
@Override
public String getBundleName() {
public String getBundleName()
{
return "TEXT";
}
@@ -39,7 +44,8 @@ public class HTMLFilter extends MediaFilter {
* @return String bitstreamformat
*/
@Override
public String getFormatString() {
public String getFormatString()
{
return "Text";
}
@@ -47,7 +53,8 @@ public class HTMLFilter extends MediaFilter {
* @return String description
*/
@Override
public String getDescription() {
public String getDescription()
{
return "Extracted text";
}
@@ -55,12 +62,14 @@ public class HTMLFilter extends MediaFilter {
* @param currentItem item
* @param source source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
// try and read the document - set to ignore character set directive,
// assuming that the input stream is already set properly (I hope)
HTMLEditorKit kit = new HTMLEditorKit();

View File

@@ -7,41 +7,48 @@
*/
package org.dspace.app.mediafilter;
import org.dspace.content.Item;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.io.InputStream;
import java.nio.file.Files;
import org.dspace.content.Item;
/**
* Filter image bitstreams, scaling the image to be within the bounds of
* thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be
* no bigger than. Creates only JPEGs.
*/
public class ImageMagickImageThumbnailFilter extends ImageMagickThumbnailFilter {
public class ImageMagickImageThumbnailFilter extends ImageMagickThumbnailFilter
{
/**
* @param currentItem item
* @param source source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
File f = inputStreamToTempFile(source, "imthumb", ".tmp");
File f2 = null;
try {
try
{
f2 = getThumbnailFile(f, verbose);
byte[] bytes = Files.readAllBytes(f2.toPath());
return new ByteArrayInputStream(bytes);
} finally {
}
finally
{
//noinspection ResultOfMethodCallIgnored
f.delete();
if (f2 != null) {
if (f2 != null)
{
//noinspection ResultOfMethodCallIgnored
f2.delete();
}

View File

@@ -7,33 +7,39 @@
*/
package org.dspace.app.mediafilter;
import org.dspace.content.Item;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.io.InputStream;
import java.nio.file.Files;
import org.dspace.content.Item;
public class ImageMagickPdfThumbnailFilter extends ImageMagickThumbnailFilter {
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
File f = inputStreamToTempFile(source, "impdfthumb", ".pdf");
File f2 = null;
File f3 = null;
try {
try
{
f2 = getImageFile(f, 0, verbose);
f3 = getThumbnailFile(f2, verbose);
byte[] bytes = Files.readAllBytes(f3.toPath());
return new ByteArrayInputStream(bytes);
} finally {
}
finally
{
//noinspection ResultOfMethodCallIgnored
f.delete();
if (f2 != null) {
if (f2 != null)
{
//noinspection ResultOfMethodCallIgnored
f2.delete();
}
if (f3 != null) {
if (f3 != null)
{
//noinspection ResultOfMethodCallIgnored
f3.delete();
}

View File

@@ -14,19 +14,22 @@ import java.io.InputStream;
import java.util.regex.Pattern;
import java.util.regex.PatternSyntaxException;
import javax.imageio.ImageIO;
import org.dspace.content.Bitstream;
import org.dspace.content.Bundle;
import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService;
import org.dspace.core.ConfigurationManager;
import org.dspace.core.Context;
import org.im4java.core.ConvertCmd;
import org.im4java.core.Info;
import org.im4java.core.IM4JavaException;
import org.im4java.core.IMOperation;
import org.im4java.core.Info;
import org.im4java.process.ProcessStarter;
import org.dspace.core.ConfigurationManager;
/**
* Filter image bitstreams, scaling the image to be within the bounds of
* thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be
@@ -75,6 +78,7 @@ public abstract class ImageMagickThumbnailFilter extends MediaFilter {
/**
* @return String bundle name
*
*/
@Override
public String getBundleName() {
@@ -143,7 +147,7 @@ public abstract class ImageMagickThumbnailFilter extends MediaFilter {
// PDFs using the CMYK color system can be handled specially if
// profiles are defined
if (cmyk_profile != null && srgb_profile != null) {
Info imageInfo = new Info(f.getAbsolutePath(), true);
Info imageInfo = new Info(f.getAbsolutePath() + s, true);
String imageClass = imageInfo.getImageClass();
if (imageClass.contains("CMYK")) {
op.profile(cmyk_profile);
@@ -166,11 +170,10 @@ public abstract class ImageMagickThumbnailFilter extends MediaFilter {
String n = bit.getName();
if (n != null) {
if (nsrc != null) {
if (!n.startsWith(nsrc)) {
if (!n.startsWith(nsrc))
continue;
}
}
}
String description = bit.getDescription();
// If anything other than a generated thumbnail
// is found, halt processing

View File

@@ -7,18 +7,16 @@
*/
package org.dspace.app.mediafilter;
import java.awt.Color;
import java.awt.Font;
import java.awt.Graphics2D;
import java.awt.Color;
import java.awt.image.*;
import java.awt.RenderingHints;
import java.awt.Transparency;
import java.awt.image.BufferedImage;
import java.awt.image.BufferedImageOp;
import java.awt.image.ConvolveOp;
import java.awt.image.Kernel;
import java.awt.Font;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.InputStream;
import javax.imageio.ImageIO;
import org.dspace.content.Item;
@@ -31,17 +29,21 @@ import org.dspace.core.ConfigurationManager;
*
* @author Jason Sherman jsherman@usao.edu
*/
public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats {
public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
{
@Override
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".jpg";
}
/**
* @return String bundle name
*
*/
@Override
public String getBundleName() {
public String getBundleName()
{
return "THUMBNAIL";
}
@@ -49,7 +51,8 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
* @return String bitstreamformat
*/
@Override
public String getFormatString() {
public String getFormatString()
{
return "JPEG";
}
@@ -57,7 +60,8 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
* @return String description
*/
@Override
public String getDescription() {
public String getDescription()
{
return "Generated Thumbnail";
}
@@ -65,12 +69,14 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
* @param currentItem item
* @param source source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
// read in bitstream's image
BufferedImage buf = ImageIO.read(source);
@@ -78,7 +84,8 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
}
public InputStream getThumb(Item currentItem, BufferedImage buf, boolean verbose)
throws Exception {
throws Exception
{
// get config params
float xmax = (float) ConfigurationManager
.getIntProperty("thumbnail.maxwidth");
@@ -92,28 +99,30 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
return getThumbDim(currentItem, buf, verbose, xmax, ymax, blurring, hqscaling, 0, 0, null);
}
public InputStream getThumbDim(Item currentItem, BufferedImage buf, boolean verbose, float xmax, float ymax,
boolean blurring, boolean hqscaling, int brandHeight, int brandFontPoint,
String brandFont)
throws Exception {
public InputStream getThumbDim(Item currentItem, BufferedImage buf, boolean verbose, float xmax, float ymax, boolean blurring, boolean hqscaling, int brandHeight, int brandFontPoint, String brandFont)
throws Exception
{
// now get the image dimensions
float xsize = (float) buf.getWidth(null);
float ysize = (float) buf.getHeight(null);
// if verbose flag is set, print out dimensions
// to STDOUT
if (verbose) {
if (verbose)
{
System.out.println("original size: " + xsize + "," + ysize);
}
// scale by x first if needed
if (xsize > xmax) {
if (xsize > xmax)
{
// calculate scaling factor so that xsize * scale = new size (max)
float scale_factor = xmax / xsize;
// if verbose flag is set, print out extracted text
// to STDOUT
if (verbose) {
if (verbose)
{
System.out.println("x scale factor: " + scale_factor);
}
@@ -124,13 +133,15 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
// if verbose flag is set, print out extracted text
// to STDOUT
if (verbose) {
if (verbose)
{
System.out.println("size after fitting to maximum width: " + xsize + "," + ysize);
}
}
// scale by y if needed
if (ysize > ymax) {
if (ysize > ymax)
{
float scale_factor = ymax / ysize;
// now reduce x size
@@ -140,7 +151,8 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
}
// if verbose flag is set, print details to STDOUT
if (verbose) {
if (verbose)
{
System.out.println("size after fitting to maximum height: " + xsize + ", "
+ ysize);
}
@@ -151,14 +163,16 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
// Use blurring if selected in config.
// a little blur before scaling does wonders for keeping moire in check.
if (blurring) {
if (blurring)
{
// send the buffered image off to get blurred.
buf = getBlurredInstance((BufferedImage) buf);
}
// Use high quality scaling method if selected in config.
// this has a definite performance penalty.
if (hqscaling) {
if (hqscaling)
{
// send the buffered image off to get an HQ downscale.
buf = getScaledInstance((BufferedImage) buf, (int) xsize, (int) ysize,
(Object) RenderingHints.VALUE_INTERPOLATION_BICUBIC, (boolean) true);
@@ -190,27 +204,32 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
@Override
public String[] getInputMIMETypes() {
public String[] getInputMIMETypes()
{
return ImageIO.getReaderMIMETypes();
}
@Override
public String[] getInputDescriptions() {
public String[] getInputDescriptions()
{
return null;
}
@Override
public String[] getInputExtensions() {
public String[] getInputExtensions()
{
// Temporarily disabled as JDK 1.6 only
// return ImageIO.getReaderFileSuffixes();
return null;
}
public BufferedImage getNormalizedInstance(BufferedImage buf) {
public BufferedImage getNormalizedInstance(BufferedImage buf)
{
int type = (buf.getTransparency() == Transparency.OPAQUE) ?
BufferedImage.TYPE_INT_RGB : BufferedImage.TYPE_INT_ARGB_PRE;
int w = buf.getWidth();
int h = buf.getHeight();
int w, h;
w = buf.getWidth();
h = buf.getHeight();
BufferedImage normal = new BufferedImage(w, h, type);
Graphics2D g2d = normal.createGraphics();
g2d.drawImage(buf, 0, 0, w, h, Color.WHITE, null);
@@ -225,7 +244,8 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
* @param buf buffered image
* @return updated BufferedImage
*/
public BufferedImage getBlurredInstance(BufferedImage buf) {
public BufferedImage getBlurredInstance(BufferedImage buf)
{
buf = getNormalizedInstance(buf);
// kernel for blur op
@@ -267,12 +287,12 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
int targetWidth,
int targetHeight,
Object hint,
boolean higherQuality) {
boolean higherQuality)
{
int type = (buf.getTransparency() == Transparency.OPAQUE) ?
BufferedImage.TYPE_INT_RGB : BufferedImage.TYPE_INT_ARGB;
BufferedImage scalebuf = (BufferedImage)buf;
int w;
int h;
int w, h;
if (higherQuality) {
// Use multi-step technique: start with original size, then
// scale down in multiple passes with drawImage()

View File

@@ -18,7 +18,8 @@ import org.dspace.core.Context;
* by the MediaFilterManager. More complex filters should likely implement the FormatFilter
* interface directly, so that they can define their own pre/postProcessing methods.
*/
public abstract class MediaFilter implements FormatFilter {
public abstract class MediaFilter implements FormatFilter
{
/**
* Perform any pre-processing of the source bitstream *before* the actual
* filtering takes place in MediaFilterManager.processBitstream().
@@ -27,17 +28,20 @@ public abstract class MediaFilter implements FormatFilter {
* is necessary). Return false if bitstream should be skipped
* for any reason.
*
*
* @param c context
* @param item item containing bitstream to process
* @param source source bitstream to be processed
* @param verbose verbose mode
*
* @return true if bitstream processing should continue,
* false if this bitstream should be skipped
* @throws Exception if error
*/
@Override
public boolean preProcessBitstream(Context c, Item item, Bitstream source, boolean verbose)
throws Exception {
throws Exception
{
return true; //default to no pre-processing
}
@@ -49,15 +53,20 @@ public abstract class MediaFilter implements FormatFilter {
* is necessary). Return false if bitstream should be skipped
* for some reason.
*
* @param c context
* @param item item containing bitstream to process
* @param generatedBitstream the bitstream which was generated by
*
* @param c
* context
* @param item
* item containing bitstream to process
* @param generatedBitstream
* the bitstream which was generated by
* this filter.
* @throws Exception if error
* @throws java.lang.Exception
*/
@Override
public void postProcessBitstream(Context c, Item item, Bitstream generatedBitstream)
throws Exception {
throws Exception
{
//default to no post-processing necessary
}
}

View File

@@ -7,22 +7,7 @@
*/
package org.dspace.app.mediafilter;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.MissingArgumentException;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.OptionBuilder;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.lang.ArrayUtils;
import org.apache.commons.cli.*;
import org.dspace.app.mediafilter.factory.MediaFilterServiceFactory;
import org.dspace.app.mediafilter.service.MediaFilterService;
import org.dspace.content.Collection;
@@ -34,6 +19,9 @@ import org.dspace.core.Context;
import org.dspace.core.SelfNamedPlugin;
import org.dspace.core.factory.CoreServiceFactory;
import org.dspace.handle.factory.HandleServiceFactory;
import java.util.*;
import org.apache.commons.lang.ArrayUtils;
import org.dspace.services.factory.DSpaceServicesFactory;
/**
@@ -56,12 +44,8 @@ public class MediaFilterCLITool {
//suffix (in dspace.cfg) for input formats supported by each filter
private static final String INPUT_FORMATS_SUFFIX = "inputFormats";
/**
* Default constructor
*/
private MediaFilterCLITool() { }
public static void main(String[] argv) throws Exception {
public static void main(String[] argv) throws Exception
{
// set headless for non-gui workstations
System.setProperty("java.awt.headless", "true");
@@ -115,39 +99,48 @@ public class MediaFilterCLITool {
Map<String, List<String>> filterFormats = new HashMap<>();
CommandLine line = null;
try {
try
{
line = parser.parse(options, argv);
} catch (MissingArgumentException e) {
}
catch(MissingArgumentException e)
{
System.out.println("ERROR: " + e.getMessage());
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MediaFilterManager\n", options);
System.exit(1);
}
if (line.hasOption('h')) {
if (line.hasOption('h'))
{
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MediaFilterManager\n", options);
System.exit(0);
}
if (line.hasOption('v')) {
if (line.hasOption('v'))
{
isVerbose = true;
}
isQuiet = line.hasOption('q');
if (line.hasOption('f')) {
if (line.hasOption('f'))
{
isForce = true;
}
if (line.hasOption('i')) {
if (line.hasOption('i'))
{
identifier = line.getOptionValue('i');
}
if (line.hasOption('m')) {
if (line.hasOption('m'))
{
max2Process = Integer.parseInt(line.getOptionValue('m'));
if (max2Process <= 1) {
if (max2Process <= 1)
{
System.out.println("Invalid maximum value '" +
line.getOptionValue('m') + "' - ignoring");
max2Process = Integer.MAX_VALUE;
@@ -155,21 +148,24 @@ public class MediaFilterCLITool {
}
String filterNames[] = null;
if (line.hasOption('p')) {
if(line.hasOption('p'))
{
//specified which media filter plugins we are using
filterNames = line.getOptionValues('p');
if (filterNames == null || filterNames.length == 0) { //display error, since no plugins specified
if(filterNames==null || filterNames.length==0)
{ //display error, since no plugins specified
System.err.println("\nERROR: -p (-plugin) option requires at least one plugin to be specified.\n" +
"(e.g. MediaFilterManager -p \"Word Text Extractor\",\"PDF Text Extractor\")\n");
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MediaFilterManager\n", options);
System.exit(1);
}
} else {
}
else
{
//retrieve list of all enabled media filter plugins!
filterNames = DSpaceServicesFactory.getInstance().getConfigurationService()
.getArrayProperty(MEDIA_FILTER_PLUGINS_KEY);
filterNames = DSpaceServicesFactory.getInstance().getConfigurationService().getArrayProperty(MEDIA_FILTER_PLUGINS_KEY);
}
MediaFilterService mediaFilterService = MediaFilterServiceFactory.getInstance().getMediaFilterService();
@@ -182,16 +178,17 @@ public class MediaFilterCLITool {
List<FormatFilter> filterList = new ArrayList<FormatFilter>();
//set up each filter
for (int i = 0; i < filterNames.length; i++) {
for(int i=0; i< filterNames.length; i++)
{
//get filter of this name & add to list of filters
FormatFilter filter = (FormatFilter) CoreServiceFactory.getInstance().getPluginService()
.getNamedPlugin(FormatFilter.class, filterNames[i]);
if (filter == null) {
System.err.println(
"\nERROR: Unknown MediaFilter specified (either from command-line or in dspace.cfg): '" +
filterNames[i] + "'");
FormatFilter filter = (FormatFilter) CoreServiceFactory.getInstance().getPluginService().getNamedPlugin(FormatFilter.class, filterNames[i]);
if(filter==null)
{
System.err.println("\nERROR: Unknown MediaFilter specified (either from command-line or in dspace.cfg): '" + filterNames[i] + "'");
System.exit(1);
} else {
}
else
{
filterList.add(filter);
String filterClassName = filter.getClass().getName();
@@ -203,7 +200,8 @@ public class MediaFilterCLITool {
//each "named" plugin that it defines.
//So, we have to look for every key that fits the
//following format: filter.<class-name>.<plugin-name>.inputFormats
if (SelfNamedPlugin.class.isAssignableFrom(filter.getClass())) {
if( SelfNamedPlugin.class.isAssignableFrom(filter.getClass()) )
{
//Get the plugin instance name for this class
pluginName = ((SelfNamedPlugin) filter).getPluginInstanceName();
}
@@ -221,28 +219,31 @@ public class MediaFilterCLITool {
"." + INPUT_FORMATS_SUFFIX);
//add to internal map of filters to supported formats
if (ArrayUtils.isNotEmpty(formats)) {
if (ArrayUtils.isNotEmpty(formats))
{
//For SelfNamedPlugins, map key is:
// <class-name><separator><plugin-name>
//For other MediaFilters, map key is just:
// <class-name>
filterFormats.put(filterClassName +
(pluginName != null ? MediaFilterService.FILTER_PLUGIN_SEPARATOR +
pluginName : ""),
(pluginName!=null ? MediaFilterService.FILTER_PLUGIN_SEPARATOR + pluginName : ""),
Arrays.asList(formats));
}
}//end if filter!=null
}//end for
//If verbose, print out loaded mediafilter info
if (isVerbose) {
if(isVerbose)
{
System.out.println("The following MediaFilters are enabled: ");
Iterator<String> i = filterFormats.keySet().iterator();
while (i.hasNext()) {
while(i.hasNext())
{
String filterName = i.next();
System.out.println("Full Filter Name: " + filterName);
String pluginName = null;
if (filterName.contains(MediaFilterService.FILTER_PLUGIN_SEPARATOR)) {
if(filterName.contains(MediaFilterService.FILTER_PLUGIN_SEPARATOR))
{
String[] fields = filterName.split(MediaFilterService.FILTER_PLUGIN_SEPARATOR);
filterName=fields[0];
pluginName=fields[1];
@@ -260,11 +261,13 @@ public class MediaFilterCLITool {
//Retrieve list of identifiers to skip (if any)
String skipIds[] = null;
if (line.hasOption('s')) {
if(line.hasOption('s'))
{
//specified which identifiers to skip when processing
skipIds = line.getOptionValues('s');
if (skipIds == null || skipIds.length == 0) { //display error, since no identifiers specified to skip
if(skipIds==null || skipIds.length==0)
{ //display error, since no identifiers specified to skip
System.err.println("\nERROR: -s (-skip) option requires at least one identifier to SKIP.\n" +
"Make sure to separate multiple identifiers with a comma!\n" +
"(e.g. MediaFilterManager -s 123456789/34,123456789/323)\n");
@@ -279,24 +282,29 @@ public class MediaFilterCLITool {
Context c = null;
try {
try
{
c = new Context();
// have to be super-user to do the filtering
c.turnOffAuthorisationSystem();
// now apply the filters
if (identifier == null) {
if (identifier == null)
{
mediaFilterService.applyFiltersAllItems(c);
} else {
// restrict application scope to identifier
}
else // restrict application scope to identifier
{
DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(c, identifier);
if (dso == null) {
if (dso == null)
{
throw new IllegalArgumentException("Cannot resolve "
+ identifier + " to a DSpace object");
}
switch (dso.getType()) {
switch (dso.getType())
{
case Constants.COMMUNITY:
mediaFilterService.applyFiltersCommunity(c, (Community) dso);
break;
@@ -306,17 +314,20 @@ public class MediaFilterCLITool {
case Constants.ITEM:
mediaFilterService.applyFiltersItem(c, (Item) dso);
break;
default:
break;
}
}
c.complete();
c = null;
} catch (Exception e) {
}
catch (Exception e)
{
status = 1;
} finally {
if (c != null) {
}
finally
{
if (c != null)
{
c.abort();
}
}

View File

@@ -7,28 +7,11 @@
*/
package org.dspace.app.mediafilter;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import org.dspace.app.mediafilter.service.MediaFilterService;
import org.dspace.authorize.service.AuthorizeService;
import org.dspace.content.Bitstream;
import org.dspace.content.BitstreamFormat;
import org.dspace.content.Bundle;
import org.dspace.content.*;
import org.dspace.content.Collection;
import org.dspace.content.Community;
import org.dspace.content.DCDate;
import org.dspace.content.Item;
import org.dspace.content.service.BitstreamFormatService;
import org.dspace.content.service.BitstreamService;
import org.dspace.content.service.BundleService;
import org.dspace.content.service.CollectionService;
import org.dspace.content.service.CommunityService;
import org.dspace.content.service.ItemService;
import org.dspace.content.service.*;
import org.dspace.core.Constants;
import org.dspace.core.Context;
import org.dspace.core.SelfNamedPlugin;
@@ -38,6 +21,10 @@ import org.dspace.services.ConfigurationService;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.annotation.Autowired;
import java.io.IOException;
import java.io.InputStream;
import java.util.*;
/**
* MediaFilterManager is the class that invokes the media/format filters over the
* repository's content. A few command line flags affect the operation of the
@@ -47,7 +34,8 @@ import org.springframework.beans.factory.annotation.Autowired;
* scope to a community, collection or item; and -m [max] limits processing to a
* maximum number of items.
*/
public class MediaFilterServiceImpl implements MediaFilterService, InitializingBean {
public class MediaFilterServiceImpl implements MediaFilterService, InitializingBean
{
@Autowired(required = true)
protected AuthorizeService authorizeService;
@Autowired(required = true)
@@ -85,14 +73,14 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
protected boolean isQuiet = false;
protected boolean isForce = false; // default to not forced
protected MediaFilterServiceImpl() {
protected MediaFilterServiceImpl()
{
}
@Override
public void afterPropertiesSet() throws Exception {
String[] publicPermissionFilters = configurationService
.getArrayProperty("filter.org.dspace.app.mediafilter.publicPermission");
String[] publicPermissionFilters = configurationService.getArrayProperty("filter.org.dspace.app.mediafilter.publicPermission");
if(publicPermissionFilters != null) {
for(String filter : publicPermissionFilters) {
@@ -102,8 +90,10 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
}
@Override
public void applyFiltersAllItems(Context context) throws Exception {
if (skipList != null) {
public void applyFiltersAllItems(Context context) throws Exception
{
if(skipList!=null)
{
//if a skip-list exists, we need to filter community-by-community
//so we can respect what is in the skip-list
List<Community> topLevelCommunities = communityService.findAllTop(context);
@@ -111,10 +101,13 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
for (Community topLevelCommunity : topLevelCommunities) {
applyFiltersCommunity(context, topLevelCommunity);
}
} else {
}
else
{
//otherwise, just find every item and process
Iterator<Item> itemIterator = itemService.findAll(context);
while (itemIterator.hasNext() && processed < max2Process) {
while (itemIterator.hasNext() && processed < max2Process)
{
applyFiltersItem(context, itemIterator.next());
}
}
@@ -122,8 +115,10 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
@Override
public void applyFiltersCommunity(Context context, Community community)
throws Exception { //only apply filters if community not in skip-list
if (!inSkipList(community.getHandle())) {
throws Exception
{ //only apply filters if community not in skip-list
if(!inSkipList(community.getHandle()))
{
List<Community> subcommunities = community.getSubcommunities();
for (Community subcommunity : subcommunities) {
applyFiltersCommunity(context, subcommunity);
@@ -138,25 +133,31 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
@Override
public void applyFiltersCollection(Context context, Collection collection)
throws Exception {
throws Exception
{
//only apply filters if collection not in skip-list
if (!inSkipList(collection.getHandle())) {
if(!inSkipList(collection.getHandle()))
{
Iterator<Item> itemIterator = itemService.findAllByCollection(context, collection);
while (itemIterator.hasNext() && processed < max2Process) {
while (itemIterator.hasNext() && processed < max2Process)
{
applyFiltersItem(context, itemIterator.next());
}
}
}
@Override
public void applyFiltersItem(Context c, Item item) throws Exception {
public void applyFiltersItem(Context c, Item item) throws Exception
{
//only apply filters if item not in skip-list
if (!inSkipList(item.getHandle())) {
if(!inSkipList(item.getHandle()))
{
//cache this item in MediaFilterManager
//so it can be accessed by MediaFilters as necessary
currentItem = item;
if (filterItem(c, item)) {
if (filterItem(c, item))
{
// increment processed count
++processed;
}
@@ -167,7 +168,8 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
}
@Override
public boolean filterItem(Context context, Item myItem) throws Exception {
public boolean filterItem(Context context, Item myItem) throws Exception
{
// get 'original' bundles
List<Bundle> myBundles = itemService.getBundles(myItem, "ORIGINAL");
boolean done = false;
@@ -184,7 +186,8 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
@Override
public boolean filterBitstream(Context context, Item myItem,
Bitstream myBitstream) throws Exception {
Bitstream myBitstream) throws Exception
{
boolean filtered = false;
// iterate through filter classes. A single format may be actioned
@@ -220,7 +223,7 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
} catch (Exception e) {
String handle = myItem.getHandle();
List<Bundle> bundles = myBitstream.getBundles();
long size = myBitstream.getSize();
long size = myBitstream.getSizeBytes();
String checksum = myBitstream.getChecksum() + " (" + myBitstream.getChecksumAlgorithm() + ")";
int assetstore = myBitstream.getStoreNumber();
@@ -299,9 +302,11 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
@Override
public boolean processBitstream(Context context, Item item, Bitstream source, FormatFilter formatFilter)
throws Exception {
throws Exception
{
//do pre-processing of this bitstream, and if it fails, skip this bitstream!
if (!formatFilter.preProcessBitstream(context, item, source, isVerbose)) {
if(!formatFilter.preProcessBitstream(context, item, source, isVerbose))
{
return false;
}
@@ -310,29 +315,29 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
// get bitstream filename, calculate destination filename
String newName = formatFilter.getFilteredName(source.getName());
Bitstream existingBitstream = null; // is there an existing rendition?
Bundle targetBundle = null; // bundle we're modifying
List<Bundle> bundles = itemService.getBundles(item, formatFilter.getBundleName());
// check if destination bitstream exists
if (bundles.size() > 0) {
Bundle existingBundle = null;
Bitstream existingBitstream = null;
List<Bundle> bundles = itemService.getBundles(item, formatFilter.getBundleName());
if (bundles.size() > 0)
{
// only finds the last match (FIXME?)
for (Bundle bundle : bundles) {
List<Bitstream> bitstreams = bundle.getBitstreams();
for (Bitstream bitstream : bitstreams) {
if (bitstream.getName().trim().equals(newName.trim())) {
targetBundle = bundle;
existingBundle = bundle;
existingBitstream = bitstream;
}
}
}
}
// if exists and overwrite = false, exit
if (!overWrite && (existingBitstream != null)) {
if (!isQuiet) {
if (!overWrite && (existingBitstream != null))
{
if (!isQuiet)
{
System.out.println("SKIPPED: bitstream " + source.getID()
+ " (item: " + item.getHandle() + ") because '" + newName + "' already exists");
}
@@ -345,40 +350,44 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
+ " (item: " + item.getHandle() + ")");
}
InputStream destStream;
try {
System.out.println("File: " + newName);
destStream = formatFilter.getDestinationStream(item, bitstreamService.retrieve(context, source), isVerbose);
// start filtering of the bitstream, using try with resource to close all InputStreams properly
try (
// get the source stream
InputStream srcStream = bitstreamService.retrieve(context, source);
// filter the source stream to produce the destination stream
// this is the hard work, check for OutOfMemoryErrors at the end of the try clause.
InputStream destStream = formatFilter.getDestinationStream(item, srcStream, isVerbose);
) {
if (destStream == null) {
if (!isQuiet) {
System.out.println("SKIPPED: bitstream " + source.getID()
+ " (item: " + item.getHandle() + ") because filtering was unsuccessful");
}
return false;
}
} catch (OutOfMemoryError oome) {
System.out.println("!!! OutOfMemoryError !!!");
return false;
}
Bundle targetBundle; // bundle we're modifying
if (bundles.size() < 1)
{
// create new bundle if needed
if (bundles.size() < 1) {
targetBundle = bundleService.create(context, item, formatFilter.getBundleName());
} else {
// take the first match
}
else
{
// take the first match as we already looked out for the correct bundle name
targetBundle = bundles.get(0);
}
// create bitstream to store the filter result
Bitstream b = bitstreamService.create(context, targetBundle, destStream);
// Now set the format and name of the bitstream
// set the name, source and description of the bitstream
b.setName(context, newName);
b.setSource(context, "Written by FormatFilter " + formatFilter.getClass().getName() +
" on " + DCDate.getCurrent() + " (GMT).");
b.setDescription(context, formatFilter.getDescription());
// Find the proper format
// Set the format of the bitstream
BitstreamFormat bf = bitstreamFormatService.findByShortDescription(context,
formatFilter.getFormatString());
bitstreamService.setFormat(context, b, bf);
@@ -398,36 +407,49 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
authorizeService.inheritPolicies(context, source, b);
}
// fixme - set date?
// we are overwriting, so remove old bitstream
if (existingBitstream != null) {
bundleService.removeBitstream(context, targetBundle, existingBitstream);
//do post-processing of the generated bitstream
formatFilter.postProcessBitstream(context, item, b);
} catch (OutOfMemoryError oome) {
System.out.println("!!! OutOfMemoryError !!!");
}
if (!isQuiet) {
// fixme - set date?
// we are overwriting, so remove old bitstream
if (existingBitstream != null)
{
bundleService.removeBitstream(context, existingBundle, existingBitstream);
}
if (!isQuiet)
{
System.out.println("FILTERED: bitstream " + source.getID()
+ " (item: " + item.getHandle() + ") and created '" + newName + "'");
}
//do post-processing of the generated bitstream
formatFilter.postProcessBitstream(context, item, b);
return true;
}
@Override
public Item getCurrentItem() {
public Item getCurrentItem()
{
return currentItem;
}
@Override
public boolean inSkipList(String identifier) {
if (skipList != null && skipList.contains(identifier)) {
if (!isQuiet) {
public boolean inSkipList(String identifier)
{
if(skipList!=null && skipList.contains(identifier))
{
if (!isQuiet)
{
System.out.println("SKIP-LIST: skipped bitstreams within identifier " + identifier);
}
return true;
} else {
}
else
{
return false;
}
}

View File

@@ -7,10 +7,8 @@
*/
package org.dspace.app.mediafilter;
import java.awt.image.BufferedImage;
import java.awt.image.*;
import java.io.InputStream;
import javax.imageio.ImageIO;
import org.apache.pdfbox.pdmodel.PDDocument;
import org.apache.pdfbox.rendering.PDFRenderer;
import org.dspace.content.Item;
@@ -25,17 +23,21 @@ import org.dspace.content.Item;
* @author Ivan Masár helix84@centrum.sk
* @author Jason Sherman jsherman@usao.edu
*/
public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFormats {
public class PDFBoxThumbnail extends MediaFilter
{
@Override
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".jpg";
}
/**
* @return String bundle name
*
*/
@Override
public String getBundleName() {
public String getBundleName()
{
return "THUMBNAIL";
}
@@ -43,7 +45,8 @@ public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFor
* @return String bitstreamformat
*/
@Override
public String getFormatString() {
public String getFormatString()
{
return "JPEG";
}
@@ -51,7 +54,8 @@ public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFor
* @return String description
*/
@Override
public String getDescription() {
public String getDescription()
{
return "Generated Thumbnail";
}
@@ -59,12 +63,14 @@ public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFor
* @param currentItem item
* @param source source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
PDDocument doc = PDDocument.load(source);
PDFRenderer renderer = new PDFRenderer(doc);
BufferedImage buf = renderer.renderImage(0);
@@ -74,21 +80,4 @@ public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFor
JPEGFilter jpegFilter = new JPEGFilter();
return jpegFilter.getThumb(currentItem, buf, verbose);
}
@Override
public String[] getInputMIMETypes() {
return ImageIO.getReaderMIMETypes();
}
@Override
public String[] getInputDescriptions() {
return null;
}
@Override
public String[] getInputExtensions() {
// Temporarily disabled as JDK 1.6 only
// return ImageIO.getReaderFileSuffixes();
return null;
}
}

View File

@@ -28,20 +28,24 @@ import org.dspace.core.ConfigurationManager;
* instantiate filter - bitstream format doesn't exist
*
*/
public class PDFFilter extends MediaFilter {
public class PDFFilter extends MediaFilter
{
private static Logger log = Logger.getLogger(PDFFilter.class);
@Override
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".txt";
}
/**
* @return String bundle name
*
*/
@Override
public String getBundleName() {
public String getBundleName()
{
return "TEXT";
}
@@ -49,7 +53,8 @@ public class PDFFilter extends MediaFilter {
* @return String bitstreamformat
*/
@Override
public String getFormatString() {
public String getFormatString()
{
return "Text";
}
@@ -57,7 +62,8 @@ public class PDFFilter extends MediaFilter {
* @return String description
*/
@Override
public String getDescription() {
public String getDescription()
{
return "Extracted text";
}
@@ -65,13 +71,16 @@ public class PDFFilter extends MediaFilter {
* @param currentItem item
* @param source source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
try {
throws Exception
{
try
{
boolean useTemporaryFile = ConfigurationManager.getBooleanProperty("pdffilter.largepdfs", false);
// get input stream from bitstream
@@ -83,43 +92,62 @@ public class PDFFilter extends MediaFilter {
File tempTextFile = null;
ByteArrayOutputStream byteStream = null;
if (useTemporaryFile) {
if (useTemporaryFile)
{
tempTextFile = File.createTempFile("dspacepdfextract" + source.hashCode(), ".txt");
tempTextFile.deleteOnExit();
writer = new OutputStreamWriter(new FileOutputStream(tempTextFile));
} else {
}
else
{
byteStream = new ByteArrayOutputStream();
writer = new OutputStreamWriter(byteStream);
}
try {
try
{
pdfDoc = PDDocument.load(source);
pts.writeText(pdfDoc, writer);
} finally {
try {
if (pdfDoc != null) {
}
finally
{
try
{
if (pdfDoc != null)
{
pdfDoc.close();
}
} catch (Exception e) {
}
catch(Exception e)
{
log.error("Error closing PDF file: " + e.getMessage(), e);
}
try {
try
{
writer.close();
} catch (Exception e) {
}
catch(Exception e)
{
log.error("Error closing temporary extract file: " + e.getMessage(), e);
}
}
if (useTemporaryFile) {
if (useTemporaryFile)
{
return new FileInputStream(tempTextFile);
} else {
}
else
{
byte[] bytes = byteStream.toByteArray();
return new ByteArrayInputStream(bytes);
}
} catch (OutOfMemoryError oome) {
}
catch (OutOfMemoryError oome)
{
log.error("Error parsing PDF document " + oome.getMessage(), oome);
if (!ConfigurationManager.getBooleanProperty("pdffilter.skiponmemoryexception", false)) {
if (!ConfigurationManager.getBooleanProperty("pdffilter.skiponmemoryexception", false))
{
throw oome;
}
}

View File

@@ -8,8 +8,8 @@
package org.dspace.app.mediafilter;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.IOException;
import org.apache.poi.POITextExtractor;
import org.apache.poi.extractor.ExtractorFactory;
@@ -23,45 +23,55 @@ import org.slf4j.LoggerFactory;
* Extract flat text from Microsoft Word documents (.doc, .docx).
*/
public class PoiWordFilter
extends MediaFilter {
extends MediaFilter
{
private static final Logger LOG = LoggerFactory.getLogger(PoiWordFilter.class);
@Override
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".txt";
}
@Override
public String getBundleName() {
public String getBundleName()
{
return "TEXT";
}
@Override
public String getFormatString() {
public String getFormatString()
{
return "Text";
}
@Override
public String getDescription() {
public String getDescription()
{
return "Extracted text";
}
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
String text;
try {
try
{
// get input stream from bitstream, pass to filter, get string back
POITextExtractor extractor = ExtractorFactory.createExtractor(source);
text = extractor.getText();
} catch (IOException | OpenXML4JException | XmlException e) {
}
catch (IOException | OpenXML4JException | XmlException e)
{
System.err.format("Invalid File Format: %s%n", e.getMessage());
LOG.error("Unable to parse the bitstream: ", e);
throw e;
}
// if verbose flag is set, print out extracted text to STDOUT
if (verbose) {
if (verbose)
{
System.out.println(text);
}

View File

@@ -10,31 +10,36 @@ package org.dspace.app.mediafilter;
import java.io.ByteArrayInputStream;
import java.io.InputStream;
import org.apache.log4j.Logger;
import org.apache.poi.POITextExtractor;
import org.apache.poi.extractor.ExtractorFactory;
import org.apache.poi.hslf.extractor.PowerPointExtractor;
import org.apache.poi.xslf.extractor.XSLFPowerPointExtractor;
import org.apache.poi.hslf.extractor.PowerPointExtractor;
import org.apache.poi.POITextExtractor;
import org.apache.log4j.Logger;
import org.dspace.content.Item;
/*
* TODO: Allow user to configure extraction of only text or only notes
*
*/
public class PowerPointFilter extends MediaFilter {
public class PowerPointFilter extends MediaFilter
{
private static Logger log = Logger.getLogger(PowerPointFilter.class);
@Override
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".txt";
}
/**
* @return String bundle name
*
*/
@Override
public String getBundleName() {
public String getBundleName()
{
return "TEXT";
}
@@ -44,7 +49,8 @@ public class PowerPointFilter extends MediaFilter {
* TODO: Check that this is correct
*/
@Override
public String getFormatString() {
public String getFormatString()
{
return "Text";
}
@@ -52,7 +58,8 @@ public class PowerPointFilter extends MediaFilter {
* @return String description
*/
@Override
public String getDescription() {
public String getDescription()
{
return "Extracted text";
}
@@ -60,14 +67,17 @@ public class PowerPointFilter extends MediaFilter {
* @param currentItem item
* @param source source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
try {
try
{
String extractedText = null;
new ExtractorFactory();
@@ -78,22 +88,29 @@ public class PowerPointFilter extends MediaFilter {
// require different classes and APIs for text extraction
// If this is a PowerPoint XML file, extract accordingly
if (pptExtractor instanceof XSLFPowerPointExtractor) {
if (pptExtractor instanceof XSLFPowerPointExtractor)
{
// The true method arguments indicate that text from
// the slides and the notes is desired
extractedText = ((XSLFPowerPointExtractor) pptExtractor)
.getText(true, true);
} else if (pptExtractor instanceof PowerPointExtractor) { // Legacy PowerPoint files
}
// Legacy PowerPoint files
else if (pptExtractor instanceof PowerPointExtractor)
{
extractedText = ((PowerPointExtractor) pptExtractor).getText()
+ " " + ((PowerPointExtractor) pptExtractor).getNotes();
}
if (extractedText != null) {
if (extractedText != null)
{
// if verbose flag is set, print out extracted text
// to STDOUT
if (verbose) {
if (verbose)
{
System.out.println(extractedText);
}
@@ -103,7 +120,9 @@ public class PowerPointFilter extends MediaFilter {
return bais;
}
} catch (Exception e) {
}
catch (Exception e)
{
log.error("Error filtering bitstream: " + e.getMessage(), e);
throw e;
}

View File

@@ -11,7 +11,8 @@ package org.dspace.app.mediafilter;
* Interface to allow filters to register the input formats they handle
* (useful for exposing underlying capabilities of libraries used)
*/
public interface SelfRegisterInputFormats {
public interface SelfRegisterInputFormats
{
public String[] getInputMIMETypes();
public String[] getInputDescriptions();

View File

@@ -8,10 +8,11 @@
package org.dspace.app.mediafilter;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.IOException;
import org.apache.log4j.Logger;
import org.dspace.content.Item;
import org.textmining.extraction.TextExtractor;
import org.textmining.extraction.word.WordTextExtractorFactory;
@@ -22,20 +23,24 @@ import org.textmining.extraction.word.WordTextExtractorFactory;
* instantiate filter - bitstream format doesn't exist.
*
*/
public class WordFilter extends MediaFilter {
public class WordFilter extends MediaFilter
{
private static Logger log = Logger.getLogger(WordFilter.class);
@Override
public String getFilteredName(String oldFilename) {
public String getFilteredName(String oldFilename)
{
return oldFilename + ".txt";
}
/**
* @return String bundle name
*
*/
@Override
public String getBundleName() {
public String getBundleName()
{
return "TEXT";
}
@@ -43,7 +48,8 @@ public class WordFilter extends MediaFilter {
* @return String bitstreamformat
*/
@Override
public String getFormatString() {
public String getFormatString()
{
return "Text";
}
@@ -51,7 +57,8 @@ public class WordFilter extends MediaFilter {
* @return String description
*/
@Override
public String getDescription() {
public String getDescription()
{
return "Extracted text";
}
@@ -59,22 +66,26 @@ public class WordFilter extends MediaFilter {
* @param currentItem item
* @param source source input stream
* @param verbose verbose mode
*
* @return InputStream the resulting input stream
* @throws Exception if error
*/
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
throws Exception
{
// get input stream from bitstream
// pass to filter, get string back
try {
try
{
WordTextExtractorFactory factory = new WordTextExtractorFactory();
TextExtractor e = factory.textExtractor(source);
String extractedText = e.getText();
// if verbose flag is set, print out extracted text
// to STDOUT
if (verbose) {
if (verbose)
{
System.out.println(extractedText);
}
@@ -83,7 +94,9 @@ public class WordFilter extends MediaFilter {
ByteArrayInputStream bais = new ByteArrayInputStream(textBytes);
return bais; // will this work? or will the byte array be out of scope?
} catch (IOException ioe) {
}
catch (IOException ioe)
{
System.out.println("Invalid Word Format");
log.error("Error detected - Word File format not recognized: "
+ ioe.getMessage(), ioe);

View File

@@ -11,8 +11,7 @@ import org.dspace.app.mediafilter.service.MediaFilterService;
import org.dspace.services.factory.DSpaceServicesFactory;
/**
* Abstract factory to get services for the mediafilter package, use MediaFilterServiceFactory.getInstance() to
* retrieve an implementation
* Abstract factory to get services for the mediafilter package, use MediaFilterServiceFactory.getInstance() to retrieve an implementation
*
* @author kevinvandevelde at atmire.com
*/
@@ -21,7 +20,6 @@ public abstract class MediaFilterServiceFactory {
public abstract MediaFilterService getMediaFilterService();
public static MediaFilterServiceFactory getInstance(){
return DSpaceServicesFactory.getInstance().getServiceManager()
.getServiceByName("mediaFilterServiceFactory", MediaFilterServiceFactory.class);
return DSpaceServicesFactory.getInstance().getServiceManager().getServiceByName("mediaFilterServiceFactory", MediaFilterServiceFactory.class);
}
}

View File

@@ -11,8 +11,7 @@ import org.dspace.app.mediafilter.service.MediaFilterService;
import org.springframework.beans.factory.annotation.Autowired;
/**
* Factory implementation to get services for the mediafilter package, use MediaFilterServiceFactory.getInstance() to
* retrieve an implementation
* Factory implementation to get services for the mediafilter package, use MediaFilterServiceFactory.getInstance() to retrieve an implementation
*
* @author kevinvandevelde at atmire.com
*/

Some files were not shown because too many files have changed in this diff Show More