Compare commits

...

866 Commits

Author SHA1 Message Date
Terry Brady
cfc200f2c7 Update README.md 2018-06-22 14:33:35 -07:00
Andrea Bollini
211a9e215f Merge pull request #2066 from 4Science/DS-3915
DS-3915 RestResourceController hides not existing endpoints under other errors
2018-05-28 08:36:26 +02:00
Andrea Bollini
b248e3d1f3 Merge pull request #2061 from ppmdo/DS-3911
DS-3911 Endpoint to delete bitstreams
2018-05-28 08:29:51 +02:00
Andrea Bollini
41341eb9b9 DS-3915 proper handling of not existent subPaths 2018-05-20 21:13:20 +02:00
Andrea Bollini
3e76aa8972 DS-3915 Integration Test to expose the issue 2018-05-20 19:16:46 +02:00
Andrea Bollini
829ee0687a Merge pull request #2060 from Georgetown-University-Libraries/ds3650r3808
[DS-3808] Return 204 Status when return json object is empty
2018-05-20 18:29:17 +02:00
Andrea Bollini
36fbcb1b24 Better exception handling 2018-05-20 16:10:15 +02:00
Andrea Bollini
a4331dc4df Merge branch 'DS-3911' of https://github.com/ppmdo/DSpace into ppmdo-DS-3911 2018-05-20 12:30:45 +02:00
Andrea Bollini
75e1074123 Merge pull request #2053 from Georgetown-University-Libraries/ds3783a
[DS-3783] Change http return code when required parameter is missing
2018-05-20 11:44:27 +02:00
Andrea Bollini
a323bba76e Replaced Spring Param anno with our, add more test 2018-05-19 23:29:37 +02:00
Pablo Prieto
c66167e220 Changes based on review 2018-05-19 12:06:47 -05:00
Andrea Bollini
6225a353ef Merge branch 'ds3783a' of https://github.com/Georgetown-University-Libraries/DSpace into Georgetown-University-Libraries-ds3783a 2018-05-19 16:26:26 +02:00
Andrea Bollini
f10801ecca Merge pull request #2052 from tantz001/DS-3905
DS-3905; add a custom RestController
2018-05-18 22:14:52 +02:00
Andrea Bollini
fa11b172b7 Merge pull request #2057 from TAMULib/ds3777
DS-3777: Allow only POST on the authn/login endpoint.
2018-05-18 20:29:44 +02:00
Andrea Bollini
fe0a397548 Add IT for IdentifierRestController 2018-05-18 20:16:50 +02:00
Andrea Bollini
344459ad60 Fix checkstyle issue 2018-05-18 19:21:44 +02:00
Andrea Bollini
d2ff33da6d Fix endpoint registration in the root document 2018-05-18 19:13:12 +02:00
Andrea Bollini
17ad4050d2 Merge branch 'DS-3905' of https://github.com/tantz001/DSpace into tantz001-DS-3905 2018-05-18 17:25:25 +02:00
Andrea Bollini
2f06d6e7e6 POST is required for the login endpoint 2018-05-18 17:15:02 +02:00
Andrea Bollini
c2d2136e2c Fix test failures and add test for GET method 2018-05-18 17:14:29 +02:00
Andrea Bollini
0d4582484b Fix code style issues 2018-05-18 17:05:17 +02:00
Andrea Bollini
eee6529930 Merge branch 'ds3777' of https://github.com/TAMULib/DSpace into TAMULib-ds3777 2018-05-18 15:53:00 +02:00
Pablo Prieto
b26d8613e3 Rebase unchanged files 2018-05-17 23:36:25 -05:00
Pablo Prieto
263c867ced Integration Tests for Bitstreams DELETE endpoint. 2018-05-17 23:28:30 -05:00
Pablo Prieto
c4bfd5ef83 Half Integration Tests 2018-05-17 01:14:55 -05:00
Bill Tantzen
350f431eee removed unused imports; reorderd imports. 2018-05-16 11:03:22 -05:00
Bill Tantzen
f1cd0af629 remove <p> from license text blank line. 2018-05-15 14:37:17 -05:00
Terry W Brady
eb9d2d1f49 comment todo method 2018-05-15 12:22:52 -07:00
Terry W Brady
42d0a223d2 review suggestion 2018-05-15 12:04:38 -07:00
Terry W Brady
141be6e338 verify return structure of empty repos 2018-05-15 12:02:38 -07:00
Pablo Prieto
b6aa7308aa Draft Integration Tests 2018-05-15 13:49:45 -05:00
Terry W Brady
d4df6d12c6 Simplify cases where 204 is returned 2018-05-15 11:05:20 -07:00
Pablo Prieto
db512feb4e Fixed pom.xml 2018-05-14 22:20:47 -05:00
Pablo Prieto
edf4fc5669 Fixed pom.xml 2018-05-14 22:19:32 -05:00
Pablo Prieto
cd8c6b20ec Merge branch 'DS-3911' of https://github.com/ppmdo/DSpace into DS-3911 2018-05-14 22:16:28 -05:00
Pablo Prieto
14712be757 Removed local changes. 2018-05-14 22:15:41 -05:00
Pablo Prieto
92c01ebb66 No message 2018-05-14 22:09:55 -05:00
Pablo Prieto
6987ebb71d Removed local pom.xml modifications 2018-05-14 22:08:47 -05:00
Pablo Prieto
6bda178838 Removed application.properties local modifications 2018-05-14 22:07:33 -05:00
Pablo Prieto
67cc795026 Return 422 error code when trying to delete logo bitstreams. 2018-05-14 21:57:33 -05:00
Terry W Brady
5c95001c09 check req params 2018-05-14 15:42:48 -07:00
Terry W Brady
42253e7d6e force test pass 2018-05-14 14:04:59 -07:00
Terry W Brady
c32d21378a Update test cases for 204 change 2018-05-14 13:34:57 -07:00
Terry W Brady
e9083bdfa3 req param annotation 2018-05-14 13:12:24 -07:00
Terry W Brady
da1917539a 204 logic 2018-05-14 12:35:27 -07:00
Pablo Prieto
2d8a9fb315 Full working implementation.
TODO: Raise error when trying to delete logos
2018-05-14 14:20:10 -05:00
Pablo Prieto
e23076da09 Fixed UUID - id mapping bug of RestResourceController.
First DELETE implementation on bitstream endpoint.

TODO: Investigate why the endpoint triggers but doesn't end in the bitstream deletion.
2018-05-14 09:10:43 -05:00
Andrea Bollini
e806bd7727 Merge pull request #2051 from Georgetown-University-Libraries/ds3650
[DS-3650] Add subcommunities link to community object
2018-05-14 15:52:34 +02:00
James Creel
a59e15a2b3 Web Security ignores not-allowed methods on authn/login endpoint 2018-05-12 12:29:29 -05:00
James Creel
9e7a56954c New authn controller endpoint to intercept not allowed methods 2018-05-12 12:26:27 -05:00
James Creel
1d52ec86cd removing GET from allowed methods on authn/login controller endpoint 2018-05-12 11:53:53 -05:00
Bill Tantzen
a7d89fbb58 removed unecessary classes; changed from problematic{prefix}/{suffix} path to params-based path; changed response to 501 when IdentifierNotResolvable. 2018-05-11 15:21:59 -05:00
Bill Tantzen
195bc622e9 reformat to DSpace style guidelines. 2018-05-10 15:32:48 -05:00
Bill Tantzen
76d60c2e94 simplified method signature; converted DSpaceObject to DSpaceObjectRest; leveraged ControllerLinkBuilder.linkTo() to construct the redirect. 2018-05-10 11:53:31 -05:00
Terry W Brady
892667d623 handle improperly formatted parent param 2018-05-09 16:19:18 -07:00
Terry W Brady
d7739c376f missing parameter exception 2018-05-09 15:35:36 -07:00
Terry W Brady
3ab889670e rm unneeded decl 2018-05-09 15:05:47 -07:00
Terry W Brady
ed5c43226a change ret code for empty 2018-05-09 14:53:24 -07:00
Terry W Brady
48de581514 comment placeholder 2018-05-09 14:26:04 -07:00
Terry W Brady
c455c59df0 activate ignored test 2018-05-09 14:22:04 -07:00
kshepherd
8c9e6afa53 Merge pull request #2032 from mwoodiupui/DS-3795
[DS-3795] Update jackson-databind due to known vulnerabilities.
2018-05-10 08:58:21 +12:00
Terry W Brady
d29666d74f placeholder for 204 checks 2018-05-09 13:05:59 -07:00
Terry W Brady
6b983bfee9 format exception output 2018-05-08 16:31:42 -07:00
Terry W Brady
07e59b82a5 Add 422 return 2018-05-08 16:20:04 -07:00
Terry W Brady
69c62d33ca add initial exception 2018-05-08 15:55:41 -07:00
Bill Tantzen
e5641262ef DS-3905; add a custom RestController 2018-05-08 15:12:41 -05:00
Mark H. Wood
5cda7abed4 [DS-3795] Update zookeeper (even though not used) to address known vulnerabilities. 2018-05-08 12:28:55 -04:00
Mark H. Wood
a3fb9f5c3a [DS-3795] Update ant to address known vulnerabilities. 2018-05-08 12:28:55 -04:00
Mark H. Wood
310e5d8579 [DS-3795] Update jackson-databind due to known vulnerabilities. 2018-05-08 12:28:55 -04:00
Tim Donohue
598f35b267 Merge pull request #2050 from Georgetown-University-Libraries/ds3903m
[DS-3903] Port to master, Legacy Rest Jar Resolution
2018-05-08 11:01:00 -05:00
Tim Donohue
a5a3dda2fd Remove obsolete comment. Minor alignment fix 2018-05-08 09:01:27 -05:00
Terry W Brady
e5eb901065 clarify specific checks for desc objs 2018-05-07 15:54:53 -07:00
Terry W Brady
a7c81129f4 update based on review 2018-05-07 15:45:01 -07:00
Terry W Brady
d631f33d64 Handle empty child arrays 2018-05-07 15:38:53 -07:00
kshepherd
d64ebe2788 Merge pull request #2001 from minurmin/DS-3511-master
DS-3511: Fix HTTP 500 errors on REST API bitstream updates (7.0)
2018-05-08 09:39:02 +12:00
Terry W Brady
71281d2f09 mirror link.collections tests 2018-05-07 14:34:21 -07:00
Terry W Brady
cc48340fcc rem unnecessary ref 2018-05-07 10:48:36 -07:00
Terry W Brady
74ddde6c7a normalize endpoint link name 2018-05-07 09:44:13 -07:00
Terry W Brady
4dd3bcbbfd add converter 2018-05-07 09:30:27 -07:00
Terry W Brady
129a6a5d1f subcomm links 2018-05-07 09:08:09 -07:00
Terry W Brady
d3f150e2f9 pr port 2018-05-07 08:25:30 -07:00
Tim Donohue
b60fecbb0a Merge pull request #2047 from kshepherd/ds-2862_fix_legacy_embargo_checks_master_port
[DS-2862] Apply DSPR#2000 changes manually and fix style for master port
2018-05-03 10:59:31 -05:00
Kim Shepherd
c96e1598ec [DS-2862] Fix line length of comment on line 150 to conform to checkstyle rules 2018-05-03 19:12:29 +12:00
Kim Shepherd
da117687e8 [DS-2862] Apply DSPR#2000 changes manually and fix style for master port (cherry-pick resulted in conflicts due to other changes in master) 2018-05-03 18:42:58 +12:00
kshepherd
295da5e14a Merge pull request #2002 from KingKrimmson/DS-3877
DS-3877 Trim bitstream name during filter-media comparison
2018-05-03 17:51:35 +12:00
kshepherd
a6fd4c07f4 Merge pull request #2040 from kshepherd/ds-3734_last_modified_timestamp_master_port
DS-3734 Fixed missing trigger of item last modified date when adding …
2018-04-30 13:38:09 +12:00
Kim Shepherd
68cca2c11c [DS-3734] Fix checkstyle issues for master port 2018-04-30 11:59:20 +12:00
Christian Scheible
0f752b4f62 DS-3734 Fixed missing trigger of item last modified date when adding a bitstream. 2018-04-30 11:31:53 +12:00
kshepherd
c293756a45 Merge pull request #2037 from kshepherd/ds-3556_ds-3733_xalan_rollback_master_port
Ds 3556 ds 3733 xalan rollback master port
2018-04-26 08:37:48 +12:00
Per Broman
e445801de3 [DS-3556] Rollback of Xalan from 2.7.2 to 2.7.0 to fix DS-3556 and DS-3733 2018-04-25 19:53:27 +00:00
Ilja Sidoroff
426d21a1c6 Add option to select citation page format (LETTER or A4) 2018-04-24 01:14:52 +00:00
kshepherd
49185db7e9 Merge pull request #1997 from mwoodiupui/DS-3832
[DS-3832] Upgrade to GeoIP2.
2018-04-19 09:49:35 +12:00
Mark H. Wood
9d5910c010 [DS-3832] Don't spew stack traces for simple exceptions. 2018-04-18 16:42:38 -04:00
Tim Donohue
81b99a0619 Merge pull request #2023 from kshepherd/ds-3800_master_port
[DS-3800] master port of pull/1914
2018-04-16 09:14:22 -05:00
Kim Shepherd
36c7c678f6 [DS-3800] Port to master -- removed commented lines accidentally left in 2018-04-15 17:19:42 +12:00
Kim Shepherd
4a6d9ce11e [DS-3800] Port to master -- some changes by hand due to checkstyle differences 2018-04-15 17:16:51 +12:00
Tim Donohue
b00a914b8f Merge pull request #2022 from kshepherd/ds-3662_master_port
[DS-3662] DSpace 'logging in' without password or with non-existent e…
2018-04-13 08:40:37 -05:00
Kim Shepherd
04181ef271 [DS-3662] DSpace 'logging in' without password or with non-existent e-mail using Shib and Password authentication (master port) 2018-04-13 12:07:14 +12:00
kshepherd
8218b1ce24 Merge pull request #2019 from tdonohue/DS-3770-master-port
[DS-3770] always uncache item after performed curation task (port to master)
2018-04-13 09:55:51 +12:00
Tim Donohue
191f6ace28 [DS-3770] always uncache item after performed curation task for better performance 2018-04-12 15:24:03 +00:00
kshepherd
d30f455c23 Merge pull request #1912 from kardeiz/fix-workflow-sql
[DS-3788] drop indexes, update, recreate
2018-04-12 08:19:11 +12:00
kshepherd
1d2705bb0a Merge pull request #2015 from kshepherd/ds-3710_master_port
DS-3710 Fix ehcache config conflict (port to master)
2018-04-09 17:11:41 +12:00
Chris Herron
fd4608c9e0 DS-3710 Fix ehcache config conflict 2018-04-09 16:39:43 +12:00
kshepherd
b0d3a0f6c7 Merge pull request #1940 from mwoodiupui/DS-3694
[DS-3694] Clean up EHCache configuration mess
2018-04-09 15:44:48 +12:00
Tim Donohue
f203ddcd4c Merge pull request #1900 from jmarton/DS-3790
[DS-3790] include remote IP value in debug log for IP authentication
2018-04-03 10:51:56 -05:00
Jozsef Marton
4a4bd26479 [DS-3790] include remote IP and useProxies configuration value in debug log for IP authentication 2018-04-02 23:49:34 +02:00
Tim Donohue
68901a2319 Merge pull request #1991 from Georgetown-University-Libraries/ds3811m2
[DS-3811] port pr 1934 to master
2018-04-02 10:16:51 -05:00
Tim Donohue
b05c94d0b4 Merge pull request #2009 from Georgetown-University-Libraries/ds3880
[DS-3880] ignore .checkstyle
2018-03-29 13:51:52 -05:00
Mark H. Wood
79d11cbe5b [DS-3832] Clean up more references to v1 database. 2018-03-29 13:29:06 -04:00
Terry Brady
cfb8cbff91 move change to eclipse section 2018-03-29 10:04:38 -07:00
Mark H. Wood
d628ffee38 [DS-3832] Fetch and use GeoLite v2 City database. 2018-03-29 12:49:44 -04:00
Terry W Brady
dc61dc8a17 ignore .checkstyle 2018-03-28 21:26:06 -07:00
Terry W Brady
55b8826646 port pr to master 2018-03-28 21:21:55 -07:00
kshepherd
7d30cef57b Merge pull request #1804 from saiful-semantic/patch-1
Change Content-Type in OAI-PMH Response
2018-03-29 11:03:41 +13:00
Chris Herron
80337f8d6f DS-3877 Trim bitstream name during filter-media comparison (prevent duplicate bitstream generation) 2018-03-28 10:27:27 -04:00
Mark H. Wood
6b9cfec8c6 [DS-3832] Recast test support classes. 2018-03-27 21:14:49 -04:00
Miika Nurminen
22bb9a5751 DS-3511: Add context.complete() to updateBitstreamData and deleteBitstreamPolicy to prevent HTTP 500 response on REST updates 2018-03-27 20:56:43 +03:00
Mark H. Wood
074337f167 [DS-3832] Upgrade to GeoIP2. 2018-03-21 15:49:38 -04:00
Tim Donohue
0fb23c677e Merge pull request #1945 from tdonohue/latest_postgres_jdbc
DS-3854: Update to latest PostgreSQL JDBC driver
2018-03-21 13:47:57 -05:00
Tim Donohue
d44b9aa29d Merge pull request #1985 from mwoodiupui/DS-3868
[DS-3868] Missing Hibernate dependencies -- 'ant fresh_install' fails
2018-03-21 13:44:22 -05:00
Tim Donohue
c3d567c31e Merge pull request #1989 from Georgetown-University-Libraries/ds3704m
[DS-3704] port pr 1854 to master
2018-03-16 15:29:20 -05:00
Tim Donohue
b00dbd07d6 Merge pull request #1988 from Georgetown-University-Libraries/ds3714m
[DS-3714] port pr 1862 to master
2018-03-16 15:24:55 -05:00
Tim Donohue
8fb55e95f2 Merge pull request #1987 from Georgetown-University-Libraries/ds3783m
[DS-3713] Master port or PR 1863
2018-03-16 15:21:18 -05:00
Terry W Brady
5f5688e5fd port pr 1854 2018-03-16 12:43:28 -07:00
Terry W Brady
06e99ebfb7 fix checkstyle 2018-03-16 12:30:59 -07:00
Terry W Brady
8f689efc4e port pr 1862 2018-03-16 12:15:15 -07:00
Terry W Brady
2779205c0a port from 6x 2018-03-15 15:03:59 -07:00
Mark H. Wood
dff155a9d9 [DS-3868] Use a newer version of hibernate-validator*, symbolize version, move to parent. 2018-03-15 10:58:57 -04:00
Mark H. Wood
0a84b32b17 [DS-3868] Add dependencies needed in non-JEE environment. 2018-03-15 09:53:49 -04:00
Tim Donohue
b5f86ebb13 Merge pull request #1975 from tdonohue/code_style_for_unit_tests
Update Unit Tests to align with new Code Style
2018-03-09 10:34:12 -06:00
PTrottier
79f3c9e753 [DS-3841] REBASE enable CORS header for security controller (#1983)
* set current user on the context too

* enable cors on security configuration, add exposition of authorization header

* Address style and re add cors

* Fix cors() line

* Remove redundant code for setting eperson
2018-03-08 16:43:27 -06:00
Tim Donohue
4a5dfcff64 Per feedback, perform checkstyle *after* tests by chaning bind phase to verify. 2018-03-08 11:38:09 -06:00
Tim Donohue
47b9e04a36 Fix dspace-spring-rest unit tests to align with code style 2018-02-28 11:38:35 -06:00
Tim Donohue
5777acb743 Fix dspace-services unit tests to align with code style 2018-02-28 11:38:34 -06:00
Tim Donohue
2d425c4518 Fix dspace-oai unit tests to align with code style 2018-02-28 11:38:34 -06:00
Tim Donohue
3e45d113dc Fix dspace-api unit tests to align with code style 2018-02-28 11:38:22 -06:00
Tim Donohue
2063632bac Enable Checkstyle for Unit Test source code. Also enable suppression configuration to turn off indentation checks for unit tests 2018-02-28 10:49:29 -06:00
Tim Donohue
d7095844a8 Update to latest PostgreSQL JDBC driver 2018-02-26 20:28:09 +00:00
Tim Donohue
8019087c55 Merge pull request #1967 from mwoodiupui/DS-3852-7_x
[DS-3852] OAI indexer message not helpful in locating problems
2018-02-26 09:31:22 -06:00
Mark H. Wood
a2314579e9 [DS-3852] Give more information about the item just indexed, to help identify it in case of problems.
07b050c
2018-02-24 17:46:46 -05:00
Mark H. Wood
f765acf844 [DS-3694] Exclude problematic JAR pulled into lib/ by dspace-rest 2018-02-22 11:47:23 -05:00
Mark H. Wood
b1742340c3 [DS-3694] Replace an inappropriate system property with a local Hibernate/EHCache property 2018-02-22 11:47:23 -05:00
Mark H. Wood
033b023b81 [DS-3694] Remove huge blocks of commentary that obscure the settings.
This includes much sample configuration for distributed caching which
DSpace has never used, and documentation of EHCache itself.  If you
want to know more about EHCache, consult its own documentation sources.
2018-02-22 11:47:23 -05:00
Mark H. Wood
781a26b56d [DS-3694] Configure CacheManagers to suppress version check.
Don't set corresponding system property -- it's the wrong way to do
this in Servlet code, and anyway the manager is already initialized
before we can set it.
Remove fallback code that should never execute because injection is @Required.
2018-02-22 11:43:51 -05:00
Tim Donohue
b2f83303d6 Merge pull request #1955 from mwoodiupui/DS-3836
[DS-3836] dspace-rest old dependencies break command line
2018-02-22 10:27:58 -06:00
Tim Donohue
fa6dc2d07d Merge pull request #1952 from DSpace/code-style-fixes
Align Java code with new Code Style
2018-02-22 09:27:11 -06:00
Tim Donohue
ac0ef0594c Enable Checkstyle verification for all builds. 2018-02-21 15:23:59 -06:00
Tim Donohue
2d5b66df96 Minor bug fix for dspace-api updates 2018-02-21 15:23:59 -06:00
Tim Donohue
8a48f782ea Fix dspace-api module per new code style 2018-02-21 15:23:58 -06:00
Tim Donohue
8ffc97f7f9 Fix dspace-swordv2 module per new code style 2018-02-21 14:56:49 -06:00
Tim Donohue
9c5e9e8f4e Fix dspace-sword per new code style 2018-02-21 14:56:49 -06:00
Tim Donohue
1055dde94c Fix dspace-solr module per new code style 2018-02-21 14:56:49 -06:00
Tim Donohue
f1058802f2 Fix dspace-services module per new code style 2018-02-21 14:56:48 -06:00
Tim Donohue
e09d047227 Restore newlines at end of a few files. These were accidentally removed in previous commits and caused coveralls issues in Travis. 2018-02-21 14:56:48 -06:00
Tim Donohue
2c3ac0af7c Fix (legacy) dspace-rest module per new code style 2018-02-21 14:56:48 -06:00
Tim Donohue
3e5f3acd79 Fix dspace-rdf module per new code style 2018-02-21 14:56:47 -06:00
Tim Donohue
739d598f14 Fix dspace-oai module per new code style 2018-02-21 14:56:47 -06:00
Tim Donohue
de4e8bc3d1 Fix dspace-spring-rest module per new code style 2018-02-21 14:56:46 -06:00
Tim Donohue
14a29bb303 Updated version of Checkstyle and maven-checkstyle-plugin 2018-02-21 12:24:39 -06:00
Tim Donohue
fd28ccd5a8 Minor modifications per testing against codebase. Remove indentation exception for throws. Disable Javadocs test for public methods. 2018-02-21 12:24:39 -06:00
Tim Donohue
a6ef3ff10e Minor enhancements based on comments from others. 2018-02-21 12:24:39 -06:00
Tim Donohue
288a33a70d Initial version of CheckStyle configuration 2018-02-21 12:24:39 -06:00
kshepherd
112019f403 Merge pull request #1961 from hardyoyo/DS-3839-revised-master
[DS-3839] moved the autoorient IM op to the top of the operations list
2018-02-20 16:04:46 +13:00
Hardy Pottinger
4cc58b6764 [DS-3839] moved the autoorient IM op to the top of the operations list, where it belongs 2018-02-19 17:40:58 -06:00
Andrea Bollini
724e4cbc98 Merge pull request #1939 from atmire/DS-3817
DS-3817: Support the DSpace Cover page functionality in the bitstream endpoint
2018-02-16 17:54:28 +01:00
Andrea Bollini
dcc08406fc Merge pull request #1927 from Georgetown-University-Libraries/ds3764d
[DS-3764] Return Shibboleth SSO url when login fails, add re-direct to HAL login
2018-02-16 17:53:00 +01:00
Tim Donohue
25faee65c4 Merge pull request #1957 from hardyoyo/DS-3839-support-autoorient-for-imagemagick-thumbnails-dspace-master
[DS-3839] forward-porting DSPR#1956 for master branch
2018-02-16 09:30:53 -06:00
Hardy Pottinger
471adb44dd [DS-3839] forward-porting DSPR#1956 for master branch 2018-02-16 08:21:45 -06:00
Andrea Bollini
2a2fe0f352 Merge pull request #1898 from atmire/DS-3762_write-missing-tests
[DS-3762] fixed test for subCommunities
2018-02-15 17:57:22 +01:00
Andrea Bollini
5e9007015a remove final slash 2018-02-15 16:46:06 +01:00
Mark H. Wood
9d63206877 [DS-3836] Align jersey artifact versions through a property; upgrade jersey-spring3 to jersey-spring4. 2018-02-14 11:36:03 -05:00
Tim Donohue
8fa2aa34ba Merge pull request #1953 from mwoodiupui/DS-3524-2
[DS-3524] Align versions of Apache Poi artifacts using a property.
2018-02-14 09:58:47 -06:00
Mark H. Wood
b75e39db7c [DS-3524] Align versions of Apache Poi artifacts using a property.
I think that someone updated the Poi versions while I was converging
on the old versions.  Testing the branch can't find that, but testing
the merged trunk did.  We need to be more consistent in using symbolic
<version> values when related artifacts should be on the same version.
2018-02-14 10:28:31 -05:00
Mark H. Wood
022ef1866d Merge pull request #1936 from mwoodiupui/DS-3524
[DS-3524] Restore maven-enforcer-plugin on rest7
2018-02-14 08:27:23 -05:00
Tim Donohue
fb8eb032d8 Merge pull request #1950 from mwoodiupui/DS-3833
[DS-3833] Logging: remove Cocoon debris, merge Spring with main log
2018-02-09 09:25:20 -06:00
Mark H. Wood
0f72fe6a22 [DS-3833] Remove log appender A3; Spring will default to A1. 2018-02-08 09:41:55 -05:00
Mark H. Wood
29907299da Merge pull request #1933 from mwoodiupui/DS-3434-7
[DS-3434] DSpace fails to start when a database connection pool is supplied through JNDI
2018-02-07 14:19:50 -05:00
Mark H. Wood
a8cd35dd13 Merge pull request #1944 from mwoodiupui/DS-3720
[DS-3720] Remove all traces of ContiPerf.
2018-02-07 12:08:04 -05:00
Tim Donohue
c3a4eee031 Merge pull request #1930 from Georgetown-University-Libraries/ds3807a
[DS-3807 ] Get legacy /rest service running on master branch
2018-02-05 12:37:26 -06:00
Mark H. Wood
b001b21013 Merge pull request #1948 from mwoodiupui/DS-3803
[DS-3803] Remove db.jndi setting from dspace.cfg
2018-02-05 12:17:46 -05:00
Alan Orth
ed6b7b3e24 DS-3803 Remove db.jndi setting from dspace.cfg
As of DSpace 6.x this setting is no longer used and is not customizable
by the user. Now DSpace always looks for a pool named "jdbc/dspace" in
JNDI and falls back to creating a pool with the db.* settings located
in dspace.cfg.

See: https://wiki.duraspace.org/display/DSDOC6x/Configuration+Reference
See: dspace/config/spring/api/core-hibernate.xml
See: https://jira.duraspace.org/browse/DS-3434
2018-02-05 10:59:17 -05:00
Tim Donohue
974b409c38 Merge pull request #1943 from tdonohue/update-poi
DS-3795: Update Apache POI library to latest version
2018-02-02 10:08:37 -06:00
Mark H. Wood
a1aae4a82b [DS-3720] Remove all traces of ContiPerf. 2018-02-01 16:37:02 -05:00
Tim Donohue
3145cf6877 DS-3795: Update Apache POI library to latest version 2018-02-01 16:52:47 +00:00
Luigi Andrea Pascarelli
e2fdc751c1 Merge pull request #1929 from 4Science/DS-3755
[DS-3755] integration test for submission configuration endpoint
2018-02-01 17:40:03 +01:00
Terry W Brady
254cc413a9 Add int test 2018-01-31 16:42:27 -08:00
Luigi Andrea Pascarelli
ddd2fff23d DS-3755 introduce constant to define request mapping regex 2018-01-31 11:51:38 +01:00
Luigi Andrea Pascarelli
9e3e653906 DS-3755 remove try/catch to avoid unexpected NullPointerException 2018-01-31 11:28:14 +01:00
Luigi Andrea Pascarelli
77dab9c056 DS-3755 fix message 2018-01-31 11:09:38 +01:00
Luigi Andrea Pascarelli
6bf5019cdf DS-3755 minor changes 2018-01-30 19:22:02 +01:00
Luigi Andrea Pascarelli
51823d1a9e DS-3755 remove api class; add javadoc; 2018-01-30 19:18:49 +01:00
Luigi Andrea Pascarelli
34269359c4 DS-3755 rename method in differenceInSubmissionFields (was "diff") 2018-01-30 17:45:32 +01:00
Luigi Andrea Pascarelli
5f1e6705ab DS-3755 minor comment 2018-01-30 17:40:14 +01:00
Luigi Andrea Pascarelli
4a38057756 DS-3755 add javadoc 2018-01-30 17:09:43 +01:00
Luigi Andrea Pascarelli
13a36607d2 DS-3755 renamed class into RestAddressableModel (was DirectlyAddressableRestModel)
DS-3755 add missing interface declaration (RestAddressableModel) for the search rest class
2018-01-30 11:36:10 +01:00
Luigi Andrea Pascarelli
feb2709682 DS-3755 remove the dspace.ui configuration (maybe in a future PR a dspace.path or a dspace.rest.url could be introduced) 2018-01-29 20:51:38 +01:00
Luigi Andrea Pascarelli
d690e677b8 DS-3755 split method to cut down on a number of changes in the PR 2018-01-29 20:29:12 +01:00
Yana De Pauw
79208c777e DS-3817: DSpace citation cover page - small fixes 2018-01-29 15:31:34 +01:00
Yana De Pauw
028e72d1b6 DS-3817: Support the DSpace Cover page functionality in the bitstream content endpoint 2018-01-29 15:01:29 +01:00
Mark H. Wood
d0fe0349ee [DS-3524] Converge some common dependencies' versions. 2018-01-26 17:11:13 -05:00
Mark H. Wood
591ec33278 [DS-3524] Re-enable enforcement of dependency convergence. 2018-01-26 15:01:04 -05:00
Luigi Andrea Pascarelli
8630f12990 DS-3755 change response content for upload (now return the full object);
DS-3755 add links for the resource returned by the POST and PATCH method;
2018-01-26 18:08:21 +01:00
Luigi Andrea Pascarelli
4a615ec234 DS-3755 add missed link support for embedded object findrelation 2018-01-26 14:05:12 +01:00
Luigi Andrea Pascarelli
3a09cd5f46 DS-3755 add test to check links on embeddedpage when the resource is retrieved with findrel 2018-01-26 13:28:21 +01:00
Luigi Andrea Pascarelli
0cfa569b4e DS-3755 add missed link support 2018-01-26 11:54:28 +01:00
Terry W Brady
fe3a7722d1 remove methods that are not applicable 2018-01-25 15:29:45 -08:00
Mark H. Wood
e5b30f3398 [DS-3434] DSpace fails to start when a databse connection pool is supplied through JNDI 2018-01-25 13:59:31 -05:00
Tim Donohue
e4f579319d Adding in name of bean to comment. 2018-01-25 12:30:20 -06:00
Terry Brady
a0df98776b comment updated 2018-01-25 10:18:26 -08:00
Luigi Andrea Pascarelli
19f44661eb DS-3755 fix compilation errors 2018-01-25 17:21:35 +01:00
Luigi Andrea Pascarelli
a42ca56675 remove no sense context parameter 2018-01-25 17:21:02 +01:00
Luigi Andrea Pascarelli
87ddad01be Merge DS-3755 with master branch 2018-01-25 16:44:00 +01:00
Terry W Brady
815569b28f comment out bean due to service start issue 2018-01-24 14:32:06 -08:00
Terry Brady
04dd484357 Merge pull request #1907 from hardyoyo/DS-3455-remove-elastic-search-stats
[DS-3455] refactored out all mention of Elastic Search, code and configs,
2018-01-24 13:52:02 -08:00
Hardy Pottinger
9c71e4e2aa [DS-3455] added back accidentally-deleted comment at the request of Mark Wood 2018-01-24 14:33:49 -06:00
Luigi Andrea Pascarelli
c2a81a4c46 DS-3755 implemented integration test for submission configuration 2018-01-24 19:25:24 +01:00
Terry W Brady
96bb5eeca7 Update the description for loginPageURL 2018-01-24 10:22:56 -08:00
Terry W Brady
80f3df4740 Consolidate success handler 2018-01-24 10:13:31 -08:00
Terry W Brady
ec7781156c refine return url 2018-01-24 10:06:22 -08:00
Terry W Brady
dfba61fa87 referer target 2018-01-24 09:56:24 -08:00
Terry W Brady
2ac477e6af Update target return url 2018-01-24 09:38:47 -08:00
Terry W Brady
235b85e201 use loginPageURL 2018-01-24 09:16:22 -08:00
Terry W Brady
3d135a4114 Add redirect behavior 2018-01-23 16:36:59 -08:00
Terry W Brady
fa94297a80 fix location
Refine location path
2018-01-23 16:36:26 -08:00
Terry W Brady
a69bf2e21b Add location header
format location

Host vs Origin
2018-01-23 16:35:51 -08:00
Tim Donohue
cecf494428 Merge pull request #1909 from hardyoyo/DS-3795-bump-jackson-version
[DS-3795] bumped google-http-client-jackson2 to 1.23.0
2018-01-23 14:27:52 -06:00
Luigi Andrea Pascarelli
6cebabdc6d DS-3699 add implementation for "collections" linked entities endpoint 2018-01-23 19:51:39 +01:00
Tim Donohue
42894a5e39 Merge pull request #1921 from Georgetown-University-Libraries/ds3804
[DS-3804] Jar dependency issues running DSpace 7x REST
2018-01-23 11:10:48 -06:00
Hardy Pottinger
1605aebc95 [DS-3795] whoops, let's pick an actual version number for google-api-services-analytics, heh 2018-01-22 14:46:58 -06:00
Hardy Pottinger
a419ff7608 [DS-3795] bumped bumped other Google API dependencies to 1.23.0, as per suggestion of tdonohue 2018-01-22 14:29:09 -06:00
Tim Donohue
694f31cad1 Merge pull request #1908 from hardyoyo/bump-commons-fileupload-version
[DS-3795] increased version for commons-fileupload
2018-01-19 11:05:22 -06:00
Terry W Brady
00757b862c remove jackson exclusion per review comments 2018-01-19 08:47:14 -08:00
Terry W Brady
7a31731f2f Add comment per review 2018-01-19 08:34:43 -08:00
Terry Brady
4e156c6594 Merge pull request #1920 from atmire/DS-3542_REST-authentication-empty-request-error
DS-3542: Fix REST authentication error with empty request + test
2018-01-19 08:15:42 -08:00
Tim Donohue
72e441f55a Merge pull request #1924 from mwoodiupui/DS-3667-7
[DS-3667] Document fundamental persistence support classes.
2018-01-19 08:35:53 -06:00
Mark H. Wood
da0e2ba073 [DS-3667] Take up PR comments and document another class. 2018-01-18 15:53:36 -05:00
Mark H. Wood
f778467f82 [DS-3667] Document fundamental persistence support classes. 2018-01-18 15:49:36 -05:00
Terry W Brady
3c8744cc9a Add jackson version 2018-01-18 10:24:24 -08:00
Terry W Brady
e57b4fbe3b old ver of jackson in oai 2018-01-18 10:04:04 -08:00
Andrea Bollini
871acdf5bb Merge pull request #1923 from atmire/DS-3489_Fix-EmbeddedPage-links
DS-3489: Fix links of resources in embedded page
2018-01-18 18:02:48 +01:00
Tom Desair
36af89cbaa DS-3489: Make sure the HalLinkService also visits all elements inside an EmbeddedPage 2018-01-18 17:20:11 +01:00
Andrea Bollini
8d7c6bfbc1 Merge pull request #1881 from atmire/DS-3489_Read-only-Search-REST-endpoint
DS-3489: Read-only search rest endpoint
2018-01-18 12:24:06 +01:00
Terry W Brady
b643df8ed4 exclude logback 2018-01-17 15:50:09 -08:00
Terry W Brady
64443e0a45 jul-to-slf4j excl 2018-01-17 15:19:32 -08:00
Terry W Brady
8d663ba02a slf4j exclusion 2018-01-17 14:54:29 -08:00
Tom Desair
787473674d DS-3542: Fix REST authentication error with empty request + test 2018-01-17 11:00:51 +01:00
Jacob Brown
ac0e9e23eb drop indexes, update, recreate 2018-01-05 11:24:36 -06:00
Hardy Pottinger
e9396a9d05 [DS-3795] bumped google-http-client-jackson2 to 1.23.0 2018-01-03 12:00:16 -06:00
Hardy Pottinger
0a00367b0f increased version for commons-fileupload 2018-01-02 14:37:37 -06:00
Hardy Pottinger
8fb788dcc5 refactored out all mention of Elastic Search, code and configs,
removed ES dependency from the dspace-api pom.xml, added lucene-core 4.10.3 instead
2018-01-02 12:23:31 -06:00
Raf Ponsaerts
2250ae4efb [DS-3762] applied the requested changes 2017-12-19 17:40:37 +01:00
Luigi Andrea Pascarelli
af9ce48ffb fix delete READ policy if patch 2017-12-19 15:42:18 +01:00
Andrea Bollini
50d802f299 Merge pull request #1897 from 4Science/DS-3787
DS-3787 Impossible to retrieve the community or collection logo
2017-12-14 16:03:36 +01:00
Tom Desair
f6b7805b88 DS-3489: Fixes after rebase on master 2017-12-13 23:39:12 +01:00
Tom Desair
a45709ae90 DS-3489: Move link creation to HAL link factory 2017-12-13 20:52:47 +01:00
Tom Desair
5297c50eeb DS-3489: Update tests to new builders 2017-12-13 20:52:47 +01:00
Raf Ponsaerts
443757a6a8 [DS-3489] fixed some issues with the implementation and fixed the tests 2017-12-13 20:52:47 +01:00
Raf Ponsaerts
f76df428f4 [DS-3489] Applied the fixes with regards to the metadata field name in the sort options, the extra api/discover/search/facets endpoint, the removal of hasMore field and the rename of searchResults to objects 2017-12-13 20:52:47 +01:00
Tom Desair
a0433f72ef DS-3489: Added HAL link factories for AuthorityEntryResource and SubmissionSectionResource 2017-12-13 20:52:34 +01:00
Raf Ponsaerts
f9041df1c8 [DS-3489] added test for the hitHighlights to the DiscoverQueryBuilderTest class 2017-12-13 20:52:34 +01:00
Raf Ponsaerts
1dfec4d4e0 DS-3489: Made the next tests and fixed the hitHighlight implementation 2017-12-13 20:52:34 +01:00
Raf Ponsaerts
cd1a84210f DS-3489: Finished writing the tests for DiscoveryRestController and wrote commentary for them 2017-12-13 20:52:34 +01:00
Tom Desair
5e9e1c78f9 DS-3489: Added unit test for DiscoverQueryBuilder 2017-12-13 20:52:34 +01:00
Tom Desair
413a52aff5 DS-3489: Embedded facet entries should also expose links 2017-12-13 20:52:34 +01:00
Tom Desair
b75aa96a3a DS-3489: Embedded facet entries should also expose links 2017-12-13 20:52:33 +01:00
Tom Desair
89968e291e DS-3489: Fix sorting search results 2017-12-13 20:52:33 +01:00
Tom Desair
cc93b8a64b DS-3489: Fix date facet values results 2017-12-13 20:52:33 +01:00
Tom Desair
b1675c3aa8 DS-3489: Minor refactoring and code cleanup 2017-12-13 20:52:33 +01:00
Tom Desair
81b833c817 DS-3489: Do not expose content twice 2017-12-13 20:52:33 +01:00
Tom Desair
1799a59aa2 DS-3489: Fix facet search link 2017-12-13 20:52:33 +01:00
Tom Desair
c839c1ce03 DS-3489: Fix facet results pagination 2017-12-13 20:52:33 +01:00
Tom Desair
09fc4ee934 DS-3489: Refactor HAL Link factories and HALResource 2017-12-13 20:52:33 +01:00
Tom Desair
cf73560c9e DS-3489: Refactored HAL Link factories 2017-12-13 20:52:33 +01:00
Tom Desair
659a6f4cb2 DS-3489: Update search objects response document 2017-12-13 20:52:33 +01:00
Tom Desair
0a5e5f3052 DS-3489: Fix facet value pagination 2017-12-13 20:52:32 +01:00
Tom Desair
c96886b598 DS-3489: Move DSpaceResource links to LinkFactory 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
a6d8808c49 DS-3489: Implemented a REST API for the facets values 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
3d67337685 DS-3489 Started working on exposing the facet values 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
6d913aa05b [DS-3489] Wrote the tests and cleaned up the code 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
f85daef824 [DS-3489] Implemented the facet configuration in the rest API. Currently showing name and type in the list of facets 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
bc488316d7 [DS-3489] Wrote tests for the added classes. Tests which require a dspace kernel or spring are left out 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
b9f6334e8c [DS-3489] Made the implementation for REST with the factory methods included 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
978b3d75c9 DS-3489: Bugfixed the links in the root level of the rest API, included all files 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
548407fdd7 DS-3489: Added an @Override equals and hashCode function to rest classes 2017-12-13 20:52:32 +01:00
Raf Ponsaerts
ee34bb07e7 DS-3489: Added license headers for the build to pass and changed a test to be consistent with a new implementation 2017-12-13 20:52:31 +01:00
Raf Ponsaerts
fbbead6137 DS-3489: Bugfixed the links in the root level of the rest API 2017-12-13 20:52:31 +01:00
Raf Ponsaerts
324403b663 DS-3489: Wrote tests for the added classes. Tests which require a dspace kernel or spring are left out 2017-12-13 20:52:31 +01:00
Raf Ponsaerts
01a9d8507b DS-3489: Made the implementation for REST with the factory methods included 2017-12-13 20:52:31 +01:00
Tom Desair
53cdcddd8e DS-3489: Adding facet values to search result part 2
DS-3489: Added trace logging
2017-12-13 20:52:31 +01:00
Tom Desair
78d85140a9 DS-3489: Adding facet values to search result 2017-12-13 20:52:31 +01:00
Tom Desair
93bc51383a DS-3489: Correct the SearchFilterResolver and sorting validation
DS-3489: self-link fix
2017-12-13 20:52:31 +01:00
Tom Desair
9359244854 DS-3489 Use authority label when searching on authority ID 2017-12-13 20:52:31 +01:00
Tom Desair
7df6201fd4 DS-3489: Converting Discovery results to REST HAL objects part 2 2017-12-13 20:52:31 +01:00
Tom Desair
f37a7868ab DS-3489: Converting Discovery results to REST HAL objects part 1
DS-3696: Improve base REST url implementation

DS-3696: Calculate link outside loop
2017-12-13 20:52:31 +01:00
Tom Desair
7871b618c4 DS-3489: DiscoverQueryBuilder implementation part 1 2017-12-13 20:52:30 +01:00
Tom Desair
6b66f1282b DS-3489: Add discover link to Root links 2017-12-13 20:52:30 +01:00
Tom Desair
4299ff9ec3 DS-3489: Added DiscoveryRestController and custom SearchFilterResolver REST parameter resolver 2017-12-13 20:52:30 +01:00
Luigi Andrea Pascarelli
90c6ed1478 D4CRIS-339 first implementation to delete method support; added delete for workspaceitem 2017-12-13 20:30:47 +01:00
Luigi Andrea Pascarelli
151d45d0d0 D4CRIS-416 first implementation of the contract (missing bitstream move operation and replace operation on attribute at accessConditions level) 2017-12-13 20:11:41 +01:00
Raf Ponsaerts
a63dde05e0 [DS-3762] fixed test for subCommunities 2017-12-13 17:38:41 +01:00
Luigi Andrea Pascarelli
d521163760 D4CRIS-416 fix set declared field with reflection 2017-12-12 22:51:36 +01:00
Andrea Bollini
a76500937b Merge pull request #22 from atmire/DS-3787
DS-3787: Cleanup deleted bitstreams after test
2017-12-12 17:30:24 +01:00
Luigi Andrea Pascarelli
f6b07a316a D4CRIS-416 make accetanceDate readonly; change path and named operation to add/remove/replace license to be compliance with the new contract; 2017-12-12 17:28:49 +01:00
Luigi Andrea Pascarelli
8ee3257c28 D4CRIS-416 refactoring to accept "from" attribute from operation; 2017-12-12 15:52:57 +01:00
Luigi Andrea Pascarelli
2e215bfd17 D4CRIS-416 refactoring to accept "from" attribute from operation; introduce constants to discriminate patch operation implementation; finalize move operation for metadata; 2017-12-12 15:19:57 +01:00
Luigi Andrea Pascarelli
e8a17a0aac D4CRIS-416 introduce move operation for metadata (describe step and upload step) 2017-12-12 13:18:35 +01:00
Tom Desair
386327369f DS-3787: Cleanup deleted bitstreams after test 2017-12-12 10:22:25 +01:00
Andrea Bollini
ad5463a664 DS-3787 set a default name for unnamed bitstreams (logos) 2017-12-12 01:21:15 +01:00
Andrea Bollini
c56a459a65 DS-3787 add tests to show the bug 2017-12-12 01:20:22 +01:00
Andrea Bollini
f051b764b5 Merge pull request #1893 from atmire/DS-3762_write-missing-tests
DS-3762: Write integration tests for initial endpoints
2017-12-11 23:43:37 +01:00
Luigi Andrea Pascarelli
54288d241a D4CRIS-416 introduce model class to manage patch operation deserialization 2017-12-11 17:04:31 +01:00
Luigi Andrea Pascarelli
4ff9027190 D4CRIS-416 manage request to add last element 2017-12-11 16:30:42 +01:00
Raf Ponsaerts
1271eb2a3a [DS-3762] implemented requested changes 2017-12-11 14:41:33 +01:00
Raf Ponsaerts
e61e6e7a1e [DS-3762] implemented changes 2017-12-11 14:20:45 +01:00
Andrea Bollini
aaafc1887b Merge pull request #1894 from atmire/DS-3781_Only-check-Flyway-once
DS-3781: Make sure Flyway is only executed once
2017-12-09 16:02:55 +01:00
Luigi Andrea Pascarelli
bdc4a19ad0 D4CRIS-416 first implementation to respect rest contract 2017-12-08 20:53:23 +01:00
Luigi Andrea Pascarelli
94372e65af D4CRIS-416 add comment with example 2017-12-08 20:52:38 +01:00
Luigi Andrea Pascarelli
16999ed800 D4CRIS-416 add stub move patch operation for metadata 2017-12-08 20:52:21 +01:00
Luigi Andrea Pascarelli
a31fa62bbc D4CRIS-416 first implementation to manage add/remove/replace workspaceitem-data-metadata as rest contract (not tested yet) 2017-12-07 22:51:31 +01:00
Luigi Andrea Pascarelli
db9fea71cc D4CRIS-416 try to manage path including "files" 2017-12-07 22:50:09 +01:00
Luigi Andrea Pascarelli
58b97597f0 D4CRIS-416 implements shifting to right 2017-12-07 22:48:58 +01:00
Luigi Andrea Pascarelli
2af59d1047 D4CRIS-357 implement validations 2017-12-06 17:38:04 +01:00
Luigi Andrea Pascarelli
e34e0cda9a add missing license header 2017-12-06 17:37:16 +01:00
Luigi Andrea Pascarelli
cb4121539a try to handle rights exception 2017-12-06 16:33:58 +01:00
Luigi Andrea Pascarelli
787d8e4a6f start to manage exception errors codes as provided into the Rest Contract for DSpace 7 2017-12-06 16:18:21 +01:00
Tom Desair
2d537abd7e DS-3762: Added test for Context.close() in try-with-resources block 2017-12-06 00:47:50 +01:00
Tom Desair
707bbe7108 DS-3781: Added test for Context.updateDatabase 2017-12-06 00:39:46 +01:00
Luigi Andrea Pascarelli
cef912143a DS-3743 fix exposition of qualdropvalue 2017-12-05 17:28:53 +01:00
Raf Ponsaerts
f4c07c947d [DS-3762] cleaned up unneccesary code and comments 2017-12-05 17:21:00 +01:00
Tom Desair
1dde73f55b DS-3762: Correct @author tag 2017-12-05 15:30:25 +01:00
Tom Desair
a478867804 [DS-3762] Adjust tests to new Builder API 2017-12-05 15:10:57 +01:00
Raf Ponsaerts
e070c2bd79 [DS-3762] made the findAuthorized tests to assert that no values get returned 2017-12-05 14:51:59 +01:00
Raf Ponsaerts
37016e6e9c [DS-3762] Added tests on every endpoint that a findOne lookup with a wrong UUID fails with a 404 error message 2017-12-05 14:51:59 +01:00
Raf Ponsaerts
fa1f2993c8 [DS-3762] Added extra tests using the new builders 2017-12-05 14:51:58 +01:00
Jonas Van Goolen
50c8e08da6 [DS-3762] Author update, + CRUDService implementation pre-loading 2017-12-05 14:51:58 +01:00
Jonas Van Goolen
c97f8c4320 [DS-3762] Refactoring for 1 generalised Builder 2017-12-05 14:51:58 +01:00
Raf Ponsaerts
3c8533d607 [DS-3762] Merge of all the tests and fixed for the travis build 2017-12-05 14:51:31 +01:00
Raf Ponsaerts
eaaffd403b [DS-3762] made a test for site endpoint 2017-12-05 14:51:24 +01:00
Raf Ponsaerts
078b71010a [DS-3762] added a page check to the metadataschemas endpoint and finalized the tests 2017-12-05 14:51:23 +01:00
Raf Ponsaerts
907501c280 [DS-3762] added clause in the metadataschemamatcher and added a new, ignored, test 2017-12-05 14:51:23 +01:00
Raf Ponsaerts
9a1ab3e9f6 [DS-3762] added tests for metadataschemas and added license headers for previous test classes 2017-12-05 14:51:23 +01:00
Raf Ponsaerts
f90772cb4a [DS-3762] added page checks to the item tests, finalized the ItemRestRepository tests 2017-12-05 14:51:23 +01:00
Raf Ponsaerts
63a62b9c36 [DS-3762] added a page check to the metadatafield, finalized the metadatafield endpoint tests 2017-12-05 14:51:23 +01:00
Raf Ponsaerts
06cae1627e [DS-3762] finalized Eperson tests 2017-12-05 14:51:23 +01:00
Raf Ponsaerts
00d3e095fb [DS-3762] added more clauses to the metadatafield tests 2017-12-05 14:51:23 +01:00
Raf Ponsaerts
f4b6b5d480 [DS-3762] added extra clauses for the EPerson matcher 2017-12-05 14:51:23 +01:00
Raf Ponsaerts
70529b154c [DS-3762] Finalized community endpoint tests, added test for /search/top and /search/subCommunities, though the latter one is on ignore 2017-12-05 14:51:22 +01:00
Raf Ponsaerts
e9b8457289 [DS-3762] added a not empty check on the collections of the community 2017-12-05 14:51:22 +01:00
Raf Ponsaerts
759d987332 [DS-3762] added an extra clause on the dc.title metadata field for the communities 2017-12-05 14:51:22 +01:00
Raf Ponsaerts
033a734fb5 [DS-3762] finalized testing for the CollectionRestRepository 2017-12-05 14:51:22 +01:00
Raf Ponsaerts
81e91d7f54 [DS-3762] added an extra clause in the collection matcher 2017-12-05 14:51:22 +01:00
Raf Ponsaerts
7b002d93eb [DS-3762] added clauses to the bitstream matcher 2017-12-05 14:51:22 +01:00
Jonas Van Goolen
1a6100bce1 [DS-3762] Additional testing, taking into account that "Unknown" needs to exist 2017-12-05 14:51:22 +01:00
Jonas Van Goolen
20a3b03b4f [DS-3762] Commit before generalising AbstractBuilder to also include the "DSpaceCRUDService" 2017-12-05 14:51:07 +01:00
Jonas Van Goolen
e4b164dc9c [DS-3762] Tests for api/eperson/groups endpoint 2017-12-05 14:50:59 +01:00
Tom Desair
364b940ac2 DS-3762: Fixed and added bitstream tests 2017-12-05 14:50:59 +01:00
Raf Ponsaerts
d2a7824ebf [DS-3762] added license headers 2017-12-05 14:50:59 +01:00
Raf Ponsaerts
6cfdcd3923 [DS-3762] wrote tests for the items endpoint 2017-12-05 14:50:59 +01:00
Raf Ponsaerts
dd3df70b0d [DS-3762] added pagination test for the community endpoint 2017-12-05 14:50:58 +01:00
Raf Ponsaerts
043882d079 [DS-3762] wrote tests for the community endpoint, the /api/core/communities/search is still missing as there is no implementation for it 2017-12-05 14:50:58 +01:00
Raf Ponsaerts
13c3da82f9 [DS-3762] added pagination test for the collection endpoint 2017-12-05 14:50:58 +01:00
Raf Ponsaerts
3c408de9c5 [DS-3762] Wrote tests for the collection endpoint. /api/core/collections/search is not implemented yet and thus not tested 2017-12-05 14:50:49 +01:00
Raf Ponsaerts
37a4e05b77 [DS-3762] added pagination test for bitstreams endpoint 2017-12-05 14:50:49 +01:00
Raf Ponsaerts
e2ff7fa2e8 [DS-3762] wrote all possible, accepting, tests for the bitstreams endpoint. See comments for failures 2017-12-05 14:50:48 +01:00
Raf Ponsaerts
00638f1b68 [DS-3762] wrote test for metadatafield endpoint 2017-12-05 14:50:48 +01:00
Raf Ponsaerts
f6d3568eaa [DS-3762] added pagination test and license headers 2017-12-05 14:50:48 +01:00
Raf Ponsaerts
729b8cc9a8 [DS-3762] added tests for the eperson endpoint and created the logic to automatically delete objects that were build with the provided builders 2017-12-05 14:50:48 +01:00
Tom Desair
0fcc056c99 DS-3781: Restore DSpace 6 code and make sure Flyway is only executed once 2017-12-04 21:02:31 +01:00
Andrea Bollini
ec6449f0d6 Merge branch 'atmire-DS-3651_Range-Header-support' 2017-12-03 04:48:42 +01:00
Andrea Bollini
52782e0342 Merge branch 'DS-3651_Range-Header-support' of https://github.com/atmire/DSpace into atmire-DS-3651_Range-Header-support 2017-12-03 03:36:02 +01:00
Andrea Bollini
8311ae207d Merge pull request #1873 from atmire/POC_stateless_sessions
DS-3542: Stateless sessions authentication
2017-12-02 22:11:03 +01:00
Luigi Andrea Pascarelli
14a99945c5 DS-3699 fix exposition of authority for field with different valuepairs in different form definition 2017-12-01 19:19:06 +01:00
Luigi Andrea Pascarelli
53dd2c2d24 DS-3699 add language configuration for field in submission form 2017-12-01 17:43:02 +01:00
Luigi Andrea Pascarelli
5e55dfa58f fix setup of language element tag on inputform (fix the wrong behaviour to not permits to get a custom list but only common_iso_languages) 2017-12-01 17:37:19 +01:00
Tom Desair
71b0bc4cc1 DS-3651: Correct Atmire @author tag 2017-11-30 14:11:00 +01:00
Tom Desair
ad0187ff8a DS-3651: Close database connection during download/streaming 2017-11-30 09:26:39 +01:00
Tom Desair
41a59ce03c DS-3651: Remove unnecessary catch 2017-11-29 17:59:31 +01:00
Tom Desair
b0bb8ebf9e DS-3651: Correct ControllerAdvice exception handling + tests 2017-11-29 17:51:48 +01:00
frederic
9b65d72fed DS-3651 Throw exceptions instead of catching 2017-11-29 13:34:02 +01:00
frederic
22ced228e8 DS-3651 Exceptions with @ControllerAdvice 2017-11-29 13:34:02 +01:00
Tom Desair
14e9bce865 DS-3651: Fixing imports and authors 2017-11-29 13:34:02 +01:00
Tom Desair
8fe17da0cf DS-3651: Check bitstream authorizatoins + tests 2017-11-29 13:34:02 +01:00
Tom Desair
f9236d75f8 DS-3651: Fix imports 2017-11-29 13:34:02 +01:00
Tom Desair
df950ea475 DS-3651: Finished integration tests 2017-11-29 13:34:01 +01:00
Tom Desair
5165ab00d1 DS-3651: Adding integration tests 2017-11-29 13:33:59 +01:00
Tom Desair
26dfe8feba DS-3651: Update unit test 2017-11-29 13:33:05 +01:00
Tom Desair
fd4dde3c6f DS-3651: Log event when downloading bitstream via new REST API 2017-11-29 13:33:05 +01:00
frederic
39d202c4f7 DS-3651 Unit tests for MultipartFileSender 2017-11-29 13:33:05 +01:00
frederic
4c747a2a28 DS-3651 Fix If-Match implementation 2017-11-29 13:33:05 +01:00
frederic
40e6e955a1 DS-3651 Multirange fix 2017-11-29 13:33:05 +01:00
frederic
a02bda791f DS-3651 MultiPartFileSender testclass 2017-11-29 13:33:04 +01:00
Tom Desair
71f0dc5ca8 DS-3651: Call the event service on bitstream download 2017-11-29 13:33:04 +01:00
Tom Desair
91ae2e55d3 DS-3651: Small improvements 2017-11-29 13:33:04 +01:00
Tom Desair
a58e65025c DS-3651: Small refactoring 2017-11-29 13:33:04 +01:00
frederic
34e6676129 DS-3651 configurable buffer 2017-11-29 13:33:04 +01:00
frederic
db54bc12f2 DS-3651: several improvements 2017-11-29 13:33:04 +01:00
frederic
eb65a3fc73 DS-3651: Fixes and refactoring 2017-11-29 13:33:04 +01:00
frederic
4cedffc3be DS-3651: MultipartFileSender with inputstreams 2017-11-29 13:33:03 +01:00
Andrea Bollini
cea8ccd271 fix test failure due to the change in the name used for the embedded list on collection resource endpoints 2017-11-28 23:54:36 +01:00
Andrea Bollini
e9738d9353 fix compilation issues related to the Exception specialization 2017-11-28 19:13:27 +01:00
Hardy Pottinger
b578b03595 [DS-3757] increase default clamav socket timeout to 6 minutes (#1886) 2017-11-27 10:19:00 -06:00
Andrea Bollini
c639608ec2 Merge remote-tracking branch 'origin/master' into D4CRIS-338
# Conflicts:
#	dspace-api/src/main/java/org/dspace/app/util/SubmissionInfo.java
#	dspace-api/src/main/java/org/dspace/submit/model/SelectableMetadata.java
#	dspace-api/src/main/java/org/dspace/submit/step/DescribeStep.java
#	dspace-api/src/main/java/org/dspace/submit/step/SelectCollectionStep.java
#	dspace-api/src/main/java/org/dspace/submit/step/StartSubmissionLookupStep.java
2017-11-23 17:36:03 +01:00
Tom Desair
5b0c101b65 DS-3542 Code cleanup 2017-11-23 12:31:16 +01:00
Tom Desair
044e243802 DS-3542 Improve login form; Fix Flyway; Fix authn links 2017-11-23 00:03:43 +01:00
Tom Desair
ca37648417 Merge branch 'master' into POC_stateless_sessions 2017-11-22 18:03:52 +01:00
Tom Desair
430d074549 DS-3542 Authentication status link fixes 2017-11-22 17:44:18 +01:00
frederic
516aa41bf6 DS-3542 logout login bugfix 2017-11-22 16:22:32 +01:00
frederic
437d3bebc7 DS-3542 HAL browser Links for Authn 2017-11-22 14:01:57 +01:00
Luigi Andrea Pascarelli
b483e6f28d Merge branch 'D4CRIS-338' of https://github.com/4Science/DSpace into D4CRIS-338 2017-11-22 12:51:02 +01:00
frederic
69827e202d DS-3542 fixed test endpoints 2017-11-22 10:01:06 +01:00
Andrea Bollini
792fe8f9d1 improve the default configuration, renamed in the plural form 2017-11-22 09:12:41 +01:00
Andrea Bollini
064b4b853d improve the default configuration, renamed in the plural form 2017-11-22 09:10:38 +01:00
Luigi Andrea Pascarelli
d6b67c4bac D4CRIS-354 DS-3759 implemented move collection during submission (remove the metadata not anymore managed in the new submission form) 2017-11-21 20:04:10 +01:00
Luigi Andrea Pascarelli
8ec23cbb4f D4CRIS-354 manage empty path 2017-11-21 20:01:53 +01:00
frederic
8f92363110 DS-3542 move endpoints to authn 2017-11-21 18:07:04 +01:00
frederic
6d2e475577 DS-3542 more tests, improvements and comments 2017-11-21 17:56:54 +01:00
frederic
1c9ee80289 DS-3542 Testclasses 2017-11-21 14:06:11 +01:00
frederic
41691cbe0e DS-3542 Integration/unit tests + small changes 2017-11-21 14:05:35 +01:00
Luigi Andrea Pascarelli
b187c1d9fc enabling Cross Origin Requests for all methods 2017-11-20 16:10:47 +01:00
frederic
a14c4403a1 DS-3542 missing license header 2017-11-20 13:08:56 +01:00
frederic
0f393e7981 Merge branch 'POC_stateless_sessions' of github.com:atmire/DSpace into POC_stateless_sessions 2017-11-20 09:58:48 +01:00
frederic
80c26906e9 DS-3542 configurable encryption 2017-11-17 17:59:16 +01:00
Luigi Andrea Pascarelli
bb1f873489 expose subgroups as links relation 2017-11-17 17:58:48 +01:00
Tom Desair
e2169fb3ae DS-3542: Add support for testing with authentication 2017-11-17 17:13:53 +01:00
Luigi Andrea Pascarelli
56533d4c0f D4CRIS-338 change resource name for accesscondition to resourcepolicy 2017-11-17 16:28:02 +01:00
Luigi Andrea Pascarelli
6003012d40 D4CRIS-338 refactoring accesscondition to resourcepolicy; 2017-11-17 15:31:56 +01:00
frederic
a86f10cfd5 DS-3542 etra documentation signing key 2017-11-17 14:25:55 +01:00
Luigi Andrea Pascarelli
aa2bf54f39 D4CRIS-338 refactoring accesscondition to resourcepolicy; implemented add resourcepolicy PATCH; change remove PATCH index basis 2017-11-17 13:23:46 +01:00
marsaoua
5d507ed23b DS-3729: Set the Bitstream deletion flag in the database in case of an item deletion (#1874) 2017-11-16 13:46:02 -06:00
frederic
28eb61f42d DS-3542 fix size salt 2017-11-16 17:37:10 +01:00
Andrea Bollini
00ba22e0dd Merge pull request #1852 from 4Science/DS-3699
DS-3699 expose submission configuration over REST
2017-11-16 15:17:05 +01:00
Luigi Andrea Pascarelli
c5164b937d D4CRIS-338 rename dto accesscondition into dto resourcepolicy 2017-11-16 12:45:53 +01:00
Luigi Andrea Pascarelli
52ce4afe2a D4CRIS-338 add move patch operation; add license and author 2017-11-16 12:32:07 +01:00
Luigi Andrea Pascarelli
a23a4d1c93 D4CRIS-338 use constant to work with "ORIGINAL" bundle 2017-11-16 12:17:41 +01:00
Luigi Andrea Pascarelli
4870d75afa D4CRIS-338 refactoring to implement PATCH operation model; manage upload section operation; 2017-11-15 19:50:12 +01:00
frederic
bf8d63c1e3 DS-3542 JWT encryption and compression 2017-11-15 17:17:55 +01:00
frederic
2e650d8964 DS-3542 Pass by authorization header 2017-11-15 16:15:10 +01:00
Luigi Andrea Pascarelli
c9b4270af2 add support for rel/relid with minus and underscore 2017-11-15 11:39:40 +01:00
Luigi Andrea Pascarelli
3e77e73fc2 D4CRIS-338 save granted date into dcterms.accessRights (was dc.rights.date) 2017-11-15 11:39:13 +01:00
Luigi Andrea Pascarelli
7a3532993f Merge branch 'D4CRIS-338' of https://github.com/4Science/DSpace into D4CRIS-338
# Conflicts:
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/RestResourceController.java
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/submit/step/DescribeStep.java
2017-11-15 00:37:46 +01:00
Luigi Andrea Pascarelli
fc13ad5725 D4CRIS-338 add implementation to grant license 2017-11-15 00:31:30 +01:00
Luigi Andrea Pascarelli
1e1416143d D4CRIS-338 add implementation for save metadata; add stub logic for patch processing 2017-11-14 23:47:28 +01:00
Luigi Andrea Pascarelli
75cbe4d093 D4CRIS-338 add support for patch update [rfc6902] 2017-11-14 23:46:24 +01:00
Luigi Andrea Pascarelli
6505523477 Use stringutils to check empty/null string 2017-11-14 23:43:58 +01:00
Andrea Bollini
b82e9bbbf2 Merge branch 'D4CRIS-338' of https://github.com/4Science/DSpace into D4CRIS-338
# Conflicts:
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/RestResourceController.java
2017-11-14 18:40:01 +01:00
Andrea Bollini
84f0810fc7 separate the DirectlyAddressableRestModel from the base RestModel 2017-11-14 18:34:23 +01:00
Andrea Bollini
6e3395a997 D4CRIS-338 expose the metadata section in the upload as a link to a submissionForm 2017-11-14 18:33:31 +01:00
Luigi Andrea Pascarelli
7ef0d2992e D4CRIS-338 D4CRIS-345 add support for true/false status and error message; removed support for multifiles 2017-11-14 13:00:00 +01:00
Andrea Bollini
bd2ba93635 rename AccessCondition Link Repository 2017-11-13 20:52:39 +01:00
Andrea Bollini
4e54d528d8 fix camel case 2017-11-13 20:52:16 +01:00
Luigi Andrea Pascarelli
3fd4b01980 D4CRIS-338 fix name attribute 2017-11-13 19:54:48 +01:00
Luigi Andrea Pascarelli
781a567764 D4CRIS-338 D4CRIS-345 add first support for upload file; first implementation for upload bitstream during submission 2017-11-13 19:36:04 +01:00
Andrea Bollini
0e9d7521ab rename the DefaultAccessCondition classes in AccessCondition 2017-11-13 19:29:08 +01:00
Andrea Bollini
e386f0d49e allow to set the collection resource name also for not DSpaceRestResource 2017-11-13 18:31:18 +01:00
Andrea Bollini
4a0bd31483 fix wrong link creation in linked resources 2017-11-13 18:30:45 +01:00
Andrea Bollini
280d41fd3d Merge commit '8883cf874da22f3fe459c08b0dcf5bd173f1a4d8' into D4CRIS-338
# Conflicts:
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/converter/AccessConditionsConverter.java
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/model/CollectionRest.java
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/model/DefaultAccessConditionRest.java
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/model/hateoas/CollectionResource.java
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/model/hateoas/DefaultAccessConditionResource.java
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/repository/DefaultAccessConditionRestLinkRepository.java
2017-11-13 18:11:02 +01:00
Andrea Bollini
a4567e4297 cleanup the use of the LinkRestRepository 2017-11-13 15:49:37 +01:00
Tom Desair
561d577939 DS-3542 Fix salt reuse + Authorization header + licenses 2017-11-13 11:24:38 +01:00
frederic
3fb72221ca DS-3542 Reuse salt if not expired 2017-11-13 09:45:23 +01:00
Luigi Andrea Pascarelli
8883cf874d D4CRIS-338 add license 2017-11-10 23:18:45 +01:00
Luigi Andrea Pascarelli
bf04bcc9cf D4CRIS-338 refactoring 2017-11-10 23:16:10 +01:00
Luigi Andrea Pascarelli
f940cffd3e D4CRIS-338 move service to the api 2017-11-10 22:14:04 +01:00
Luigi Andrea Pascarelli
5a2666637b D4CRIS-338 refactoring object model for endpoint default access condition; implementation of upload section config endpoint 2017-11-10 20:58:03 +01:00
Luigi Andrea Pascarelli
9945617fd3 D4CRIS-338 implementation of endpoint for collection default access condition 2017-11-09 20:02:11 +01:00
Luigi Andrea Pascarelli
994da90f80 DS-3699 fix after change to isFieldPresent 2017-11-09 19:18:02 +01:00
Luigi Andrea Pascarelli
d409331f8c Add support for single linked object 2017-11-09 19:03:51 +01:00
frederic
dd8fbedf90 DS-3542 refresh token via login endpoint 2017-11-09 12:43:47 +01:00
Luigi Andrea Pascarelli
af27a978eb minor organize import 2017-11-09 12:17:01 +01:00
Luigi Andrea Pascarelli
53ab0e8d8e D4CRIS-338 implementation of endpoint for collection license 2017-11-09 12:16:29 +01:00
Luigi Andrea Pascarelli
3f04880ef5 introduce return single object from linkrepository method 2017-11-09 12:13:20 +01:00
Andrea Bollini
6b63b6ea41 DS-3699 Introduce a dedicated exception for the SubmissionConfigReader failures 2017-11-08 23:20:24 +01:00
Andrea Bollini
e6f68668fd DS-3699 Add comment about the scope of the SelectableMetadata class 2017-11-08 23:14:15 +01:00
Luigi Andrea Pascarelli
0886e1ed99 DS-3740 add license and java doc 2017-11-08 17:57:32 +01:00
Luigi Andrea Pascarelli
72afc3107a DS-3743 add java documentation 2017-11-08 17:57:13 +01:00
Luigi Andrea Pascarelli
f66228b42a Merge branch 'DS-3743' into DS-3740
# Conflicts:
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/repository/WorkspaceItemRestRepository.java
2017-11-08 17:38:04 +01:00
Luigi Andrea Pascarelli
a44034fc9c DS-3743 remove json ignore from "id" attribute 2017-11-08 17:26:05 +01:00
Luigi Andrea Pascarelli
83da4d9fe7 DS-3743 implement upload and license step section 2017-11-08 17:24:45 +01:00
frederic
79ada4ea0f DS-3542 token to Authorization header and hal browser login 2017-11-08 16:19:49 +01:00
frederic
2350c1e758 DS-3542 Fix eperson link 2017-11-08 14:25:03 +01:00
frederic
2dcfa28ae3 DS-3542 eperson embedded in status rest 2017-11-08 13:11:11 +01:00
Luigi Andrea Pascarelli
23c3f3ed58 DS-3743 implementation of findbysubmitter 2017-11-08 09:45:14 +01:00
Luigi Andrea Pascarelli
b54d0f09a9 DS-3743 metadata exposition key with dot separator; add @JsonAnyGetter annotation to remove serialization of metadata label 2017-11-08 09:30:09 +01:00
Luigi Andrea Pascarelli
7ac8722ce0 DS-3743 modify getdata now return serializable 2017-11-07 20:55:43 +01:00
Luigi Andrea Pascarelli
9b48854db7 DS-3743 fix initialize reader for submission config; add missing setcollection and setsubmissiondefinition into the converter for WSI 2017-11-07 19:32:26 +01:00
Luigi Andrea Pascarelli
0112af21bc DS-3740 refactoring method 2017-11-07 19:08:38 +01:00
Luigi Andrea Pascarelli
7be035f553 Merge branch 'DS-3743' into DS-3740 2017-11-07 19:05:37 +01:00
Luigi Andrea Pascarelli
20e4ca3617 DS-3740 reintroduce collection rest attribute in workspaceitem rest object model 2017-11-07 19:05:05 +01:00
Luigi Andrea Pascarelli
8c7685b987 Merge branch 'DS-3743' into DS-3740
# Conflicts:
#	dspace-spring-rest/src/main/java/org/dspace/app/rest/model/WorkspaceItemRest.java
2017-11-07 19:02:28 +01:00
Luigi Andrea Pascarelli
d47f5a0334 DS-3743 add missing license 2017-11-07 19:00:49 +01:00
Andrea Bollini
0f4ccf2a46 fix mapping for entities with string ID 2017-11-07 18:59:18 +01:00
Andrea Bollini
7f84c4b082 avoid ambigous mapping when the id is digit-only with alphanumeric id 2017-11-07 18:59:08 +01:00
Luigi Andrea Pascarelli
595c0444eb DS-3743 introduce SectionData 2017-11-07 18:58:19 +01:00
Luigi Andrea Pascarelli
624258d159 DS-3740 stub implementation of update 2017-11-07 18:40:46 +01:00
frederic
a8582ea478 DS-3542 feedback fixes 2017-11-07 16:16:22 +01:00
Luigi Andrea Pascarelli
06d4c7270c DS-3740 add commit transaction on create method 2017-11-07 15:57:34 +01:00
Andrea Bollini
ddb545ee1a fix mapping for entities with string ID 2017-11-07 15:12:00 +01:00
Luigi Andrea Pascarelli
f7895619f4 DS-3740 move fake turnoff to create fake user 2017-11-07 13:15:49 +01:00
Luigi Andrea Pascarelli
8dcfebad10 DS-3740 add missed commit to save the fake user 2017-11-07 13:04:20 +01:00
Luigi Andrea Pascarelli
fcf57ed177 Merge branch 'D4CRIS-337' of https://github.com/4Science/DSpace into D4CRIS-337 2017-11-07 12:18:57 +01:00
Andrea Bollini
db33e62bc4 avoid ambigous mapping when the id is digit-only with alphanumeric id 2017-11-07 12:10:56 +01:00
Luigi Andrea Pascarelli
9139a8421e DS-3740 add missed set submission definition 2017-11-07 12:04:18 +01:00
frederic
89e3e9f720 Merge branch 'POC_stateless_sessions' of github.com:atmire/DSpace into POC_stateless_sessions 2017-11-07 12:03:10 +01:00
frederic
c77bcf5d8a DS-3542 extensible jwt claims 2017-11-07 12:02:55 +01:00
Luigi Andrea Pascarelli
6eab911779 DS-3740 refactoring use of the fake user move into rest 2017-11-07 11:56:51 +01:00
Luigi Andrea Pascarelli
1b278ec1dd DS-3740 the init of workspaceitem is demanded on external service 2017-11-07 11:55:53 +01:00
Luigi Andrea Pascarelli
0539c9246d DS-3740 remove method doPreProcessing (step have todo no operations in this phase) 2017-11-07 11:53:47 +01:00
Luigi Andrea Pascarelli
f3d09ee813 DS-3740 minor comment 2017-11-07 09:39:21 +01:00
Luigi Andrea Pascarelli
363320a38c DS-3740 introduced dry run with a fake user (NOTE this code is an experimental code to works without authorization) 2017-11-06 19:20:24 +01:00
Luigi Andrea Pascarelli
16108d2cf9 DS-3740 design preprocessing step method; implemented preprocessing into SelectCollectionStep 2017-11-06 19:19:19 +01:00
Luigi Andrea Pascarelli
d4cc4351d9 DS-3740 use standardize utility method 2017-11-06 18:27:58 +01:00
Luigi Andrea Pascarelli
f5215931dc DS-3740 implement POST method, add create workspaceitem by default 2017-11-06 17:45:11 +01:00
Tom Desair
c5d4f9d5b1 DS-3542: Build works locally, try again on Travis 2017-11-06 17:05:50 +01:00
Tom Desair
6249f3f08f DS-3542: Fixes after rebase 2017-11-06 14:50:24 +01:00
Tom Desair
8fa5394f9d DS-3542: Fix autowiring 2017-11-06 14:08:56 +01:00
Tom Desair
1c8de472e7 DS-3542: Cleanup SessionImpl 2017-11-06 14:08:55 +01:00
Tom Desair
eed50c363d DS-3542: Cleanup session service. TODO SessionImpl 2017-11-06 14:08:55 +01:00
Tom Desair
8dcdcb1ebd DS-3542: Refactoring and comments 2017-11-06 14:08:55 +01:00
frederic
e0b9f90aed DS-3542 Extra documentation and licenses 2017-11-06 14:08:55 +01:00
frederic
2cf4baf3ac DS-3542 remove use of sessions 2017-11-06 14:08:55 +01:00
frederic
3630c817e7 DS-3542 expiration time configurable 2017-11-06 14:08:43 +01:00
frederic
ec0be65d34 DS-3542 expiration time configurable 2017-11-06 14:08:01 +01:00
frederic
bc055f1612 DS-3542 More detailed status end point and other improvements 2017-11-06 14:06:35 +01:00
frederic
5f25ea2387 DS-3542 Logout endpoint and various improvements 2017-11-06 14:06:34 +01:00
frederic
02c2612889 DS-3542 status endpoint 2017-11-06 14:06:34 +01:00
frederic
3e366bf442 DS-3542: invalided sessions with salt 2017-11-06 14:06:30 +01:00
frederic
da3a941e0a DS-3542: first login with jwt concept 2017-11-06 14:06:20 +01:00
Luigi Andrea Pascarelli
b2b035ca09 DS-3740 comment in patch method; add rest step to define data exposure (getData) 2017-11-06 13:05:15 +01:00
Luigi Andrea Pascarelli
910dfcce80 DS-3740 remove old DTO class for submission process 2017-11-03 19:58:04 +01:00
Luigi Andrea Pascarelli
4716128ac3 Merge branch 'DS-3699' into DS-3740 2017-11-03 19:00:06 +01:00
Luigi Andrea Pascarelli
216d7481bb DS-3699 enhance wrapper to pass integration test;
DS-3484 catch generic Exception
2017-11-03 17:00:16 +01:00
Luigi Andrea Pascarelli
cdd9049b6b Merge branch 'master' into DS-3699 2017-11-03 11:15:42 +01:00
Luigi Andrea Pascarelli
8af7a74646 DS-3740 stub refactoring for processing step; 2017-11-03 10:59:25 +01:00
Luigi Andrea Pascarelli
6523b2faf7 Merge branch 'DS-3699' into DS-3740 2017-11-03 10:27:09 +01:00
Luigi Andrea Pascarelli
4014c317e3 DS-3699 minor adjust comment; add javadoc to tokenize and standardize utility method 2017-11-03 09:55:03 +01:00
Luigi Andrea Pascarelli
ea79d17df1 DS-3699 replace with the standardize method 2017-11-03 09:53:51 +01:00
Luigi Andrea Pascarelli
6bed6ac158 DS-3740 finalize fromModel converter method; start refactoring for processing step; 2017-11-03 09:19:11 +01:00
Luigi Andrea Pascarelli
d41ccfacc1 DS-3740 introduce findBySubmitter 2017-11-03 09:03:29 +01:00
Luigi Andrea Pascarelli
eb9b2b6010 stub: introduce PATCH try to manage as https://tools.ietf.org/html/rfc6902 2017-11-03 09:02:25 +01:00
Luigi Andrea Pascarelli
dcacb03993 format code 2017-11-03 09:00:41 +01:00
Andrea Bollini
716912f275 Merge pull request #1866 from atmire/DS-3484_Spring-mvc-test-REST-API
DS-3484: REST API integration tests with Spring MVC test
2017-11-03 08:31:57 +01:00
Luigi Andrea Pascarelli
03007a8f6a Merge branch 'DS-3740' of https://github.com/4Science/DSpace into DS-3740 2017-11-02 12:45:21 +01:00
Andrea Bollini
e71c036111 DS-3740 initial draft of the workspaceitems endpoint 2017-11-02 12:28:04 +01:00
Tom Desair
0aa663479b DS-3484: Improve ApplicationContext initialization 2017-11-02 11:19:13 +01:00
Tom Desair
84edb50afc DS-3484: Added pagination test + extra logging + Solr Service bug 2017-11-02 11:18:50 +01:00
Raf Ponsaerts
255b8a7e8e [DS-3484] Added tests for the BrowseEntryResourceIT controller 2017-11-02 11:18:50 +01:00
Tom Desair
6a7c60c121 DS-3484: Refactoring builders and extended Browse by title test 2017-11-02 11:18:50 +01:00
Tom Desair
9529e5634f DS-3484: Added SOLR integration for the REST integration tests 2017-11-02 11:18:49 +01:00
Tom Desair
876d0e9a82 DS-3484: Initial setup of Spring MVC test + RootRestResourceControllerTest 2017-11-02 11:18:49 +01:00
Luigi Andrea Pascarelli
f755f59d38 Merge pull request #19 from AlexanderS/DS-3699
DS-3724: Missing readonly in input-forms.dtd
2017-11-02 10:43:10 +01:00
Alexander Sulfrian
377fefbfe7 DS-3724: Missing readonly in input-forms.dtd 2017-11-01 18:40:13 +01:00
Tim Donohue
260e4178df Merge pull request #1865 from atmire/DS-3711_JaCoCo-code-coverage-maven-plugin
DS-3711: JaCoCo maven plugin
2017-11-02 01:22:36 +10:00
Luigi Andrea Pascarelli
779f561e68 DS-3699 use authorities.keySet() here and get rid of the authorityNames class field; replace logic into the makefield method in favour of Utils.standardize static method; 2017-10-31 21:14:47 +01:00
Luigi Andrea Pascarelli
f2c8574204 DS-3699 use stringutils 2017-10-31 21:13:22 +01:00
Luigi Andrea Pascarelli
3a96b940fb [DS-3699] change name of the DTD into submission-forms.xml test file 2017-10-25 17:13:26 +02:00
Luigi Andrea Pascarelli
22e9844987 [DS-3699] manage valuepairs and vocabulary authority both controlled and required configuration setup to true 2017-10-25 17:03:45 +02:00
Luigi Andrea Pascarelli
2f5d18ab04 [DS-3699] misc refactoring; fix typo; add logger; remove test configuration; 2017-10-25 11:59:13 +02:00
Luigi Andrea Pascarelli
5ce58f4f86 [DS-3699] renamed endpoint (remove the dash from the name) 2017-10-23 11:48:41 +02:00
Luigi Andrea Pascarelli
90b916f782 [DS-3699] renamed attibute submission section 2017-10-23 11:22:38 +02:00
Luigi Andrea Pascarelli
14808d7700 [DS-3699] remove model rest page for input form; add use of constant to match section type for submission-forms 2017-10-23 10:57:46 +02:00
Luigi Andrea Pascarelli
2770a13fa2 [DS-3699] change name to input-forms.dtd; replace all occurance of input-forms.xml in Java files; replace type for step definition (WAS input-form now submission-form is required) 2017-10-23 10:55:39 +02:00
Luigi Andrea Pascarelli
74539f3458 [DS-3699] finalizing refactoring; improved authority endpoint; 2017-10-20 15:53:39 +02:00
Luigi Andrea Pascarelli
94cf2bd1f0 [DS-3699] refactoring class name 2017-10-20 15:33:24 +02:00
Luigi Andrea Pascarelli
e7254cf62c DS-3699 auto register choice plugin for value-pairs and controlled vocabulary 2017-10-17 12:06:23 +02:00
Luigi Andrea Pascarelli
ba5ccb1099 DS-3699 changed file name (NOW submission-forms.xml WAS input-forms.xml) 2017-10-17 11:29:34 +02:00
Luigi Andrea Pascarelli
8df24f883d DS-3699 renamed endpoint and file for input-forms (now is called submission-forms); modified category for configuration service (WAS configuration NOW config) 2017-10-16 19:44:08 +02:00
Luigi Andrea Pascarelli
e5a97ff7f6 remove unused object 2017-10-16 10:53:19 +02:00
Luigi Andrea Pascarelli
fde9032ad3 fix throw exception 2017-10-16 10:26:42 +02:00
Luigi Andrea Pascarelli
b447f89c40 D4CRIS-342 enable authority for value-pairs common_types (for test scope); update input-form.xml (for test scope) 2017-10-13 19:07:09 +02:00
Luigi Andrea Pascarelli
114aad03f0 D4CRIS-342 control input-type with custom rule 2017-10-13 18:33:09 +02:00
Luigi Andrea Pascarelli
129824bdfa D4CRIS-342 cleanup item-submission; remove collection and complete required step; 2017-10-13 17:27:48 +02:00
Luigi Andrea Pascarelli
e3aeb3d35b D4CRIS-342 cleanup item-submission; add required attribute "id" into the step element 2017-10-13 16:18:48 +02:00
Luigi Andrea Pascarelli
88365ccd11 add comment in the gitignore to explain rebel.xml rule 2017-10-13 15:24:22 +02:00
Luigi Andrea Pascarelli
d941883ddb D4CRIS-342 change named link from panel to inputform (was input-form now is configuration) 2017-10-13 12:35:43 +02:00
Luigi Andrea Pascarelli
f875c33648 D4CRIS-342 change named endpoint from panel to submission-panel 2017-10-13 12:35:43 +02:00
Luigi Andrea Pascarelli
80236758fc D4CRIS-336 exposition of vocabulary as authority 2017-10-13 12:35:43 +02:00
Luigi Andrea Pascarelli
dbddc1f7f0 D4CRIS-342 finalize endpoint to expose link from panel to inputform 2017-10-13 12:35:43 +02:00
Luigi Andrea Pascarelli
0678b91f19 D4CRIS-342 start point to develop panel rest endpoint 2017-10-13 12:35:42 +02:00
Luigi Andrea Pascarelli
d0aee26ac7 D4CRIS-342 respect the new contract for submission definition and input form 2017-10-13 12:35:42 +02:00
Luigi Andrea Pascarelli
7b21429fc6 D4CRIS-336 refactoring endpoint to retrieve authority; finalize dynamic discovery of valuepairs and vocabulary as authority; exposition of relid; D4CRIS-342 improve selectablemetadata; add check on inputform converter to add authority only if authority have choice; 2017-10-13 12:35:42 +02:00
Luigi Andrea Pascarelli
e7c3c9d8bc D4CRIS-336 improving endpoint; Add wrapper for authority derived from valuepairs or vocabulary 2017-10-13 12:35:41 +02:00
Luigi Andrea Pascarelli
6a46863580 D4CRIS-336 remove requestbody parameter 2017-10-13 12:35:41 +02:00
Luigi Andrea Pascarelli
9545ed5afe D4CRIS-336 remove requestbody parameter 2017-10-13 12:35:41 +02:00
Luigi Andrea Pascarelli
6dc5e1953d D4CRIS-336 try to inject body request parameter 2017-10-13 12:35:41 +02:00
Luigi Andrea Pascarelli
e13b6fe35a D4CRIS-336 finalize authority rest resource; starting point for authority entry rest 2017-10-13 12:35:40 +02:00
Luigi Andrea Pascarelli
d890cbb44f Refactoring (create package and move annotation); fix typo 2017-10-13 12:35:40 +02:00
Luigi Andrea Pascarelli
424541fd01 D4CRIS-335 change naming convention to receive suggestion from abollini 2017-10-13 12:35:40 +02:00
Luigi Andrea Pascarelli
23eb15c1ad D4CRIS-336 add authority rest endpoint; add stub for retrieve authority entries 2017-10-13 12:35:39 +02:00
Luigi Andrea Pascarelli
a05f6a646a DS-3699 add community service with autowired 2017-10-13 12:35:38 +02:00
Luigi Andrea Pascarelli
3bc94a28f2 add rebel.xml to the gitignore 2017-10-13 12:35:38 +02:00
Andrea Bollini
2ef05eb448 DS-3699 embedding authority information in the inputform 2017-10-13 12:33:45 +02:00
Andrea Bollini
8c83c2d419 DS-3699 first implementation input-form endpoint 2017-10-13 12:33:45 +02:00
Andrea Bollini
d026d72017 DS-3699 add support for regex validation, page heading and mandatory flag 2017-10-13 12:33:45 +02:00
Andrea Bollini
2594415db7 DS-3699 code cleanup 2017-10-13 12:33:44 +02:00
Andrea Bollini
a42a18c4cb DS-3699 renamed getInputs in getInputsByCollection 2017-10-13 12:33:44 +02:00
Andrea Bollini
a54d0ba47e DS-3699 add findByCollection to the submission definition repository 2017-10-13 12:33:44 +02:00
Andrea Bollini
d9fc8bea05 DS-3699 flag the default submission config 2017-10-13 12:33:44 +02:00
Andrea Bollini
5247bc86e3 DS-3699 Expose the item-submission configuration over the new REST 2017-10-13 12:33:44 +02:00
Tom Desair
205ad24d3b DS-3711: Retrigger build after Coveralls 404 error 2017-10-13 10:06:08 +02:00
Tom Desair
6fa8832f8c DS-3711: Move aggregated reports to DSpace Assembly and Configuration module 2017-10-12 22:52:49 +02:00
Tom Desair
eea9993b8a DS-3711: Configured JaCoCo and Coveralls code coverage tools
DS-3711: Added and configured JaCoCo maven plugin for DSpace API module

DS-3711: Attempt to configure the Coveralls plugin

DS-3711: Updated coveralls-maven-plugin configuration for DSpace API

Added comment

Added dspace-oai coverage reporting + updated coveralls config to not run locally

DS-3711: Correct skip-coveralls configuration

DS-3711: Second attempt to correct skip-coveralls configuration

DS-3711: Attempt to aggregate JaCoCo reports for coverall

DS-3711: Second attempt to aggregate JaCoCo reports for coveralls

Fix maven dependencies

DS-3711: Attempt to fix coveralls No source found for error
2017-10-12 22:51:48 +02:00
Terry Brady
48aa148bf7 Merge pull request #1861 from tdonohue/add_error_prone_checks
DS-3712: Add Error Prone checks to catch common Java code mistakes
2017-10-11 13:13:33 -07:00
Andrea Bollini
0bc719eb7e Merge pull request #1843 from atmire/DS-3697_Autowire-DSpace-service-in-Spring-REST-webapp
DS-3697: Enable autowiring of DSpace Services in the Spring REST webapp
2017-10-11 13:47:49 +02:00
Andrea Bollini
a1c46a488e Merge pull request #1726 from 4Science/DS-3544
DS-3544 add support to search methods for repositories
2017-10-10 21:56:33 +02:00
Tim Donohue
3fceaf46c3 Bug fix to Unit Tests as reported by ErrorProne 2017-10-05 17:28:22 +00:00
Tim Donohue
74cff74c3e Bug fixes to OAI-PMH as reported by Error Prone 2017-10-05 17:05:31 +00:00
Tim Donohue
5edb6f4a7c Bug fixes to REST API as reported by Error Prone 2017-10-05 17:05:04 +00:00
Tim Donohue
22f7690a19 Bug fixes to Java API reported by ErrorProne 2017-10-05 17:04:31 +00:00
Tim Donohue
9ba039a51a Enable ErrorProne in POM 2017-10-05 17:04:16 +00:00
Tom Desair
e2d36236c0 DS-3697: More imports cleaning 2017-10-04 13:04:44 +02:00
Tom Desair
84c9341381 DS-3697: Restore DSpace Kernel startup on Servlet Context init + fix imports 2017-10-04 13:00:12 +02:00
Tom Desair
6628d872d8 DS-3697: Enable autowiring of DSpace Services in the Spring REST webapp 2017-10-04 12:37:14 +02:00
Andrea Bollini
5b2b2b6764 DS-3544 code cleanup 2017-09-30 14:43:24 +02:00
Andrea Bollini
5200b203ac DS-3544 fix repository methods return page instead than list 2017-09-30 14:43:03 +02:00
Andrea Bollini
4ead87a23a DS-3544 refactored the repository related code in an utils class 2017-09-30 14:42:03 +02:00
Andrea Bollini
5bd215c1cf DS-3544 add support for search methods that return a single instance 2017-09-30 11:53:29 +02:00
Andrea Bollini
e34375e6cb DS-3544 only expose the search endpoint if there are search methods 2017-09-30 11:53:12 +02:00
Andrea Bollini
31929bd2e0 DS-3544 fix (fake) pagination of top communities add subCommunities search method 2017-09-30 11:51:52 +02:00
Andrea Bollini
8f0be17158 DS-3544 add support for custom search methods signatures 2017-09-30 11:51:52 +02:00
Andrea Bollini
5adeadcdff DS-3544 add support to search methods for repositories 2017-09-30 11:51:52 +02:00
Pascal-Nicolas Becker
a786dd74c8 DS-3627: Cleanup utility leaves files in assetstore 2017-09-29 17:47:41 +02:00
Andrea Bollini
09d58f5e73 DS-3701 explicitly set the doi table name to lowercase 2017-09-27 09:04:19 +02:00
Tom Desair
ca63b4f3e1 DS-3698 Correct HAL browser context path 2017-09-26 21:11:56 +02:00
Tom Desair (Atmire)
0d6688e91b DS-3696: Absolute links in root resource (#1841)
* Return absolute URLs in the Links section of the ROOT resource

* DS-3696: Root context should return absolute URLs

* DS-3696: Improve base REST url implementation

* DS-3696: Calculate link outside loop
2017-09-26 19:16:58 +02:00
Andrea Bollini
83446ab8eb Merge pull request #1809 from DSpace/bitstream-retrieve-renamed
HAL link to the bitstream content renamed
2017-09-25 10:26:48 +02:00
Pascal-Nicolas Becker
bd0c458a19 DS-3680: Remove problematic unaching. Also see DS-3681 as follow-up. 2017-08-30 20:45:09 +00:00
Pascal-Nicolas Becker
5510415ab6 DS-3680: clarify that we need to dispatch events before committing 2017-08-30 20:44:58 +00:00
Pascal-Nicolas Becker
74b75d546a Revert "Events must be dispatched after commit() to ensure they can retrieve latest data from DB"
This reverts commit 646936a3d8.
2017-08-30 20:43:40 +00:00
Terry Brady
fd934e6e3b [DS-3602] Ensure Consistent Use of Legacy Id in Usage Queries (#1782)
* ensure that owning Item,Coll,Comm use legacy consistently

* scopeId query

* refine queries

* alter id query

* Commenting the behavior of the id / legacyId search

* Address duplicate disp for DSO w legacy and uuid stats
2017-08-17 13:56:24 +00:00
Tim Donohue
4b017e6d7d Replace dispatchEvents() call with an actual commit() to ensure changes are saved 2017-08-16 20:58:29 +00:00
Tim Donohue
058e0c7e39 Events must be dispatched after commit() to ensure they can retrieve latest data from DB 2017-08-16 20:58:18 +00:00
Tim Donohue
43b841fcee DS-3648: Don't uncache submitter and related groups. Also DS-3656: Flush changes before evict() 2017-08-16 20:54:52 +00:00
Mark H. Wood
dd1f64a1e3 Merge pull request #1827 from hardyoyo/DS-3674-provide-inputforms-config-for-plugin-test
[DS-3674] copied over input-forms.xml to the test config folder
2017-08-15 15:35:21 -04:00
Hardy Pottinger
293a1e15f9 [DS-3674] copied over input-forms.xml to the test config folder 2017-08-15 16:56:28 +00:00
Terry W Brady
981c126c74 Normalize space 2017-08-09 13:20:25 -07:00
Terry W Brady
64634be53f Port PR1817, Only request image info if color space 2017-08-09 13:20:03 -07:00
Alexander Sulfrian
388558792c DS-3660: Fix discovery reindex on metadata change
Stored objects may get evicted from the session cache and get into detached
state. Lazy loaded fields are inaccessible and throw an exception on access.

Before using objects they have to be reloaded (retrieved from the
database and associated with the session again).
2017-08-09 19:31:27 +00:00
Tim Donohue
342f190231 DS-3659: Ensure readonly connections can never rollback 2017-08-03 13:52:57 +00:00
Tim Donohue
e4cef702fe Merge pull request #1783 from 4Science/bitstream-retrieve
Add Bitstream /content endpoint
2017-07-14 23:54:29 +10:00
Andrea Bollini
56650ee49a remove old retrieve link name 2017-07-14 12:08:28 +02:00
Andrea Bollini
89050bea57 Keep in sync the HAL link name with the endpoint path
Keep in place also the old link name retrieve to be backward compatible
2017-07-14 12:06:35 +02:00
Andrea Bollini
bf304a6292 rename endpoint to retrieve the bitstream content in /content 2017-07-14 10:52:20 +02:00
Andrea Bollini
7d177168f7 add link to the logo from communities and collections and the link from the community to its collection 2017-07-14 10:52:20 +02:00
Andrea Bollini
8cdee35e1f initial draft to add the functionality to retrieve the bitstream content 2017-07-14 10:52:20 +02:00
Andrea Bollini
e4bc0f028e DS-3483 add support to embed linked resources collection in the HAL document
This is an initial draft that require further refinements. By default now all the collection properties
are embedded in response, linked entities listed in the LinksRest annotation of the repository are
included only if specified in the resource wrapper instantiation and supported for embedding by
the link repository (i.e the relation have suitable default or don't depend on additional parameters)
2017-07-14 10:52:20 +02:00
Andrea Bollini
0d21ce0c7d Merge pull request #1808 from tdonohue/travis-cleanup
Speed up Travis CI build by removing Mirage2 dependencies
2017-07-14 10:51:24 +02:00
Tim Donohue
e34b6980ec Speed up Travis CI build by removing installation of Mirage2 dependencies 2017-07-13 21:20:23 +00:00
Tim Donohue
c069ec0621 Pin versions of SASS and Compass that Travis uses 2017-07-13 17:01:06 +00:00
Tom Desair
f65f9ab2ea DS-3127: Update test assert descriptions of GoogleBitstreamComparatorTest 2017-07-11 21:02:30 +00:00
Tom Desair
bcc6ebd894 DS-3127: Prevent database updates when directly manipulating the bistream list of a bundle 2017-07-11 21:02:13 +00:00
Tom Desair
260f346a74 Fix IT tests 2017-07-11 20:42:03 +00:00
Tom Desair
fcb91d6771 DS-3127: Process review feedback and fix tests 2017-07-11 20:41:53 +00:00
frederic
0ffc3c9a27 ported DS-3558 from dspace 5 to dspace6 2017-07-11 20:41:38 +00:00
Tom Desair
5cb3bc81dc DS-3632: Prevent the use of the locate function as this seems to give inconsistent results 2017-07-11 20:32:20 +00:00
Tom Desair
d267857f59 DS-3632: Changed the update-handle-prefix script so that it does not change the handle suffix 2017-07-11 20:32:02 +00:00
Tim Donohue
78ef7f9243 DS-3431 : Fix broken tests by removing nullifying of global eperson 2017-07-11 16:37:35 +00:00
Pascal-Nicolas Becker
04ec199ff3 [DS-3431] Harden DSpace's BasicWorfklowService 2017-07-11 16:37:19 +00:00
Saiful Amin
0ceb003cd3 Change Content-Type in OAI-Response
As per OAI 2.0 spec (3.1.2.1) the response content type *must* be text/xml.
http://www.openarchives.org/OAI/openarchivesprotocol.html#HTTPResponseFormat

Our OAI client is rejecting the response from DSpace OAI server.
2017-07-11 21:02:05 +05:30
Tom Desair
b0ed059ab5 DS-3628: Check READ resouce policies for items return by REST find-by-metadata-field endpoint 2017-07-07 19:49:30 +00:00
Tim Donohue
fdb4780a3a DS-3619: AuthorizeService.getAuthorizedGroups(...) should check dates (cherry-picked from ef626b4 on dspace-6_x) 2017-07-07 19:42:03 +00:00
Tim Donohue
243e5e3371 Merge pull request #1729 from SophiaGoldberg/master
[DS-3397] Fix error when getting bitstream policies in REST API
2017-07-07 02:46:37 +10:00
Tim Donohue
54d9c799e1 Merge pull request #1797 from atmire/DS-3563-master_Missing-index-metadatavalue-resource-type-id
DS-3563: Fix Oracle Flyway migration error
2017-07-06 01:34:09 +10:00
Tim Donohue
1d884a6798 Merge pull request #1793 from tomdesair/DS-3579-master_Context-mode-and-cache-management-CLI-commands
DS-3579: Context mode and cache management for CLI commands
2017-07-06 01:10:39 +10:00
Tom Desair
50358e9b8e DS-3563: Fix Oracle Flyway migration error 2017-07-05 14:03:46 +02:00
Tom Desair
8a35828e1e DS-3579: Attempt to speed up the IT tests to prevent test failures due to average execution time exceeded the requirement. 2017-07-04 15:36:59 +02:00
Tom Desair
65ed7923c6 Update HibernateDBConnection to use Hibernate 5 API 2017-07-04 14:32:05 +02:00
Tom Desair
10e028918a Port DS-3579_Context-mode-and-cache-management-CLI-commands to master 2017-07-03 14:27:43 +02:00
Terry Brady
207a9e64be Merge pull request #1757 from rivaldi8/DS-3245-csv-linebreaks
DS-3245: CSV linebreaks not supported by Bulkedit
2017-06-28 15:46:42 -07:00
Mark H. Wood
08df09f778 Merge pull request #1708 from christian-scheible/DS-3568
DS-3568. UTF-8 characters are now supported in configuration files
2017-06-22 16:28:44 -04:00
Andrea Bollini
c087675025 Merge pull request #1775 from 4Science/browse
DS-3618 Expose the browse system over the new REST API
2017-06-21 12:45:57 +02:00
Andrea Bollini
05808b4d75 add missing license 2017-06-19 15:26:52 +02:00
Andrea Bollini
bedfd92cde DS-3618 fix items link in the BrowseEntryResource 2017-06-15 16:15:31 +02:00
Andrea Bollini
d04973efac move the spring-data-rest base path to root to avoid conflict with our rest controller
the spring-data-rest base path is currently used only by the HAL browser. It now responds to /browser/** so to avoid conflict with our /api/{apiCategory}/{model}/** mapping
2017-06-15 15:40:38 +02:00
Tom Desair
f148fb3de5 DS-3572: Renamed epersonInGroup to isEPersonInGroup 2017-06-13 15:56:01 +02:00
Tom Desair
59087eed9b DS-3572: Restored behaviour of GroupService.isMember and moved new behaviour to GroupService.isParentOf 2017-06-13 15:55:59 +02:00
Tom Desair
aeb3af9b2f DS-3572: Fix bug where normal group membership is ignored if special groups are present + added tests 2017-06-13 15:55:48 +02:00
Tom Desair
ff8923b315 Improve tests + make GroupService.isMember method more performant for special groups 2017-06-13 15:55:46 +02:00
Tom Desair
20736c2821 Fix DSpace AIP IT tests: Set correct membership for admin 2017-06-13 15:55:27 +02:00
Tom Desair
17c4d7ba9d Attempt to fix contstraint violation 2017-06-13 15:55:27 +02:00
Tom Desair
083eae3e9f Restore GroupServiceImpl.isMember logic + fix tests 2017-06-13 15:55:27 +02:00
Pascal-Nicolas Becker
4f03cbfe09 DS-3572: Adding simple unit test for DS-3572. 2017-06-13 15:55:26 +02:00
Pascal-Nicolas Becker
ff4b7f00fa DS-3572: Check authorization for a specified user instead of currentUser 2017-06-13 15:55:23 +02:00
Tim Donohue
56dcd2d915 Merge pull request #1769 from tdonohue/DS-3552-master
DS-3552: Port to "master" of read only context and hibernate improvements (#1694)
2017-06-12 09:44:53 -05:00
Tom Desair (Atmire)
1ccd6d1e13 Ds 3552 read only context and hibernate improvements (#1694)
* Refactor READ ONLY mode in Context and adjust hibernate settings accordingly

* Set Context in READ-ONLY mode when retrieving community lists

* Fix Hibernate EHCache configuration + fix some Hibernate warnings

* Cache authorized actions and group membership when Context is in READ-ONLY mode

* Set default Context mode

* Let ConfigurableBrowse use a READ-ONLY context

* Add 2nd level cache support for Site and EPerson DSpaceObjects

* Added 2nd level caching for Community and Collection

* Fix tests and license checks

* Cache collection and community queries

* Small refactorings + backwards compatibility

* Set Context to READ-ONLY for JSPUI submissions and 'select collection' step

* OAI improvements part 1

* OAI indexing improvements part 1

* OAI indexing improvements part 2

* DS-3552: Only uncache resource policies in AuthorizeService when in read-only

* DS-3552: Additional comment on caching handles

* DS-3552: Fix cache leakage in SolrServiceResourceRestrictionPlugin

* DS-3552: Clear the read-only cache when switching Context modes

* DS-3552: Correct Group 2nd level cache size

* DS-3552: Always clear the cache, except when going from READ_ONLY to READ_ONLY
2017-06-12 13:50:45 +00:00
Andrea Bollini
5e6fd80c9b DS-3618 Expose the browse system over the new REST API 2017-06-10 15:49:34 +02:00
Andrea Bollini
e41bacf892 DS-3618 introduced the LinkRestRepository concept 2017-06-10 15:38:34 +02:00
Andrea Bollini
55c20cf550 DS-3617 Pluralization of model not respected in the HAL _links and endpoints mapping 2017-06-10 15:35:08 +02:00
Tim Donohue
9f46a1b812 Merge pull request #1768 from tdonohue/DS-3406-master
Forward port of DS-3406 and DS-3599 (Collection/Community alpha sorting) to master
2017-06-09 13:38:21 -07:00
frederic
7f252b6289 DS-3406 unit tests for getCollection/getCommunity for different dspace objects 2017-06-09 19:49:57 +00:00
Tom Desair
4ae462041a Revert imports 2017-06-09 19:49:48 +00:00
Tom Desair
77e507d0b2 DS-3406: Remove unnecessary commit 2017-06-09 19:49:41 +00:00
Tom Desair
027a5a68f9 Fix integration tests. Remove Hibernate Sort annotations as a collection name can change and this breaks the Set semantics 2017-06-09 19:49:31 +00:00
Tom Desair
cc3342894b Fix bug so that comparator can be used for sets 2017-06-09 19:49:21 +00:00
Tom Desair
d39d8134d0 Fixing tests 2017-06-09 19:49:11 +00:00
Yana De Pauw
a588d42f5a DS-3406: Ordering sub communities and collections 2017-06-09 19:48:54 +00:00
Tom Desair
821678dae4 DS-3406: Resolve review feedback 2017-06-09 19:46:44 +00:00
Tom Desair
73e6724ac4 DS-3406: Sort communities and collections in-memory using a comparator 2017-06-09 19:46:34 +00:00
Alan Orth
27255735c4 DS-3517 Allow improved handling of CMYK PDFs
Allow ImageMagick to generate thumbnails with more accurate colors
for PDFs using the CMYK color system. This adds two options to the
dspace.cfg where the user can optionally specify paths to CMYK and
RGB color profiles if they are available on their system (they are
provided by Ghostscript 9.x).

Uses im4java's Info class to determine the color system being used
by the PDF.

See: http://im4java.sourceforge.net/docs/dev-guide.html
2017-06-08 00:11:52 +02:00
Tim Donohue
e16f6a80d5 Refactor BundleServiceImpl.setOrder() to be more failsafe. Update Tests to prove out (previously these new tests failed) 2017-06-06 23:19:53 +02:00
Tim Donohue
d3d8471756 Create a valid unit test for BundleServiceImpl.setOrder() method 2017-06-06 23:19:53 +02:00
kshepherd
95c983faef Merge pull request #1763 from Georgetown-University-Libraries/ds3563m
[DS-3563] Port PR to master
2017-06-06 12:36:57 +12:00
Tom Desair
02d6b57631 DS-3563: Conditional create index for Oracle 2017-06-02 13:26:03 -07:00
Tom Desair
8c734d99ba DS-3563 Added missing index on metadatavalue.resource_type_id 2017-06-02 13:24:13 -07:00
Alexander Sulfrian
f28c8ced67 DS-3281: Start workflow for REST submissions
If an item is submitted through the REST API (via POST on
/{collection_id}/items) the item should not be published immediately,
but should be approved via the defined workflow.
2017-05-31 20:53:52 +00:00
Terry Brady
5032daf297 Merge pull request #1758 from Georgetown-University-Libraries/ds3594m
[DS-3594] Unit Test Clean Up for PostGres
2017-05-31 11:11:59 -07:00
Terry Brady
b2e7c2039a Add comment for null check during sort 2017-05-31 09:04:55 -07:00
Terry Brady
9bd0a58342 Avoid NPE 2017-05-31 09:04:55 -07:00
Terry Brady
c29745d421 Make destroy more forgiving of test failures 2017-05-31 09:04:54 -07:00
Terry Brady
c5414e836e Avoid handle collision in persistent db 2017-05-31 09:04:11 -07:00
Terry Brady
c1844fe960 change parameter setting for db portability 2017-05-31 09:03:55 -07:00
Àlex Magaz Graça
d9175c3c38 DS-3245: CSV linebreaks not supported by Bulkedit
When a multiline field contained empty lines, the importer stopped
reading the file. This reverts a change in 53d387fed to stop when the
end of the file has been reached instead.

Fixes https://jira.duraspace.org/browse/DS-3245
2017-05-31 15:52:23 +02:00
Tim Donohue
7ab631f419 Merge pull request #1754 from 4Science/master
DS-3593 fix missing category for the metadata registries that prevent…
2017-05-25 10:28:38 -07:00
Andrea Bollini
28363680d5 DS-3593 fix missing category for the metadata registries that prevent the new REST app to start 2017-05-25 19:12:12 +02:00
Andrea Bollini
9eaed69d5f Merge pull request #1750 from 4Science/DS-3593
DS-3593 allow aggregation of endpoints by category / Include UUID in serialization
2017-05-25 18:29:56 +02:00
Tim Donohue
bc17a391cd Merge pull request #1662 from minurmin/DS-3463
[DS-3463] Fix IP authentication for anonymous users
2017-05-18 13:11:45 -07:00
Andrea Bollini
4c3793cf50 DS-3593 allow aggregation of endpoints by category
include UUID in the object serialization
2017-05-18 19:12:10 +02:00
Andrea Bollini
27809b5757 Merge pull request #1749 from 4Science/DS-3588
DS-3588 configuration of CORS headers for the API
2017-05-18 17:43:19 +02:00
Andrea Bollini
e32e900316 DS-3588 configuration of CORS headers for the API 2017-05-18 16:28:05 +02:00
Tim Donohue
6c192c53f7 Merge pull request #1742 from Frederic-Atmire/DS-3558-dspace7
DS-3558 Port from dspace 6 to dspace 7
2017-05-15 12:28:53 -07:00
frederic
a5851ce0db DS-3558 Port from dspace 6 to dspace 7 2017-05-05 11:32:00 +02:00
Tim Donohue
9f2121007d Merge pull request #1736 from cjuergen/DS-3585-master
Fix for DS-3585
2017-04-26 11:09:35 -07:00
cjuergen
cd7231a9f4 Fix for DS-3585 2017-04-26 17:39:59 +02:00
Tim Donohue
afe11d7ee2 Merge pull request #1728 from Georgetown-University-Libraries/ds3575m
[DS-3575] master port: Rename misguiding find method in ResourcePolicyService
2017-04-21 13:47:55 -07:00
Andrea Bollini
473311de5c Merge pull request #1719 from 4Science/DS-3577
DS-3577 Metadata Fields and Schemas READ-ONLY endpoints
2017-04-21 11:06:21 +02:00
Andrea Bollini
a5b182c924 Merge pull request #1722 from 4Science/DS-3578
DS-3578 Endpoints to retrieve EPerson and Groups
2017-04-21 11:05:44 +02:00
Sophia Goldberg
eb1d5cf0b5 [DS-3397] Add null checks to EPerson and Group 2017-04-20 21:11:27 +01:00
Pascal-Nicolas Becker
4b5eaf483b apply PR1714 to master 2017-04-20 12:19:59 -07:00
Mark H. Wood
2accc41fe7 Merge pull request #1725 from mwoodiupui/DS-3564-master
[DS-3564] Limit maximum idle database connections by default (port from dspace-6_x)

Copied from approved patch to dspace-6_x branch.
2017-04-20 14:53:55 -04:00
Andrea Bollini
5303785686 removed commented out code 2017-04-20 19:53:31 +02:00
Andrea Bollini
3cb40a1b7b DS-3578 prefer group name over eperson group 2017-04-20 19:46:21 +02:00
Mark H. Wood
80c2722c7e [DS-3564] Limit maximum idle database connections by default (port from dspace-6_x) 2017-04-20 13:03:36 -04:00
Tim Donohue
177ef1febf Merge pull request #1721 from Georgetown-University-Libraries/ds3516m
[DS-3516] master Port ImageMagick PDF Thumbnail class should only process PDFs
2017-04-20 06:57:18 -07:00
Andrea Bollini
0dcb02ee0c DS-3578 Endpoints to retrieve EPerson and Groups
implemented pagination in the eperson and group service
2017-04-20 01:12:59 +02:00
Alan Orth
dbb3f8d6d6 Port PR1709 to master 2017-04-19 14:50:36 -07:00
Andrea Bollini
44d821ae7e DS-3577 Metadata Fields and Schemas READ-ONLY endpoints 2017-04-19 23:07:41 +02:00
Tim Donohue
773e5a3653 Minor updates to README warning about DSpace 7
Correct links, add note welcoming contributors, etc.
2017-04-13 12:02:39 -05:00
Christian Scheible
e022835d75 DS-3568. UTF-8 characters are now supported in configuration files 2017-04-12 15:42:24 +02:00
Andrea Bollini
8af93d8375 Merge pull request #1680 from Georgetown-University-Libraries/rest7site
Add endpoints for the Site object
2017-04-02 15:07:42 +02:00
Terry W Brady
e4134ffa48 Apply review comments 2017-03-30 14:09:09 -07:00
Tim Donohue
54805a8318 Merge pull request #1622 from toniprieto/DS-2947-DIM-repeats-authority-and-confidence-fix
[DS-2947] DIM crosswalks repeats authority & confidence values in the metadata values
2017-03-24 11:15:40 -05:00
Tim Donohue
65139d5453 Update README.md
Add a warning about lack of UI on master
2017-03-23 17:04:26 -05:00
Tim Donohue
cf5bd04447 Merge pull request #1679 from DSpace/rest7
Merge the Rest7 branch back to the master
2017-03-23 11:38:27 -05:00
Andrea Bollini
aaa57dd4fa Merge pull request #1666 from Georgetown-University-Libraries/rest7ds3515
[DS-3515] REST 7 Pagination for items and bitstreams
2017-03-23 17:00:57 +01:00
Andrea Bollini
89cc7d7f01 Merge pull request #1678 from 4Science/DS-3513
DS-3513 Provide an entry endpoint that exposes the known endpoints via the HAL browser
2017-03-23 16:53:01 +01:00
Andrea Bollini
b66066ffb6 restore dspace-solr profile 2017-03-23 16:38:27 +01:00
Mark H. Wood
e51dfe1f6e [DS-1140] Add configuration data 2017-03-22 21:49:33 +00:00
Mark H. Wood
a8bf670261 [DS-1140] Add unit test. 2017-03-22 21:49:24 +00:00
Mark H. Wood
3e4af4ec4a [DS-1140] No need to treat old and new Word formats differently 2017-03-22 21:49:16 +00:00
Mark H. Wood
ec9cc6e038 [DS-1140] New POI-based MS Word extractor and some comment cleanup 2017-03-22 21:49:08 +00:00
Terry W Brady
d921a9c9f2 Add support for Site object 2017-03-20 14:48:20 -07:00
Andrea Bollini
62bd9ed80d DS-3513 proper setting of the initialized attribute 2017-03-17 12:50:40 +01:00
Andrea Bollini
a2dc17e7ae DS-3513 initial implementation of the Root Rest Controller 2017-03-16 23:38:38 +01:00
Andrea Bollini
b6a9b0b51d Merge pull request #1677 from 4Science/rest7-updated
Merge current master in the rest7
2017-03-16 23:10:56 +01:00
Andrea Bollini
b20ce90b60 Merge branch 'master' of https://github.com/DSpace/DSpace into rest7
Conflicts:
	dspace/modules/pom.xml
	dspace/modules/spring-rest/pom.xml
	pom.xml
2017-03-16 21:52:54 +01:00
Tim Donohue
4ab43d6097 Merge pull request #997 from rradillen/DS-2299
DS-2299 useProxies and X-Forwarded-FOR
2017-03-15 14:18:19 -05:00
Terry W Brady
172c0228c1 dao pagination support for item and bitstream 2017-03-13 11:53:24 -07:00
Tim Donohue
5a2fdabaaf Merge pull request #1675 from mwoodiupui/rest7
[DS-3482] Fix test failure and update dependencies
2017-03-10 10:40:04 -06:00
Mark H. Wood
22987c10b2 [DS-3482] Force all Ehcache users that I can find to use a shared CacheManager. 2017-03-10 10:38:20 -05:00
Mark H. Wood
697433085f [DS-3482] Accept any type of DBConnection.
DSpace.getSingletonService(Class) looks for a service of type Class
named Class.getName().  I don't see how this ever worked, since
DBConnection is an interface, and an implementation will have some
other name.  Instead request any service which is instanceof
DBConnection.  There should be only one.

This patch uncovers a slew of new NPEs in dspace-api tests, which are
happening in dspace-services.  We have multiple Ehcache CacheManager
instances for some reason, and newer versions of Ehcache throw a fit
when you try to create multiple managers with the same configuration.
2017-03-10 10:31:22 -05:00
Mark H. Wood
0248b1471d [DS-3423] Update a few dependencies and fix license headers 2017-03-09 15:30:00 -05:00
Terry Brady
424f01ed65 [DS-3348] Drop date check in EmbargoService (#1542)
* Drop date check in EmbargoService

* Revise comment per review
2017-03-08 12:28:25 -06:00
Terry Brady
e7fbbc44c8 Merge pull request #1543 from terrywbrady/ds3334
[DS-3334] Allow reverse-chronological sort of submitters
2017-03-08 09:03:49 -08:00
Andrea Bollini
ce801dfac9 Merge pull request #1661 from Georgetown-University-Libraries/rest7ds3488
[DS-3488] REST7: Create Item Converter
2017-03-05 10:05:27 +01:00
Andrea Bollini
52f284ea5a add awareness of the pagination 2017-03-02 14:16:29 -08:00
Andrea Bollini
ea41259274 exclude linked resources in the default serialization 2017-03-02 14:16:29 -08:00
Terry W Brady
7c081176c9 Add item, collection, community to rest7 2017-03-02 14:15:52 -08:00
Terry Brady
d3e62e4496 Update README.md 2017-03-02 10:13:34 -08:00
Miika Nurminen
314dddafc7 [DS-3463] Fix IP authentication for anonymous users
Added group membership check based on context even if no eperson is found. Affects file downloads in (at least) xmlui.
2017-03-01 19:43:48 +02:00
Terry W Brady
57be1ee8bf Simple item properties 2017-02-28 15:52:19 -08:00
Tim Donohue
5a26f17a3d Workaround for travis-ci/travis-ci#4629 2017-02-27 21:37:52 +00:00
Mark H. Wood
96db228d7b [DS-3378] Patch to restore lost indices, from Adan Roman 2017-02-23 16:36:11 -05:00
Mark H. Wood
a0f4cb734a Merge pull request #1596 from tomdesair/DS-3367_Configurable-Workflow-authorization-denied-error
DS-3367: Fix authorization error on claim by non-admin user
2017-02-23 16:02:56 -05:00
Tim Donohue
280dce2470 Merge pull request #1646 from 4Science/DS-3482
Add the spring-rest overlay to allow building of the new rest7
2017-02-23 10:34:40 -06:00
Terry Brady
1a1b765283 Merge pull request #1595 from tomdesair/DS-2952_SOLR-full-text-indexing-multiple-bitstreams
DS-2952 SOLR full text indexing multiple bitstreams
2017-02-22 08:10:40 -08:00
Tim Donohue
f7ce229c84 Merge pull request #1577 from helix84/DS-2302-license-headers-cleanup
DS-2302 license headers cleanup
2017-02-21 15:12:18 -06:00
Tim Donohue
6a70c9a3c1 Merge pull request #1648 from tdonohue/DS-3422
DS-3422 Remove XMLUI and JSPUI from the official distribution
2017-02-20 16:29:37 -06:00
Andrea Bollini
d6c25c57d9 DS-3422 Remove XMLUI and JSPUI from the official distribution 2017-02-20 21:39:09 +00:00
Bram Luyten
27cd68271e DS-2840 switch logging to debug
Changes INFO level sidebar facet transformer log entries to DEBUG
2017-02-18 14:18:54 +01:00
Andrea Bollini
dd7406b8ce Add the spring-rest overlay to allow building of the new rest7
temporary turn off mvn-enforcer dependency convergence
temporary turn off dspace-solr (to migrate to an external SOLR6)
2017-02-17 18:44:07 +01:00
Andrea Bollini
8786987fb6 Add support for resources that use Integer ID instead than UUID
The rest resource controller now have separate mapping for resources with Integer identifier and UUID identifier. Each mapping delegate to the same generic method handler.
Add information about the requested rels to the DSpaceResource wrapper so to allow future inclusion of lazy loaded reference
Better handling of 404 and catch of some NPE
2017-02-16 15:23:26 +01:00
Andrea Bollini
e911a11c42 fix the name of the new rest webapp 2017-02-16 15:18:22 +01:00
Tim Donohue
c8084a2594 Merge pull request #1553 from 4Science/DS-3356-v6
DS-3356 add turnoff authz system
2017-02-15 16:08:56 -06:00
Andrea Bollini
d6c5aafbff Add initial informations about the new REST webapp 2017-02-12 19:23:54 +01:00
Andrea Bollini
44d34b397f Add license header and initial javadoc to describe the classes 2017-02-12 17:32:15 +01:00
Andrea Bollini
b9e885ab98 remove jdk7 env as DSpace7 will use JAVA8 language features 2017-02-12 16:34:36 +01:00
Andrea Bollini
3a21c4b315 Utils is now a singleton. DSpaceResource automatically adds links to embedded resources 2017-02-12 16:29:03 +01:00
Andrea Bollini
b7ce0389db Allow DSpaceResource to automatically discover links to other resources 2017-02-12 16:24:24 +01:00
Andrea Bollini
e151d86421 add support to serve multiple core entities from the RestResourceController
managed 404 for unavailable endpoints
add self link to single resources in a collection
2017-02-12 16:24:23 +01:00
Andrea Bollini
e992e00080 Initial skeleton for Spring MVC + HATEOAS 2017-02-12 16:24:02 +01:00
Andrea Bollini
e5f9aa67b9 update spring and hibernate dependencies 2017-02-12 16:22:55 +01:00
Andrea Bollini
c9807df70e Merge branch 'master' of https://github.com/DSpace/DSpace into DS-3422
# Conflicts:
#	dspace-xmlui/src/main/java/org/dspace/app/xmlui/aspect/submission/submit/AccessStepUtil.java
#	dspace-xmlui/src/main/java/org/dspace/app/xmlui/objectmanager/ItemAdapter.java
2017-02-12 16:17:07 +01:00
Tim Donohue
be862676ac Merge pull request #1644 from Georgetown-University-Libraries/ds3436m
DS-3436 Sharding SOLR cores corrupts multivalued fields for master
2017-02-09 08:48:03 -06:00
Tom Desair
17fea3b22a Port PR 1613 2017-02-08 14:03:48 -08:00
Tim Donohue
505565f802 Merge pull request #1630 from idmgroup/log-beansexception-master
Add exception message in log when a bean cannot be loaded
2017-02-08 10:49:31 -06:00
Roeland Dillen
61b9a5ef7f add guard code in case no dot is present in bitsream name 2017-02-08 10:41:49 -06:00
Tim Donohue
e91b1e964c Merge pull request #1642 from cjuergen/DS-3479
DS-3479 prevent the import of empty metadata
2017-02-08 10:13:05 -06:00
Mark H. Wood
dd4ca00071 [DS-3469] virus scan during submission attempts to read uploaded bitstream as anonymous user, which fails (#1632)
* [DS-3469] Add the current session context to the curation task run.

* [DS-3469] Log how I/O failed, not just that it did.

* [DS-3469] Keep reference to Bundle from which we just removed the Bitstream instead of expecting the List of Bundle to be unaltered.

* [DS-3469] Finish switching from e.getMessage() to e

* [DS-3469] Note the side effect of calling curate() with a Context.
2017-02-08 10:00:59 -06:00
Terry Brady
90cb82922f [DS-3456 ] Fix Command Line Parameters for statistics import/export tools (master) (#1635)
* Migrate 6x patch to master

* whitespace normalize

* more whitespace
2017-02-08 09:44:10 -06:00
Tim Donohue
eb35470128 Merge pull request #1634 from Georgetown-University-Libraries/ds3457m
[DS-3457] Address tomcat hang when multiple solr shards exist (master)
2017-02-08 09:32:10 -06:00
Terry W Brady
268660b5d9 whitespace 2017-02-07 09:17:54 -08:00
Mark H. Wood
d48e4ed432 Merge pull request #1580 from arvoConsultores/DS-3410
Deleting columns drop indexes
2017-02-06 16:11:01 -05:00
Tim Donohue
ecb19a737c Merge pull request #1631 from idmgroup/junitignore-abstract-tests-master
annotate unit tests so they can be run directly from a Junit runner
2017-02-06 08:00:54 -06:00
Mark H. Wood
0553aa5aa8 Merge pull request #1579 from arvoConsultores/DS-3409
DS-3409 Handle of collections and communities are lost
2017-02-06 08:25:51 -05:00
cjuergen
c323310c27 DS-3479 prevent the import of empty metadata 2017-02-06 13:40:38 +01:00
Hardy Pottinger
7d7a8108a1 Merge pull request #1636 from hardyoyo/DS-3475-add-assetstore.dir-to-dspace.cfg
[DS-3475] added back assetstore.dir configuration to dspace.cfg
2017-02-01 15:05:18 -06:00
Tom Desair
41deb20043 DS-3446: Remove policies only after the bitstream has been updated (otherwise the current user has not WRITE rights) 2017-02-02 10:02:32 +13:00
Hardy Pottinger
4b9095ae80 [DS-3475] adding more guidance to example local.cfg as per suggestion of Tim Donohue 2017-02-01 20:19:35 +00:00
Hardy Pottinger
425caf5656 [DS-3475] added back assetstore.dir configuration to dspace.cfg 2017-02-01 19:40:53 +00:00
Terry W Brady
9e84117557 apply 6x commits 2017-02-01 08:59:56 -08:00
Terry W Brady
1af506d35f apply 6x commits 2017-02-01 08:57:26 -08:00
Bram Luyten
190cbd5d76 Adding stackoverflow support option 2017-02-01 10:51:39 +01:00
Colin Delacroix
a17bf59b9d annotate unit tests so they can be run directly from eclipse junit4 runner 2017-01-30 11:27:39 +01:00
Colin Delacroix
c7b22165aa Add exception message in log when a bean cannot be loaded 2017-01-30 10:42:55 +01:00
Terry Brady
305cd55f25 [DS-3468] Ignore bin directory created by eclipse (#1626)
* Exclude top level /bin directory built by Eclipse
2017-01-26 16:30:17 +01:00
Toni Prieto
6b23729b17 [DS-2947] DIM crosswalks repeats authority & confidence values in the metadata values 2017-01-19 22:34:43 +01:00
Tim Donohue
717fda5d0b Merge pull request #1620 from hardyoyo/DS-3445
DS-3445 Only add "ResultCode" if not default
2017-01-19 11:55:26 -06:00
Jonas Van Goolen
c2c98e886e DS-3445 Only add "ResultCode" if not default 2017-01-19 17:36:53 +00:00
helix84
ba23d25b22 Merge pull request #1618 from AndrewBennet/master
[DS-3460] Fix incorrect REST documentation
2017-01-17 21:26:38 +01:00
Andrew Bennet
cf2d2a4d37 [DS-3460] Fix incorrect REST documentation 2017-01-17 17:11:31 +00:00
Bram Luyten
2b764be9ca Merge pull request #1602 from cjuergen/DS-3440
Fix for DS-3440
2017-01-06 18:22:54 +01:00
cjuergen
d95902b680 Fix for DS-3440 replacing sword-server config reference with
swordv2-server in swordv2
2017-01-05 13:03:53 +01:00
Tim Donohue
740486cf1c Merge pull request #1599 from samuelcambien/DS-3435
DS-3435 possible nullpointerexception at AccessStepUtil$populateEmbar…
2017-01-04 14:38:20 -06:00
samuel
9cd511def4 DS-3435 possible nullpointerexception at AccessStepUtil$populateEmbargoDetail 2017-01-03 12:38:56 +01:00
Andrea Bollini
0421fb4d75 DS-3422 Remove XMLUI and JSPUI from the official distribution 2017-01-03 09:26:45 +01:00
Tom Desair
92b0a5b5a0 DS-3367: Fix authorization error when non-admin users claim a configurable workflow task 2017-01-02 14:02:36 +01:00
Tom Desair
f5e07ba956 DS-2952: Added missing license 2016-12-30 13:35:01 +01:00
Tom Desair
c50c3006dd DS-2952: Only prepend new line if we have an actual input stream 2016-12-30 00:43:53 +01:00
Tom Desair
df1f81bf9d DS-2952: Small improvements to FullTextContentStreams and added a unit test for it 2016-12-30 00:16:25 +01:00
Tom Desair
3b2d8f3669 DS-2952: Use a SequenceInputStream to add the content of multiple full text bitstreams to SOLR 2016-12-28 23:47:12 +01:00
Terry Brady
3ed55464e9 fix typo in comment 2016-12-14 08:28:52 -08:00
Terry Brady
e5211103ec First attempt to resort submitters 2016-12-14 08:28:14 -08:00
Mark H. Wood
1137d4562c Merge pull request #1575 from mwoodiupui/DS-2707
[DS-2707] Poor messaging when batch upload directory cannot be created
2016-12-14 10:16:35 -05:00
aroman
0f49cf29fb Extracted to file 2016-12-01 10:16:47 +01:00
aroman
f22cb7da91 Revert "Handle of collections and communities are lost"
This reverts commit 7defb4b6c9.
2016-12-01 10:12:36 +01:00
aroman
59fd980254 Extracted to external sql 2016-12-01 10:08:36 +01:00
aroman
5ee1f786ab Revert "Deleting collumns drop indexes"
This reverts commit f0e8afef54.
2016-12-01 10:08:09 +01:00
aroman
f0e8afef54 Deleting collumns drop indexes 2016-12-01 09:30:23 +01:00
aroman
7defb4b6c9 Handle of collections and communities are lost 2016-12-01 09:11:47 +01:00
Peter Dietz
e0bd496e64 DS-2302 Remove license information from config files
Also removed excess text
2016-11-24 15:35:31 +01:00
Peter Dietz
2eb63a54e2 Remove boiler license from xmlui.xconf, also trim the inline text 2016-11-24 15:29:55 +01:00
Mark H. Wood
1d3a20cfd4 [DS-2707] Throw an exception if a working directory cannot be created, report it in UI.
Add lots of debug logging, including duplication of some console
output that of course won't appear when running in a webapp.

Also realigned some nearby code that was hard to read.
2016-11-22 11:49:57 -05:00
helix84
462ed4437c Merge pull request #1559 from helix84/DS-3363-csv-import-error-messages
DS-3363 CSV import error says "row", means "column"
2016-11-14 18:24:38 +01:00
helix84
a60a1f61e8 Merge branch 'master' into DS-3363-csv-import-error-messages 2016-11-14 18:04:18 +01:00
helix84
030242a7ff DS-3386 Travis build on both Java 7 and 8 (#1571) 2016-11-14 10:57:50 -06:00
Tim Donohue
f2e7cbf8bc Merge pull request #1570 from helix84/DS-3384-travis-oraclejdk8
DS-3384 tell Travis CI to build using Oracle JDK 8
2016-11-14 10:29:42 -06:00
Ivan Masár
8383ad0228 DS-3384 tell Travis CI to build using Oracle JDK 8 2016-11-14 16:49:38 +01:00
helix84
288be9bbc6 Merge pull request #1569 from helix84/reenable-doclint
DS-3384 re-enable DocLint
2016-11-13 19:47:39 +01:00
Ivan Masár
f7c0577a7c revert doclint-java8-disable 2016-11-13 19:18:49 +01:00
Ivan Masár
cae5570643 whitespace per Coding Conventions 2016-11-13 19:18:23 +01:00
Ivan Masár
0b0fab6404 javadoc fixme, whitespace fixes 2016-11-13 19:18:13 +01:00
helix84
dfbaa9074e Merge pull request #1567 from mwoodiupui/DS-3308
Clean up remaining doclint errors and warnings.
2016-11-13 19:17:31 +01:00
Mark H. Wood
d1f321f345 Clean up remaining doclint errors and warnings.
Throughout dspace-sword I was baffled by the validation aspect, so
you'll see lots of "UNKNOWN.  PLEASE DOCUMENT." that I wrote just to
shut doclint up.  If you know how this works, please amend my
placeholders.
2016-11-11 11:48:17 -05:00
Ivan Masár
269af71afb javadoc (doclint) and whitespace fixes
includes whitespace fixes as per Coding Conventions
2016-11-10 13:34:37 +01:00
Ivan Masár
db1641cfa9 fix JavaDoc @author 2016-11-10 13:34:37 +01:00
Ivan Masár
c92b1d8009 typo: xforwarderfor -> xforwardedfor 2016-11-01 16:19:41 +01:00
Mark H. Wood
5ff4ccfedc After 6.0 release: move master to 7.0-SNAPSHOT 2016-10-28 10:15:54 -04:00
Pascal-Nicolas Becker
f53f3e45cc Revert "prepare for next development iteration"
This reverts commit 543d2098ca.
2016-10-24 17:30:24 +02:00
Pascal-Nicolas Becker
1cb94c2799 Revert "Also moving mirage2 module to 7.0-SNAPSHOT"
This reverts commit 5c4e7ac327.
2016-10-24 17:30:20 +02:00
Pascal-Nicolas Becker
75260e6d17 Revert "Revert "Also moving mirage2 module to 7.0-SNAPSHOT""
This reverts commit 661db4c484.
2016-10-24 17:11:43 +02:00
Pascal-Nicolas Becker
661db4c484 Revert "Also moving mirage2 module to 7.0-SNAPSHOT"
This reverts commit 5c4e7ac327.
2016-10-24 17:11:34 +02:00
Pascal-Nicolas Becker
5c4e7ac327 Also moving mirage2 module to 7.0-SNAPSHOT 2016-10-24 16:51:24 +02:00
Pascal-Nicolas Becker
543d2098ca prepare for next development iteration 2016-10-24 16:45:36 +02:00
Pascal-Nicolas Becker
c17f85e4b3 [maven-release-plugin] prepare for next development iteration 2016-10-24 14:40:29 +02:00
Ivan Masár
244ec0c214 DS-3363 CSV import error says "row", means "column" 2016-10-24 09:48:24 +02:00
Luigi Andrea Pascarelli
127f79f2ef DS-3356 add turnoff authz system 2016-10-07 13:24:35 +02:00
Roeland Dillen
39a1e6fa19 default value in dspace.cfg but not commented out 2016-05-19 11:41:58 +02:00
Roeland Dillen
bde4acf1e4 log message if useProxies is disabled and X-Forwarded-For is set 2016-05-19 11:37:34 +02:00
3470 changed files with 127671 additions and 352154 deletions

6
.gitignore vendored
View File

@@ -6,8 +6,10 @@ tags
## Ignore project files created by Eclipse ## Ignore project files created by Eclipse
.settings/ .settings/
/bin/
.project .project
.classpath .classpath
.checkstyle
## Ignore project files created by IntelliJ IDEA ## Ignore project files created by IntelliJ IDEA
*.iml *.iml
@@ -24,7 +26,6 @@ dist/
nbdist/ nbdist/
nbactions.xml nbactions.xml
nb-configuration.xml nb-configuration.xml
META-INF/
## Ignore all *.properties file in root folder, EXCEPT build.properties (the default) ## Ignore all *.properties file in root folder, EXCEPT build.properties (the default)
## KEPT FOR BACKWARDS COMPATIBILITY WITH 5.x (build.properties is now replaced with local.cfg) ## KEPT FOR BACKWARDS COMPATIBILITY WITH 5.x (build.properties is now replaced with local.cfg)
@@ -38,3 +39,6 @@ META-INF/
##Mac noise ##Mac noise
.DS_Store .DS_Store
##Ignore JRebel project configuration
rebel.xml

View File

@@ -5,27 +5,23 @@ env:
# Give Maven 1GB of memory to work with # Give Maven 1GB of memory to work with
- MAVEN_OPTS=-Xmx1024M - MAVEN_OPTS=-Xmx1024M
jdk:
# DS-3384 Oracle JDK 8 has DocLint enabled by default.
# Let's use this to catch any newly introduced DocLint issues.
- oraclejdk8
## Should we run into any problems with oraclejdk8 on Travis, we may try the following workaround.
## https://docs.travis-ci.com/user/languages/java#Testing-Against-Multiple-JDKs
## https://github.com/travis-ci/travis-ci/issues/3259#issuecomment-130860338
#addons:
# apt:
# packages:
# - oracle-java8-installer
# Install prerequisites for building Mirage2 more rapidly # Install prerequisites for building Mirage2 more rapidly
before_install: before_install:
# Install Node.js 6.5.0 & print version info # Remove outdated settings.xml from Travis builds. Workaround for https://github.com/travis-ci/travis-ci/issues/4629
- nvm install 6.5.0 - rm ~/.m2/settings.xml
- node --version
# Install npm 3.10.8 & print version info
- npm install -g npm@3.10.8
- npm --version
# Install Bower
- npm install -g bower
# Install Grunt & print version info
- npm install -g grunt && npm install -g grunt-cli
- grunt --version
# Print ruby version info (should be installed)
- ruby -v
# Install Sass & print version info
- gem install sass
- sass -v
# Install Compass & print version info
- gem install compass
- compass version
# Skip install stage, as we'll do it below # Skip install stage, as we'll do it below
install: "echo 'Skipping install stage, dependencies will be downloaded during build and test stages.'" install: "echo 'Skipping install stage, dependencies will be downloaded during build and test stages.'"
@@ -43,8 +39,6 @@ script:
# -V => Display Maven version info before build # -V => Display Maven version info before build
# -Dsurefire.rerunFailingTestsCount=2 => try again for flakey tests, and keep track of/report on number of retries # -Dsurefire.rerunFailingTestsCount=2 => try again for flakey tests, and keep track of/report on number of retries
- "mvn clean install license:check -Dmaven.test.skip=false -DskipITs=false -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2" - "mvn clean install license:check -Dmaven.test.skip=false -DskipITs=false -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2"
# 2. [Assemble DSpace] Ensure assembly process works (from [src]/dspace/), including Mirage 2 # 2. [Assemble DSpace] Ensure overlay & assembly process works (from [src]/dspace/)
# -Dmirage2.on=true => Build Mirage2
# -Dmirage2.deps.included=false => Don't include Mirage2 build dependencies (We installed them in before_install)
# -P !assembly => SKIP the actual building of [src]/dspace/dspace-installer (as it can be memory intensive) # -P !assembly => SKIP the actual building of [src]/dspace/dspace-installer (as it can be memory intensive)
- "cd dspace && mvn package -Dmirage2.on=true -Dmirage2.deps.included=false -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2" - "cd dspace && mvn package -P !assembly -B -V -Dsurefire.rerunFailingTestsCount=2"

View File

@@ -1,19 +1,19 @@
DSpace uses third-party libraries which may be distributed under different DSpace uses third-party libraries which may be distributed under different
licenses. We have listed all of these third party libraries and their licenses licenses. We have listed all of these third party libraries and their licenses
below. This file can be regenerated at any time by simply running: below. This file can be regenerated at any time by simply running:
mvn clean verify -Dthird.party.licenses=true mvn clean verify -Dthird.party.licenses=true
You must agree to the terms of these licenses, in addition to the DSpace You must agree to the terms of these licenses, in addition to the DSpace
source code license, in order to use this software. source code license, in order to use this software.
--------------------------------------------------- ---------------------------------------------------
Third party Java libraries listed by License type. Third party Java libraries listed by License type.
PLEASE NOTE: Some dependencies may be listed under multiple licenses if they PLEASE NOTE: Some dependencies may be listed under multiple licenses if they
are dual-licensed. This is especially true of anything listed as are dual-licensed. This is especially true of anything listed as
"GNU General Public Library" below, as DSpace actually does NOT allow for any "GNU General Public Library" below, as DSpace actually does NOT allow for any
dependencies that are solely released under GPL terms. For more info see: dependencies that are solely released under GPL terms. For more info see:
https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
--------------------------------------------------- ---------------------------------------------------
@@ -199,8 +199,6 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Woodstox (org.codehaus.woodstox:woodstox-core-asl:4.1.4 - http://woodstox.codehaus.org) * Woodstox (org.codehaus.woodstox:woodstox-core-asl:4.1.4 - http://woodstox.codehaus.org)
* Woodstox (org.codehaus.woodstox:wstx-asl:3.2.0 - http://woodstox.codehaus.org) * Woodstox (org.codehaus.woodstox:wstx-asl:3.2.0 - http://woodstox.codehaus.org)
* Woodstox (org.codehaus.woodstox:wstx-asl:3.2.7 - http://woodstox.codehaus.org) * Woodstox (org.codehaus.woodstox:wstx-asl:3.2.7 - http://woodstox.codehaus.org)
* databene ContiPerf (org.databene:contiperf:2.3.4 - http://databene.org/contiperf)
* elasticsearch (org.elasticsearch:elasticsearch:1.4.0 - http://nexus.sonatype.org/oss-repository-hosting.html/elasticsearch)
* flyway-core (org.flywaydb:flyway-core:4.0.3 - https://flywaydb.org/flyway-core) * flyway-core (org.flywaydb:flyway-core:4.0.3 - https://flywaydb.org/flyway-core)
* Ogg and Vorbis for Java, Core (org.gagravarr:vorbis-java-core:0.1 - https://github.com/Gagravarr/VorbisJava) * Ogg and Vorbis for Java, Core (org.gagravarr:vorbis-java-core:0.1 - https://github.com/Gagravarr/VorbisJava)
* Apache Tika plugin for Ogg, Vorbis and FLAC (org.gagravarr:vorbis-java-tika:0.1 - https://github.com/Gagravarr/VorbisJava) * Apache Tika plugin for Ogg, Vorbis and FLAC (org.gagravarr:vorbis-java-tika:0.1 - https://github.com/Gagravarr/VorbisJava)
@@ -300,7 +298,7 @@ https://wiki.duraspace.org/display/DSPACE/Code+Contribution+Guidelines
* Repackaged Cocoon Servlet Service Implementation (org.dspace.dependencies.cocoon:dspace-cocoon-servlet-service-impl:1.0.3 - http://projects.dspace.org/dspace-pom/dspace-cocoon-servlet-service-impl) * Repackaged Cocoon Servlet Service Implementation (org.dspace.dependencies.cocoon:dspace-cocoon-servlet-service-impl:1.0.3 - http://projects.dspace.org/dspace-pom/dspace-cocoon-servlet-service-impl)
* DSpace Kernel :: Additions and Local Customizations (org.dspace.modules:additions:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/modules/additions) * DSpace Kernel :: Additions and Local Customizations (org.dspace.modules:additions:6.0-rc4-SNAPSHOT - https://github.com/dspace/DSpace/modules/additions)
* Hamcrest All (org.hamcrest:hamcrest-all:1.3 - https://github.com/hamcrest/JavaHamcrest/hamcrest-all) * Hamcrest All (org.hamcrest:hamcrest-all:1.3 - https://github.com/hamcrest/JavaHamcrest/hamcrest-all)
* Hamcrest Core (org.hamcrest:hamcrest-core:1.3 - https://github.com/hamcrest/JavaHamcrest/hamcrest-core) * Hamcrest Core (org.hamcrest:hamcrest-all:1.3 - https://github.com/hamcrest/JavaHamcrest/hamcrest-all)
* JBibTeX (org.jbibtex:jbibtex:1.0.10 - http://www.jbibtex.org) * JBibTeX (org.jbibtex:jbibtex:1.0.10 - http://www.jbibtex.org)
* ASM Core (org.ow2.asm:asm:4.1 - http://asm.objectweb.org/asm/) * ASM Core (org.ow2.asm:asm:4.1 - http://asm.objectweb.org/asm/)
* ASM Analysis (org.ow2.asm:asm-analysis:4.1 - http://asm.objectweb.org/asm-analysis/) * ASM Analysis (org.ow2.asm:asm-analysis:4.1 - http://asm.objectweb.org/asm-analysis/)

View File

@@ -1,6 +1,9 @@
# DSpace # DSpace
## NOTE: The rest-tutorial branch has been created to support the [DSpace 7 REST documentation](https://dspace-labs.github.io/DSpace7RestTutorial/walkthrough/intro)
- This branch provides stable, referencable line numbers in code
[![Build Status](https://travis-ci.org/DSpace/DSpace.png?branch=master)](https://travis-ci.org/DSpace/DSpace) [![Build Status](https://travis-ci.org/DSpace/DSpace.png?branch=master)](https://travis-ci.org/DSpace/DSpace)
[DSpace Documentation](https://wiki.duraspace.org/display/DSDOC/) | [DSpace Documentation](https://wiki.duraspace.org/display/DSDOC/) |
@@ -9,9 +12,20 @@
[Support](https://wiki.duraspace.org/display/DSPACE/Support) [Support](https://wiki.duraspace.org/display/DSPACE/Support)
DSpace open source software is a turnkey repository application used by more than DSpace open source software is a turnkey repository application used by more than
1000+ organizations and institutions worldwide to provide durable access to digital resources. 2,000 organizations and institutions worldwide to provide durable access to digital resources.
For more information, visit http://www.dspace.org/ For more information, visit http://www.dspace.org/
***
:warning: **Work on DSpace 7 has begun on our `master` branch.** This means that there is temporarily NO user interface on this `master` branch. DSpace 7 will feature a new, unified [Angular](https://angular.io/) user interface, along with an enhanced, rebuilt REST API. The latest status of this work can be found on the [DSpace 7 UI Working Group](https://wiki.duraspace.org/display/DSPACE/DSpace+7+UI+Working+Group) page. Additionally, the codebases can be found in the following places:
* DSpace 7 REST API work is occurring on the [`master` branch](https://github.com/DSpace/DSpace/tree/master/dspace-spring-rest) of this repository.
* The REST Contract is being documented at https://github.com/DSpace/Rest7Contract
* DSpace 7 Angular UI work is occurring at https://github.com/DSpace/dspace-angular
**If you would like to get involved in our DSpace 7 development effort, we welcome new contributors.** Just join one of our meetings or get in touch via Slack. See the [DSpace 7 UI Working Group](https://wiki.duraspace.org/display/DSPACE/DSpace+7+UI+Working+Group) wiki page for more info.
**If you are looking for the ongoing maintenance work for DSpace 6 (or prior releases)**, you can find that work on the corresponding maintenance branch (e.g. [`dspace-6_x`](https://github.com/DSpace/DSpace/tree/dspace-6_x)) in this repository.
***
## Downloads ## Downloads
The latest release of DSpace can be downloaded from the [DSpace website](http://www.dspace.org/latest-release/) or from [GitHub](https://github.com/DSpace/DSpace/releases). The latest release of DSpace can be downloaded from the [DSpace website](http://www.dspace.org/latest-release/) or from [GitHub](https://github.com/DSpace/DSpace/releases).
@@ -53,6 +67,8 @@ We welcome everyone to participate in these lists:
* [dspace-tech@googlegroups.com](https://groups.google.com/d/forum/dspace-tech) : Technical support mailing list. See also our guide for [How to troubleshoot an error](https://wiki.duraspace.org/display/DSPACE/Troubleshoot+an+error). * [dspace-tech@googlegroups.com](https://groups.google.com/d/forum/dspace-tech) : Technical support mailing list. See also our guide for [How to troubleshoot an error](https://wiki.duraspace.org/display/DSPACE/Troubleshoot+an+error).
* [dspace-devel@googlegroups.com](https://groups.google.com/d/forum/dspace-devel) : Developers / Development mailing list * [dspace-devel@googlegroups.com](https://groups.google.com/d/forum/dspace-devel) : Developers / Development mailing list
Great Q&A is also available under the [DSpace tag on Stackoverflow](http://stackoverflow.com/questions/tagged/dspace)
Additional support options are listed at https://wiki.duraspace.org/display/DSPACE/Support Additional support options are listed at https://wiki.duraspace.org/display/DSPACE/Support
DSpace also has an active service provider network. If you'd rather hire a service provider to DSpace also has an active service provider network. If you'd rather hire a service provider to

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suppressions PUBLIC
"-//Puppy Crawl//DTD Suppressions 1.2//EN"
"http://checkstyle.sourceforge.net/dtds/suppressions_1_2.dtd">
<suppressions>
<!-- Temporarily suppress indentation checks for all Tests -->
<!-- TODO: We should have these turned on. But, currently there's a known bug with indentation checks
on JMockIt Expectations blocks and similar. See https://github.com/checkstyle/checkstyle/issues/3739 -->
<suppress checks="Indentation" files="src[/\\]test[/\\]java"/>
</suppressions>

144
checkstyle.xml Normal file
View File

@@ -0,0 +1,144 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE module PUBLIC
"-//Puppy Crawl//DTD Check Configuration 1.3//EN"
"http://checkstyle.sourceforge.net/dtds/configuration_1_3.dtd">
<!--
DSpace CodeStyle Requirements
1. 4-space indents for Java, and 2-space indents for XML. NO TABS ALLOWED.
2. K&R style braces required. Braces required on all blocks.
3. Do not use wildcard imports (e.g. import java.util.*). Duplicated or unused imports also not allowed.
4. Javadocs should exist for all public classes and methods. (Methods rule is unenforced at this time.) Keep it short and to the point
5. Maximum line length is 120 characters (except for long URLs, packages or imports)
6. No trailing spaces allowed (except in comments)
7. Tokens should be surrounded by whitespace (see http://checkstyle.sourceforge.net/config_whitespace.html#WhitespaceAround)
8. Each source file must include our license header (validated separately by license-maven-plugin, see pom.xml)
For more information on CheckStyle configurations below, see: http://checkstyle.sourceforge.net/checks.html
-->
<module name="Checker">
<!-- Configure checker to use UTF-8 encoding -->
<property name="charset" value="UTF-8"/>
<!-- Configure checker to run on files with these extensions -->
<property name="fileExtensions" value="java, properties, cfg, xml"/>
<!-- Suppression configurations in checkstyle-suppressions.xml in same directory -->
<module name="SuppressionFilter">
<property name="file" value="${checkstyle.suppressions.file}" default="checkstyle-suppressions.xml"/>
</module>
<!-- No tab characters ('\t') allowed in the source code -->
<module name="FileTabCharacter">
<property name="eachLine" value="true"/>
<property name="fileExtensions" value="java, properties, cfg, css, js, xml"/>
</module>
<!-- No Trailing Whitespace, except on lines that only have an asterisk (e.g. Javadoc comments) -->
<module name="RegexpSingleline">
<property name="format" value="(?&lt;!\*)\s+$|\*\s\s+$"/>
<property name="message" value="Line has trailing whitespace"/>
<property name="fileExtensions" value="java, properties, cfg, css, js, xml"/>
</module>
<!-- Allow individual lines of code to be excluded from these rules, if they are annotated
with @SuppressWarnings. See also SuppressWarningsHolder below -->
<module name="SuppressWarningsFilter" />
<!-- Check individual Java source files for specific rules -->
<module name="TreeWalker">
<!-- Maximum line length is 120 characters -->
<module name="LineLength">
<property name="max" value="120"/>
<!-- Only exceptions for packages, imports, URLs, and JavaDoc {@link} tags -->
<property name="ignorePattern" value="^package.*|^import.*|http://|https://|@link"/>
</module>
<!-- Highlight any TODO or FIXME comments in info messages -->
<module name="TodoComment">
<property name="severity" value="info"/>
<property name="format" value="(TODO)|(FIXME)"/>
</module>
<!-- Do not report errors on any lines annotated with @SuppressWarnings -->
<module name="SuppressWarningsHolder"/>
<!-- ##### Import statement requirements ##### -->
<!-- Star imports (e.g. import java.util.*) are NOT ALLOWED -->
<module name="AvoidStarImport"/>
<!-- Redundant import statements are NOT ALLOWED -->
<module name="RedundantImport"/>
<!-- Unused import statements are NOT ALLOWED -->
<module name="UnusedImports"/>
<!-- Ensure imports appear alphabetically and grouped -->
<module name="CustomImportOrder">
<property name="sortImportsInGroupAlphabetically" value="true"/>
<property name="separateLineBetweenGroups" value="true"/>
<property name="customImportOrderRules" value="STATIC###STANDARD_JAVA_PACKAGE###THIRD_PARTY_PACKAGE"/>
</module>
<!-- ##### Javadocs requirements ##### -->
<!-- Requirements for Javadocs for classes/interfaces -->
<module name="JavadocType">
<!-- All public classes/interfaces MUST HAVE Javadocs -->
<property name="scope" value="public"/>
<!-- Add an exception for anonymous inner classes -->
<property name="excludeScope" value="anoninner"/>
<!-- Ignore errors related to unknown tags -->
<property name="allowUnknownTags" value="true"/>
<!-- Allow params tags to be optional -->
<property name="allowMissingParamTags" value="false"/>
</module>
<!-- Requirements for Javadocs for methods -->
<module name="JavadocMethod">
<!-- All public methods MUST HAVE Javadocs -->
<!-- <property name="scope" value="public"/> -->
<!-- TODO: Above rule has been disabled because of large amount of missing public method Javadocs -->
<property name="scope" value="nothing"/>
<!-- Allow RuntimeExceptions to be undeclared -->
<property name="allowUndeclaredRTE" value="true"/>
<!-- Allow params, throws and return tags to be optional -->
<property name="allowMissingParamTags" value="true"/>
<property name="allowMissingThrowsTags" value="true"/>
<property name="allowMissingReturnTag" value="true"/>
</module>
<!-- ##### Requirements for K&R Style braces ##### -->
<!-- Code blocks MUST HAVE braces, even single line statements (if, while, etc) -->
<module name="NeedBraces"/>
<!-- Left braces should be at the end of current line (default value)-->
<module name="LeftCurly"/>
<!-- Right braces should be on start of a new line (default value) -->
<module name="RightCurly"/>
<!-- ##### Indentation / Whitespace requirements ##### -->
<!-- Require 4-space indentation (default value) -->
<module name="Indentation"/>
<!-- Whitespace should exist around all major tokens -->
<module name="WhitespaceAround">
<!-- However, make an exception for empty constructors, methods, types, etc. -->
<property name="allowEmptyConstructors" value="true"/>
<property name="allowEmptyMethods" value="true"/>
<property name="allowEmptyTypes" value="true"/>
<property name="allowEmptyLoops" value="true"/>
</module>
<!-- Validate whitespace around Generics (angle brackets) per typical conventions
http://checkstyle.sourceforge.net/config_whitespace.html#GenericWhitespace -->
<module name="GenericWhitespace"/>
<!-- ##### Requirements for "switch" statements ##### -->
<!-- "switch" statements MUST have a "default" clause -->
<module name="MissingSwitchDefault"/>
<!-- "case" clauses in switch statements MUST include break, return, throw or continue -->
<module name="FallThrough"/>
<!-- ##### Other / Miscellaneous requirements ##### -->
<!-- Require utility classes do not have a public constructor -->
<module name="HideUtilityClassConstructor"/>
<!-- Require each variable declaration is its own statement on its own line -->
<module name="MultipleVariableDeclarations"/>
<!-- Each line of code can only include one statement -->
<module name="OneStatementPerLine"/>
<!-- Require that "catch" statements are not empty (must at least contain a comment) -->
<module name="EmptyCatchBlock"/>
</module>
</module>

View File

@@ -1,4 +1,5 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>org.dspace</groupId> <groupId>org.dspace</groupId>
<artifactId>dspace-api</artifactId> <artifactId>dspace-api</artifactId>
@@ -12,7 +13,7 @@
<parent> <parent>
<groupId>org.dspace</groupId> <groupId>org.dspace</groupId>
<artifactId>dspace-parent</artifactId> <artifactId>dspace-parent</artifactId>
<version>6.0</version> <version>7.0-SNAPSHOT</version>
<relativePath>..</relativePath> <relativePath>..</relativePath>
</parent> </parent>
@@ -205,14 +206,15 @@
<executions> <executions>
<execution> <execution>
<id>setproperty</id> <id>setproperty</id>
<phase>generate-test-resources</phase> <!-- XXX I think this should be 'initialize' - MHW --> <phase>generate-test-resources
</phase> <!-- XXX I think this should be 'initialize' - MHW -->
<goals> <goals>
<goal>execute</goal> <goal>execute</goal>
</goals> </goals>
<configuration> <configuration>
<source> <source>
project.properties['agnostic.build.dir']=project.build.directory.replace(File.separator,'/'); project.properties['agnostic.build.dir'] = project.build.directory.replace(File.separator, '/');
println("Initializing Maven property 'agnostic.build.dir' to: " + project.properties['agnostic.build.dir']); println("Initializing Maven property 'agnostic.build.dir' to: " + project.properties['agnostic.build.dir']);
</source> </source>
</configuration> </configuration>
</execution> </execution>
@@ -239,40 +241,39 @@
<artifactId>xml-maven-plugin</artifactId> <artifactId>xml-maven-plugin</artifactId>
<version>1.0.1</version> <version>1.0.1</version>
<executions> <executions>
<execution> <execution>
<id>validate-ALL-xml-and-xsl</id> <id>validate-ALL-xml-and-xsl</id>
<phase>process-test-resources</phase> <phase>process-test-resources</phase>
<goals> <goals>
<goal>validate</goal> <goal>validate</goal>
</goals> </goals>
</execution> </execution>
</executions> </executions>
<configuration> <configuration>
<validationSets> <validationSets>
<!-- validate ALL XML and XSL config files in the testing folder --> <!-- validate ALL XML and XSL config files in the testing folder -->
<validationSet> <validationSet>
<dir>${agnostic.build.dir}/testing</dir> <dir>${agnostic.build.dir}/testing</dir>
<includes> <includes>
<include>**/*.xml</include> <include>**/*.xml</include>
<include>**/*.xsl</include> <include>**/*.xsl</include>
<include>**/*.xconf</include> <include>**/*.xconf</include>
</includes> </includes>
</validationSet> </validationSet>
<!-- validate ALL XML and XSL files throughout the project --> <!-- validate ALL XML and XSL files throughout the project -->
<validationSet> <validationSet>
<dir>${root.basedir}</dir> <dir>${root.basedir}</dir>
<includes> <includes>
<include>**/*.xml</include> <include>**/*.xml</include>
<include>**/*.xsl</include> <include>**/*.xsl</include>
<include>**/*.xmap</include> <include>**/*.xmap</include>
</includes> </includes>
</validationSet> </validationSet>
</validationSets> </validationSets>
</configuration> </configuration>
</plugin> </plugin>
<!-- Run Integration Testing! This plugin just kicks off the tests (when enabled). --> <!-- Run Integration Testing! This plugin just kicks off the tests (when enabled). -->
<plugin> <plugin>
<artifactId>maven-failsafe-plugin</artifactId> <artifactId>maven-failsafe-plugin</artifactId>
@@ -291,7 +292,6 @@
</profile> </profile>
</profiles> </profiles>
<dependencies> <dependencies>
<dependency> <dependency>
<groupId>org.hibernate</groupId> <groupId>org.hibernate</groupId>
@@ -304,13 +304,24 @@
</exclusions> </exclusions>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.hibernate</groupId> <groupId>org.hibernate</groupId>
<artifactId>hibernate-ehcache</artifactId> <artifactId>hibernate-ehcache</artifactId>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator-cdi</artifactId>
<version>${hibernate-validator.version}</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.springframework</groupId> <groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId> <artifactId>spring-orm</artifactId>
</dependency> </dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.1-b10</version>
</dependency>
<dependency> <dependency>
<groupId>org.dspace</groupId> <groupId>org.dspace</groupId>
<artifactId>handle</artifactId> <artifactId>handle</artifactId>
@@ -354,6 +365,10 @@
<groupId>commons-collections</groupId> <groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId> <artifactId>commons-collections</artifactId>
</dependency> </dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-collections4</artifactId>
</dependency>
<dependency> <dependency>
<groupId>org.apache.commons</groupId> <groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId> <artifactId>commons-dbcp2</artifactId>
@@ -415,8 +430,8 @@
<artifactId>pdfbox</artifactId> <artifactId>pdfbox</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.pdfbox</groupId> <groupId>org.apache.pdfbox</groupId>
<artifactId>fontbox</artifactId> <artifactId>fontbox</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.bouncycastle</groupId> <groupId>org.bouncycastle</groupId>
@@ -434,10 +449,6 @@
<groupId>org.apache.poi</groupId> <groupId>org.apache.poi</groupId>
<artifactId>poi-scratchpad</artifactId> <artifactId>poi-scratchpad</artifactId>
</dependency> </dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
</dependency>
<dependency> <dependency>
<groupId>rome</groupId> <groupId>rome</groupId>
<artifactId>rome</artifactId> <artifactId>rome</artifactId>
@@ -501,10 +512,11 @@
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.databene</groupId> <groupId>org.mockito</groupId>
<artifactId>contiperf</artifactId> <artifactId>mockito-core</artifactId>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.rometools</groupId> <groupId>org.rometools</groupId>
<artifactId>rome-modules</artifactId> <artifactId>rome-modules</artifactId>
@@ -568,9 +580,9 @@
<artifactId>commons-configuration</artifactId> <artifactId>commons-configuration</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>com.maxmind.geoip</groupId> <groupId>com.maxmind.geoip2</groupId>
<artifactId>geoip-api</artifactId> <artifactId>geoip2</artifactId>
<version>1.3.0</version> <version>2.11.0</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.ant</groupId> <groupId>org.apache.ant</groupId>
@@ -583,9 +595,9 @@
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.elasticsearch</groupId> <groupId>org.apache.lucene</groupId>
<artifactId>elasticsearch</artifactId> <artifactId>lucene-core</artifactId>
<version>1.4.0</version> <version>4.10.4</version>
</dependency> </dependency>
<dependency> <dependency>
@@ -671,7 +683,6 @@
<dependency> <dependency>
<groupId>joda-time</groupId> <groupId>joda-time</groupId>
<artifactId>joda-time</artifactId> <artifactId>joda-time</artifactId>
<version>2.9.2</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>javax.inject</groupId> <groupId>javax.inject</groupId>
@@ -696,7 +707,7 @@
<dependency> <dependency>
<groupId>org.glassfish.jersey.core</groupId> <groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId> <artifactId>jersey-client</artifactId>
<version>2.22.1</version> <version>${jersey.version}</version>
</dependency> </dependency>
<!-- S3 --> <!-- S3 -->
<dependency> <dependency>
@@ -718,17 +729,10 @@
<dependency> <dependency>
<groupId>com.fasterxml.jackson.core</groupId> <groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId> <artifactId>jackson-core</artifactId>
<version>2.7.0</version>
</dependency> </dependency>
<dependency> <dependency>
<groupId>com.fasterxml.jackson.core</groupId> <groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId> <artifactId>jackson-databind</artifactId>
<version>2.7.0</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.7.0</version>
</dependency> </dependency>
</dependencies> </dependencies>

View File

@@ -19,147 +19,145 @@ package org.apache.solr.handler.extraction;
/** /**
* The various Solr Parameters names to use when extracting content. * The various Solr Parameters names to use when extracting content.
*
**/ **/
public interface ExtractingParams { public interface ExtractingParams {
/** /**
* Map all generated attribute names to field names with lowercase and underscores. * Map all generated attribute names to field names with lowercase and underscores.
*/ */
public static final String LOWERNAMES = "lowernames"; public static final String LOWERNAMES = "lowernames";
/** /**
* if true, ignore TikaException (give up to extract text but index meta data) * if true, ignore TikaException (give up to extract text but index meta data)
*/ */
public static final String IGNORE_TIKA_EXCEPTION = "ignoreTikaException"; public static final String IGNORE_TIKA_EXCEPTION = "ignoreTikaException";
/** /**
* The param prefix for mapping Tika metadata to Solr fields. * The param prefix for mapping Tika metadata to Solr fields.
* <p> * <p>
* To map a field, add a name like: * To map a field, add a name like:
* <pre>fmap.title=solr.title</pre> * <pre>fmap.title=solr.title</pre>
* *
* In this example, the tika "title" metadata value will be added to a Solr field named "solr.title" * In this example, the tika "title" metadata value will be added to a Solr field named "solr.title"
* */
* public static final String MAP_PREFIX = "fmap.";
*/
public static final String MAP_PREFIX = "fmap.";
/** /**
* The boost value for the name of the field. The boost can be specified by a name mapping. * The boost value for the name of the field. The boost can be specified by a name mapping.
* <p> * <p>
* For example * For example
* <pre> * <pre>
* map.title=solr.title * map.title=solr.title
* boost.solr.title=2.5 * boost.solr.title=2.5
* </pre> * </pre>
* will boost the solr.title field for this document by 2.5 * will boost the solr.title field for this document by 2.5
* */
*/ public static final String BOOST_PREFIX = "boost.";
public static final String BOOST_PREFIX = "boost.";
/** /**
* Pass in literal values to be added to the document, as in * Pass in literal values to be added to the document, as in
* <pre> * <pre>
* literal.myField=Foo * literal.myField=Foo
* </pre> * </pre>
* */
*/ public static final String LITERALS_PREFIX = "literal.";
public static final String LITERALS_PREFIX = "literal.";
/** /**
* Restrict the extracted parts of a document to be indexed * Restrict the extracted parts of a document to be indexed
* by passing in an XPath expression. All content that satisfies the XPath expr. * by passing in an XPath expression. All content that satisfies the XPath expr.
* will be passed to the {@link SolrContentHandler}. * will be passed to the {@link org.apache.solr.handler.extraction.SolrContentHandler}.
* <p> * <p>
* See Tika's docs for what the extracted document looks like. * See Tika's docs for what the extracted document looks like.
* <p> *
* @see #CAPTURE_ELEMENTS * @see #CAPTURE_ELEMENTS
*/ */
public static final String XPATH_EXPRESSION = "xpath"; public static final String XPATH_EXPRESSION = "xpath";
/** /**
* Only extract and return the content, do not index it. * Only extract and return the content, do not index it.
*/ */
public static final String EXTRACT_ONLY = "extractOnly"; public static final String EXTRACT_ONLY = "extractOnly";
/** /**
* Content output format if extractOnly is true. Default is "xml", alternative is "text". * Content output format if extractOnly is true. Default is "xml", alternative is "text".
*/ */
public static final String EXTRACT_FORMAT = "extractFormat"; public static final String EXTRACT_FORMAT = "extractFormat";
/** /**
* Capture attributes separately according to the name of the element, instead of just adding them to the string buffer * Capture attributes separately according to the name of the element, instead of just adding them to the string
*/ * buffer
public static final String CAPTURE_ATTRIBUTES = "captureAttr"; */
public static final String CAPTURE_ATTRIBUTES = "captureAttr";
/** /**
* Literal field values will by default override other values such as metadata and content. Set this to false to revert to pre-4.0 behaviour * Literal field values will by default override other values such as metadata and content. Set this to false to
*/ * revert to pre-4.0 behaviour
public static final String LITERALS_OVERRIDE = "literalsOverride"; */
public static final String LITERALS_OVERRIDE = "literalsOverride";
/** /**
* Capture the specified fields (and everything included below it that isn't capture by some other capture field) separately from the default. This is different * Capture the specified fields (and everything included below it that isn't capture by some other capture field)
* then the case of passing in an XPath expression. * separately from the default. This is different
* <p> * then the case of passing in an XPath expression.
* The Capture field is based on the localName returned to the {@link SolrContentHandler} * <p>
* by Tika, not to be confused by the mapped field. The field name can then * The Capture field is based on the localName returned to the
* be mapped into the index schema. * {@link org.apache.solr.handler.extraction.SolrContentHandler}
* <p> * by Tika, not to be confused by the mapped field. The field name can then
* For instance, a Tika document may look like: * be mapped into the index schema.
* <pre> * <p>
* &lt;html&gt; * For instance, a Tika document may look like:
* ... * <pre>
* &lt;body&gt; * &lt;html&gt;
* &lt;p&gt;some text here. &lt;div&gt;more text&lt;/div&gt;&lt;/p&gt; * ...
* Some more text * &lt;body&gt;
* &lt;/body&gt; * &lt;p&gt;some text here. &lt;div&gt;more text&lt;/div&gt;&lt;/p&gt;
* </pre> * Some more text
* By passing in the p tag, you could capture all P tags separately from the rest of the t * &lt;/body&gt;
* Thus, in the example, the capture of the P tag would be: "some text here. more text" * </pre>
* * By passing in the p tag, you could capture all P tags separately from the rest of the t
*/ * Thus, in the example, the capture of the P tag would be: "some text here. more text"
public static final String CAPTURE_ELEMENTS = "capture"; */
public static final String CAPTURE_ELEMENTS = "capture";
/** /**
* The type of the stream. If not specified, Tika will use mime type detection. * The type of the stream. If not specified, Tika will use mime type detection.
*/ */
public static final String STREAM_TYPE = "stream.type"; public static final String STREAM_TYPE = "stream.type";
/** /**
* Optional. The file name. If specified, Tika can take this into account while * Optional. The file name. If specified, Tika can take this into account while
* guessing the MIME type. * guessing the MIME type.
*/ */
public static final String RESOURCE_NAME = "resource.name"; public static final String RESOURCE_NAME = "resource.name";
/** /**
* Optional. The password for this resource. Will be used instead of the rule based password lookup mechanisms * Optional. The password for this resource. Will be used instead of the rule based password lookup mechanisms
*/ */
public static final String RESOURCE_PASSWORD = "resource.password"; public static final String RESOURCE_PASSWORD = "resource.password";
/** /**
* Optional. If specified, the prefix will be prepended to all Metadata, such that it would be possible * Optional. If specified, the prefix will be prepended to all Metadata, such that it would be possible
* to setup a dynamic field to automatically capture it * to setup a dynamic field to automatically capture it
*/ */
public static final String UNKNOWN_FIELD_PREFIX = "uprefix"; public static final String UNKNOWN_FIELD_PREFIX = "uprefix";
/** /**
* Optional. If specified and the name of a potential field cannot be determined, the default Field specified * Optional. If specified and the name of a potential field cannot be determined, the default Field specified
* will be used instead. * will be used instead.
*/ */
public static final String DEFAULT_FIELD = "defaultField"; public static final String DEFAULT_FIELD = "defaultField";
/** /**
* Optional. If specified, loads the file as a source for password lookups for Tika encrypted documents. * Optional. If specified, loads the file as a source for password lookups for Tika encrypted documents.
* <p> * <p>
* File format is Java properties format with one key=value per line. * File format is Java properties format with one key=value per line.
* The key is evaluated as a regex against the file name, and the value is the password * The key is evaluated as a regex against the file name, and the value is the password
* The rules are evaluated top-bottom, i.e. the first match will be used * The rules are evaluated top-bottom, i.e. the first match will be used
* If you want a fallback password to be always used, supply a .*=&lt;defaultmypassword&gt; at the end * If you want a fallback password to be always used, supply a .*=&lt;defaultmypassword&gt; at the end
*/ */
public static final String PASSWORD_MAP_FILE = "passwordsFile"; public static final String PASSWORD_MAP_FILE = "passwordsFile";
} }

View File

@@ -30,13 +30,12 @@ import org.dspace.handle.service.HandleService;
/** /**
* A command-line tool for setting/removing community/sub-community * A command-line tool for setting/removing community/sub-community
* relationships. Takes community DB Id or handle arguments as inputs. * relationships. Takes community DB Id or handle arguments as inputs.
* *
* @author rrodgers * @author rrodgers
* @version $Revision$ * @version $Revision$
*/ */
public class CommunityFiliator public class CommunityFiliator {
{
protected CommunityService communityService; protected CommunityService communityService;
protected HandleService handleService; protected HandleService handleService;
@@ -47,12 +46,10 @@ public class CommunityFiliator
} }
/** /**
* * @param argv the command line arguments given
* @param argv arguments
* @throws Exception if error * @throws Exception if error
*/ */
public static void main(String[] argv) throws Exception public static void main(String[] argv) throws Exception {
{
// create an options object and populate it // create an options object and populate it
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
@@ -60,11 +57,11 @@ public class CommunityFiliator
options.addOption("s", "set", false, "set a parent/child relationship"); options.addOption("s", "set", false, "set a parent/child relationship");
options.addOption("r", "remove", false, options.addOption("r", "remove", false,
"remove a parent/child relationship"); "remove a parent/child relationship");
options.addOption("p", "parent", true, options.addOption("p", "parent", true,
"parent community (handle or database ID)"); "parent community (handle or database ID)");
options.addOption("c", "child", true, options.addOption("c", "child", true,
"child community (handle or databaseID)"); "child community (handle or databaseID)");
options.addOption("h", "help", false, "help"); options.addOption("h", "help", false, "help");
CommandLine line = parser.parse(options, argv); CommandLine line = parser.parse(options, argv);
@@ -73,57 +70,48 @@ public class CommunityFiliator
String parentID = null; String parentID = null;
String childID = null; String childID = null;
if (line.hasOption('h')) if (line.hasOption('h')) {
{
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("CommunityFiliator\n", options); myhelp.printHelp("CommunityFiliator\n", options);
System.out System.out
.println("\nestablish a relationship: CommunityFiliator -s -p parentID -c childID"); .println("\nestablish a relationship: CommunityFiliator -s -p parentID -c childID");
System.out System.out
.println("remove a relationship: CommunityFiliator -r -p parentID -c childID"); .println("remove a relationship: CommunityFiliator -r -p parentID -c childID");
System.exit(0); System.exit(0);
} }
if (line.hasOption('s')) if (line.hasOption('s')) {
{
command = "set"; command = "set";
} }
if (line.hasOption('r')) if (line.hasOption('r')) {
{
command = "remove"; command = "remove";
} }
if (line.hasOption('p')) // parent if (line.hasOption('p')) { // parent
{
parentID = line.getOptionValue('p'); parentID = line.getOptionValue('p');
} }
if (line.hasOption('c')) // child if (line.hasOption('c')) { // child
{
childID = line.getOptionValue('c'); childID = line.getOptionValue('c');
} }
// now validate // now validate
// must have a command set // must have a command set
if (command == null) if (command == null) {
{
System.out System.out
.println("Error - must run with either set or remove (run with -h flag for details)"); .println("Error - must run with either set or remove (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
if ("set".equals(command) || "remove".equals(command)) if ("set".equals(command) || "remove".equals(command)) {
{ if (parentID == null) {
if (parentID == null)
{
System.out.println("Error - a parentID must be specified (run with -h flag for details)"); System.out.println("Error - a parentID must be specified (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
if (childID == null) if (childID == null) {
{
System.out.println("Error - a childID must be specified (run with -h flag for details)"); System.out.println("Error - a childID must be specified (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
@@ -135,71 +123,57 @@ public class CommunityFiliator
// we are superuser! // we are superuser!
c.turnOffAuthorisationSystem(); c.turnOffAuthorisationSystem();
try try {
{
// validate and resolve the parent and child IDs into commmunities // validate and resolve the parent and child IDs into commmunities
Community parent = filiator.resolveCommunity(c, parentID); Community parent = filiator.resolveCommunity(c, parentID);
Community child = filiator.resolveCommunity(c, childID); Community child = filiator.resolveCommunity(c, childID);
if (parent == null) if (parent == null) {
{
System.out.println("Error, parent community cannot be found: " System.out.println("Error, parent community cannot be found: "
+ parentID); + parentID);
System.exit(1); System.exit(1);
} }
if (child == null) if (child == null) {
{
System.out.println("Error, child community cannot be found: " System.out.println("Error, child community cannot be found: "
+ childID); + childID);
System.exit(1); System.exit(1);
} }
if ("set".equals(command)) if ("set".equals(command)) {
{
filiator.filiate(c, parent, child); filiator.filiate(c, parent, child);
} } else {
else
{
filiator.defiliate(c, parent, child); filiator.defiliate(c, parent, child);
} }
} } catch (SQLException sqlE) {
catch (SQLException sqlE)
{
System.out.println("Error - SQL exception: " + sqlE.toString()); System.out.println("Error - SQL exception: " + sqlE.toString());
} } catch (AuthorizeException authE) {
catch (AuthorizeException authE)
{
System.out.println("Error - Authorize exception: " System.out.println("Error - Authorize exception: "
+ authE.toString()); + authE.toString());
} } catch (IOException ioE) {
catch (IOException ioE)
{
System.out.println("Error - IO exception: " + ioE.toString()); System.out.println("Error - IO exception: " + ioE.toString());
} }
} }
/** /**
* * @param c context
* @param c context
* @param parent parent Community * @param parent parent Community
* @param child child community * @param child child community
* @throws SQLException if database error * @throws SQLException if database error
* @throws AuthorizeException if authorize error * @throws AuthorizeException if authorize error
* @throws IOException if IO error * @throws IOException if IO error
*/ */
public void filiate(Context c, Community parent, Community child) public void filiate(Context c, Community parent, Community child)
throws SQLException, AuthorizeException, IOException throws SQLException, AuthorizeException, IOException {
{
// check that a valid filiation would be established // check that a valid filiation would be established
// first test - proposed child must currently be an orphan (i.e. // first test - proposed child must currently be an orphan (i.e.
// top-level) // top-level)
Community childDad = CollectionUtils.isNotEmpty(child.getParentCommunities()) ? child.getParentCommunities().iterator().next() : null; Community childDad = CollectionUtils.isNotEmpty(child.getParentCommunities()) ? child.getParentCommunities()
.iterator().next() : null;
if (childDad != null) if (childDad != null) {
{
System.out.println("Error, child community: " + child.getID() System.out.println("Error, child community: " + child.getID()
+ " already a child of: " + childDad.getID()); + " already a child of: " + childDad.getID());
System.exit(1); System.exit(1);
} }
@@ -207,12 +181,10 @@ public class CommunityFiliator
// child // child
List<Community> parentDads = parent.getParentCommunities(); List<Community> parentDads = parent.getParentCommunities();
for (int i = 0; i < parentDads.size(); i++) for (int i = 0; i < parentDads.size(); i++) {
{ if (parentDads.get(i).getID().equals(child.getID())) {
if (parentDads.get(i).getID().equals(child.getID()))
{
System.out System.out
.println("Error, circular parentage - child is parent of parent"); .println("Error, circular parentage - child is parent of parent");
System.exit(1); System.exit(1);
} }
} }
@@ -223,39 +195,34 @@ public class CommunityFiliator
// complete the pending transaction // complete the pending transaction
c.complete(); c.complete();
System.out.println("Filiation complete. Community: '" + parent.getID() System.out.println("Filiation complete. Community: '" + parent.getID()
+ "' is parent of community: '" + child.getID() + "'"); + "' is parent of community: '" + child.getID() + "'");
} }
/** /**
* * @param c context
* @param c context
* @param parent parent Community * @param parent parent Community
* @param child child community * @param child child community
* @throws SQLException if database error * @throws SQLException if database error
* @throws AuthorizeException if authorize error * @throws AuthorizeException if authorize error
* @throws IOException if IO error * @throws IOException if IO error
*/ */
public void defiliate(Context c, Community parent, Community child) public void defiliate(Context c, Community parent, Community child)
throws SQLException, AuthorizeException, IOException throws SQLException, AuthorizeException, IOException {
{
// verify that child is indeed a child of parent // verify that child is indeed a child of parent
List<Community> parentKids = parent.getSubcommunities(); List<Community> parentKids = parent.getSubcommunities();
boolean isChild = false; boolean isChild = false;
for (int i = 0; i < parentKids.size(); i++) for (int i = 0; i < parentKids.size(); i++) {
{ if (parentKids.get(i).getID().equals(child.getID())) {
if (parentKids.get(i).getID().equals(child.getID()))
{
isChild = true; isChild = true;
break; break;
} }
} }
if (!isChild) if (!isChild) {
{
System.out System.out
.println("Error, child community not a child of parent community"); .println("Error, child community not a child of parent community");
System.exit(1); System.exit(1);
} }
@@ -269,37 +236,33 @@ public class CommunityFiliator
// complete the pending transaction // complete the pending transaction
c.complete(); c.complete();
System.out.println("Defiliation complete. Community: '" + child.getID() System.out.println("Defiliation complete. Community: '" + child.getID()
+ "' is no longer a child of community: '" + parent.getID() + "' is no longer a child of community: '" + parent.getID()
+ "'"); + "'");
} }
/** /**
* Find a community by ID * Find a community by ID
* @param c context *
* @param c context
* @param communityID community ID * @param communityID community ID
* @return Community object * @return Community object
* @throws SQLException if database error * @throws SQLException if database error
*/ */
protected Community resolveCommunity(Context c, String communityID) protected Community resolveCommunity(Context c, String communityID)
throws SQLException throws SQLException {
{
Community community = null; Community community = null;
if (communityID.indexOf('/') != -1) if (communityID.indexOf('/') != -1) {
{
// has a / must be a handle // has a / must be a handle
community = (Community) handleService.resolveToObject(c, community = (Community) handleService.resolveToObject(c,
communityID); communityID);
// ensure it's a community // ensure it's a community
if ((community == null) if ((community == null)
|| (community.getType() != Constants.COMMUNITY)) || (community.getType() != Constants.COMMUNITY)) {
{
community = null; community = null;
} }
} } else {
else
{
community = communityService.find(c, UUID.fromString(communityID)); community = communityService.find(c, UUID.fromString(communityID));
} }

View File

@@ -15,7 +15,6 @@ import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser; import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.Options; import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser; import org.apache.commons.cli.PosixParser;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang.StringUtils;
import org.dspace.core.ConfigurationManager; import org.dspace.core.ConfigurationManager;
import org.dspace.core.Context; import org.dspace.core.Context;
@@ -34,21 +33,21 @@ import org.dspace.eperson.service.GroupService;
* <P> * <P>
* Alternatively, it can be used to take the email, first name, last name and * Alternatively, it can be used to take the email, first name, last name and
* desired password as arguments thus: * desired password as arguments thus:
* *
* CreateAdministrator -e [email] -f [first name] -l [last name] -p [password] * CreateAdministrator -e [email] -f [first name] -l [last name] -p [password]
* *
* This is particularly convenient for automated deploy scripts that require an * This is particularly convenient for automated deploy scripts that require an
* initial administrator, for example, before deployment can be completed * initial administrator, for example, before deployment can be completed
* *
* @author Robert Tansley * @author Robert Tansley
* @author Richard Jones * @author Richard Jones
*
* @version $Revision$ * @version $Revision$
*/ */
public final class CreateAdministrator public final class CreateAdministrator {
{ /**
/** DSpace Context object */ * DSpace Context object
private final Context context; */
private final Context context;
protected EPersonService ePersonService; protected EPersonService ePersonService;
protected GroupService groupService; protected GroupService groupService;
@@ -56,222 +55,199 @@ public final class CreateAdministrator
/** /**
* For invoking via the command line. If called with no command line arguments, * For invoking via the command line. If called with no command line arguments,
* it will negotiate with the user for the administrator details * it will negotiate with the user for the administrator details
* *
* @param argv * @param argv the command line arguments given
* command-line arguments
* @throws Exception if error * @throws Exception if error
*/ */
public static void main(String[] argv) public static void main(String[] argv)
throws Exception throws Exception {
{ CommandLineParser parser = new PosixParser();
CommandLineParser parser = new PosixParser(); Options options = new Options();
Options options = new Options();
CreateAdministrator ca = new CreateAdministrator();
CreateAdministrator ca = new CreateAdministrator();
options.addOption("e", "email", true, "administrator email address");
options.addOption("e", "email", true, "administrator email address"); options.addOption("f", "first", true, "administrator first name");
options.addOption("f", "first", true, "administrator first name"); options.addOption("l", "last", true, "administrator last name");
options.addOption("l", "last", true, "administrator last name"); options.addOption("c", "language", true, "administrator language");
options.addOption("c", "language", true, "administrator language"); options.addOption("p", "password", true, "administrator password");
options.addOption("p", "password", true, "administrator password");
CommandLine line = parser.parse(options, argv);
CommandLine line = parser.parse(options, argv);
if (line.hasOption("e") && line.hasOption("f") && line.hasOption("l") &&
if (line.hasOption("e") && line.hasOption("f") && line.hasOption("l") && line.hasOption("c") && line.hasOption("p")) {
line.hasOption("c") && line.hasOption("p")) ca.createAdministrator(line.getOptionValue("e"),
{ line.getOptionValue("f"), line.getOptionValue("l"),
ca.createAdministrator(line.getOptionValue("e"), line.getOptionValue("c"), line.getOptionValue("p"));
line.getOptionValue("f"), line.getOptionValue("l"), } else {
line.getOptionValue("c"), line.getOptionValue("p")); ca.negotiateAdministratorDetails();
} }
else
{
ca.negotiateAdministratorDetails();
}
} }
/** /**
* constructor, which just creates and object with a ready context * constructor, which just creates and object with a ready context
* *
* @throws Exception if error * @throws Exception if error
*/ */
protected CreateAdministrator() protected CreateAdministrator()
throws Exception throws Exception {
{ context = new Context();
context = new Context();
groupService = EPersonServiceFactory.getInstance().getGroupService(); groupService = EPersonServiceFactory.getInstance().getGroupService();
ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); ePersonService = EPersonServiceFactory.getInstance().getEPersonService();
} }
/** /**
* Method which will negotiate with the user via the command line to * Method which will negotiate with the user via the command line to
* obtain the administrator's details * obtain the administrator's details
* *
* @throws Exception if error * @throws Exception if error
*/ */
protected void negotiateAdministratorDetails() protected void negotiateAdministratorDetails()
throws Exception throws Exception {
{
Console console = System.console(); Console console = System.console();
System.out.println("Creating an initial administrator account"); System.out.println("Creating an initial administrator account");
boolean dataOK = false; boolean dataOK = false;
String email = null; String email = null;
String firstName = null; String firstName = null;
String lastName = null; String lastName = null;
char[] password1 = null; char[] password1 = null;
char[] password2 = null; char[] password2 = null;
String language = I18nUtil.DEFAULTLOCALE.getLanguage(); String language = I18nUtil.DEFAULTLOCALE.getLanguage();
while (!dataOK) while (!dataOK) {
{ System.out.print("E-mail address: ");
System.out.print("E-mail address: "); System.out.flush();
System.out.flush();
email = console.readLine();
email = console.readLine(); if (!StringUtils.isBlank(email)) {
if (!StringUtils.isBlank(email))
{
email = email.trim(); email = email.trim();
} } else {
else
{
System.out.println("Please provide an email address."); System.out.println("Please provide an email address.");
continue; continue;
} }
System.out.print("First name: ");
System.out.flush();
firstName = console.readLine();
if (firstName != null) System.out.print("First name: ");
{ System.out.flush();
firstName = console.readLine();
if (firstName != null) {
firstName = firstName.trim(); firstName = firstName.trim();
} }
System.out.print("Last name: ");
System.out.flush();
lastName = console.readLine();
if (lastName != null) System.out.print("Last name: ");
{ System.out.flush();
lastName = console.readLine();
if (lastName != null) {
lastName = lastName.trim(); lastName = lastName.trim();
} }
if (ConfigurationManager.getProperty("webui.supported.locales") != null) if (ConfigurationManager.getProperty("webui.supported.locales") != null) {
{ System.out.println("Select one of the following languages: " + ConfigurationManager
System.out.println("Select one of the following languages: " + ConfigurationManager.getProperty("webui.supported.locales")); .getProperty("webui.supported.locales"));
System.out.print("Language: "); System.out.print("Language: ");
System.out.flush(); System.out.flush();
language = console.readLine();
if (language != null) language = console.readLine();
{
if (language != null) {
language = language.trim(); language = language.trim();
language = I18nUtil.getSupportedLocale(new Locale(language)).getLanguage(); language = I18nUtil.getSupportedLocale(new Locale(language)).getLanguage();
} }
} }
System.out.println("Password will not display on screen.");
System.out.print("Password: ");
System.out.flush();
password1 = console.readPassword(); System.out.println("Password will not display on screen.");
System.out.print("Password: ");
System.out.print("Again to confirm: "); System.out.flush();
System.out.flush();
password1 = console.readPassword();
password2 = console.readPassword();
System.out.print("Again to confirm: ");
System.out.flush();
password2 = console.readPassword();
//TODO real password validation //TODO real password validation
if (password1.length > 1 && Arrays.equals(password1, password2)) if (password1.length > 1 && Arrays.equals(password1, password2)) {
{ // password OK
// password OK System.out.print("Is the above data correct? (y or n): ");
System.out.print("Is the above data correct? (y or n): "); System.out.flush();
System.out.flush();
String s = console.readLine();
if (s != null) String s = console.readLine();
{
if (s != null) {
s = s.trim(); s = s.trim();
if (s.toLowerCase().startsWith("y")) if (s.toLowerCase().startsWith("y")) {
{
dataOK = true; dataOK = true;
} }
} }
} } else {
else System.out.println("Passwords don't match");
{ }
System.out.println("Passwords don't match"); }
}
} // if we make it to here, we are ready to create an administrator
createAdministrator(email, firstName, lastName, language, String.valueOf(password1));
// if we make it to here, we are ready to create an administrator
createAdministrator(email, firstName, lastName, language, String.valueOf(password1));
//Cleaning arrays that held password //Cleaning arrays that held password
Arrays.fill(password1, ' '); Arrays.fill(password1, ' ');
Arrays.fill(password2, ' '); Arrays.fill(password2, ' ');
} }
/** /**
* Create the administrator with the given details. If the user * Create the administrator with the given details. If the user
* already exists then they are simply upped to administrator status * already exists then they are simply upped to administrator status
* *
* @param email the email for the user * @param email the email for the user
* @param first user's first name * @param first user's first name
* @param last user's last name * @param last user's last name
* @param language preferred language * @param language preferred language
* @param pw desired password * @param pw desired password
*
* @throws Exception if error * @throws Exception if error
*/ */
protected void createAdministrator(String email, String first, String last, protected void createAdministrator(String email, String first, String last,
String language, String pw) String language, String pw)
throws Exception throws Exception {
{ // Of course we aren't an administrator yet so we need to
// Of course we aren't an administrator yet so we need to // circumvent authorisation
// circumvent authorisation context.turnOffAuthorisationSystem();
context.turnOffAuthorisationSystem();
// Find administrator group
// Find administrator group Group admins = groupService.findByName(context, Group.ADMIN);
Group admins = groupService.findByName(context, Group.ADMIN);
if (admins == null) {
if (admins == null) throw new IllegalStateException("Error, no admin group (group 1) found");
{ }
throw new IllegalStateException("Error, no admin group (group 1) found");
} // Create the administrator e-person
EPerson eperson = ePersonService.findByEmail(context, email);
// Create the administrator e-person
EPerson eperson = ePersonService.findByEmail(context,email);
// check if the email belongs to a registered user, // check if the email belongs to a registered user,
// if not create a new user with this email // if not create a new user with this email
if (eperson == null) if (eperson == null) {
{
eperson = ePersonService.create(context); eperson = ePersonService.create(context);
eperson.setEmail(email); eperson.setEmail(email);
eperson.setCanLogIn(true); eperson.setCanLogIn(true);
eperson.setRequireCertificate(false); eperson.setRequireCertificate(false);
eperson.setSelfRegistered(false); eperson.setSelfRegistered(false);
} }
eperson.setLastName(context, last); eperson.setLastName(context, last);
eperson.setFirstName(context, first); eperson.setFirstName(context, first);
eperson.setLanguage(context, language); eperson.setLanguage(context, language);
ePersonService.setPassword(eperson, pw); ePersonService.setPassword(eperson, pw);
ePersonService.update(context, eperson); ePersonService.update(context, eperson);
groupService.addMember(context, admins, eperson); groupService.addMember(context, admins, eperson);
groupService.update(context, admins); groupService.update(context, admins);
context.complete(); context.complete();
System.out.println("Administrator account created"); System.out.println("Administrator account created");
} }
} }

View File

@@ -7,7 +7,19 @@
*/ */
package org.dspace.administer; package org.dspace.administer;
import org.apache.commons.cli.*; import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import org.apache.commons.cli.PosixParser;
import org.apache.xml.serialize.Method; import org.apache.xml.serialize.Method;
import org.apache.xml.serialize.OutputFormat; import org.apache.xml.serialize.OutputFormat;
import org.apache.xml.serialize.XMLSerializer; import org.apache.xml.serialize.XMLSerializer;
@@ -19,69 +31,63 @@ import org.dspace.content.service.MetadataSchemaService;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.xml.sax.SAXException; import org.xml.sax.SAXException;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/** /**
* @author Graham Triggs * @author Graham Triggs
* *
* This class creates an xml document as passed in the arguments and * This class creates an xml document as passed in the arguments and
* from the metadata schemas for the repository. * from the metadata schemas for the repository.
* *
* The form of the XML is as follows * The form of the XML is as follows
* {@code * {@code
* <metadata-schemas> * <metadata-schemas>
* <schema> * <schema>
* <name>dc</name> * <name>dc</name>
* <namespace>http://dublincore.org/documents/dcmi-terms/</namespace> * <namespace>http://dublincore.org/documents/dcmi-terms/</namespace>
* </schema> * </schema>
* </metadata-schemas> * </metadata-schemas>
* } * }
*/ */
public class MetadataExporter public class MetadataExporter {
{
protected static MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance().getMetadataSchemaService(); protected static MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance()
protected static MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService(); .getMetadataSchemaService();
protected static MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance()
.getMetadataFieldService();
/**
* Default constructor
*/
private MetadataExporter() { }
/** /**
* @param args commandline arguments * @param args commandline arguments
* @throws ParseException if parser error * @throws ParseException if parser error
* @throws SAXException if XML parse error * @throws SAXException if XML parse error
* @throws IOException if IO error * @throws IOException if IO error
* @throws SQLException if database error * @throws SQLException if database error
* @throws RegistryExportException if export error * @throws RegistryExportException if export error
*/ */
public static void main(String[] args) throws ParseException, SQLException, IOException, SAXException, RegistryExportException public static void main(String[] args)
{ throws ParseException, SQLException, IOException, SAXException, RegistryExportException {
// create an options object and populate it // create an options object and populate it
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
Options options = new Options(); Options options = new Options();
options.addOption("f", "file", true, "output xml file for registry"); options.addOption("f", "file", true, "output xml file for registry");
options.addOption("s", "schema", true, "the name of the schema to export"); options.addOption("s", "schema", true, "the name of the schema to export");
CommandLine line = parser.parse(options, args); CommandLine line = parser.parse(options, args);
String file = null; String file = null;
String schema = null; String schema = null;
if (line.hasOption('f')) if (line.hasOption('f')) {
{ file = line.getOptionValue('f');
file = line.getOptionValue('f'); } else {
}
else
{
usage(); usage();
System.exit(0); System.exit(0);
} }
if (line.hasOption('s')) if (line.hasOption('s')) {
{
schema = line.getOptionValue('s'); schema = line.getOptionValue('s');
} }
@@ -90,15 +96,16 @@ public class MetadataExporter
/** /**
* Save a registry to a filepath * Save a registry to a filepath
* @param file filepath *
* @param file filepath
* @param schema schema definition to save * @param schema schema definition to save
* @throws SQLException if database error * @throws SQLException if database error
* @throws IOException if IO error * @throws IOException if IO error
* @throws SAXException if XML error * @throws SAXException if XML error
* @throws RegistryExportException if export error * @throws RegistryExportException if export error
*/ */
public static void saveRegistry(String file, String schema) throws SQLException, IOException, SAXException, RegistryExportException public static void saveRegistry(String file, String schema)
{ throws SQLException, IOException, SAXException, RegistryExportException {
// create a context // create a context
Context context = new Context(); Context context = new Context();
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
@@ -106,113 +113,102 @@ public class MetadataExporter
OutputFormat xmlFormat = new OutputFormat(Method.XML, "UTF-8", true); OutputFormat xmlFormat = new OutputFormat(Method.XML, "UTF-8", true);
xmlFormat.setLineWidth(120); xmlFormat.setLineWidth(120);
xmlFormat.setIndent(4); xmlFormat.setIndent(4);
XMLSerializer xmlSerializer = new XMLSerializer(new BufferedWriter(new FileWriter(file)), xmlFormat); XMLSerializer xmlSerializer = new XMLSerializer(new BufferedWriter(new FileWriter(file)), xmlFormat);
// XMLSerializer xmlSerializer = new XMLSerializer(System.out, xmlFormat); // XMLSerializer xmlSerializer = new XMLSerializer(System.out, xmlFormat);
xmlSerializer.startDocument(); xmlSerializer.startDocument();
xmlSerializer.startElement("dspace-dc-types", null); xmlSerializer.startElement("dspace-dc-types", null);
// Save the schema definition(s) // Save the schema definition(s)
saveSchema(context, xmlSerializer, schema); saveSchema(context, xmlSerializer, schema);
List<MetadataField> mdFields = null; List<MetadataField> mdFields = null;
// If a single schema has been specified // If a single schema has been specified
if (schema != null && !"".equals(schema)) if (schema != null && !"".equals(schema)) {
{
// Get the id of that schema // Get the id of that schema
MetadataSchema mdSchema = metadataSchemaService.find(context, schema); MetadataSchema mdSchema = metadataSchemaService.find(context, schema);
if (mdSchema == null) if (mdSchema == null) {
{
throw new RegistryExportException("no schema to export"); throw new RegistryExportException("no schema to export");
} }
// Get the metadata fields only for the specified schema // Get the metadata fields only for the specified schema
mdFields = metadataFieldService.findAllInSchema(context, mdSchema); mdFields = metadataFieldService.findAllInSchema(context, mdSchema);
} } else {
else
{
// Get the metadata fields for all the schemas // Get the metadata fields for all the schemas
mdFields = metadataFieldService.findAll(context); mdFields = metadataFieldService.findAll(context);
} }
// Output the metadata fields // Output the metadata fields
for (MetadataField mdField : mdFields) for (MetadataField mdField : mdFields) {
{
saveType(context, xmlSerializer, mdField); saveType(context, xmlSerializer, mdField);
} }
xmlSerializer.endElement("dspace-dc-types"); xmlSerializer.endElement("dspace-dc-types");
xmlSerializer.endDocument(); xmlSerializer.endDocument();
// abort the context, as we shouldn't have changed it!! // abort the context, as we shouldn't have changed it!!
context.abort(); context.abort();
} }
/** /**
* Serialize the schema registry. If the parameter 'schema' is null or empty, save all schemas * Serialize the schema registry. If the parameter 'schema' is null or empty, save all schemas
* @param context DSpace Context *
* @param context DSpace Context
* @param xmlSerializer XML serializer * @param xmlSerializer XML serializer
* @param schema schema (may be null to save all) * @param schema schema (may be null to save all)
* @throws SQLException if database error * @throws SQLException if database error
* @throws SAXException if XML error * @throws SAXException if XML error
* @throws RegistryExportException if export error * @throws RegistryExportException if export error
*/ */
public static void saveSchema(Context context, XMLSerializer xmlSerializer, String schema) throws SQLException, SAXException, RegistryExportException public static void saveSchema(Context context, XMLSerializer xmlSerializer, String schema)
{ throws SQLException, SAXException, RegistryExportException {
if (schema != null && !"".equals(schema)) if (schema != null && !"".equals(schema)) {
{
// Find a single named schema // Find a single named schema
MetadataSchema mdSchema = metadataSchemaService.find(context, schema); MetadataSchema mdSchema = metadataSchemaService.find(context, schema);
saveSchema(xmlSerializer, mdSchema); saveSchema(xmlSerializer, mdSchema);
} } else {
else
{
// Find all schemas // Find all schemas
List<MetadataSchema> mdSchemas = metadataSchemaService.findAll(context); List<MetadataSchema> mdSchemas = metadataSchemaService.findAll(context);
for (MetadataSchema mdSchema : mdSchemas) for (MetadataSchema mdSchema : mdSchemas) {
{
saveSchema(xmlSerializer, mdSchema); saveSchema(xmlSerializer, mdSchema);
} }
} }
} }
/** /**
* Serialize a single schema (namespace) registry entry * Serialize a single schema (namespace) registry entry
* *
* @param xmlSerializer XML serializer * @param xmlSerializer XML serializer
* @param mdSchema DSpace metadata schema * @param mdSchema DSpace metadata schema
* @throws SAXException if XML error * @throws SAXException if XML error
* @throws RegistryExportException if export error * @throws RegistryExportException if export error
*/ */
private static void saveSchema(XMLSerializer xmlSerializer, MetadataSchema mdSchema) throws SAXException, RegistryExportException private static void saveSchema(XMLSerializer xmlSerializer, MetadataSchema mdSchema)
{ throws SAXException, RegistryExportException {
// If we haven't got a schema, it's an error // If we haven't got a schema, it's an error
if (mdSchema == null) if (mdSchema == null) {
{
throw new RegistryExportException("no schema to export"); throw new RegistryExportException("no schema to export");
} }
String name = mdSchema.getName(); String name = mdSchema.getName();
String namespace = mdSchema.getNamespace(); String namespace = mdSchema.getNamespace();
if (name == null || "".equals(name)) if (name == null || "".equals(name)) {
{
System.out.println("name is null, skipping"); System.out.println("name is null, skipping");
return; return;
} }
if (namespace == null || "".equals(namespace)) if (namespace == null || "".equals(namespace)) {
{
System.out.println("namespace is null, skipping"); System.out.println("namespace is null, skipping");
return; return;
} }
// Output the parent tag // Output the parent tag
xmlSerializer.startElement("dc-schema", null); xmlSerializer.startElement("dc-schema", null);
// Output the schema name // Output the schema name
xmlSerializer.startElement("name", null); xmlSerializer.startElement("name", null);
xmlSerializer.characters(name.toCharArray(), 0, name.length()); xmlSerializer.characters(name.toCharArray(), 0, name.length());
@@ -225,26 +221,25 @@ public class MetadataExporter
xmlSerializer.endElement("dc-schema"); xmlSerializer.endElement("dc-schema");
} }
/** /**
* Serialize a single metadata field registry entry to xml * Serialize a single metadata field registry entry to xml
* *
* @param context DSpace context * @param context DSpace context
* @param xmlSerializer xml serializer * @param xmlSerializer xml serializer
* @param mdField DSpace metadata field * @param mdField DSpace metadata field
* @throws SAXException if XML error * @throws SAXException if XML error
* @throws RegistryExportException if export error * @throws RegistryExportException if export error
* @throws SQLException if database error * @throws SQLException if database error
* @throws IOException if IO error * @throws IOException if IO error
*/ */
private static void saveType(Context context, XMLSerializer xmlSerializer, MetadataField mdField) throws SAXException, RegistryExportException, SQLException, IOException private static void saveType(Context context, XMLSerializer xmlSerializer, MetadataField mdField)
{ throws SAXException, RegistryExportException, SQLException, IOException {
// If we haven't been given a field, it's an error // If we haven't been given a field, it's an error
if (mdField == null) if (mdField == null) {
{
throw new RegistryExportException("no field to export"); throw new RegistryExportException("no field to export");
} }
// Get the data from the metadata field // Get the data from the metadata field
String schemaName = getSchemaName(context, mdField); String schemaName = getSchemaName(context, mdField);
String element = mdField.getElement(); String element = mdField.getElement();
@@ -252,8 +247,7 @@ public class MetadataExporter
String scopeNote = mdField.getScopeNote(); String scopeNote = mdField.getScopeNote();
// We must have a schema and element // We must have a schema and element
if (schemaName == null || element == null) if (schemaName == null || element == null) {
{
throw new RegistryExportException("incomplete field information"); throw new RegistryExportException("incomplete field information");
} }
@@ -271,73 +265,64 @@ public class MetadataExporter
xmlSerializer.endElement("element"); xmlSerializer.endElement("element");
// Output the qualifier, if present // Output the qualifier, if present
if (qualifier != null) if (qualifier != null) {
{
xmlSerializer.startElement("qualifier", null); xmlSerializer.startElement("qualifier", null);
xmlSerializer.characters(qualifier.toCharArray(), 0, qualifier.length()); xmlSerializer.characters(qualifier.toCharArray(), 0, qualifier.length());
xmlSerializer.endElement("qualifier"); xmlSerializer.endElement("qualifier");
} } else {
else
{
xmlSerializer.comment("unqualified"); xmlSerializer.comment("unqualified");
} }
// Output the scope note, if present // Output the scope note, if present
if (scopeNote != null) if (scopeNote != null) {
{
xmlSerializer.startElement("scope_note", null); xmlSerializer.startElement("scope_note", null);
xmlSerializer.characters(scopeNote.toCharArray(), 0, scopeNote.length()); xmlSerializer.characters(scopeNote.toCharArray(), 0, scopeNote.length());
xmlSerializer.endElement("scope_note"); xmlSerializer.endElement("scope_note");
} } else {
else
{
xmlSerializer.comment("no scope note"); xmlSerializer.comment("no scope note");
} }
xmlSerializer.endElement("dc-type"); xmlSerializer.endElement("dc-type");
} }
static Map<Integer, String> schemaMap = new HashMap<Integer, String>(); static Map<Integer, String> schemaMap = new HashMap<Integer, String>();
/** /**
* Helper method to retrieve a schema name for the field. * Helper method to retrieve a schema name for the field.
* Caches the name after looking up the id. * Caches the name after looking up the id.
*
* @param context DSpace Context * @param context DSpace Context
* @param mdField DSpace metadata field * @param mdField DSpace metadata field
* @return name of schema * @return name of schema
* @throws SQLException if database error * @throws SQLException if database error
* @throws RegistryExportException if export error * @throws RegistryExportException if export error
*/ */
private static String getSchemaName(Context context, MetadataField mdField) throws SQLException, RegistryExportException private static String getSchemaName(Context context, MetadataField mdField)
{ throws SQLException, RegistryExportException {
// Get name from cache // Get name from cache
String name = schemaMap.get(mdField.getMetadataSchema().getID()); String name = schemaMap.get(mdField.getMetadataSchema().getID());
if (name == null) if (name == null) {
{
// Name not retrieved before, so get the schema now // Name not retrieved before, so get the schema now
MetadataSchema mdSchema = metadataSchemaService.find(context, mdField.getMetadataSchema().getID()); MetadataSchema mdSchema = metadataSchemaService.find(context, mdField.getMetadataSchema().getID());
if (mdSchema != null) if (mdSchema != null) {
{
name = mdSchema.getName(); name = mdSchema.getName();
schemaMap.put(mdSchema.getID(), name); schemaMap.put(mdSchema.getID(), name);
} } else {
else
{
// Can't find the schema // Can't find the schema
throw new RegistryExportException("Can't get schema name for field"); throw new RegistryExportException("Can't get schema name for field");
} }
} }
return name; return name;
} }
/** /**
* Print the usage message to stdout * Print the usage message to stdout
*/ */
public static void usage() public static void usage() {
{
String usage = "Use this class with the following options:\n" + String usage = "Use this class with the following options:\n" +
" -f <xml output file> : specify the output file for the schemas\n" + " -f <xml output file> : specify the output file for the schemas\n" +
" -s <schema> : name of the schema to export\n"; " -s <schema> : name of the schema to export\n";
System.out.println(usage); System.out.println(usage);
} }
} }

View File

@@ -9,7 +9,6 @@ package org.dspace.administer;
import java.io.IOException; import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
import javax.xml.parsers.ParserConfigurationException; import javax.xml.parsers.ParserConfigurationException;
import javax.xml.transform.TransformerException; import javax.xml.transform.TransformerException;
@@ -18,9 +17,7 @@ import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.Options; import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException; import org.apache.commons.cli.ParseException;
import org.apache.commons.cli.PosixParser; import org.apache.commons.cli.PosixParser;
import org.apache.xpath.XPathAPI; import org.apache.xpath.XPathAPI;
import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.AuthorizeException;
import org.dspace.content.MetadataField; import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema; import org.dspace.content.MetadataSchema;
@@ -31,78 +28,81 @@ import org.dspace.content.service.MetadataSchemaService;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.w3c.dom.Document; import org.w3c.dom.Document;
import org.w3c.dom.Node; import org.w3c.dom.Node;
import org.w3c.dom.NodeList; import org.w3c.dom.NodeList;
import org.xml.sax.SAXException; import org.xml.sax.SAXException;
/** /**
* @author Richard Jones * @author Richard Jones
* *
* This class takes an xml document as passed in the arguments and * This class takes an xml document as passed in the arguments and
* uses it to create metadata elements in the Metadata Registry if * uses it to create metadata elements in the Metadata Registry if
* they do not already exist * they do not already exist
* *
* The format of the XML file is as follows: * The format of the XML file is as follows:
* *
* {@code * {@code
* <dspace-dc-types> * <dspace-dc-types>
* <dc-type> * <dc-type>
* <schema>icadmin</schema> * <schema>icadmin</schema>
* <element>status</element> * <element>status</element>
* <qualifier>dateset</qualifier> * <qualifier>dateset</qualifier>
* <scope_note>the workflow status of an item</scope_note> * <scope_note>the workflow status of an item</scope_note>
* </dc-type> * </dc-type>
* *
* [....] * [....]
* *
* </dspace-dc-types> * </dspace-dc-types>
* } * }
*/ */
public class MetadataImporter public class MetadataImporter {
{ protected static MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance()
protected static MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance().getMetadataSchemaService(); .getMetadataSchemaService();
protected static MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService(); protected static MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance()
.getMetadataFieldService();
/** logging category */ /**
* logging category
*/
private static final Logger log = LoggerFactory.getLogger(MetadataImporter.class); private static final Logger log = LoggerFactory.getLogger(MetadataImporter.class);
/**
* Default constructor
*/
private MetadataImporter() { }
/** /**
* main method for reading user input from the command line * main method for reading user input from the command line
* @param args arguments *
* @throws ParseException if parse error * @param args the command line arguments given
* @throws SQLException if database error * @throws ParseException if parse error
* @throws IOException if IO error * @throws SQLException if database error
* @throws TransformerException if transformer error * @throws IOException if IO error
* @throws TransformerException if transformer error
* @throws ParserConfigurationException if config error * @throws ParserConfigurationException if config error
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
* @throws SAXException if parser error * @throws SAXException if parser error
* @throws NonUniqueMetadataException if duplicate metadata * @throws NonUniqueMetadataException if duplicate metadata
* @throws RegistryImportException if import fails * @throws RegistryImportException if import fails
**/ **/
public static void main(String[] args) public static void main(String[] args)
throws ParseException, SQLException, IOException, TransformerException, throws ParseException, SQLException, IOException, TransformerException,
ParserConfigurationException, AuthorizeException, SAXException, ParserConfigurationException, AuthorizeException, SAXException,
NonUniqueMetadataException, RegistryImportException NonUniqueMetadataException, RegistryImportException {
{
boolean forceUpdate = false; boolean forceUpdate = false;
// create an options object and populate it // create an options object and populate it
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
Options options = new Options(); Options options = new Options();
options.addOption("f", "file", true, "source xml file for DC fields"); options.addOption("f", "file", true, "source xml file for DC fields");
options.addOption("u", "update", false, "update an existing schema"); options.addOption("u", "update", false, "update an existing schema");
CommandLine line = parser.parse(options, args); CommandLine line = parser.parse(options, args);
String file = null; String file = null;
if (line.hasOption('f')) if (line.hasOption('f')) {
{
file = line.getOptionValue('f'); file = line.getOptionValue('f');
} } else {
else
{
usage(); usage();
System.exit(0); System.exit(0);
} }
@@ -110,29 +110,27 @@ public class MetadataImporter
forceUpdate = line.hasOption('u'); forceUpdate = line.hasOption('u');
loadRegistry(file, forceUpdate); loadRegistry(file, forceUpdate);
} }
/** /**
* Load the data from the specified file path into the database * Load the data from the specified file path into the database
* *
* @param file the file path containing the source data * @param file the file path containing the source data
* @param forceUpdate whether to force update * @param forceUpdate whether to force update
* @throws SQLException if database error * @throws SQLException if database error
* @throws IOException if IO error * @throws IOException if IO error
* @throws TransformerException if transformer error * @throws TransformerException if transformer error
* @throws ParserConfigurationException if config error * @throws ParserConfigurationException if config error
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
* @throws SAXException if parser error * @throws SAXException if parser error
* @throws NonUniqueMetadataException if duplicate metadata * @throws NonUniqueMetadataException if duplicate metadata
* @throws RegistryImportException if import fails * @throws RegistryImportException if import fails
*/ */
public static void loadRegistry(String file, boolean forceUpdate) public static void loadRegistry(String file, boolean forceUpdate)
throws SQLException, IOException, TransformerException, ParserConfigurationException, throws SQLException, IOException, TransformerException, ParserConfigurationException,
AuthorizeException, SAXException, NonUniqueMetadataException, RegistryImportException AuthorizeException, SAXException, NonUniqueMetadataException, RegistryImportException {
{
Context context = null; Context context = null;
try try {
{
// create a context // create a context
context = new Context(); context = new Context();
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
@@ -144,8 +142,7 @@ public class MetadataImporter
NodeList schemaNodes = XPathAPI.selectNodeList(document, "/dspace-dc-types/dc-schema"); NodeList schemaNodes = XPathAPI.selectNodeList(document, "/dspace-dc-types/dc-schema");
// Add each one as a new format to the registry // Add each one as a new format to the registry
for (int i = 0; i < schemaNodes.getLength(); i++) for (int i = 0; i < schemaNodes.getLength(); i++) {
{
Node n = schemaNodes.item(i); Node n = schemaNodes.item(i);
loadSchema(context, n, forceUpdate); loadSchema(context, n, forceUpdate);
} }
@@ -154,110 +151,95 @@ public class MetadataImporter
NodeList typeNodes = XPathAPI.selectNodeList(document, "/dspace-dc-types/dc-type"); NodeList typeNodes = XPathAPI.selectNodeList(document, "/dspace-dc-types/dc-type");
// Add each one as a new format to the registry // Add each one as a new format to the registry
for (int i = 0; i < typeNodes.getLength(); i++) for (int i = 0; i < typeNodes.getLength(); i++) {
{
Node n = typeNodes.item(i); Node n = typeNodes.item(i);
loadType(context, n); loadType(context, n);
} }
context.restoreAuthSystemState(); context.restoreAuthSystemState();
context.complete(); context.complete();
} } finally {
finally // Clean up our context, if it still exists & it was never completed
{ if (context != null && context.isValid()) {
// Clean up our context, if it still exists & it was never completed
if(context!=null && context.isValid())
context.abort(); context.abort();
}
} }
} }
/** /**
* Process a node in the metadata registry XML file. If the * Process a node in the metadata registry XML file. If the
* schema already exists, it will not be recreated * schema already exists, it will not be recreated
* *
* @param context * @param context DSpace context object
* DSpace context object * @param node the node in the DOM tree
* @param node * @throws SQLException if database error
* the node in the DOM tree * @throws IOException if IO error
* @throws SQLException if database error * @throws TransformerException if transformer error
* @throws IOException if IO error * @throws AuthorizeException if authorization error
* @throws TransformerException if transformer error
* @throws AuthorizeException if authorization error
* @throws NonUniqueMetadataException if duplicate metadata * @throws NonUniqueMetadataException if duplicate metadata
* @throws RegistryImportException if import fails * @throws RegistryImportException if import fails
*/ */
private static void loadSchema(Context context, Node node, boolean updateExisting) private static void loadSchema(Context context, Node node, boolean updateExisting)
throws SQLException, IOException, TransformerException, throws SQLException, IOException, TransformerException,
AuthorizeException, NonUniqueMetadataException, RegistryImportException AuthorizeException, NonUniqueMetadataException, RegistryImportException {
{
// Get the values // Get the values
String name = RegistryImporter.getElementData(node, "name"); String name = RegistryImporter.getElementData(node, "name");
String namespace = RegistryImporter.getElementData(node, "namespace"); String namespace = RegistryImporter.getElementData(node, "namespace");
if (name == null || "".equals(name)) if (name == null || "".equals(name)) {
{
throw new RegistryImportException("Name of schema must be supplied"); throw new RegistryImportException("Name of schema must be supplied");
} }
if (namespace == null || "".equals(namespace)) if (namespace == null || "".equals(namespace)) {
{
throw new RegistryImportException("Namespace of schema must be supplied"); throw new RegistryImportException("Namespace of schema must be supplied");
} }
// check to see if the schema already exists // check to see if the schema already exists
MetadataSchema s = metadataSchemaService.find(context, name); MetadataSchema s = metadataSchemaService.find(context, name);
if (s == null) if (s == null) {
{
// Schema does not exist - create // Schema does not exist - create
log.info("Registering Schema " + name + " (" + namespace + ")"); log.info("Registering Schema " + name + " (" + namespace + ")");
metadataSchemaService.create(context, name, namespace); metadataSchemaService.create(context, name, namespace);
} } else {
else
{
// Schema exists - if it's the same namespace, allow the type imports to continue // Schema exists - if it's the same namespace, allow the type imports to continue
if (s.getNamespace().equals(namespace)) if (s.getNamespace().equals(namespace)) {
{
// This schema already exists with this namespace, skipping it // This schema already exists with this namespace, skipping it
return; return;
} }
// It's a different namespace - have we been told to update? // It's a different namespace - have we been told to update?
if (updateExisting) if (updateExisting) {
{
// Update the existing schema namespace and continue to type import // Update the existing schema namespace and continue to type import
log.info("Updating Schema " + name + ": New namespace " + namespace); log.info("Updating Schema " + name + ": New namespace " + namespace);
s.setNamespace(namespace); s.setNamespace(namespace);
metadataSchemaService.update(context, s); metadataSchemaService.update(context, s);
} } else {
else throw new RegistryImportException(
{ "Schema " + name + " already registered with different namespace " + namespace + ". Rerun with " +
throw new RegistryImportException("Schema " + name + " already registered with different namespace " + namespace + ". Rerun with 'update' option enabled if you wish to update this schema."); "'update' option enabled if you wish to update this schema.");
} }
} }
} }
/** /**
* Process a node in the metadata registry XML file. The node must * Process a node in the metadata registry XML file. The node must
* be a "dc-type" node. If the type already exists, then it * be a "dc-type" node. If the type already exists, then it
* will not be reimported * will not be reimported
* *
* @param context * @param context DSpace context object
* DSpace context object * @param node the node in the DOM tree
* @param node * @throws SQLException if database error
* the node in the DOM tree * @throws IOException if IO error
* @throws SQLException if database error * @throws TransformerException if transformer error
* @throws IOException if IO error * @throws AuthorizeException if authorization error
* @throws TransformerException if transformer error
* @throws AuthorizeException if authorization error
* @throws NonUniqueMetadataException if duplicate metadata * @throws NonUniqueMetadataException if duplicate metadata
* @throws RegistryImportException if import fails * @throws RegistryImportException if import fails
*/ */
private static void loadType(Context context, Node node) private static void loadType(Context context, Node node)
throws SQLException, IOException, TransformerException, throws SQLException, IOException, TransformerException,
AuthorizeException, NonUniqueMetadataException, RegistryImportException AuthorizeException, NonUniqueMetadataException, RegistryImportException {
{
// Get the values // Get the values
String schema = RegistryImporter.getElementData(node, "schema"); String schema = RegistryImporter.getElementData(node, "schema");
String element = RegistryImporter.getElementData(node, "element"); String element = RegistryImporter.getElementData(node, "element");
@@ -265,44 +247,41 @@ public class MetadataImporter
String scopeNote = RegistryImporter.getElementData(node, "scope_note"); String scopeNote = RegistryImporter.getElementData(node, "scope_note");
// If the schema is not provided default to DC // If the schema is not provided default to DC
if (schema == null) if (schema == null) {
{
schema = MetadataSchema.DC_SCHEMA; schema = MetadataSchema.DC_SCHEMA;
} }
// Find the matching schema object // Find the matching schema object
MetadataSchema schemaObj = metadataSchemaService.find(context, schema); MetadataSchema schemaObj = metadataSchemaService.find(context, schema);
if (schemaObj == null) if (schemaObj == null) {
{
throw new RegistryImportException("Schema '" + schema + "' is not registered and does not exist."); throw new RegistryImportException("Schema '" + schema + "' is not registered and does not exist.");
} }
MetadataField mf = metadataFieldService.findByElement(context, schemaObj, element, qualifier); MetadataField mf = metadataFieldService.findByElement(context, schemaObj, element, qualifier);
if (mf != null) if (mf != null) {
{
// Metadata field already exists, skipping it // Metadata field already exists, skipping it
return; return;
} }
// Actually create this metadata field as it doesn't yet exist // Actually create this metadata field as it doesn't yet exist
String fieldName = schema + "." + element + "." + qualifier; String fieldName = schema + "." + element + "." + qualifier;
if(qualifier==null) if (qualifier == null) {
fieldName = schema + "." + element; fieldName = schema + "." + element;
}
log.info("Registering metadata field " + fieldName); log.info("Registering metadata field " + fieldName);
MetadataField field = metadataFieldService.create(context, schemaObj, element, qualifier, scopeNote); MetadataField field = metadataFieldService.create(context, schemaObj, element, qualifier, scopeNote);
metadataFieldService.update(context, field); metadataFieldService.update(context, field);
} }
/** /**
* Print the usage message to stdout * Print the usage message to stdout
*/ */
public static void usage() public static void usage() {
{
String usage = "Use this class with the following option:\n" + String usage = "Use this class with the following option:\n" +
" -f <xml source file> : specify which xml source file " + " -f <xml source file> : specify which xml source file " +
"contains the DC fields to import.\n"; "contains the DC fields to import.\n";
System.out.println(usage); System.out.println(usage);
} }
} }

View File

@@ -12,45 +12,40 @@ package org.dspace.administer;
* *
* An exception to report any problems with registry exports * An exception to report any problems with registry exports
*/ */
public class RegistryExportException extends Exception public class RegistryExportException extends Exception {
{
/** /**
* Create an empty authorize exception * Create an empty authorize exception
*/ */
public RegistryExportException() public RegistryExportException() {
{
super(); super();
} }
/** /**
* create an exception with only a message * create an exception with only a message
* *
* @param message exception message * @param message exception message
*/ */
public RegistryExportException(String message) public RegistryExportException(String message) {
{
super(message); super(message);
} }
/** /**
* create an exception with an inner exception and a message * create an exception with an inner exception and a message
* *
* @param message exception message * @param message exception message
* @param e reference to Throwable * @param e reference to Throwable
*/ */
public RegistryExportException(String message, Throwable e) public RegistryExportException(String message, Throwable e) {
{
super(message, e); super(message, e);
} }
/** /**
* create an exception with an inner exception * create an exception with an inner exception
* *
* @param e reference to Throwable * @param e reference to Throwable
*/ */
public RegistryExportException(Throwable e) public RegistryExportException(Throwable e) {
{
super(e); super(e);
} }
} }

View File

@@ -12,45 +12,40 @@ package org.dspace.administer;
* *
* An exception to report any problems with registry imports * An exception to report any problems with registry imports
*/ */
public class RegistryImportException extends Exception public class RegistryImportException extends Exception {
{
/** /**
* Create an empty authorize exception * Create an empty authorize exception
*/ */
public RegistryImportException() public RegistryImportException() {
{
super(); super();
} }
/** /**
* create an exception with only a message * create an exception with only a message
* *
* @param message error message * @param message error message
*/ */
public RegistryImportException(String message) public RegistryImportException(String message) {
{
super(message); super(message);
} }
/** /**
* create an exception with an inner exception and a message * create an exception with an inner exception and a message
* *
* @param message error message * @param message error message
* @param e throwable * @param e throwable
*/ */
public RegistryImportException(String message, Throwable e) public RegistryImportException(String message, Throwable e) {
{ super(message, e);
super(message, e);
} }
/** /**
* create an exception with an inner exception * create an exception with an inner exception
* *
* @param e throwable * @param e throwable
*/ */
public RegistryImportException(Throwable e) public RegistryImportException(Throwable e) {
{ super(e);
super(e);
} }
} }

View File

@@ -9,18 +9,15 @@ package org.dspace.administer;
import java.io.File; import java.io.File;
import java.io.IOException; import java.io.IOException;
import javax.xml.parsers.DocumentBuilder; import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory; import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException; import javax.xml.parsers.ParserConfigurationException;
import javax.xml.transform.TransformerException; import javax.xml.transform.TransformerException;
import org.apache.xpath.XPathAPI; import org.apache.xpath.XPathAPI;
import org.w3c.dom.Document; import org.w3c.dom.Document;
import org.w3c.dom.Node; import org.w3c.dom.Node;
import org.w3c.dom.NodeList; import org.w3c.dom.NodeList;
import org.xml.sax.SAXException; import org.xml.sax.SAXException;
/** /**
@@ -31,30 +28,32 @@ import org.xml.sax.SAXException;
* I am the author, really I ripped these methods off from other * I am the author, really I ripped these methods off from other
* classes * classes
*/ */
public class RegistryImporter public class RegistryImporter {
{
/**
* Default constructor
*/
private RegistryImporter() { }
/** /**
* Load in the XML from file. * Load in the XML from file.
* *
* @param filename * @param filename the filename to load from
* the filename to load from
*
* @return the DOM representation of the XML file * @return the DOM representation of the XML file
* @throws IOException if IO error * @throws IOException if IO error
* @throws ParserConfigurationException if configuration parse error * @throws ParserConfigurationException if configuration parse error
* @throws SAXException if XML parse error * @throws SAXException if XML parse error
*/ */
public static Document loadXML(String filename) public static Document loadXML(String filename)
throws IOException, ParserConfigurationException, SAXException throws IOException, ParserConfigurationException, SAXException {
{
DocumentBuilder builder = DocumentBuilderFactory.newInstance() DocumentBuilder builder = DocumentBuilderFactory.newInstance()
.newDocumentBuilder(); .newDocumentBuilder();
Document document = builder.parse(new File(filename)); Document document = builder.parse(new File(filename));
return document; return document;
} }
/** /**
* Get the CDATA of a particular element. For example, if the XML document * Get the CDATA of a particular element. For example, if the XML document
* contains: * contains:
@@ -66,22 +65,18 @@ public class RegistryImporter
* return <code>application/pdf</code>. * return <code>application/pdf</code>.
* </P> * </P>
* Why this isn't a core part of the XML API I do not know... * Why this isn't a core part of the XML API I do not know...
* *
* @param parentElement * @param parentElement the element, whose child element you want the CDATA from
* the element, whose child element you want the CDATA from * @param childName the name of the element you want the CDATA from
* @param childName
* the name of the element you want the CDATA from
* @throws TransformerException if error
* @return the CDATA as a <code>String</code> * @return the CDATA as a <code>String</code>
* @throws TransformerException if error
*/ */
public static String getElementData(Node parentElement, String childName) public static String getElementData(Node parentElement, String childName)
throws TransformerException throws TransformerException {
{
// Grab the child node // Grab the child node
Node childNode = XPathAPI.selectSingleNode(parentElement, childName); Node childNode = XPathAPI.selectSingleNode(parentElement, childName);
if (childNode == null) if (childNode == null) {
{
// No child node, so no values // No child node, so no values
return null; return null;
} }
@@ -89,8 +84,7 @@ public class RegistryImporter
// Get the #text // Get the #text
Node dataNode = childNode.getFirstChild(); Node dataNode = childNode.getFirstChild();
if (dataNode == null) if (dataNode == null) {
{
return null; return null;
} }
@@ -106,32 +100,28 @@ public class RegistryImporter
* <P> * <P>
* <code> * <code>
* &lt;foo&gt; * &lt;foo&gt;
* &lt;bar&gt;val1&lt;/bar&gt; * &lt;bar&gt;val1&lt;/bar&gt;
* &lt;bar&gt;val2&lt;/bar&gt; * &lt;bar&gt;val2&lt;/bar&gt;
* &lt;/foo&gt; * &lt;/foo&gt;
* </code> * </code>
* passing this the <code>foo</code> node and <code>bar</code> will * passing this the <code>foo</code> node and <code>bar</code> will
* return <code>val1</code> and <code>val2</code>. * return <code>val1</code> and <code>val2</code>.
* </P> * </P>
* Why this also isn't a core part of the XML API I do not know... * Why this also isn't a core part of the XML API I do not know...
* *
* @param parentElement * @param parentElement the element, whose child element you want the CDATA from
* the element, whose child element you want the CDATA from * @param childName the name of the element you want the CDATA from
* @param childName
* the name of the element you want the CDATA from
* @throws TransformerException if error
* @return the CDATA as a <code>String</code> * @return the CDATA as a <code>String</code>
* @throws TransformerException if error
*/ */
public static String[] getRepeatedElementData(Node parentElement, public static String[] getRepeatedElementData(Node parentElement,
String childName) throws TransformerException String childName) throws TransformerException {
{
// Grab the child node // Grab the child node
NodeList childNodes = XPathAPI.selectNodeList(parentElement, childName); NodeList childNodes = XPathAPI.selectNodeList(parentElement, childName);
String[] data = new String[childNodes.getLength()]; String[] data = new String[childNodes.getLength()];
for (int i = 0; i < childNodes.getLength(); i++) for (int i = 0; i < childNodes.getLength(); i++) {
{
// Get the #text node // Get the #text node
Node dataNode = childNodes.item(i).getFirstChild(); Node dataNode = childNodes.item(i).getFirstChild();

View File

@@ -12,7 +12,6 @@ import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import javax.xml.parsers.DocumentBuilder; import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory; import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException; import javax.xml.parsers.ParserConfigurationException;
@@ -40,33 +39,37 @@ import org.xml.sax.SAXException;
* <code>RegistryLoader -bitstream bitstream-formats.xml</code> * <code>RegistryLoader -bitstream bitstream-formats.xml</code>
* <P> * <P>
* <code>RegistryLoader -dc dc-types.xml</code> * <code>RegistryLoader -dc dc-types.xml</code>
* *
* @author Robert Tansley * @author Robert Tansley
* @version $Revision$ * @version $Revision$
*/ */
public class RegistryLoader public class RegistryLoader {
{ /**
/** log4j category */ * log4j category
*/
private static Logger log = Logger.getLogger(RegistryLoader.class); private static Logger log = Logger.getLogger(RegistryLoader.class);
protected static BitstreamFormatService bitstreamFormatService = ContentServiceFactory.getInstance().getBitstreamFormatService(); protected static BitstreamFormatService bitstreamFormatService = ContentServiceFactory.getInstance()
.getBitstreamFormatService();
/**
* Default constructor
*/
private RegistryLoader() { }
/** /**
* For invoking via the command line * For invoking via the command line
* *
* @param argv * @param argv the command line arguments given
* command-line arguments
* @throws Exception if error * @throws Exception if error
*/ */
public static void main(String[] argv) throws Exception public static void main(String[] argv) throws Exception {
{
String usage = "Usage: " + RegistryLoader.class.getName() String usage = "Usage: " + RegistryLoader.class.getName()
+ " (-bitstream | -metadata) registry-file.xml"; + " (-bitstream | -metadata) registry-file.xml";
Context context = null; Context context = null;
try try {
{
context = new Context(); context = new Context();
// Can't update registries anonymously, so we need to turn off // Can't update registries anonymously, so we need to turn off
@@ -74,17 +77,12 @@ public class RegistryLoader
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
// Work out what we're loading // Work out what we're loading
if (argv[0].equalsIgnoreCase("-bitstream")) if (argv[0].equalsIgnoreCase("-bitstream")) {
{
RegistryLoader.loadBitstreamFormats(context, argv[1]); RegistryLoader.loadBitstreamFormats(context, argv[1]);
} } else if (argv[0].equalsIgnoreCase("-metadata")) {
else if (argv[0].equalsIgnoreCase("-metadata"))
{
// Call MetadataImporter, as it handles Metadata schema updates // Call MetadataImporter, as it handles Metadata schema updates
MetadataImporter.loadRegistry(argv[1], true); MetadataImporter.loadRegistry(argv[1], true);
} } else {
else
{
System.err.println(usage); System.err.println(usage);
} }
@@ -92,81 +90,69 @@ public class RegistryLoader
context.complete(); context.complete();
System.exit(0); System.exit(0);
} } catch (ArrayIndexOutOfBoundsException ae) {
catch (ArrayIndexOutOfBoundsException ae)
{
System.err.println(usage); System.err.println(usage);
System.exit(1); System.exit(1);
} } catch (Exception e) {
catch (Exception e)
{
log.fatal(LogManager.getHeader(context, "error_loading_registries", log.fatal(LogManager.getHeader(context, "error_loading_registries",
""), e); ""), e);
System.err.println("Error: \n - " + e.getMessage()); System.err.println("Error: \n - " + e.getMessage());
System.exit(1); System.exit(1);
} } finally {
finally
{
// Clean up our context, if it still exists & it was never completed // Clean up our context, if it still exists & it was never completed
if(context!=null && context.isValid()) if (context != null && context.isValid()) {
context.abort(); context.abort();
}
} }
} }
/** /**
* Load Bitstream Format metadata * Load Bitstream Format metadata
* *
* @param context * @param context DSpace context object
* DSpace context object * @param filename the filename of the XML file to load
* @param filename * @throws SQLException if database error
* the filename of the XML file to load * @throws IOException if IO error
* @throws SQLException if database error * @throws TransformerException if transformer error
* @throws IOException if IO error
* @throws TransformerException if transformer error
* @throws ParserConfigurationException if config error * @throws ParserConfigurationException if config error
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
* @throws SAXException if parser error * @throws SAXException if parser error
*/ */
public static void loadBitstreamFormats(Context context, String filename) public static void loadBitstreamFormats(Context context, String filename)
throws SQLException, IOException, ParserConfigurationException, throws SQLException, IOException, ParserConfigurationException,
SAXException, TransformerException, AuthorizeException SAXException, TransformerException, AuthorizeException {
{
Document document = loadXML(filename); Document document = loadXML(filename);
// Get the nodes corresponding to formats // Get the nodes corresponding to formats
NodeList typeNodes = XPathAPI.selectNodeList(document, NodeList typeNodes = XPathAPI.selectNodeList(document,
"dspace-bitstream-types/bitstream-type"); "dspace-bitstream-types/bitstream-type");
// Add each one as a new format to the registry // Add each one as a new format to the registry
for (int i = 0; i < typeNodes.getLength(); i++) for (int i = 0; i < typeNodes.getLength(); i++) {
{
Node n = typeNodes.item(i); Node n = typeNodes.item(i);
loadFormat(context, n); loadFormat(context, n);
} }
log.info(LogManager.getHeader(context, "load_bitstream_formats", log.info(LogManager.getHeader(context, "load_bitstream_formats",
"number_loaded=" + typeNodes.getLength())); "number_loaded=" + typeNodes.getLength()));
} }
/** /**
* Process a node in the bitstream format registry XML file. The node must * Process a node in the bitstream format registry XML file. The node must
* be a "bitstream-type" node * be a "bitstream-type" node
* *
* @param context * @param context DSpace context object
* DSpace context object * @param node the node in the DOM tree
* @param node * @throws SQLException if database error
* the node in the DOM tree * @throws IOException if IO error
* @throws SQLException if database error
* @throws IOException if IO error
* @throws TransformerException if transformer error * @throws TransformerException if transformer error
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
*/ */
private static void loadFormat(Context context, Node node) private static void loadFormat(Context context, Node node)
throws SQLException, IOException, TransformerException, throws SQLException, IOException, TransformerException,
AuthorizeException AuthorizeException {
{
// Get the values // Get the values
String mimeType = getElementData(node, "mimetype"); String mimeType = getElementData(node, "mimetype");
String shortDesc = getElementData(node, "short_description"); String shortDesc = getElementData(node, "short_description");
@@ -182,16 +168,14 @@ public class RegistryLoader
// Check if this format already exists in our registry (by mime type) // Check if this format already exists in our registry (by mime type)
BitstreamFormat exists = bitstreamFormatService.findByMIMEType(context, mimeType); BitstreamFormat exists = bitstreamFormatService.findByMIMEType(context, mimeType);
// If not found by mimeType, check by short description (since this must also be unique) // If not found by mimeType, check by short description (since this must also be unique)
if(exists==null) if (exists == null) {
{
exists = bitstreamFormatService.findByShortDescription(context, shortDesc); exists = bitstreamFormatService.findByShortDescription(context, shortDesc);
} }
// If it doesn't exist, create it..otherwise skip it. // If it doesn't exist, create it..otherwise skip it.
if(exists==null) if (exists == null) {
{
// Create the format object // Create the format object
BitstreamFormat format = bitstreamFormatService.create(context); BitstreamFormat format = bitstreamFormatService.create(context);
@@ -214,19 +198,17 @@ public class RegistryLoader
/** /**
* Load in the XML from file. * Load in the XML from file.
* *
* @param filename * @param filename the filename to load from
* the filename to load from
* @throws IOException if IO error
* @throws ParserConfigurationException if config error
* @throws SAXException if parser error
* @return the DOM representation of the XML file * @return the DOM representation of the XML file
* @throws IOException if IO error
* @throws ParserConfigurationException if config error
* @throws SAXException if parser error
*/ */
private static Document loadXML(String filename) throws IOException, private static Document loadXML(String filename) throws IOException,
ParserConfigurationException, SAXException ParserConfigurationException, SAXException {
{
DocumentBuilder builder = DocumentBuilderFactory.newInstance() DocumentBuilder builder = DocumentBuilderFactory.newInstance()
.newDocumentBuilder(); .newDocumentBuilder();
return builder.parse(new File(filename)); return builder.parse(new File(filename));
} }
@@ -242,22 +224,18 @@ public class RegistryLoader
* return <code>application/pdf</code>. * return <code>application/pdf</code>.
* </P> * </P>
* Why this isn't a core part of the XML API I do not know... * Why this isn't a core part of the XML API I do not know...
* *
* @param parentElement * @param parentElement the element, whose child element you want the CDATA from
* the element, whose child element you want the CDATA from * @param childName the name of the element you want the CDATA from
* @param childName
* the name of the element you want the CDATA from
* @throws TransformerException if transformer error
* @return the CDATA as a <code>String</code> * @return the CDATA as a <code>String</code>
* @throws TransformerException if transformer error
*/ */
private static String getElementData(Node parentElement, String childName) private static String getElementData(Node parentElement, String childName)
throws TransformerException throws TransformerException {
{
// Grab the child node // Grab the child node
Node childNode = XPathAPI.selectSingleNode(parentElement, childName); Node childNode = XPathAPI.selectSingleNode(parentElement, childName);
if (childNode == null) if (childNode == null) {
{
// No child node, so no values // No child node, so no values
return null; return null;
} }
@@ -265,8 +243,7 @@ public class RegistryLoader
// Get the #text // Get the #text
Node dataNode = childNode.getFirstChild(); Node dataNode = childNode.getFirstChild();
if (dataNode == null) if (dataNode == null) {
{
return null; return null;
} }
@@ -282,32 +259,28 @@ public class RegistryLoader
* <P> * <P>
* <code> * <code>
* &lt;foo&gt; * &lt;foo&gt;
* &lt;bar&gt;val1&lt;/bar&gt; * &lt;bar&gt;val1&lt;/bar&gt;
* &lt;bar&gt;val2&lt;/bar&gt; * &lt;bar&gt;val2&lt;/bar&gt;
* &lt;/foo&gt; * &lt;/foo&gt;
* </code> * </code>
* passing this the <code>foo</code> node and <code>bar</code> will * passing this the <code>foo</code> node and <code>bar</code> will
* return <code>val1</code> and <code>val2</code>. * return <code>val1</code> and <code>val2</code>.
* </P> * </P>
* Why this also isn't a core part of the XML API I do not know... * Why this also isn't a core part of the XML API I do not know...
* *
* @param parentElement * @param parentElement the element, whose child element you want the CDATA from
* the element, whose child element you want the CDATA from * @param childName the name of the element you want the CDATA from
* @param childName
* the name of the element you want the CDATA from
* @throws TransformerException if transformer error
* @return the CDATA as a <code>String</code> * @return the CDATA as a <code>String</code>
* @throws TransformerException if transformer error
*/ */
private static String[] getRepeatedElementData(Node parentElement, private static String[] getRepeatedElementData(Node parentElement,
String childName) throws TransformerException String childName) throws TransformerException {
{
// Grab the child node // Grab the child node
NodeList childNodes = XPathAPI.selectNodeList(parentElement, childName); NodeList childNodes = XPathAPI.selectNodeList(parentElement, childName);
String[] data = new String[childNodes.getLength()]; String[] data = new String[childNodes.getLength()];
for (int i = 0; i < childNodes.getLength(); i++) for (int i = 0; i < childNodes.getLength(); i++) {
{
// Get the #text node // Get the #text node
Node dataNode = childNodes.item(i).getFirstChild(); Node dataNode = childNodes.item(i).getFirstChild();

View File

@@ -14,7 +14,6 @@ import java.io.IOException;
import java.sql.SQLException; import java.sql.SQLException;
import java.util.HashMap; import java.util.HashMap;
import java.util.Map; import java.util.Map;
import javax.xml.parsers.DocumentBuilder; import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory; import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException; import javax.xml.parsers.ParserConfigurationException;
@@ -44,115 +43,119 @@ import org.xml.sax.SAXException;
/** /**
* This class deals with importing community and collection structures from * This class deals with importing community and collection structures from
* an XML file. * an XML file.
* *
* The XML file structure needs to be: * The XML file structure needs to be:
* {@code * {@code
* <import_structure> * <import_structure>
* <community> * <community>
* <name>....</name> * <name>....</name>
* <community>...</community> * <community>...</community>
* <collection> * <collection>
* <name>....</name> * <name>....</name>
* </collection> * </collection>
* </community> * </community>
* </import_structure> * </import_structure>
* } * }
* it can be arbitrarily deep, and supports all the metadata elements * it can be arbitrarily deep, and supports all the metadata elements
* that make up the community and collection metadata. See the system * that make up the community and collection metadata. See the system
* documentation for more details * documentation for more details
*
* @author Richard Jones
* *
* @author Richard Jones
*/ */
public class StructBuilder public class StructBuilder {
{ /**
/** the output xml document which will contain updated information about the * the output xml document which will contain updated information about the
* imported structure * imported structure
*/ */
private static org.jdom.Document xmlOutput = new org.jdom.Document(new Element("imported_structure")); private static org.jdom.Document xmlOutput = new org.jdom.Document(new Element("imported_structure"));
/** a hashtable to hold metadata for the collection being worked on */ /**
* a hashtable to hold metadata for the collection being worked on
*/
private static Map<String, String> collectionMap = new HashMap<String, String>(); private static Map<String, String> collectionMap = new HashMap<String, String>();
/** a hashtable to hold metadata for the community being worked on */ /**
* a hashtable to hold metadata for the community being worked on
*/
private static Map<String, String> communityMap = new HashMap<String, String>(); private static Map<String, String> communityMap = new HashMap<String, String>();
protected static CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService(); protected static CommunityService communityService = ContentServiceFactory.getInstance().getCommunityService();
protected static CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); protected static CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService();
protected static EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); protected static EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService();
/**
* Default constructor
*/
private StructBuilder() { }
/** /**
* Main method to be run from the command line to import a structure into * Main method to be run from the command line to import a structure into
* DSpace * DSpace
* *
* This is of the form: * This is of the form:
* *
* {@code StructBuilder -f [xml source] -e [administrator email] -o [output file]} * {@code StructBuilder -f [xml source] -e [administrator email] -o [output file]}
* *
* The output file will contain exactly the same as the source xml document, but * The output file will contain exactly the same as the source xml document, but
* with the handle for each imported item added as an attribute. * with the handle for each imported item added as an attribute.
* @param argv commandline arguments *
* @param argv the command line arguments given
* @throws Exception if an error occurs * @throws Exception if an error occurs
*/ */
public static void main(String[] argv) public static void main(String[] argv)
throws Exception throws Exception {
{
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
Options options = new Options(); Options options = new Options();
options.addOption("f", "file", true, "file");
options.addOption("e", "eperson", true, "eperson");
options.addOption("o", "output", true, "output");
CommandLine line = parser.parse(options, argv);
String file = null;
String eperson = null;
String output = null;
if (line.hasOption('f')) {
file = line.getOptionValue('f');
}
if (line.hasOption('e')) {
eperson = line.getOptionValue('e');
}
if (line.hasOption('o')) {
output = line.getOptionValue('o');
}
if (output == null || eperson == null || file == null) {
usage();
System.exit(0);
}
options.addOption( "f", "file", true, "file");
options.addOption( "e", "eperson", true, "eperson");
options.addOption("o", "output", true, "output");
CommandLine line = parser.parse( options, argv );
String file = null;
String eperson = null;
String output = null;
if (line.hasOption('f'))
{
file = line.getOptionValue('f');
}
if (line.hasOption('e'))
{
eperson = line.getOptionValue('e');
}
if (line.hasOption('o'))
{
output = line.getOptionValue('o');
}
if (output == null || eperson == null || file == null)
{
usage();
System.exit(0);
}
// create a context // create a context
Context context = new Context(); Context context = new Context();
// set the context // set the context
context.setCurrentUser(ePersonService.findByEmail(context, eperson)); context.setCurrentUser(ePersonService.findByEmail(context, eperson));
// load the XML // load the XML
Document document = loadXML(file); Document document = loadXML(file);
// run the preliminary validation, to be sure that the the XML document // run the preliminary validation, to be sure that the the XML document
// is properly structured // is properly structured
validate(document); validate(document);
// load the mappings into the member variable hashmaps // load the mappings into the member variable hashmaps
communityMap.put("name", "name"); communityMap.put("name", "name");
communityMap.put("description", "short_description"); communityMap.put("description", "short_description");
communityMap.put("intro", "introductory_text"); communityMap.put("intro", "introductory_text");
communityMap.put("copyright", "copyright_text"); communityMap.put("copyright", "copyright_text");
communityMap.put("sidebar", "side_bar_text"); communityMap.put("sidebar", "side_bar_text");
collectionMap.put("name", "name"); collectionMap.put("name", "name");
collectionMap.put("description", "short_description"); collectionMap.put("description", "short_description");
collectionMap.put("intro", "introductory_text"); collectionMap.put("intro", "introductory_text");
@@ -160,267 +163,232 @@ public class StructBuilder
collectionMap.put("sidebar", "side_bar_text"); collectionMap.put("sidebar", "side_bar_text");
collectionMap.put("license", "license"); collectionMap.put("license", "license");
collectionMap.put("provenance", "provenance_description"); collectionMap.put("provenance", "provenance_description");
// get the top level community list // get the top level community list
NodeList first = XPathAPI.selectNodeList(document, "/import_structure/community"); NodeList first = XPathAPI.selectNodeList(document, "/import_structure/community");
// run the import starting with the top level communities // run the import starting with the top level communities
Element[] elements = handleCommunities(context, first, null); Element[] elements = handleCommunities(context, first, null);
// generate the output // generate the output
Element root = xmlOutput.getRootElement(); Element root = xmlOutput.getRootElement();
for (int i = 0; i < elements.length; i++) for (int i = 0; i < elements.length; i++) {
{
root.addContent(elements[i]); root.addContent(elements[i]);
} }
// finally write the string into the output file // finally write the string into the output file
try try {
{
BufferedWriter out = new BufferedWriter(new FileWriter(output)); BufferedWriter out = new BufferedWriter(new FileWriter(output));
out.write(new XMLOutputter().outputString(xmlOutput)); out.write(new XMLOutputter().outputString(xmlOutput));
out.close(); out.close();
} } catch (IOException e) {
catch (IOException e)
{
System.out.println("Unable to write to output file " + output); System.out.println("Unable to write to output file " + output);
System.exit(0); System.exit(0);
} }
context.complete(); context.complete();
} }
/** /**
* Output the usage information * Output the usage information
*/ */
private static void usage() private static void usage() {
{
System.out.println("Usage: java StructBuilder -f <source XML file> -o <output file> -e <eperson email>"); System.out.println("Usage: java StructBuilder -f <source XML file> -o <output file> -e <eperson email>");
System.out.println("Communities will be created from the top level, and a map of communities to handles will be returned in the output file"); System.out.println(
"Communities will be created from the top level, and a map of communities to handles will be returned in " +
"the output file");
return; return;
} }
/** /**
* Validate the XML document. This method does not return, but if validation * Validate the XML document. This method does not return, but if validation
* fails it generates an error and ceases execution * fails it generates an error and ceases execution
* *
* @param document the XML document object * @param document the XML document object
* @throws TransformerException if transformer error * @throws TransformerException if transformer error
*
*/ */
private static void validate(org.w3c.dom.Document document) private static void validate(org.w3c.dom.Document document)
throws TransformerException throws TransformerException {
{
StringBuffer err = new StringBuffer(); StringBuffer err = new StringBuffer();
boolean trip = false; boolean trip = false;
err.append("The following errors were encountered parsing the source XML\n"); err.append("The following errors were encountered parsing the source XML\n");
err.append("No changes have been made to the DSpace instance\n\n"); err.append("No changes have been made to the DSpace instance\n\n");
NodeList first = XPathAPI.selectNodeList(document, "/import_structure/community"); NodeList first = XPathAPI.selectNodeList(document, "/import_structure/community");
if (first.getLength() == 0) if (first.getLength() == 0) {
{
err.append("-There are no top level communities in the source document"); err.append("-There are no top level communities in the source document");
System.out.println(err.toString()); System.out.println(err.toString());
System.exit(0); System.exit(0);
} }
String errs = validateCommunities(first, 1); String errs = validateCommunities(first, 1);
if (errs != null) if (errs != null) {
{
err.append(errs); err.append(errs);
trip = true; trip = true;
} }
if (trip) if (trip) {
{
System.out.println(err.toString()); System.out.println(err.toString());
System.exit(0); System.exit(0);
} }
} }
/** /**
* Validate the communities section of the XML document. This returns a string * Validate the communities section of the XML document. This returns a string
* containing any errors encountered, or null if there were no errors * containing any errors encountered, or null if there were no errors
* *
* @param communities the NodeList of communities to validate * @param communities the NodeList of communities to validate
* @param level the level in the XML document that we are at, for the purposes * @param level the level in the XML document that we are at, for the purposes
* of error reporting * of error reporting
*
* @return the errors that need to be generated by the calling method, or null if * @return the errors that need to be generated by the calling method, or null if
* no errors. * no errors.
*/ */
private static String validateCommunities(NodeList communities, int level) private static String validateCommunities(NodeList communities, int level)
throws TransformerException throws TransformerException {
{
StringBuffer err = new StringBuffer(); StringBuffer err = new StringBuffer();
boolean trip = false; boolean trip = false;
String errs = null; String errs = null;
for (int i = 0; i < communities.getLength(); i++) for (int i = 0; i < communities.getLength(); i++) {
{
Node n = communities.item(i); Node n = communities.item(i);
NodeList name = XPathAPI.selectNodeList(n, "name"); NodeList name = XPathAPI.selectNodeList(n, "name");
if (name.getLength() != 1) if (name.getLength() != 1) {
{ String pos = Integer.toString(i + 1);
String pos = Integer.toString(i + 1); err.append("-The level " + level + " community in position " + pos);
err.append("-The level " + level + " community in position " + pos); err.append(" does not contain exactly one name field\n");
err.append(" does not contain exactly one name field\n"); trip = true;
trip = true; }
}
// validate sub communities
// validate sub communities NodeList subCommunities = XPathAPI.selectNodeList(n, "community");
NodeList subCommunities = XPathAPI.selectNodeList(n, "community"); String comErrs = validateCommunities(subCommunities, level + 1);
String comErrs = validateCommunities(subCommunities, level + 1); if (comErrs != null) {
if (comErrs != null) err.append(comErrs);
{ trip = true;
err.append(comErrs); }
trip = true;
} // validate collections
NodeList collections = XPathAPI.selectNodeList(n, "collection");
// validate collections String colErrs = validateCollections(collections, level + 1);
NodeList collections = XPathAPI.selectNodeList(n, "collection"); if (colErrs != null) {
String colErrs = validateCollections(collections, level + 1); err.append(colErrs);
if (colErrs != null) trip = true;
{ }
err.append(colErrs);
trip = true;
}
} }
if (trip) if (trip) {
{
errs = err.toString(); errs = err.toString();
} }
return errs; return errs;
} }
/** /**
* validate the collection section of the XML document. This generates a * validate the collection section of the XML document. This generates a
* string containing any errors encountered, or returns null if no errors * string containing any errors encountered, or returns null if no errors
* *
* @param collections a NodeList of collections to validate * @param collections a NodeList of collections to validate
* @param level the level in the XML document for the purposes of error reporting * @param level the level in the XML document for the purposes of error reporting
*
* @return the errors to be generated by the calling method, or null if none * @return the errors to be generated by the calling method, or null if none
*/ */
private static String validateCollections(NodeList collections, int level) private static String validateCollections(NodeList collections, int level)
throws TransformerException throws TransformerException {
{
StringBuffer err = new StringBuffer(); StringBuffer err = new StringBuffer();
boolean trip = false; boolean trip = false;
String errs = null; String errs = null;
for (int i = 0; i < collections.getLength(); i++) for (int i = 0; i < collections.getLength(); i++) {
{
Node n = collections.item(i); Node n = collections.item(i);
NodeList name = XPathAPI.selectNodeList(n, "name"); NodeList name = XPathAPI.selectNodeList(n, "name");
if (name.getLength() != 1) if (name.getLength() != 1) {
{ String pos = Integer.toString(i + 1);
String pos = Integer.toString(i + 1); err.append("-The level " + level + " collection in position " + pos);
err.append("-The level " + level + " collection in position " + pos); err.append(" does not contain exactly one name field\n");
err.append(" does not contain exactly one name field\n"); trip = true;
trip = true; }
}
} }
if (trip) if (trip) {
{
errs = err.toString(); errs = err.toString();
} }
return errs; return errs;
} }
/** /**
* Load in the XML from file. * Load in the XML from file.
* *
* @param filename * @param filename the filename to load from
* the filename to load from
*
* @return the DOM representation of the XML file * @return the DOM representation of the XML file
*/ */
private static org.w3c.dom.Document loadXML(String filename) private static org.w3c.dom.Document loadXML(String filename)
throws IOException, ParserConfigurationException, SAXException throws IOException, ParserConfigurationException, SAXException {
{
DocumentBuilder builder = DocumentBuilderFactory.newInstance() DocumentBuilder builder = DocumentBuilderFactory.newInstance()
.newDocumentBuilder(); .newDocumentBuilder();
org.w3c.dom.Document document = builder.parse(new File(filename)); org.w3c.dom.Document document = builder.parse(new File(filename));
return document; return document;
} }
/** /**
* Return the String value of a Node * Return the String value of a Node
* *
* @param node the node from which we want to extract the string value * @param node the node from which we want to extract the string value
*
* @return the string value of the node * @return the string value of the node
*/ */
public static String getStringValue(Node node) public static String getStringValue(Node node) {
{
String value = node.getNodeValue(); String value = node.getNodeValue();
if (node.hasChildNodes()) if (node.hasChildNodes()) {
{
Node first = node.getFirstChild(); Node first = node.getFirstChild();
if (first.getNodeType() == Node.TEXT_NODE) if (first.getNodeType() == Node.TEXT_NODE) {
{
return first.getNodeValue().trim(); return first.getNodeValue().trim();
} }
} }
return value; return value;
} }
/** /**
* Take a node list of communities and build the structure from them, delegating * Take a node list of communities and build the structure from them, delegating
* to the relevant methods in this class for sub-communities and collections * to the relevant methods in this class for sub-communities and collections
* *
* @param context the context of the request * @param context the context of the request
* @param communities a nodelist of communities to create along with their sub-structures * @param communities a nodelist of communities to create along with their sub-structures
* @param parent the parent community of the nodelist of communities to create * @param parent the parent community of the nodelist of communities to create
* * @return an element array containing additional information regarding the
* @return an element array containing additional information regarding the * created communities (e.g. the handles they have been assigned)
* created communities (e.g. the handles they have been assigned)
*/ */
private static Element[] handleCommunities(Context context, NodeList communities, Community parent) private static Element[] handleCommunities(Context context, NodeList communities, Community parent)
throws TransformerException, SQLException, Exception throws TransformerException, SQLException, Exception {
{
Element[] elements = new Element[communities.getLength()]; Element[] elements = new Element[communities.getLength()];
for (int i = 0; i < communities.getLength(); i++) for (int i = 0; i < communities.getLength(); i++) {
{
Community community; Community community;
Element element = new Element("community"); Element element = new Element("community");
// create the community or sub community // create the community or sub community
if (parent != null) if (parent != null) {
{
community = communityService.create(parent, context); community = communityService.create(parent, context);
} } else {
else
{
community = communityService.create(null, context); community = communityService.create(null, context);
} }
// default the short description to be an empty string // default the short description to be an empty string
communityService.setMetadata(context, community, "short_description", " "); communityService.setMetadata(context, community, "short_description", " ");
// now update the metadata // now update the metadata
Node tn = communities.item(i); Node tn = communities.item(i);
for (Map.Entry<String, String> entry : communityMap.entrySet()) for (Map.Entry<String, String> entry : communityMap.entrySet()) {
{
NodeList nl = XPathAPI.selectNodeList(tn, entry.getKey()); NodeList nl = XPathAPI.selectNodeList(tn, entry.getKey());
if (nl.getLength() == 1) if (nl.getLength() == 1) {
{
communityService.setMetadata(context, community, entry.getValue(), getStringValue(nl.item(0))); communityService.setMetadata(context, community, entry.getValue(), getStringValue(nl.item(0)));
} }
} }
// FIXME: at the moment, if the community already exists by name // FIXME: at the moment, if the community already exists by name
// then this will throw a PSQLException on a duplicate key // then this will throw a PSQLException on a duplicate key
// violation // violation
@@ -431,7 +399,7 @@ public class StructBuilder
// to isolate the community that already exists without hitting // to isolate the community that already exists without hitting
// the database directly. // the database directly.
communityService.update(context, community); communityService.update(context, community);
// build the element with the handle that identifies the new // build the element with the handle that identifies the new
// community // community
// along with all the information that we imported here // along with all the information that we imported here
@@ -441,151 +409,134 @@ public class StructBuilder
// case // case
// we want to move it or make it switchable later // we want to move it or make it switchable later
element.setAttribute("identifier", community.getHandle()); element.setAttribute("identifier", community.getHandle());
Element nameElement = new Element("name"); Element nameElement = new Element("name");
nameElement.setText(communityService.getMetadata(community, "name")); nameElement.setText(communityService.getMetadata(community, "name"));
element.addContent(nameElement); element.addContent(nameElement);
if (communityService.getMetadata(community, "short_description") != null) if (communityService.getMetadata(community, "short_description") != null) {
{
Element descriptionElement = new Element("description"); Element descriptionElement = new Element("description");
descriptionElement.setText(communityService.getMetadata(community, "short_description")); descriptionElement.setText(communityService.getMetadata(community, "short_description"));
element.addContent(descriptionElement); element.addContent(descriptionElement);
} }
if (communityService.getMetadata(community, "introductory_text") != null) if (communityService.getMetadata(community, "introductory_text") != null) {
{
Element introElement = new Element("intro"); Element introElement = new Element("intro");
introElement.setText(communityService.getMetadata(community, "introductory_text")); introElement.setText(communityService.getMetadata(community, "introductory_text"));
element.addContent(introElement); element.addContent(introElement);
} }
if (communityService.getMetadata(community, "copyright_text") != null) if (communityService.getMetadata(community, "copyright_text") != null) {
{
Element copyrightElement = new Element("copyright"); Element copyrightElement = new Element("copyright");
copyrightElement.setText(communityService.getMetadata(community, "copyright_text")); copyrightElement.setText(communityService.getMetadata(community, "copyright_text"));
element.addContent(copyrightElement); element.addContent(copyrightElement);
} }
if (communityService.getMetadata(community, "side_bar_text") != null) if (communityService.getMetadata(community, "side_bar_text") != null) {
{
Element sidebarElement = new Element("sidebar"); Element sidebarElement = new Element("sidebar");
sidebarElement.setText(communityService.getMetadata(community, "side_bar_text")); sidebarElement.setText(communityService.getMetadata(community, "side_bar_text"));
element.addContent(sidebarElement); element.addContent(sidebarElement);
} }
// handle sub communities // handle sub communities
NodeList subCommunities = XPathAPI.selectNodeList(tn, "community"); NodeList subCommunities = XPathAPI.selectNodeList(tn, "community");
Element[] subCommunityElements = handleCommunities(context, subCommunities, community); Element[] subCommunityElements = handleCommunities(context, subCommunities, community);
// handle collections // handle collections
NodeList collections = XPathAPI.selectNodeList(tn, "collection"); NodeList collections = XPathAPI.selectNodeList(tn, "collection");
Element[] collectionElements = handleCollections(context, collections, community); Element[] collectionElements = handleCollections(context, collections, community);
int j; int j;
for (j = 0; j < subCommunityElements.length; j++) for (j = 0; j < subCommunityElements.length; j++) {
{
element.addContent(subCommunityElements[j]); element.addContent(subCommunityElements[j]);
} }
for (j = 0; j < collectionElements.length; j++) for (j = 0; j < collectionElements.length; j++) {
{
element.addContent(collectionElements[j]); element.addContent(collectionElements[j]);
} }
elements[i] = element; elements[i] = element;
} }
return elements; return elements;
} }
/** /**
* Take a node list of collections and create the structure from them * Take a node list of collections and create the structure from them
* *
* @param context the context of the request * @param context the context of the request
* @param collections the node list of collections to be created * @param collections the node list of collections to be created
* @param parent the parent community to whom the collections belong * @param parent the parent community to whom the collections belong
*
* @return an Element array containing additional information about the * @return an Element array containing additional information about the
* created collections (e.g. the handle) * created collections (e.g. the handle)
*/ */
private static Element[] handleCollections(Context context, NodeList collections, Community parent) private static Element[] handleCollections(Context context, NodeList collections, Community parent)
throws TransformerException, SQLException, AuthorizeException, IOException, Exception throws TransformerException, SQLException, AuthorizeException, IOException, Exception {
{
Element[] elements = new Element[collections.getLength()]; Element[] elements = new Element[collections.getLength()];
for (int i = 0; i < collections.getLength(); i++) for (int i = 0; i < collections.getLength(); i++) {
{
Element element = new Element("collection"); Element element = new Element("collection");
Collection collection = collectionService.create(context, parent); Collection collection = collectionService.create(context, parent);
// default the short description to the empty string // default the short description to the empty string
collectionService.setMetadata(context, collection, "short_description", " "); collectionService.setMetadata(context, collection, "short_description", " ");
// import the rest of the metadata // import the rest of the metadata
Node tn = collections.item(i); Node tn = collections.item(i);
for (Map.Entry<String, String> entry : collectionMap.entrySet()) for (Map.Entry<String, String> entry : collectionMap.entrySet()) {
{
NodeList nl = XPathAPI.selectNodeList(tn, entry.getKey()); NodeList nl = XPathAPI.selectNodeList(tn, entry.getKey());
if (nl.getLength() == 1) if (nl.getLength() == 1) {
{
collectionService.setMetadata(context, collection, entry.getValue(), getStringValue(nl.item(0))); collectionService.setMetadata(context, collection, entry.getValue(), getStringValue(nl.item(0)));
} }
} }
collectionService.update(context, collection); collectionService.update(context, collection);
element.setAttribute("identifier", collection.getHandle()); element.setAttribute("identifier", collection.getHandle());
Element nameElement = new Element("name"); Element nameElement = new Element("name");
nameElement.setText(collectionService.getMetadata(collection, "name")); nameElement.setText(collectionService.getMetadata(collection, "name"));
element.addContent(nameElement); element.addContent(nameElement);
if (collectionService.getMetadata(collection, "short_description") != null) if (collectionService.getMetadata(collection, "short_description") != null) {
{
Element descriptionElement = new Element("description"); Element descriptionElement = new Element("description");
descriptionElement.setText(collectionService.getMetadata(collection, "short_description")); descriptionElement.setText(collectionService.getMetadata(collection, "short_description"));
element.addContent(descriptionElement); element.addContent(descriptionElement);
} }
if (collectionService.getMetadata(collection, "introductory_text") != null) if (collectionService.getMetadata(collection, "introductory_text") != null) {
{
Element introElement = new Element("intro"); Element introElement = new Element("intro");
introElement.setText(collectionService.getMetadata(collection, "introductory_text")); introElement.setText(collectionService.getMetadata(collection, "introductory_text"));
element.addContent(introElement); element.addContent(introElement);
} }
if (collectionService.getMetadata(collection, "copyright_text") != null) if (collectionService.getMetadata(collection, "copyright_text") != null) {
{
Element copyrightElement = new Element("copyright"); Element copyrightElement = new Element("copyright");
copyrightElement.setText(collectionService.getMetadata(collection, "copyright_text")); copyrightElement.setText(collectionService.getMetadata(collection, "copyright_text"));
element.addContent(copyrightElement); element.addContent(copyrightElement);
} }
if (collectionService.getMetadata(collection, "side_bar_text") != null) if (collectionService.getMetadata(collection, "side_bar_text") != null) {
{
Element sidebarElement = new Element("sidebar"); Element sidebarElement = new Element("sidebar");
sidebarElement.setText(collectionService.getMetadata(collection, "side_bar_text")); sidebarElement.setText(collectionService.getMetadata(collection, "side_bar_text"));
element.addContent(sidebarElement); element.addContent(sidebarElement);
} }
if (collectionService.getMetadata(collection, "license") != null) if (collectionService.getMetadata(collection, "license") != null) {
{
Element sidebarElement = new Element("license"); Element sidebarElement = new Element("license");
sidebarElement.setText(collectionService.getMetadata(collection, "license")); sidebarElement.setText(collectionService.getMetadata(collection, "license"));
element.addContent(sidebarElement); element.addContent(sidebarElement);
} }
if (collectionService.getMetadata(collection, "provenance_description") != null) if (collectionService.getMetadata(collection, "provenance_description") != null) {
{
Element sidebarElement = new Element("provenance"); Element sidebarElement = new Element("provenance");
sidebarElement.setText(collectionService.getMetadata(collection, "provenance_description")); sidebarElement.setText(collectionService.getMetadata(collection, "provenance_description"));
element.addContent(sidebarElement); element.addContent(sidebarElement);
} }
elements[i] = element; elements[i] = element;
} }
return elements; return elements;
} }
} }

View File

@@ -7,67 +7,93 @@
*/ */
package org.dspace.app.bulkedit; package org.dspace.app.bulkedit;
import org.dspace.content.Item;
import org.dspace.content.Collection;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
import org.dspace.content.Collection;
import org.dspace.content.Item;
/** /**
* Utility class to store changes to item that may occur during a batch edit. * Utility class to store changes to item that may occur during a batch edit.
* *
* @author Stuart Lewis * @author Stuart Lewis
*/ */
public class BulkEditChange public class BulkEditChange {
{ /**
/** The item these changes relate to */ * The item these changes relate to
*/
private Item item; private Item item;
/** The List of hashtables with the new elements */ /**
* The List of hashtables with the new elements
*/
private List<BulkEditMetadataValue> adds; private List<BulkEditMetadataValue> adds;
/** The List of hashtables with the removed elements */ /**
* The List of hashtables with the removed elements
*/
private List<BulkEditMetadataValue> removes; private List<BulkEditMetadataValue> removes;
/** The List of hashtables with the unchanged elements */ /**
* The List of hashtables with the unchanged elements
*/
private List<BulkEditMetadataValue> constant; private List<BulkEditMetadataValue> constant;
/** The List of the complete set of new values (constant + adds) */ /**
* The List of the complete set of new values (constant + adds)
*/
private List<BulkEditMetadataValue> complete; private List<BulkEditMetadataValue> complete;
/** The list of old collections the item used to be mapped to */ /**
* The list of old collections the item used to be mapped to
*/
private List<Collection> oldMappedCollections; private List<Collection> oldMappedCollections;
/** The list of new collections the item has been mapped into */ /**
* The list of new collections the item has been mapped into
*/
private List<Collection> newMappedCollections; private List<Collection> newMappedCollections;
/** The old owning collection */ /**
* The old owning collection
*/
private Collection oldOwningCollection; private Collection oldOwningCollection;
/** The new owning collection */ /**
* The new owning collection
*/
private Collection newOwningCollection; private Collection newOwningCollection;
/** Is this a new item */ /**
* Is this a new item
*/
private boolean newItem; private boolean newItem;
/** Has this item been deleted? */ /**
* Has this item been deleted?
*/
private boolean deleted; private boolean deleted;
/** Has this item been withdrawn? */ /**
* Has this item been withdrawn?
*/
private boolean withdrawn; private boolean withdrawn;
/** Has this item been reinstated? */ /**
* Has this item been reinstated?
*/
private boolean reinstated; private boolean reinstated;
/** Have any changes actually been made? */ /**
* Have any changes actually been made?
*/
private boolean empty; private boolean empty;
/** /**
* Initialise a change holder for a new item * Initialise a change holder for a new item
*/ */
public BulkEditChange() public BulkEditChange() {
{
// Set the item to be null // Set the item to be null
item = null; item = null;
newItem = true; newItem = true;
@@ -89,8 +115,7 @@ public class BulkEditChange
* *
* @param i The Item to store * @param i The Item to store
*/ */
public BulkEditChange(Item i) public BulkEditChange(Item i) {
{
// Store the item // Store the item
item = i; item = i;
newItem = false; newItem = false;
@@ -110,8 +135,7 @@ public class BulkEditChange
* *
* @param i The item * @param i The item
*/ */
public void setItem(Item i) public void setItem(Item i) {
{
// Store the item // Store the item
item = i; item = i;
} }
@@ -121,8 +145,7 @@ public class BulkEditChange
* *
* @param dcv The value to add * @param dcv The value to add
*/ */
public void registerAdd(BulkEditMetadataValue dcv) public void registerAdd(BulkEditMetadataValue dcv) {
{
// Add the added value // Add the added value
adds.add(dcv); adds.add(dcv);
complete.add(dcv); complete.add(dcv);
@@ -134,8 +157,7 @@ public class BulkEditChange
* *
* @param dcv The value to remove * @param dcv The value to remove
*/ */
public void registerRemove(BulkEditMetadataValue dcv) public void registerRemove(BulkEditMetadataValue dcv) {
{
// Add the removed value // Add the removed value
removes.add(dcv); removes.add(dcv);
empty = false; empty = false;
@@ -146,8 +168,7 @@ public class BulkEditChange
* *
* @param dcv The value to keep unchanged * @param dcv The value to keep unchanged
*/ */
public void registerConstant(BulkEditMetadataValue dcv) public void registerConstant(BulkEditMetadataValue dcv) {
{
// Add the removed value // Add the removed value
constant.add(dcv); constant.add(dcv);
complete.add(dcv); complete.add(dcv);
@@ -158,8 +179,7 @@ public class BulkEditChange
* *
* @param c The new mapped Collection * @param c The new mapped Collection
*/ */
public void registerNewMappedCollection(Collection c) public void registerNewMappedCollection(Collection c) {
{
// Add the new owning Collection // Add the new owning Collection
newMappedCollections.add(c); newMappedCollections.add(c);
empty = false; empty = false;
@@ -170,27 +190,22 @@ public class BulkEditChange
* *
* @param c The old mapped Collection * @param c The old mapped Collection
*/ */
public void registerOldMappedCollection(Collection c) public void registerOldMappedCollection(Collection c) {
{
// Add the old owning Collection (if it isn't there already, or is an old collection) // Add the old owning Collection (if it isn't there already, or is an old collection)
boolean found = false; boolean found = false;
if ((this.getOldOwningCollection() != null) && if ((this.getOldOwningCollection() != null) &&
(this.getOldOwningCollection().getHandle().equals(c.getHandle()))) (this.getOldOwningCollection().getHandle().equals(c.getHandle()))) {
{
found = true; found = true;
} }
for (Collection collection : oldMappedCollections) for (Collection collection : oldMappedCollections) {
{ if (collection.getHandle().equals(c.getHandle())) {
if (collection.getHandle().equals(c.getHandle()))
{
found = true; found = true;
} }
} }
if (!found) if (!found) {
{
oldMappedCollections.add(c); oldMappedCollections.add(c);
empty = false; empty = false;
} }
@@ -202,8 +217,7 @@ public class BulkEditChange
* @param oldC The old owning collection * @param oldC The old owning collection
* @param newC The new owning collection * @param newC The new owning collection
*/ */
public void changeOwningCollection(Collection oldC, Collection newC) public void changeOwningCollection(Collection oldC, Collection newC) {
{
// Store the old owning collection // Store the old owning collection
oldOwningCollection = oldC; oldOwningCollection = oldC;
@@ -217,8 +231,7 @@ public class BulkEditChange
* *
* @param newC The new owning collection * @param newC The new owning collection
*/ */
public void setOwningCollection(Collection newC) public void setOwningCollection(Collection newC) {
{
// Store the new owning collection // Store the new owning collection
newOwningCollection = newC; newOwningCollection = newC;
//empty = false; //empty = false;
@@ -229,8 +242,7 @@ public class BulkEditChange
* *
* @return The item * @return The item
*/ */
public Item getItem() public Item getItem() {
{
// Return the item // Return the item
return item; return item;
} }
@@ -240,8 +252,7 @@ public class BulkEditChange
* *
* @return the list of elements and their values that have been added. * @return the list of elements and their values that have been added.
*/ */
public List<BulkEditMetadataValue> getAdds() public List<BulkEditMetadataValue> getAdds() {
{
// Return the array // Return the array
return adds; return adds;
} }
@@ -251,8 +262,7 @@ public class BulkEditChange
* *
* @return the list of elements and their values that have been removed. * @return the list of elements and their values that have been removed.
*/ */
public List<BulkEditMetadataValue> getRemoves() public List<BulkEditMetadataValue> getRemoves() {
{
// Return the array // Return the array
return removes; return removes;
} }
@@ -262,8 +272,7 @@ public class BulkEditChange
* *
* @return the list of unchanged values * @return the list of unchanged values
*/ */
public List<BulkEditMetadataValue> getConstant() public List<BulkEditMetadataValue> getConstant() {
{
// Return the array // Return the array
return constant; return constant;
} }
@@ -273,8 +282,7 @@ public class BulkEditChange
* *
* @return the list of all values * @return the list of all values
*/ */
public List<BulkEditMetadataValue> getComplete() public List<BulkEditMetadataValue> getComplete() {
{
// Return the array // Return the array
return complete; return complete;
} }
@@ -284,8 +292,7 @@ public class BulkEditChange
* *
* @return the list of new mapped collections * @return the list of new mapped collections
*/ */
public List<Collection> getNewMappedCollections() public List<Collection> getNewMappedCollections() {
{
// Return the array // Return the array
return newMappedCollections; return newMappedCollections;
} }
@@ -295,8 +302,7 @@ public class BulkEditChange
* *
* @return the list of old mapped collections * @return the list of old mapped collections
*/ */
public List<Collection> getOldMappedCollections() public List<Collection> getOldMappedCollections() {
{
// Return the array // Return the array
return oldMappedCollections; return oldMappedCollections;
} }
@@ -306,8 +312,7 @@ public class BulkEditChange
* *
* @return the old owning collection * @return the old owning collection
*/ */
public Collection getOldOwningCollection() public Collection getOldOwningCollection() {
{
// Return the old owning collection // Return the old owning collection
return oldOwningCollection; return oldOwningCollection;
} }
@@ -317,8 +322,7 @@ public class BulkEditChange
* *
* @return the new owning collection * @return the new owning collection
*/ */
public Collection getNewOwningCollection() public Collection getNewOwningCollection() {
{
// Return the new owning collection // Return the new owning collection
return newOwningCollection; return newOwningCollection;
} }
@@ -328,8 +332,7 @@ public class BulkEditChange
* *
* @return Whether or not this is for a new item * @return Whether or not this is for a new item
*/ */
public boolean isNewItem() public boolean isNewItem() {
{
// Return the new item status // Return the new item status
return newItem; return newItem;
} }
@@ -339,8 +342,7 @@ public class BulkEditChange
* *
* @return Whether or not this is for a deleted item * @return Whether or not this is for a deleted item
*/ */
public boolean isDeleted() public boolean isDeleted() {
{
// Return the new item status // Return the new item status
return deleted; return deleted;
} }
@@ -359,8 +361,7 @@ public class BulkEditChange
* *
* @return Whether or not this is for a withdrawn item * @return Whether or not this is for a withdrawn item
*/ */
public boolean isWithdrawn() public boolean isWithdrawn() {
{
// Return the new item status // Return the new item status
return withdrawn; return withdrawn;
} }
@@ -379,8 +380,7 @@ public class BulkEditChange
* *
* @return Whether or not this is for a reinstated item * @return Whether or not this is for a reinstated item
*/ */
public boolean isReinstated() public boolean isReinstated() {
{
// Return the new item status // Return the new item status
return reinstated; return reinstated;
} }
@@ -399,8 +399,7 @@ public class BulkEditChange
* *
* @return Whether or not changes have been made * @return Whether or not changes have been made
*/ */
public boolean hasChanges() public boolean hasChanges() {
{
return !empty; return !empty;
} }
} }

View File

@@ -7,7 +7,26 @@
*/ */
package org.dspace.app.bulkedit; package org.dspace.app.bulkedit;
import org.apache.commons.lang3.StringUtils; import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStreamWriter;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.dspace.authority.AuthorityValue; import org.dspace.authority.AuthorityValue;
import org.dspace.authority.factory.AuthorityServiceFactory; import org.dspace.authority.factory.AuthorityServiceFactory;
import org.dspace.authority.service.AuthorityValueService; import org.dspace.authority.service.AuthorityValueService;
@@ -16,19 +35,14 @@ import org.dspace.content.Item;
import org.dspace.content.MetadataField; import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema; import org.dspace.content.MetadataSchema;
import org.dspace.content.MetadataValue; import org.dspace.content.MetadataValue;
import org.dspace.content.authority.Choices;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService; import org.dspace.content.service.ItemService;
import org.dspace.content.service.MetadataFieldService; import org.dspace.content.service.MetadataFieldService;
import org.dspace.content.service.MetadataSchemaService; import org.dspace.content.service.MetadataSchemaService;
import org.dspace.content.authority.Choices;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.services.factory.DSpaceServicesFactory; import org.dspace.services.factory.DSpaceServicesFactory;
import java.util.*;
import java.util.regex.Pattern;
import java.util.regex.Matcher;
import java.io.*;
/** /**
* Utility class to read and write CSV files * Utility class to read and write CSV files
* *
@@ -38,50 +52,74 @@ import java.io.*;
* *
* This class has been made serializable, as it is stored in a Session. * This class has been made serializable, as it is stored in a Session.
* Is it wise to: * Is it wise to:
* a) be putting this into a user's session? * a) be putting this into a user's session?
* b) holding an entire CSV upload in memory? * b) holding an entire CSV upload in memory?
* *
* @author Stuart Lewis * @author Stuart Lewis
*/ */
public class DSpaceCSV implements Serializable public class DSpaceCSV implements Serializable {
{ /**
/** The headings of the CSV file */ * The headings of the CSV file
*/
protected List<String> headings; protected List<String> headings;
/** An array list of CSV lines */ /**
* An array list of CSV lines
*/
protected List<DSpaceCSVLine> lines; protected List<DSpaceCSVLine> lines;
/** A counter of how many CSV lines this object holds */ /**
* A counter of how many CSV lines this object holds
*/
protected int counter; protected int counter;
/** The value separator (defaults to double pipe '||') */ /**
* The value separator (defaults to double pipe '||')
*/
protected String valueSeparator; protected String valueSeparator;
/** The value separator in an escaped form for using in regexes */ /**
* The value separator in an escaped form for using in regexes
*/
protected String escapedValueSeparator; protected String escapedValueSeparator;
/** The field separator (defaults to comma) */ /**
* The field separator (defaults to comma)
*/
protected String fieldSeparator; protected String fieldSeparator;
/** The field separator in an escaped form for using in regexes */ /**
* The field separator in an escaped form for using in regexes
*/
protected String escapedFieldSeparator; protected String escapedFieldSeparator;
/** The authority separator (defaults to double colon '::') */ /**
* The authority separator (defaults to double colon '::')
*/
protected String authoritySeparator; protected String authoritySeparator;
/** The authority separator in an escaped form for using in regexes */ /**
* The authority separator in an escaped form for using in regexes
*/
protected String escapedAuthoritySeparator; protected String escapedAuthoritySeparator;
protected transient final ItemService itemService = ContentServiceFactory.getInstance().getItemService(); protected transient final ItemService itemService = ContentServiceFactory.getInstance().getItemService();
protected transient final MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance().getMetadataSchemaService(); protected transient final MetadataSchemaService metadataSchemaService =
protected transient final MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService(); ContentServiceFactory.getInstance().getMetadataSchemaService();
protected transient final AuthorityValueService authorityValueService = AuthorityServiceFactory.getInstance().getAuthorityValueService(); protected transient final MetadataFieldService metadataFieldService =
ContentServiceFactory.getInstance().getMetadataFieldService();
protected transient final AuthorityValueService authorityValueService =
AuthorityServiceFactory.getInstance().getAuthorityValueService();
/** Whether to export all metadata such as handles and provenance information */ /**
* Whether to export all metadata such as handles and provenance information
*/
protected boolean exportAll; protected boolean exportAll;
/** A list of metadata elements to ignore */ /**
* A list of metadata elements to ignore
*/
protected Map<String, String> ignore; protected Map<String, String> ignore;
@@ -90,8 +128,7 @@ public class DSpaceCSV implements Serializable
* *
* @param exportAll Whether to export all metadata such as handles and provenance information * @param exportAll Whether to export all metadata such as handles and provenance information
*/ */
public DSpaceCSV(boolean exportAll) public DSpaceCSV(boolean exportAll) {
{
// Initialise the class // Initialise the class
init(); init();
@@ -104,48 +141,37 @@ public class DSpaceCSV implements Serializable
* *
* @param f The file to read from * @param f The file to read from
* @param c The DSpace Context * @param c The DSpace Context
*
* @throws Exception thrown if there is an error reading or processing the file * @throws Exception thrown if there is an error reading or processing the file
*/ */
public DSpaceCSV(File f, Context c) throws Exception public DSpaceCSV(File f, Context c) throws Exception {
{
// Initialise the class // Initialise the class
init(); init();
// Open the CSV file // Open the CSV file
BufferedReader input = null; BufferedReader input = null;
try try {
{ input = new BufferedReader(new InputStreamReader(new FileInputStream(f), "UTF-8"));
input = new BufferedReader(new InputStreamReader(new FileInputStream(f),"UTF-8"));
// Read the heading line // Read the heading line
String head = input.readLine(); String head = input.readLine();
String[] headingElements = head.split(escapedFieldSeparator); String[] headingElements = head.split(escapedFieldSeparator);
int columnCounter = 0; int columnCounter = 0;
for (String element : headingElements) for (String element : headingElements) {
{
columnCounter++; columnCounter++;
// Remove surrounding quotes if there are any // Remove surrounding quotes if there are any
if ((element.startsWith("\"")) && (element.endsWith("\""))) if ((element.startsWith("\"")) && (element.endsWith("\""))) {
{
element = element.substring(1, element.length() - 1); element = element.substring(1, element.length() - 1);
} }
// Store the heading // Store the heading
if ("collection".equals(element)) if ("collection".equals(element)) {
{
// Store the heading // Store the heading
headings.add(element); headings.add(element);
} } else if ("action".equals(element)) { // Store the action
// Store the action
else if ("action".equals(element))
{
// Store the heading // Store the heading
headings.add(element); headings.add(element);
} } else if (!"id".equals(element)) {
else if (!"id".equals(element))
{
String authorityPrefix = ""; String authorityPrefix = "";
AuthorityValue authorityValueType = authorityValueService.getAuthorityValueType(element); AuthorityValue authorityValueType = authorityValueService.getAuthorityValueType(element);
if (authorityValueType != null) { if (authorityValueType != null) {
@@ -180,7 +206,8 @@ public class DSpaceCSV implements Serializable
} }
// Check that the metadata element exists in the schema // Check that the metadata element exists in the schema
MetadataField foundField = metadataFieldService.findByElement(c, foundSchema, metadataElement, metadataQualifier); MetadataField foundField = metadataFieldService
.findByElement(c, foundSchema, metadataElement, metadataQualifier);
if (foundField == null) { if (foundField == null) {
throw new MetadataImportInvalidHeadingException(clean[0], throw new MetadataImportInvalidHeadingException(clean[0],
MetadataImportInvalidHeadingException.ELEMENT, MetadataImportInvalidHeadingException.ELEMENT,
@@ -196,8 +223,7 @@ public class DSpaceCSV implements Serializable
StringBuilder lineBuilder = new StringBuilder(); StringBuilder lineBuilder = new StringBuilder();
String lineRead; String lineRead;
while (StringUtils.isNotBlank(lineRead = input.readLine())) while ((lineRead = input.readLine()) != null) {
{
if (lineBuilder.length() > 0) { if (lineBuilder.length() > 0) {
// Already have a previously read value - add this line // Already have a previously read value - add this line
lineBuilder.append("\n").append(lineRead); lineBuilder.append("\n").append(lineRead);
@@ -236,11 +262,8 @@ public class DSpaceCSV implements Serializable
addItem(lineRead); addItem(lineRead);
} }
} }
} } finally {
finally if (input != null) {
{
if (input != null)
{
input.close(); input.close();
} }
} }
@@ -249,8 +272,7 @@ public class DSpaceCSV implements Serializable
/** /**
* Initialise this class with values from dspace.cfg * Initialise this class with values from dspace.cfg
*/ */
protected void init() protected void init() {
{
// Set the value separator // Set the value separator
setValueSeparator(); setValueSeparator();
@@ -273,13 +295,16 @@ public class DSpaceCSV implements Serializable
ignore = new HashMap<>(); ignore = new HashMap<>();
// Specify default values // Specify default values
String[] defaultValues = new String[]{"dc.date.accessioned, dc.date.available, " + String[] defaultValues =
"dc.date.updated, dc.description.provenance"}; new String[] {
String[] toIgnoreArray = DSpaceServicesFactory.getInstance().getConfigurationService().getArrayProperty("bulkedit.ignore-on-export", defaultValues); "dc.date.accessioned, dc.date.available, dc.date.updated, dc.description.provenance"
for (String toIgnoreString : toIgnoreArray) };
{ String[] toIgnoreArray =
if (!"".equals(toIgnoreString.trim())) DSpaceServicesFactory.getInstance()
{ .getConfigurationService()
.getArrayProperty("bulkedit.ignore-on-export", defaultValues);
for (String toIgnoreString : toIgnoreArray) {
if (!"".equals(toIgnoreString.trim())) {
ignore.put(toIgnoreString.trim(), toIgnoreString.trim()); ignore.put(toIgnoreString.trim(), toIgnoreString.trim());
} }
} }
@@ -307,16 +332,13 @@ public class DSpaceCSV implements Serializable
* *
* If not set, defaults to double pipe '||' * If not set, defaults to double pipe '||'
*/ */
private void setValueSeparator() private void setValueSeparator() {
{
// Get the value separator // Get the value separator
valueSeparator = DSpaceServicesFactory.getInstance().getConfigurationService().getProperty("bulkedit.valueseparator"); valueSeparator = DSpaceServicesFactory.getInstance().getConfigurationService()
if ((valueSeparator != null) && (!"".equals(valueSeparator.trim()))) .getProperty("bulkedit.valueseparator");
{ if ((valueSeparator != null) && (!"".equals(valueSeparator.trim()))) {
valueSeparator = valueSeparator.trim(); valueSeparator = valueSeparator.trim();
} } else {
else
{
valueSeparator = "||"; valueSeparator = "||";
} }
@@ -336,32 +358,22 @@ public class DSpaceCSV implements Serializable
* Special values are 'tab', 'hash' and 'semicolon' which will * Special values are 'tab', 'hash' and 'semicolon' which will
* get substituted from the text to the value. * get substituted from the text to the value.
*/ */
private void setFieldSeparator() private void setFieldSeparator() {
{
// Get the value separator // Get the value separator
fieldSeparator =DSpaceServicesFactory.getInstance().getConfigurationService().getProperty("bulkedit.fieldseparator"); fieldSeparator = DSpaceServicesFactory.getInstance().getConfigurationService()
if ((fieldSeparator != null) && (!"".equals(fieldSeparator.trim()))) .getProperty("bulkedit.fieldseparator");
{ if ((fieldSeparator != null) && (!"".equals(fieldSeparator.trim()))) {
fieldSeparator = fieldSeparator.trim(); fieldSeparator = fieldSeparator.trim();
if ("tab".equals(fieldSeparator)) if ("tab".equals(fieldSeparator)) {
{
fieldSeparator = "\t"; fieldSeparator = "\t";
} } else if ("semicolon".equals(fieldSeparator)) {
else if ("semicolon".equals(fieldSeparator))
{
fieldSeparator = ";"; fieldSeparator = ";";
} } else if ("hash".equals(fieldSeparator)) {
else if ("hash".equals(fieldSeparator))
{
fieldSeparator = "#"; fieldSeparator = "#";
} } else {
else
{
fieldSeparator = fieldSeparator.trim(); fieldSeparator = fieldSeparator.trim();
} }
} } else {
else
{
fieldSeparator = ","; fieldSeparator = ",";
} }
@@ -371,23 +383,20 @@ public class DSpaceCSV implements Serializable
escapedFieldSeparator = match.replaceAll("\\\\$1"); escapedFieldSeparator = match.replaceAll("\\\\$1");
} }
/** /**
* Set the authority separator for value with authority data. * Set the authority separator for value with authority data.
* *
* Is set in dspace.cfg as bulkedit.authorityseparator * Is set in dspace.cfg as bulkedit.authorityseparator
* *
* If not set, defaults to double colon '::' * If not set, defaults to double colon '::'
*/ */
private void setAuthoritySeparator() private void setAuthoritySeparator() {
{
// Get the value separator // Get the value separator
authoritySeparator = DSpaceServicesFactory.getInstance().getConfigurationService().getProperty("bulkedit.authorityseparator"); authoritySeparator = DSpaceServicesFactory.getInstance().getConfigurationService()
if ((authoritySeparator != null) && (!"".equals(authoritySeparator.trim()))) .getProperty("bulkedit.authorityseparator");
{ if ((authoritySeparator != null) && (!"".equals(authoritySeparator.trim()))) {
authoritySeparator = authoritySeparator.trim(); authoritySeparator = authoritySeparator.trim();
} } else {
else
{
authoritySeparator = "::"; authoritySeparator = "::";
} }
@@ -401,11 +410,9 @@ public class DSpaceCSV implements Serializable
* Add a DSpace item to the CSV file * Add a DSpace item to the CSV file
* *
* @param i The DSpace item * @param i The DSpace item
*
* @throws Exception if something goes wrong with adding the Item * @throws Exception if something goes wrong with adding the Item
*/ */
public final void addItem(Item i) throws Exception public final void addItem(Item i) throws Exception {
{
// If the item does not have an "owningCollection" the the below "getHandle()" call will fail // If the item does not have an "owningCollection" the the below "getHandle()" call will fail
// This should not happen but is here for safety. // This should not happen but is here for safety.
if (i.getOwningCollection() == null) { if (i.getOwningCollection() == null) {
@@ -421,49 +428,42 @@ public class DSpaceCSV implements Serializable
// Add in any mapped collections // Add in any mapped collections
List<Collection> collections = i.getCollections(); List<Collection> collections = i.getCollections();
for (Collection c : collections) for (Collection c : collections) {
{
// Only add if it is not the owning collection // Only add if it is not the owning collection
if (!c.getHandle().equals(owningCollectionHandle)) if (!c.getHandle().equals(owningCollectionHandle)) {
{
line.add("collection", c.getHandle()); line.add("collection", c.getHandle());
} }
} }
// Populate it // Populate it
List<MetadataValue> md = itemService.getMetadata(i, Item.ANY, Item.ANY, Item.ANY, Item.ANY); List<MetadataValue> md = itemService.getMetadata(i, Item.ANY, Item.ANY, Item.ANY, Item.ANY);
for (MetadataValue value : md) for (MetadataValue value : md) {
{
MetadataField metadataField = value.getMetadataField(); MetadataField metadataField = value.getMetadataField();
MetadataSchema metadataSchema = metadataField.getMetadataSchema(); MetadataSchema metadataSchema = metadataField.getMetadataSchema();
// Get the key (schema.element) // Get the key (schema.element)
String key = metadataSchema.getName() + "." + metadataField.getElement(); String key = metadataSchema.getName() + "." + metadataField.getElement();
// Add the qualifier if there is one (schema.element.qualifier) // Add the qualifier if there is one (schema.element.qualifier)
if (metadataField.getQualifier() != null) if (metadataField.getQualifier() != null) {
{
key = key + "." + metadataField.getQualifier(); key = key + "." + metadataField.getQualifier();
} }
// Add the language if there is one (schema.element.qualifier[langauge]) // Add the language if there is one (schema.element.qualifier[langauge])
//if ((value.language != null) && (!"".equals(value.language))) //if ((value.language != null) && (!"".equals(value.language)))
if (value.getLanguage() != null) if (value.getLanguage() != null) {
{
key = key + "[" + value.getLanguage() + "]"; key = key + "[" + value.getLanguage() + "]";
} }
// Store the item // Store the item
if (exportAll || okToExport(metadataField)) if (exportAll || okToExport(metadataField)) {
{
// Add authority and confidence if authority is not null // Add authority and confidence if authority is not null
String mdValue = value.getValue(); String mdValue = value.getValue();
if (value.getAuthority() != null && !"".equals(value.getAuthority())) if (value.getAuthority() != null && !"".equals(value.getAuthority())) {
{ mdValue += authoritySeparator + value.getAuthority() + authoritySeparator + (value
mdValue += authoritySeparator + value.getAuthority() + authoritySeparator + (value.getConfidence() != -1 ? value.getConfidence() : Choices.CF_ACCEPTED); .getConfidence() != -1 ? value.getConfidence() : Choices.CF_ACCEPTED);
} }
line.add(key, mdValue); line.add(key, mdValue);
if (!headings.contains(key)) if (!headings.contains(key)) {
{
headings.add(key); headings.add(key);
} }
} }
@@ -478,12 +478,10 @@ public class DSpaceCSV implements Serializable
* @param line The line of elements * @param line The line of elements
* @throws Exception Thrown if an error occurs when adding the item * @throws Exception Thrown if an error occurs when adding the item
*/ */
public final void addItem(String line) throws Exception public final void addItem(String line) throws Exception {
{
// Check to see if the last character is a field separator, which hides the last empty column // Check to see if the last character is a field separator, which hides the last empty column
boolean last = false; boolean last = false;
if (line.endsWith(fieldSeparator)) if (line.endsWith(fieldSeparator)) {
{
// Add a space to the end, then remove it later // Add a space to the end, then remove it later
last = true; last = true;
line += " "; line += " ";
@@ -496,15 +494,12 @@ public class DSpaceCSV implements Serializable
// Merge parts with embedded separators // Merge parts with embedded separators
boolean alldone = false; boolean alldone = false;
while (!alldone) while (!alldone) {
{
boolean found = false; boolean found = false;
int i = 0; int i = 0;
for (String part : bits) for (String part : bits) {
{
int bitcounter = part.length() - part.replaceAll("\"", "").length(); int bitcounter = part.length() - part.replaceAll("\"", "").length();
if ((part.startsWith("\"")) && ((!part.endsWith("\"")) || ((bitcounter & 1) == 1))) if ((part.startsWith("\"")) && ((!part.endsWith("\"")) || ((bitcounter & 1) == 1))) {
{
found = true; found = true;
String add = bits.get(i) + fieldSeparator + bits.get(i + 1); String add = bits.get(i) + fieldSeparator + bits.get(i + 1);
bits.remove(i); bits.remove(i);
@@ -519,10 +514,8 @@ public class DSpaceCSV implements Serializable
// Deal with quotes around the elements // Deal with quotes around the elements
int i = 0; int i = 0;
for (String part : bits) for (String part : bits) {
{ if ((part.startsWith("\"")) && (part.endsWith("\""))) {
if ((part.startsWith("\"")) && (part.endsWith("\"")))
{
part = part.substring(1, part.length() - 1); part = part.substring(1, part.length() - 1);
bits.set(i, part); bits.set(i, part);
} }
@@ -531,10 +524,8 @@ public class DSpaceCSV implements Serializable
// Remove embedded quotes // Remove embedded quotes
i = 0; i = 0;
for (String part : bits) for (String part : bits) {
{ if (part.contains("\"\"")) {
if (part.contains("\"\""))
{
part = part.replaceAll("\"\"", "\""); part = part.replaceAll("\"\"", "\"");
bits.set(i, part); bits.set(i, part);
} }
@@ -546,34 +537,25 @@ public class DSpaceCSV implements Serializable
DSpaceCSVLine csvLine; DSpaceCSVLine csvLine;
// Is this an existing item, or a new item (where id = '+') // Is this an existing item, or a new item (where id = '+')
if ("+".equals(id)) if ("+".equals(id)) {
{
csvLine = new DSpaceCSVLine(); csvLine = new DSpaceCSVLine();
} } else {
else try {
{
try
{
csvLine = new DSpaceCSVLine(UUID.fromString(id)); csvLine = new DSpaceCSVLine(UUID.fromString(id));
} } catch (NumberFormatException nfe) {
catch (NumberFormatException nfe)
{
System.err.println("Invalid item identifier: " + id); System.err.println("Invalid item identifier: " + id);
System.err.println("Please check your CSV file for information. " + System.err.println("Please check your CSV file for information. " +
"Item id must be numeric, or a '+' to add a new item"); "Item id must be numeric, or a '+' to add a new item");
throw(nfe); throw (nfe);
} }
} }
// Add the rest of the parts // Add the rest of the parts
i = 0; i = 0;
for (String part : bits) for (String part : bits) {
{ if (i > 0) {
if (i > 0)
{
// Is this a last empty item? // Is this a last empty item?
if ((last) && (i == headings.size())) if ((last) && (i == headings.size())) {
{
part = ""; part = "";
} }
@@ -585,10 +567,8 @@ public class DSpaceCSV implements Serializable
} }
csvLine.add(headings.get(i - 1), null); csvLine.add(headings.get(i - 1), null);
String[] elements = part.split(escapedValueSeparator); String[] elements = part.split(escapedValueSeparator);
for (String element : elements) for (String element : elements) {
{ if ((element != null) && (!"".equals(element))) {
if ((element != null) && (!"".equals(element)))
{
csvLine.add(headings.get(i - 1), element); csvLine.add(headings.get(i - 1), element);
} }
} }
@@ -604,8 +584,7 @@ public class DSpaceCSV implements Serializable
* *
* @return The lines * @return The lines
*/ */
public final List<DSpaceCSVLine> getCSVLines() public final List<DSpaceCSVLine> getCSVLines() {
{
// Return the lines // Return the lines
return lines; return lines;
} }
@@ -615,22 +594,19 @@ public class DSpaceCSV implements Serializable
* *
* @return the array of CSV formatted Strings * @return the array of CSV formatted Strings
*/ */
public final String[] getCSVLinesAsStringArray() public final String[] getCSVLinesAsStringArray() {
{
// Create the headings line // Create the headings line
String[] csvLines = new String[counter + 1]; String[] csvLines = new String[counter + 1];
csvLines[0] = "id" + fieldSeparator + "collection"; csvLines[0] = "id" + fieldSeparator + "collection";
List<String> headingsCopy = new ArrayList<>(headings); List<String> headingsCopy = new ArrayList<>(headings);
Collections.sort(headingsCopy); Collections.sort(headingsCopy);
for (String value : headingsCopy) for (String value : headingsCopy) {
{
csvLines[0] = csvLines[0] + fieldSeparator + value; csvLines[0] = csvLines[0] + fieldSeparator + value;
} }
Iterator<DSpaceCSVLine> i = lines.iterator(); Iterator<DSpaceCSVLine> i = lines.iterator();
int c = 1; int c = 1;
while (i.hasNext()) while (i.hasNext()) {
{
csvLines[c++] = i.next().toCSV(headingsCopy, fieldSeparator, valueSeparator); csvLines[c++] = i.next().toCSV(headingsCopy, fieldSeparator, valueSeparator);
} }
@@ -641,15 +617,13 @@ public class DSpaceCSV implements Serializable
* Save the CSV file to the given filename * Save the CSV file to the given filename
* *
* @param filename The filename to save the CSV file to * @param filename The filename to save the CSV file to
*
* @throws IOException Thrown if an error occurs when writing the file * @throws IOException Thrown if an error occurs when writing the file
*/ */
public final void save(String filename) throws IOException public final void save(String filename) throws IOException {
{
// Save the file // Save the file
BufferedWriter out = new BufferedWriter( BufferedWriter out = new BufferedWriter(
new OutputStreamWriter( new OutputStreamWriter(
new FileOutputStream(filename), "UTF-8")); new FileOutputStream(filename), "UTF-8"));
for (String csvLine : getCSVLinesAsStringArray()) { for (String csvLine : getCSVLinesAsStringArray()) {
out.write(csvLine + "\n"); out.write(csvLine + "\n");
} }
@@ -666,12 +640,10 @@ public class DSpaceCSV implements Serializable
* @param md The Metadatum to examine * @param md The Metadatum to examine
* @return Whether or not it is OK to export this element * @return Whether or not it is OK to export this element
*/ */
protected boolean okToExport(MetadataField md) protected boolean okToExport(MetadataField md) {
{
// Now compare with the list to ignore // Now compare with the list to ignore
String key = md.getMetadataSchema().getName() + "." + md.getElement(); String key = md.getMetadataSchema().getName() + "." + md.getElement();
if (md.getQualifier() != null) if (md.getQualifier() != null) {
{
key += "." + md.getQualifier(); key += "." + md.getQualifier();
} }
if (ignore.get(key) != null) { if (ignore.get(key) != null) {
@@ -687,8 +659,7 @@ public class DSpaceCSV implements Serializable
* *
* @return The headings * @return The headings
*/ */
public List<String> getHeadings() public List<String> getHeadings() {
{
return headings; return headings;
} }
@@ -698,13 +669,11 @@ public class DSpaceCSV implements Serializable
* @return The formatted String as a csv * @return The formatted String as a csv
*/ */
@Override @Override
public final String toString() public final String toString() {
{
// Return the csv as one long string // Return the csv as one long string
StringBuilder csvLines = new StringBuilder(); StringBuilder csvLines = new StringBuilder();
String[] lines = this.getCSVLinesAsStringArray(); String[] lines = this.getCSVLinesAsStringArray();
for (String line : lines) for (String line : lines) {
{
csvLines.append(line).append("\n"); csvLines.append(line).append("\n");
} }
return csvLines.toString(); return csvLines.toString();

View File

@@ -7,30 +7,41 @@
*/ */
package org.dspace.app.bulkedit; package org.dspace.app.bulkedit;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Comparator;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.TreeMap;
import java.util.UUID;
import org.dspace.authority.AuthorityValue; import org.dspace.authority.AuthorityValue;
import org.dspace.authority.factory.AuthorityServiceFactory; import org.dspace.authority.factory.AuthorityServiceFactory;
import org.dspace.authority.service.AuthorityValueService; import org.dspace.authority.service.AuthorityValueService;
import java.io.Serializable;
import java.util.*;
/** /**
* Utility class to store a line from a CSV file * Utility class to store a line from a CSV file
* *
* @author Stuart Lewis * @author Stuart Lewis
*/ */
public class DSpaceCSVLine implements Serializable public class DSpaceCSVLine implements Serializable {
{ /**
/** The item id of the item represented by this line. -1 is for a new item */ * The item id of the item represented by this line. -1 is for a new item
*/
private final UUID id; private final UUID id;
/** The elements in this line in a hashtable, keyed by the metadata type */ /**
* The elements in this line in a hashtable, keyed by the metadata type
*/
private final Map<String, ArrayList> items; private final Map<String, ArrayList> items;
protected transient final AuthorityValueService authorityValueService protected transient final AuthorityValueService authorityValueService
= AuthorityServiceFactory.getInstance().getAuthorityValueService(); = AuthorityServiceFactory.getInstance().getAuthorityValueService();
/** ensuring that the order-sensible columns of the csv are processed in the correct order */ /**
* ensuring that the order-sensible columns of the csv are processed in the correct order
*/
private transient final Comparator<? super String> headerComparator = new Comparator<String>() { private transient final Comparator<? super String> headerComparator = new Comparator<String>() {
@Override @Override
public int compare(String md1, String md2) { public int compare(String md1, String md2) {
@@ -41,8 +52,7 @@ public class DSpaceCSVLine implements Serializable
int compare; int compare;
if (source1 == null && source2 != null) { if (source1 == null && source2 != null) {
compare = -1; compare = -1;
} } else if (source1 != null && source2 == null) {
else if (source1 != null && source2 == null) {
compare = 1; compare = 1;
} else { } else {
// the order of the rest does not matter // the order of the rest does not matter
@@ -57,8 +67,7 @@ public class DSpaceCSVLine implements Serializable
* *
* @param itemId The item ID of the line * @param itemId The item ID of the line
*/ */
public DSpaceCSVLine(UUID itemId) public DSpaceCSVLine(UUID itemId) {
{
// Store the ID + separator, and initialise the hashtable // Store the ID + separator, and initialise the hashtable
this.id = itemId; this.id = itemId;
items = new TreeMap<>(headerComparator); items = new TreeMap<>(headerComparator);
@@ -68,8 +77,7 @@ public class DSpaceCSVLine implements Serializable
/** /**
* Create a new CSV line for a new item * Create a new CSV line for a new item
*/ */
public DSpaceCSVLine() public DSpaceCSVLine() {
{
// Set the ID to be null, and initialise the hashtable // Set the ID to be null, and initialise the hashtable
this.id = null; this.id = null;
this.items = new TreeMap<>(headerComparator); this.items = new TreeMap<>(headerComparator);
@@ -80,8 +88,7 @@ public class DSpaceCSVLine implements Serializable
* *
* @return The item ID * @return The item ID
*/ */
public UUID getID() public UUID getID() {
{
// Return the ID // Return the ID
return id; return id;
} }
@@ -89,20 +96,17 @@ public class DSpaceCSVLine implements Serializable
/** /**
* Add a new metadata value to this line * Add a new metadata value to this line
* *
* @param key The metadata key (e.g. dc.contributor.author) * @param key The metadata key (e.g. dc.contributor.author)
* @param value The metadata value * @param value The metadata value
*/ */
public void add(String key, String value) public void add(String key, String value) {
{
// Create the array list if we need to // Create the array list if we need to
if (items.get(key) == null) if (items.get(key) == null) {
{
items.put(key, new ArrayList<String>()); items.put(key, new ArrayList<String>());
} }
// Store the item if it is not null // Store the item if it is not null
if (value != null) if (value != null) {
{
items.get(key).add(value); items.get(key).add(value);
} }
} }
@@ -113,8 +117,7 @@ public class DSpaceCSVLine implements Serializable
* @param key The metadata key * @param key The metadata key
* @return All the elements that match * @return All the elements that match
*/ */
public List<String> get(String key) public List<String> get(String key) {
{
// Return any relevant values // Return any relevant values
return items.get(key); return items.get(key);
} }
@@ -124,12 +127,11 @@ public class DSpaceCSVLine implements Serializable
* *
* @return The action (may be blank, 'withdraw', 'reinstate' or 'delete') * @return The action (may be blank, 'withdraw', 'reinstate' or 'delete')
*/ */
public String getAction() public String getAction() {
{
if (items.containsKey("action")) { if (items.containsKey("action")) {
ArrayList actions = items.get("action"); ArrayList actions = items.get("action");
if (actions.size() > 0) { if (actions.size() > 0) {
return ((String)actions.get(0)).trim(); return ((String) actions.get(0)).trim();
} }
} }
return ""; return "";
@@ -140,8 +142,7 @@ public class DSpaceCSVLine implements Serializable
* *
* @return An enumeration of all the keys * @return An enumeration of all the keys
*/ */
public Set<String> keys() public Set<String> keys() {
{
// Return the keys // Return the keys
return items.keySet(); return items.keySet();
} }
@@ -149,26 +150,23 @@ public class DSpaceCSVLine implements Serializable
/** /**
* Write this line out as a CSV formatted string, in the order given by the headings provided * Write this line out as a CSV formatted string, in the order given by the headings provided
* *
* @param headings The headings which define the order the elements must be presented in * @param headings The headings which define the order the elements must be presented in
* @param fieldSeparator separator between metadata fields * @param fieldSeparator separator between metadata fields
* @param valueSeparator separator between metadata values (within a field) * @param valueSeparator separator between metadata values (within a field)
* @return The CSV formatted String * @return The CSV formatted String
*/ */
protected String toCSV(List<String> headings, String fieldSeparator, String valueSeparator) protected String toCSV(List<String> headings, String fieldSeparator, String valueSeparator) {
{
StringBuilder bits = new StringBuilder(); StringBuilder bits = new StringBuilder();
// Add the id // Add the id
bits.append("\"").append(id).append("\"").append(fieldSeparator); bits.append("\"").append(id).append("\"").append(fieldSeparator);
bits.append(valueToCSV(items.get("collection"),valueSeparator)); bits.append(valueToCSV(items.get("collection"), valueSeparator));
// Add the rest of the elements // Add the rest of the elements
for (String heading : headings) for (String heading : headings) {
{
bits.append(fieldSeparator); bits.append(fieldSeparator);
List<String> values = items.get(heading); List<String> values = items.get(heading);
if (values != null && !"collection".equals(heading)) if (values != null && !"collection".equals(heading)) {
{
bits.append(valueToCSV(values, valueSeparator)); bits.append(valueToCSV(values, valueSeparator));
} }
} }
@@ -179,33 +177,26 @@ public class DSpaceCSVLine implements Serializable
/** /**
* Internal method to create a CSV formatted String joining a given set of elements * Internal method to create a CSV formatted String joining a given set of elements
* *
* @param values The values to create the string from * @param values The values to create the string from
* @param valueSeparator value separator * @param valueSeparator value separator
* @return The line as a CSV formatted String * @return The line as a CSV formatted String
*/ */
protected String valueToCSV(List<String> values, String valueSeparator) protected String valueToCSV(List<String> values, String valueSeparator) {
{
// Check there is some content // Check there is some content
if (values == null) if (values == null) {
{
return ""; return "";
} }
// Get on with the work // Get on with the work
String s; String s;
if (values.size() == 1) if (values.size() == 1) {
{
s = values.get(0); s = values.get(0);
} } else {
else
{
// Concatenate any fields together // Concatenate any fields together
StringBuilder str = new StringBuilder(); StringBuilder str = new StringBuilder();
for (String value : values) for (String value : values) {
{ if (str.length() > 0) {
if (str.length() > 0)
{
str.append(valueSeparator); str.append(valueSeparator);
} }

View File

@@ -7,34 +7,46 @@
*/ */
package org.dspace.app.bulkedit; package org.dspace.app.bulkedit;
import com.google.common.collect.Iterators; import java.sql.SQLException;
import org.apache.commons.cli.*; import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import org.dspace.content.*; import com.google.common.collect.Iterators;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException;
import org.apache.commons.cli.PosixParser;
import org.dspace.content.Collection;
import org.dspace.content.Community;
import org.dspace.content.DSpaceObject;
import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService; import org.dspace.content.service.ItemService;
import org.dspace.core.Constants; import org.dspace.core.Constants;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.handle.factory.HandleServiceFactory; import org.dspace.handle.factory.HandleServiceFactory;
import java.util.ArrayList;
import java.sql.SQLException;
import java.util.Iterator;
import java.util.List;
/** /**
* Metadata exporter to allow the batch export of metadata into a file * Metadata exporter to allow the batch export of metadata into a file
* *
* @author Stuart Lewis * @author Stuart Lewis
*/ */
public class MetadataExport public class MetadataExport {
{ /**
/** The items to export */ * The items to export
*/
protected Iterator<Item> toExport; protected Iterator<Item> toExport;
protected ItemService itemService; protected ItemService itemService;
/** Whether to export all metadata, or just normally edited metadata */ protected Context context;
/**
* Whether to export all metadata, or just normally edited metadata
*/
protected boolean exportAll; protected boolean exportAll;
protected MetadataExport() { protected MetadataExport() {
@@ -44,38 +56,35 @@ public class MetadataExport
/** /**
* Set up a new metadata export * Set up a new metadata export
* *
* @param c The Context * @param c The Context
* @param toExport The ItemIterator of items to export * @param toExport The ItemIterator of items to export
* @param exportAll whether to export all metadata or not (include handle, provenance etc) * @param exportAll whether to export all metadata or not (include handle, provenance etc)
*/ */
public MetadataExport(Context c, Iterator<Item> toExport, boolean exportAll) public MetadataExport(Context c, Iterator<Item> toExport, boolean exportAll) {
{
itemService = ContentServiceFactory.getInstance().getItemService(); itemService = ContentServiceFactory.getInstance().getItemService();
// Store the export settings // Store the export settings
this.toExport = toExport; this.toExport = toExport;
this.exportAll = exportAll; this.exportAll = exportAll;
this.context = c;
} }
/** /**
* Method to export a community (and sub-communities and collections) * Method to export a community (and sub-communities and collections)
* *
* @param c The Context * @param c The Context
* @param toExport The Community to export * @param toExport The Community to export
* @param exportAll whether to export all metadata or not (include handle, provenance etc) * @param exportAll whether to export all metadata or not (include handle, provenance etc)
*/ */
public MetadataExport(Context c, Community toExport, boolean exportAll) public MetadataExport(Context c, Community toExport, boolean exportAll) {
{
itemService = ContentServiceFactory.getInstance().getItemService(); itemService = ContentServiceFactory.getInstance().getItemService();
try try {
{
// Try to export the community // Try to export the community
this.toExport = buildFromCommunity(c, toExport, 0); this.toExport = buildFromCommunity(c, toExport, 0);
this.exportAll = exportAll; this.exportAll = exportAll;
} this.context = c;
catch (SQLException sqle) } catch (SQLException sqle) {
{
// Something went wrong... // Something went wrong...
System.err.println("Error running exporter:"); System.err.println("Error running exporter:");
sqle.printStackTrace(System.err); sqle.printStackTrace(System.err);
@@ -86,49 +95,43 @@ public class MetadataExport
/** /**
* Build an array list of item ids that are in a community (include sub-communities and collections) * Build an array list of item ids that are in a community (include sub-communities and collections)
* *
* @param context DSpace context * @param context DSpace context
* @param community The community to build from * @param community The community to build from
* @param indent How many spaces to use when writing out the names of items added * @param indent How many spaces to use when writing out the names of items added
* @return The list of item ids * @return The list of item ids
* @throws SQLException if database error * @throws SQLException if database error
*/ */
protected Iterator<Item> buildFromCommunity(Context context, Community community, int indent) protected Iterator<Item> buildFromCommunity(Context context, Community community, int indent)
throws SQLException throws SQLException {
{
// Add all the collections // Add all the collections
List<Collection> collections = community.getCollections(); List<Collection> collections = community.getCollections();
Iterator<Item> result = null; Iterator<Item> result = null;
for (Collection collection : collections) for (Collection collection : collections) {
{ for (int i = 0; i < indent; i++) {
for (int i = 0; i < indent; i++)
{
System.out.print(" "); System.out.print(" ");
} }
Iterator<Item> items = itemService.findByCollection(context, collection); Iterator<Item> items = itemService.findByCollection(context, collection);
result = addItemsToResult(result,items); result = addItemsToResult(result, items);
} }
// Add all the sub-communities // Add all the sub-communities
List<Community> communities = community.getSubcommunities(); List<Community> communities = community.getSubcommunities();
for (Community subCommunity : communities) for (Community subCommunity : communities) {
{ for (int i = 0; i < indent; i++) {
for (int i = 0; i < indent; i++)
{
System.out.print(" "); System.out.print(" ");
} }
Iterator<Item> items = buildFromCommunity(context, subCommunity, indent + 1); Iterator<Item> items = buildFromCommunity(context, subCommunity, indent + 1);
result = addItemsToResult(result,items); result = addItemsToResult(result, items);
} }
return result; return result;
} }
private Iterator<Item> addItemsToResult(Iterator<Item> result, Iterator<Item> items) { private Iterator<Item> addItemsToResult(Iterator<Item> result, Iterator<Item> items) {
if(result == null) if (result == null) {
{
result = items; result = items;
}else{ } else {
result = Iterators.concat(result, items); result = Iterators.concat(result, items);
} }
@@ -140,22 +143,23 @@ public class MetadataExport
* *
* @return the exported CSV lines * @return the exported CSV lines
*/ */
public DSpaceCSV export() public DSpaceCSV export() {
{ try {
try Context.Mode originalMode = context.getCurrentMode();
{ context.setMode(Context.Mode.READ_ONLY);
// Process each item // Process each item
DSpaceCSV csv = new DSpaceCSV(exportAll); DSpaceCSV csv = new DSpaceCSV(exportAll);
while (toExport.hasNext()) while (toExport.hasNext()) {
{ Item item = toExport.next();
csv.addItem(toExport.next()); csv.addItem(item);
context.uncacheEntity(item);
} }
context.setMode(originalMode);
// Return the results // Return the results
return csv; return csv;
} } catch (Exception e) {
catch (Exception e)
{
// Something went wrong... // Something went wrong...
System.err.println("Error exporting to CSV:"); System.err.println("Error exporting to CSV:");
e.printStackTrace(); e.printStackTrace();
@@ -166,11 +170,10 @@ public class MetadataExport
/** /**
* Print the help message * Print the help message
* *
* @param options The command line options the user gave * @param options The command line options the user gave
* @param exitCode the system exit code to use * @param exitCode the system exit code to use
*/ */
private static void printHelp(Options options, int exitCode) private static void printHelp(Options options, int exitCode) {
{
// print the help message // print the help message
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MetadataExport\n", options); myhelp.printHelp("MetadataExport\n", options);
@@ -180,13 +183,12 @@ public class MetadataExport
} }
/** /**
* main method to run the metadata exporter * main method to run the metadata exporter
* *
* @param argv the command line arguments given * @param argv the command line arguments given
* @throws Exception if error occurs * @throws Exception if error occurs
*/ */
public static void main(String[] argv) throws Exception public static void main(String[] argv) throws Exception {
{
// Create an options object and populate it // Create an options object and populate it
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
@@ -194,37 +196,33 @@ public class MetadataExport
options.addOption("i", "id", true, "ID or handle of thing to export (item, collection, or community)"); options.addOption("i", "id", true, "ID or handle of thing to export (item, collection, or community)");
options.addOption("f", "file", true, "destination where you want file written"); options.addOption("f", "file", true, "destination where you want file written");
options.addOption("a", "all", false, "include all metadata fields that are not normally changed (e.g. provenance)"); options.addOption("a", "all", false,
"include all metadata fields that are not normally changed (e.g. provenance)");
options.addOption("h", "help", false, "help"); options.addOption("h", "help", false, "help");
CommandLine line = null; CommandLine line = null;
try try {
{
line = parser.parse(options, argv); line = parser.parse(options, argv);
} } catch (ParseException pe) {
catch (ParseException pe)
{
System.err.println("Error with commands."); System.err.println("Error with commands.");
printHelp(options, 1); printHelp(options, 1);
System.exit(0); System.exit(0);
} }
if (line.hasOption('h')) if (line.hasOption('h')) {
{
printHelp(options, 0); printHelp(options, 0);
} }
// Check a filename is given // Check a filename is given
if (!line.hasOption('f')) if (!line.hasOption('f')) {
{
System.err.println("Required parameter -f missing!"); System.err.println("Required parameter -f missing!");
printHelp(options, 1); printHelp(options, 1);
} }
String filename = line.getOptionValue('f'); String filename = line.getOptionValue('f');
// Create a context // Create a context
Context c = new Context(); Context c = new Context(Context.Mode.READ_ONLY);
c.turnOffAuthorisationSystem(); c.turnOffAuthorisationSystem();
// The things we'll export // The things we'll export
@@ -237,42 +235,31 @@ public class MetadataExport
ContentServiceFactory contentServiceFactory = ContentServiceFactory.getInstance(); ContentServiceFactory contentServiceFactory = ContentServiceFactory.getInstance();
// Check we have an item OK // Check we have an item OK
ItemService itemService = contentServiceFactory.getItemService(); ItemService itemService = contentServiceFactory.getItemService();
if (!line.hasOption('i')) if (!line.hasOption('i')) {
{
System.out.println("Exporting whole repository WARNING: May take some time!"); System.out.println("Exporting whole repository WARNING: May take some time!");
exporter = new MetadataExport(c, itemService.findAll(c), exportAll); exporter = new MetadataExport(c, itemService.findAll(c), exportAll);
} } else {
else
{
String handle = line.getOptionValue('i'); String handle = line.getOptionValue('i');
DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(c, handle); DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(c, handle);
if (dso == null) if (dso == null) {
{
System.err.println("Item '" + handle + "' does not resolve to an item in your repository!"); System.err.println("Item '" + handle + "' does not resolve to an item in your repository!");
printHelp(options, 1); printHelp(options, 1);
} }
if (dso.getType() == Constants.ITEM) if (dso.getType() == Constants.ITEM) {
{
System.out.println("Exporting item '" + dso.getName() + "' (" + handle + ")"); System.out.println("Exporting item '" + dso.getName() + "' (" + handle + ")");
List<Item> item = new ArrayList<>(); List<Item> item = new ArrayList<>();
item.add((Item) dso); item.add((Item) dso);
exporter = new MetadataExport(c, item.iterator(), exportAll); exporter = new MetadataExport(c, item.iterator(), exportAll);
} } else if (dso.getType() == Constants.COLLECTION) {
else if (dso.getType() == Constants.COLLECTION)
{
System.out.println("Exporting collection '" + dso.getName() + "' (" + handle + ")"); System.out.println("Exporting collection '" + dso.getName() + "' (" + handle + ")");
Collection collection = (Collection)dso; Collection collection = (Collection) dso;
toExport = itemService.findByCollection(c, collection); toExport = itemService.findByCollection(c, collection);
exporter = new MetadataExport(c, toExport, exportAll); exporter = new MetadataExport(c, toExport, exportAll);
} } else if (dso.getType() == Constants.COMMUNITY) {
else if (dso.getType() == Constants.COMMUNITY)
{
System.out.println("Exporting community '" + dso.getName() + "' (" + handle + ")"); System.out.println("Exporting community '" + dso.getName() + "' (" + handle + ")");
exporter = new MetadataExport(c, (Community)dso, exportAll); exporter = new MetadataExport(c, (Community) dso, exportAll);
} } else {
else
{
System.err.println("Error identifying '" + handle + "'"); System.err.println("Error identifying '" + handle + "'");
System.exit(1); System.exit(1);
} }
@@ -282,7 +269,7 @@ public class MetadataExport
DSpaceCSV csv = exporter.export(); DSpaceCSV csv = exporter.export();
// Save the files to the file // Save the files to the file
csv.save(filename); csv.save(filename);
// Finish off and tidy up // Finish off and tidy up
c.restoreAuthSystemState(); c.restoreAuthSystemState();

View File

@@ -12,26 +12,23 @@ package org.dspace.app.bulkedit;
* *
* @author Stuart Lewis * @author Stuart Lewis
*/ */
public class MetadataImportException extends Exception public class MetadataImportException extends Exception {
{
/** /**
* Instantiate a new MetadataImportException * Instantiate a new MetadataImportException
* *
* @param message the error message * @param message the error message
*/ */
public MetadataImportException(String message) public MetadataImportException(String message) {
{ super(message);
super(message);
} }
/** /**
* Instantiate a new MetadataImportException * Instantiate a new MetadataImportException
* *
* @param message the error message * @param message the error message
* @param exception the root cause * @param exception the root cause
*/ */
public MetadataImportException(String message, Exception exception) public MetadataImportException(String message, Exception exception) {
{ super(message, exception);
super(message, exception);
} }
} }

View File

@@ -12,39 +12,51 @@ package org.dspace.app.bulkedit;
* *
* @author Stuart Lewis * @author Stuart Lewis
*/ */
public class MetadataImportInvalidHeadingException extends Exception public class MetadataImportInvalidHeadingException extends Exception {
{ /**
/** The type of error (schema or element) */ * The type of error (schema or element)
*/
private int type; private int type;
/** The bad heading */ /**
* The bad heading
*/
private String badHeading; private String badHeading;
/** The column number */ /**
* The column number
*/
private int column; private int column;
/** Error with the schema */ /**
* Error with the schema
*/
public static final int SCHEMA = 0; public static final int SCHEMA = 0;
/** Error with the element */ /**
* Error with the element
*/
public static final int ELEMENT = 1; public static final int ELEMENT = 1;
/** Error with a missing header */ /**
* Error with a missing header
*/
public static final int MISSING = 98; public static final int MISSING = 98;
/** Error with the whole entry */ /**
* Error with the whole entry
*/
public static final int ENTRY = 99; public static final int ENTRY = 99;
/** /**
* Instantiate a new MetadataImportInvalidHeadingException * Instantiate a new MetadataImportInvalidHeadingException
* *
* @param message the error message * @param message the error message
* @param theType the type of the error * @param theType the type of the error
* @param theColumn column number * @param theColumn column number
*/ */
public MetadataImportInvalidHeadingException(String message, int theType, int theColumn) public MetadataImportInvalidHeadingException(String message, int theType, int theColumn) {
{
super(message); super(message);
badHeading = message; badHeading = message;
type = theType; type = theType;
@@ -54,10 +66,9 @@ public class MetadataImportInvalidHeadingException extends Exception
/** /**
* Get the type of the exception * Get the type of the exception
* *
* @return the type of the exception * @return the type of the exception
*/ */
public String getType() public String getType() {
{
return "" + type; return "" + type;
} }
@@ -66,8 +77,7 @@ public class MetadataImportInvalidHeadingException extends Exception
* *
* @return the invalid heading * @return the invalid heading
*/ */
public String getBadHeader() public String getBadHeader() {
{
return badHeading; return badHeading;
} }
@@ -76,8 +86,7 @@ public class MetadataImportInvalidHeadingException extends Exception
* *
* @return the invalid column number * @return the invalid column number
*/ */
public int getColumn() public int getColumn() {
{
return column; return column;
} }
@@ -87,20 +96,15 @@ public class MetadataImportInvalidHeadingException extends Exception
* @return The exception message * @return The exception message
*/ */
@Override @Override
public String getMessage() public String getMessage() {
{ if (type == SCHEMA) {
if (type == SCHEMA) return "Unknown metadata schema in column " + column + ": " + badHeading;
{ } else if (type == ELEMENT) {
return "Unknown metadata schema in row " + column + ": " + badHeading; return "Unknown metadata element in column " + column + ": " + badHeading;
} else if (type == ELEMENT) } else if (type == MISSING) {
{ return "Row with missing header: column " + column;
return "Unknown metadata element in row " + column + ": " + badHeading; } else {
} else if (type == MISSING) return "Bad metadata declaration in column" + column + ": " + badHeading;
{
return "Row with missing header: Row " + column;
} else
{
return "Bad metadata declaration in row " + column + ": " + badHeading;
} }
} }
} }

View File

@@ -9,7 +9,11 @@ package org.dspace.app.checker;
import java.io.FileNotFoundException; import java.io.FileNotFoundException;
import java.sql.SQLException; import java.sql.SQLException;
import java.util.*; import java.util.ArrayList;
import java.util.Calendar;
import java.util.Date;
import java.util.List;
import java.util.UUID;
import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser; import org.apache.commons.cli.CommandLineParser;
@@ -20,7 +24,15 @@ import org.apache.commons.cli.Options;
import org.apache.commons.cli.ParseException; import org.apache.commons.cli.ParseException;
import org.apache.commons.cli.PosixParser; import org.apache.commons.cli.PosixParser;
import org.apache.log4j.Logger; import org.apache.log4j.Logger;
import org.dspace.checker.*; import org.dspace.checker.BitstreamDispatcher;
import org.dspace.checker.CheckerCommand;
import org.dspace.checker.HandleDispatcher;
import org.dspace.checker.IteratorDispatcher;
import org.dspace.checker.LimitedCountDispatcher;
import org.dspace.checker.LimitedDurationDispatcher;
import org.dspace.checker.ResultsLogger;
import org.dspace.checker.ResultsPruner;
import org.dspace.checker.SimpleDispatcher;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.BitstreamService; import org.dspace.content.service.BitstreamService;
@@ -28,15 +40,14 @@ import org.dspace.core.Context;
import org.dspace.core.Utils; import org.dspace.core.Utils;
/** /**
* Command line access to the checksum checker. Options are listed in the * Command line access to the checksum checker. Options are listed in the
* documentation for the main method. * documentation for the main method.
* *
* @author Jim Downing * @author Jim Downing
* @author Grace Carpenter * @author Grace Carpenter
* @author Nathan Sarr * @author Nathan Sarr
*/ */
public final class ChecksumChecker public final class ChecksumChecker {
{
private static final Logger LOG = Logger.getLogger(ChecksumChecker.class); private static final Logger LOG = Logger.getLogger(ChecksumChecker.class);
private static final BitstreamService bitstreamService = ContentServiceFactory.getInstance().getBitstreamService(); private static final BitstreamService bitstreamService = ContentServiceFactory.getInstance().getBitstreamService();
@@ -44,34 +55,33 @@ public final class ChecksumChecker
/** /**
* Blanked off constructor, this class should be used as a command line * Blanked off constructor, this class should be used as a command line
* tool. * tool.
*
*/ */
private ChecksumChecker() private ChecksumChecker() {
{
} }
/** /**
* Command line access to the checksum package. * Command line access to the checksum package.
* *
* @param args * <dl>
* <dl> * <dt>-h</dt>
* <dt>-h</dt> * <dd>Print help on command line options</dd>
* <dd>Print help on command line options</dd> * <dt>-l</dt>
* <dt>-l</dt> * <dd>loop through bitstreams once</dd>
* <dd>loop through bitstreams once</dd> * <dt>-L</dt>
* <dt>-L</dt> * <dd>loop continuously through bitstreams</dd>
* <dd>loop continuously through bitstreams</dd> * <dt>-d</dt>
* <dt>-d</dt> * <dd>specify duration of process run</dd>
* <dd>specify duration of process run</dd> * <dt>-b</dt>
* <dt>-b</dt> * <dd>specify bitstream IDs</dd>
* <dd>specify bitstream IDs</dd> * <dt>-a [handle_id]</dt>
* <dt>-a [handle_id]</dt> * <dd>check anything by handle</dd>
* <dd>check anything by handle</dd> * <dt>-e</dt>
* <dt>-e</dt> * <dd>Report only errors in the logs</dd>
* <dd>Report only errors in the logs</dd> * <dt>-p</dt>
* <dt>-p</dt> * <dd>Don't prune results before running checker</dd>
* <dd>Don't prune results before running checker</dd> * </dl>
* </dl> *
* @param args the command line arguments given
* @throws SQLException if error * @throws SQLException if error
*/ */
public static void main(String[] args) throws SQLException { public static void main(String[] args) throws SQLException {
@@ -84,7 +94,7 @@ public final class ChecksumChecker
options.addOption("l", "looping", false, "Loop once through bitstreams"); options.addOption("l", "looping", false, "Loop once through bitstreams");
options.addOption("L", "continuous", false, options.addOption("L", "continuous", false,
"Loop continuously through bitstreams"); "Loop continuously through bitstreams");
options.addOption("h", "help", false, "Help"); options.addOption("h", "help", false, "Help");
options.addOption("d", "duration", true, "Checking duration"); options.addOption("d", "duration", true, "Checking duration");
options.addOption("c", "count", true, "Check count"); options.addOption("c", "count", true, "Check count");
@@ -92,33 +102,28 @@ public final class ChecksumChecker
options.addOption("v", "verbose", false, "Report all processing"); options.addOption("v", "verbose", false, "Report all processing");
OptionBuilder.withArgName("bitstream-ids").hasArgs().withDescription( OptionBuilder.withArgName("bitstream-ids").hasArgs().withDescription(
"Space separated list of bitstream ids"); "Space separated list of bitstream ids");
Option useBitstreamIds = OptionBuilder.create('b'); Option useBitstreamIds = OptionBuilder.create('b');
options.addOption(useBitstreamIds); options.addOption(useBitstreamIds);
options.addOption("p", "prune", false, "Prune configuration file"); options.addOption("p", "prune", false, "Prune configuration file");
options options.addOption(OptionBuilder
.addOption(OptionBuilder .withArgName("prune")
.withArgName("prune") .hasOptionalArgs(1)
.hasOptionalArgs(1) .withDescription(
.withDescription( "Prune old results (optionally using specified properties file for configuration)")
"Prune old results (optionally using specified properties file for configuration)") .create('p'));
.create('p'));
try try {
{
line = parser.parse(options, args); line = parser.parse(options, args);
} } catch (ParseException e) {
catch (ParseException e)
{
LOG.fatal(e); LOG.fatal(e);
System.exit(1); System.exit(1);
} }
// user asks for help // user asks for help
if (line.hasOption('h')) if (line.hasOption('h')) {
{
printHelp(options); printHelp(options);
} }
Context context = null; Context context = null;
@@ -127,23 +132,19 @@ public final class ChecksumChecker
// Prune stage // Prune stage
if (line.hasOption('p')) if (line.hasOption('p')) {
{
ResultsPruner rp = null; ResultsPruner rp = null;
try try {
{
rp = (line.getOptionValue('p') != null) ? ResultsPruner rp = (line.getOptionValue('p') != null) ? ResultsPruner
.getPruner(context, line.getOptionValue('p')) : ResultsPruner .getPruner(context, line.getOptionValue('p')) : ResultsPruner
.getDefaultPruner(context); .getDefaultPruner(context);
} } catch (FileNotFoundException e) {
catch (FileNotFoundException e)
{
LOG.error("File not found", e); LOG.error("File not found", e);
System.exit(1); System.exit(1);
} }
int count = rp.prune(); int count = rp.prune();
System.out.println("Pruned " + count System.out.println("Pruned " + count
+ " old results from the database."); + " old results from the database.");
} }
Date processStart = Calendar.getInstance().getTime(); Date processStart = Calendar.getInstance().getTime();
@@ -152,77 +153,55 @@ public final class ChecksumChecker
// process should loop infinitely through // process should loop infinitely through
// most_recent_checksum table // most_recent_checksum table
if (line.hasOption('l')) if (line.hasOption('l')) {
{
dispatcher = new SimpleDispatcher(context, processStart, false); dispatcher = new SimpleDispatcher(context, processStart, false);
} } else if (line.hasOption('L')) {
else if (line.hasOption('L'))
{
dispatcher = new SimpleDispatcher(context, processStart, true); dispatcher = new SimpleDispatcher(context, processStart, true);
} } else if (line.hasOption('b')) {
else if (line.hasOption('b'))
{
// check only specified bitstream(s) // check only specified bitstream(s)
String[] ids = line.getOptionValues('b'); String[] ids = line.getOptionValues('b');
List<Bitstream> bitstreams = new ArrayList<>(ids.length); List<Bitstream> bitstreams = new ArrayList<>(ids.length);
for (int i = 0; i < ids.length; i++) for (int i = 0; i < ids.length; i++) {
{ try {
try
{
bitstreams.add(bitstreamService.find(context, UUID.fromString(ids[i]))); bitstreams.add(bitstreamService.find(context, UUID.fromString(ids[i])));
} } catch (NumberFormatException nfe) {
catch (NumberFormatException nfe)
{
System.err.println("The following argument: " + ids[i] System.err.println("The following argument: " + ids[i]
+ " is not an integer"); + " is not an integer");
System.exit(0); System.exit(0);
} }
} }
dispatcher = new IteratorDispatcher(bitstreams.iterator()); dispatcher = new IteratorDispatcher(bitstreams.iterator());
} } else if (line.hasOption('a')) {
else if (line.hasOption('a'))
{
dispatcher = new HandleDispatcher(context, line.getOptionValue('a')); dispatcher = new HandleDispatcher(context, line.getOptionValue('a'));
} } else if (line.hasOption('d')) {
else if (line.hasOption('d'))
{
// run checker process for specified duration // run checker process for specified duration
try try {
{
dispatcher = new LimitedDurationDispatcher( dispatcher = new LimitedDurationDispatcher(
new SimpleDispatcher(context, processStart, true), new Date( new SimpleDispatcher(context, processStart, true), new Date(
System.currentTimeMillis() System.currentTimeMillis()
+ Utils.parseDuration(line + Utils.parseDuration(line
.getOptionValue('d')))); .getOptionValue('d'))));
} } catch (Exception e) {
catch (Exception e)
{
LOG.fatal("Couldn't parse " + line.getOptionValue('d') LOG.fatal("Couldn't parse " + line.getOptionValue('d')
+ " as a duration: ", e); + " as a duration: ", e);
System.exit(0); System.exit(0);
} }
} } else if (line.hasOption('c')) {
else if (line.hasOption('c'))
{
int count = Integer.valueOf(line.getOptionValue('c')); int count = Integer.valueOf(line.getOptionValue('c'));
// run checker process for specified number of bitstreams // run checker process for specified number of bitstreams
dispatcher = new LimitedCountDispatcher(new SimpleDispatcher( dispatcher = new LimitedCountDispatcher(new SimpleDispatcher(
context, processStart, false), count); context, processStart, false), count);
} } else {
else
{
dispatcher = new LimitedCountDispatcher(new SimpleDispatcher( dispatcher = new LimitedCountDispatcher(new SimpleDispatcher(
context, processStart, false), 1); context, processStart, false), 1);
} }
ResultsLogger logger = new ResultsLogger(processStart); ResultsLogger logger = new ResultsLogger(processStart);
CheckerCommand checker = new CheckerCommand(context); CheckerCommand checker = new CheckerCommand(context);
// verbose reporting // verbose reporting
if (line.hasOption('v')) if (line.hasOption('v')) {
{
checker.setReportVerbose(true); checker.setReportVerbose(true);
} }
@@ -233,7 +212,7 @@ public final class ChecksumChecker
context.complete(); context.complete();
context = null; context = null;
} finally { } finally {
if(context != null){ if (context != null) {
context.abort(); context.abort();
} }
} }
@@ -241,27 +220,22 @@ public final class ChecksumChecker
/** /**
* Print the help options for the user * Print the help options for the user
* *
* @param options that are available for the user * @param options that are available for the user
*/ */
private static void printHelp(Options options) private static void printHelp(Options options) {
{
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("Checksum Checker\n", options); myhelp.printHelp("Checksum Checker\n", options);
System.out System.out.println("\nSpecify a duration for checker process, using s(seconds),"
.println("\nSpecify a duration for checker process, using s(seconds)," + "m(minutes), or h(hours): ChecksumChecker -d 30s"
+ "m(minutes), or h(hours): ChecksumChecker -d 30s" + " OR ChecksumChecker -d 30m"
+ " OR ChecksumChecker -d 30m" + " OR ChecksumChecker -d 2h");
+ " OR ChecksumChecker -d 2h"); System.out.println("\nSpecify bitstream IDs: ChecksumChecker -b 13 15 17 20");
System.out
.println("\nSpecify bitstream IDs: ChecksumChecker -b 13 15 17 20");
System.out.println("\nLoop once through all bitstreams: " System.out.println("\nLoop once through all bitstreams: "
+ "ChecksumChecker -l"); + "ChecksumChecker -l");
System.out System.out.println("\nLoop continuously through all bitstreams: ChecksumChecker -L");
.println("\nLoop continuously through all bitstreams: ChecksumChecker -L"); System.out.println("\nCheck a defined number of bitstreams: ChecksumChecker -c 10");
System.out
.println("\nCheck a defined number of bitstreams: ChecksumChecker -c 10");
System.out.println("\nReport all processing (verbose)(default reports only errors): ChecksumChecker -v"); System.out.println("\nReport all processing (verbose)(default reports only errors): ChecksumChecker -v");
System.out.println("\nDefault (no arguments) is equivalent to '-c 1'"); System.out.println("\nDefault (no arguments) is equivalent to '-c 1'");
System.exit(0); System.exit(0);

View File

@@ -7,12 +7,12 @@
*/ */
package org.dspace.app.configuration; package org.dspace.app.configuration;
import org.dspace.kernel.config.SpringLoader;
import org.dspace.services.ConfigurationService;
import java.io.File; import java.io.File;
import java.net.MalformedURLException; import java.net.MalformedURLException;
import org.dspace.kernel.config.SpringLoader;
import org.dspace.services.ConfigurationService;
/** /**
* @author Kevin Van de Velde (kevin at atmire dot com) * @author Kevin Van de Velde (kevin at atmire dot com)
*/ */
@@ -32,7 +32,7 @@ public class APISpringLoader implements SpringLoader {
try { try {
return new String[]{new File(filePath.toString()).toURI().toURL().toString() + XML_SUFFIX}; return new String[] {new File(filePath.toString()).toURI().toURL().toString() + XML_SUFFIX};
} catch (MalformedURLException e) { } catch (MalformedURLException e) {
return new String[0]; return new String[0];
} }

View File

@@ -21,37 +21,37 @@ import org.apache.commons.cli.PosixParser;
import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.AuthorizeException;
import org.dspace.content.Collection; import org.dspace.content.Collection;
import org.dspace.content.DSpaceObject; import org.dspace.content.DSpaceObject;
import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.CollectionService; import org.dspace.content.service.CollectionService;
import org.dspace.content.service.ItemService; import org.dspace.content.service.ItemService;
import org.dspace.core.Constants;
import org.dspace.core.Context;
import org.dspace.eperson.EPerson;
import org.dspace.eperson.factory.EPersonServiceFactory; import org.dspace.eperson.factory.EPersonServiceFactory;
import org.dspace.eperson.service.EPersonService; import org.dspace.eperson.service.EPersonService;
import org.dspace.handle.factory.HandleServiceFactory; import org.dspace.handle.factory.HandleServiceFactory;
import org.dspace.harvest.HarvestedCollection; import org.dspace.harvest.HarvestedCollection;
import org.dspace.content.Item;
import org.dspace.harvest.HarvestingException; import org.dspace.harvest.HarvestingException;
import org.dspace.harvest.OAIHarvester; import org.dspace.harvest.OAIHarvester;
import org.dspace.core.Constants;
import org.dspace.core.Context;
import org.dspace.eperson.EPerson;
import org.dspace.harvest.factory.HarvestServiceFactory; import org.dspace.harvest.factory.HarvestServiceFactory;
import org.dspace.harvest.service.HarvestedCollectionService; import org.dspace.harvest.service.HarvestedCollectionService;
/** /**
* Test class for harvested collections. * Test class for harvested collections.
* *
* @author Alexey Maslov * @author Alexey Maslov
*/ */
public class Harvest public class Harvest {
{
private static Context context; private static Context context;
private static final HarvestedCollectionService harvestedCollectionService = HarvestServiceFactory.getInstance().getHarvestedCollectionService(); private static final HarvestedCollectionService harvestedCollectionService =
HarvestServiceFactory.getInstance().getHarvestedCollectionService();
private static final EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService(); private static final EPersonService ePersonService = EPersonServiceFactory.getInstance().getEPersonService();
private static final CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); private static final CollectionService collectionService =
ContentServiceFactory.getInstance().getCollectionService();
public static void main(String[] argv) throws Exception public static void main(String[] argv) throws Exception {
{
// create an options object and populate it // create an options object and populate it
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
@@ -65,49 +65,50 @@ public class Harvest
options.addOption("S", "start", false, "start the harvest loop"); options.addOption("S", "start", false, "start the harvest loop");
options.addOption("R", "reset", false, "reset harvest status on all collections"); options.addOption("R", "reset", false, "reset harvest status on all collections");
options.addOption("P", "purge", false, "purge all harvestable collections"); options.addOption("P", "purge", false, "purge all harvestable collections");
options.addOption("e", "eperson", true, "eperson");
options.addOption("c", "collection", true, "harvesting collection (handle or id)"); options.addOption("e", "eperson", true,
options.addOption("t", "type", true, "type of harvesting (0 for none)"); "eperson");
options.addOption("a", "address", true, "address of the OAI-PMH server"); options.addOption("c", "collection", true,
options.addOption("i", "oai_set_id", true, "id of the PMH set representing the harvested collection"); "harvesting collection (handle or id)");
options.addOption("m", "metadata_format", true, "the name of the desired metadata format for harvesting, resolved to namespace and crosswalk in dspace.cfg"); options.addOption("t", "type", true,
"type of harvesting (0 for none)");
options.addOption("a", "address", true,
"address of the OAI-PMH server");
options.addOption("i", "oai_set_id", true,
"id of the PMH set representing the harvested collection");
options.addOption("m", "metadata_format", true,
"the name of the desired metadata format for harvesting, resolved to namespace and " +
"crosswalk in dspace.cfg");
options.addOption("h", "help", false, "help"); options.addOption("h", "help", false, "help");
CommandLine line = parser.parse(options, argv); CommandLine line = parser.parse(options, argv);
String command = null; String command = null;
String eperson = null; String eperson = null;
String collection = null; String collection = null;
String oaiSource = null; String oaiSource = null;
String oaiSetID = null; String oaiSetID = null;
String metadataKey = null; String metadataKey = null;
int harvestType = 0; int harvestType = 0;
if (line.hasOption('h')) if (line.hasOption('h')) {
{
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("Harvest\n", options); myhelp.printHelp("Harvest\n", options);
System.out System.out.println("\nPING OAI server: Harvest -g -a oai_source -i oai_set_id");
.println("\nPING OAI server: Harvest -g -s oai_source -i oai_set_id"); System.out.println(
System.out "RUNONCE harvest with arbitrary options: Harvest -o -e eperson -c collection -t harvest_type -a " +
.println("RUNONCE harvest with arbitrary options: Harvest -o -e eperson -c collection -t harvest_type -a oai_source -i oai_set_id -m metadata_format"); "oai_source -i oai_set_id -m metadata_format");
System.out System.out.println(
.println("SETUP a collection for harvesting: Harvest -s -c collection -t harvest_type -a oai_source -i oai_set_id -m metadata_format"); "SETUP a collection for harvesting: Harvest -s -c collection -t harvest_type -a oai_source -i " +
System.out "oai_set_id -m metadata_format");
.println("RUN harvest once: Harvest -r -e eperson -c collection"); System.out.println("RUN harvest once: Harvest -r -e eperson -c collection");
System.out System.out.println("START harvest scheduler: Harvest -S");
.println("START harvest scheduler: Harvest -S"); System.out.println("RESET all harvest status: Harvest -R");
System.out System.out.println("PURGE a collection of items and settings: Harvest -p -e eperson -c collection");
.println("RESET all harvest status: Harvest -R"); System.out.println("PURGE all harvestable collections: Harvest -P -e eperson");
System.out
.println("PURGE a collection of items and settings: Harvest -p -e eperson -c collection");
System.out
.println("PURGE all harvestable collections: Harvest -P -e eperson");
System.exit(0); System.exit(0);
} }
@@ -137,7 +138,7 @@ public class Harvest
command = "purgeAll"; command = "purgeAll";
} }
if (line.hasOption('e')) { if (line.hasOption('e')) {
eperson = line.getOptionValue('e'); eperson = line.getOptionValue('e');
} }
@@ -147,7 +148,7 @@ public class Harvest
if (line.hasOption('t')) { if (line.hasOption('t')) {
harvestType = Integer.parseInt(line.getOptionValue('t')); harvestType = Integer.parseInt(line.getOptionValue('t'));
} else { } else {
harvestType = 0; harvestType = 0;
} }
if (line.hasOption('a')) { if (line.hasOption('a')) {
oaiSource = line.getOptionValue('a'); oaiSource = line.getOptionValue('a');
@@ -158,106 +159,87 @@ public class Harvest
if (line.hasOption('m')) { if (line.hasOption('m')) {
metadataKey = line.getOptionValue('m'); metadataKey = line.getOptionValue('m');
} }
// Instantiate our class // Instantiate our class
Harvest harvester = new Harvest(); Harvest harvester = new Harvest();
harvester.context = new Context(); harvester.context = new Context(Context.Mode.BATCH_EDIT);
// Check our options // Check our options
if (command == null) if (command == null) {
{
System.out System.out
.println("Error - no parameters specified (run with -h flag for details)"); .println("Error - no parameters specified (run with -h flag for details)");
System.exit(1); System.exit(1);
} } else if ("run".equals(command)) {
// Run a single harvest cycle on a collection using saved settings. // Run a single harvest cycle on a collection using saved settings.
else if ("run".equals(command)) if (collection == null || eperson == null) {
{
if (collection == null || eperson == null)
{
System.out System.out
.println("Error - a target collection and eperson must be provided"); .println("Error - a target collection and eperson must be provided");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
harvester.runHarvest(collection, eperson); harvester.runHarvest(collection, eperson);
} } else if ("start".equals(command)) {
// start the harvest loop // start the harvest loop
else if ("start".equals(command)) startHarvester();
{ } else if ("reset".equals(command)) {
startHarvester(); // reset harvesting status
} resetHarvesting();
// reset harvesting status } else if ("purgeAll".equals(command)) {
else if ("reset".equals(command)) // purge all collections that are set up for harvesting (obviously for testing purposes only)
{ if (eperson == null) {
resetHarvesting();
}
// purge all collections that are set up for harvesting (obviously for testing purposes only)
else if ("purgeAll".equals(command))
{
if (eperson == null)
{
System.out System.out
.println("Error - an eperson must be provided"); .println("Error - an eperson must be provided");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
List<HarvestedCollection> harvestedCollections = harvestedCollectionService.findAll(context); List<HarvestedCollection> harvestedCollections = harvestedCollectionService.findAll(context);
for (HarvestedCollection harvestedCollection : harvestedCollections) for (HarvestedCollection harvestedCollection : harvestedCollections) {
{ System.out.println(
System.out.println("Purging the following collections (deleting items and resetting harvest status): " + harvestedCollection.getCollection().getID().toString()); "Purging the following collections (deleting items and resetting harvest status): " +
harvestedCollection
.getCollection().getID().toString());
harvester.purgeCollection(harvestedCollection.getCollection().getID().toString(), eperson); harvester.purgeCollection(harvestedCollection.getCollection().getID().toString(), eperson);
} }
context.complete(); context.complete();
} } else if ("purge".equals(command)) {
// Delete all items in a collection. Useful for testing fresh harvests. // Delete all items in a collection. Useful for testing fresh harvests.
else if ("purge".equals(command)) if (collection == null || eperson == null) {
{
if (collection == null || eperson == null)
{
System.out System.out
.println("Error - a target collection and eperson must be provided"); .println("Error - a target collection and eperson must be provided");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
harvester.purgeCollection(collection, eperson); harvester.purgeCollection(collection, eperson);
context.complete(); context.complete();
//TODO: implement this... remove all items and remember to unset "last-harvested" settings //TODO: implement this... remove all items and remember to unset "last-harvested" settings
} } else if ("config".equals(command)) {
// Configure a collection with the three main settings // Configure a collection with the three main settings
else if ("config".equals(command)) if (collection == null) {
{
if (collection == null)
{
System.out.println("Error - a target collection must be provided"); System.out.println("Error - a target collection must be provided");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
if (oaiSource == null || oaiSetID == null) if (oaiSource == null || oaiSetID == null) {
{
System.out.println("Error - both the OAI server address and OAI set id must be specified"); System.out.println("Error - both the OAI server address and OAI set id must be specified");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
if (metadataKey == null) if (metadataKey == null) {
{ System.out
System.out.println("Error - a metadata key (commonly the prefix) must be specified for this collection"); .println("Error - a metadata key (commonly the prefix) must be specified for this collection");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
harvester.configureCollection(collection, harvestType, oaiSource, oaiSetID, metadataKey); harvester.configureCollection(collection, harvestType, oaiSource, oaiSetID, metadataKey);
} } else if ("ping".equals(command)) {
else if ("ping".equals(command)) if (oaiSource == null || oaiSetID == null) {
{
if (oaiSource == null || oaiSetID == null)
{
System.out.println("Error - both the OAI server address and OAI set id must be specified"); System.out.println("Error - both the OAI server address and OAI set id must be specified");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
@@ -266,183 +248,166 @@ public class Harvest
pingResponder(oaiSource, oaiSetID, metadataKey); pingResponder(oaiSource, oaiSetID, metadataKey);
} }
} }
/* /*
* Resolve the ID into a collection and check to see if its harvesting options are set. If so, return * Resolve the ID into a collection and check to see if its harvesting options are set. If so, return
* the collection, if not, bail out. * the collection, if not, bail out.
*/ */
private Collection resolveCollection(String collectionID) { private Collection resolveCollection(String collectionID) {
DSpaceObject dso; DSpaceObject dso;
Collection targetCollection = null; Collection targetCollection = null;
try { try {
// is the ID a handle? // is the ID a handle?
if (collectionID != null) if (collectionID != null) {
{ if (collectionID.indexOf('/') != -1) {
if (collectionID.indexOf('/') != -1)
{
// string has a / so it must be a handle - try and resolve it // string has a / so it must be a handle - try and resolve it
dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context, collectionID); dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context, collectionID);
// resolved, now make sure it's a collection // resolved, now make sure it's a collection
if (dso == null || dso.getType() != Constants.COLLECTION) if (dso == null || dso.getType() != Constants.COLLECTION) {
{
targetCollection = null; targetCollection = null;
} } else {
else
{
targetCollection = (Collection) dso; targetCollection = (Collection) dso;
} }
} } else {
// not a handle, try and treat it as an integer collection // not a handle, try and treat it as an integer collection database ID
// database ID System.out.println("Looking up by id: " + collectionID + ", parsed as '" + Integer
else .parseInt(collectionID) + "', " + "in context: " + context);
{
System.out.println("Looking up by id: " + collectionID + ", parsed as '" + Integer.parseInt(collectionID) + "', " + "in context: " + context);
targetCollection = collectionService.find(context, UUID.fromString(collectionID)); targetCollection = collectionService.find(context, UUID.fromString(collectionID));
} }
} }
// was the collection valid? // was the collection valid?
if (targetCollection == null) if (targetCollection == null) {
{
System.out.println("Cannot resolve " + collectionID + " to collection"); System.out.println("Cannot resolve " + collectionID + " to collection");
System.exit(1); System.exit(1);
} }
} } catch (SQLException se) {
catch (SQLException se) { se.printStackTrace();
se.printStackTrace(); }
}
return targetCollection;
return targetCollection;
} }
private void configureCollection(String collectionID, int type, String oaiSource, String oaiSetId, String mdConfigId) { private void configureCollection(String collectionID, int type, String oaiSource, String oaiSetId,
System.out.println("Running: configure collection"); String mdConfigId) {
System.out.println("Running: configure collection");
Collection collection = resolveCollection(collectionID);
System.out.println(collection.getID()); Collection collection = resolveCollection(collectionID);
System.out.println(collection.getID());
try {
HarvestedCollection hc = harvestedCollectionService.find(context, collection); try {
if (hc == null) { HarvestedCollection hc = harvestedCollectionService.find(context, collection);
hc = harvestedCollectionService.create(context, collection); if (hc == null) {
} hc = harvestedCollectionService.create(context, collection);
context.turnOffAuthorisationSystem();
hc.setHarvestParams(type, oaiSource, oaiSetId, mdConfigId);
hc.setHarvestStatus(HarvestedCollection.STATUS_READY);
harvestedCollectionService.update(context, hc);
context.restoreAuthSystemState();
context.complete();
}
catch (Exception e) {
System.out.println("Changes could not be committed");
e.printStackTrace();
System.exit(1);
}
finally {
if (context != null)
{
context.restoreAuthSystemState();
} }
}
context.turnOffAuthorisationSystem();
hc.setHarvestParams(type, oaiSource, oaiSetId, mdConfigId);
hc.setHarvestStatus(HarvestedCollection.STATUS_READY);
harvestedCollectionService.update(context, hc);
context.restoreAuthSystemState();
context.complete();
} catch (Exception e) {
System.out.println("Changes could not be committed");
e.printStackTrace();
System.exit(1);
} finally {
if (context != null) {
context.restoreAuthSystemState();
}
}
} }
/** /**
* Purges a collection of all harvest-related data and settings. All items in the collection will be deleted. * Purges a collection of all harvest-related data and settings. All items in the collection will be deleted.
* *
* @param collectionID * @param collectionID
* @param email * @param email
*/ */
private void purgeCollection(String collectionID, String email) { private void purgeCollection(String collectionID, String email) {
System.out.println("Purging collection of all items and resetting last_harvested and harvest_message: " + collectionID); System.out.println(
Collection collection = resolveCollection(collectionID); "Purging collection of all items and resetting last_harvested and harvest_message: " + collectionID);
Collection collection = resolveCollection(collectionID);
try
{ try {
EPerson eperson = ePersonService.findByEmail(context, email); EPerson eperson = ePersonService.findByEmail(context, email);
context.setCurrentUser(eperson); context.setCurrentUser(eperson);
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
ItemService itemService = ContentServiceFactory.getInstance().getItemService(); ItemService itemService = ContentServiceFactory.getInstance().getItemService();
Iterator<Item> it = itemService.findByCollection(context, collection); Iterator<Item> it = itemService.findByCollection(context, collection);
int i=0; int i = 0;
while (it.hasNext()) { while (it.hasNext()) {
i++; i++;
Item item = it.next(); Item item = it.next();
System.out.println("Deleting: " + item.getHandle()); System.out.println("Deleting: " + item.getHandle());
collectionService.removeItem(context, collection, item); collectionService.removeItem(context, collection, item);
// Dispatch events every 50 items context.uncacheEntity(item);// Dispatch events every 50 items
if (i%50 == 0) { if (i % 50 == 0) {
context.dispatchEvents(); context.dispatchEvents();
i=0; i = 0;
} }
} }
HarvestedCollection hc = harvestedCollectionService.find(context, collection); HarvestedCollection hc = harvestedCollectionService.find(context, collection);
if (hc != null) { if (hc != null) {
hc.setLastHarvested(null); hc.setLastHarvested(null);
hc.setHarvestMessage(""); hc.setHarvestMessage("");
hc.setHarvestStatus(HarvestedCollection.STATUS_READY); hc.setHarvestStatus(HarvestedCollection.STATUS_READY);
hc.setHarvestStartTime(null); hc.setHarvestStartTime(null);
harvestedCollectionService.update(context, hc); harvestedCollectionService.update(context, hc);
} }
context.restoreAuthSystemState(); context.restoreAuthSystemState();
context.dispatchEvents(); context.dispatchEvents();
} } catch (Exception e) {
catch (Exception e) { System.out.println("Changes could not be committed");
System.out.println("Changes could not be committed"); e.printStackTrace();
e.printStackTrace(); System.exit(1);
System.exit(1); } finally {
} context.restoreAuthSystemState();
finally { }
context.restoreAuthSystemState();
}
} }
/** /**
* Run a single harvest cycle on the specified collection under the authorization of the supplied EPerson * Run a single harvest cycle on the specified collection under the authorization of the supplied EPerson
*/ */
private void runHarvest(String collectionID, String email) { private void runHarvest(String collectionID, String email) {
System.out.println("Running: a harvest cycle on " + collectionID); System.out.println("Running: a harvest cycle on " + collectionID);
System.out.print("Initializing the harvester... "); System.out.print("Initializing the harvester... ");
OAIHarvester harvester = null; OAIHarvester harvester = null;
try { try {
Collection collection = resolveCollection(collectionID); Collection collection = resolveCollection(collectionID);
HarvestedCollection hc = harvestedCollectionService.find(context, collection); HarvestedCollection hc = harvestedCollectionService.find(context, collection);
harvester = new OAIHarvester(context, collection, hc); harvester = new OAIHarvester(context, collection, hc);
System.out.println("success. "); System.out.println("success. ");
} } catch (HarvestingException hex) {
catch (HarvestingException hex) { System.out.print("failed. ");
System.out.print("failed. "); System.out.println(hex.getMessage());
System.out.println(hex.getMessage()); throw new IllegalStateException("Unable to harvest", hex);
throw new IllegalStateException("Unable to harvest", hex); } catch (SQLException se) {
} catch (SQLException se) {
System.out.print("failed. "); System.out.print("failed. ");
System.out.println(se.getMessage()); System.out.println(se.getMessage());
throw new IllegalStateException("Unable to access database", se); throw new IllegalStateException("Unable to access database", se);
}
try {
// Harvest will not work for an anonymous user
EPerson eperson = ePersonService.findByEmail(context, email);
System.out.println("Harvest started... ");
context.setCurrentUser(eperson);
harvester.runHarvest();
context.complete();
}
catch (SQLException e) {
throw new IllegalStateException("Failed to run harvester", e);
} }
catch (AuthorizeException e) {
try {
// Harvest will not work for an anonymous user
EPerson eperson = ePersonService.findByEmail(context, email);
System.out.println("Harvest started... ");
context.setCurrentUser(eperson);
harvester.runHarvest();
context.complete();
} catch (SQLException e) {
throw new IllegalStateException("Failed to run harvester", e); throw new IllegalStateException("Failed to run harvester", e);
} } catch (AuthorizeException e) {
catch (IOException e) { throw new IllegalStateException("Failed to run harvester", e);
} catch (IOException e) {
throw new IllegalStateException("Failed to run harvester", e); throw new IllegalStateException("Failed to run harvester", e);
} }
@@ -450,76 +415,70 @@ public class Harvest
} }
/** /**
* Resets harvest_status and harvest_start_time flags for all collections that have a row in the harvested_collections table * Resets harvest_status and harvest_start_time flags for all collections that have a row in the
* harvested_collections table
*/ */
private static void resetHarvesting() { private static void resetHarvesting() {
System.out.print("Resetting harvest status flag on all collections... "); System.out.print("Resetting harvest status flag on all collections... ");
try try {
{
List<HarvestedCollection> harvestedCollections = harvestedCollectionService.findAll(context); List<HarvestedCollection> harvestedCollections = harvestedCollectionService.findAll(context);
for (HarvestedCollection harvestedCollection : harvestedCollections) for (HarvestedCollection harvestedCollection : harvestedCollections) {
{
//hc.setHarvestResult(null,""); //hc.setHarvestResult(null,"");
harvestedCollection.setHarvestStartTime(null); harvestedCollection.setHarvestStartTime(null);
harvestedCollection.setHarvestStatus(HarvestedCollection.STATUS_READY); harvestedCollection.setHarvestStatus(HarvestedCollection.STATUS_READY);
harvestedCollectionService.update(context, harvestedCollection); harvestedCollectionService.update(context, harvestedCollection);
} }
System.out.println("success. "); System.out.println("success. ");
} } catch (Exception ex) {
catch (Exception ex) {
System.out.println("failed. "); System.out.println("failed. ");
ex.printStackTrace(); ex.printStackTrace();
} }
} }
/** /**
* Starts up the harvest scheduler. Terminating this process will stop the scheduler. * Starts up the harvest scheduler. Terminating this process will stop the scheduler.
*/ */
private static void startHarvester() private static void startHarvester() {
{ try {
try
{
System.out.print("Starting harvest loop... "); System.out.print("Starting harvest loop... ");
HarvestServiceFactory.getInstance().getHarvestSchedulingService().startNewScheduler(); HarvestServiceFactory.getInstance().getHarvestSchedulingService().startNewScheduler();
System.out.println("running. "); System.out.println("running. ");
} } catch (Exception ex) {
catch (Exception ex) {
ex.printStackTrace(); ex.printStackTrace();
} }
} }
/** /**
* See if the responder is alive and working. * See if the responder is alive and working.
* *
* @param server address of the responder's host. * @param server address of the responder's host.
* @param set name of an item set. * @param set name of an item set.
* @param metadataFormat local prefix name, or null for "dc". * @param metadataFormat local prefix name, or null for "dc".
*/ */
private static void pingResponder(String server, String set, String metadataFormat) private static void pingResponder(String server, String set, String metadataFormat) {
{
List<String> errors; List<String> errors;
System.out.print("Testing basic PMH access: "); System.out.print("Testing basic PMH access: ");
errors = OAIHarvester.verifyOAIharvester(server, set, errors = OAIHarvester.verifyOAIharvester(server, set,
(null != metadataFormat) ? metadataFormat : "dc", false); (null != metadataFormat) ? metadataFormat : "dc", false);
if (errors.isEmpty()) if (errors.isEmpty()) {
System.out.println("OK"); System.out.println("OK");
else } else {
{ for (String error : errors) {
for (String error : errors)
System.err.println(error); System.err.println(error);
}
} }
System.out.print("Testing ORE support: "); System.out.print("Testing ORE support: ");
errors = OAIHarvester.verifyOAIharvester(server, set, errors = OAIHarvester.verifyOAIharvester(server, set,
(null != metadataFormat) ? metadataFormat : "dc", true); (null != metadataFormat) ? metadataFormat : "dc", true);
if (errors.isEmpty()) if (errors.isEmpty()) {
System.out.println("OK"); System.out.println("OK");
else } else {
{ for (String error : errors) {
for (String error : errors)
System.err.println(error); System.err.println(error);
}
} }
} }
} }

View File

@@ -7,7 +7,17 @@
*/ */
package org.dspace.app.itemexport; package org.dspace.app.itemexport;
import org.apache.commons.cli.*; import java.util.ArrayList;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import java.util.UUID;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.dspace.app.itemexport.factory.ItemExportServiceFactory; import org.dspace.app.itemexport.factory.ItemExportServiceFactory;
import org.dspace.app.itemexport.service.ItemExportService; import org.dspace.app.itemexport.service.ItemExportService;
import org.dspace.content.Collection; import org.dspace.content.Collection;
@@ -20,8 +30,6 @@ import org.dspace.core.Context;
import org.dspace.handle.factory.HandleServiceFactory; import org.dspace.handle.factory.HandleServiceFactory;
import org.dspace.handle.service.HandleService; import org.dspace.handle.service.HandleService;
import java.util.*;
/** /**
* Item exporter to create simple AIPs for DSpace content. Currently exports * Item exporter to create simple AIPs for DSpace content. Currently exports
* individual items, or entire collections. For instructions on use, see * individual items, or entire collections. For instructions on use, see
@@ -45,17 +53,21 @@ import java.util.*;
*/ */
public class ItemExportCLITool { public class ItemExportCLITool {
protected static ItemExportService itemExportService = ItemExportServiceFactory.getInstance().getItemExportService(); protected static ItemExportService itemExportService = ItemExportServiceFactory.getInstance()
.getItemExportService();
protected static HandleService handleService = HandleServiceFactory.getInstance().getHandleService(); protected static HandleService handleService = HandleServiceFactory.getInstance().getHandleService();
protected static ItemService itemService = ContentServiceFactory.getInstance().getItemService(); protected static ItemService itemService = ContentServiceFactory.getInstance().getItemService();
protected static CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); protected static CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService();
/**
* Default constructor
*/
private ItemExportCLITool() { }
/* /*
* *
*/ */
public static void main(String[] argv) throws Exception public static void main(String[] argv) throws Exception {
{
// create an options object and populate it // create an options object and populate it
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
@@ -64,10 +76,11 @@ public class ItemExportCLITool {
options.addOption("t", "type", true, "type: COLLECTION or ITEM"); options.addOption("t", "type", true, "type: COLLECTION or ITEM");
options.addOption("i", "id", true, "ID or handle of thing to export"); options.addOption("i", "id", true, "ID or handle of thing to export");
options.addOption("d", "dest", true, options.addOption("d", "dest", true,
"destination where you want items to go"); "destination where you want items to go");
options.addOption("m", "migrate", false, "export for migration (remove handle and metadata that will be re-created in new system)"); options.addOption("m", "migrate", false,
"export for migration (remove handle and metadata that will be re-created in new system)");
options.addOption("n", "number", true, options.addOption("n", "number", true,
"sequence number to begin exporting items with"); "sequence number to begin exporting items with");
options.addOption("z", "zip", true, "export as zip file (specify filename e.g. export.zip)"); options.addOption("z", "zip", true, "export as zip file (specify filename e.g. export.zip)");
options.addOption("h", "help", false, "help"); options.addOption("h", "help", false, "help");
@@ -86,175 +99,140 @@ public class ItemExportCLITool {
Item myItem = null; Item myItem = null;
Collection mycollection = null; Collection mycollection = null;
if (line.hasOption('h')) if (line.hasOption('h')) {
{
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("ItemExport\n", options); myhelp.printHelp("ItemExport\n", options);
System.out System.out
.println("\nfull collection: ItemExport -t COLLECTION -i ID -d dest -n number"); .println("\nfull collection: ItemExport -t COLLECTION -i ID -d dest -n number");
System.out System.out
.println("singleitem: ItemExport -t ITEM -i ID -d dest -n number"); .println("singleitem: ItemExport -t ITEM -i ID -d dest -n number");
System.exit(0); System.exit(0);
} }
if (line.hasOption('t')) // type if (line.hasOption('t')) { // type
{
typeString = line.getOptionValue('t'); typeString = line.getOptionValue('t');
if ("ITEM".equals(typeString)) if ("ITEM".equals(typeString)) {
{
myType = Constants.ITEM; myType = Constants.ITEM;
} } else if ("COLLECTION".equals(typeString)) {
else if ("COLLECTION".equals(typeString))
{
myType = Constants.COLLECTION; myType = Constants.COLLECTION;
} }
} }
if (line.hasOption('i')) // id if (line.hasOption('i')) { // id
{
myIDString = line.getOptionValue('i'); myIDString = line.getOptionValue('i');
} }
if (line.hasOption('d')) // dest if (line.hasOption('d')) { // dest
{
destDirName = line.getOptionValue('d'); destDirName = line.getOptionValue('d');
} }
if (line.hasOption('n')) // number if (line.hasOption('n')) { // number
{
seqStart = Integer.parseInt(line.getOptionValue('n')); seqStart = Integer.parseInt(line.getOptionValue('n'));
} }
boolean migrate = false; boolean migrate = false;
if (line.hasOption('m')) // number if (line.hasOption('m')) { // number
{
migrate = true; migrate = true;
} }
boolean zip = false; boolean zip = false;
String zipFileName = ""; String zipFileName = "";
if (line.hasOption('z')) if (line.hasOption('z')) {
{
zip = true; zip = true;
zipFileName = line.getOptionValue('z'); zipFileName = line.getOptionValue('z');
} }
boolean excludeBitstreams = false; boolean excludeBitstreams = false;
if (line.hasOption('x')) if (line.hasOption('x')) {
{ excludeBitstreams = true;
excludeBitstreams = true;
} }
// now validate the args // now validate the args
if (myType == -1) if (myType == -1) {
{
System.out System.out
.println("type must be either COLLECTION or ITEM (-h for help)"); .println("type must be either COLLECTION or ITEM (-h for help)");
System.exit(1); System.exit(1);
} }
if (destDirName == null) if (destDirName == null) {
{
System.out System.out
.println("destination directory must be set (-h for help)"); .println("destination directory must be set (-h for help)");
System.exit(1); System.exit(1);
} }
if (seqStart == -1) if (seqStart == -1) {
{
System.out System.out
.println("sequence start number must be set (-h for help)"); .println("sequence start number must be set (-h for help)");
System.exit(1); System.exit(1);
} }
if (myIDString == null) if (myIDString == null) {
{
System.out System.out
.println("ID must be set to either a database ID or a handle (-h for help)"); .println("ID must be set to either a database ID or a handle (-h for help)");
System.exit(1); System.exit(1);
} }
Context c = new Context(); Context c = new Context(Context.Mode.READ_ONLY);
c.turnOffAuthorisationSystem(); c.turnOffAuthorisationSystem();
if (myType == Constants.ITEM) if (myType == Constants.ITEM) {
{
// first, is myIDString a handle? // first, is myIDString a handle?
if (myIDString.indexOf('/') != -1) if (myIDString.indexOf('/') != -1) {
{
myItem = (Item) handleService.resolveToObject(c, myIDString); myItem = (Item) handleService.resolveToObject(c, myIDString);
if ((myItem == null) || (myItem.getType() != Constants.ITEM)) if ((myItem == null) || (myItem.getType() != Constants.ITEM)) {
{
myItem = null; myItem = null;
} }
} } else {
else
{
myItem = itemService.find(c, UUID.fromString(myIDString)); myItem = itemService.find(c, UUID.fromString(myIDString));
} }
if (myItem == null) if (myItem == null) {
{
System.out System.out
.println("Error, item cannot be found: " + myIDString); .println("Error, item cannot be found: " + myIDString);
} }
} } else {
else if (myIDString.indexOf('/') != -1) {
{
if (myIDString.indexOf('/') != -1)
{
// has a / must be a handle // has a / must be a handle
mycollection = (Collection) handleService.resolveToObject(c, mycollection = (Collection) handleService.resolveToObject(c,
myIDString); myIDString);
// ensure it's a collection // ensure it's a collection
if ((mycollection == null) if ((mycollection == null)
|| (mycollection.getType() != Constants.COLLECTION)) || (mycollection.getType() != Constants.COLLECTION)) {
{
mycollection = null; mycollection = null;
} }
} } else if (myIDString != null) {
else if (myIDString != null)
{
mycollection = collectionService.find(c, UUID.fromString(myIDString)); mycollection = collectionService.find(c, UUID.fromString(myIDString));
} }
if (mycollection == null) if (mycollection == null) {
{
System.out.println("Error, collection cannot be found: " System.out.println("Error, collection cannot be found: "
+ myIDString); + myIDString);
System.exit(1); System.exit(1);
} }
} }
if (zip) if (zip) {
{
Iterator<Item> items; Iterator<Item> items;
if (myItem != null) if (myItem != null) {
{
List<Item> myItems = new ArrayList<>(); List<Item> myItems = new ArrayList<>();
myItems.add(myItem); myItems.add(myItem);
items = myItems.iterator(); items = myItems.iterator();
} } else {
else
{
System.out.println("Exporting from collection: " + myIDString); System.out.println("Exporting from collection: " + myIDString);
items = itemService.findByCollection(c, mycollection); items = itemService.findByCollection(c, mycollection);
} }
itemExportService.exportAsZip(c, items, destDirName, zipFileName, seqStart, migrate, excludeBitstreams); itemExportService.exportAsZip(c, items, destDirName, zipFileName, seqStart, migrate, excludeBitstreams);
} } else {
else if (myItem != null) {
{
if (myItem != null)
{
// it's only a single item // it's only a single item
itemExportService.exportItem(c, Collections.singletonList(myItem).iterator(), destDirName, seqStart, migrate, excludeBitstreams); itemExportService
} .exportItem(c, Collections.singletonList(myItem).iterator(), destDirName, seqStart, migrate,
else excludeBitstreams);
{ } else {
System.out.println("Exporting from collection: " + myIDString); System.out.println("Exporting from collection: " + myIDString);
// it's a collection, so do a bunch of items // it's a collection, so do a bunch of items

View File

@@ -10,20 +10,17 @@ package org.dspace.app.itemexport;
/** /**
* An exception that can be thrown when error occur during item export * An exception that can be thrown when error occur during item export
*/ */
public class ItemExportException extends Exception public class ItemExportException extends Exception {
{
public static final int EXPORT_TOO_LARGE = 0; public static final int EXPORT_TOO_LARGE = 0;
private int reason; private int reason;
public ItemExportException(int r, String message) public ItemExportException(int r, String message) {
{
super(message); super(message);
reason = r; reason = r;
} }
public int getReason() public int getReason() {
{
return reason; return reason;
} }
} }

View File

@@ -11,7 +11,8 @@ import org.dspace.app.itemexport.service.ItemExportService;
import org.dspace.services.factory.DSpaceServicesFactory; import org.dspace.services.factory.DSpaceServicesFactory;
/** /**
* Abstract factory to get services for the itemexport package, use ItemExportServiceFactory.getInstance() to retrieve an implementation * Abstract factory to get services for the itemexport package, use ItemExportServiceFactory.getInstance() to
* retrieve an implementation
* *
* @author kevinvandevelde at atmire.com * @author kevinvandevelde at atmire.com
*/ */
@@ -19,7 +20,8 @@ public abstract class ItemExportServiceFactory {
public abstract ItemExportService getItemExportService(); public abstract ItemExportService getItemExportService();
public static ItemExportServiceFactory getInstance(){ public static ItemExportServiceFactory getInstance() {
return DSpaceServicesFactory.getInstance().getServiceManager().getServiceByName("itemExportServiceFactory", ItemExportServiceFactory.class); return DSpaceServicesFactory.getInstance().getServiceManager()
.getServiceByName("itemExportServiceFactory", ItemExportServiceFactory.class);
} }
} }

View File

@@ -11,7 +11,8 @@ import org.dspace.app.itemexport.service.ItemExportService;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
/** /**
* Factory implementation to get services for the itemexport package, use ItemExportServiceFactory.getInstance() to retrieve an implementation * Factory implementation to get services for the itemexport package, use ItemExportServiceFactory.getInstance() to
* retrieve an implementation
* *
* @author kevinvandevelde at atmire.com * @author kevinvandevelde at atmire.com
*/ */

View File

@@ -7,16 +7,16 @@
*/ */
package org.dspace.app.itemexport.service; package org.dspace.app.itemexport.service;
import org.dspace.content.DSpaceObject;
import org.dspace.content.Item;
import org.dspace.core.Context;
import org.dspace.eperson.EPerson;
import javax.mail.MessagingException;
import java.io.InputStream; import java.io.InputStream;
import java.util.Date; import java.util.Date;
import java.util.Iterator; import java.util.Iterator;
import java.util.List; import java.util.List;
import javax.mail.MessagingException;
import org.dspace.content.DSpaceObject;
import org.dspace.content.Item;
import org.dspace.core.Context;
import org.dspace.eperson.EPerson;
/** /**
* Item exporter to create simple AIPs for DSpace content. Currently exports * Item exporter to create simple AIPs for DSpace content. Currently exports
@@ -47,122 +47,109 @@ public interface ItemExportService {
public static final String COMPRESSED_EXPORT_MIME_TYPE = "application/zip"; public static final String COMPRESSED_EXPORT_MIME_TYPE = "application/zip";
public void exportItem(Context c, Iterator<Item> i, public void exportItem(Context c, Iterator<Item> i,
String destDirName, int seqStart, boolean migrate, String destDirName, int seqStart, boolean migrate,
boolean excludeBitstreams) throws Exception; boolean excludeBitstreams) throws Exception;
/** /**
* Method to perform an export and save it as a zip file. * Method to perform an export and save it as a zip file.
* *
* @param context The DSpace Context * @param context The DSpace Context
* @param items The items to export * @param items The items to export
* @param destDirName The directory to save the export in * @param destDirName The directory to save the export in
* @param zipFileName The name to save the zip file as * @param zipFileName The name to save the zip file as
* @param seqStart The first number in the sequence * @param seqStart The first number in the sequence
* @param migrate Whether to use the migrate option or not * @param migrate Whether to use the migrate option or not
* @param excludeBitstreams Whether to exclude bitstreams or not * @param excludeBitstreams Whether to exclude bitstreams or not
* @throws Exception if error * @throws Exception if error
*/ */
public void exportAsZip(Context context, Iterator<Item> items, public void exportAsZip(Context context, Iterator<Item> items,
String destDirName, String zipFileName, String destDirName, String zipFileName,
int seqStart, boolean migrate, int seqStart, boolean migrate,
boolean excludeBitstreams) throws Exception; boolean excludeBitstreams) throws Exception;
/** /**
* Convenience methot to create export a single Community, Collection, or * Convenience methot to create export a single Community, Collection, or
* Item * Item
* *
* @param dso * @param dso - the dspace object to export
* - the dspace object to export * @param context - the dspace context
* @param context
* - the dspace context
* @param migrate Whether to use the migrate option or not * @param migrate Whether to use the migrate option or not
* @throws Exception if error * @throws Exception if error
*/ */
public void createDownloadableExport(DSpaceObject dso, public void createDownloadableExport(DSpaceObject dso,
Context context, boolean migrate) throws Exception; Context context, boolean migrate) throws Exception;
/** /**
* Convenience method to export a List of dspace objects (Community, * Convenience method to export a List of dspace objects (Community,
* Collection or Item) * Collection or Item)
* *
* @param dsObjects * @param dsObjects - List containing dspace objects
* - List containing dspace objects * @param context - the dspace context
* @param context * @param migrate Whether to use the migrate option or not
* - the dspace context
* @param migrate Whether to use the migrate option or not
* @throws Exception if error * @throws Exception if error
*/ */
public void createDownloadableExport(List<DSpaceObject> dsObjects, public void createDownloadableExport(List<DSpaceObject> dsObjects,
Context context, boolean migrate) throws Exception; Context context, boolean migrate) throws Exception;
/** /**
* Convenience methot to create export a single Community, Collection, or * Convenience methot to create export a single Community, Collection, or
* Item * Item
* *
* @param dso * @param dso - the dspace object to export
* - the dspace object to export * @param context - the dspace context
* @param context * @param additionalEmail - cc email to use
* - the dspace context * @param migrate Whether to use the migrate option or not
* @param additionalEmail
* - cc email to use
* @param migrate Whether to use the migrate option or not
* @throws Exception if error * @throws Exception if error
*/ */
public void createDownloadableExport(DSpaceObject dso, public void createDownloadableExport(DSpaceObject dso,
Context context, String additionalEmail, boolean migrate) throws Exception; Context context, String additionalEmail, boolean migrate) throws Exception;
/** /**
* Convenience method to export a List of dspace objects (Community, * Convenience method to export a List of dspace objects (Community,
* Collection or Item) * Collection or Item)
* *
* @param dsObjects * @param dsObjects - List containing dspace objects
* - List containing dspace objects * @param context - the dspace context
* @param context * @param additionalEmail - cc email to use
* - the dspace context * @param migrate Whether to use the migrate option or not
* @param additionalEmail
* - cc email to use
* @param migrate Whether to use the migrate option or not
* @throws Exception if error * @throws Exception if error
*/ */
public void createDownloadableExport(List<DSpaceObject> dsObjects, public void createDownloadableExport(List<DSpaceObject> dsObjects,
Context context, String additionalEmail, boolean migrate) throws Exception; Context context, String additionalEmail, boolean migrate) throws Exception;
/** /**
* Create a file name based on the date and eperson * Create a file name based on the date and eperson
* *
* @param type Type of object (as string) * @param type Type of object (as string)
* @param eperson * @param eperson - eperson who requested export and will be able to download it
* - eperson who requested export and will be able to download it * @param date - the date the export process was created
* @param date
* - the date the export process was created
* @return String representing the file name in the form of * @return String representing the file name in the form of
* 'export_yyy_MMM_dd_count_epersonID' * 'export_yyy_MMM_dd_count_epersonID'
* @throws Exception if error * @throws Exception if error
*/ */
public String assembleFileName(String type, EPerson eperson, public String assembleFileName(String type, EPerson eperson,
Date date) throws Exception; Date date) throws Exception;
/** /**
* Use config file entry for org.dspace.app.itemexport.download.dir and id * Use config file entry for org.dspace.app.itemexport.download.dir and id
* of the eperson to create a download directory name * of the eperson to create a download directory name
* *
* @param ePerson * @param ePerson - the eperson who requested export archive
* - the eperson who requested export archive
* @return String representing a directory in the form of * @return String representing a directory in the form of
* org.dspace.app.itemexport.download.dir/epersonID * org.dspace.app.itemexport.download.dir/epersonID
* @throws Exception if error * @throws Exception if error
*/ */
public String getExportDownloadDirectory(EPerson ePerson) public String getExportDownloadDirectory(EPerson ePerson)
throws Exception; throws Exception;
/** /**
* Returns config file entry for org.dspace.app.itemexport.work.dir * Returns config file entry for org.dspace.app.itemexport.work.dir
* *
* @return String representing config file entry for * @return String representing config file entry for
* org.dspace.app.itemexport.work.dir * org.dspace.app.itemexport.work.dir
* @throws Exception if error * @throws Exception if error
*/ */
public String getExportWorkDirectory() throws Exception; public String getExportWorkDirectory() throws Exception;
@@ -170,49 +157,43 @@ public interface ItemExportService {
/** /**
* Used to read the export archived. Inteded for download. * Used to read the export archived. Inteded for download.
* *
* @param fileName * @param fileName the name of the file to download
* the name of the file to download * @param eperson the eperson requesting the download
* @param eperson
* the eperson requesting the download
* @return an input stream of the file to be downloaded * @return an input stream of the file to be downloaded
* @throws Exception if error * @throws Exception if error
*/ */
public InputStream getExportDownloadInputStream(String fileName, public InputStream getExportDownloadInputStream(String fileName,
EPerson eperson) throws Exception; EPerson eperson) throws Exception;
/** /**
* Get the file size of the export archive represented by the file name. * Get the file size of the export archive represented by the file name.
* *
* @param context DSpace context * @param context DSpace context
* @param fileName * @param fileName name of the file to get the size.
* name of the file to get the size.
* @throws Exception if error
* @return size as long * @return size as long
* @throws Exception if error
*/ */
public long getExportFileSize(Context context, String fileName) throws Exception; public long getExportFileSize(Context context, String fileName) throws Exception;
/** /**
* Get the last modified date of the export archive represented by the file name. * Get the last modified date of the export archive represented by the file name.
* *
* @param context DSpace context * @param context DSpace context
* @param fileName * @param fileName name of the file to get the size.
* name of the file to get the size.
* @return date as long * @return date as long
* @see java.io.File#lastModified()
* @throws Exception if error * @throws Exception if error
* @see java.io.File#lastModified()
*/ */
public long getExportFileLastModified(Context context, String fileName) public long getExportFileLastModified(Context context, String fileName)
throws Exception; throws Exception;
/** /**
* The file name of the export archive contains the eperson id of the person * The file name of the export archive contains the eperson id of the person
* who created it When requested for download this method can check if the * who created it When requested for download this method can check if the
* person requesting it is the same one that created it * person requesting it is the same one that created it
* *
* @param context * @param context dspace context
* dspace context * @param fileName the file name to check auths for
* @param fileName
* the file name to check auths for
* @return true if it is the same person false otherwise * @return true if it is the same person false otherwise
*/ */
public boolean canDownload(Context context, String fileName); public boolean canDownload(Context context, String fileName);
@@ -223,19 +204,18 @@ public interface ItemExportService {
* *
* @param eperson EPerson object * @param eperson EPerson object
* @return a list of file names representing export archives that have been * @return a list of file names representing export archives that have been
* processed * processed
* @throws Exception if error * @throws Exception if error
*/ */
public List<String> getExportsAvailable(EPerson eperson) public List<String> getExportsAvailable(EPerson eperson)
throws Exception; throws Exception;
/** /**
* A clean up method that is ran before a new export archive is created. It * A clean up method that is ran before a new export archive is created. It
* uses the config file entry 'org.dspace.app.itemexport.life.span.hours' to * uses the config file entry 'org.dspace.app.itemexport.life.span.hours' to
* determine if the current exports are too old and need pruging * determine if the current exports are too old and need pruging
* *
* @param eperson * @param eperson - the eperson to clean up
* - the eperson to clean up
* @throws Exception if error * @throws Exception if error
*/ */
public void deleteOldExportArchives(EPerson eperson) throws Exception; public void deleteOldExportArchives(EPerson eperson) throws Exception;
@@ -256,17 +236,14 @@ public interface ItemExportService {
* communication with email instead. Send a success email once the export * communication with email instead. Send a success email once the export
* archive is complete and ready for download * archive is complete and ready for download
* *
* @param context * @param context - the current Context
* - the current Context * @param eperson - eperson to send the email to
* @param eperson * @param fileName - the file name to be downloaded. It is added to the url in
* - eperson to send the email to * the email
* @param fileName
* - the file name to be downloaded. It is added to the url in
* the email
* @throws MessagingException if error * @throws MessagingException if error
*/ */
public void emailSuccessMessage(Context context, EPerson eperson, public void emailSuccessMessage(Context context, EPerson eperson,
String fileName) throws MessagingException; String fileName) throws MessagingException;
/** /**
* Since the archive is created in a new thread we are unable to communicate * Since the archive is created in a new thread we are unable to communicate
@@ -274,19 +251,18 @@ public interface ItemExportService {
* communication with email instead. Send an error email if the export * communication with email instead. Send an error email if the export
* archive fails * archive fails
* *
* @param eperson * @param eperson - EPerson to send the error message to
* - EPerson to send the error message to * @param error - the error message
* @param error
* - the error message
* @throws MessagingException if error * @throws MessagingException if error
*/ */
public void emailErrorMessage(EPerson eperson, String error) public void emailErrorMessage(EPerson eperson, String error)
throws MessagingException; throws MessagingException;
/** /**
* Zip source to target * Zip source to target
*
* @param strSource source file * @param strSource source file
* @param target target file * @param target target file
* @throws Exception if error * @throws Exception if error
*/ */
public void zip(String strSource, String target) throws Exception; public void zip(String strSource, String target) throws Exception;

View File

@@ -7,99 +7,100 @@
*/ */
package org.dspace.app.itemimport; package org.dspace.app.itemimport;
import gr.ekt.bte.core.DataLoader;
import gr.ekt.bte.core.TransformationEngine;
import gr.ekt.bte.dataloader.FileDataLoader;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.HashMap; import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import gr.ekt.bte.core.DataLoader;
import gr.ekt.bte.core.TransformationEngine;
import gr.ekt.bte.dataloader.FileDataLoader;
/** /**
* This class acts as a Service in the procedure to batch import using the Biblio-Transformation-Engine * This class acts as a Service in the procedure to batch import using the Biblio-Transformation-Engine
*/ */
public class BTEBatchImportService public class BTEBatchImportService {
{
TransformationEngine transformationEngine; TransformationEngine transformationEngine;
Map<String, DataLoader> dataLoaders = new HashMap<String, DataLoader>(); Map<String, DataLoader> dataLoaders = new HashMap<String, DataLoader>();
Map<String, String> outputMap = new HashMap<String,String>(); Map<String, String> outputMap = new HashMap<String, String>();
/** /**
* Default constructor * Default constructor
*/ */
public BTEBatchImportService() public BTEBatchImportService() {
{
super(); super();
} }
/** /**
* Setter method for dataLoaders parameter * Setter method for dataLoaders parameter
*
* @param dataLoaders map of data loaders * @param dataLoaders map of data loaders
*/ */
public void setDataLoaders(Map<String, DataLoader> dataLoaders) public void setDataLoaders(Map<String, DataLoader> dataLoaders) {
{
this.dataLoaders = dataLoaders; this.dataLoaders = dataLoaders;
} }
/** /**
* Get data loaders * Get data loaders
*
* @return the map of DataLoaders * @return the map of DataLoaders
*/ */
public Map<String, DataLoader> getDataLoaders() public Map<String, DataLoader> getDataLoaders() {
{
return dataLoaders; return dataLoaders;
} }
/** /**
* Get output map * Get output map
*
* @return the outputMapping * @return the outputMapping
*/ */
public Map<String, String> getOutputMap() { public Map<String, String> getOutputMap() {
return outputMap; return outputMap;
} }
/** /**
* Setter method for the outputMapping * Setter method for the outputMapping
* @param outputMap the output mapping *
*/ * @param outputMap the output mapping
public void setOutputMap(Map<String, String> outputMap) { */
this.outputMap = outputMap; public void setOutputMap(Map<String, String> outputMap) {
} this.outputMap = outputMap;
}
/** /**
* Get transformation engine * Get transformation engine
* @return transformation engine *
*/ * @return transformation engine
public TransformationEngine getTransformationEngine() { */
return transformationEngine; public TransformationEngine getTransformationEngine() {
} return transformationEngine;
}
/** /**
* set transformation engine * set transformation engine
* @param transformationEngine transformation engine *
*/ * @param transformationEngine transformation engine
public void setTransformationEngine(TransformationEngine transformationEngine) { */
this.transformationEngine = transformationEngine; public void setTransformationEngine(TransformationEngine transformationEngine) {
} this.transformationEngine = transformationEngine;
}
/**
* Getter of file data loaders /**
* @return List of file data loaders * Getter of file data loaders
*/ *
public List<String> getFileDataLoaders(){ * @return List of file data loaders
List<String> result = new ArrayList<String>(); */
public List<String> getFileDataLoaders() {
for (String key : dataLoaders.keySet()){ List<String> result = new ArrayList<String>();
DataLoader dl = dataLoaders.get(key);
if (dl instanceof FileDataLoader){ for (String key : dataLoaders.keySet()) {
result.add(key); DataLoader dl = dataLoaders.get(key);
} if (dl instanceof FileDataLoader) {
} result.add(key);
return result; }
} }
} return result;
}
}

View File

@@ -20,198 +20,210 @@ import java.util.List;
/** /**
* @author kstamatis * @author kstamatis
*
*/ */
public class BatchUpload { public class BatchUpload {
private Date date; private Date date;
private File dir; private File dir;
private boolean successful; private boolean successful;
private int itemsImported; private int itemsImported;
private int totalItems = 0; private int totalItems = 0;
private List<String> handlesImported = new ArrayList<String>(); private List<String> handlesImported = new ArrayList<String>();
private String errorMsg = ""; private String errorMsg = "";
private String errorMsgHTML = ""; private String errorMsgHTML = "";
/**
* Initialize with directory
* @param dirPath directory path
*/
public BatchUpload(String dirPath) {
this.initializeWithFile(new File(dirPath));
}
/** /**
* Initialize with directory * Initialize with directory
* @param dir directory path *
*/ * @param dirPath directory path
public BatchUpload(File dir) { */
public BatchUpload(String dirPath) {
this.initializeWithFile(dir);
}
/** this.initializeWithFile(new File(dirPath));
* Initialize with directory
* @param dir directory path
*/
private void initializeWithFile(File dir){
this.dir = dir;
String dirName = dir.getName();
long timeMillis = Long.parseLong(dirName);
Calendar calendar = new GregorianCalendar();
calendar.setTimeInMillis(timeMillis);
this.date = calendar.getTime();
try {
this.itemsImported = countLines(dir + File.separator + "mapfile");
} catch (IOException e) {
e.printStackTrace();
}
for (File file : dir.listFiles()){
if (file.isDirectory()){
this.totalItems = file.list().length;
}
}
this.successful = this.totalItems == this.itemsImported;
//Parse possible error message
File errorFile = new File(dir + File.separator + "error.txt");
if (errorFile.exists()){
try {
readFile(dir + File.separator + "error.txt");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
/**
* Count lines in file
* @param filename file name
* @return lines in file
* @throws IOException if IO error
*/
private int countLines(String filename) throws IOException {
LineNumberReader reader = new LineNumberReader(new FileReader(filename));
int cnt = 0;
String lineRead = "";
while ((lineRead = reader.readLine()) != null) {
String[] parts = lineRead.split(" ");
if (parts.length > 1)
handlesImported.add(parts[1].trim());
else
handlesImported.add(lineRead);
}
cnt = reader.getLineNumber(); }
reader.close();
return cnt;
}
/**
* Read a file
* @param filename file name
* @throws IOException if IO error
*/
private void readFile(String filename) throws IOException {
LineNumberReader reader = new LineNumberReader(new FileReader(filename));
String lineRead = "";
while ((lineRead = reader.readLine()) != null) {
this.errorMsg += lineRead + "\n";
if (lineRead.startsWith("\tat ")){
this.errorMsgHTML += "<span class=\"batchimport-error-tab\">" + lineRead + "</span><br/>";
}
else if (lineRead.startsWith("Caused by")){
this.errorMsgHTML += "<span class=\"batchimport-error-caused\">" + lineRead + "</span><br/>";
}
else {
this.errorMsgHTML += lineRead + "<br/>";
}
}
reader.close();
}
/** /**
* Get date * Initialize with directory
* @return Date *
*/ * @param dir directory path
public Date getDate() { */
return date; public BatchUpload(File dir) {
}
/** this.initializeWithFile(dir);
* Get path to directory
* @return directory
*/
public File getDir() {
return dir;
}
/** }
* Whether successulf
* @return true or false
*/
public boolean isSuccessful() {
return successful;
}
/** /**
* Get items imported * Initialize with directory
* @return number of items *
*/ * @param dir directory path
public int getItemsImported() { */
return itemsImported; private void initializeWithFile(File dir) {
}
/** this.dir = dir;
* Get total items
* @return total
*/
public int getTotalItems() {
return totalItems;
}
/**
* Get formatted date (DD/MM/YY)
* @return date as string
*/
public String getDateFormatted(){
SimpleDateFormat df = new SimpleDateFormat("dd/MM/yyyy - HH:mm");
return df.format(date);
}
/** String dirName = dir.getName();
* Get handles of imported files long timeMillis = Long.parseLong(dirName);
* @return list of handles Calendar calendar = new GregorianCalendar();
*/ calendar.setTimeInMillis(timeMillis);
public List<String> getHandlesImported() { this.date = calendar.getTime();
return handlesImported;
}
/** try {
* Get error message this.itemsImported = countLines(dir + File.separator + "mapfile");
* @return error message } catch (IOException e) {
*/ e.printStackTrace();
public String getErrorMsg() { }
return errorMsg;
}
/** for (File file : dir.listFiles()) {
* Get error message as HTML if (file.isDirectory()) {
* @return error message string as HTML this.totalItems = file.list().length;
*/ }
public String getErrorMsgHTML() { }
return errorMsgHTML;
} this.successful = this.totalItems == this.itemsImported;
//Parse possible error message
File errorFile = new File(dir + File.separator + "error.txt");
if (errorFile.exists()) {
try {
readFile(dir + File.separator + "error.txt");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
/**
* Count lines in file
*
* @param filename file name
* @return lines in file
* @throws IOException if IO error
*/
private int countLines(String filename) throws IOException {
LineNumberReader reader = new LineNumberReader(new FileReader(filename));
int cnt = 0;
String lineRead = "";
while ((lineRead = reader.readLine()) != null) {
String[] parts = lineRead.split(" ");
if (parts.length > 1) {
handlesImported.add(parts[1].trim());
} else {
handlesImported.add(lineRead);
}
}
cnt = reader.getLineNumber();
reader.close();
return cnt;
}
/**
* Read a file
*
* @param filename file name
* @throws IOException if IO error
*/
private void readFile(String filename) throws IOException {
LineNumberReader reader = new LineNumberReader(new FileReader(filename));
String lineRead = "";
while ((lineRead = reader.readLine()) != null) {
this.errorMsg += lineRead + "\n";
if (lineRead.startsWith("\tat ")) {
this.errorMsgHTML += "<span class=\"batchimport-error-tab\">" + lineRead + "</span><br/>";
} else if (lineRead.startsWith("Caused by")) {
this.errorMsgHTML += "<span class=\"batchimport-error-caused\">" + lineRead + "</span><br/>";
} else {
this.errorMsgHTML += lineRead + "<br/>";
}
}
reader.close();
}
/**
* Get date
*
* @return Date
*/
public Date getDate() {
return date;
}
/**
* Get path to directory
*
* @return directory
*/
public File getDir() {
return dir;
}
/**
* Whether successulf
*
* @return true or false
*/
public boolean isSuccessful() {
return successful;
}
/**
* Get items imported
*
* @return number of items
*/
public int getItemsImported() {
return itemsImported;
}
/**
* Get total items
*
* @return total
*/
public int getTotalItems() {
return totalItems;
}
/**
* Get formatted date (DD/MM/YY)
*
* @return date as string
*/
public String getDateFormatted() {
SimpleDateFormat df = new SimpleDateFormat("dd/MM/yyyy - HH:mm");
return df.format(date);
}
/**
* Get handles of imported files
*
* @return list of handles
*/
public List<String> getHandlesImported() {
return handlesImported;
}
/**
* Get error message
*
* @return error message
*/
public String getErrorMsg() {
return errorMsg;
}
/**
* Get error message as HTML
*
* @return error message string as HTML
*/
public String getErrorMsgHTML() {
return errorMsgHTML;
}
} }

View File

@@ -7,7 +7,17 @@
*/ */
package org.dspace.app.itemimport; package org.dspace.app.itemimport;
import org.apache.commons.cli.*; import java.io.File;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.UUID;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.dspace.app.itemimport.factory.ItemImportServiceFactory; import org.dspace.app.itemimport.factory.ItemImportServiceFactory;
import org.dspace.app.itemimport.service.ItemImportService; import org.dspace.app.itemimport.service.ItemImportService;
import org.dspace.content.Collection; import org.dspace.content.Collection;
@@ -21,12 +31,6 @@ import org.dspace.eperson.service.EPersonService;
import org.dspace.handle.factory.HandleServiceFactory; import org.dspace.handle.factory.HandleServiceFactory;
import org.dspace.handle.service.HandleService; import org.dspace.handle.service.HandleService;
import java.io.File;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.UUID;
/** /**
* Import items into DSpace. The conventional use is upload files by copying * Import items into DSpace. The conventional use is upload files by copying
* them. DSpace writes the item's bitstreams into its assetstore. Metadata is * them. DSpace writes the item's bitstreams into its assetstore. Metadata is
@@ -47,12 +51,17 @@ public class ItemImportCLITool {
private static boolean template = false; private static boolean template = false;
private static final CollectionService collectionService = ContentServiceFactory.getInstance().getCollectionService(); private static final CollectionService collectionService = ContentServiceFactory.getInstance()
.getCollectionService();
private static final EPersonService epersonService = EPersonServiceFactory.getInstance().getEPersonService(); private static final EPersonService epersonService = EPersonServiceFactory.getInstance().getEPersonService();
private static final HandleService handleService = HandleServiceFactory.getInstance().getHandleService(); private static final HandleService handleService = HandleServiceFactory.getInstance().getHandleService();
public static void main(String[] argv) throws Exception /**
{ * Default constructor
*/
private ItemImportCLITool() { }
public static void main(String[] argv) throws Exception {
Date startTime = new Date(); Date startTime = new Date();
int status = 0; int status = 0;
@@ -66,24 +75,24 @@ public class ItemImportCLITool {
options.addOption("b", "add-bte", false, "add items to DSpace via Biblio-Transformation-Engine (BTE)"); options.addOption("b", "add-bte", false, "add items to DSpace via Biblio-Transformation-Engine (BTE)");
options.addOption("r", "replace", false, "replace items in mapfile"); options.addOption("r", "replace", false, "replace items in mapfile");
options.addOption("d", "delete", false, options.addOption("d", "delete", false,
"delete items listed in mapfile"); "delete items listed in mapfile");
options.addOption("i", "inputtype", true, "input type in case of BTE import"); options.addOption("i", "inputtype", true, "input type in case of BTE import");
options.addOption("s", "source", true, "source of items (directory)"); options.addOption("s", "source", true, "source of items (directory)");
options.addOption("z", "zip", true, "name of zip file"); options.addOption("z", "zip", true, "name of zip file");
options.addOption("c", "collection", true, options.addOption("c", "collection", true,
"destination collection(s) Handle or database ID"); "destination collection(s) Handle or database ID");
options.addOption("m", "mapfile", true, "mapfile items in mapfile"); options.addOption("m", "mapfile", true, "mapfile items in mapfile");
options.addOption("e", "eperson", true, options.addOption("e", "eperson", true,
"email of eperson doing importing"); "email of eperson doing importing");
options.addOption("w", "workflow", false, options.addOption("w", "workflow", false,
"send submission through collection's workflow"); "send submission through collection's workflow");
options.addOption("n", "notify", false, options.addOption("n", "notify", false,
"if sending submissions through the workflow, send notification emails"); "if sending submissions through the workflow, send notification emails");
options.addOption("t", "test", false, options.addOption("t", "test", false,
"test run - do not actually import items"); "test run - do not actually import items");
options.addOption("p", "template", false, "apply template"); options.addOption("p", "template", false, "apply template");
options.addOption("R", "resume", false, options.addOption("R", "resume", false,
"resume a failed import (add only)"); "resume a failed import (add only)");
options.addOption("q", "quiet", false, "don't display metadata"); options.addOption("q", "quiet", false, "don't display metadata");
options.addOption("h", "help", false, "help"); options.addOption("h", "help", false, "help");
@@ -106,15 +115,19 @@ public class ItemImportCLITool {
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("ItemImport\n", options); myhelp.printHelp("ItemImport\n", options);
System.out System.out
.println("\nadding items: ItemImport -a -e eperson -c collection -s sourcedir -m mapfile"); .println("\nadding items: ItemImport -a -e eperson -c collection -s sourcedir -m mapfile");
System.out System.out
.println("\nadding items from zip file: ItemImport -a -e eperson -c collection -s sourcedir -z filename.zip -m mapfile"); .println(
"\nadding items from zip file: ItemImport -a -e eperson -c collection -s sourcedir -z " +
"filename.zip -m mapfile");
System.out System.out
.println("replacing items: ItemImport -r -e eperson -c collection -s sourcedir -m mapfile"); .println("replacing items: ItemImport -r -e eperson -c collection -s sourcedir -m mapfile");
System.out System.out
.println("deleting items: ItemImport -d -e eperson -m mapfile"); .println("deleting items: ItemImport -d -e eperson -m mapfile");
System.out System.out
.println("If multiple collections are specified, the first collection will be the one that owns the item."); .println(
"If multiple collections are specified, the first collection will be the one that owns the " +
"item.");
System.exit(0); System.exit(0);
} }
@@ -155,30 +168,26 @@ public class ItemImportCLITool {
template = true; template = true;
} }
if (line.hasOption('s')) // source if (line.hasOption('s')) { // source
{
sourcedir = line.getOptionValue('s'); sourcedir = line.getOptionValue('s');
} }
if (line.hasOption('m')) // mapfile if (line.hasOption('m')) { // mapfile
{
mapfile = line.getOptionValue('m'); mapfile = line.getOptionValue('m');
} }
if (line.hasOption('e')) // eperson if (line.hasOption('e')) { // eperson
{
eperson = line.getOptionValue('e'); eperson = line.getOptionValue('e');
} }
if (line.hasOption('c')) // collections if (line.hasOption('c')) { // collections
{
collections = line.getOptionValues('c'); collections = line.getOptionValues('c');
} }
if (line.hasOption('R')) { if (line.hasOption('R')) {
isResume = true; isResume = true;
System.out System.out
.println("**Resume import** - attempting to import items not already imported"); .println("**Resume import** - attempting to import items not already imported");
} }
if (line.hasOption('q')) { if (line.hasOption('q')) {
@@ -189,7 +198,7 @@ public class ItemImportCLITool {
String zipfilename = ""; String zipfilename = "";
if (line.hasOption('z')) { if (line.hasOption('z')) {
zip = true; zip = true;
zipfilename = sourcedir + System.getProperty("file.separator") + line.getOptionValue('z'); zipfilename = line.getOptionValue('z');
} }
//By default assume collections will be given on the command line //By default assume collections will be given on the command line
@@ -198,26 +207,26 @@ public class ItemImportCLITool {
// must have a command set // must have a command set
if (command == null) { if (command == null) {
System.out System.out
.println("Error - must run with either add, replace, or remove (run with -h flag for details)"); .println("Error - must run with either add, replace, or remove (run with -h flag for details)");
System.exit(1); System.exit(1);
} else if ("add".equals(command) || "replace".equals(command)) { } else if ("add".equals(command) || "replace".equals(command)) {
if (sourcedir == null) { if (sourcedir == null) {
System.out System.out
.println("Error - a source directory containing items must be set"); .println("Error - a source directory containing items must be set");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
if (mapfile == null) { if (mapfile == null) {
System.out System.out
.println("Error - a map file to hold importing results must be specified"); .println("Error - a map file to hold importing results must be specified");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
if (eperson == null) { if (eperson == null) {
System.out System.out
.println("Error - an eperson to do the importing must be specified"); .println("Error - an eperson to do the importing must be specified");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
@@ -227,18 +236,19 @@ public class ItemImportCLITool {
commandLineCollections = false; commandLineCollections = false;
} }
} else if ("add-bte".equals(command)) { } else if ("add-bte".equals(command)) {
//Source dir can be null, the user can specify the parameters for his loader in the Spring XML configuration file //Source dir can be null, the user can specify the parameters for his loader in the Spring XML
// configuration file
if (mapfile == null) { if (mapfile == null) {
System.out System.out
.println("Error - a map file to hold importing results must be specified"); .println("Error - a map file to hold importing results must be specified");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
if (eperson == null) { if (eperson == null) {
System.out System.out
.println("Error - an eperson to do the importing must be specified"); .println("Error - an eperson to do the importing must be specified");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
@@ -250,14 +260,16 @@ public class ItemImportCLITool {
if (bteInputType == null) { if (bteInputType == null) {
System.out System.out
.println("Error - an input type (tsv, csv, ris, endnote, bibtex or any other type you have specified in BTE Spring XML configuration file) must be specified"); .println(
"Error - an input type (tsv, csv, ris, endnote, bibtex or any other type you have " +
"specified in BTE Spring XML configuration file) must be specified");
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
} else if ("delete".equals(command)) { } else if ("delete".equals(command)) {
if (eperson == null) { if (eperson == null) {
System.out System.out
.println("Error - an eperson to do the importing must be specified"); .println("Error - an eperson to do the importing must be specified");
System.exit(1); System.exit(1);
} }
@@ -270,7 +282,7 @@ public class ItemImportCLITool {
// can only resume for adds // can only resume for adds
if (isResume && !"add".equals(command) && !"add-bte".equals(command)) { if (isResume && !"add".equals(command) && !"add-bte".equals(command)) {
System.out System.out
.println("Error - resume option only works with the --add or the --add-bte commands"); .println("Error - resume option only works with the --add or the --add-bte commands");
System.exit(1); System.exit(1);
} }
@@ -280,9 +292,9 @@ public class ItemImportCLITool {
if (!isResume && "add".equals(command) && myFile.exists()) { if (!isResume && "add".equals(command) && myFile.exists()) {
System.out.println("Error - the mapfile " + mapfile System.out.println("Error - the mapfile " + mapfile
+ " already exists."); + " already exists.");
System.out System.out
.println("Either delete it or use --resume if attempting to resume an aborted import."); .println("Either delete it or use --resume if attempting to resume an aborted import.");
System.exit(1); System.exit(1);
} }
@@ -294,7 +306,7 @@ public class ItemImportCLITool {
myloader.setQuiet(isQuiet); myloader.setQuiet(isQuiet);
// create a context // create a context
Context c = new Context(); Context c = new Context(Context.Mode.BATCH_EDIT);
// find the EPerson, assign to context // find the EPerson, assign to context
EPerson myEPerson = null; EPerson myEPerson = null;
@@ -330,24 +342,22 @@ public class ItemImportCLITool {
// string has a / so it must be a handle - try and resolve // string has a / so it must be a handle - try and resolve
// it // it
mycollections.add((Collection) handleService mycollections.add((Collection) handleService
.resolveToObject(c, collections[i])); .resolveToObject(c, collections[i]));
// resolved, now make sure it's a collection // resolved, now make sure it's a collection
if ((mycollections.get(i) == null) if ((mycollections.get(i) == null)
|| (mycollections.get(i).getType() != Constants.COLLECTION)) { || (mycollections.get(i).getType() != Constants.COLLECTION)) {
mycollections.set(i, null); mycollections.set(i, null);
} }
} } else if (collections[i] != null) {
// not a handle, try and treat it as an integer collection // not a handle, try and treat it as an integer collection database ID
// database ID
else if (collections[i] != null) {
mycollections.set(i, collectionService.find(c, UUID.fromString(collections[i]))); mycollections.set(i, collectionService.find(c, UUID.fromString(collections[i])));
} }
// was the collection valid? // was the collection valid?
if (mycollections.get(i) == null) { if (mycollections.get(i) == null) {
throw new IllegalArgumentException("Cannot resolve " throw new IllegalArgumentException("Cannot resolve "
+ collections[i] + " to collection"); + collections[i] + " to collection");
} }
// print progress info // print progress info
@@ -358,7 +368,7 @@ public class ItemImportCLITool {
} }
System.out.println(owningPrefix + " Collection: " System.out.println(owningPrefix + " Collection: "
+ mycollections.get(i).getName()); + mycollections.get(i).getName());
} }
} // end of validating collections } // end of validating collections
@@ -394,11 +404,13 @@ public class ItemImportCLITool {
try { try {
if (zip) { if (zip) {
System.gc(); System.gc();
System.out.println("Deleting temporary zip directory: " + myloader.getTempWorkDirFile().getAbsolutePath()); System.out.println(
"Deleting temporary zip directory: " + myloader.getTempWorkDirFile().getAbsolutePath());
myloader.cleanupZipTemp(); myloader.cleanupZipTemp();
} }
} catch (Exception ex) { } catch (Exception ex) {
System.out.println("Unable to delete temporary zip archive location: " + myloader.getTempWorkDirFile().getAbsolutePath()); System.out.println("Unable to delete temporary zip archive location: " + myloader.getTempWorkDirFile()
.getAbsolutePath());
} }
@@ -409,7 +421,9 @@ public class ItemImportCLITool {
Date endTime = new Date(); Date endTime = new Date();
System.out.println("Started: " + startTime.getTime()); System.out.println("Started: " + startTime.getTime());
System.out.println("Ended: " + endTime.getTime()); System.out.println("Ended: " + endTime.getTime());
System.out.println("Elapsed time: " + ((endTime.getTime() - startTime.getTime()) / 1000) + " secs (" + (endTime.getTime() - startTime.getTime()) + " msecs)"); System.out.println(
"Elapsed time: " + ((endTime.getTime() - startTime.getTime()) / 1000) + " secs (" + (endTime
.getTime() - startTime.getTime()) + " msecs)");
} }
System.exit(status); System.exit(status);

View File

@@ -11,7 +11,8 @@ import org.dspace.app.itemimport.service.ItemImportService;
import org.dspace.services.factory.DSpaceServicesFactory; import org.dspace.services.factory.DSpaceServicesFactory;
/** /**
* Abstract factory to get services for the itemimport package, use ItemImportService.getInstance() to retrieve an implementation * Abstract factory to get services for the itemimport package, use ItemImportService.getInstance() to retrieve an
* implementation
* *
* @author kevinvandevelde at atmire.com * @author kevinvandevelde at atmire.com
*/ */
@@ -19,7 +20,8 @@ public abstract class ItemImportServiceFactory {
public abstract ItemImportService getItemImportService(); public abstract ItemImportService getItemImportService();
public static ItemImportServiceFactory getInstance(){ public static ItemImportServiceFactory getInstance() {
return DSpaceServicesFactory.getInstance().getServiceManager().getServiceByName("itemImportServiceFactory", ItemImportServiceFactory.class); return DSpaceServicesFactory.getInstance().getServiceManager()
.getServiceByName("itemImportServiceFactory", ItemImportServiceFactory.class);
} }
} }

View File

@@ -11,7 +11,8 @@ import org.dspace.app.itemimport.service.ItemImportService;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
/** /**
* Factory implementation to get services for the itemimport package, use ItemImportService.getInstance() to retrieve an implementation * Factory implementation to get services for the itemimport package, use ItemImportService.getInstance() to retrieve
* an implementation
* *
* @author kevinvandevelde at atmire.com * @author kevinvandevelde at atmire.com
*/ */

View File

@@ -7,16 +7,16 @@
*/ */
package org.dspace.app.itemimport.service; package org.dspace.app.itemimport.service;
import java.io.File;
import java.io.IOException;
import java.util.List;
import javax.mail.MessagingException;
import org.dspace.app.itemimport.BatchUpload; import org.dspace.app.itemimport.BatchUpload;
import org.dspace.content.Collection; import org.dspace.content.Collection;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.eperson.EPerson; import org.dspace.eperson.EPerson;
import javax.mail.MessagingException;
import java.io.File;
import java.io.IOException;
import java.util.List;
/** /**
* Import items into DSpace. The conventional use is upload files by copying * Import items into DSpace. The conventional use is upload files by copying
* them. DSpace writes the item's bitstreams into its assetstore. Metadata is * them. DSpace writes the item's bitstreams into its assetstore. Metadata is
@@ -37,30 +37,32 @@ public interface ItemImportService {
/** /**
* * @param c DSpace Context
* @param c DSpace Context
* @param mycollections List of Collections * @param mycollections List of Collections
* @param sourceDir source location * @param sourceDir source location
* @param mapFile map file * @param mapFile map file
* @param template whether to use template item * @param template whether to use template item
* @throws Exception if error * @throws Exception if error
*/ */
public void addItemsAtomic(Context c, List<Collection> mycollections, String sourceDir, String mapFile, boolean template) throws Exception; public void addItemsAtomic(Context c, List<Collection> mycollections, String sourceDir, String mapFile,
boolean template) throws Exception;
/** /**
* Add items * Add items
* @param c DSpace Context *
* @param c DSpace Context
* @param mycollections List of Collections * @param mycollections List of Collections
* @param sourceDir source location * @param sourceDir source location
* @param mapFile map file * @param mapFile map file
* @param template whether to use template item * @param template whether to use template item
* @throws Exception if error * @throws Exception if error
*/ */
public void addItems(Context c, List<Collection> mycollections, public void addItems(Context c, List<Collection> mycollections,
String sourceDir, String mapFile, boolean template) throws Exception; String sourceDir, String mapFile, boolean template) throws Exception;
/** /**
* Unzip a file * Unzip a file
*
* @param zipfile file * @param zipfile file
* @return unzip location * @return unzip location
* @throws IOException if error * @throws IOException if error
@@ -69,6 +71,7 @@ public interface ItemImportService {
/** /**
* Unzip a file to a destination * Unzip a file to a destination
*
* @param zipfile file * @param zipfile file
* @param destDir destination directory * @param destDir destination directory
* @return unzip location * @return unzip location
@@ -78,7 +81,8 @@ public interface ItemImportService {
/** /**
* Unzip a file in a specific source directory * Unzip a file in a specific source directory
* @param sourcedir source directory *
* @param sourcedir source directory
* @param zipfilename file name * @param zipfilename file name
* @return unzip location * @return unzip location
* @throws IOException if error * @throws IOException if error
@@ -86,18 +90,19 @@ public interface ItemImportService {
public String unzip(String sourcedir, String zipfilename) throws IOException; public String unzip(String sourcedir, String zipfilename) throws IOException;
/** /**
*
* Given a public URL to a zip file that has the Simple Archive Format, this method imports the contents to DSpace * Given a public URL to a zip file that has the Simple Archive Format, this method imports the contents to DSpace
* @param url The public URL of the zip file *
* @param url The public URL of the zip file
* @param owningCollection The owning collection the items will belong to * @param owningCollection The owning collection the items will belong to
* @param collections The collections the created items will be inserted to, apart from the owning one * @param collections The collections the created items will be inserted to, apart from the owning one
* @param resumeDir In case of a resume request, the directory that containsthe old mapfile and data * @param resumeDir In case of a resume request, the directory that containsthe old mapfile and data
* @param inputType The input type of the data (bibtex, csv, etc.), in case of local file * @param inputType The input type of the data (bibtex, csv, etc.), in case of local file
* @param context The context * @param context The context
* @param template whether to use template item * @param template whether to use template item
* @throws Exception if error * @throws Exception if error
*/ */
public void processUIImport(String url, Collection owningCollection, String[] collections, String resumeDir, String inputType, Context context, boolean template) throws Exception; public void processUIImport(String url, Collection owningCollection, String[] collections, String resumeDir,
String inputType, Context context, boolean template) throws Exception;
/** /**
* Since the BTE batch import is done in a new thread we are unable to communicate * Since the BTE batch import is done in a new thread we are unable to communicate
@@ -105,16 +110,13 @@ public interface ItemImportService {
* communication with email instead. Send a success email once the batch * communication with email instead. Send a success email once the batch
* import is complete * import is complete
* *
* @param context * @param context - the current Context
* - the current Context * @param eperson - eperson to send the email to
* @param eperson * @param fileName - the filepath to the mapfile created by the batch import
* - eperson to send the email to
* @param fileName
* - the filepath to the mapfile created by the batch import
* @throws MessagingException if error * @throws MessagingException if error
*/ */
public void emailSuccessMessage(Context context, EPerson eperson, public void emailSuccessMessage(Context context, EPerson eperson,
String fileName) throws MessagingException; String fileName) throws MessagingException;
/** /**
* Since the BTE batch import is done in a new thread we are unable to communicate * Since the BTE batch import is done in a new thread we are unable to communicate
@@ -122,37 +124,38 @@ public interface ItemImportService {
* communication with email instead. Send an error email if the batch * communication with email instead. Send an error email if the batch
* import fails * import fails
* *
* @param eperson * @param eperson - EPerson to send the error message to
* - EPerson to send the error message to * @param error - the error message
* @param error
* - the error message
* @throws MessagingException if error * @throws MessagingException if error
*/ */
public void emailErrorMessage(EPerson eperson, String error) public void emailErrorMessage(EPerson eperson, String error)
throws MessagingException; throws MessagingException;
/** /**
* Get imports available for a person * Get imports available for a person
*
* @param eperson EPerson object * @param eperson EPerson object
* @return List of batch uploads * @return List of batch uploads
* @throws Exception if error * @throws Exception if error
*/ */
public List<BatchUpload> getImportsAvailable(EPerson eperson) public List<BatchUpload> getImportsAvailable(EPerson eperson)
throws Exception; throws Exception;
/** /**
* Get import upload directory * Get import upload directory
*
* @param ePerson EPerson object * @param ePerson EPerson object
* @return directory * @return directory
* @throws Exception if error * @throws Exception if error
*/ */
public String getImportUploadableDirectory(EPerson ePerson) public String getImportUploadableDirectory(EPerson ePerson)
throws Exception; throws Exception;
/** /**
* Delete a batch by ID * Delete a batch by ID
* @param c DSpace Context *
* @param c DSpace Context
* @param uploadId identifier * @param uploadId identifier
* @throws Exception if error * @throws Exception if error
*/ */
@@ -160,18 +163,21 @@ public interface ItemImportService {
/** /**
* Replace items * Replace items
* @param c DSpace Context *
* @param c DSpace Context
* @param mycollections List of Collections * @param mycollections List of Collections
* @param sourcedir source directory * @param sourcedir source directory
* @param mapfile map file * @param mapfile map file
* @param template whether to use template item * @param template whether to use template item
* @throws Exception if error * @throws Exception if error
*/ */
public void replaceItems(Context c, List<Collection> mycollections, String sourcedir, String mapfile, boolean template) throws Exception; public void replaceItems(Context c, List<Collection> mycollections, String sourcedir, String mapfile,
boolean template) throws Exception;
/** /**
* Delete items via mapfile * Delete items via mapfile
* @param c DSpace Context *
* @param c DSpace Context
* @param mapfile map file * @param mapfile map file
* @throws Exception if error * @throws Exception if error
*/ */
@@ -179,28 +185,33 @@ public interface ItemImportService {
/** /**
* Add items * Add items
* @param c DSpace Context *
* @param c DSpace Context
* @param mycollections List of Collections * @param mycollections List of Collections
* @param sourcedir source directory * @param sourcedir source directory
* @param mapfile map file * @param mapfile map file
* @param template whether to use template item * @param template whether to use template item
* @param bteInputType The input type of the data (bibtex, csv, etc.), in case of local file * @param bteInputType The input type of the data (bibtex, csv, etc.), in case of local file
* @param workingDir working directory * @param workingDir working directory
* @throws Exception if error * @throws Exception if error
*/ */
public void addBTEItems(Context c, List<Collection> mycollections, String sourcedir, String mapfile, boolean template, String bteInputType, String workingDir) throws Exception; public void addBTEItems(Context c, List<Collection> mycollections, String sourcedir, String mapfile,
boolean template, String bteInputType, String workingDir) throws Exception;
/** /**
* Get temporary work directory * Get temporary work directory
*
* @return directory as string * @return directory as string
*/ */
public String getTempWorkDir(); public String getTempWorkDir();
/** /**
* Get temporary work directory (as File) * Get temporary work directory (as File)
*
* @return directory as File * @return directory as File
* @throws java.io.IOException if the directory cannot be created.
*/ */
public File getTempWorkDirFile(); public File getTempWorkDirFile() throws IOException;
/** /**
* Cleanup * Cleanup
@@ -209,18 +220,21 @@ public interface ItemImportService {
/** /**
* Set test flag * Set test flag
* @param isTest true or false *
* @param isTest true or false
*/ */
public void setTest(boolean isTest); public void setTest(boolean isTest);
/** /**
* Set resume flag * Set resume flag
* @param isResume true or false *
* @param isResume true or false
*/ */
public void setResume(boolean isResume); public void setResume(boolean isResume);
/** /**
* Set use workflow * Set use workflow
*
* @param useWorkflow whether to enable workflow * @param useWorkflow whether to enable workflow
*/ */
public void setUseWorkflow(boolean useWorkflow); public void setUseWorkflow(boolean useWorkflow);
@@ -232,6 +246,7 @@ public interface ItemImportService {
/** /**
* Set quiet flag * Set quiet flag
*
* @param isQuiet true or false * @param isQuiet true or false
*/ */
public void setQuiet(boolean isQuiet); public void setQuiet(boolean isQuiet);

View File

@@ -23,75 +23,71 @@ import org.springframework.beans.factory.annotation.Autowired;
/** /**
* This is an item marking Strategy class that tries to mark an item availability * This is an item marking Strategy class that tries to mark an item availability
* based on the existence of bitstreams within the ORIGINAL bundle. * based on the existence of bitstreams within the ORIGINAL bundle.
* *
* @author Kostas Stamatis * @author Kostas Stamatis
*
*/ */
public class ItemMarkingAvailabilityBitstreamStrategy implements ItemMarkingExtractor { public class ItemMarkingAvailabilityBitstreamStrategy implements ItemMarkingExtractor {
private String availableImageName; private String availableImageName;
private String nonAvailableImageName; private String nonAvailableImageName;
@Autowired(required = true) @Autowired(required = true)
protected ItemService itemService; protected ItemService itemService;
public ItemMarkingAvailabilityBitstreamStrategy() {
}
@Override public ItemMarkingAvailabilityBitstreamStrategy() {
public ItemMarkingInfo getItemMarkingInfo(Context context, Item item)
throws SQLException { }
List<Bundle> bundles = itemService.getBundles(item, "ORIGINAL"); @Override
if (bundles.size() == 0){ public ItemMarkingInfo getItemMarkingInfo(Context context, Item item)
ItemMarkingInfo markInfo = new ItemMarkingInfo(); throws SQLException {
markInfo.setImageName(nonAvailableImageName);
List<Bundle> bundles = itemService.getBundles(item, "ORIGINAL");
return markInfo; if (bundles.size() == 0) {
} ItemMarkingInfo markInfo = new ItemMarkingInfo();
else { markInfo.setImageName(nonAvailableImageName);
Bundle originalBundle = bundles.iterator().next();
if (originalBundle.getBitstreams().size() == 0){ return markInfo;
ItemMarkingInfo markInfo = new ItemMarkingInfo(); } else {
markInfo.setImageName(nonAvailableImageName); Bundle originalBundle = bundles.iterator().next();
if (originalBundle.getBitstreams().size() == 0) {
return markInfo; ItemMarkingInfo markInfo = new ItemMarkingInfo();
} markInfo.setImageName(nonAvailableImageName);
else {
return markInfo;
} else {
Bitstream bitstream = originalBundle.getBitstreams().get(0); Bitstream bitstream = originalBundle.getBitstreams().get(0);
ItemMarkingInfo signInfo = new ItemMarkingInfo(); ItemMarkingInfo signInfo = new ItemMarkingInfo();
signInfo.setImageName(availableImageName); signInfo.setImageName(availableImageName);
signInfo.setTooltip(bitstream.getName()); signInfo.setTooltip(bitstream.getName());
String bsLink = "";
String bsLink = "";
bsLink = bsLink + "bitstream/" bsLink = bsLink + "bitstream/"
+ item.getHandle() + "/" + item.getHandle() + "/"
+ bitstream.getSequenceID() + "/"; + bitstream.getSequenceID() + "/";
try { try {
bsLink = bsLink + Util.encodeBitstreamName(bitstream.getName(), Constants.DEFAULT_ENCODING); bsLink = bsLink + Util.encodeBitstreamName(bitstream.getName(), Constants.DEFAULT_ENCODING);
} catch (UnsupportedEncodingException e) { } catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
signInfo.setLink(bsLink);
return signInfo;
}
}
}
public void setAvailableImageName(String availableImageName) { e.printStackTrace();
this.availableImageName = availableImageName; }
}
public void setNonAvailableImageName(String nonAvailableImageName) { signInfo.setLink(bsLink);
this.nonAvailableImageName = nonAvailableImageName;
} return signInfo;
}
}
}
public void setAvailableImageName(String availableImageName) {
this.availableImageName = availableImageName;
}
public void setNonAvailableImageName(String nonAvailableImageName) {
this.nonAvailableImageName = nonAvailableImageName;
}
} }

View File

@@ -18,33 +18,32 @@ import org.dspace.core.Context;
/** /**
* This is an item marking Strategy class that tries to mark an item * This is an item marking Strategy class that tries to mark an item
* based on the collection the items belong to * based on the collection the items belong to
* *
* @author Kostas Stamatis * @author Kostas Stamatis
*
*/ */
public class ItemMarkingCollectionStrategy implements ItemMarkingExtractor { public class ItemMarkingCollectionStrategy implements ItemMarkingExtractor {
Map<String, ItemMarkingInfo> mapping = new HashMap<String, ItemMarkingInfo>(); Map<String, ItemMarkingInfo> mapping = new HashMap<String, ItemMarkingInfo>();
public ItemMarkingCollectionStrategy() {
}
@Override public ItemMarkingCollectionStrategy() {
public ItemMarkingInfo getItemMarkingInfo(Context context, Item item) }
throws SQLException {
if (mapping!=null){
for (Collection collection : item.getCollections()){
if (mapping.containsKey(collection.getHandle())){
return mapping.get(collection.getHandle());
}
}
}
return null;
}
public void setMapping(Map<String, ItemMarkingInfo> mapping) { @Override
this.mapping = mapping; public ItemMarkingInfo getItemMarkingInfo(Context context, Item item)
} throws SQLException {
if (mapping != null) {
for (Collection collection : item.getCollections()) {
if (mapping.containsKey(collection.getHandle())) {
return mapping.get(collection.getHandle());
}
}
}
return null;
}
public void setMapping(Map<String, ItemMarkingInfo> mapping) {
this.mapping = mapping;
}
} }

View File

@@ -14,11 +14,10 @@ import org.dspace.core.Context;
/** /**
* Interface to abstract the strategy for item signing * Interface to abstract the strategy for item signing
* *
* @author Kostas Stamatis * @author Kostas Stamatis
*
*/ */
public interface ItemMarkingExtractor { public interface ItemMarkingExtractor {
public ItemMarkingInfo getItemMarkingInfo(Context context, Item item) public ItemMarkingInfo getItemMarkingInfo(Context context, Item item)
throws SQLException; throws SQLException;
} }

View File

@@ -9,49 +9,48 @@ package org.dspace.app.itemmarking;
/** /**
* Simple DTO to transfer data about the marking info for an item * Simple DTO to transfer data about the marking info for an item
* *
* @author Kostas Stamatis * @author Kostas Stamatis
*
*/ */
public class ItemMarkingInfo { public class ItemMarkingInfo {
private String imageName; private String imageName;
private String classInfo; private String classInfo;
private String tooltip; private String tooltip;
private String link; private String link;
public ItemMarkingInfo() { public ItemMarkingInfo() {
super(); super();
} }
public String getImageName() { public String getImageName() {
return imageName; return imageName;
} }
public void setImageName(String imageName) { public void setImageName(String imageName) {
this.imageName = imageName; this.imageName = imageName;
} }
public String getTooltip() { public String getTooltip() {
return tooltip; return tooltip;
} }
public void setTooltip(String tooltip) { public void setTooltip(String tooltip) {
this.tooltip = tooltip; this.tooltip = tooltip;
} }
public String getLink() {
return link;
}
public void setLink(String link) { public String getLink() {
this.link = link; return link;
} }
public String getClassInfo() {
return classInfo;
}
public void setClassInfo(String classInfo) { public void setLink(String link) {
this.classInfo = classInfo; this.link = link;
} }
public String getClassInfo() {
return classInfo;
}
public void setClassInfo(String classInfo) {
this.classInfo = classInfo;
}
} }

View File

@@ -22,46 +22,43 @@ import org.springframework.beans.factory.annotation.Autowired;
* This is an item marking Strategy class that tries to mark an item * This is an item marking Strategy class that tries to mark an item
* based on the existence of a specific value within the values of a specific * based on the existence of a specific value within the values of a specific
* metadata field * metadata field
* *
* @author Kostas Stamatis * @author Kostas Stamatis
*
*/ */
public class ItemMarkingMetadataStrategy implements ItemMarkingExtractor { public class ItemMarkingMetadataStrategy implements ItemMarkingExtractor {
@Autowired(required = true) @Autowired(required = true)
protected ItemService itemService; protected ItemService itemService;
private String metadataField; private String metadataField;
Map<String, ItemMarkingInfo> mapping = new HashMap<String, ItemMarkingInfo>(); Map<String, ItemMarkingInfo> mapping = new HashMap<String, ItemMarkingInfo>();
public ItemMarkingMetadataStrategy() {
}
@Override public ItemMarkingMetadataStrategy() {
public ItemMarkingInfo getItemMarkingInfo(Context context, Item item) }
throws SQLException {
if (metadataField != null && mapping!=null)
{
List<MetadataValue> vals = itemService.getMetadataByMetadataString(item, metadataField);
if (vals.size() > 0)
{
for (MetadataValue value : vals){
String type = value.getValue();
if (mapping.containsKey(type)){
return mapping.get(type);
}
}
}
}
return null;
}
public void setMetadataField(String metadataField) { @Override
this.metadataField = metadataField; public ItemMarkingInfo getItemMarkingInfo(Context context, Item item)
} throws SQLException {
public void setMapping(Map<String, ItemMarkingInfo> mapping) { if (metadataField != null && mapping != null) {
this.mapping = mapping; List<MetadataValue> vals = itemService.getMetadataByMetadataString(item, metadataField);
} if (vals.size() > 0) {
for (MetadataValue value : vals) {
String type = value.getValue();
if (mapping.containsKey(type)) {
return mapping.get(type);
}
}
}
}
return null;
}
public void setMetadataField(String metadataField) {
this.metadataField = metadataField;
}
public void setMapping(Map<String, ItemMarkingInfo> mapping) {
this.mapping = mapping;
}
} }

View File

@@ -12,80 +12,70 @@ import java.util.LinkedHashMap;
import java.util.Map; import java.util.Map;
/** /**
* Container for UpdateActions * Container for UpdateActions
* Order of actions is very import for correct processing. This implementation * Order of actions is very import for correct processing. This implementation
* supports an iterator that returns the actions in the order in which they are * supports an iterator that returns the actions in the order in which they are
* put in. Adding the same action a second time has no effect on this order. * put in. Adding the same action a second time has no effect on this order.
*
*
*/ */
public class ActionManager implements Iterable<UpdateAction> { public class ActionManager implements Iterable<UpdateAction> {
protected Map<Class<? extends UpdateAction>, UpdateAction> registry protected Map<Class<? extends UpdateAction>, UpdateAction> registry
= new LinkedHashMap<Class<? extends UpdateAction>, UpdateAction>(); = new LinkedHashMap<Class<? extends UpdateAction>, UpdateAction>();
/** /**
* Get update action * Get update action
* @param actionClass UpdateAction class *
* @return instantiation of UpdateAction class * @param actionClass UpdateAction class
* @throws InstantiationException if instantiation error * @return instantiation of UpdateAction class
* @throws IllegalAccessException if illegal access error * @throws InstantiationException if instantiation error
*/ * @throws IllegalAccessException if illegal access error
public UpdateAction getUpdateAction(Class<? extends UpdateAction> actionClass) */
throws InstantiationException, IllegalAccessException public UpdateAction getUpdateAction(Class<? extends UpdateAction> actionClass)
{ throws InstantiationException, IllegalAccessException {
UpdateAction action = registry.get(actionClass); UpdateAction action = registry.get(actionClass);
if (action == null) if (action == null) {
{ action = actionClass.newInstance();
action = actionClass.newInstance(); registry.put(actionClass, action);
registry.put(actionClass, action); }
}
return action;
return action; }
}
/**
/** * @return whether any actions have been registered with this manager
* */
* @return whether any actions have been registered with this manager public boolean hasActions() {
*/ return !registry.isEmpty();
public boolean hasActions() }
{
return !registry.isEmpty(); /**
} * This implementation guarantees the iterator order is the same as the order
* in which updateActions have been added
/** *
* This implementation guarantees the iterator order is the same as the order * @return iterator for UpdateActions
* in which updateActions have been added */
* @Override
* @return iterator for UpdateActions public Iterator<UpdateAction> iterator() {
*/ return new Iterator<UpdateAction>() {
@Override private Iterator<Class<? extends UpdateAction>> itr = registry.keySet().iterator();
public Iterator<UpdateAction> iterator()
{ @Override
return new Iterator<UpdateAction>() public boolean hasNext() {
{ return itr.hasNext();
private Iterator<Class<? extends UpdateAction>> itr = registry.keySet().iterator(); }
@Override @Override
public boolean hasNext() public UpdateAction next() {
{ return registry.get(itr.next());
return itr.hasNext(); }
}
//not supported
@Override @Override
public UpdateAction next() public void remove() {
{ throw new UnsupportedOperationException();
return registry.get(itr.next()); }
} };
//not supported }
@Override
public void remove()
{
throw new UnsupportedOperationException();
}
};
}
} }

View File

@@ -19,7 +19,11 @@ import java.util.List;
import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.AuthorizeException;
import org.dspace.authorize.factory.AuthorizeServiceFactory; import org.dspace.authorize.factory.AuthorizeServiceFactory;
import org.dspace.authorize.service.AuthorizeService; import org.dspace.authorize.service.AuthorizeService;
import org.dspace.content.*; import org.dspace.content.Bitstream;
import org.dspace.content.BitstreamFormat;
import org.dspace.content.Bundle;
import org.dspace.content.DCDate;
import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.BitstreamFormatService; import org.dspace.content.service.BitstreamFormatService;
import org.dspace.content.service.InstallItemService; import org.dspace.content.service.InstallItemService;
@@ -29,201 +33,176 @@ import org.dspace.eperson.factory.EPersonServiceFactory;
import org.dspace.eperson.service.GroupService; import org.dspace.eperson.service.GroupService;
/** /**
* Action to add bitstreams listed in item contents file to the item in DSpace * Action to add bitstreams listed in item contents file to the item in DSpace
*
*
*/ */
public class AddBitstreamsAction extends UpdateBitstreamsAction { public class AddBitstreamsAction extends UpdateBitstreamsAction {
protected AuthorizeService authorizeService = AuthorizeServiceFactory.getInstance().getAuthorizeService(); protected AuthorizeService authorizeService = AuthorizeServiceFactory.getInstance().getAuthorizeService();
protected BitstreamFormatService bitstreamFormatService = ContentServiceFactory.getInstance().getBitstreamFormatService(); protected BitstreamFormatService bitstreamFormatService = ContentServiceFactory.getInstance()
.getBitstreamFormatService();
protected GroupService groupService = EPersonServiceFactory.getInstance().getGroupService(); protected GroupService groupService = EPersonServiceFactory.getInstance().getGroupService();
protected InstallItemService installItemService = ContentServiceFactory.getInstance().getInstallItemService(); protected InstallItemService installItemService = ContentServiceFactory.getInstance().getInstallItemService();
public AddBitstreamsAction() public AddBitstreamsAction() {
{ //empty
//empty }
}
/**
/** * Adds bitstreams from the archive as listed in the contents file.
* Adds bitstreams from the archive as listed in the contents file. *
* * @param context DSpace Context
* @param context DSpace Context * @param itarch Item Archive
* @param itarch Item Archive * @param isTest test flag
* @param isTest test flag * @param suppressUndo undo flag
* @param suppressUndo undo flag * @throws IOException if IO error
* @throws IOException if IO error * @throws IllegalArgumentException if arg exception
* @throws IllegalArgumentException if arg exception * @throws SQLException if database error
* @throws SQLException if database error * @throws AuthorizeException if authorization error
* @throws AuthorizeException if authorization error * @throws ParseException if parse error
* @throws ParseException if parse error */
*/ @Override
@Override public void execute(Context context, ItemArchive itarch, boolean isTest,
public void execute(Context context, ItemArchive itarch, boolean isTest, boolean suppressUndo) throws IllegalArgumentException,
boolean suppressUndo) throws IllegalArgumentException, ParseException, IOException, AuthorizeException, SQLException {
ParseException, IOException, AuthorizeException, SQLException Item item = itarch.getItem();
{ File dir = itarch.getDirectory();
Item item = itarch.getItem();
File dir = itarch.getDirectory(); List<ContentsEntry> contents = MetadataUtilities.readContentsFile(new File(dir, ItemUpdate.CONTENTS_FILE));
List<ContentsEntry> contents = MetadataUtilities.readContentsFile(new File(dir, ItemUpdate.CONTENTS_FILE)); if (contents.isEmpty()) {
ItemUpdate.pr("Contents is empty - no bitstreams to add");
if (contents.isEmpty()) return;
{ }
ItemUpdate.pr("Contents is empty - no bitstreams to add");
return; ItemUpdate.pr("Contents bitstream count: " + contents.size());
}
String[] files = dir.list(ItemUpdate.fileFilter);
ItemUpdate.pr("Contents bitstream count: " + contents.size()); List<String> fileList = new ArrayList<String>();
for (String filename : files) {
String[] files = dir.list(ItemUpdate.fileFilter); fileList.add(filename);
List<String> fileList = new ArrayList<String>(); ItemUpdate.pr("file: " + filename);
for (String filename : files) }
{
fileList.add(filename); for (ContentsEntry ce : contents) {
ItemUpdate.pr("file: " + filename); //validate match to existing file in archive
} if (!fileList.contains(ce.filename)) {
throw new IllegalArgumentException("File listed in contents is missing: " + ce.filename);
for (ContentsEntry ce : contents) }
{ }
//validate match to existing file in archive int bitstream_bundles_updated = 0;
if (!fileList.contains(ce.filename))
{ //now okay to add
throw new IllegalArgumentException("File listed in contents is missing: " + ce.filename); for (ContentsEntry ce : contents) {
} String targetBundleName = addBitstream(context, itarch, item, dir, ce, suppressUndo, isTest);
} if (!targetBundleName.equals("")
int bitstream_bundles_updated = 0; && !targetBundleName.equals("THUMBNAIL")
&& !targetBundleName.equals("TEXT")) {
//now okay to add bitstream_bundles_updated++;
for (ContentsEntry ce : contents) }
{ }
String targetBundleName = addBitstream(context, itarch, item, dir, ce, suppressUndo, isTest);
if (!targetBundleName.equals("") if (alterProvenance && bitstream_bundles_updated > 0) {
&& !targetBundleName.equals("THUMBNAIL") DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
&& !targetBundleName.equals("TEXT"))
{ String append = ". Added " + Integer.toString(bitstream_bundles_updated)
bitstream_bundles_updated++; + " bitstream(s) on " + DCDate.getCurrent() + " : "
} + installItemService.getBitstreamProvenanceMessage(context, item);
} MetadataUtilities.appendMetadata(context, item, dtom, false, append);
}
}
/**
* Add bitstream
*
* @param context DSpace Context
* @param itarch Item Archive
* @param item DSpace Item
* @param dir directory
* @param ce contents entry for bitstream
* @param suppressUndo undo flag
* @param isTest test flag
* @return bundle name
* @throws IOException if IO error
* @throws IllegalArgumentException if arg exception
* @throws SQLException if database error
* @throws AuthorizeException if authorization error
* @throws ParseException if parse error
*/
protected String addBitstream(Context context, ItemArchive itarch, Item item, File dir,
ContentsEntry ce, boolean suppressUndo, boolean isTest)
throws IOException, IllegalArgumentException, SQLException, AuthorizeException, ParseException {
ItemUpdate.pr("contents entry for bitstream: " + ce.toString());
File f = new File(dir, ce.filename);
if (alterProvenance && bitstream_bundles_updated > 0)
{
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
String append = ". Added " + Integer.toString(bitstream_bundles_updated)
+ " bitstream(s) on " + DCDate.getCurrent() + " : "
+ installItemService.getBitstreamProvenanceMessage(context, item);
MetadataUtilities.appendMetadata(context, item, dtom, false, append);
}
}
/**
* Add bitstream
* @param context DSpace Context
* @param itarch Item Archive
* @param item DSpace Item
* @param dir directory
* @param ce contents entry for bitstream
* @param suppressUndo undo flag
* @param isTest test flag
* @return bundle name
* @throws IOException if IO error
* @throws IllegalArgumentException if arg exception
* @throws SQLException if database error
* @throws AuthorizeException if authorization error
* @throws ParseException if parse error
*/
protected String addBitstream(Context context, ItemArchive itarch, Item item, File dir,
ContentsEntry ce, boolean suppressUndo, boolean isTest)
throws IOException, IllegalArgumentException, SQLException, AuthorizeException, ParseException
{
ItemUpdate.pr("contents entry for bitstream: " + ce.toString());
File f = new File(dir, ce.filename);
// get an input stream // get an input stream
BufferedInputStream bis = new BufferedInputStream(new FileInputStream(f)); BufferedInputStream bis = new BufferedInputStream(new FileInputStream(f));
Bitstream bs = null; Bitstream bs = null;
String newBundleName = ce.bundlename; String newBundleName = ce.bundlename;
if (ce.bundlename == null) // should be required but default convention established if (ce.bundlename == null) { // should be required but default convention established
{ if (ce.filename.equals("license.txt")) {
if (ce.filename.equals("license.txt"))
{
newBundleName = "LICENSE"; newBundleName = "LICENSE";
} } else {
else
{
newBundleName = "ORIGINAL"; newBundleName = "ORIGINAL";
} }
} }
ItemUpdate.pr(" Bitstream " + ce.filename + " to be added to bundle: " + newBundleName); ItemUpdate.pr(" Bitstream " + ce.filename + " to be added to bundle: " + newBundleName);
if (!isTest)
{
// find the bundle
List<Bundle> bundles = itemService.getBundles(item, newBundleName);
Bundle targetBundle = null;
if (bundles.size() < 1)
{
// not found, create a new one
targetBundle = bundleService.create(context, item, newBundleName);
}
else
{
//verify bundle + name are not duplicates
for (Bundle b : bundles)
{
List<Bitstream> bitstreams = b.getBitstreams();
for (Bitstream bsm : bitstreams)
{
if (bsm.getName().equals(ce.filename))
{
throw new IllegalArgumentException("Duplicate bundle + filename cannot be added: "
+ b.getName() + " + " + bsm.getName());
}
}
}
// select first bundle if (!isTest) {
targetBundle = bundles.iterator().next(); // find the bundle
} List<Bundle> bundles = itemService.getBundles(item, newBundleName);
Bundle targetBundle = null;
bs = bitstreamService.create(context, targetBundle, bis);
bs.setName(context, ce.filename);
// Identify the format
// FIXME - guessing format guesses license.txt incorrectly as a text file format!
BitstreamFormat fmt = bitstreamFormatService.guessFormat(context, bs);
bitstreamService.setFormat(context, bs, fmt);
if (ce.description != null) if (bundles.size() < 1) {
{ // not found, create a new one
bs.setDescription(context, ce.description); targetBundle = bundleService.create(context, item, newBundleName);
} } else {
//verify bundle + name are not duplicates
if ((ce.permissionsActionId != -1) && (ce.permissionsGroupName != null)) for (Bundle b : bundles) {
{ List<Bitstream> bitstreams = b.getBitstreams();
Group group = groupService.findByName(context, ce.permissionsGroupName); for (Bitstream bsm : bitstreams) {
if (bsm.getName().equals(ce.filename)) {
if (group != null) throw new IllegalArgumentException("Duplicate bundle + filename cannot be added: "
{ + b.getName() + " + " + bsm.getName());
}
}
}
// select first bundle
targetBundle = bundles.iterator().next();
}
bs = bitstreamService.create(context, targetBundle, bis);
bs.setName(context, ce.filename);
// Identify the format
// FIXME - guessing format guesses license.txt incorrectly as a text file format!
BitstreamFormat fmt = bitstreamFormatService.guessFormat(context, bs);
bitstreamService.setFormat(context, bs, fmt);
if (ce.description != null) {
bs.setDescription(context, ce.description);
}
if ((ce.permissionsActionId != -1) && (ce.permissionsGroupName != null)) {
Group group = groupService.findByName(context, ce.permissionsGroupName);
if (group != null) {
authorizeService.removeAllPolicies(context, bs); // remove the default policy authorizeService.removeAllPolicies(context, bs); // remove the default policy
authorizeService.createResourcePolicy(context, bs, group, null, ce.permissionsActionId, null); authorizeService.createResourcePolicy(context, bs, group, null, ce.permissionsActionId, null);
} }
} }
//update after all changes are applied //update after all changes are applied
bitstreamService.update(context, bs); bitstreamService.update(context, bs);
if (!suppressUndo) if (!suppressUndo) {
{ itarch.addUndoDeleteContents(bs.getID());
itarch.addUndoDeleteContents(bs.getID()); }
} return targetBundle.getName();
return targetBundle.getName();
} }
return ""; return "";
} }
} }

View File

@@ -11,119 +11,107 @@ import java.sql.SQLException;
import java.util.List; import java.util.List;
import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.AuthorizeException;
import org.dspace.content.MetadataValue;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.content.MetadataField; import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema; import org.dspace.content.MetadataSchema;
import org.dspace.content.MetadataValue;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.MetadataFieldService; import org.dspace.content.service.MetadataFieldService;
import org.dspace.content.service.MetadataSchemaService; import org.dspace.content.service.MetadataSchemaService;
import org.dspace.core.Context; import org.dspace.core.Context;
/** /**
* Action to add metadata to item * Action to add metadata to item
*
*/ */
public class AddMetadataAction extends UpdateMetadataAction { public class AddMetadataAction extends UpdateMetadataAction {
protected MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance().getMetadataSchemaService(); protected MetadataSchemaService metadataSchemaService = ContentServiceFactory.getInstance()
.getMetadataSchemaService();
protected MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService(); protected MetadataFieldService metadataFieldService = ContentServiceFactory.getInstance().getMetadataFieldService();
/** /**
* Adds metadata specified in the source archive * Adds metadata specified in the source archive
* *
* @param context DSpace Context * @param context DSpace Context
* @param itarch item archive * @param itarch item archive
* @param isTest test flag * @param isTest test flag
* @param suppressUndo undo flag * @param suppressUndo undo flag
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
* @throws SQLException if database error * @throws SQLException if database error
*/ */
@Override @Override
public void execute(Context context, ItemArchive itarch, boolean isTest, public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws AuthorizeException, SQLException boolean suppressUndo) throws AuthorizeException, SQLException {
{ Item item = itarch.getItem();
Item item = itarch.getItem(); String dirname = itarch.getDirectoryName();
String dirname = itarch.getDirectoryName();
for (DtoMetadata dtom : itarch.getMetadataFields()) {
for (DtoMetadata dtom : itarch.getMetadataFields()) for (String f : targetFields) {
{ if (dtom.matches(f, false)) {
for (String f : targetFields) // match against metadata for this field/value in repository
{ // qualifier must be strictly matched, possibly null
if (dtom.matches(f, false)) List<MetadataValue> ardcv = null;
{ ardcv = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
// match against metadata for this field/value in repository
// qualifier must be strictly matched, possibly null boolean found = false;
List<MetadataValue> ardcv = null; for (MetadataValue dcv : ardcv) {
ardcv = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY); if (dcv.getValue().equals(dtom.value)) {
found = true;
boolean found = false; break;
for (MetadataValue dcv : ardcv) }
{ }
if (dcv.getValue().equals(dtom.value))
{ if (found) {
found = true; ItemUpdate.pr("Warning: No new metadata found to add to item " + dirname
break; + " for element " + f);
} } else {
} if (isTest) {
ItemUpdate.pr("Metadata to add: " + dtom.toString());
if (found) //validity tests that would occur in actual processing
{ // If we're just test the import, let's check that the actual metadata field exists.
ItemUpdate.pr("Warning: No new metadata found to add to item " + dirname MetadataSchema foundSchema = metadataSchemaService.find(context, dtom.schema);
+ " for element " + f);
} if (foundSchema == null) {
else ItemUpdate.pr("ERROR: schema '"
{ + dtom.schema + "' was not found in the registry; found on item " +
if (isTest) dirname);
{ } else {
ItemUpdate.pr("Metadata to add: " + dtom.toString()); MetadataField foundField = metadataFieldService
//validity tests that would occur in actual processing .findByElement(context, foundSchema, dtom.element, dtom.qualifier);
// If we're just test the import, let's check that the actual metadata field exists.
MetadataSchema foundSchema = metadataSchemaService.find(context, dtom.schema); if (foundField == null) {
ItemUpdate.pr("ERROR: Metadata field: '" + dtom.schema + "." + dtom.element + "."
if (foundSchema == null) + dtom.qualifier + "' not found in registry; found on item " +
{ dirname);
ItemUpdate.pr("ERROR: schema '" }
+ dtom.schema + "' was not found in the registry; found on item " + dirname); }
} } else {
else itemService
{ .addMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language,
MetadataField foundField = metadataFieldService.findByElement(context, foundSchema, dtom.element, dtom.qualifier); dtom.value);
ItemUpdate.pr("Metadata added: " + dtom.toString());
if (foundField == null)
{ if (!suppressUndo) {
ItemUpdate.pr("ERROR: Metadata field: '" + dtom.schema + "." + dtom.element + "." //itarch.addUndoDtom(dtom);
+ dtom.qualifier + "' not found in registry; found on item " + dirname); //ItemUpdate.pr("Undo metadata: " + dtom);
}
} // add all as a replace record to be preceded by delete
} for (MetadataValue dcval : ardcv) {
else
{
itemService.addMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language, dtom.value);
ItemUpdate.pr("Metadata added: " + dtom.toString());
if (!suppressUndo)
{
//itarch.addUndoDtom(dtom);
//ItemUpdate.pr("Undo metadata: " + dtom);
// add all as a replace record to be preceded by delete
for (MetadataValue dcval : ardcv)
{
MetadataField metadataField = dcval.getMetadataField(); MetadataField metadataField = dcval.getMetadataField();
MetadataSchema metadataSchema = metadataField.getMetadataSchema(); MetadataSchema metadataSchema = metadataField.getMetadataSchema();
itarch.addUndoMetadataField(DtoMetadata.create(metadataSchema.getName(), metadataField.getElement(), itarch.addUndoMetadataField(
metadataField.getQualifier(), dcval.getLanguage(), dcval.getValue())); DtoMetadata.create(metadataSchema.getName(), metadataField.getElement(),
} metadataField.getQualifier(), dcval.getLanguage(),
dcval.getValue()));
} }
}
} }
break; // don't need to check if this field matches any other target fields }
} }
} break; // don't need to check if this field matches any other target fields
} }
} }
}
}
} }

View File

@@ -7,55 +7,49 @@
*/ */
package org.dspace.app.itemupdate; package org.dspace.app.itemupdate;
import java.io.IOException;
import java.util.Properties;
import java.io.InputStream;
import java.io.FileInputStream; import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.Properties;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
/** /**
* Filter interface to be used by ItemUpdate * Filter interface to be used by ItemUpdate
* to determine which bitstreams in an Item * to determine which bitstreams in an Item
* acceptable for removal. * acceptable for removal.
*
*/ */
public abstract class BitstreamFilter { public abstract class BitstreamFilter {
protected Properties props = null; protected Properties props = null;
/**
* The filter method
*
* @param bitstream Bitstream
* @return whether the bitstream matches the criteria
* @throws BitstreamFilterException if filter error
*/
public abstract boolean accept(Bitstream bitstream) throws BitstreamFilterException;
/**
*
* @param filepath - The complete path for the properties file
* @throws IOException if IO error
*/
public void initProperties(String filepath)
throws IOException
{
props = new Properties();
InputStream in = null;
try /**
{ * The filter method
*
* @param bitstream Bitstream
* @return whether the bitstream matches the criteria
* @throws BitstreamFilterException if filter error
*/
public abstract boolean accept(Bitstream bitstream) throws BitstreamFilterException;
/**
* @param filepath - The complete path for the properties file
* @throws IOException if IO error
*/
public void initProperties(String filepath)
throws IOException {
props = new Properties();
InputStream in = null;
try {
in = new FileInputStream(filepath); in = new FileInputStream(filepath);
props.load(in); props.load(in);
} } finally {
finally if (in != null) {
{
if (in != null)
{
in.close(); in.close();
} }
} }
} }
} }

View File

@@ -14,55 +14,44 @@ import org.dspace.content.Bitstream;
import org.dspace.content.Bundle; import org.dspace.content.Bundle;
/** /**
* BitstreamFilter implementation to filter by bundle name * BitstreamFilter implementation to filter by bundle name
*
*/ */
public class BitstreamFilterByBundleName extends BitstreamFilter { public class BitstreamFilterByBundleName extends BitstreamFilter {
protected String bundleName; protected String bundleName;
public BitstreamFilterByBundleName() public BitstreamFilterByBundleName() {
{ //empty
//empty }
}
/**
* Filter bitstream based on bundle name found in properties file
*
* @param bitstream Bitstream
* @return whether bitstream is in bundle
* @throws BitstreamFilterException if filter error
*/
@Override
public boolean accept(Bitstream bitstream)
throws BitstreamFilterException {
if (bundleName == null) {
bundleName = props.getProperty("bundle");
if (bundleName == null) {
throw new BitstreamFilterException("Property 'bundle' not found.");
}
}
try {
List<Bundle> bundles = bitstream.getBundles();
for (Bundle b : bundles) {
if (b.getName().equals(bundleName)) {
return true;
}
}
} catch (SQLException e) {
throw new BitstreamFilterException(e);
}
return false;
}
/**
* Filter bitstream based on bundle name found in properties file
*
* @param bitstream Bitstream
* @throws BitstreamFilterException if filter error
* @return whether bitstream is in bundle
*
*/
@Override
public boolean accept(Bitstream bitstream)
throws BitstreamFilterException
{
if (bundleName == null)
{
bundleName = props.getProperty("bundle");
if (bundleName == null)
{
throw new BitstreamFilterException("Property 'bundle' not found.");
}
}
try
{
List<Bundle> bundles = bitstream.getBundles();
for (Bundle b : bundles)
{
if (b.getName().equals(bundleName))
{
return true;
}
}
}
catch(SQLException e)
{
throw new BitstreamFilterException(e);
}
return false;
}
} }

View File

@@ -7,47 +7,43 @@
*/ */
package org.dspace.app.itemupdate; package org.dspace.app.itemupdate;
import java.util.regex.*; import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
/** /**
* BitstreamFilter implementation to filter by filename pattern * BitstreamFilter implementation to filter by filename pattern
*
*/ */
public class BitstreamFilterByFilename extends BitstreamFilter { public class BitstreamFilterByFilename extends BitstreamFilter {
protected Pattern pattern;
protected String filenameRegex;
public BitstreamFilterByFilename()
{
//empty
}
/** protected Pattern pattern;
* Tests bitstream by matching the regular expression in the protected String filenameRegex;
* properties against the bitstream name
* public BitstreamFilterByFilename() {
* @param bitstream Bitstream //empty
* @return whether bitstream name matches the regular expression }
* @exception BitstreamFilterException if filter error
*/ /**
@Override * Tests bitstream by matching the regular expression in the
public boolean accept(Bitstream bitstream) throws BitstreamFilterException * properties against the bitstream name
{ *
if (filenameRegex == null) * @param bitstream Bitstream
{ * @return whether bitstream name matches the regular expression
filenameRegex = props.getProperty("filename"); * @throws BitstreamFilterException if filter error
if (filenameRegex == null) */
{ @Override
throw new BitstreamFilterException("BitstreamFilter property 'filename' not found."); public boolean accept(Bitstream bitstream) throws BitstreamFilterException {
} if (filenameRegex == null) {
pattern = Pattern.compile(filenameRegex); filenameRegex = props.getProperty("filename");
} if (filenameRegex == null) {
throw new BitstreamFilterException("BitstreamFilter property 'filename' not found.");
Matcher m = pattern.matcher(bitstream.getName()); }
return m.matches(); pattern = Pattern.compile(filenameRegex);
} }
Matcher m = pattern.matcher(bitstream.getName());
return m.matches();
}
} }

View File

@@ -8,30 +8,27 @@
package org.dspace.app.itemupdate; package org.dspace.app.itemupdate;
/** /**
* Exception class for BitstreamFilters * Exception class for BitstreamFilters
*
*/ */
public class BitstreamFilterException extends Exception public class BitstreamFilterException extends Exception {
{
private static final long serialVersionUID = 1L;
private static final long serialVersionUID = 1L;
public BitstreamFilterException() {
public BitstreamFilterException() {} }
/**
* /**
* @param msg exception message * @param msg exception message
*/ */
public BitstreamFilterException(String msg) public BitstreamFilterException(String msg) {
{ super(msg);
super(msg); }
}
/** /**
* * @param e exception
* @param e exception */
*/ public BitstreamFilterException(Exception e) {
public BitstreamFilterException(Exception e) super(e);
{ }
super(e);
}
} }

View File

@@ -8,148 +8,124 @@
package org.dspace.app.itemupdate; package org.dspace.app.itemupdate;
import java.text.ParseException; import java.text.ParseException;
import java.util.regex.*; import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.dspace.core.Constants; import org.dspace.core.Constants;
/** /**
* Holds the elements of a line in the Contents Entry file * Holds the elements of a line in the Contents Entry file
*
* Based on private methods in ItemImport
*
* Lacking a spec or full documentation for the file format,
* it looks from the source code that the ordering or elements is not fixed
*
* e.g.:
* {@code
* 48217870-MIT.pdf\tbundle: bundlename\tpermissions: -r 'MIT Users'\tdescription: Full printable version (MIT only)
* permissions: -[r|w] ['group name']
* description: <the description of the file>
* }
*
* *
* Based on private methods in ItemImport
*
* Lacking a spec or full documentation for the file format,
* it looks from the source code that the ordering or elements is not fixed
*
* e.g.:
* {@code
* 48217870-MIT.pdf\tbundle: bundlename\tpermissions: -r 'MIT Users'\tdescription: Full printable version (MIT only)
* permissions: -[r|w] ['group name']
* description: <the description of the file>
* }
*/ */
public class ContentsEntry public class ContentsEntry {
{ public static final String HDR_BUNDLE = "bundle:";
public static final String HDR_BUNDLE = "bundle:"; public static final String HDR_PERMISSIONS = "permissions:";
public static final String HDR_PERMISSIONS = "permissions:"; public static final String HDR_DESCRIPTION = "description:";
public static final String HDR_DESCRIPTION = "description:";
public static final Pattern permissionsPattern = Pattern.compile("-([rw])\\s*'?([^']+)'?");
public static final Pattern permissionsPattern = Pattern.compile("-([rw])\\s*'?([^']+)'?");
final String filename;
final String filename; final String bundlename;
final String bundlename; final String permissionsGroupName;
final String permissionsGroupName; final int permissionsActionId;
final int permissionsActionId; final String description;
final String description;
protected ContentsEntry(String filename,
protected ContentsEntry(String filename, String bundlename,
String bundlename, int permissionsActionId,
int permissionsActionId, String permissionsGroupName,
String permissionsGroupName, String description) {
String description) this.filename = filename;
{ this.bundlename = bundlename;
this.filename = filename; this.permissionsActionId = permissionsActionId;
this.bundlename = bundlename; this.permissionsGroupName = permissionsGroupName;
this.permissionsActionId = permissionsActionId; this.description = description;
this.permissionsGroupName = permissionsGroupName; }
this.description = description;
} /**
* Factory method parses a line from the Contents Entry file
/** *
* Factory method parses a line from the Contents Entry file * @param line line as string
* * @return the parsed ContentsEntry object
* @param line line as string * @throws ParseException if parse error
* @return the parsed ContentsEntry object */
* @throws ParseException if parse error public static ContentsEntry parse(String line)
*/ throws ParseException {
public static ContentsEntry parse(String line) String[] ar = line.split("\t");
throws ParseException ItemUpdate.pr("ce line split: " + ar.length);
{
String[] ar = line.split("\t"); String[] arp = new String[4];
ItemUpdate.pr("ce line split: " + ar.length); arp[0] = ar[0]; //bitstream name doesn't have header and is always first
String[] arp = new String[4]; String groupName = null;
arp[0] = ar[0]; //bitstream name doesn't have header and is always first int actionId = -1;
String groupName = null; if (ar.length > 1) {
int actionId = -1; for (int i = 1; i < ar.length; i++) {
ItemUpdate.pr("ce " + i + " : " + ar[i]);
if (ar[i].startsWith(HDR_BUNDLE)) {
arp[1] = ar[i].substring(HDR_BUNDLE.length()).trim();
} else if (ar[i].startsWith(HDR_PERMISSIONS)) {
arp[2] = ar[i].substring(HDR_PERMISSIONS.length()).trim();
// parse into actionId and group name
Matcher m = permissionsPattern.matcher(arp[2]);
if (m.matches()) {
String action = m.group(1); //
if (action.equals("r")) {
actionId = Constants.READ;
} else if (action.equals("w")) {
actionId = Constants.WRITE;
}
groupName = m.group(2).trim();
}
} else if (ar[i].startsWith(HDR_DESCRIPTION)) {
arp[3] = ar[i].substring(HDR_DESCRIPTION.length()).trim();
} else {
throw new ParseException("Unknown text in contents file: " + ar[i], 0);
}
}
}
return new ContentsEntry(arp[0], arp[1], actionId, groupName, arp[3]);
}
public String toString() {
StringBuilder sb = new StringBuilder(filename);
if (bundlename != null) {
sb.append(HDR_BUNDLE).append(" ").append(bundlename);
}
if (permissionsGroupName != null) {
sb.append(HDR_PERMISSIONS);
if (permissionsActionId == Constants.READ) {
sb.append(" -r ");
} else if (permissionsActionId == Constants.WRITE) {
sb.append(" -w ");
}
sb.append(permissionsGroupName);
}
if (description != null) {
sb.append(HDR_DESCRIPTION).append(" ").append(description);
}
return sb.toString();
}
if (ar.length > 1)
{
for (int i=1; i < ar.length; i++)
{
ItemUpdate.pr("ce " + i + " : " + ar[i]);
if (ar[i].startsWith(HDR_BUNDLE))
{
arp[1] = ar[i].substring(HDR_BUNDLE.length()).trim();
}
else if (ar[i].startsWith(HDR_PERMISSIONS))
{
arp[2] = ar[i].substring(HDR_PERMISSIONS.length()).trim();
// parse into actionId and group name
Matcher m = permissionsPattern.matcher(arp[2]);
if (m.matches())
{
String action = m.group(1); //
if (action.equals("r"))
{
actionId = Constants.READ;
}
else if (action.equals("w"))
{
actionId = Constants.WRITE;
}
groupName = m.group(2).trim();
}
}
else if (ar[i].startsWith(HDR_DESCRIPTION))
{
arp[3] = ar[i].substring(HDR_DESCRIPTION.length()).trim();
}
else
{
throw new ParseException("Unknown text in contents file: " + ar[i], 0);
}
}
}
return new ContentsEntry(arp[0], arp[1], actionId, groupName, arp[3]);
}
public String toString()
{
StringBuilder sb = new StringBuilder(filename);
if (bundlename != null)
{
sb.append(HDR_BUNDLE).append(" ").append(bundlename);
}
if (permissionsGroupName != null)
{
sb.append(HDR_PERMISSIONS);
if (permissionsActionId == Constants.READ)
{
sb.append(" -r ");
}
else if (permissionsActionId == Constants.WRITE)
{
sb.append(" -w ");
}
sb.append(permissionsGroupName);
}
if (description != null)
{
sb.append(HDR_DESCRIPTION).append(" ").append(description);
}
return sb.toString();
}
} }

View File

@@ -14,99 +14,81 @@ import java.text.ParseException;
import java.util.List; import java.util.List;
import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.AuthorizeException;
import org.dspace.content.*; import org.dspace.content.Bitstream;
import org.dspace.content.Bundle;
import org.dspace.content.DCDate;
import org.dspace.content.Item;
import org.dspace.core.Context; import org.dspace.core.Context;
/** /**
* Action to delete bitstreams * Action to delete bitstreams
*
* Undo not supported for this UpdateAction
*
* Derivatives of the bitstream to be deleted are not also deleted
* *
* Undo not supported for this UpdateAction
*
* Derivatives of the bitstream to be deleted are not also deleted
*/ */
public class DeleteBitstreamsAction extends UpdateBitstreamsAction public class DeleteBitstreamsAction extends UpdateBitstreamsAction {
{ /**
/** * Delete bitstream from item
* Delete bitstream from item *
* * @param context DSpace Context
* @param context DSpace Context * @param itarch item archive
* @param itarch item archive * @param isTest test flag
* @param isTest test flag * @param suppressUndo undo flag
* @param suppressUndo undo flag * @throws IOException if IO error
* @throws IOException if IO error * @throws IllegalArgumentException if arg exception
* @throws IllegalArgumentException if arg exception * @throws SQLException if database error
* @throws SQLException if database error * @throws AuthorizeException if authorization error
* @throws AuthorizeException if authorization error * @throws ParseException if parse error
* @throws ParseException if parse error */
*/ @Override
@Override
public void execute(Context context, ItemArchive itarch, boolean isTest, public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws IllegalArgumentException, IOException, boolean suppressUndo) throws IllegalArgumentException, IOException,
SQLException, AuthorizeException, ParseException SQLException, AuthorizeException, ParseException {
{ File f = new File(itarch.getDirectory(), ItemUpdate.DELETE_CONTENTS_FILE);
File f = new File(itarch.getDirectory(), ItemUpdate.DELETE_CONTENTS_FILE); if (!f.exists()) {
if (!f.exists()) ItemUpdate.pr("Warning: Delete_contents file for item " + itarch.getDirectoryName() + " not found.");
{ } else {
ItemUpdate.pr("Warning: Delete_contents file for item " + itarch.getDirectoryName() + " not found."); List<String> list = MetadataUtilities.readDeleteContentsFile(f);
} if (list.isEmpty()) {
else ItemUpdate.pr("Warning: empty delete_contents file for item " + itarch.getDirectoryName());
{ } else {
List<String> list = MetadataUtilities.readDeleteContentsFile(f); for (String id : list) {
if (list.isEmpty()) try {
{ Bitstream bs = bitstreamService.findByIdOrLegacyId(context, id);
ItemUpdate.pr("Warning: empty delete_contents file for item " + itarch.getDirectoryName() ); if (bs == null) {
} ItemUpdate.pr("Bitstream not found by id: " + id);
else } else {
{ List<Bundle> bundles = bs.getBundles();
for (String id : list) for (Bundle b : bundles) {
{ if (isTest) {
try ItemUpdate.pr("Delete bitstream with id = " + id);
{ } else {
Bitstream bs = bitstreamService.findByIdOrLegacyId(context, id); bundleService.removeBitstream(context, b, bs);
if (bs == null) ItemUpdate.pr("Deleted bitstream with id = " + id);
{
ItemUpdate.pr("Bitstream not found by id: " + id); }
} }
else
{ if (alterProvenance) {
List<Bundle> bundles = bs.getBundles(); DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
for (Bundle b : bundles)
{ String append = "Bitstream " + bs.getName() + " deleted on " + DCDate
if (isTest) .getCurrent() + "; ";
{ Item item = bundles.iterator().next().getItems().iterator().next();
ItemUpdate.pr("Delete bitstream with id = " + id); ItemUpdate.pr("Append provenance with: " + append);
}
else if (!isTest) {
{ MetadataUtilities.appendMetadata(context, item, dtom, false, append);
bundleService.removeBitstream(context, b, bs); }
ItemUpdate.pr("Deleted bitstream with id = " + id); }
}
} } catch (SQLException e) {
} ItemUpdate.pr("Error finding bitstream from id: " + id + " : " + e.toString());
}
if (alterProvenance) }
{ }
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", ""); }
}
String append = "Bitstream " + bs.getName() + " deleted on " + DCDate.getCurrent() + "; ";
Item item = bundles.iterator().next().getItems().iterator().next();
ItemUpdate.pr("Append provenance with: " + append);
if (!isTest)
{
MetadataUtilities.appendMetadata(context, item, dtom, false, append);
}
}
}
}
catch(SQLException e)
{
ItemUpdate.pr("Error finding bitstream from id: " + id + " : " + e.toString());
}
}
}
}
}
} }

View File

@@ -14,115 +14,104 @@ import java.util.ArrayList;
import java.util.List; import java.util.List;
import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.AuthorizeException;
import org.dspace.content.*; import org.dspace.content.Bitstream;
import org.dspace.content.Bundle;
import org.dspace.content.DCDate;
import org.dspace.content.Item;
import org.dspace.core.Context; import org.dspace.core.Context;
/** /**
* Action to delete bitstreams using a specified filter implementing BitstreamFilter * Action to delete bitstreams using a specified filter implementing BitstreamFilter
* Derivatives for the target bitstreams are not deleted. * Derivatives for the target bitstreams are not deleted.
*
* The dc.description.provenance field is amended to reflect the deletions
*
* Note: Multiple filters are impractical if trying to manage multiple properties files
* in a commandline environment
*
* *
* The dc.description.provenance field is amended to reflect the deletions
*
* Note: Multiple filters are impractical if trying to manage multiple properties files
* in a commandline environment
*/ */
public class DeleteBitstreamsByFilterAction extends UpdateBitstreamsAction { public class DeleteBitstreamsByFilterAction extends UpdateBitstreamsAction {
protected BitstreamFilter filter; protected BitstreamFilter filter;
/**
* Set filter
*
* @param filter BitstreamFilter
*/
public void setBitstreamFilter(BitstreamFilter filter)
{
this.filter = filter;
}
/**
* Get filter
* @return filter
*/
public BitstreamFilter getBitstreamFilter()
{
return filter;
}
/**
* Delete bitstream
*
* @param context DSpace Context
* @param itarch item archive
* @param isTest test flag
* @param suppressUndo undo flag
* @throws IOException if IO error
* @throws SQLException if database error
* @throws AuthorizeException if authorization error
* @throws ParseException if parse error
* @throws BitstreamFilterException if filter error
*/
@Override
public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws AuthorizeException,
BitstreamFilterException, IOException, ParseException, SQLException
{
List<String> deleted = new ArrayList<String>();
Item item = itarch.getItem();
List<Bundle> bundles = item.getBundles();
for (Bundle b : bundles)
{
List<Bitstream> bitstreams = b.getBitstreams();
String bundleName = b.getName();
for (Bitstream bs : bitstreams) /**
{ * Set filter
if (filter.accept(bs)) *
{ * @param filter BitstreamFilter
if (isTest) */
{ public void setBitstreamFilter(BitstreamFilter filter) {
ItemUpdate.pr("Delete from bundle " + bundleName + " bitstream " + bs.getName() this.filter = filter;
+ " with id = " + bs.getID()); }
}
else /**
{ * Get filter
//provenance is not maintained for derivative bitstreams *
if (!bundleName.equals("THUMBNAIL") && !bundleName.equals("TEXT")) * @return filter
{ */
deleted.add(bs.getName()); public BitstreamFilter getBitstreamFilter() {
} return filter;
bundleService.removeBitstream(context, b, bs); }
ItemUpdate.pr("Deleted " + bundleName + " bitstream " + bs.getName()
+ " with id = " + bs.getID()); /**
} * Delete bitstream
} *
} * @param context DSpace Context
} * @param itarch item archive
* @param isTest test flag
if (alterProvenance && !deleted.isEmpty()) * @param suppressUndo undo flag
{ * @throws IOException if IO error
StringBuilder sb = new StringBuilder(" Bitstreams deleted on "); * @throws SQLException if database error
sb.append(DCDate.getCurrent()).append(": "); * @throws AuthorizeException if authorization error
* @throws ParseException if parse error
for (String s : deleted) * @throws BitstreamFilterException if filter error
{ */
sb.append(s).append(", "); @Override
} public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws AuthorizeException,
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", ""); BitstreamFilterException, IOException, ParseException, SQLException {
ItemUpdate.pr("Append provenance with: " + sb.toString()); List<String> deleted = new ArrayList<String>();
if (!isTest) Item item = itarch.getItem();
{ List<Bundle> bundles = item.getBundles();
MetadataUtilities.appendMetadata(context, item, dtom, false, sb.toString());
} for (Bundle b : bundles) {
List<Bitstream> bitstreams = b.getBitstreams();
String bundleName = b.getName();
for (Bitstream bs : bitstreams) {
if (filter.accept(bs)) {
if (isTest) {
ItemUpdate.pr("Delete from bundle " + bundleName + " bitstream " + bs.getName()
+ " with id = " + bs.getID());
} else {
//provenance is not maintained for derivative bitstreams
if (!bundleName.equals("THUMBNAIL") && !bundleName.equals("TEXT")) {
deleted.add(bs.getName());
}
bundleService.removeBitstream(context, b, bs);
ItemUpdate.pr("Deleted " + bundleName + " bitstream " + bs.getName()
+ " with id = " + bs.getID());
}
}
}
} }
}
if (alterProvenance && !deleted.isEmpty()) {
StringBuilder sb = new StringBuilder(" Bitstreams deleted on ");
sb.append(DCDate.getCurrent()).append(": ");
for (String s : deleted) {
sb.append(s).append(", ");
}
DtoMetadata dtom = DtoMetadata.create("dc.description.provenance", "en", "");
ItemUpdate.pr("Append provenance with: " + sb.toString());
if (!isTest) {
MetadataUtilities.appendMetadata(context, item, dtom, false, sb.toString());
}
}
}
} }

View File

@@ -12,60 +12,54 @@ import java.text.ParseException;
import java.util.List; import java.util.List;
import org.dspace.authorize.AuthorizeException; import org.dspace.authorize.AuthorizeException;
import org.dspace.content.Item;
import org.dspace.content.MetadataField; import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema; import org.dspace.content.MetadataSchema;
import org.dspace.content.MetadataValue; import org.dspace.content.MetadataValue;
import org.dspace.content.Item;
import org.dspace.core.Context; import org.dspace.core.Context;
/** /**
* Action to delete metadata * Action to delete metadata
*
*
*/ */
public class DeleteMetadataAction extends UpdateMetadataAction { public class DeleteMetadataAction extends UpdateMetadataAction {
/** /**
* Delete metadata from item * Delete metadata from item
* *
* @param context DSpace Context * @param context DSpace Context
* @param itarch Item Archive * @param itarch Item Archive
* @param isTest test flag * @param isTest test flag
* @param suppressUndo undo flag * @param suppressUndo undo flag
* @throws SQLException if database error * @throws SQLException if database error
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
* @throws ParseException if parse error * @throws ParseException if parse error
*/ */
@Override @Override
public void execute(Context context, ItemArchive itarch, boolean isTest, public void execute(Context context, ItemArchive itarch, boolean isTest,
boolean suppressUndo) throws AuthorizeException, ParseException, SQLException { boolean suppressUndo) throws AuthorizeException, ParseException, SQLException {
Item item = itarch.getItem(); Item item = itarch.getItem();
for (String f : targetFields) for (String f : targetFields) {
{ DtoMetadata dummy = DtoMetadata.create(f, Item.ANY, "");
DtoMetadata dummy = DtoMetadata.create(f, Item.ANY, ""); List<MetadataValue> ardcv = itemService.getMetadataByMetadataString(item, f);
List<MetadataValue> ardcv = itemService.getMetadataByMetadataString(item, f);
ItemUpdate.pr("Metadata to be deleted: "); ItemUpdate.pr("Metadata to be deleted: ");
for (MetadataValue dcv : ardcv) for (MetadataValue dcv : ardcv) {
{ ItemUpdate.pr(" " + MetadataUtilities.getDCValueString(dcv));
ItemUpdate.pr(" " + MetadataUtilities.getDCValueString(dcv)); }
}
if (!isTest) if (!isTest) {
{ if (!suppressUndo) {
if (!suppressUndo) for (MetadataValue dcv : ardcv) {
{
for (MetadataValue dcv : ardcv)
{
MetadataField metadataField = dcv.getMetadataField(); MetadataField metadataField = dcv.getMetadataField();
MetadataSchema metadataSchema = metadataField.getMetadataSchema(); MetadataSchema metadataSchema = metadataField.getMetadataSchema();
itarch.addUndoMetadataField(DtoMetadata.create(metadataSchema.getName(), metadataField.getElement(), itarch.addUndoMetadataField(
metadataField.getQualifier(), dcv.getLanguage(), dcv.getValue())); DtoMetadata.create(metadataSchema.getName(), metadataField.getElement(),
} metadataField.getQualifier(), dcv.getLanguage(), dcv.getValue()));
} }
}
itemService.clearMetadata(context, item, dummy.schema, dummy.element, dummy.qualifier, Item.ANY); itemService.clearMetadata(context, item, dummy.schema, dummy.element, dummy.qualifier, Item.ANY);
} }
} }
} }
} }

View File

@@ -10,15 +10,13 @@ package org.dspace.app.itemupdate;
import java.util.Properties; import java.util.Properties;
/** /**
* Bitstream filter to delete from TEXT bundle * Bitstream filter to delete from TEXT bundle
*
*/ */
public class DerivativeTextBitstreamFilter extends BitstreamFilterByBundleName { public class DerivativeTextBitstreamFilter extends BitstreamFilterByBundleName {
public DerivativeTextBitstreamFilter() public DerivativeTextBitstreamFilter() {
{ props = new Properties();
props = new Properties(); props.setProperty("bundle", "TEXT");
props.setProperty("bundle", "TEXT"); }
}
} }

View File

@@ -8,152 +8,131 @@
package org.dspace.app.itemupdate; package org.dspace.app.itemupdate;
import java.text.ParseException; import java.text.ParseException;
import org.dspace.content.Item; import org.dspace.content.Item;
/** /**
* A data transfer object class enhancement of org.dspace.content.DCValue, which is deprecated * A data transfer object class enhancement of org.dspace.content.DCValue, which is deprecated
* Name intended to not conflict with DSpace API classes for similar concepts but not usable in this context * Name intended to not conflict with DSpace API classes for similar concepts but not usable in this context
*
* Adds some utility methods
*
* Really not at all general enough but supports Dublin Core and the compound form notation {@code <schema>.<element>[.<qualifier>]}
*
* Does not support wildcard for qualifier
*
* *
* Adds some utility methods
*
* Really not at all general enough but supports Dublin Core and the compound form notation {@code <schema>
* .<element>[.<qualifier>]}
*
* Does not support wildcard for qualifier
*/ */
class DtoMetadata class DtoMetadata {
{ final String schema;
final String schema; final String element;
final String element; final String qualifier;
final String qualifier; final String language;
final String language; final String value;
final String value;
protected DtoMetadata(String schema, String element, String qualifier, String language, String value) {
protected DtoMetadata(String schema, String element, String qualifier, String language, String value) this.schema = schema;
{ this.element = element;
this.schema = schema; this.qualifier = qualifier;
this.element = element; this.language = language;
this.qualifier = qualifier; this.value = value;
this.language = language;
this.value = value;
}
/**
* Factory method
*
*
* @param schema not null, not empty - 'dc' is the standard case
* @param element not null, not empty
* @param qualifier null; don't allow empty string or * indicating 'any'
* @param language null or empty
* @param value value
* @return DtoMetadata object
* @throws IllegalArgumentException if arg error
*/
public static DtoMetadata create(String schema,
String element,
String qualifier,
String language,
String value)
throws IllegalArgumentException
{
if ((qualifier != null) && (qualifier.equals(Item.ANY) || qualifier.equals("")))
{
throw new IllegalArgumentException("Invalid qualifier: " + qualifier);
}
return new DtoMetadata(schema, element, qualifier, language, value);
} }
/** /**
* Factory method to create metadata object * Factory method
* *
* * @param schema not null, not empty - 'dc' is the standard case
* @param compoundForm of the form <schema>.<element>[.<qualifier>] * @param element not null, not empty
* @param language null or empty * @param qualifier null; don't allow empty string or * indicating 'any'
* @param value value * @param language null or empty
* @throws ParseException if parse error * @param value value
* @throws IllegalArgumentException if arg error * @return DtoMetadata object
*/ * @throws IllegalArgumentException if arg error
public static DtoMetadata create(String compoundForm, String language, String value) */
throws ParseException, IllegalArgumentException public static DtoMetadata create(String schema,
{ String element,
String[] ar = MetadataUtilities.parseCompoundForm(compoundForm); String qualifier,
String language,
String qual = null; String value)
if (ar.length > 2) throws IllegalArgumentException {
{ if ((qualifier != null) && (qualifier.equals(Item.ANY) || qualifier.equals(""))) {
qual = ar[2]; throw new IllegalArgumentException("Invalid qualifier: " + qualifier);
} }
return new DtoMetadata(schema, element, qualifier, language, value);
return create(ar[0], ar[1], qual, language, value);
} }
/** /**
* Determine if this metadata field matches the specified type: * Factory method to create metadata object
* schema.element or schema.element.qualifier *
* * @param compoundForm of the form <schema>.<element>[.<qualifier>]
* * @param language null or empty
* @param compoundForm of the form <schema>.<element>[.<qualifier>|.*] * @param value value
* @param wildcard allow wildcards in compoundForm param * @throws ParseException if parse error
* @return whether matches * @throws IllegalArgumentException if arg error
*/ */
public boolean matches(String compoundForm, boolean wildcard) public static DtoMetadata create(String compoundForm, String language, String value)
{ throws ParseException, IllegalArgumentException {
String[] ar = compoundForm.split("\\s*\\.\\s*"); //MetadataUtilities.parseCompoundForm(compoundForm); String[] ar = MetadataUtilities.parseCompoundForm(compoundForm);
if ((ar.length < 2) || (ar.length > 3)) String qual = null;
{ if (ar.length > 2) {
return false; qual = ar[2];
} }
if (!this.schema.equals(ar[0]) || !this.element.equals(ar[1])) return create(ar[0], ar[1], qual, language, value);
{ }
return false;
} /**
* Determine if this metadata field matches the specified type:
if (ar.length == 2) * schema.element or schema.element.qualifier
{ *
if (this.qualifier != null) * @param compoundForm of the form <schema>.<element>[.<qualifier>|.*]
{ * @param wildcard allow wildcards in compoundForm param
return false; * @return whether matches
} */
} public boolean matches(String compoundForm, boolean wildcard) {
String[] ar = compoundForm.split("\\s*\\.\\s*"); //MetadataUtilities.parseCompoundForm(compoundForm);
if (ar.length == 3)
{ if ((ar.length < 2) || (ar.length > 3)) {
if (this.qualifier == null) return false;
{ }
return false;
} if (!this.schema.equals(ar[0]) || !this.element.equals(ar[1])) {
if (wildcard && ar[2].equals(Item.ANY)) return false;
{ }
return true;
} if (ar.length == 2) {
if (!this.qualifier.equals(ar[2])) if (this.qualifier != null) {
{ return false;
return false; }
} }
}
return true; if (ar.length == 3) {
} if (this.qualifier == null) {
return false;
public String toString() }
{ if (wildcard && ar[2].equals(Item.ANY)) {
String s = "\tSchema: " + schema + " Element: " + element; return true;
if (qualifier != null) }
{ if (!this.qualifier.equals(ar[2])) {
s+= " Qualifier: " + qualifier; return false;
} }
s+= " Language: " + ((language == null) ? "[null]" : language); }
return true;
}
public String toString() {
String s = "\tSchema: " + schema + " Element: " + element;
if (qualifier != null) {
s += " Qualifier: " + qualifier;
}
s += " Language: " + ((language == null) ? "[null]" : language);
s += " Value: " + value; s += " Value: " + value;
return s; return s;
} }
public String getValue() public String getValue() {
{ return value;
return value; }
}
} }

View File

@@ -10,12 +10,11 @@ package org.dspace.app.itemupdate;
import java.io.BufferedWriter; import java.io.BufferedWriter;
import java.io.File; import java.io.File;
import java.io.FileInputStream; import java.io.FileInputStream;
import java.io.FileWriter;
import java.io.FilenameFilter;
import java.io.FileNotFoundException; import java.io.FileNotFoundException;
import java.io.FileOutputStream; import java.io.FileOutputStream;
import java.io.InputStream; import java.io.FileWriter;
import java.io.IOException; import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream; import java.io.OutputStream;
import java.io.PrintWriter; import java.io.PrintWriter;
import java.sql.SQLException; import java.sql.SQLException;
@@ -23,14 +22,13 @@ import java.util.ArrayList;
import java.util.Iterator; import java.util.Iterator;
import java.util.List; import java.util.List;
import java.util.UUID; import java.util.UUID;
import javax.xml.parsers.DocumentBuilder; import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory; import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException; import javax.xml.parsers.ParserConfigurationException;
import javax.xml.transform.Transformer; import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerConfigurationException;
import javax.xml.transform.TransformerException; import javax.xml.transform.TransformerException;
import javax.xml.transform.TransformerFactory; import javax.xml.transform.TransformerFactory;
import javax.xml.transform.TransformerConfigurationException;
import org.apache.log4j.Logger; import org.apache.log4j.Logger;
import org.dspace.app.util.LocalSchemaFilenameFilter; import org.dspace.app.util.LocalSchemaFilenameFilter;
@@ -40,20 +38,18 @@ import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService; import org.dspace.content.service.ItemService;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.handle.factory.HandleServiceFactory; import org.dspace.handle.factory.HandleServiceFactory;
import org.dspace.handle.service.HandleService; import org.dspace.handle.service.HandleService;
import org.w3c.dom.Document; import org.w3c.dom.Document;
/** /**
* Encapsulates the Item in the context of the DSpace Archive Format * Encapsulates the Item in the context of the DSpace Archive Format
*
*/ */
public class ItemArchive { public class ItemArchive {
private static final Logger log = Logger.getLogger(ItemArchive.class); private static final Logger log = Logger.getLogger(ItemArchive.class);
public static final String DUBLIN_CORE_XML = "dublin_core.xml"; public static final String DUBLIN_CORE_XML = "dublin_core.xml";
protected static DocumentBuilder builder = null; protected static DocumentBuilder builder = null;
protected Transformer transformer = null; protected Transformer transformer = null;
@@ -70,312 +66,278 @@ public class ItemArchive {
protected HandleService handleService; protected HandleService handleService;
protected ItemService itemService; protected ItemService itemService;
//constructors //constructors
protected ItemArchive() protected ItemArchive() {
{
handleService = HandleServiceFactory.getInstance().getHandleService(); handleService = HandleServiceFactory.getInstance().getHandleService();
itemService = ContentServiceFactory.getInstance().getItemService(); itemService = ContentServiceFactory.getInstance().getItemService();
} }
/** factory method /**
* * factory method
* Minimal requirements for dublin_core.xml for this application *
* is the presence of dc.identifier.uri * Minimal requirements for dublin_core.xml for this application
* which must contain the handle for the item * is the presence of dc.identifier.uri
* * which must contain the handle for the item
* @param context - The DSpace context *
* @param dir - The directory File in the source archive * @param context - The DSpace context
* @param itemField - The metadata field in which the Item identifier is located * @param dir - The directory File in the source archive
* if null, the default is the handle in the dc.identifier.uri field * @param itemField - The metadata field in which the Item identifier is located
* @return ItemArchive object * if null, the default is the handle in the dc.identifier.uri field
* @throws Exception if error * @return ItemArchive object
* * @throws Exception if error
*/ */
public static ItemArchive create(Context context, File dir, String itemField) public static ItemArchive create(Context context, File dir, String itemField)
throws Exception throws Exception {
{ ItemArchive itarch = new ItemArchive();
ItemArchive itarch = new ItemArchive(); itarch.dir = dir;
itarch.dir = dir; itarch.dirname = dir.getName();
itarch.dirname = dir.getName();
InputStream is = null; InputStream is = null;
try try {
{
is = new FileInputStream(new File(dir, DUBLIN_CORE_XML)); is = new FileInputStream(new File(dir, DUBLIN_CORE_XML));
itarch.dtomList = MetadataUtilities.loadDublinCore(getDocumentBuilder(), is); itarch.dtomList = MetadataUtilities.loadDublinCore(getDocumentBuilder(), is);
//The code to search for local schema files was copied from org.dspace.app.itemimport.ItemImportServiceImpl.java //The code to search for local schema files was copied from org.dspace.app.itemimport
// .ItemImportServiceImpl.java
File file[] = dir.listFiles(new LocalSchemaFilenameFilter()); File file[] = dir.listFiles(new LocalSchemaFilenameFilter());
for (int i = 0; i < file.length; i++) for (int i = 0; i < file.length; i++) {
{
is = new FileInputStream(file[i]); is = new FileInputStream(file[i]);
itarch.dtomList.addAll(MetadataUtilities.loadDublinCore(getDocumentBuilder(), is)); itarch.dtomList.addAll(MetadataUtilities.loadDublinCore(getDocumentBuilder(), is));
} }
} } finally {
finally if (is != null) {
{
if (is != null)
{
is.close(); is.close();
} }
} }
ItemUpdate.pr("Loaded metadata with " + itarch.dtomList.size() + " fields"); ItemUpdate.pr("Loaded metadata with " + itarch.dtomList.size() + " fields");
if (itemField == null)
{
itarch.item = itarch.itemFromHandleInput(context); // sets the item instance var and seeds the undo list
}
else
{
itarch.item = itarch.itemFromMetadataField(context, itemField);
}
if (itarch.item == null)
{
throw new Exception("Item not instantiated: " + itarch.dirname);
}
ItemUpdate.prv("item instantiated: " + itarch.item.getHandle());
return itarch; if (itemField == null) {
} itarch.item = itarch.itemFromHandleInput(context); // sets the item instance var and seeds the undo list
} else {
protected static DocumentBuilder getDocumentBuilder() itarch.item = itarch.itemFromMetadataField(context, itemField);
throws ParserConfigurationException }
{
if (builder == null) if (itarch.item == null) {
{ throw new Exception("Item not instantiated: " + itarch.dirname);
builder = DocumentBuilderFactory.newInstance().newDocumentBuilder(); }
}
return builder; ItemUpdate.prv("item instantiated: " + itarch.item.getHandle());
}
return itarch;
}
protected static DocumentBuilder getDocumentBuilder()
throws ParserConfigurationException {
if (builder == null) {
builder = DocumentBuilderFactory.newInstance().newDocumentBuilder();
}
return builder;
}
/** /**
* Getter for Transformer * Getter for Transformer
*
* @return Transformer * @return Transformer
* @throws TransformerConfigurationException if config error * @throws TransformerConfigurationException if config error
*/ */
protected Transformer getTransformer() protected Transformer getTransformer()
throws TransformerConfigurationException throws TransformerConfigurationException {
{ if (transformer == null) {
if (transformer == null) transformer = TransformerFactory.newInstance().newTransformer();
{ }
transformer = TransformerFactory.newInstance().newTransformer(); return transformer;
} }
return transformer;
}
/**
* Getter for the DSpace item referenced in the archive
* @return DSpace item
*/
public Item getItem()
{
return item;
}
/**
* Getter for directory in archive on disk
* @return directory in archive
*/
public File getDirectory()
{
return dir;
}
/**
* Getter for directory name in archive
* @return directory name in archive
*/
public String getDirectoryName()
{
return dirname;
}
/**
* Add metadata field to undo list
* @param dtom DtoMetadata (represents metadata field)
*/
public void addUndoMetadataField(DtoMetadata dtom)
{
this.undoDtomList.add(dtom);
}
/**
* Getter for list of metadata fields
* @return list of metadata fields
*/
public List<DtoMetadata> getMetadataFields()
{
return dtomList;
}
/**
* Add bitstream id to delete contents file
* @param bitstreamId bitstream ID
*/
public void addUndoDeleteContents(UUID bitstreamId)
{
this.undoAddContents.add(bitstreamId);
}
/** /**
* Obtain item from DSpace based on handle * Getter for the DSpace item referenced in the archive
* This is the default implementation *
* that uses the dc.identifier.uri metadatafield * @return DSpace item
* that contains the item handle as its value */
* @param context DSpace Context public Item getItem() {
* @throws SQLException if database error return item;
* @throws Exception if error }
/**
* Getter for directory in archive on disk
*
* @return directory in archive
*/
public File getDirectory() {
return dir;
}
/**
* Getter for directory name in archive
*
* @return directory name in archive
*/
public String getDirectoryName() {
return dirname;
}
/**
* Add metadata field to undo list
*
* @param dtom DtoMetadata (represents metadata field)
*/
public void addUndoMetadataField(DtoMetadata dtom) {
this.undoDtomList.add(dtom);
}
/**
* Getter for list of metadata fields
*
* @return list of metadata fields
*/
public List<DtoMetadata> getMetadataFields() {
return dtomList;
}
/**
* Add bitstream id to delete contents file
*
* @param bitstreamId bitstream ID
*/
public void addUndoDeleteContents(UUID bitstreamId) {
this.undoAddContents.add(bitstreamId);
}
/**
* Obtain item from DSpace based on handle
* This is the default implementation
* that uses the dc.identifier.uri metadatafield
* that contains the item handle as its value
*
* @param context DSpace Context
* @throws SQLException if database error
* @throws Exception if error
*/ */
private Item itemFromHandleInput(Context context) private Item itemFromHandleInput(Context context)
throws SQLException, Exception throws SQLException, Exception {
{ DtoMetadata dtom = getMetadataField("dc.identifier.uri");
DtoMetadata dtom = getMetadataField("dc.identifier.uri"); if (dtom == null) {
if (dtom == null) throw new Exception("No dc.identier.uri field found for handle");
{ }
throw new Exception("No dc.identier.uri field found for handle");
} this.addUndoMetadataField(dtom); //seed the undo list with the uri
this.addUndoMetadataField(dtom); //seed the undo list with the uri String uri = dtom.value;
String uri = dtom.value; if (!uri.startsWith(ItemUpdate.HANDLE_PREFIX)) {
throw new Exception("dc.identifier.uri for item " + uri
if (!uri.startsWith(ItemUpdate.HANDLE_PREFIX)) + " does not begin with prefix: " + ItemUpdate.HANDLE_PREFIX);
{ }
throw new Exception("dc.identifier.uri for item " + uri
+ " does not begin with prefix: " + ItemUpdate.HANDLE_PREFIX); String handle = uri.substring(ItemUpdate.HANDLE_PREFIX.length());
}
DSpaceObject dso = handleService.resolveToObject(context, handle);
String handle = uri.substring(ItemUpdate.HANDLE_PREFIX.length()); if (dso instanceof Item) {
item = (Item) dso;
DSpaceObject dso = handleService.resolveToObject(context, handle); } else {
if (dso instanceof Item) ItemUpdate.pr("Warning: item not instantiated");
{ throw new IllegalArgumentException("Item " + handle + " not instantiated.");
item = (Item) dso; }
} return item;
else
{
ItemUpdate.pr("Warning: item not instantiated");
throw new IllegalArgumentException("Item " + handle + " not instantiated.");
}
return item;
} }
/** /**
* Find and instantiate Item from the dublin_core.xml based * Find and instantiate Item from the dublin_core.xml based
* on the specified itemField for the item identifier, * on the specified itemField for the item identifier,
* *
*
* @param context - the DSpace context * @param context - the DSpace context
* @param itemField - the compound form of the metadata element <schema>.<element>.<qualifier> * @param itemField - the compound form of the metadata element <schema>.<element>.<qualifier>
* @throws SQLException if database error * @throws SQLException if database error
* @throws Exception if error * @throws Exception if error
*/ */
private Item itemFromMetadataField(Context context, String itemField) private Item itemFromMetadataField(Context context, String itemField)
throws SQLException, AuthorizeException, Exception throws SQLException, AuthorizeException, Exception {
{ DtoMetadata dtom = getMetadataField(itemField);
DtoMetadata dtom = getMetadataField(itemField);
Item item = null;
if (dtom == null)
{
throw new IllegalArgumentException("No field found for item identifier field: " + itemField);
}
ItemUpdate.prv("Metadata field to match for item: " + dtom.toString());
this.addUndoMetadataField(dtom); //seed the undo list with the identifier field Item item = null;
Iterator<Item> itr = itemService.findByMetadataField(context, dtom.schema, dtom.element, dtom.qualifier, dtom.value); if (dtom == null) {
int count = 0; throw new IllegalArgumentException("No field found for item identifier field: " + itemField);
while (itr.hasNext()) }
{ ItemUpdate.prv("Metadata field to match for item: " + dtom.toString());
item = itr.next();
count++; this.addUndoMetadataField(dtom); //seed the undo list with the identifier field
}
Iterator<Item> itr = itemService
.findByMetadataField(context, dtom.schema, dtom.element, dtom.qualifier, dtom.value);
int count = 0;
while (itr.hasNext()) {
item = itr.next();
count++;
}
ItemUpdate.prv("items matching = " + count);
if (count != 1) {
throw new Exception("" + count + " items matching item identifier: " + dtom.value);
}
return item;
}
ItemUpdate.prv("items matching = " + count );
if (count != 1)
{
throw new Exception ("" + count + " items matching item identifier: " + dtom.value);
}
return item;
}
/** /**
* Get DtoMetadata field * Get DtoMetadata field
*
* @param compoundForm compound form * @param compoundForm compound form
* @return DtoMetadata field * @return DtoMetadata field
*/ */
private DtoMetadata getMetadataField(String compoundForm) private DtoMetadata getMetadataField(String compoundForm) {
{ for (DtoMetadata dtom : dtomList) {
for (DtoMetadata dtom : dtomList) if (dtom.matches(compoundForm, false)) {
{ return dtom;
if (dtom.matches(compoundForm, false)) }
{ }
return dtom; return null;
} }
}
return null;
}
/** /**
* write undo directory and files to Disk in archive format * write undo directory and files to Disk in archive format
* *
* @param undoDir - the root directory of the undo archive * @param undoDir - the root directory of the undo archive
* @throws IOException if IO error * @throws IOException if IO error
* @throws ParserConfigurationException if config error * @throws ParserConfigurationException if config error
* @throws TransformerConfigurationException if transformer config error * @throws TransformerConfigurationException if transformer config error
* @throws TransformerException if transformer error * @throws TransformerException if transformer error
* @throws FileNotFoundException if file not found * @throws FileNotFoundException if file not found
*/ */
public void writeUndo(File undoDir) public void writeUndo(File undoDir)
throws IOException, ParserConfigurationException, TransformerConfigurationException, throws IOException, ParserConfigurationException, TransformerConfigurationException,
TransformerException, FileNotFoundException TransformerException, FileNotFoundException {
{ // create directory for item
// create directory for item File dir = new File(undoDir, dirname);
File dir = new File(undoDir, dirname); if (!dir.exists() && !dir.mkdir()) {
if (!dir.exists() && !dir.mkdir())
{
log.error("Unable to create undo directory"); log.error("Unable to create undo directory");
} }
OutputStream out = null;
try OutputStream out = null;
{
try {
out = new FileOutputStream(new File(dir, "dublin_core.xml")); out = new FileOutputStream(new File(dir, "dublin_core.xml"));
Document doc = MetadataUtilities.writeDublinCore(getDocumentBuilder(), undoDtomList); Document doc = MetadataUtilities.writeDublinCore(getDocumentBuilder(), undoDtomList);
MetadataUtilities.writeDocument(doc, getTransformer(), out); MetadataUtilities.writeDocument(doc, getTransformer(), out);
// if undo has delete bitstream // if undo has delete bitstream
if (undoAddContents.size() > 0) if (undoAddContents.size() > 0) {
{
PrintWriter pw = null; PrintWriter pw = null;
try try {
{
File f = new File(dir, ItemUpdate.DELETE_CONTENTS_FILE); File f = new File(dir, ItemUpdate.DELETE_CONTENTS_FILE);
pw = new PrintWriter(new BufferedWriter(new FileWriter(f))); pw = new PrintWriter(new BufferedWriter(new FileWriter(f)));
for (UUID i : undoAddContents) for (UUID i : undoAddContents) {
{
pw.println(i); pw.println(i);
} }
} } finally {
finally
{
pw.close(); pw.close();
} }
} }
} } finally {
finally if (out != null) {
{
if (out != null)
{
out.close(); out.close();
} }
} }
} }
} //end class } //end class

View File

@@ -9,12 +9,18 @@ package org.dspace.app.itemupdate;
import java.io.BufferedWriter; import java.io.BufferedWriter;
import java.io.File; import java.io.File;
import java.io.FilenameFilter;
import java.io.FileNotFoundException; import java.io.FileNotFoundException;
import java.io.FileWriter; import java.io.FileWriter;
import java.io.FilenameFilter;
import java.io.IOException; import java.io.IOException;
import java.io.PrintWriter; import java.io.PrintWriter;
import java.util.*; import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser; import org.apache.commons.cli.CommandLineParser;
@@ -32,75 +38,68 @@ import org.dspace.eperson.factory.EPersonServiceFactory;
import org.dspace.eperson.service.EPersonService; import org.dspace.eperson.service.EPersonService;
/** /**
* * Provides some batch editing capabilities for items in DSpace:
* Provides some batch editing capabilities for items in DSpace: * Metadata fields - Add, Delete
* Metadata fields - Add, Delete * Bitstreams - Add, Delete
* Bitstreams - Add, Delete
*
* The design has been for compatibility with ItemImporter
* in the use of the DSpace archive format which is used to
* specify changes on a per item basis. The directory names
* to correspond to each item are arbitrary and will only be
* used for logging purposes. The reference to the item is
* from a required dc.identifier with the item handle to be
* included in the dublin_core.xml (or similar metadata) file.
*
* Any combination of these actions is permitted in a single run of this class
* The order of actions is important when used in combination.
* It is the responsibility of the calling class (here, ItemUpdate)
* to register UpdateAction classes in the order to which they are
* to be performed.
*
*
* It is unfortunate that so much code needs to be borrowed
* from ItemImport as it is not reusable in private methods, etc.
* Some of this has been placed into the MetadataUtilities class
* for possible reuse elsewhere.
*
*
* @author W. Hays based on a conceptual design by R. Rodgers
* *
* The design has been for compatibility with ItemImporter
* in the use of the DSpace archive format which is used to
* specify changes on a per item basis. The directory names
* to correspond to each item are arbitrary and will only be
* used for logging purposes. The reference to the item is
* from a required dc.identifier with the item handle to be
* included in the dublin_core.xml (or similar metadata) file.
*
* Any combination of these actions is permitted in a single run of this class
* The order of actions is important when used in combination.
* It is the responsibility of the calling class (here, ItemUpdate)
* to register UpdateAction classes in the order to which they are
* to be performed.
*
*
* It is unfortunate that so much code needs to be borrowed
* from ItemImport as it is not reusable in private methods, etc.
* Some of this has been placed into the MetadataUtilities class
* for possible reuse elsewhere.
*
* @author W. Hays based on a conceptual design by R. Rodgers
*/ */
public class ItemUpdate { public class ItemUpdate {
public static final String SUPPRESS_UNDO_FILENAME = "suppress_undo";
public static final String CONTENTS_FILE = "contents"; public static final String SUPPRESS_UNDO_FILENAME = "suppress_undo";
public static final String DELETE_CONTENTS_FILE = "delete_contents";
public static String HANDLE_PREFIX = null; public static final String CONTENTS_FILE = "contents";
public static final Map<String, String> filterAliases = new HashMap<String, String>(); public static final String DELETE_CONTENTS_FILE = "delete_contents";
public static boolean verbose = false; public static String HANDLE_PREFIX = null;
public static final Map<String, String> filterAliases = new HashMap<String, String>();
public static boolean verbose = false;
protected static final EPersonService epersonService = EPersonServiceFactory.getInstance().getEPersonService(); protected static final EPersonService epersonService = EPersonServiceFactory.getInstance().getEPersonService();
protected static final ItemService itemService = ContentServiceFactory.getInstance().getItemService(); protected static final ItemService itemService = ContentServiceFactory.getInstance().getItemService();
static static {
{ filterAliases.put("ORIGINAL", "org.dspace.app.itemupdate.OriginalBitstreamFilter");
filterAliases.put("ORIGINAL", "org.dspace.app.itemupdate.OriginalBitstreamFilter"); filterAliases
filterAliases.put("ORIGINAL_AND_DERIVATIVES", "org.dspace.app.itemupdate.OriginalWithDerivativesBitstreamFilter"); .put("ORIGINAL_AND_DERIVATIVES", "org.dspace.app.itemupdate.OriginalWithDerivativesBitstreamFilter");
filterAliases.put("TEXT", "org.dspace.app.itemupdate.DerivativeTextBitstreamFilter"); filterAliases.put("TEXT", "org.dspace.app.itemupdate.DerivativeTextBitstreamFilter");
filterAliases.put("THUMBNAIL", "org.dspace.app.itemupdate.ThumbnailBitstreamFilter"); filterAliases.put("THUMBNAIL", "org.dspace.app.itemupdate.ThumbnailBitstreamFilter");
} }
// File listing filter to check for folders // File listing filter to check for folders
static FilenameFilter directoryFilter = new FilenameFilter() static FilenameFilter directoryFilter = new FilenameFilter() {
{
@Override @Override
public boolean accept(File dir, String n) public boolean accept(File dir, String n) {
{
File f = new File(dir.getAbsolutePath() + File.separatorChar + n); File f = new File(dir.getAbsolutePath() + File.separatorChar + n);
return f.isDirectory(); return f.isDirectory();
} }
}; };
// File listing filter to check for files (not directories) // File listing filter to check for files (not directories)
static FilenameFilter fileFilter = new FilenameFilter() static FilenameFilter fileFilter = new FilenameFilter() {
{
@Override @Override
public boolean accept(File dir, String n) public boolean accept(File dir, String n) {
{
File f = new File(dir.getAbsolutePath() + File.separatorChar + n); File f = new File(dir.getAbsolutePath() + File.separatorChar + n);
return (f.isFile()); return (f.isFile());
} }
@@ -110,520 +109,458 @@ public class ItemUpdate {
protected ActionManager actionMgr = new ActionManager(); protected ActionManager actionMgr = new ActionManager();
protected List<String> undoActionList = new ArrayList<String>(); protected List<String> undoActionList = new ArrayList<String>();
protected String eperson; protected String eperson;
/** /**
* * @param argv the command line arguments given
* @param argv commandline args
*/ */
public static void main(String[] argv) public static void main(String[] argv) {
{
// create an options object and populate it // create an options object and populate it
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
Options options = new Options(); Options options = new Options();
//processing basis for determining items //processing basis for determining items
//item-specific changes with metadata in source directory with dublin_core.xml files //item-specific changes with metadata in source directory with dublin_core.xml files
options.addOption("s", "source", true, "root directory of source dspace archive "); options.addOption("s", "source", true, "root directory of source dspace archive ");
//actions on items //actions on items
options.addOption("a", "addmetadata", true, "add metadata specified for each item; multiples separated by semicolon ';'"); options.addOption("a", "addmetadata", true,
"add metadata specified for each item; multiples separated by semicolon ';'");
options.addOption("d", "deletemetadata", true, "delete metadata specified for each item"); options.addOption("d", "deletemetadata", true, "delete metadata specified for each item");
options.addOption("A", "addbitstreams", false, "add bitstreams as specified for each item"); options.addOption("A", "addbitstreams", false, "add bitstreams as specified for each item");
// extra work to get optional argument // extra work to get optional argument
Option delBitstreamOption = new Option("D", "deletebitstreams", true, "delete bitstreams as specified for each item"); Option delBitstreamOption = new Option("D", "deletebitstreams", true,
"delete bitstreams as specified for each item");
delBitstreamOption.setOptionalArg(true); delBitstreamOption.setOptionalArg(true);
delBitstreamOption.setArgName("BitstreamFilter"); delBitstreamOption.setArgName("BitstreamFilter");
options.addOption(delBitstreamOption); options.addOption(delBitstreamOption);
//other params //other params
options.addOption("e", "eperson", true, "email of eperson doing the update"); options.addOption("e", "eperson", true, "email of eperson doing the update");
options.addOption("i", "itemfield", true, "optional metadata field that containing item identifier; default is dc.identifier.uri"); options.addOption("i", "itemfield", true,
options.addOption("F", "filter-properties", true, "filter class name; only for deleting bitstream"); "optional metadata field that containing item identifier; default is dc.identifier.uri");
options.addOption("F", "filter-properties", true, "filter class name; only for deleting bitstream");
options.addOption("v", "verbose", false, "verbose logging"); options.addOption("v", "verbose", false, "verbose logging");
//special run states //special run states
options.addOption("t", "test", false, "test run - do not actually import items"); options.addOption("t", "test", false, "test run - do not actually import items");
options.addOption("P", "provenance", false, "suppress altering provenance field for bitstream changes"); options.addOption("P", "provenance", false, "suppress altering provenance field for bitstream changes");
options.addOption("h", "help", false, "help"); options.addOption("h", "help", false, "help");
int status = 0; int status = 0;
boolean isTest = false; boolean isTest = false;
boolean alterProvenance = true; boolean alterProvenance = true;
String itemField = null; String itemField = null;
String metadataIndexName = null; String metadataIndexName = null;
Context context = null; Context context = null;
ItemUpdate iu = new ItemUpdate(); ItemUpdate iu = new ItemUpdate();
try try {
{ CommandLine line = parser.parse(options, argv);
CommandLine line = parser.parse(options, argv);
if (line.hasOption('h'))
{
HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("ItemUpdate", options);
pr("");
pr("Examples:");
pr(" adding metadata: ItemUpdate -e jsmith@mit.edu -s sourcedir -a dc.contributor -a dc.subject ");
pr(" deleting metadata: ItemUpdate -e jsmith@mit.edu -s sourcedir -d dc.description.other");
pr(" adding bitstreams: ItemUpdate -e jsmith@mit.edu -s sourcedir -A -i dc.identifier");
pr(" deleting bitstreams: ItemUpdate -e jsmith@mit.edu -s sourcedir -D ORIGINAL ");
pr("");
System.exit(0);
}
if (line.hasOption('v'))
{
verbose = true;
}
if (line.hasOption('h')) {
if (line.hasOption('P')) HelpFormatter myhelp = new HelpFormatter();
{ myhelp.printHelp("ItemUpdate", options);
alterProvenance = false; pr("");
pr("Suppressing changes to Provenance field option"); pr("Examples:");
} pr(" adding metadata: ItemUpdate -e jsmith@mit.edu -s sourcedir -a dc.contributor -a dc.subject ");
pr(" deleting metadata: ItemUpdate -e jsmith@mit.edu -s sourcedir -d dc.description.other");
pr(" adding bitstreams: ItemUpdate -e jsmith@mit.edu -s sourcedir -A -i dc.identifier");
pr(" deleting bitstreams: ItemUpdate -e jsmith@mit.edu -s sourcedir -D ORIGINAL ");
pr("");
iu.eperson = line.getOptionValue('e'); // db ID or email System.exit(0);
}
if (!line.hasOption('s')) // item specific changes from archive dir
{
pr("Missing source archive option");
System.exit(1);
}
String sourcedir = line.getOptionValue('s');
if (line.hasOption('t')) //test
{
isTest = true;
pr("**Test Run** - not actually updating items.");
}
if (line.hasOption('i'))
{
itemField = line.getOptionValue('i');
}
if (line.hasOption('d')) if (line.hasOption('v')) {
{ verbose = true;
String[] targetFields = line.getOptionValues('d'); }
DeleteMetadataAction delMetadataAction = (DeleteMetadataAction) iu.actionMgr.getUpdateAction(DeleteMetadataAction.class);
delMetadataAction.addTargetFields(targetFields);
//undo is an add
for (String field : targetFields)
{
iu.undoActionList.add(" -a " + field + " ");
}
pr("Delete metadata for fields: ");
for (String s : targetFields)
{
pr(" " + s);
}
}
if (line.hasOption('a'))
{
String[] targetFields = line.getOptionValues('a');
AddMetadataAction addMetadataAction = (AddMetadataAction) iu.actionMgr.getUpdateAction(AddMetadataAction.class);
addMetadataAction.addTargetFields(targetFields);
//undo is a delete followed by an add of a replace record for target fields
for (String field : targetFields)
{
iu.undoActionList.add(" -d " + field + " ");
}
for (String field : targetFields)
{
iu.undoActionList.add(" -a " + field + " ");
}
pr("Add metadata for fields: ");
for (String s : targetFields)
{
pr(" " + s);
}
}
if (line.hasOption('D')) // undo not supported
{
pr("Delete bitstreams ");
String[] filterNames = line.getOptionValues('D');
if ((filterNames != null) && (filterNames.length > 1))
{
pr("Error: Only one filter can be a used at a time.");
System.exit(1);
}
String filterName = line.getOptionValue('D');
pr("Filter argument: " + filterName);
if (filterName == null) // indicates using delete_contents files
{
DeleteBitstreamsAction delAction = (DeleteBitstreamsAction) iu.actionMgr.getUpdateAction(DeleteBitstreamsAction.class);
delAction.setAlterProvenance(alterProvenance);
}
else
{
// check if param is on ALIAS list
String filterClassname = filterAliases.get(filterName);
if (filterClassname == null)
{
filterClassname = filterName;
}
BitstreamFilter filter = null;
try
{
Class<?> cfilter = Class.forName(filterClassname);
pr("BitstreamFilter class to instantiate: " + cfilter.toString());
filter = (BitstreamFilter) cfilter.newInstance(); //unfortunate cast, an erasure consequence
}
catch(Exception e)
{
pr("Error: Failure instantiating bitstream filter class: " + filterClassname);
System.exit(1);
}
String filterPropertiesName = line.getOptionValue('F');
if (filterPropertiesName != null) //not always required
{
try
{
// TODO try multiple relative locations, e.g. source dir
if (!filterPropertiesName.startsWith("/"))
{
filterPropertiesName = sourcedir + File.separator + filterPropertiesName;
}
filter.initProperties(filterPropertiesName);
}
catch(Exception e)
{
pr("Error: Failure finding properties file for bitstream filter class: " + filterPropertiesName);
System.exit(1);
}
}
DeleteBitstreamsByFilterAction delAction =
(DeleteBitstreamsByFilterAction) iu.actionMgr.getUpdateAction(DeleteBitstreamsByFilterAction.class);
delAction.setAlterProvenance(alterProvenance);
delAction.setBitstreamFilter(filter);
//undo not supported
}
}
if (line.hasOption('A')) if (line.hasOption('P')) {
{ alterProvenance = false;
pr("Add bitstreams "); pr("Suppressing changes to Provenance field option");
AddBitstreamsAction addAction = (AddBitstreamsAction) iu.actionMgr.getUpdateAction(AddBitstreamsAction.class); }
addAction.setAlterProvenance(alterProvenance);
iu.eperson = line.getOptionValue('e'); // db ID or email
iu.undoActionList.add(" -D "); // delete_contents file will be written, no arg required
} if (!line.hasOption('s')) { // item specific changes from archive dir
pr("Missing source archive option");
if (!iu.actionMgr.hasActions()) System.exit(1);
{ }
String sourcedir = line.getOptionValue('s');
if (line.hasOption('t')) { //test
isTest = true;
pr("**Test Run** - not actually updating items.");
}
if (line.hasOption('i')) {
itemField = line.getOptionValue('i');
}
if (line.hasOption('d')) {
String[] targetFields = line.getOptionValues('d');
DeleteMetadataAction delMetadataAction = (DeleteMetadataAction) iu.actionMgr
.getUpdateAction(DeleteMetadataAction.class);
delMetadataAction.addTargetFields(targetFields);
//undo is an add
for (String field : targetFields) {
iu.undoActionList.add(" -a " + field + " ");
}
pr("Delete metadata for fields: ");
for (String s : targetFields) {
pr(" " + s);
}
}
if (line.hasOption('a')) {
String[] targetFields = line.getOptionValues('a');
AddMetadataAction addMetadataAction = (AddMetadataAction) iu.actionMgr
.getUpdateAction(AddMetadataAction.class);
addMetadataAction.addTargetFields(targetFields);
//undo is a delete followed by an add of a replace record for target fields
for (String field : targetFields) {
iu.undoActionList.add(" -d " + field + " ");
}
for (String field : targetFields) {
iu.undoActionList.add(" -a " + field + " ");
}
pr("Add metadata for fields: ");
for (String s : targetFields) {
pr(" " + s);
}
}
if (line.hasOption('D')) { // undo not supported
pr("Delete bitstreams ");
String[] filterNames = line.getOptionValues('D');
if ((filterNames != null) && (filterNames.length > 1)) {
pr("Error: Only one filter can be a used at a time.");
System.exit(1);
}
String filterName = line.getOptionValue('D');
pr("Filter argument: " + filterName);
if (filterName == null) { // indicates using delete_contents files
DeleteBitstreamsAction delAction = (DeleteBitstreamsAction) iu.actionMgr
.getUpdateAction(DeleteBitstreamsAction.class);
delAction.setAlterProvenance(alterProvenance);
} else {
// check if param is on ALIAS list
String filterClassname = filterAliases.get(filterName);
if (filterClassname == null) {
filterClassname = filterName;
}
BitstreamFilter filter = null;
try {
Class<?> cfilter = Class.forName(filterClassname);
pr("BitstreamFilter class to instantiate: " + cfilter.toString());
filter = (BitstreamFilter) cfilter.newInstance(); //unfortunate cast, an erasure consequence
} catch (Exception e) {
pr("Error: Failure instantiating bitstream filter class: " + filterClassname);
System.exit(1);
}
String filterPropertiesName = line.getOptionValue('F');
if (filterPropertiesName != null) { //not always required
try {
// TODO try multiple relative locations, e.g. source dir
if (!filterPropertiesName.startsWith("/")) {
filterPropertiesName = sourcedir + File.separator + filterPropertiesName;
}
filter.initProperties(filterPropertiesName);
} catch (Exception e) {
pr("Error: Failure finding properties file for bitstream filter class: " +
filterPropertiesName);
System.exit(1);
}
}
DeleteBitstreamsByFilterAction delAction =
(DeleteBitstreamsByFilterAction) iu.actionMgr
.getUpdateAction(DeleteBitstreamsByFilterAction.class);
delAction.setAlterProvenance(alterProvenance);
delAction.setBitstreamFilter(filter);
//undo not supported
}
}
if (line.hasOption('A')) {
pr("Add bitstreams ");
AddBitstreamsAction addAction = (AddBitstreamsAction) iu.actionMgr
.getUpdateAction(AddBitstreamsAction.class);
addAction.setAlterProvenance(alterProvenance);
iu.undoActionList.add(" -D "); // delete_contents file will be written, no arg required
}
if (!iu.actionMgr.hasActions()) {
pr("Error - an action must be specified"); pr("Error - an action must be specified");
System.exit(1); System.exit(1);
} } else {
else pr("Actions to be performed: ");
{
pr("Actions to be performed: ");
for (UpdateAction ua : iu.actionMgr)
{
pr(" " + ua.getClass().getName());
}
}
pr("ItemUpdate - initializing run on " + (new Date()).toString());
context = new Context();
iu.setEPerson(context, iu.eperson);
context.turnOffAuthorisationSystem();
HANDLE_PREFIX = ConfigurationManager.getProperty("handle.canonical.prefix");
if (HANDLE_PREFIX == null || HANDLE_PREFIX.length() == 0)
{
HANDLE_PREFIX = "http://hdl.handle.net/";
}
iu.processArchive(context, sourcedir, itemField, metadataIndexName, alterProvenance, isTest);
context.complete(); // complete all transactions for (UpdateAction ua : iu.actionMgr) {
} pr(" " + ua.getClass().getName());
catch (Exception e) }
{ }
if (context != null && context.isValid())
{ pr("ItemUpdate - initializing run on " + (new Date()).toString());
context = new Context(Context.Mode.BATCH_EDIT);
iu.setEPerson(context, iu.eperson);
context.turnOffAuthorisationSystem();
HANDLE_PREFIX = ConfigurationManager.getProperty("handle.canonical.prefix");
if (HANDLE_PREFIX == null || HANDLE_PREFIX.length() == 0) {
HANDLE_PREFIX = "http://hdl.handle.net/";
}
iu.processArchive(context, sourcedir, itemField, metadataIndexName, alterProvenance, isTest);
context.complete(); // complete all transactions
} catch (Exception e) {
if (context != null && context.isValid()) {
context.abort(); context.abort();
} }
e.printStackTrace(); e.printStackTrace();
pr(e.toString()); pr(e.toString());
status = 1; status = 1;
} } finally {
finally { context.restoreAuthSystemState();
context.restoreAuthSystemState();
} }
if (isTest) if (isTest) {
{
pr("***End of Test Run***"); pr("***End of Test Run***");
} } else {
else pr("End.");
{
pr("End.");
} }
System.exit(status); System.exit(status);
} }
/** /**
* process an archive * process an archive
* @param context DSpace Context *
* @param sourceDirPath source path * @param context DSpace Context
* @param itemField item field * @param sourceDirPath source path
* @param itemField item field
* @param metadataIndexName index name * @param metadataIndexName index name
* @param alterProvenance whether to alter provenance * @param alterProvenance whether to alter provenance
* @param isTest test flag * @param isTest test flag
* @throws Exception if error * @throws Exception if error
*/ */
protected void processArchive(Context context, String sourceDirPath, String itemField, protected void processArchive(Context context, String sourceDirPath, String itemField,
String metadataIndexName, boolean alterProvenance, boolean isTest) String metadataIndexName, boolean alterProvenance, boolean isTest)
throws Exception throws Exception {
{
// open and process the source directory // open and process the source directory
File sourceDir = new File(sourceDirPath); File sourceDir = new File(sourceDirPath);
if ((sourceDir == null) || !sourceDir.exists() || !sourceDir.isDirectory()) if ((sourceDir == null) || !sourceDir.exists() || !sourceDir.isDirectory()) {
{
pr("Error, cannot open archive source directory " + sourceDirPath); pr("Error, cannot open archive source directory " + sourceDirPath);
throw new Exception("error with archive source directory " + sourceDirPath); throw new Exception("error with archive source directory " + sourceDirPath);
} }
String[] dircontents = sourceDir.list(directoryFilter); //just the names, not the path String[] dircontents = sourceDir.list(directoryFilter); //just the names, not the path
Arrays.sort(dircontents); Arrays.sort(dircontents);
//Undo is suppressed to prevent undo of undo //Undo is suppressed to prevent undo of undo
boolean suppressUndo = false; boolean suppressUndo = false;
File fSuppressUndo = new File(sourceDir, SUPPRESS_UNDO_FILENAME); File fSuppressUndo = new File(sourceDir, SUPPRESS_UNDO_FILENAME);
if (fSuppressUndo.exists()) if (fSuppressUndo.exists()) {
{ suppressUndo = true;
suppressUndo = true;
} }
File undoDir = null; //sibling directory of source archive File undoDir = null; //sibling directory of source archive
if (!suppressUndo && !isTest) if (!suppressUndo && !isTest) {
{ undoDir = initUndoArchive(sourceDir);
undoDir = initUndoArchive(sourceDir); }
}
int itemCount = 0;
int itemCount = 0; int successItemCount = 0;
int successItemCount = 0;
for (String dirname : dircontents) {
for (String dirname : dircontents) itemCount++;
{ pr("");
itemCount++; pr("processing item " + dirname);
pr("");
pr("processing item " + dirname); try {
ItemArchive itarch = ItemArchive.create(context, new File(sourceDir, dirname), itemField);
try
{ for (UpdateAction action : actionMgr) {
ItemArchive itarch = ItemArchive.create(context, new File(sourceDir, dirname), itemField); pr("action: " + action.getClass().getName());
action.execute(context, itarch, isTest, suppressUndo);
for (UpdateAction action : actionMgr) if (!isTest && !suppressUndo) {
{
pr("action: " + action.getClass().getName());
action.execute(context, itarch, isTest, suppressUndo);
if (!isTest && !suppressUndo)
{
itarch.writeUndo(undoDir); itarch.writeUndo(undoDir);
} }
} }
if (!isTest) if (!isTest) {
{ Item item = itarch.getItem();
Item item = itarch.getItem();
itemService.update(context, item); //need to update before commit itemService.update(context, item); //need to update before commit
} context.uncacheEntity(item);
ItemUpdate.pr("Item " + dirname + " completed"); }
successItemCount++; ItemUpdate.pr("Item " + dirname + " completed");
} successItemCount++;
catch(Exception e) } catch (Exception e) {
{ pr("Exception processing item " + dirname + ": " + e.toString());
pr("Exception processing item " + dirname + ": " + e.toString());
e.printStackTrace(); e.printStackTrace();
} }
}
if (!suppressUndo && !isTest)
{
StringBuilder sb = new StringBuilder("dsrun org.dspace.app.itemupdate.ItemUpdate ");
sb.append(" -e ").append(this.eperson);
sb.append(" -s ").append(undoDir);
if (itemField != null)
{
sb.append(" -i ").append(itemField);
}
if (!alterProvenance)
{
sb.append(" -P ");
}
if (isTest)
{
sb.append(" -t ");
}
for (String actionOption : undoActionList)
{
sb.append(actionOption);
}
PrintWriter pw = null;
try
{
File cmdFile = new File (undoDir.getParent(), undoDir.getName() + "_command.sh");
pw = new PrintWriter(new BufferedWriter(new FileWriter(cmdFile)));
pw.println(sb.toString());
}
finally
{
pw.close();
}
} }
if (!suppressUndo && !isTest) {
StringBuilder sb = new StringBuilder("dsrun org.dspace.app.itemupdate.ItemUpdate ");
sb.append(" -e ").append(this.eperson);
sb.append(" -s ").append(undoDir);
if (itemField != null) {
sb.append(" -i ").append(itemField);
}
if (!alterProvenance) {
sb.append(" -P ");
}
if (isTest) {
sb.append(" -t ");
}
for (String actionOption : undoActionList) {
sb.append(actionOption);
}
PrintWriter pw = null;
try {
File cmdFile = new File(undoDir.getParent(), undoDir.getName() + "_command.sh");
pw = new PrintWriter(new BufferedWriter(new FileWriter(cmdFile)));
pw.println(sb.toString());
} finally {
pw.close();
}
}
pr(""); pr("");
pr("Done processing. Successful items: " + successItemCount + " of " + itemCount + " items in source archive"); pr("Done processing. Successful items: " + successItemCount + " of " + itemCount + " items in source archive");
pr(""); pr("");
} }
/** /**
*
* to avoid overwriting the undo source tree on repeated processing * to avoid overwriting the undo source tree on repeated processing
* sequence numbers are added and checked * sequence numbers are added and checked
* *
* @param sourceDir - the original source directory * @param sourceDir - the original source directory
* @return the directory of the undo archive * @return the directory of the undo archive
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
* @throws IOException if IO error * @throws IOException if IO error
*/ */
protected File initUndoArchive(File sourceDir) protected File initUndoArchive(File sourceDir)
throws FileNotFoundException, IOException throws FileNotFoundException, IOException {
{ File parentDir = sourceDir.getCanonicalFile().getParentFile();
File parentDir = sourceDir.getCanonicalFile().getParentFile(); if (parentDir == null) {
if (parentDir == null) throw new FileNotFoundException(
{ "Parent directory of archive directory not found; unable to write UndoArchive; no processing " +
throw new FileNotFoundException("Parent directory of archive directory not found; unable to write UndoArchive; no processing performed"); "performed");
} }
String sourceDirName = sourceDir.getName(); String sourceDirName = sourceDir.getName();
int seqNo = 1; int seqNo = 1;
File undoDir = new File(parentDir, "undo_" + sourceDirName + "_" + seqNo); File undoDir = new File(parentDir, "undo_" + sourceDirName + "_" + seqNo);
while (undoDir.exists()) while (undoDir.exists()) {
{ undoDir = new File(parentDir, "undo_" + sourceDirName + "_" + ++seqNo); //increment
undoDir = new File(parentDir, "undo_" + sourceDirName+ "_" + ++seqNo); //increment }
}
// create root directory
// create root directory if (!undoDir.mkdir()) {
if (!undoDir.mkdir()) pr("ERROR creating Undo Archive directory " + undoDir.getCanonicalPath());
{ throw new IOException("ERROR creating Undo Archive directory " + undoDir.getCanonicalPath());
pr("ERROR creating Undo Archive directory " + undoDir.getCanonicalPath()); }
throw new IOException("ERROR creating Undo Archive directory " + undoDir.getCanonicalPath());
}
//Undo is suppressed to prevent undo of undo //Undo is suppressed to prevent undo of undo
File fSuppressUndo = new File(undoDir, ItemUpdate.SUPPRESS_UNDO_FILENAME); File fSuppressUndo = new File(undoDir, ItemUpdate.SUPPRESS_UNDO_FILENAME);
try try {
{ fSuppressUndo.createNewFile();
fSuppressUndo.createNewFile(); } catch (IOException e) {
pr("ERROR creating Suppress Undo File " + e.toString());
throw e;
} }
catch(IOException e) return undoDir;
{ }
pr("ERROR creating Suppress Undo File " + e.toString());
throw e; //private void write
}
return undoDir;
}
//private void write
/** /**
* Set EPerson doing import * Set EPerson doing import
*
* @param context DSpace Context * @param context DSpace Context
* @param eperson EPerson obj * @param eperson EPerson obj
* @throws Exception if error * @throws Exception if error
*/ */
protected void setEPerson(Context context, String eperson) protected void setEPerson(Context context, String eperson)
throws Exception throws Exception {
{ if (eperson == null) {
if (eperson == null)
{
pr("Error - an eperson to do the importing must be specified"); pr("Error - an eperson to do the importing must be specified");
pr(" (run with -h flag for details)"); pr(" (run with -h flag for details)");
throw new Exception("EPerson not specified."); } throw new Exception("EPerson not specified.");
}
EPerson myEPerson = null; EPerson myEPerson = null;
if (eperson.indexOf('@') != -1) if (eperson.indexOf('@') != -1) {
{
// @ sign, must be an email // @ sign, must be an email
myEPerson = epersonService.findByEmail(context, eperson); myEPerson = epersonService.findByEmail(context, eperson);
} } else {
else
{
myEPerson = epersonService.find(context, UUID.fromString(eperson)); myEPerson = epersonService.find(context, UUID.fromString(eperson));
} }
if (myEPerson == null) if (myEPerson == null) {
{
pr("Error, eperson cannot be found: " + eperson); pr("Error, eperson cannot be found: " + eperson);
throw new Exception("Invalid EPerson"); throw new Exception("Invalid EPerson");
} }
context.setCurrentUser(myEPerson); context.setCurrentUser(myEPerson);
} }
/** /**
* poor man's logging * poor man's logging
* As with ItemImport, API logging goes through log4j to the DSpace.log files * As with ItemImport, API logging goes through log4j to the DSpace.log files
* whereas the batch logging goes to the console to be captured there. * whereas the batch logging goes to the console to be captured there.
*
* @param s String * @param s String
*/ */
static void pr(String s) static void pr(String s) {
{ System.out.println(s);
System.out.println(s);
} }
/** /**
* print if verbose flag is set * print if verbose flag is set
*
* @param s String * @param s String
*/ */
static void prv(String s) static void prv(String s) {
{ if (verbose) {
if (verbose) System.out.println(s);
{ }
System.out.println(s);
}
} }
} //end of class } //end of class

View File

@@ -11,14 +11,13 @@ import java.io.BufferedReader;
import java.io.File; import java.io.File;
import java.io.FileNotFoundException; import java.io.FileNotFoundException;
import java.io.FileReader; import java.io.FileReader;
import java.io.InputStream;
import java.io.IOException; import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream; import java.io.OutputStream;
import java.sql.SQLException; import java.sql.SQLException;
import java.text.ParseException; import java.text.ParseException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
import javax.xml.parsers.DocumentBuilder; import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.ParserConfigurationException; import javax.xml.parsers.ParserConfigurationException;
import javax.xml.transform.Result; import javax.xml.transform.Result;
@@ -31,10 +30,14 @@ import javax.xml.transform.stream.StreamResult;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang.StringUtils;
import org.apache.xpath.XPathAPI; import org.apache.xpath.XPathAPI;
import org.dspace.authorize.AuthorizeException;
import org.dspace.content.*; import org.dspace.content.Item;
import org.dspace.content.MetadataField;
import org.dspace.content.MetadataSchema;
import org.dspace.content.MetadataValue;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService; import org.dspace.content.service.ItemService;
import org.dspace.core.ConfigurationManager;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.w3c.dom.Document; import org.w3c.dom.Document;
import org.w3c.dom.Element; import org.w3c.dom.Element;
@@ -43,492 +46,420 @@ import org.w3c.dom.Node;
import org.w3c.dom.NodeList; import org.w3c.dom.NodeList;
import org.xml.sax.SAXException; import org.xml.sax.SAXException;
import org.dspace.authorize.AuthorizeException;
import org.dspace.core.ConfigurationManager;
/** /**
* Miscellaneous methods for metadata handling that build on the API * Miscellaneous methods for metadata handling that build on the API
* which might have general utility outside of the specific use * which might have general utility outside of the specific use
* in context in ItemUpdate. * in context in ItemUpdate.
*
* The XML methods were based on those in ItemImport
*
* *
* The XML methods were based on those in ItemImport
*/ */
public class MetadataUtilities { public class MetadataUtilities {
protected static final ItemService itemService = ContentServiceFactory.getInstance().getItemService(); protected static final ItemService itemService = ContentServiceFactory.getInstance().getItemService();
/** /**
* * Default constructor
* Working around Item API to delete a value-specific Metadatum
For a given element/qualifier/lang:
get all DCValues
clear (i.e. delete) all of these DCValues
* add them back, minus the one to actually delete
*
* @param context DSpace Context
* @param item Item Object
* @param dtom metadata field
* @param isLanguageStrict whether strict or not
* @throws SQLException if database error
* @return true if metadata field is found with matching value and was deleted
*/ */
public static boolean deleteMetadataByValue(Context context, Item item, DtoMetadata dtom, boolean isLanguageStrict) throws SQLException { private MetadataUtilities() { }
List<MetadataValue> ar = null;
/**
if (isLanguageStrict) * Working around Item API to delete a value-specific Metadatum
{ // get all for given type * For a given element/qualifier/lang:
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, dtom.language); * get all DCValues
} * clear (i.e. delete) all of these DCValues
else * add them back, minus the one to actually delete
{ *
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY); * @param context DSpace Context
} * @param item Item Object
* @param dtom metadata field
boolean found = false; * @param isLanguageStrict whether strict or not
* @return true if metadata field is found with matching value and was deleted
//build new set minus the one to delete * @throws SQLException if database error
List<String> vals = new ArrayList<String>(); */
for (MetadataValue dcv : ar) public static boolean deleteMetadataByValue(Context context, Item item, DtoMetadata dtom, boolean isLanguageStrict)
{ throws SQLException {
if (dcv.getValue().equals(dtom.value)) List<MetadataValue> ar = null;
{
found = true; if (isLanguageStrict) { // get all for given type
} ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
else } else {
{ ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
vals.add(dcv.getValue()); }
}
} boolean found = false;
if (found) //remove all for given type ??synchronize this block?? //build new set minus the one to delete
{ List<String> vals = new ArrayList<String>();
if (isLanguageStrict) for (MetadataValue dcv : ar) {
{ if (dcv.getValue().equals(dtom.value)) {
found = true;
} else {
vals.add(dcv.getValue());
}
}
if (found) { //remove all for given type ??synchronize this block??
if (isLanguageStrict) {
itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language); itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
} } else {
else
{
itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY); itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
} }
itemService.addMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language, vals); itemService.addMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language, vals);
} }
return found; return found;
} }
/** /**
* Append text to value metadata field to item * Append text to value metadata field to item
* *
* @param context DSpace Context * @param context DSpace Context
* @param item DSpace Item * @param item DSpace Item
* @param dtom metadata field * @param dtom metadata field
* @param isLanguageStrict if strict * @param isLanguageStrict if strict
* @param textToAppend text to append * @param textToAppend text to append
* @throws IllegalArgumentException - When target metadata field is not found * @throws IllegalArgumentException - When target metadata field is not found
* @throws SQLException if database error * @throws SQLException if database error
*/ */
public static void appendMetadata(Context context, Item item, DtoMetadata dtom, boolean isLanguageStrict, public static void appendMetadata(Context context, Item item, DtoMetadata dtom, boolean isLanguageStrict,
String textToAppend) String textToAppend)
throws IllegalArgumentException, SQLException { throws IllegalArgumentException, SQLException {
List<MetadataValue> ar = null; List<MetadataValue> ar = null;
// get all values for given element/qualifier
if (isLanguageStrict) // get all for given element/qualifier
{
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
}
else
{
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
}
if (ar.size() == 0)
{
throw new IllegalArgumentException("Metadata to append to not found");
}
int idx = 0; //index of field to change
if (ar.size() > 1) //need to pick one, can't be sure it's the last one
{
// TODO maybe get highest id ?
}
//build new set minus the one to delete
List<String> vals = new ArrayList<String>();
for (int i=0; i < ar.size(); i++)
{
if (i == idx)
{
vals.add(ar.get(i).getValue() + textToAppend);
}
else
{
vals.add(ar.get(i).getValue());
}
}
if (isLanguageStrict) // get all values for given element/qualifier
{ if (isLanguageStrict) { // get all for given element/qualifier
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
} else {
ar = itemService.getMetadata(item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
}
if (ar.size() == 0) {
throw new IllegalArgumentException("Metadata to append to not found");
}
int idx = 0; //index of field to change
if (ar.size() > 1) { //need to pick one, can't be sure it's the last one
// TODO maybe get highest id ?
}
//build new set minus the one to delete
List<String> vals = new ArrayList<String>();
for (int i = 0; i < ar.size(); i++) {
if (i == idx) {
vals.add(ar.get(i).getValue() + textToAppend);
} else {
vals.add(ar.get(i).getValue());
}
}
if (isLanguageStrict) {
itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language); itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language);
} } else {
else
{
itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY); itemService.clearMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, Item.ANY);
} }
itemService.addMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language, vals); itemService.addMetadata(context, item, dtom.schema, dtom.element, dtom.qualifier, dtom.language, vals);
} }
/** /**
* Modification of method from ItemImporter.loadDublinCore * Modification of method from ItemImporter.loadDublinCore
* as a Factory method * as a Factory method
* *
* @param docBuilder DocumentBuilder * @param docBuilder DocumentBuilder
* @param is - InputStream of dublin_core.xml * @param is - InputStream of dublin_core.xml
* @return list of DtoMetadata representing the metadata fields relating to an Item * @return list of DtoMetadata representing the metadata fields relating to an Item
* @throws SQLException if database error * @throws SQLException if database error
* @throws IOException if IO error * @throws IOException if IO error
* @throws ParserConfigurationException if parser config error * @throws ParserConfigurationException if parser config error
* @throws SAXException if XML error * @throws SAXException if XML error
* @throws TransformerException if transformer error * @throws TransformerException if transformer error
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
*/ */
public static List<DtoMetadata> loadDublinCore(DocumentBuilder docBuilder, InputStream is) public static List<DtoMetadata> loadDublinCore(DocumentBuilder docBuilder, InputStream is)
throws SQLException, IOException, ParserConfigurationException, throws SQLException, IOException, ParserConfigurationException,
SAXException, TransformerException, AuthorizeException SAXException, TransformerException, AuthorizeException {
{ Document document = docBuilder.parse(is);
Document document = docBuilder.parse(is);
List<DtoMetadata> dtomList = new ArrayList<DtoMetadata>();
// Get the schema, for backward compatibility we will default to the
// dublin core schema if the schema name is not available in the import file
String schema = null;
NodeList metadata = XPathAPI.selectNodeList(document, "/dublin_core");
Node schemaAttr = metadata.item(0).getAttributes().getNamedItem("schema");
if (schemaAttr == null)
{
schema = MetadataSchema.DC_SCHEMA;
}
else
{
schema = schemaAttr.getNodeValue();
}
// Get the nodes corresponding to formats List<DtoMetadata> dtomList = new ArrayList<DtoMetadata>();
NodeList dcNodes = XPathAPI.selectNodeList(document, "/dublin_core/dcvalue");
for (int i = 0; i < dcNodes.getLength(); i++)
{
Node n = dcNodes.item(i);
String value = getStringValue(n).trim();
// compensate for empty value getting read as "null", which won't display
if (value == null)
{
value = "";
}
String element = getAttributeValue(n, "element");
if (element != null)
{
element = element.trim();
}
String qualifier = getAttributeValue(n, "qualifier");
if (qualifier != null)
{
qualifier = qualifier.trim();
}
String language = getAttributeValue(n, "language");
if (language != null)
{
language = language.trim();
}
if ("none".equals(qualifier) || "".equals(qualifier)) // Get the schema, for backward compatibility we will default to the
{ // dublin core schema if the schema name is not available in the import file
qualifier = null; String schema = null;
} NodeList metadata = XPathAPI.selectNodeList(document, "/dublin_core");
Node schemaAttr = metadata.item(0).getAttributes().getNamedItem("schema");
// a goofy default, but consistent with DSpace treatment elsewhere if (schemaAttr == null) {
if (language == null) schema = MetadataSchema.DC_SCHEMA;
{ } else {
language = "en"; schema = schemaAttr.getNodeValue();
} }
else if ("".equals(language))
{ // Get the nodes corresponding to formats
language = ConfigurationManager.getProperty("default.language"); NodeList dcNodes = XPathAPI.selectNodeList(document, "/dublin_core/dcvalue");
}
for (int i = 0; i < dcNodes.getLength(); i++) {
DtoMetadata dtom = DtoMetadata.create(schema, element, qualifier, language, value); Node n = dcNodes.item(i);
ItemUpdate.pr(dtom.toString()); String value = getStringValue(n).trim();
dtomList.add(dtom); // compensate for empty value getting read as "null", which won't display
} if (value == null) {
return dtomList; value = "";
} }
String element = getAttributeValue(n, "element");
if (element != null) {
element = element.trim();
}
String qualifier = getAttributeValue(n, "qualifier");
if (qualifier != null) {
qualifier = qualifier.trim();
}
String language = getAttributeValue(n, "language");
if (language != null) {
language = language.trim();
}
if ("none".equals(qualifier) || "".equals(qualifier)) {
qualifier = null;
}
// a goofy default, but consistent with DSpace treatment elsewhere
if (language == null) {
language = "en";
} else if ("".equals(language)) {
language = ConfigurationManager.getProperty("default.language");
}
DtoMetadata dtom = DtoMetadata.create(schema, element, qualifier, language, value);
ItemUpdate.pr(dtom.toString());
dtomList.add(dtom);
}
return dtomList;
}
/** /**
* Write dublin_core.xml * Write dublin_core.xml
* *
* @param docBuilder DocumentBuilder * @param docBuilder DocumentBuilder
* @param dtomList List of metadata fields * @param dtomList List of metadata fields
* @return xml document * @return xml document
* @throws ParserConfigurationException if parser config error * @throws ParserConfigurationException if parser config error
* @throws TransformerConfigurationException if transformer config error * @throws TransformerConfigurationException if transformer config error
* @throws TransformerException if transformer error * @throws TransformerException if transformer error
*/ */
public static Document writeDublinCore(DocumentBuilder docBuilder, List<DtoMetadata> dtomList) public static Document writeDublinCore(DocumentBuilder docBuilder, List<DtoMetadata> dtomList)
throws ParserConfigurationException, TransformerConfigurationException, TransformerException throws ParserConfigurationException, TransformerConfigurationException, TransformerException {
{
Document doc = docBuilder.newDocument(); Document doc = docBuilder.newDocument();
Element root = doc.createElement("dublin_core"); Element root = doc.createElement("dublin_core");
doc.appendChild(root); doc.appendChild(root);
for (DtoMetadata dtom : dtomList) for (DtoMetadata dtom : dtomList) {
{ Element mel = doc.createElement("dcvalue");
Element mel = doc.createElement("dcvalue"); mel.setAttribute("element", dtom.element);
mel.setAttribute("element", dtom.element); if (dtom.qualifier == null) {
if (dtom.qualifier == null) mel.setAttribute("qualifier", "none");
{ } else {
mel.setAttribute("qualifier", "none"); mel.setAttribute("qualifier", dtom.qualifier);
} }
else
{ if (StringUtils.isEmpty(dtom.language)) {
mel.setAttribute("qualifier", dtom.qualifier); mel.setAttribute("language", "en");
} } else {
mel.setAttribute("language", dtom.language);
if (StringUtils.isEmpty(dtom.language)) }
{ mel.setTextContent(dtom.value);
mel.setAttribute("language", "en"); root.appendChild(mel);
}
else
{
mel.setAttribute("language", dtom.language);
}
mel.setTextContent(dtom.value);
root.appendChild(mel);
} }
return doc; return doc;
} }
/** /**
* write xml document to output stream * write xml document to output stream
* @param doc XML Document *
* @param doc XML Document
* @param transformer XML Transformer * @param transformer XML Transformer
* @param out OutputStream * @param out OutputStream
* @throws IOException if IO Error * @throws IOException if IO Error
* @throws TransformerException if Transformer error * @throws TransformerException if Transformer error
*/ */
public static void writeDocument(Document doc, Transformer transformer, OutputStream out) public static void writeDocument(Document doc, Transformer transformer, OutputStream out)
throws IOException, TransformerException throws IOException, TransformerException {
{ Source src = new DOMSource(doc);
Source src = new DOMSource(doc); Result dest = new StreamResult(out);
Result dest = new StreamResult(out); transformer.transform(src, dest);
transformer.transform(src, dest); }
}
// XML utility methods // XML utility methods
/** /**
* Lookup an attribute from a DOM node. * Lookup an attribute from a DOM node.
* @param n Node *
* @param n Node
* @param name name * @param name name
* @return attribute value * @return attribute value
*/ */
private static String getAttributeValue(Node n, String name) private static String getAttributeValue(Node n, String name) {
{
NamedNodeMap nm = n.getAttributes(); NamedNodeMap nm = n.getAttributes();
for (int i = 0; i < nm.getLength(); i++) for (int i = 0; i < nm.getLength(); i++) {
{
Node node = nm.item(i); Node node = nm.item(i);
if (name.equals(node.getNodeName())) if (name.equals(node.getNodeName())) {
{
return node.getNodeValue(); return node.getNodeValue();
} }
} }
return ""; return "";
} }
/** /**
* Return the String value of a Node. * Return the String value of a Node.
*
* @param node node * @param node node
* @return string value * @return string value
*/ */
private static String getStringValue(Node node) private static String getStringValue(Node node) {
{
String value = node.getNodeValue(); String value = node.getNodeValue();
if (node.hasChildNodes()) if (node.hasChildNodes()) {
{
Node first = node.getFirstChild(); Node first = node.getFirstChild();
if (first.getNodeType() == Node.TEXT_NODE) if (first.getNodeType() == Node.TEXT_NODE) {
{
return first.getNodeValue(); return first.getNodeValue();
} }
} }
return value; return value;
} }
/** /**
* Rewrite of ItemImport's functionality * Rewrite of ItemImport's functionality
* but just the parsing of the file, not the processing of its elements. * but just the parsing of the file, not the processing of its elements.
* *
* @param f file * @param f file
* @return list of ContentsEntry * @return list of ContentsEntry
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
* @throws IOException if IO error * @throws IOException if IO error
* @throws ParseException if parse error * @throws ParseException if parse error
*/ */
public static List<ContentsEntry> readContentsFile(File f) public static List<ContentsEntry> readContentsFile(File f)
throws FileNotFoundException, IOException, ParseException throws FileNotFoundException, IOException, ParseException {
{ List<ContentsEntry> list = new ArrayList<ContentsEntry>();
List<ContentsEntry> list = new ArrayList<ContentsEntry>();
BufferedReader in = null;
BufferedReader in = null;
try {
try in = new BufferedReader(new FileReader(f));
{ String line = null;
in = new BufferedReader(new FileReader(f));
String line = null; while ((line = in.readLine()) != null) {
line = line.trim();
while ((line = in.readLine()) != null) if ("".equals(line)) {
{ continue;
line = line.trim(); }
if ("".equals(line)) ItemUpdate.pr("Contents entry: " + line);
{ list.add(ContentsEntry.parse(line));
continue; }
} } finally {
ItemUpdate.pr("Contents entry: " + line); try {
list.add(ContentsEntry.parse(line)); in.close();
} } catch (IOException e) {
} //skip
finally }
{ }
try
{ return list;
in.close();
}
catch(IOException e)
{
//skip
}
}
return list;
} }
/** /**
*
* @param f file * @param f file
* @return list of lines as strings * @return list of lines as strings
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
* @throws IOException if IO Error * @throws IOException if IO Error
*/ */
public static List<String> readDeleteContentsFile(File f) public static List<String> readDeleteContentsFile(File f)
throws FileNotFoundException, IOException throws FileNotFoundException, IOException {
{ List<String> list = new ArrayList<>();
List<String> list = new ArrayList<>();
BufferedReader in = null;
BufferedReader in = null;
try {
try in = new BufferedReader(new FileReader(f));
{ String line = null;
in = new BufferedReader(new FileReader(f));
String line = null; while ((line = in.readLine()) != null) {
line = line.trim();
while ((line = in.readLine()) != null) if ("".equals(line)) {
{ continue;
line = line.trim(); }
if ("".equals(line))
{
continue;
}
list.add(line); list.add(line);
} }
} } finally {
finally try {
{ in.close();
try } catch (IOException e) {
{ //skip
in.close(); }
} }
catch(IOException e)
{ return list;
//skip
}
}
return list;
} }
/** /**
* Get display of Metadatum * Get display of Metadatum
* *
* @param dcv MetadataValue * @param dcv MetadataValue
* @return string displaying elements of the Metadatum * @return string displaying elements of the Metadatum
*/ */
public static String getDCValueString(MetadataValue dcv) public static String getDCValueString(MetadataValue dcv) {
{
MetadataField metadataField = dcv.getMetadataField(); MetadataField metadataField = dcv.getMetadataField();
MetadataSchema metadataSchema = metadataField.getMetadataSchema(); MetadataSchema metadataSchema = metadataField.getMetadataSchema();
return "schema: " + metadataSchema.getName() + "; element: " + metadataField.getElement() + "; qualifier: " + metadataField.getQualifier() + return "schema: " + metadataSchema.getName() + "; element: " + metadataField
"; language: " + dcv.getLanguage() + "; value: " + dcv.getValue(); .getElement() + "; qualifier: " + metadataField.getQualifier() +
"; language: " + dcv.getLanguage() + "; value: " + dcv.getValue();
} }
/** /**
* Return compound form of a metadata field (i.e. schema.element.qualifier) * Return compound form of a metadata field (i.e. schema.element.qualifier)
* @param schema schema *
* @param element element * @param schema schema
* @param qualifier qualifier * @param element element
* @return a String representation of the two- or three-part form of a metadata element * @param qualifier qualifier
* e.g. dc.identifier.uri * @return a String representation of the two- or three-part form of a metadata element
*/ * e.g. dc.identifier.uri
public static String getCompoundForm(String schema, String element, String qualifier) */
{ public static String getCompoundForm(String schema, String element, String qualifier) {
StringBuilder sb = new StringBuilder(); StringBuilder sb = new StringBuilder();
sb.append(schema).append(".").append(element); sb.append(schema).append(".").append(element);
if (qualifier != null) if (qualifier != null) {
{ sb.append(".").append(qualifier);
sb.append(".").append(qualifier); }
} return sb.toString();
return sb.toString(); }
}
/**
/** * Parses metadata field given in the form {@code <schema>.<element>[.<qualifier>|.*]}
* Parses metadata field given in the form {@code <schema>.<element>[.<qualifier>|.*]} * checks for correct number of elements (2 or 3) and for empty strings
* checks for correct number of elements (2 or 3) and for empty strings *
* * @param compoundForm compound form of metadata field
* @param compoundForm compound form of metadata field * @return String Array
* @return String Array * @throws ParseException if validity checks fail
* @throws ParseException if validity checks fail */
* public static String[] parseCompoundForm(String compoundForm)
*/ throws ParseException {
public static String[] parseCompoundForm(String compoundForm) String[] ar = compoundForm.split("\\s*\\.\\s*"); //trim ends
throws ParseException
{ if ("".equals(ar[0])) {
String[] ar = compoundForm.split("\\s*\\.\\s*"); //trim ends throw new ParseException("schema is empty string: " + compoundForm, 0);
}
if ("".equals(ar[0]))
{ if ((ar.length < 2) || (ar.length > 3) || "".equals(ar[1])) {
throw new ParseException("schema is empty string: " + compoundForm, 0); throw new ParseException("element is malformed or empty string: " + compoundForm, 0);
} }
if ((ar.length < 2) || (ar.length > 3) || "".equals(ar[1])) return ar;
{ }
throw new ParseException("element is malformed or empty string: " + compoundForm, 0);
} }
return ar;
}
}

View File

@@ -13,45 +13,37 @@ import java.util.List;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
import org.dspace.content.Bundle; import org.dspace.content.Bundle;
/** /**
* Filter all bitstreams in the ORIGINAL bundle * Filter all bitstreams in the ORIGINAL bundle
* Also delete all derivative bitstreams, i.e. * Also delete all derivative bitstreams, i.e.
* all bitstreams in the TEXT and THUMBNAIL bundles * all bitstreams in the TEXT and THUMBNAIL bundles
*/ */
public class OriginalBitstreamFilter extends BitstreamFilterByBundleName public class OriginalBitstreamFilter extends BitstreamFilterByBundleName {
{ public OriginalBitstreamFilter() {
public OriginalBitstreamFilter() //empty
{ }
//empty
} /**
* Tests bitstreams for containment in an ORIGINAL bundle
/** *
* Tests bitstreams for containment in an ORIGINAL bundle * @param bitstream Bitstream
* @param bitstream Bitstream * @return true if the bitstream is in the ORIGINAL bundle
* @return true if the bitstream is in the ORIGINAL bundle * @throws BitstreamFilterException if filter error
* */
* @throws BitstreamFilterException if filter error @Override
*/
@Override
public boolean accept(Bitstream bitstream) public boolean accept(Bitstream bitstream)
throws BitstreamFilterException throws BitstreamFilterException {
{ try {
try List<Bundle> bundles = bitstream.getBundles();
{ for (Bundle bundle : bundles) {
List<Bundle> bundles = bitstream.getBundles(); if (bundle.getName().equals("ORIGINAL")) {
for (Bundle bundle : bundles) return true;
{ }
if (bundle.getName().equals("ORIGINAL")) }
{ } catch (SQLException e) {
return true; throw new BitstreamFilterException(e);
} }
} return false;
} }
catch(SQLException e)
{
throw new BitstreamFilterException(e);
}
return false;
}
} }

View File

@@ -13,50 +13,41 @@ import java.util.List;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
import org.dspace.content.Bundle; import org.dspace.content.Bundle;
/** /**
* Filter all bitstreams in the ORIGINAL bundle * Filter all bitstreams in the ORIGINAL bundle
* Also delete all derivative bitstreams, i.e. * Also delete all derivative bitstreams, i.e.
* all bitstreams in the TEXT and THUMBNAIL bundles * all bitstreams in the TEXT and THUMBNAIL bundles
*/ */
public class OriginalWithDerivativesBitstreamFilter extends BitstreamFilter public class OriginalWithDerivativesBitstreamFilter extends BitstreamFilter {
{ protected String[] bundlesToEmpty = {"ORIGINAL", "TEXT", "THUMBNAIL"};
protected String[] bundlesToEmpty = { "ORIGINAL", "TEXT", "THUMBNAIL" };
public OriginalWithDerivativesBitstreamFilter() {
public OriginalWithDerivativesBitstreamFilter() //empty
{ }
//empty
} /**
* Tests bitstream for membership in specified bundles (ORIGINAL, TEXT, THUMBNAIL)
/** *
* Tests bitstream for membership in specified bundles (ORIGINAL, TEXT, THUMBNAIL) * @param bitstream Bitstream
* * @return true if bitstream is in specified bundles
* @param bitstream Bitstream * @throws BitstreamFilterException if error
* @throws BitstreamFilterException if error */
* @return true if bitstream is in specified bundles @Override
*/
@Override
public boolean accept(Bitstream bitstream) public boolean accept(Bitstream bitstream)
throws BitstreamFilterException throws BitstreamFilterException {
{ try {
try List<Bundle> bundles = bitstream.getBundles();
{ for (Bundle b : bundles) {
List<Bundle> bundles = bitstream.getBundles(); for (String bn : bundlesToEmpty) {
for (Bundle b : bundles) if (b.getName().equals(bn)) {
{ return true;
for (String bn : bundlesToEmpty) }
{ }
if (b.getName().equals(bn)) }
{ } catch (SQLException e) {
return true; throw new BitstreamFilterException(e);
} }
} return false;
} }
}
catch(SQLException e)
{
throw new BitstreamFilterException(e);
}
return false;
}
} }

View File

@@ -10,15 +10,13 @@ package org.dspace.app.itemupdate;
import java.util.Properties; import java.util.Properties;
/** /**
* Bitstream filter targetting the THUMBNAIL bundle * Bitstream filter targetting the THUMBNAIL bundle
*
*/ */
public class ThumbnailBitstreamFilter extends BitstreamFilterByBundleName { public class ThumbnailBitstreamFilter extends BitstreamFilterByBundleName {
public ThumbnailBitstreamFilter() public ThumbnailBitstreamFilter() {
{ props = new Properties();
props = new Properties(); props.setProperty("bundle", "THUMBNAIL");
props.setProperty("bundle", "THUMBNAIL"); }
}
} }

View File

@@ -12,24 +12,22 @@ import org.dspace.content.service.ItemService;
import org.dspace.core.Context; import org.dspace.core.Context;
/** /**
* Interface for actions to update an item * Interface for actions to update an item
*
*/ */
public interface UpdateAction public interface UpdateAction {
{
public ItemService itemService = ContentServiceFactory.getInstance().getItemService(); public ItemService itemService = ContentServiceFactory.getInstance().getItemService();
/** /**
* Action to update item * Action to update item
* *
* @param context DSpace context * @param context DSpace context
* @param itarch item archive * @param itarch item archive
* @param isTest test flag * @param isTest test flag
* @param suppressUndo undo flag * @param suppressUndo undo flag
* @throws Exception if error * @throws Exception if error
*/ */
public void execute(Context context, ItemArchive itarch, boolean isTest, boolean suppressUndo) public void execute(Context context, ItemArchive itarch, boolean isTest, boolean suppressUndo)
throws Exception; throws Exception;
} }

View File

@@ -12,36 +12,32 @@ import org.dspace.content.service.BitstreamService;
import org.dspace.content.service.BundleService; import org.dspace.content.service.BundleService;
/** /**
* Base class for Bitstream actions * Base class for Bitstream actions
*
*
*/ */
public abstract class UpdateBitstreamsAction implements UpdateAction { public abstract class UpdateBitstreamsAction implements UpdateAction {
protected boolean alterProvenance = true; protected boolean alterProvenance = true;
protected BundleService bundleService = ContentServiceFactory.getInstance().getBundleService(); protected BundleService bundleService = ContentServiceFactory.getInstance().getBundleService();
protected BitstreamService bitstreamService = ContentServiceFactory.getInstance().getBitstreamService(); protected BitstreamService bitstreamService = ContentServiceFactory.getInstance().getBitstreamService();
/** /**
* Set variable to indicate that the dc.description.provenance field may * Set variable to indicate that the dc.description.provenance field may
* be changed as a result of Bitstream changes by ItemUpdate * be changed as a result of Bitstream changes by ItemUpdate
* @param alterProvenance whether to alter provenance *
*/ * @param alterProvenance whether to alter provenance
public void setAlterProvenance(boolean alterProvenance) */
{ public void setAlterProvenance(boolean alterProvenance) {
this.alterProvenance = alterProvenance; this.alterProvenance = alterProvenance;
} }
/** /**
* * @return boolean value to indicate whether the dc.description.provenance field may
* @return boolean value to indicate whether the dc.description.provenance field may * be changed as a result of Bitstream changes by ItemUpdate
* be changed as a result of Bitstream changes by ItemUpdate */
*/ public boolean getAlterProvenance() {
public boolean getAlterProvenance() return alterProvenance;
{ }
return alterProvenance;
}
} }

View File

@@ -11,60 +11,57 @@ import java.util.HashSet;
import java.util.Set; import java.util.Set;
/** /**
* This abstract subclass for metadata actions * This abstract subclass for metadata actions
* maintains a collection for the target metadata fields * maintains a collection for the target metadata fields
* expressed as a string in the compound notation ( {@code <schema>.<element>.<qualifier>} ) * expressed as a string in the compound notation ( {@code <schema>.<element>.<qualifier>} )
* on which to apply the action when the method execute is called. * on which to apply the action when the method execute is called.
*
* Implemented as a Set to avoid problems with duplicates
*
* *
* Implemented as a Set to avoid problems with duplicates
*/ */
public abstract class UpdateMetadataAction implements UpdateAction { public abstract class UpdateMetadataAction implements UpdateAction {
protected Set<String> targetFields = new HashSet<String>(); protected Set<String> targetFields = new HashSet<String>();
/** /**
* Get target fields * Get target fields
* *
* @return set of fields to update * @return set of fields to update
*/ */
public Set<String> getTargetFields() { public Set<String> getTargetFields() {
return targetFields; return targetFields;
} }
/** /**
* Set target fields * Set target fields
* *
* @param targetFields Set of target fields to update * @param targetFields Set of target fields to update
*/ */
public void addTargetFields(Set<String> targetFields) { public void addTargetFields(Set<String> targetFields) {
for (String tf : targetFields) for (String tf : targetFields) {
{ this.targetFields.add(tf);
this.targetFields.add(tf); }
}
}
/** }
* Add array of target fields to update
* @param targetFields array of target fields to update
*/
public void addTargetFields(String[] targetFields) {
for (String tf : targetFields)
{
this.targetFields.add(tf);
}
}
/** /**
* Add single field to update * Add array of target fields to update
* *
* @param targetField target field to update * @param targetFields array of target fields to update
*/ */
public void addTargetField(String targetField) { public void addTargetFields(String[] targetFields) {
this.targetFields.add(targetField); for (String tf : targetFields) {
} this.targetFields.add(tf);
}
}
/**
* Add single field to update
*
* @param targetField target field to update
*/
public void addTargetField(String targetField) {
this.targetFields.add(targetField);
}
} }

View File

@@ -15,29 +15,29 @@ import java.io.Reader;
import java.io.StreamTokenizer; import java.io.StreamTokenizer;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
import org.jdom.Document; import org.jdom.Document;
/** /**
*
* @author mwood * @author mwood
*/ */
public class CommandRunner public class CommandRunner {
{
/** /**
* * Default constructor
* @param args commandline args */
* @throws IOException if IO error private CommandRunner() { }
/**
* @param args the command line arguments given
* @throws IOException if IO error
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
*/ */
public static void main(String[] args) public static void main(String[] args)
throws FileNotFoundException, IOException throws FileNotFoundException, IOException {
{ if (args.length > 0) {
if (args.length > 0)
{
runManyCommands(args[0]); runManyCommands(args[0]);
} } else {
else
{
runManyCommands("-"); runManyCommands("-");
} }
// There is no sensible way to use the status returned by runManyCommands(). // There is no sensible way to use the status returned by runManyCommands().
@@ -54,19 +54,15 @@ public class CommandRunner
* *
* @param script the file of command lines to be executed. * @param script the file of command lines to be executed.
* @return status code * @return status code
* @throws IOException if IO error * @throws IOException if IO error
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
*/ */
static int runManyCommands(String script) static int runManyCommands(String script)
throws FileNotFoundException, IOException throws FileNotFoundException, IOException {
{
Reader input; Reader input;
if ("-".equals(script)) if ("-".equals(script)) {
{
input = new InputStreamReader(System.in); input = new InputStreamReader(System.in);
} } else {
else
{
input = new FileReader(script); input = new FileReader(script);
} }
@@ -89,22 +85,16 @@ public class CommandRunner
int status = 0; int status = 0;
List<String> tokens = new ArrayList<String>(); List<String> tokens = new ArrayList<String>();
Document commandConfigs = ScriptLauncher.getConfig(); Document commandConfigs = ScriptLauncher.getConfig();
while (StreamTokenizer.TT_EOF != tokenizer.nextToken()) while (StreamTokenizer.TT_EOF != tokenizer.nextToken()) {
{ if (StreamTokenizer.TT_EOL == tokenizer.ttype) {
if (StreamTokenizer.TT_EOL == tokenizer.ttype) if (tokens.size() > 0) {
{
if (tokens.size() > 0)
{
status = ScriptLauncher.runOneCommand(commandConfigs, tokens.toArray(new String[tokens.size()])); status = ScriptLauncher.runOneCommand(commandConfigs, tokens.toArray(new String[tokens.size()]));
if (status > 0) if (status > 0) {
{
break; break;
} }
tokens.clear(); tokens.clear();
} }
} } else {
else
{
tokens.add(tokenizer.sval); tokens.add(tokenizer.sval);
} }
} }

View File

@@ -12,6 +12,7 @@ import java.io.IOException;
import java.lang.reflect.Method; import java.lang.reflect.Method;
import java.util.List; import java.util.List;
import java.util.TreeMap; import java.util.TreeMap;
import org.dspace.servicemanager.DSpaceKernelImpl; import org.dspace.servicemanager.DSpaceKernelImpl;
import org.dspace.servicemanager.DSpaceKernelInit; import org.dspace.servicemanager.DSpaceKernelInit;
import org.dspace.services.RequestService; import org.dspace.services.RequestService;
@@ -25,38 +26,37 @@ import org.jdom.input.SAXBuilder;
* @author Stuart Lewis * @author Stuart Lewis
* @author Mark Diggory * @author Mark Diggory
*/ */
public class ScriptLauncher public class ScriptLauncher {
{ /**
/** The service manager kernel */ * The service manager kernel
*/
private static transient DSpaceKernelImpl kernelImpl; private static transient DSpaceKernelImpl kernelImpl;
/**
* Default constructor
*/
private ScriptLauncher() { }
/** /**
* Execute the DSpace script launcher * Execute the DSpace script launcher
* *
* @param args Any parameters required to be passed to the scripts it executes * @param args Any parameters required to be passed to the scripts it executes
* @throws IOException if IO error * @throws IOException if IO error
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
*/ */
public static void main(String[] args) public static void main(String[] args)
throws FileNotFoundException, IOException throws FileNotFoundException, IOException {
{
// Initialise the service manager kernel // Initialise the service manager kernel
try try {
{
kernelImpl = DSpaceKernelInit.getKernel(null); kernelImpl = DSpaceKernelInit.getKernel(null);
if (!kernelImpl.isRunning()) if (!kernelImpl.isRunning()) {
{
kernelImpl.start(); kernelImpl.start();
} }
} catch (Exception e) } catch (Exception e) {
{
// Failed to start so destroy it and log and throw an exception // Failed to start so destroy it and log and throw an exception
try try {
{
kernelImpl.destroy(); kernelImpl.destroy();
} } catch (Exception e1) {
catch (Exception e1)
{
// Nothing to do // Nothing to do
} }
String message = "Failure during kernel init: " + e.getMessage(); String message = "Failure during kernel init: " + e.getMessage();
@@ -69,8 +69,7 @@ public class ScriptLauncher
Document commandConfigs = getConfig(); Document commandConfigs = getConfig();
// Check that there is at least one argument (if not display command options) // Check that there is at least one argument (if not display command options)
if (args.length < 1) if (args.length < 1) {
{
System.err.println("You must provide at least one command argument"); System.err.println("You must provide at least one command argument");
display(commandConfigs); display(commandConfigs);
System.exit(1); System.exit(1);
@@ -81,8 +80,7 @@ public class ScriptLauncher
status = runOneCommand(commandConfigs, args); status = runOneCommand(commandConfigs, args);
// Destroy the service kernel if it is still alive // Destroy the service kernel if it is still alive
if (kernelImpl != null) if (kernelImpl != null) {
{
kernelImpl.destroy(); kernelImpl.destroy();
kernelImpl = null; kernelImpl = null;
} }
@@ -90,28 +88,29 @@ public class ScriptLauncher
System.exit(status); System.exit(status);
} }
protected static int runOneCommand(Document commandConfigs, String[] args) {
return runOneCommand(commandConfigs, args, kernelImpl);
}
/** /**
* Recognize and execute a single command. * Recognize and execute a single command.
* @param doc Document *
* @param args arguments * @param commandConfigs Document
* @param args the command line arguments given
*/ */
static int runOneCommand(Document commandConfigs, String[] args) public static int runOneCommand(Document commandConfigs, String[] args, DSpaceKernelImpl kernelImpl) {
{
String request = args[0]; String request = args[0];
Element root = commandConfigs.getRootElement(); Element root = commandConfigs.getRootElement();
List<Element> commands = root.getChildren("command"); List<Element> commands = root.getChildren("command");
Element command = null; Element command = null;
for (Element candidate : commands) for (Element candidate : commands) {
{ if (request.equalsIgnoreCase(candidate.getChild("name").getValue())) {
if (request.equalsIgnoreCase(candidate.getChild("name").getValue()))
{
command = candidate; command = candidate;
break; break;
} }
} }
if (null == command) if (null == command) {
{
// The command wasn't found // The command wasn't found
System.err.println("Command not found: " + args[0]); System.err.println("Command not found: " + args[0]);
display(commandConfigs); display(commandConfigs);
@@ -120,33 +119,26 @@ public class ScriptLauncher
// Run each step // Run each step
List<Element> steps = command.getChildren("step"); List<Element> steps = command.getChildren("step");
for (Element step : steps) for (Element step : steps) {
{
// Instantiate the class // Instantiate the class
Class target = null; Class target = null;
// Is it the special case 'dsrun' where the user provides the class name? // Is it the special case 'dsrun' where the user provides the class name?
String className; String className;
if ("dsrun".equals(request)) if ("dsrun".equals(request)) {
{ if (args.length < 2) {
if (args.length < 2)
{
System.err.println("Error in launcher.xml: Missing class name"); System.err.println("Error in launcher.xml: Missing class name");
return 1; return 1;
} }
className = args[1]; className = args[1];
} } else {
else {
className = step.getChild("class").getValue(); className = step.getChild("class").getValue();
} }
try try {
{
target = Class.forName(className, target = Class.forName(className,
true, true,
Thread.currentThread().getContextClassLoader()); Thread.currentThread().getContextClassLoader());
} } catch (ClassNotFoundException e) {
catch (ClassNotFoundException e)
{
System.err.println("Error in launcher.xml: Invalid class name: " + className); System.err.println("Error in launcher.xml: Invalid class name: " + className);
return 1; return 1;
} }
@@ -157,26 +149,20 @@ public class ScriptLauncher
Class[] argTypes = {useargs.getClass()}; Class[] argTypes = {useargs.getClass()};
boolean passargs = true; boolean passargs = true;
if ((step.getAttribute("passuserargs") != null) && if ((step.getAttribute("passuserargs") != null) &&
("false".equalsIgnoreCase(step.getAttribute("passuserargs").getValue()))) ("false".equalsIgnoreCase(step.getAttribute("passuserargs").getValue()))) {
{
passargs = false; passargs = false;
} }
if ((args.length == 1) || (("dsrun".equals(request)) && (args.length == 2)) || (!passargs)) if ((args.length == 1) || (("dsrun".equals(request)) && (args.length == 2)) || (!passargs)) {
{
useargs = new String[0]; useargs = new String[0];
} } else {
else
{
// The number of arguments to ignore // The number of arguments to ignore
// If dsrun is the command, ignore the next, as it is the class name not an arg // If dsrun is the command, ignore the next, as it is the class name not an arg
int x = 1; int x = 1;
if ("dsrun".equals(request)) if ("dsrun".equals(request)) {
{
x = 2; x = 2;
} }
String[] argsnew = new String[useargs.length - x]; String[] argsnew = new String[useargs.length - x];
for (int i = x; i < useargs.length; i++) for (int i = x; i < useargs.length; i++) {
{
argsnew[i - x] = useargs[i]; argsnew[i - x] = useargs[i];
} }
useargs = argsnew; useargs = argsnew;
@@ -184,16 +170,13 @@ public class ScriptLauncher
// Add any extra properties // Add any extra properties
List<Element> bits = step.getChildren("argument"); List<Element> bits = step.getChildren("argument");
if (step.getChild("argument") != null) if (step.getChild("argument") != null) {
{
String[] argsnew = new String[useargs.length + bits.size()]; String[] argsnew = new String[useargs.length + bits.size()];
int i = 0; int i = 0;
for (Element arg : bits) for (Element arg : bits) {
{
argsnew[i++] = arg.getValue(); argsnew[i++] = arg.getValue();
} }
for (; i < bits.size() + useargs.length; i++) for (; i < bits.size() + useargs.length; i++) {
{
argsnew[i] = useargs[i - bits.size()]; argsnew[i] = useargs[i - bits.size()];
} }
useargs = argsnew; useargs = argsnew;
@@ -201,11 +184,10 @@ public class ScriptLauncher
// Establish the request service startup // Establish the request service startup
RequestService requestService = kernelImpl.getServiceManager().getServiceByName( RequestService requestService = kernelImpl.getServiceManager().getServiceByName(
RequestService.class.getName(), RequestService.class); RequestService.class.getName(), RequestService.class);
if (requestService == null) if (requestService == null) {
{
throw new IllegalStateException( throw new IllegalStateException(
"Could not get the DSpace RequestService to start the request transaction"); "Could not get the DSpace RequestService to start the request transaction");
} }
// Establish a request related to the current session // Establish a request related to the current session
@@ -213,26 +195,23 @@ public class ScriptLauncher
requestService.startRequest(); requestService.startRequest();
// Run the main() method // Run the main() method
try try {
{
Object[] arguments = {useargs}; Object[] arguments = {useargs};
// Useful for debugging, so left in the code... // Useful for debugging, so left in the code...
/**System.out.print("About to execute: " + className); /**System.out.print("About to execute: " + className);
for (String param : useargs) for (String param : useargs)
{ {
System.out.print(" " + param); System.out.print(" " + param);
} }
System.out.println("");**/ System.out.println("");**/
Method main = target.getMethod("main", argTypes); Method main = target.getMethod("main", argTypes);
main.invoke(null, arguments); main.invoke(null, arguments);
// ensure we close out the request (happy request) // ensure we close out the request (happy request)
requestService.endRequest(null); requestService.endRequest(null);
} } catch (Exception e) {
catch (Exception e)
{
// Failure occurred in the request so we destroy it // Failure occurred in the request so we destroy it
requestService.endRequest(e); requestService.endRequest(e);
@@ -253,20 +232,20 @@ public class ScriptLauncher
* *
* @return The XML configuration file Document * @return The XML configuration file Document
*/ */
protected static Document getConfig() protected static Document getConfig() {
{ return getConfig(kernelImpl);
}
public static Document getConfig(DSpaceKernelImpl kernelImpl) {
// Load the launcher configuration file // Load the launcher configuration file
String config = kernelImpl.getConfigurationService().getProperty("dspace.dir") + String config = kernelImpl.getConfigurationService().getProperty("dspace.dir") +
System.getProperty("file.separator") + "config" + System.getProperty("file.separator") + "config" +
System.getProperty("file.separator") + "launcher.xml"; System.getProperty("file.separator") + "launcher.xml";
SAXBuilder saxBuilder = new SAXBuilder(); SAXBuilder saxBuilder = new SAXBuilder();
Document doc = null; Document doc = null;
try try {
{
doc = saxBuilder.build(config); doc = saxBuilder.build(config);
} } catch (Exception e) {
catch (Exception e)
{
System.err.println("Unable to load the launcher configuration file: [dspace]/config/launcher.xml"); System.err.println("Unable to load the launcher configuration file: [dspace]/config/launcher.xml");
System.err.println(e.getMessage()); System.err.println(e.getMessage());
e.printStackTrace(); e.printStackTrace();
@@ -277,10 +256,10 @@ public class ScriptLauncher
/** /**
* Display the commands that the current launcher config file knows about * Display the commands that the current launcher config file knows about
*
* @param commandConfigs configs as Document * @param commandConfigs configs as Document
*/ */
private static void display(Document commandConfigs) private static void display(Document commandConfigs) {
{
// List all command elements // List all command elements
List<Element> commands = commandConfigs.getRootElement().getChildren("command"); List<Element> commands = commandConfigs.getRootElement().getChildren("command");
@@ -288,17 +267,15 @@ public class ScriptLauncher
// We cannot just use commands.sort() because it tries to remove and // We cannot just use commands.sort() because it tries to remove and
// reinsert Elements within other Elements, and that doesn't work. // reinsert Elements within other Elements, and that doesn't work.
TreeMap<String, Element> sortedCommands = new TreeMap<>(); TreeMap<String, Element> sortedCommands = new TreeMap<>();
for (Element command : commands) for (Element command : commands) {
{
sortedCommands.put(command.getChild("name").getValue(), command); sortedCommands.put(command.getChild("name").getValue(), command);
} }
// Display the sorted list // Display the sorted list
System.out.println("Usage: dspace [command-name] {parameters}"); System.out.println("Usage: dspace [command-name] {parameters}");
for (Element command : sortedCommands.values()) for (Element command : sortedCommands.values()) {
{
System.out.println(" - " + command.getChild("name").getValue() + System.out.println(" - " + command.getChild("name").getValue() +
": " + command.getChild("description").getValue()); ": " + command.getChild("description").getValue());
} }
} }
} }

View File

@@ -7,12 +7,12 @@
*/ */
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import java.awt.image.BufferedImage;
import java.awt.Color; import java.awt.Color;
import java.awt.Font; import java.awt.Font;
import java.awt.FontMetrics; import java.awt.FontMetrics;
import java.awt.Graphics2D; import java.awt.Graphics2D;
import java.awt.Rectangle; import java.awt.Rectangle;
import java.awt.image.BufferedImage;
/** /**
* Class to attach a footer to an image using ImageMagick. * Class to attach a footer to an image using ImageMagick.
@@ -20,143 +20,117 @@ import java.awt.Rectangle;
* This version of the code is basically Ninh's but reorganised a little. Used with permission. * This version of the code is basically Ninh's but reorganised a little. Used with permission.
*/ */
public class Brand public class Brand {
{ private int brandWidth;
private int brandWidth; private int brandHeight;
private int brandHeight; private Font font;
private Font font; private int xOffset;
private int xOffset;
/** /**
* Constructor to set up footer image attributes. * Constructor to set up footer image attributes.
* *
* @param brandWidth length of the footer in pixels * @param brandWidth length of the footer in pixels
* @param brandHeight height of the footer in pixels * @param brandHeight height of the footer in pixels
* @param font font to use for text on the footer * @param font font to use for text on the footer
* @param xOffset number of pixels text should be indented from left-hand side of footer * @param xOffset number of pixels text should be indented from left-hand side of footer
* */
*/ public Brand(int brandWidth,
public Brand(int brandWidth, int brandHeight,
int brandHeight, Font font,
Font font, int xOffset) {
int xOffset) this.brandWidth = brandWidth;
{ this.brandHeight = brandHeight;
this.brandWidth = brandWidth; this.font = font;
this.brandHeight = brandHeight; this.xOffset = xOffset;
this.font = font; }
this.xOffset = xOffset;
}
/** /**
* Create the brand image * Create the brand image
* *
* @param brandLeftText text that should appear in the bottom left of the image * @param brandLeftText text that should appear in the bottom left of the image
* @param shortLeftText abbreviated form of brandLeftText that will be substituted if * @param shortLeftText abbreviated form of brandLeftText that will be substituted if
* the image is resized such that brandLeftText will not fit. <code>null</code> if not * the image is resized such that brandLeftText will not fit. <code>null</code> if not
* required * required
* @param brandRightText text that should appear in the bottom right of the image * @param brandRightText text that should appear in the bottom right of the image
* * @return BufferedImage a BufferedImage object describing the brand image file
* @return BufferedImage a BufferedImage object describing the brand image file */
*/ public BufferedImage create(String brandLeftText,
public BufferedImage create(String brandLeftText, String shortLeftText,
String shortLeftText, String brandRightText) {
String brandRightText) BrandText[] allBrandText = null;
{
BrandText[] allBrandText = null;
BufferedImage brandImage = BufferedImage brandImage =
new BufferedImage(brandWidth, brandHeight, BufferedImage.TYPE_INT_RGB); new BufferedImage(brandWidth, brandHeight, BufferedImage.TYPE_INT_RGB);
if (brandWidth >= 350) if (brandWidth >= 350) {
{ allBrandText = new BrandText[] {
allBrandText = new BrandText[] new BrandText(BrandText.BL, brandLeftText),
{ new BrandText(BrandText.BR, brandRightText)
new BrandText(BrandText.BL, brandLeftText), };
new BrandText(BrandText.BR, brandRightText) } else if (brandWidth >= 190) {
}; allBrandText = new BrandText[] {
} new BrandText(BrandText.BL, shortLeftText),
else if (brandWidth >= 190) new BrandText(BrandText.BR, brandRightText)
{ };
allBrandText = new BrandText[] } else {
{ allBrandText = new BrandText[] {
new BrandText(BrandText.BL, shortLeftText), new BrandText(BrandText.BR, brandRightText)
new BrandText(BrandText.BR, brandRightText) };
}; }
}
else
{
allBrandText = new BrandText[]
{
new BrandText(BrandText.BR, brandRightText)
};
}
if (allBrandText != null && allBrandText.length > 0) if (allBrandText != null && allBrandText.length > 0) {
{ for (int i = 0; i < allBrandText.length; ++i) {
for (int i = 0; i < allBrandText.length; ++i) drawImage(brandImage, allBrandText[i]);
{ }
drawImage(brandImage, allBrandText[i]); }
}
}
return brandImage; return brandImage;
} }
/** /**
* do the text placements and preparatory work for the brand image generation * do the text placements and preparatory work for the brand image generation
* *
* @param brandImage a BufferedImage object where the image is created * @param brandImage a BufferedImage object where the image is created
* @param identifier and Identifier object describing what text is to be placed in what * @param identifier and Identifier object describing what text is to be placed in what
* position within the brand * position within the brand
*/ */
private void drawImage(BufferedImage brandImage, private void drawImage(BufferedImage brandImage,
BrandText brandText) BrandText brandText) {
{ int imgWidth = brandImage.getWidth();
int imgWidth = brandImage.getWidth(); int imgHeight = brandImage.getHeight();
int imgHeight = brandImage.getHeight();
int bx, by, tx, ty, bWidth, bHeight; Graphics2D g2 = brandImage.createGraphics();
g2.setFont(font);
FontMetrics fm = g2.getFontMetrics();
Graphics2D g2 = brandImage.createGraphics(); int bWidth = fm.stringWidth(brandText.getText()) + xOffset * 2 + 1;
g2.setFont(font); int bHeight = fm.getHeight();
FontMetrics fm = g2.getFontMetrics();
int bx = 0;
int by = 0;
bWidth = fm.stringWidth(brandText.getText()) + xOffset * 2 + 1; if (brandText.getLocation().equals(BrandText.TL)) {
bHeight = fm.getHeight(); bx = 0;
by = 0;
} else if (brandText.getLocation().equals(BrandText.TR)) {
bx = imgWidth - bWidth;
by = 0;
} else if (brandText.getLocation().equals(BrandText.BL)) {
bx = 0;
by = imgHeight - bHeight;
} else if (brandText.getLocation().equals(BrandText.BR)) {
bx = imgWidth - bWidth;
by = imgHeight - bHeight;
}
bx = 0; Rectangle box = new Rectangle(bx, by, bWidth, bHeight);
by = 0; int tx = bx + xOffset;
int ty = by + fm.getAscent();
if (brandText.getLocation().equals(BrandText.TL)) g2.setColor(Color.black);
{ g2.fill(box);
bx = 0; g2.setColor(Color.white);
by = 0; g2.drawString(brandText.getText(), tx, ty);
} }
else if (brandText.getLocation().equals(BrandText.TR))
{
bx = imgWidth - bWidth;
by = 0;
}
else if (brandText.getLocation().equals(BrandText.BL))
{
bx = 0;
by = imgHeight - bHeight;
}
else if (brandText.getLocation().equals(BrandText.BR))
{
bx = imgWidth - bWidth;
by = imgHeight - bHeight;
}
Rectangle box = new Rectangle(bx, by, bWidth, bHeight);
tx = bx + xOffset;
ty = by + fm.getAscent();
g2.setColor(Color.black);
g2.fill(box);
g2.setColor(Color.white);
g2.drawString(brandText.getText(), tx, ty);
}
} }

View File

@@ -13,73 +13,75 @@ package org.dspace.app.mediafilter;
* This is a copy of Picture Australia's PiObj class re-organised with methods. * This is a copy of Picture Australia's PiObj class re-organised with methods.
* Thanks to Ninh Nguyen at the National Library for providing the original source. * Thanks to Ninh Nguyen at the National Library for providing the original source.
*/ */
class BrandText class BrandText {
{ /**
/** Bottom Left */ * Bottom Left
public static final String BL = "bl"; */
/** Bottom Right */ public static final String BL = "bl";
public static final String BR = "br"; /**
/** Top Left */ * Bottom Right
public static final String TL = "tl"; */
/** Top Right */ public static final String BR = "br";
public static final String TR = "tr"; /**
* Top Left
*/
public static final String TL = "tl";
/**
* Top Right
*/
public static final String TR = "tr";
private String location; private String location;
private String text; private String text;
/** /**
* Constructor for an Identifier object containing a text string and * Constructor for an Identifier object containing a text string and
* its location within a rectangular area. * its location within a rectangular area.
* *
* @param location one of the class location constants e.g. <code>Identifier.BL</code> * @param location one of the class location constants e.g. <code>Identifier.BL</code>
* @param the text associated with the location * @param the text associated with the location
*/ */
public BrandText(String location, String text) public BrandText(String location, String text) {
{ this.location = location;
this.location = location; this.text = text;
this.text = text; }
}
/** /**
* get the location the text of the Identifier object is associated with * get the location the text of the Identifier object is associated with
* *
* @return String one the class location constants e.g. <code>Identifier.BL</code> * @return String one the class location constants e.g. <code>Identifier.BL</code>
*/ */
public String getLocation() public String getLocation() {
{ return location;
return location; }
}
/** /**
* get the text associated with the Identifier object * get the text associated with the Identifier object
* *
* @return String the text associated with the Identifier object * @return String the text associated with the Identifier object
*/ */
public String getText() public String getText() {
{ return text;
return text; }
}
/** /**
* set the location associated with the Identifier object * set the location associated with the Identifier object
* *
* @param location one of the class location constants * @param location one of the class location constants
*/ */
public void setLocation(String location) public void setLocation(String location) {
{ this.location = location;
this.location = location; }
}
/** /**
* set the text associated with the Identifier object * set the text associated with the Identifier object
* *
* @param text any text string (typically a branding or identifier) * @param text any text string (typically a branding or identifier)
*/ */
public void setText(String text) public void setText(String text) {
{ this.text = text;
this.text = text; }
}
} }

View File

@@ -7,16 +7,13 @@
*/ */
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import java.awt.image.*; import java.awt.image.BufferedImage;
import java.io.InputStream; import java.io.InputStream;
import javax.imageio.ImageIO; import javax.imageio.ImageIO;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.core.ConfigurationManager; import org.dspace.core.ConfigurationManager;
import org.dspace.app.mediafilter.JPEGFilter;
/** /**
* Filter image bitstreams, scaling the image to be within the bounds of * Filter image bitstreams, scaling the image to be within the bounds of
* thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be * thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be
@@ -24,21 +21,17 @@ import org.dspace.app.mediafilter.JPEGFilter;
* *
* @author Jason Sherman jsherman@usao.edu * @author Jason Sherman jsherman@usao.edu
*/ */
public class BrandedPreviewJPEGFilter extends MediaFilter public class BrandedPreviewJPEGFilter extends MediaFilter {
{
@Override @Override
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".preview.jpg"; return oldFilename + ".preview.jpg";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
@Override @Override
public String getBundleName() public String getBundleName() {
{
return "BRANDED_PREVIEW"; return "BRANDED_PREVIEW";
} }
@@ -46,8 +39,7 @@ public class BrandedPreviewJPEGFilter extends MediaFilter
* @return String bitstreamformat * @return String bitstreamformat
*/ */
@Override @Override
public String getFormatString() public String getFormatString() {
{
return "JPEG"; return "JPEG";
} }
@@ -55,42 +47,40 @@ public class BrandedPreviewJPEGFilter extends MediaFilter
* @return String description * @return String description
*/ */
@Override @Override
public String getDescription() public String getDescription() {
{
return "Generated Branded Preview"; return "Generated Branded Preview";
} }
/** /**
* @param currentItem item * @param currentItem item
* @param source * @param source source input stream
* source input stream * @param verbose verbose mode
* @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{
// read in bitstream's image // read in bitstream's image
BufferedImage buf = ImageIO.read(source); BufferedImage buf = ImageIO.read(source);
// get config params // get config params
float xmax = (float) ConfigurationManager float xmax = (float) ConfigurationManager
.getIntProperty("webui.preview.maxwidth"); .getIntProperty("webui.preview.maxwidth");
float ymax = (float) ConfigurationManager float ymax = (float) ConfigurationManager
.getIntProperty("webui.preview.maxheight"); .getIntProperty("webui.preview.maxheight");
boolean blurring = (boolean) ConfigurationManager boolean blurring = (boolean) ConfigurationManager
.getBooleanProperty("webui.preview.blurring"); .getBooleanProperty("webui.preview.blurring");
boolean hqscaling = (boolean) ConfigurationManager boolean hqscaling = (boolean) ConfigurationManager
.getBooleanProperty("webui.preview.hqscaling"); .getBooleanProperty("webui.preview.hqscaling");
int brandHeight = ConfigurationManager.getIntProperty("webui.preview.brand.height"); int brandHeight = ConfigurationManager.getIntProperty("webui.preview.brand.height");
String brandFont = ConfigurationManager.getProperty("webui.preview.brand.font"); String brandFont = ConfigurationManager.getProperty("webui.preview.brand.font");
int brandFontPoint = ConfigurationManager.getIntProperty("webui.preview.brand.fontpoint"); int brandFontPoint = ConfigurationManager.getIntProperty("webui.preview.brand.fontpoint");
JPEGFilter jpegFilter = new JPEGFilter(); JPEGFilter jpegFilter = new JPEGFilter();
return jpegFilter.getThumbDim(currentItem, buf, verbose, xmax, ymax, blurring, hqscaling, brandHeight, brandFontPoint, brandFont); return jpegFilter
.getThumbDim(currentItem, buf, verbose, xmax, ymax, blurring, hqscaling, brandHeight, brandFontPoint,
brandFont);
} }
} }

View File

@@ -11,12 +11,11 @@ import java.io.InputStream;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import org.apache.commons.io.IOUtils; import org.apache.commons.io.IOUtils;
import org.apache.log4j.Logger;
import org.apache.poi.POITextExtractor; import org.apache.poi.POITextExtractor;
import org.apache.poi.extractor.ExtractorFactory; import org.apache.poi.extractor.ExtractorFactory;
import org.apache.poi.hssf.extractor.ExcelExtractor; import org.apache.poi.hssf.extractor.ExcelExtractor;
import org.apache.poi.xssf.extractor.XSSFExcelExtractor; import org.apache.poi.xssf.extractor.XSSFExcelExtractor;
import org.apache.log4j.Logger;
import org.dspace.content.Item; import org.dspace.content.Item;
/* /*
@@ -35,79 +34,62 @@ import org.dspace.content.Item;
* filter.org.dspace.app.mediafilter.ExcelFilter.inputFormats = Microsoft Excel, Microsoft Excel XML * filter.org.dspace.app.mediafilter.ExcelFilter.inputFormats = Microsoft Excel, Microsoft Excel XML
* *
*/ */
public class ExcelFilter extends MediaFilter public class ExcelFilter extends MediaFilter {
{
private static Logger log = Logger.getLogger(ExcelFilter.class); private static Logger log = Logger.getLogger(ExcelFilter.class);
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".txt"; return oldFilename + ".txt";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
public String getBundleName() public String getBundleName() {
{
return "TEXT"; return "TEXT";
} }
/** /**
* @return String bitstream format * @return String bitstream format
*
*
*/ */
public String getFormatString() public String getFormatString() {
{
return "Text"; return "Text";
} }
/** /**
* @return String description * @return String description
*/ */
public String getDescription() public String getDescription() {
{
return "Extracted text"; return "Extracted text";
} }
/** /**
* @param item item * @param item item
* @param source source input stream * @param source source input stream
* @param verbose verbose mode * @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item item, InputStream source, boolean verbose) public InputStream getDestinationStream(Item item, InputStream source, boolean verbose)
throws Exception throws Exception {
{
String extractedText = null; String extractedText = null;
try try {
{
POITextExtractor theExtractor = ExtractorFactory.createExtractor(source); POITextExtractor theExtractor = ExtractorFactory.createExtractor(source);
if (theExtractor instanceof ExcelExtractor) if (theExtractor instanceof ExcelExtractor) {
{
// for xls file // for xls file
extractedText = (theExtractor).getText(); extractedText = (theExtractor).getText();
} } else if (theExtractor instanceof XSSFExcelExtractor) {
else if (theExtractor instanceof XSSFExcelExtractor)
{
// for xlsx file // for xlsx file
extractedText = (theExtractor).getText(); extractedText = (theExtractor).getText();
} }
} } catch (Exception e) {
catch (Exception e)
{
log.error("Error filtering bitstream: " + e.getMessage(), e); log.error("Error filtering bitstream: " + e.getMessage(), e);
throw e; throw e;
} }
if (extractedText != null) if (extractedText != null) {
{
// generate an input stream with the extracted text // generate an input stream with the extracted text
return IOUtils.toInputStream(extractedText, StandardCharsets.UTF_8); return IOUtils.toInputStream(extractedText, StandardCharsets.UTF_8);
} }

View File

@@ -14,74 +14,70 @@ import org.dspace.content.Item;
import org.dspace.core.Context; import org.dspace.core.Context;
/** /**
* Public interface for any class which transforms or converts content/bitstreams * Public interface for any class which transforms or converts content/bitstreams
* from one format to another. This interface should be implemented by any class * from one format to another. This interface should be implemented by any class
* which defines a "filter" to be run by the MediaFilterManager. * which defines a "filter" to be run by the MediaFilterManager.
*/ */
public interface FormatFilter public interface FormatFilter {
{
/** /**
* Get a filename for a newly created filtered bitstream * Get a filename for a newly created filtered bitstream
* *
* @param sourceName * @param sourceName name of source bitstream
* name of source bitstream
* @return filename generated by the filter - for example, document.pdf * @return filename generated by the filter - for example, document.pdf
* becomes document.pdf.txt * becomes document.pdf.txt
*/ */
public String getFilteredName(String sourceName); public String getFilteredName(String sourceName);
/** /**
* @return name of the bundle this filter will stick its generated * @return name of the bundle this filter will stick its generated
* Bitstreams * Bitstreams
*/ */
public String getBundleName(); public String getBundleName();
/** /**
* @return name of the bitstream format (say "HTML" or "Microsoft Word") * @return name of the bitstream format (say "HTML" or "Microsoft Word")
* returned by this filter look in the bitstream format registry or * returned by this filter look in the bitstream format registry or
* mediafilter.cfg for valid format strings. * mediafilter.cfg for valid format strings.
*/ */
public String getFormatString(); public String getFormatString();
/** /**
* @return string to describe the newly-generated Bitstream's - how it was * @return string to describe the newly-generated Bitstream - how it was
* produced is a good idea * produced is a good idea
*/ */
public String getDescription(); public String getDescription();
/** /**
* @param item Item * Read the source stream and produce the filtered content.
* @param source *
* input stream * @param item Item
* @param source input stream
* @param verbose verbosity flag * @param verbose verbosity flag
* * @return result of filter's transformation as a byte stream.
* @return result of filter's transformation, written out to a bitstream
* @throws Exception if error * @throws Exception if error
*/ */
public InputStream getDestinationStream(Item item, InputStream source, boolean verbose) public InputStream getDestinationStream(Item item, InputStream source, boolean verbose)
throws Exception; throws Exception;
/** /**
* Perform any pre-processing of the source bitstream *before* the actual * Perform any pre-processing of the source bitstream *before* the actual
* filtering takes place in MediaFilterManager.processBitstream(). * filtering takes place in MediaFilterManager.processBitstream().
* <p> * <p>
* Return true if pre-processing is successful (or no pre-processing * Return true if pre-processing is successful (or no pre-processing
* is necessary). Return false if bitstream should be skipped * is necessary). Return false if bitstream should be skipped
* for any reason. * for any reason.
* *
* * @param c context
* @param c context * @param item item containing bitstream to process
* @param item item containing bitstream to process * @param source source bitstream to be processed
* @param source source bitstream to be processed
* @param verbose verbose mode * @param verbose verbose mode
* * @return true if bitstream processing should continue,
* @return true if bitstream processing should continue, * false if this bitstream should be skipped
* false if this bitstream should be skipped
* @throws Exception if error * @throws Exception if error
*/ */
public boolean preProcessBitstream(Context c, Item item, Bitstream source, boolean verbose) public boolean preProcessBitstream(Context c, Item item, Bitstream source, boolean verbose)
throws Exception; throws Exception;
/** /**
* Perform any post-processing of the generated bitstream *after* this * Perform any post-processing of the generated bitstream *after* this
* filter has already been run. * filter has already been run.
@@ -89,18 +85,14 @@ public interface FormatFilter
* Return true if pre-processing is successful (or no pre-processing * Return true if pre-processing is successful (or no pre-processing
* is necessary). Return false if bitstream should be skipped * is necessary). Return false if bitstream should be skipped
* for some reason. * for some reason.
* *
* * @param c context
* @param c * @param item item containing bitstream to process
* context * @param generatedBitstream the bitstream which was generated by
* @param item * this filter.
* item containing bitstream to process
* @param generatedBitstream
* the bitstream which was generated by
* this filter.
* @throws Exception if error * @throws Exception if error
*/ */
public void postProcessBitstream(Context c, Item item, Bitstream generatedBitstream) public void postProcessBitstream(Context c, Item item, Bitstream generatedBitstream)
throws Exception; throws Exception;
} }

View File

@@ -7,36 +7,31 @@
*/ */
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import org.dspace.content.Item;
import java.io.ByteArrayInputStream; import java.io.ByteArrayInputStream;
import java.io.InputStream; import java.io.InputStream;
import javax.swing.text.Document; import javax.swing.text.Document;
import javax.swing.text.html.HTMLEditorKit; import javax.swing.text.html.HTMLEditorKit;
import org.dspace.content.Item;
/* /*
* *
* to do: helpful error messages - can't find mediafilter.cfg - can't * to do: helpful error messages - can't find mediafilter.cfg - can't
* instantiate filter - bitstream format doesn't exist * instantiate filter - bitstream format doesn't exist
* *
*/ */
public class HTMLFilter extends MediaFilter public class HTMLFilter extends MediaFilter {
{
@Override @Override
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".txt"; return oldFilename + ".txt";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
@Override @Override
public String getBundleName() public String getBundleName() {
{
return "TEXT"; return "TEXT";
} }
@@ -44,8 +39,7 @@ public class HTMLFilter extends MediaFilter
* @return String bitstreamformat * @return String bitstreamformat
*/ */
@Override @Override
public String getFormatString() public String getFormatString() {
{
return "Text"; return "Text";
} }
@@ -53,23 +47,20 @@ public class HTMLFilter extends MediaFilter
* @return String description * @return String description
*/ */
@Override @Override
public String getDescription() public String getDescription() {
{
return "Extracted text"; return "Extracted text";
} }
/** /**
* @param currentItem item * @param currentItem item
* @param source source input stream * @param source source input stream
* @param verbose verbose mode * @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{
// try and read the document - set to ignore character set directive, // try and read the document - set to ignore character set directive,
// assuming that the input stream is already set properly (I hope) // assuming that the input stream is already set properly (I hope)
HTMLEditorKit kit = new HTMLEditorKit(); HTMLEditorKit kit = new HTMLEditorKit();

View File

@@ -7,53 +7,46 @@
*/ */
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import org.dspace.content.Item;
import java.io.ByteArrayInputStream; import java.io.ByteArrayInputStream;
import java.io.File; import java.io.File;
import java.io.InputStream; import java.io.InputStream;
import java.nio.file.Files; import java.nio.file.Files;
import org.dspace.content.Item;
/** /**
* Filter image bitstreams, scaling the image to be within the bounds of * Filter image bitstreams, scaling the image to be within the bounds of
* thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be * thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be
* no bigger than. Creates only JPEGs. * no bigger than. Creates only JPEGs.
*/ */
public class ImageMagickImageThumbnailFilter extends ImageMagickThumbnailFilter public class ImageMagickImageThumbnailFilter extends ImageMagickThumbnailFilter {
{
/** /**
* @param currentItem item * @param currentItem item
* @param source source input stream * @param source source input stream
* @param verbose verbose mode * @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{ File f = inputStreamToTempFile(source, "imthumb", ".tmp");
File f = inputStreamToTempFile(source, "imthumb", ".tmp"); File f2 = null;
File f2 = null; try {
try f2 = getThumbnailFile(f, verbose);
{ byte[] bytes = Files.readAllBytes(f2.toPath());
f2 = getThumbnailFile(f, verbose); return new ByteArrayInputStream(bytes);
byte[] bytes = Files.readAllBytes(f2.toPath()); } finally {
return new ByteArrayInputStream(bytes); //noinspection ResultOfMethodCallIgnored
} f.delete();
finally if (f2 != null) {
{ //noinspection ResultOfMethodCallIgnored
//noinspection ResultOfMethodCallIgnored f2.delete();
f.delete(); }
if (f2 != null) }
{ }
//noinspection ResultOfMethodCallIgnored
f2.delete();
}
}
}
} }

View File

@@ -7,50 +7,37 @@
*/ */
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import org.dspace.content.Item;
import java.io.ByteArrayInputStream; import java.io.ByteArrayInputStream;
import java.io.File; import java.io.File;
import java.io.InputStream; import java.io.InputStream;
import java.nio.file.Files; import java.nio.file.Files;
import org.dspace.content.Item;
public class ImageMagickPdfThumbnailFilter extends ImageMagickThumbnailFilter { public class ImageMagickPdfThumbnailFilter extends ImageMagickThumbnailFilter {
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{ File f = inputStreamToTempFile(source, "impdfthumb", ".pdf");
File f = inputStreamToTempFile(source, "impdfthumb", ".pdf"); File f2 = null;
File f2 = null; File f3 = null;
File f3 = null; try {
try f2 = getImageFile(f, 0, verbose);
{ f3 = getThumbnailFile(f2, verbose);
f2 = getImageFile(f, 0, verbose); byte[] bytes = Files.readAllBytes(f3.toPath());
f3 = getThumbnailFile(f2, verbose); return new ByteArrayInputStream(bytes);
byte[] bytes = Files.readAllBytes(f3.toPath()); } finally {
return new ByteArrayInputStream(bytes); //noinspection ResultOfMethodCallIgnored
} f.delete();
finally if (f2 != null) {
{ //noinspection ResultOfMethodCallIgnored
//noinspection ResultOfMethodCallIgnored f2.delete();
f.delete(); }
if (f2 != null) if (f3 != null) {
{ //noinspection ResultOfMethodCallIgnored
//noinspection ResultOfMethodCallIgnored f3.delete();
f2.delete(); }
} }
if (f3 != null)
{
//noinspection ResultOfMethodCallIgnored
f3.delete();
}
}
} }
public static final String[] PDF = {"Adobe PDF"};
@Override
public String[] getInputMIMETypes()
{
return PDF;
}
} }

View File

@@ -14,72 +14,70 @@ import java.io.InputStream;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import java.util.regex.PatternSyntaxException; import java.util.regex.PatternSyntaxException;
import javax.imageio.ImageIO;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
import org.dspace.content.Bundle; import org.dspace.content.Bundle;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.content.factory.ContentServiceFactory; import org.dspace.content.factory.ContentServiceFactory;
import org.dspace.content.service.ItemService; import org.dspace.content.service.ItemService;
import org.dspace.core.ConfigurationManager;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.im4java.core.ConvertCmd; import org.im4java.core.ConvertCmd;
import org.im4java.core.IM4JavaException; import org.im4java.core.IM4JavaException;
import org.im4java.core.IMOperation; import org.im4java.core.IMOperation;
import org.im4java.core.Info;
import org.im4java.process.ProcessStarter; import org.im4java.process.ProcessStarter;
import org.dspace.core.ConfigurationManager;
/** /**
* Filter image bitstreams, scaling the image to be within the bounds of * Filter image bitstreams, scaling the image to be within the bounds of
* thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be * thumbnail.maxwidth, thumbnail.maxheight, the size we want our thumbnail to be
* no bigger than. Creates only JPEGs. * no bigger than. Creates only JPEGs.
*/ */
public abstract class ImageMagickThumbnailFilter extends MediaFilter implements SelfRegisterInputFormats public abstract class ImageMagickThumbnailFilter extends MediaFilter {
{ protected static int width = 180;
protected static int width = 180;
protected static int height = 120; protected static int height = 120;
private static boolean flatten = true; private static boolean flatten = true;
static String bitstreamDescription = "IM Thumbnail"; static String bitstreamDescription = "IM Thumbnail";
static final String defaultPattern = "Generated Thumbnail"; static final String defaultPattern = "Generated Thumbnail";
static Pattern replaceRegex = Pattern.compile(defaultPattern); static Pattern replaceRegex = Pattern.compile(defaultPattern);
protected final ItemService itemService = ContentServiceFactory.getInstance().getItemService(); protected final ItemService itemService = ContentServiceFactory.getInstance().getItemService();
static { static String cmyk_profile;
String pre = ImageMagickThumbnailFilter.class.getName(); static String srgb_profile;
String s = ConfigurationManager.getProperty(pre + ".ProcessStarter");
ProcessStarter.setGlobalSearchPath(s); static {
width = ConfigurationManager.getIntProperty("thumbnail.maxwidth", width); String pre = ImageMagickThumbnailFilter.class.getName();
height = ConfigurationManager.getIntProperty("thumbnail.maxheight", height); String s = ConfigurationManager.getProperty(pre + ".ProcessStarter");
flatten = ConfigurationManager.getBooleanProperty(pre + ".flatten", flatten); ProcessStarter.setGlobalSearchPath(s);
String description = ConfigurationManager.getProperty(pre + ".bitstreamDescription"); width = ConfigurationManager.getIntProperty("thumbnail.maxwidth", width);
if (description != null) { height = ConfigurationManager.getIntProperty("thumbnail.maxheight", height);
bitstreamDescription = description; flatten = ConfigurationManager.getBooleanProperty(pre + ".flatten", flatten);
} String description = ConfigurationManager.getProperty(pre + ".bitstreamDescription");
try { cmyk_profile = ConfigurationManager.getProperty(pre + ".cmyk_profile");
String patt = ConfigurationManager.getProperty(pre + ".replaceRegex"); srgb_profile = ConfigurationManager.getProperty(pre + ".srgb_profile");
replaceRegex = Pattern.compile(patt == null ? defaultPattern : patt); if (description != null) {
} catch(PatternSyntaxException e) { bitstreamDescription = description;
System.err.println("Invalid thumbnail replacement pattern: "+e.getMessage()); }
} try {
} String patt = ConfigurationManager.getProperty(pre + ".replaceRegex");
replaceRegex = Pattern.compile(patt == null ? defaultPattern : patt);
public ImageMagickThumbnailFilter() { } catch (PatternSyntaxException e) {
} System.err.println("Invalid thumbnail replacement pattern: " + e.getMessage());
}
}
public ImageMagickThumbnailFilter() {
}
@Override @Override
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".jpg"; return oldFilename + ".jpg";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
@Override @Override
public String getBundleName() public String getBundleName() {
{
return "THUMBNAIL"; return "THUMBNAIL";
} }
@@ -87,8 +85,7 @@ public abstract class ImageMagickThumbnailFilter extends MediaFilter implements
* @return String bitstreamformat * @return String bitstreamformat
*/ */
@Override @Override
public String getFormatString() public String getFormatString() {
{
return "JPEG"; return "JPEG";
} }
@@ -96,113 +93,110 @@ public abstract class ImageMagickThumbnailFilter extends MediaFilter implements
* @return String bitstreamDescription * @return String bitstreamDescription
*/ */
@Override @Override
public String getDescription() public String getDescription() {
{
return bitstreamDescription; return bitstreamDescription;
} }
public File inputStreamToTempFile(InputStream source, String prefix, String suffix) throws IOException { public File inputStreamToTempFile(InputStream source, String prefix, String suffix) throws IOException {
File f = File.createTempFile(prefix, suffix); File f = File.createTempFile(prefix, suffix);
f.deleteOnExit(); f.deleteOnExit();
FileOutputStream fos = new FileOutputStream(f); FileOutputStream fos = new FileOutputStream(f);
byte[] buffer = new byte[1024]; byte[] buffer = new byte[1024];
int len = source.read(buffer); int len = source.read(buffer);
while (len != -1) { while (len != -1) {
fos.write(buffer, 0, len); fos.write(buffer, 0, len);
len = source.read(buffer); len = source.read(buffer);
}
fos.close();
return f;
}
public File getThumbnailFile(File f, boolean verbose) throws IOException, InterruptedException, IM4JavaException {
File f2 = new File(f.getParentFile(), f.getName() + ".jpg");
f2.deleteOnExit();
ConvertCmd cmd = new ConvertCmd();
IMOperation op = new IMOperation();
op.addImage(f.getAbsolutePath());
op.thumbnail(width, height);
op.addImage(f2.getAbsolutePath());
if (verbose) {
System.out.println("IM Thumbnail Param: "+op);
} }
cmd.run(op); fos.close();
return f2; return f;
} }
public File getImageFile(File f, int page, boolean verbose) throws IOException, InterruptedException, IM4JavaException { public File getThumbnailFile(File f, boolean verbose)
File f2 = new File(f.getParentFile(), f.getName() + ".jpg"); throws IOException, InterruptedException, IM4JavaException {
f2.deleteOnExit(); File f2 = new File(f.getParentFile(), f.getName() + ".jpg");
ConvertCmd cmd = new ConvertCmd(); f2.deleteOnExit();
IMOperation op = new IMOperation(); ConvertCmd cmd = new ConvertCmd();
String s = "[" + page + "]"; IMOperation op = new IMOperation();
op.addImage(f.getAbsolutePath()+s); op.autoOrient();
if (flatten) op.addImage(f.getAbsolutePath());
{ op.thumbnail(width, height);
op.flatten(); op.addImage(f2.getAbsolutePath());
}
op.addImage(f2.getAbsolutePath());
if (verbose) { if (verbose) {
System.out.println("IM Image Param: "+op); System.out.println("IM Thumbnail Param: " + op);
} }
cmd.run(op); cmd.run(op);
return f2; return f2;
} }
public File getImageFile(File f, int page, boolean verbose)
throws IOException, InterruptedException, IM4JavaException {
File f2 = new File(f.getParentFile(), f.getName() + ".jpg");
f2.deleteOnExit();
ConvertCmd cmd = new ConvertCmd();
IMOperation op = new IMOperation();
String s = "[" + page + "]";
op.addImage(f.getAbsolutePath() + s);
if (flatten) {
op.flatten();
}
// PDFs using the CMYK color system can be handled specially if
// profiles are defined
if (cmyk_profile != null && srgb_profile != null) {
Info imageInfo = new Info(f.getAbsolutePath(), true);
String imageClass = imageInfo.getImageClass();
if (imageClass.contains("CMYK")) {
op.profile(cmyk_profile);
op.profile(srgb_profile);
}
}
op.addImage(f2.getAbsolutePath());
if (verbose) {
System.out.println("IM Image Param: " + op);
}
cmd.run(op);
return f2;
}
@Override @Override
public boolean preProcessBitstream(Context c, Item item, Bitstream source, boolean verbose) public boolean preProcessBitstream(Context c, Item item, Bitstream source, boolean verbose) throws Exception {
throws Exception String nsrc = source.getName();
{ for (Bundle b : itemService.getBundles(item, "THUMBNAIL")) {
String nsrc = source.getName(); for (Bitstream bit : b.getBitstreams()) {
for(Bundle b: itemService.getBundles(item, "THUMBNAIL")) {
for(Bitstream bit: b.getBitstreams()) {
String n = bit.getName(); String n = bit.getName();
if (n != null) { if (n != null) {
if (nsrc != null) { if (nsrc != null) {
if (!n.startsWith(nsrc)) continue; if (!n.startsWith(nsrc)) {
} continue;
} }
String description = bit.getDescription(); }
//If anything other than a generated thumbnail is found, halt processing }
if (description != null) { String description = bit.getDescription();
if (replaceRegex.matcher(description).matches()) { // If anything other than a generated thumbnail
if (verbose) { // is found, halt processing
System.out.println(description + " " + nsrc + " matches pattern and is replacable."); if (description != null) {
} if (replaceRegex.matcher(description).matches()) {
continue; if (verbose) {
} System.out.println(description + " " + nsrc
if (description.equals(bitstreamDescription)) { + " matches pattern and is replacable.");
if (verbose) { }
System.out.println(bitstreamDescription + " " + nsrc + " is replacable."); continue;
} }
continue; if (description.equals(bitstreamDescription)) {
} if (verbose) {
} System.out.println(bitstreamDescription + " " + nsrc
System.out.println("Custom Thumbnail exists for " + nsrc + " for item " + item.getHandle() + ". Thumbnail will not be generated. "); + " is replacable.");
return false; }
} continue;
} }
}
System.out.println("Custom Thumbnail exists for " + nsrc + " for item "
return true; //assume that the thumbnail is a custom one + item.getHandle() + ". Thumbnail will not be generated. ");
return false;
}
}
return true; // assume that the thumbnail is a custom one
} }
@Override
public String[] getInputMIMETypes()
{
return ImageIO.getReaderMIMETypes();
}
@Override
public String[] getInputDescriptions()
{
return null;
}
@Override
public String[] getInputExtensions()
{
return ImageIO.getReaderFileSuffixes();
}
} }

View File

@@ -7,16 +7,18 @@
*/ */
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import java.awt.Graphics2D;
import java.awt.Color; import java.awt.Color;
import java.awt.image.*; import java.awt.Font;
import java.awt.Graphics2D;
import java.awt.RenderingHints; import java.awt.RenderingHints;
import java.awt.Transparency; import java.awt.Transparency;
import java.awt.Font; import java.awt.image.BufferedImage;
import java.awt.image.BufferedImageOp;
import java.awt.image.ConvolveOp;
import java.awt.image.Kernel;
import java.io.ByteArrayInputStream; import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream; import java.io.ByteArrayOutputStream;
import java.io.InputStream; import java.io.InputStream;
import javax.imageio.ImageIO; import javax.imageio.ImageIO;
import org.dspace.content.Item; import org.dspace.content.Item;
@@ -29,21 +31,17 @@ import org.dspace.core.ConfigurationManager;
* *
* @author Jason Sherman jsherman@usao.edu * @author Jason Sherman jsherman@usao.edu
*/ */
public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats {
{
@Override @Override
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".jpg"; return oldFilename + ".jpg";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
@Override @Override
public String getBundleName() public String getBundleName() {
{
return "THUMBNAIL"; return "THUMBNAIL";
} }
@@ -51,8 +49,7 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
* @return String bitstreamformat * @return String bitstreamformat
*/ */
@Override @Override
public String getFormatString() public String getFormatString() {
{
return "JPEG"; return "JPEG";
} }
@@ -60,23 +57,20 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
* @return String description * @return String description
*/ */
@Override @Override
public String getDescription() public String getDescription() {
{
return "Generated Thumbnail"; return "Generated Thumbnail";
} }
/** /**
* @param currentItem item * @param currentItem item
* @param source source input stream * @param source source input stream
* @param verbose verbose mode * @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{
// read in bitstream's image // read in bitstream's image
BufferedImage buf = ImageIO.read(source); BufferedImage buf = ImageIO.read(source);
@@ -84,45 +78,42 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
} }
public InputStream getThumb(Item currentItem, BufferedImage buf, boolean verbose) public InputStream getThumb(Item currentItem, BufferedImage buf, boolean verbose)
throws Exception throws Exception {
{
// get config params // get config params
float xmax = (float) ConfigurationManager float xmax = (float) ConfigurationManager
.getIntProperty("thumbnail.maxwidth"); .getIntProperty("thumbnail.maxwidth");
float ymax = (float) ConfigurationManager float ymax = (float) ConfigurationManager
.getIntProperty("thumbnail.maxheight"); .getIntProperty("thumbnail.maxheight");
boolean blurring = (boolean) ConfigurationManager boolean blurring = (boolean) ConfigurationManager
.getBooleanProperty("thumbnail.blurring"); .getBooleanProperty("thumbnail.blurring");
boolean hqscaling = (boolean) ConfigurationManager boolean hqscaling = (boolean) ConfigurationManager
.getBooleanProperty("thumbnail.hqscaling"); .getBooleanProperty("thumbnail.hqscaling");
return getThumbDim(currentItem, buf, verbose, xmax, ymax, blurring, hqscaling, 0, 0, null); return getThumbDim(currentItem, buf, verbose, xmax, ymax, blurring, hqscaling, 0, 0, null);
} }
public InputStream getThumbDim(Item currentItem, BufferedImage buf, boolean verbose, float xmax, float ymax, boolean blurring, boolean hqscaling, int brandHeight, int brandFontPoint, String brandFont) public InputStream getThumbDim(Item currentItem, BufferedImage buf, boolean verbose, float xmax, float ymax,
throws Exception boolean blurring, boolean hqscaling, int brandHeight, int brandFontPoint,
{ String brandFont)
throws Exception {
// now get the image dimensions // now get the image dimensions
float xsize = (float) buf.getWidth(null); float xsize = (float) buf.getWidth(null);
float ysize = (float) buf.getHeight(null); float ysize = (float) buf.getHeight(null);
// if verbose flag is set, print out dimensions // if verbose flag is set, print out dimensions
// to STDOUT // to STDOUT
if (verbose) if (verbose) {
{
System.out.println("original size: " + xsize + "," + ysize); System.out.println("original size: " + xsize + "," + ysize);
} }
// scale by x first if needed // scale by x first if needed
if (xsize > xmax) if (xsize > xmax) {
{
// calculate scaling factor so that xsize * scale = new size (max) // calculate scaling factor so that xsize * scale = new size (max)
float scale_factor = xmax / xsize; float scale_factor = xmax / xsize;
// if verbose flag is set, print out extracted text // if verbose flag is set, print out extracted text
// to STDOUT // to STDOUT
if (verbose) if (verbose) {
{
System.out.println("x scale factor: " + scale_factor); System.out.println("x scale factor: " + scale_factor);
} }
@@ -133,15 +124,13 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
// if verbose flag is set, print out extracted text // if verbose flag is set, print out extracted text
// to STDOUT // to STDOUT
if (verbose) if (verbose) {
{
System.out.println("size after fitting to maximum width: " + xsize + "," + ysize); System.out.println("size after fitting to maximum width: " + xsize + "," + ysize);
} }
} }
// scale by y if needed // scale by y if needed
if (ysize > ymax) if (ysize > ymax) {
{
float scale_factor = ymax / ysize; float scale_factor = ymax / ysize;
// now reduce x size // now reduce x size
@@ -151,31 +140,28 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
} }
// if verbose flag is set, print details to STDOUT // if verbose flag is set, print details to STDOUT
if (verbose) if (verbose) {
{
System.out.println("size after fitting to maximum height: " + xsize + ", " System.out.println("size after fitting to maximum height: " + xsize + ", "
+ ysize); + ysize);
} }
// create an image buffer for the thumbnail with the new xsize, ysize // create an image buffer for the thumbnail with the new xsize, ysize
BufferedImage thumbnail = new BufferedImage((int) xsize, (int) ysize, BufferedImage thumbnail = new BufferedImage((int) xsize, (int) ysize,
BufferedImage.TYPE_INT_RGB); BufferedImage.TYPE_INT_RGB);
// Use blurring if selected in config. // Use blurring if selected in config.
// a little blur before scaling does wonders for keeping moire in check. // a little blur before scaling does wonders for keeping moire in check.
if (blurring) if (blurring) {
{ // send the buffered image off to get blurred.
// send the buffered image off to get blurred. buf = getBlurredInstance((BufferedImage) buf);
buf = getBlurredInstance((BufferedImage) buf);
} }
// Use high quality scaling method if selected in config. // Use high quality scaling method if selected in config.
// this has a definite performance penalty. // this has a definite performance penalty.
if (hqscaling) if (hqscaling) {
{ // send the buffered image off to get an HQ downscale.
// send the buffered image off to get an HQ downscale. buf = getScaledInstance((BufferedImage) buf, (int) xsize, (int) ysize,
buf = getScaledInstance((BufferedImage) buf, (int) xsize, (int) ysize, (Object) RenderingHints.VALUE_INTERPOLATION_BICUBIC, (boolean) true);
(Object) RenderingHints.VALUE_INTERPOLATION_BICUBIC, (boolean) true);
} }
// now render the image into the thumbnail buffer // now render the image into the thumbnail buffer
@@ -188,7 +174,7 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
ConfigurationManager.getProperty("webui.preview.brand.abbrev"), ConfigurationManager.getProperty("webui.preview.brand.abbrev"),
currentItem == null ? "" : "hdl:" + currentItem.getHandle()); currentItem == null ? "" : "hdl:" + currentItem.getHandle());
g2d.drawImage(brandImage, (int)0, (int)ysize, (int) xsize, (int) 20, null); g2d.drawImage(brandImage, (int) 0, (int) ysize, (int) xsize, (int) 20, null);
} }
// now create an input stream for the thumbnail buffer and return it // now create an input stream for the thumbnail buffer and return it
@@ -204,37 +190,32 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
@Override @Override
public String[] getInputMIMETypes() public String[] getInputMIMETypes() {
{
return ImageIO.getReaderMIMETypes(); return ImageIO.getReaderMIMETypes();
} }
@Override @Override
public String[] getInputDescriptions() public String[] getInputDescriptions() {
{
return null; return null;
} }
@Override @Override
public String[] getInputExtensions() public String[] getInputExtensions() {
{
// Temporarily disabled as JDK 1.6 only // Temporarily disabled as JDK 1.6 only
// return ImageIO.getReaderFileSuffixes(); // return ImageIO.getReaderFileSuffixes();
return null; return null;
} }
public BufferedImage getNormalizedInstance(BufferedImage buf) public BufferedImage getNormalizedInstance(BufferedImage buf) {
{ int type = (buf.getTransparency() == Transparency.OPAQUE) ?
int type = (buf.getTransparency() == Transparency.OPAQUE) ?
BufferedImage.TYPE_INT_RGB : BufferedImage.TYPE_INT_ARGB_PRE; BufferedImage.TYPE_INT_RGB : BufferedImage.TYPE_INT_ARGB_PRE;
int w, h; int w = buf.getWidth();
w = buf.getWidth(); int h = buf.getHeight();
h = buf.getHeight(); BufferedImage normal = new BufferedImage(w, h, type);
BufferedImage normal = new BufferedImage(w, h, type); Graphics2D g2d = normal.createGraphics();
Graphics2D g2d = normal.createGraphics(); g2d.drawImage(buf, 0, 0, w, h, Color.WHITE, null);
g2d.drawImage(buf, 0, 0, w, h, Color.WHITE, null); g2d.dispose();
g2d.dispose(); return normal;
return normal;
} }
/** /**
@@ -244,55 +225,54 @@ public class JPEGFilter extends MediaFilter implements SelfRegisterInputFormats
* @param buf buffered image * @param buf buffered image
* @return updated BufferedImage * @return updated BufferedImage
*/ */
public BufferedImage getBlurredInstance(BufferedImage buf) public BufferedImage getBlurredInstance(BufferedImage buf) {
{ buf = getNormalizedInstance(buf);
buf = getNormalizedInstance(buf);
// kernel for blur op // kernel for blur op
float[] matrix = { float[] matrix = {
0.111f, 0.111f, 0.111f, 0.111f, 0.111f, 0.111f,
0.111f, 0.111f, 0.111f, 0.111f, 0.111f, 0.111f,
0.111f, 0.111f, 0.111f, 0.111f, 0.111f, 0.111f,
}; };
// perform the blur and return the blurred version. // perform the blur and return the blurred version.
BufferedImageOp blur = new ConvolveOp( new Kernel(3, 3, matrix) ); BufferedImageOp blur = new ConvolveOp(new Kernel(3, 3, matrix));
BufferedImage blurbuf = blur.filter(buf, null); BufferedImage blurbuf = blur.filter(buf, null);
return blurbuf; return blurbuf;
} }
/** /**
* Convenience method that returns a scaled instance of the * Convenience method that returns a scaled instance of the
* provided {@code BufferedImage}. * provided {@code BufferedImage}.
* *
* @param buf the original image to be scaled * @param buf the original image to be scaled
* @param targetWidth the desired width of the scaled instance, * @param targetWidth the desired width of the scaled instance,
* in pixels * in pixels
* @param targetHeight the desired height of the scaled instance, * @param targetHeight the desired height of the scaled instance,
* in pixels * in pixels
* @param hint one of the rendering hints that corresponds to * @param hint one of the rendering hints that corresponds to
* {@code RenderingHints.KEY_INTERPOLATION} (e.g. * {@code RenderingHints.KEY_INTERPOLATION} (e.g.
* {@code RenderingHints.VALUE_INTERPOLATION_NEAREST_NEIGHBOR}, * {@code RenderingHints.VALUE_INTERPOLATION_NEAREST_NEIGHBOR},
* {@code RenderingHints.VALUE_INTERPOLATION_BILINEAR}, * {@code RenderingHints.VALUE_INTERPOLATION_BILINEAR},
* {@code RenderingHints.VALUE_INTERPOLATION_BICUBIC}) * {@code RenderingHints.VALUE_INTERPOLATION_BICUBIC})
* @param higherQuality if true, this method will use a multi-step * @param higherQuality if true, this method will use a multi-step
* scaling technique that provides higher quality than the usual * scaling technique that provides higher quality than the usual
* one-step technique (only useful in downscaling cases, where * one-step technique (only useful in downscaling cases, where
* {@code targetWidth} or {@code targetHeight} is * {@code targetWidth} or {@code targetHeight} is
* smaller than the original dimensions, and generally only when * smaller than the original dimensions, and generally only when
* the {@code BILINEAR} hint is specified) * the {@code BILINEAR} hint is specified)
* @return a scaled version of the original {@code BufferedImage} * @return a scaled version of the original {@code BufferedImage}
*/ */
public BufferedImage getScaledInstance(BufferedImage buf, public BufferedImage getScaledInstance(BufferedImage buf,
int targetWidth, int targetWidth,
int targetHeight, int targetHeight,
Object hint, Object hint,
boolean higherQuality) boolean higherQuality) {
{
int type = (buf.getTransparency() == Transparency.OPAQUE) ? int type = (buf.getTransparency() == Transparency.OPAQUE) ?
BufferedImage.TYPE_INT_RGB : BufferedImage.TYPE_INT_ARGB; BufferedImage.TYPE_INT_RGB : BufferedImage.TYPE_INT_ARGB;
BufferedImage scalebuf = (BufferedImage)buf; BufferedImage scalebuf = (BufferedImage) buf;
int w, h; int w;
int h;
if (higherQuality) { if (higherQuality) {
// Use multi-step technique: start with original size, then // Use multi-step technique: start with original size, then
// scale down in multiple passes with drawImage() // scale down in multiple passes with drawImage()

View File

@@ -13,38 +13,34 @@ import org.dspace.core.Context;
/** /**
* Abstract class which defines the default settings for a *simple* Media or Format Filter. * Abstract class which defines the default settings for a *simple* Media or Format Filter.
* This class may be extended by any class which wishes to define a simple filter to be run * This class may be extended by any class which wishes to define a simple filter to be run
* by the MediaFilterManager. More complex filters should likely implement the FormatFilter * by the MediaFilterManager. More complex filters should likely implement the FormatFilter
* interface directly, so that they can define their own pre/postProcessing methods. * interface directly, so that they can define their own pre/postProcessing methods.
*/ */
public abstract class MediaFilter implements FormatFilter public abstract class MediaFilter implements FormatFilter {
{ /**
/** * Perform any pre-processing of the source bitstream *before* the actual
* Perform any pre-processing of the source bitstream *before* the actual
* filtering takes place in MediaFilterManager.processBitstream(). * filtering takes place in MediaFilterManager.processBitstream().
* <p> * <p>
* Return true if pre-processing is successful (or no pre-processing * Return true if pre-processing is successful (or no pre-processing
* is necessary). Return false if bitstream should be skipped * is necessary). Return false if bitstream should be skipped
* for any reason. * for any reason.
* *
* * @param c context
* @param c context * @param item item containing bitstream to process
* @param item item containing bitstream to process * @param source source bitstream to be processed
* @param source source bitstream to be processed
* @param verbose verbose mode * @param verbose verbose mode
* * @return true if bitstream processing should continue,
* @return true if bitstream processing should continue, * false if this bitstream should be skipped
* false if this bitstream should be skipped
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public boolean preProcessBitstream(Context c, Item item, Bitstream source, boolean verbose) public boolean preProcessBitstream(Context c, Item item, Bitstream source, boolean verbose)
throws Exception throws Exception {
{
return true; //default to no pre-processing return true; //default to no pre-processing
} }
/** /**
* Perform any post-processing of the generated bitstream *after* this * Perform any post-processing of the generated bitstream *after* this
* filter has already been run. * filter has already been run.
@@ -52,21 +48,16 @@ public abstract class MediaFilter implements FormatFilter
* Return true if pre-processing is successful (or no pre-processing * Return true if pre-processing is successful (or no pre-processing
* is necessary). Return false if bitstream should be skipped * is necessary). Return false if bitstream should be skipped
* for some reason. * for some reason.
* *
* * @param c context
* @param c * @param item item containing bitstream to process
* context * @param generatedBitstream the bitstream which was generated by
* @param item * this filter.
* item containing bitstream to process * @throws Exception if error
* @param generatedBitstream
* the bitstream which was generated by
* this filter.
* @throws java.lang.Exception
*/ */
@Override @Override
public void postProcessBitstream(Context c, Item item, Bitstream generatedBitstream) public void postProcessBitstream(Context c, Item item, Bitstream generatedBitstream)
throws Exception throws Exception {
{
//default to no post-processing necessary //default to no post-processing necessary
} }
} }

View File

@@ -7,7 +7,22 @@
*/ */
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import org.apache.commons.cli.*; import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.CommandLineParser;
import org.apache.commons.cli.HelpFormatter;
import org.apache.commons.cli.MissingArgumentException;
import org.apache.commons.cli.Option;
import org.apache.commons.cli.OptionBuilder;
import org.apache.commons.cli.Options;
import org.apache.commons.cli.PosixParser;
import org.apache.commons.lang.ArrayUtils;
import org.dspace.app.mediafilter.factory.MediaFilterServiceFactory; import org.dspace.app.mediafilter.factory.MediaFilterServiceFactory;
import org.dspace.app.mediafilter.service.MediaFilterService; import org.dspace.app.mediafilter.service.MediaFilterService;
import org.dspace.content.Collection; import org.dspace.content.Collection;
@@ -19,9 +34,6 @@ import org.dspace.core.Context;
import org.dspace.core.SelfNamedPlugin; import org.dspace.core.SelfNamedPlugin;
import org.dspace.core.factory.CoreServiceFactory; import org.dspace.core.factory.CoreServiceFactory;
import org.dspace.handle.factory.HandleServiceFactory; import org.dspace.handle.factory.HandleServiceFactory;
import java.util.*;
import org.apache.commons.lang.ArrayUtils;
import org.dspace.services.factory.DSpaceServicesFactory; import org.dspace.services.factory.DSpaceServicesFactory;
/** /**
@@ -44,8 +56,12 @@ public class MediaFilterCLITool {
//suffix (in dspace.cfg) for input formats supported by each filter //suffix (in dspace.cfg) for input formats supported by each filter
private static final String INPUT_FORMATS_SUFFIX = "inputFormats"; private static final String INPUT_FORMATS_SUFFIX = "inputFormats";
public static void main(String[] argv) throws Exception /**
{ * Default constructor
*/
private MediaFilterCLITool() { }
public static void main(String[] argv) throws Exception {
// set headless for non-gui workstations // set headless for non-gui workstations
System.setProperty("java.awt.headless", "true"); System.setProperty("java.awt.headless", "true");
@@ -57,25 +73,25 @@ public class MediaFilterCLITool {
Options options = new Options(); Options options = new Options();
options.addOption("v", "verbose", false, options.addOption("v", "verbose", false,
"print all extracted text and other details to STDOUT"); "print all extracted text and other details to STDOUT");
options.addOption("q", "quiet", false, options.addOption("q", "quiet", false,
"do not print anything except in the event of errors."); "do not print anything except in the event of errors.");
options.addOption("f", "force", false, options.addOption("f", "force", false,
"force all bitstreams to be processed"); "force all bitstreams to be processed");
options.addOption("i", "identifier", true, options.addOption("i", "identifier", true,
"ONLY process bitstreams belonging to identifier"); "ONLY process bitstreams belonging to identifier");
options.addOption("m", "maximum", true, options.addOption("m", "maximum", true,
"process no more than maximum items"); "process no more than maximum items");
options.addOption("h", "help", false, "help"); options.addOption("h", "help", false, "help");
//create a "plugin" option (to specify specific MediaFilter plugins to run) //create a "plugin" option (to specify specific MediaFilter plugins to run)
OptionBuilder.withLongOpt("plugins"); OptionBuilder.withLongOpt("plugins");
OptionBuilder.withValueSeparator(','); OptionBuilder.withValueSeparator(',');
OptionBuilder.withDescription( OptionBuilder.withDescription(
"ONLY run the specified Media Filter plugin(s)\n" + "ONLY run the specified Media Filter plugin(s)\n" +
"listed from '" + MEDIA_FILTER_PLUGINS_KEY + "' in dspace.cfg.\n" + "listed from '" + MEDIA_FILTER_PLUGINS_KEY + "' in dspace.cfg.\n" +
"Separate multiple with a comma (,)\n" + "Separate multiple with a comma (,)\n" +
"(e.g. MediaFilterManager -p \n\"Word Text Extractor\",\"PDF Text Extractor\")"); "(e.g. MediaFilterManager -p \n\"Word Text Extractor\",\"PDF Text Extractor\")");
Option pluginOption = OptionBuilder.create('p'); Option pluginOption = OptionBuilder.create('p');
pluginOption.setArgs(Option.UNLIMITED_VALUES); //unlimited number of args pluginOption.setArgs(Option.UNLIMITED_VALUES); //unlimited number of args
options.addOption(pluginOption); options.addOption(pluginOption);
@@ -84,9 +100,9 @@ public class MediaFilterCLITool {
OptionBuilder.withLongOpt("skip"); OptionBuilder.withLongOpt("skip");
OptionBuilder.withValueSeparator(','); OptionBuilder.withValueSeparator(',');
OptionBuilder.withDescription( OptionBuilder.withDescription(
"SKIP the bitstreams belonging to identifier\n" + "SKIP the bitstreams belonging to identifier\n" +
"Separate multiple identifiers with a comma (,)\n" + "Separate multiple identifiers with a comma (,)\n" +
"(e.g. MediaFilterManager -s \n 123456789/34,123456789/323)"); "(e.g. MediaFilterManager -s \n 123456789/34,123456789/323)");
Option skipOption = OptionBuilder.create('s'); Option skipOption = OptionBuilder.create('s');
skipOption.setArgs(Option.UNLIMITED_VALUES); //unlimited number of args skipOption.setArgs(Option.UNLIMITED_VALUES); //unlimited number of args
options.addOption(skipOption); options.addOption(skipOption);
@@ -99,73 +115,61 @@ public class MediaFilterCLITool {
Map<String, List<String>> filterFormats = new HashMap<>(); Map<String, List<String>> filterFormats = new HashMap<>();
CommandLine line = null; CommandLine line = null;
try try {
{
line = parser.parse(options, argv); line = parser.parse(options, argv);
} } catch (MissingArgumentException e) {
catch(MissingArgumentException e)
{
System.out.println("ERROR: " + e.getMessage()); System.out.println("ERROR: " + e.getMessage());
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MediaFilterManager\n", options); myhelp.printHelp("MediaFilterManager\n", options);
System.exit(1); System.exit(1);
} }
if (line.hasOption('h')) if (line.hasOption('h')) {
{
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MediaFilterManager\n", options); myhelp.printHelp("MediaFilterManager\n", options);
System.exit(0); System.exit(0);
} }
if (line.hasOption('v')) if (line.hasOption('v')) {
{
isVerbose = true; isVerbose = true;
} }
isQuiet = line.hasOption('q'); isQuiet = line.hasOption('q');
if (line.hasOption('f')) if (line.hasOption('f')) {
{
isForce = true; isForce = true;
} }
if (line.hasOption('i')) if (line.hasOption('i')) {
{
identifier = line.getOptionValue('i'); identifier = line.getOptionValue('i');
} }
if (line.hasOption('m')) if (line.hasOption('m')) {
{
max2Process = Integer.parseInt(line.getOptionValue('m')); max2Process = Integer.parseInt(line.getOptionValue('m'));
if (max2Process <= 1) if (max2Process <= 1) {
{
System.out.println("Invalid maximum value '" + System.out.println("Invalid maximum value '" +
line.getOptionValue('m') + "' - ignoring"); line.getOptionValue('m') + "' - ignoring");
max2Process = Integer.MAX_VALUE; max2Process = Integer.MAX_VALUE;
} }
} }
String filterNames[] = null; String filterNames[] = null;
if(line.hasOption('p')) if (line.hasOption('p')) {
{
//specified which media filter plugins we are using //specified which media filter plugins we are using
filterNames = line.getOptionValues('p'); filterNames = line.getOptionValues('p');
if(filterNames==null || filterNames.length==0) if (filterNames == null || filterNames.length == 0) { //display error, since no plugins specified
{ //display error, since no plugins specified
System.err.println("\nERROR: -p (-plugin) option requires at least one plugin to be specified.\n" + System.err.println("\nERROR: -p (-plugin) option requires at least one plugin to be specified.\n" +
"(e.g. MediaFilterManager -p \"Word Text Extractor\",\"PDF Text Extractor\")\n"); "(e.g. MediaFilterManager -p \"Word Text Extractor\",\"PDF Text Extractor\")\n");
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MediaFilterManager\n", options); myhelp.printHelp("MediaFilterManager\n", options);
System.exit(1); System.exit(1);
} }
} } else {
else
{
//retrieve list of all enabled media filter plugins! //retrieve list of all enabled media filter plugins!
filterNames = DSpaceServicesFactory.getInstance().getConfigurationService().getArrayProperty(MEDIA_FILTER_PLUGINS_KEY); filterNames = DSpaceServicesFactory.getInstance().getConfigurationService()
.getArrayProperty(MEDIA_FILTER_PLUGINS_KEY);
} }
MediaFilterService mediaFilterService = MediaFilterServiceFactory.getInstance().getMediaFilterService(); MediaFilterService mediaFilterService = MediaFilterServiceFactory.getInstance().getMediaFilterService();
@@ -178,17 +182,16 @@ public class MediaFilterCLITool {
List<FormatFilter> filterList = new ArrayList<FormatFilter>(); List<FormatFilter> filterList = new ArrayList<FormatFilter>();
//set up each filter //set up each filter
for(int i=0; i< filterNames.length; i++) for (int i = 0; i < filterNames.length; i++) {
{
//get filter of this name & add to list of filters //get filter of this name & add to list of filters
FormatFilter filter = (FormatFilter) CoreServiceFactory.getInstance().getPluginService().getNamedPlugin(FormatFilter.class, filterNames[i]); FormatFilter filter = (FormatFilter) CoreServiceFactory.getInstance().getPluginService()
if(filter==null) .getNamedPlugin(FormatFilter.class, filterNames[i]);
{ if (filter == null) {
System.err.println("\nERROR: Unknown MediaFilter specified (either from command-line or in dspace.cfg): '" + filterNames[i] + "'"); System.err.println(
"\nERROR: Unknown MediaFilter specified (either from command-line or in dspace.cfg): '" +
filterNames[i] + "'");
System.exit(1); System.exit(1);
} } else {
else
{
filterList.add(filter); filterList.add(filter);
String filterClassName = filter.getClass().getName(); String filterClassName = filter.getClass().getName();
@@ -200,8 +203,7 @@ public class MediaFilterCLITool {
//each "named" plugin that it defines. //each "named" plugin that it defines.
//So, we have to look for every key that fits the //So, we have to look for every key that fits the
//following format: filter.<class-name>.<plugin-name>.inputFormats //following format: filter.<class-name>.<plugin-name>.inputFormats
if( SelfNamedPlugin.class.isAssignableFrom(filter.getClass()) ) if (SelfNamedPlugin.class.isAssignableFrom(filter.getClass())) {
{
//Get the plugin instance name for this class //Get the plugin instance name for this class
pluginName = ((SelfNamedPlugin) filter).getPluginInstanceName(); pluginName = ((SelfNamedPlugin) filter).getPluginInstanceName();
} }
@@ -212,45 +214,42 @@ public class MediaFilterCLITool {
// filter.<class-name>.<plugin-name>.inputFormats // filter.<class-name>.<plugin-name>.inputFormats
//For other MediaFilters, format of key is: //For other MediaFilters, format of key is:
// filter.<class-name>.inputFormats // filter.<class-name>.inputFormats
String[] formats = String[] formats =
DSpaceServicesFactory.getInstance().getConfigurationService().getArrayProperty( DSpaceServicesFactory.getInstance().getConfigurationService().getArrayProperty(
FILTER_PREFIX + "." + filterClassName + FILTER_PREFIX + "." + filterClassName +
(pluginName!=null ? "." + pluginName : "") + (pluginName != null ? "." + pluginName : "") +
"." + INPUT_FORMATS_SUFFIX); "." + INPUT_FORMATS_SUFFIX);
//add to internal map of filters to supported formats //add to internal map of filters to supported formats
if (ArrayUtils.isNotEmpty(formats)) if (ArrayUtils.isNotEmpty(formats)) {
{
//For SelfNamedPlugins, map key is: //For SelfNamedPlugins, map key is:
// <class-name><separator><plugin-name> // <class-name><separator><plugin-name>
//For other MediaFilters, map key is just: //For other MediaFilters, map key is just:
// <class-name> // <class-name>
filterFormats.put(filterClassName + filterFormats.put(filterClassName +
(pluginName!=null ? MediaFilterService.FILTER_PLUGIN_SEPARATOR + pluginName : ""), (pluginName != null ? MediaFilterService.FILTER_PLUGIN_SEPARATOR +
Arrays.asList(formats)); pluginName : ""),
Arrays.asList(formats));
} }
}//end if filter!=null } //end if filter!=null
}//end for } //end for
//If verbose, print out loaded mediafilter info //If verbose, print out loaded mediafilter info
if(isVerbose) if (isVerbose) {
{
System.out.println("The following MediaFilters are enabled: "); System.out.println("The following MediaFilters are enabled: ");
Iterator<String> i = filterFormats.keySet().iterator(); Iterator<String> i = filterFormats.keySet().iterator();
while(i.hasNext()) while (i.hasNext()) {
{
String filterName = i.next(); String filterName = i.next();
System.out.println("Full Filter Name: " + filterName); System.out.println("Full Filter Name: " + filterName);
String pluginName = null; String pluginName = null;
if(filterName.contains(MediaFilterService.FILTER_PLUGIN_SEPARATOR)) if (filterName.contains(MediaFilterService.FILTER_PLUGIN_SEPARATOR)) {
{
String[] fields = filterName.split(MediaFilterService.FILTER_PLUGIN_SEPARATOR); String[] fields = filterName.split(MediaFilterService.FILTER_PLUGIN_SEPARATOR);
filterName=fields[0]; filterName = fields[0];
pluginName=fields[1]; pluginName = fields[1];
} }
System.out.println(filterName + System.out.println(filterName +
(pluginName!=null? " (Plugin: " + pluginName + ")": "")); (pluginName != null ? " (Plugin: " + pluginName + ")" : ""));
} }
} }
@@ -261,16 +260,14 @@ public class MediaFilterCLITool {
//Retrieve list of identifiers to skip (if any) //Retrieve list of identifiers to skip (if any)
String skipIds[] = null; String skipIds[] = null;
if(line.hasOption('s')) if (line.hasOption('s')) {
{
//specified which identifiers to skip when processing //specified which identifiers to skip when processing
skipIds = line.getOptionValues('s'); skipIds = line.getOptionValues('s');
if(skipIds==null || skipIds.length==0) if (skipIds == null || skipIds.length == 0) { //display error, since no identifiers specified to skip
{ //display error, since no identifiers specified to skip
System.err.println("\nERROR: -s (-skip) option requires at least one identifier to SKIP.\n" + System.err.println("\nERROR: -s (-skip) option requires at least one identifier to SKIP.\n" +
"Make sure to separate multiple identifiers with a comma!\n" + "Make sure to separate multiple identifiers with a comma!\n" +
"(e.g. MediaFilterManager -s 123456789/34,123456789/323)\n"); "(e.g. MediaFilterManager -s 123456789/34,123456789/323)\n");
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("MediaFilterManager\n", options); myhelp.printHelp("MediaFilterManager\n", options);
System.exit(0); System.exit(0);
@@ -282,29 +279,24 @@ public class MediaFilterCLITool {
Context c = null; Context c = null;
try try {
{
c = new Context(); c = new Context();
// have to be super-user to do the filtering // have to be super-user to do the filtering
c.turnOffAuthorisationSystem(); c.turnOffAuthorisationSystem();
// now apply the filters // now apply the filters
if (identifier == null) if (identifier == null) {
{
mediaFilterService.applyFiltersAllItems(c); mediaFilterService.applyFiltersAllItems(c);
} } else {
else // restrict application scope to identifier // restrict application scope to identifier
{
DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(c, identifier); DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(c, identifier);
if (dso == null) if (dso == null) {
{
throw new IllegalArgumentException("Cannot resolve " throw new IllegalArgumentException("Cannot resolve "
+ identifier + " to a DSpace object"); + identifier + " to a DSpace object");
} }
switch (dso.getType()) switch (dso.getType()) {
{
case Constants.COMMUNITY: case Constants.COMMUNITY:
mediaFilterService.applyFiltersCommunity(c, (Community) dso); mediaFilterService.applyFiltersCommunity(c, (Community) dso);
break; break;
@@ -314,20 +306,17 @@ public class MediaFilterCLITool {
case Constants.ITEM: case Constants.ITEM:
mediaFilterService.applyFiltersItem(c, (Item) dso); mediaFilterService.applyFiltersItem(c, (Item) dso);
break; break;
default:
break;
} }
} }
c.complete(); c.complete();
c = null; c = null;
} } catch (Exception e) {
catch (Exception e)
{
status = 1; status = 1;
} } finally {
finally if (c != null) {
{
if (c != null)
{
c.abort(); c.abort();
} }
} }

View File

@@ -8,13 +8,27 @@
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import java.io.InputStream; import java.io.InputStream;
import java.util.*; import java.util.ArrayList;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import org.dspace.app.mediafilter.service.MediaFilterService; import org.dspace.app.mediafilter.service.MediaFilterService;
import org.dspace.authorize.service.AuthorizeService; import org.dspace.authorize.service.AuthorizeService;
import org.dspace.content.*; import org.dspace.content.Bitstream;
import org.dspace.content.BitstreamFormat;
import org.dspace.content.Bundle;
import org.dspace.content.Collection; import org.dspace.content.Collection;
import org.dspace.content.service.*; import org.dspace.content.Community;
import org.dspace.content.DCDate;
import org.dspace.content.Item;
import org.dspace.content.service.BitstreamFormatService;
import org.dspace.content.service.BitstreamService;
import org.dspace.content.service.BundleService;
import org.dspace.content.service.CollectionService;
import org.dspace.content.service.CommunityService;
import org.dspace.content.service.ItemService;
import org.dspace.core.Constants; import org.dspace.core.Constants;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.core.SelfNamedPlugin; import org.dspace.core.SelfNamedPlugin;
@@ -29,12 +43,11 @@ import org.springframework.beans.factory.annotation.Autowired;
* repository's content. A few command line flags affect the operation of the * repository's content. A few command line flags affect the operation of the
* MFM: -v verbose outputs all extracted text to STDOUT; -f force forces all * MFM: -v verbose outputs all extracted text to STDOUT; -f force forces all
* bitstreams to be processed, even if they have been before; -n noindex does not * bitstreams to be processed, even if they have been before; -n noindex does not
* recreate index after processing bitstreams; -i [identifier] limits processing * recreate index after processing bitstreams; -i [identifier] limits processing
* scope to a community, collection or item; and -m [max] limits processing to a * scope to a community, collection or item; and -m [max] limits processing to a
* maximum number of items. * maximum number of items.
*/ */
public class MediaFilterServiceImpl implements MediaFilterService, InitializingBean public class MediaFilterServiceImpl implements MediaFilterService, InitializingBean {
{
@Autowired(required = true) @Autowired(required = true)
protected AuthorizeService authorizeService; protected AuthorizeService authorizeService;
@Autowired(required = true) @Autowired(required = true)
@@ -55,13 +68,13 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
protected ConfigurationService configurationService; protected ConfigurationService configurationService;
protected int max2Process = Integer.MAX_VALUE; // maximum number items to process protected int max2Process = Integer.MAX_VALUE; // maximum number items to process
protected int processed = 0; // number items processed protected int processed = 0; // number items processed
protected Item currentItem = null; // current item being processed protected Item currentItem = null; // current item being processed
protected List<FormatFilter> filterClasses = null; protected List<FormatFilter> filterClasses = null;
protected Map<String, List<String>> filterFormats = new HashMap<>(); protected Map<String, List<String>> filterFormats = new HashMap<>();
protected List<String> skipList = null; //list of identifiers to skip during processing protected List<String> skipList = null; //list of identifiers to skip during processing
@@ -72,27 +85,25 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
protected boolean isQuiet = false; protected boolean isQuiet = false;
protected boolean isForce = false; // default to not forced protected boolean isForce = false; // default to not forced
protected MediaFilterServiceImpl() protected MediaFilterServiceImpl() {
{
} }
@Override @Override
public void afterPropertiesSet() throws Exception { public void afterPropertiesSet() throws Exception {
String[] publicPermissionFilters = configurationService.getArrayProperty("filter.org.dspace.app.mediafilter.publicPermission"); String[] publicPermissionFilters = configurationService
.getArrayProperty("filter.org.dspace.app.mediafilter.publicPermission");
if(publicPermissionFilters != null) { if (publicPermissionFilters != null) {
for(String filter : publicPermissionFilters) { for (String filter : publicPermissionFilters) {
publicFiltersClasses.add(filter.trim()); publicFiltersClasses.add(filter.trim());
} }
} }
} }
@Override @Override
public void applyFiltersAllItems(Context context) throws Exception public void applyFiltersAllItems(Context context) throws Exception {
{ if (skipList != null) {
if(skipList!=null)
{
//if a skip-list exists, we need to filter community-by-community //if a skip-list exists, we need to filter community-by-community
//so we can respect what is in the skip-list //so we can respect what is in the skip-list
List<Community> topLevelCommunities = communityService.findAllTop(context); List<Community> topLevelCommunities = communityService.findAllTop(context);
@@ -100,13 +111,10 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
for (Community topLevelCommunity : topLevelCommunities) { for (Community topLevelCommunity : topLevelCommunities) {
applyFiltersCommunity(context, topLevelCommunity); applyFiltersCommunity(context, topLevelCommunity);
} }
} } else {
else
{
//otherwise, just find every item and process //otherwise, just find every item and process
Iterator<Item> itemIterator = itemService.findAll(context); Iterator<Item> itemIterator = itemService.findAll(context);
while (itemIterator.hasNext() && processed < max2Process) while (itemIterator.hasNext() && processed < max2Process) {
{
applyFiltersItem(context, itemIterator.next()); applyFiltersItem(context, itemIterator.next());
} }
} }
@@ -114,16 +122,14 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
@Override @Override
public void applyFiltersCommunity(Context context, Community community) public void applyFiltersCommunity(Context context, Community community)
throws Exception throws Exception { //only apply filters if community not in skip-list
{ //only apply filters if community not in skip-list if (!inSkipList(community.getHandle())) {
if(!inSkipList(community.getHandle())) List<Community> subcommunities = community.getSubcommunities();
{
List<Community> subcommunities = community.getSubcommunities();
for (Community subcommunity : subcommunities) { for (Community subcommunity : subcommunities) {
applyFiltersCommunity(context, subcommunity); applyFiltersCommunity(context, subcommunity);
} }
List<Collection> collections = community.getCollections(); List<Collection> collections = community.getCollections();
for (Collection collection : collections) { for (Collection collection : collections) {
applyFiltersCollection(context, collection); applyFiltersCollection(context, collection);
} }
@@ -132,42 +138,36 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
@Override @Override
public void applyFiltersCollection(Context context, Collection collection) public void applyFiltersCollection(Context context, Collection collection)
throws Exception throws Exception {
{
//only apply filters if collection not in skip-list //only apply filters if collection not in skip-list
if(!inSkipList(collection.getHandle())) if (!inSkipList(collection.getHandle())) {
{
Iterator<Item> itemIterator = itemService.findAllByCollection(context, collection); Iterator<Item> itemIterator = itemService.findAllByCollection(context, collection);
while (itemIterator.hasNext() && processed < max2Process) while (itemIterator.hasNext() && processed < max2Process) {
{
applyFiltersItem(context, itemIterator.next()); applyFiltersItem(context, itemIterator.next());
} }
} }
} }
@Override @Override
public void applyFiltersItem(Context c, Item item) throws Exception public void applyFiltersItem(Context c, Item item) throws Exception {
{
//only apply filters if item not in skip-list //only apply filters if item not in skip-list
if(!inSkipList(item.getHandle())) if (!inSkipList(item.getHandle())) {
{ //cache this item in MediaFilterManager
//cache this item in MediaFilterManager //so it can be accessed by MediaFilters as necessary
//so it can be accessed by MediaFilters as necessary currentItem = item;
currentItem = item;
if (filterItem(c, item)) {
if (filterItem(c, item)) // increment processed count
{ ++processed;
// increment processed count }
++processed; // clear item objects from context cache and internal cache
} c.uncacheEntity(currentItem);
// clear item objects from context cache and internal cache currentItem = null;
currentItem = null; }
}
} }
@Override @Override
public boolean filterItem(Context context, Item myItem) throws Exception public boolean filterItem(Context context, Item myItem) throws Exception {
{
// get 'original' bundles // get 'original' bundles
List<Bundle> myBundles = itemService.getBundles(myItem, "ORIGINAL"); List<Bundle> myBundles = itemService.getBundles(myItem, "ORIGINAL");
boolean done = false; boolean done = false;
@@ -184,12 +184,11 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
@Override @Override
public boolean filterBitstream(Context context, Item myItem, public boolean filterBitstream(Context context, Item myItem,
Bitstream myBitstream) throws Exception Bitstream myBitstream) throws Exception {
{ boolean filtered = false;
boolean filtered = false;
// iterate through filter classes. A single format may be actioned
// iterate through filter classes. A single format may be actioned // by more than one filter
// by more than one filter
for (FormatFilter filterClass : filterClasses) { for (FormatFilter filterClass : filterClasses) {
//List fmts = (List)filterFormats.get(filterClasses[i].getClass().getName()); //List fmts = (List)filterFormats.get(filterClasses[i].getClass().getName());
String pluginName = null; String pluginName = null;
@@ -208,7 +207,7 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
//For other MediaFilters, map key is just: //For other MediaFilters, map key is just:
// <class-name> // <class-name>
List<String> fmts = filterFormats.get(filterClass.getClass().getName() + List<String> fmts = filterFormats.get(filterClass.getClass().getName() +
(pluginName != null ? FILTER_PLUGIN_SEPARATOR + pluginName : "")); (pluginName != null ? FILTER_PLUGIN_SEPARATOR + pluginName : ""));
if (fmts.contains(myBitstream.getFormat(context).getShortDescription())) { if (fmts.contains(myBitstream.getFormat(context).getShortDescription())) {
try { try {
@@ -289,7 +288,7 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
} }
} catch (Exception e) { } catch (Exception e) {
System.out.println("ERROR filtering, skipping bitstream #" System.out.println("ERROR filtering, skipping bitstream #"
+ myBitstream.getID() + " " + e); + myBitstream.getID() + " " + e);
e.printStackTrace(); e.printStackTrace();
} }
} }
@@ -297,19 +296,17 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
} }
return filtered; return filtered;
} }
@Override @Override
public boolean processBitstream(Context context, Item item, Bitstream source, FormatFilter formatFilter) public boolean processBitstream(Context context, Item item, Bitstream source, FormatFilter formatFilter)
throws Exception throws Exception {
{
//do pre-processing of this bitstream, and if it fails, skip this bitstream! //do pre-processing of this bitstream, and if it fails, skip this bitstream!
if(!formatFilter.preProcessBitstream(context, item, source, isVerbose)) if (!formatFilter.preProcessBitstream(context, item, source, isVerbose)) {
{
return false; return false;
} }
boolean overWrite = isForce; boolean overWrite = isForce;
// get bitstream filename, calculate destination filename // get bitstream filename, calculate destination filename
String newName = formatFilter.getFilteredName(source.getName()); String newName = formatFilter.getFilteredName(source.getName());
@@ -319,14 +316,13 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
List<Bundle> bundles = itemService.getBundles(item, formatFilter.getBundleName()); List<Bundle> bundles = itemService.getBundles(item, formatFilter.getBundleName());
// check if destination bitstream exists // check if destination bitstream exists
if (bundles.size() > 0) if (bundles.size() > 0) {
{
// only finds the last match (FIXME?) // only finds the last match (FIXME?)
for (Bundle bundle : bundles) { for (Bundle bundle : bundles) {
List<Bitstream> bitstreams = bundle.getBitstreams(); List<Bitstream> bitstreams = bundle.getBitstreams();
for (Bitstream bitstream : bitstreams) { for (Bitstream bitstream : bitstreams) {
if (bitstream.getName().equals(newName)) { if (bitstream.getName().trim().equals(newName.trim())) {
targetBundle = bundle; targetBundle = bundle;
existingBitstream = bitstream; existingBitstream = bitstream;
} }
@@ -335,20 +331,18 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
} }
// if exists and overwrite = false, exit // if exists and overwrite = false, exit
if (!overWrite && (existingBitstream != null)) if (!overWrite && (existingBitstream != null)) {
{ if (!isQuiet) {
if (!isQuiet)
{
System.out.println("SKIPPED: bitstream " + source.getID() System.out.println("SKIPPED: bitstream " + source.getID()
+ " (item: " + item.getHandle() + ") because '" + newName + "' already exists"); + " (item: " + item.getHandle() + ") because '" + newName + "' already exists");
} }
return false; return false;
} }
if(isVerbose) { if (isVerbose) {
System.out.println("PROCESSING: bitstream " + source.getID() System.out.println("PROCESSING: bitstream " + source.getID()
+ " (item: " + item.getHandle() + ")"); + " (item: " + item.getHandle() + ")");
} }
InputStream destStream; InputStream destStream;
@@ -358,7 +352,7 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
if (destStream == null) { if (destStream == null) {
if (!isQuiet) { if (!isQuiet) {
System.out.println("SKIPPED: bitstream " + source.getID() System.out.println("SKIPPED: bitstream " + source.getID()
+ " (item: " + item.getHandle() + ") because filtering was unsuccessful"); + " (item: " + item.getHandle() + ") because filtering was unsuccessful");
} }
return false; return false;
@@ -369,12 +363,9 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
} }
// create new bundle if needed // create new bundle if needed
if (bundles.size() < 1) if (bundles.size() < 1) {
{
targetBundle = bundleService.create(context, item, formatFilter.getBundleName()); targetBundle = bundleService.create(context, item, formatFilter.getBundleName());
} } else {
else
{
// take the first match // take the first match
targetBundle = bundles.get(0); targetBundle = bundles.get(0);
} }
@@ -384,21 +375,21 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
// Now set the format and name of the bitstream // Now set the format and name of the bitstream
b.setName(context, newName); b.setName(context, newName);
b.setSource(context, "Written by FormatFilter " + formatFilter.getClass().getName() + b.setSource(context, "Written by FormatFilter " + formatFilter.getClass().getName() +
" on " + DCDate.getCurrent() + " (GMT)."); " on " + DCDate.getCurrent() + " (GMT).");
b.setDescription(context, formatFilter.getDescription()); b.setDescription(context, formatFilter.getDescription());
// Find the proper format // Find the proper format
BitstreamFormat bf = bitstreamFormatService.findByShortDescription(context, BitstreamFormat bf = bitstreamFormatService.findByShortDescription(context,
formatFilter.getFormatString()); formatFilter.getFormatString());
bitstreamService.setFormat(context, b, bf); bitstreamService.setFormat(context, b, bf);
bitstreamService.update(context, b); bitstreamService.update(context, b);
//Set permissions on the derivative bitstream //Set permissions on the derivative bitstream
//- First remove any existing policies //- First remove any existing policies
authorizeService.removeAllPolicies(context, b); authorizeService.removeAllPolicies(context, b);
//- Determine if this is a public-derivative format //- Determine if this is a public-derivative format
if(publicFiltersClasses.contains(formatFilter.getClass().getSimpleName())) { if (publicFiltersClasses.contains(formatFilter.getClass().getSimpleName())) {
//- Set derivative bitstream to be publicly accessible //- Set derivative bitstream to be publicly accessible
Group anonymous = groupService.findByName(context, Group.ANONYMOUS); Group anonymous = groupService.findByName(context, Group.ANONYMOUS);
authorizeService.addPolicy(context, b, Constants.READ, anonymous); authorizeService.addPolicy(context, b, Constants.READ, anonymous);
@@ -409,42 +400,34 @@ public class MediaFilterServiceImpl implements MediaFilterService, InitializingB
// fixme - set date? // fixme - set date?
// we are overwriting, so remove old bitstream // we are overwriting, so remove old bitstream
if (existingBitstream != null) if (existingBitstream != null) {
{
bundleService.removeBitstream(context, targetBundle, existingBitstream); bundleService.removeBitstream(context, targetBundle, existingBitstream);
} }
if (!isQuiet) if (!isQuiet) {
{
System.out.println("FILTERED: bitstream " + source.getID() System.out.println("FILTERED: bitstream " + source.getID()
+ " (item: " + item.getHandle() + ") and created '" + newName + "'"); + " (item: " + item.getHandle() + ") and created '" + newName + "'");
} }
//do post-processing of the generated bitstream //do post-processing of the generated bitstream
formatFilter.postProcessBitstream(context, item, b); formatFilter.postProcessBitstream(context, item, b);
return true; return true;
} }
@Override @Override
public Item getCurrentItem() public Item getCurrentItem() {
{
return currentItem; return currentItem;
} }
@Override @Override
public boolean inSkipList(String identifier) public boolean inSkipList(String identifier) {
{ if (skipList != null && skipList.contains(identifier)) {
if(skipList!=null && skipList.contains(identifier)) if (!isQuiet) {
{
if (!isQuiet)
{
System.out.println("SKIP-LIST: skipped bitstreams within identifier " + identifier); System.out.println("SKIP-LIST: skipped bitstreams within identifier " + identifier);
} }
return true; return true;
} } else {
else
{
return false; return false;
} }
} }

View File

@@ -7,18 +7,14 @@
*/ */
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import java.awt.image.*; import java.awt.image.BufferedImage;
import java.io.InputStream; import java.io.InputStream;
import javax.imageio.ImageIO; import javax.imageio.ImageIO;
import org.apache.pdfbox.pdmodel.PDDocument; import org.apache.pdfbox.pdmodel.PDDocument;
import org.apache.pdfbox.rendering.PDFRenderer; import org.apache.pdfbox.rendering.PDFRenderer;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.app.mediafilter.JPEGFilter;
/** /**
* Create JPEG thumbnails from PDF cover page using PDFBox. * Create JPEG thumbnails from PDF cover page using PDFBox.
* Based on JPEGFilter: * Based on JPEGFilter:
@@ -29,21 +25,17 @@ import org.dspace.app.mediafilter.JPEGFilter;
* @author Ivan Masár helix84@centrum.sk * @author Ivan Masár helix84@centrum.sk
* @author Jason Sherman jsherman@usao.edu * @author Jason Sherman jsherman@usao.edu
*/ */
public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFormats public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFormats {
{
@Override @Override
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".jpg"; return oldFilename + ".jpg";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
@Override @Override
public String getBundleName() public String getBundleName() {
{
return "THUMBNAIL"; return "THUMBNAIL";
} }
@@ -51,8 +43,7 @@ public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFor
* @return String bitstreamformat * @return String bitstreamformat
*/ */
@Override @Override
public String getFormatString() public String getFormatString() {
{
return "JPEG"; return "JPEG";
} }
@@ -60,23 +51,20 @@ public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFor
* @return String description * @return String description
*/ */
@Override @Override
public String getDescription() public String getDescription() {
{
return "Generated Thumbnail"; return "Generated Thumbnail";
} }
/** /**
* @param currentItem item * @param currentItem item
* @param source source input stream * @param source source input stream
* @param verbose verbose mode * @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{
PDDocument doc = PDDocument.load(source); PDDocument doc = PDDocument.load(source);
PDFRenderer renderer = new PDFRenderer(doc); PDFRenderer renderer = new PDFRenderer(doc);
BufferedImage buf = renderer.renderImage(0); BufferedImage buf = renderer.renderImage(0);
@@ -88,20 +76,17 @@ public class PDFBoxThumbnail extends MediaFilter implements SelfRegisterInputFor
} }
@Override @Override
public String[] getInputMIMETypes() public String[] getInputMIMETypes() {
{
return ImageIO.getReaderMIMETypes(); return ImageIO.getReaderMIMETypes();
} }
@Override @Override
public String[] getInputDescriptions() public String[] getInputDescriptions() {
{
return null; return null;
} }
@Override @Override
public String[] getInputExtensions() public String[] getInputExtensions() {
{
// Temporarily disabled as JDK 1.6 only // Temporarily disabled as JDK 1.6 only
// return ImageIO.getReaderFileSuffixes(); // return ImageIO.getReaderFileSuffixes();
return null; return null;

View File

@@ -28,24 +28,20 @@ import org.dspace.core.ConfigurationManager;
* instantiate filter - bitstream format doesn't exist * instantiate filter - bitstream format doesn't exist
* *
*/ */
public class PDFFilter extends MediaFilter public class PDFFilter extends MediaFilter {
{
private static Logger log = Logger.getLogger(PDFFilter.class); private static Logger log = Logger.getLogger(PDFFilter.class);
@Override @Override
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".txt"; return oldFilename + ".txt";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
@Override @Override
public String getBundleName() public String getBundleName() {
{
return "TEXT"; return "TEXT";
} }
@@ -53,8 +49,7 @@ public class PDFFilter extends MediaFilter
* @return String bitstreamformat * @return String bitstreamformat
*/ */
@Override @Override
public String getFormatString() public String getFormatString() {
{
return "Text"; return "Text";
} }
@@ -62,25 +57,21 @@ public class PDFFilter extends MediaFilter
* @return String description * @return String description
*/ */
@Override @Override
public String getDescription() public String getDescription() {
{
return "Extracted text"; return "Extracted text";
} }
/** /**
* @param currentItem item * @param currentItem item
* @param source source input stream * @param source source input stream
* @param verbose verbose mode * @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{ try {
try
{
boolean useTemporaryFile = ConfigurationManager.getBooleanProperty("pdffilter.largepdfs", false); boolean useTemporaryFile = ConfigurationManager.getBooleanProperty("pdffilter.largepdfs", false);
// get input stream from bitstream // get input stream from bitstream
@@ -92,62 +83,43 @@ public class PDFFilter extends MediaFilter
File tempTextFile = null; File tempTextFile = null;
ByteArrayOutputStream byteStream = null; ByteArrayOutputStream byteStream = null;
if (useTemporaryFile) if (useTemporaryFile) {
{
tempTextFile = File.createTempFile("dspacepdfextract" + source.hashCode(), ".txt"); tempTextFile = File.createTempFile("dspacepdfextract" + source.hashCode(), ".txt");
tempTextFile.deleteOnExit(); tempTextFile.deleteOnExit();
writer = new OutputStreamWriter(new FileOutputStream(tempTextFile)); writer = new OutputStreamWriter(new FileOutputStream(tempTextFile));
} } else {
else
{
byteStream = new ByteArrayOutputStream(); byteStream = new ByteArrayOutputStream();
writer = new OutputStreamWriter(byteStream); writer = new OutputStreamWriter(byteStream);
} }
try try {
{
pdfDoc = PDDocument.load(source); pdfDoc = PDDocument.load(source);
pts.writeText(pdfDoc, writer); pts.writeText(pdfDoc, writer);
} } finally {
finally try {
{ if (pdfDoc != null) {
try
{
if (pdfDoc != null)
{
pdfDoc.close(); pdfDoc.close();
} }
} } catch (Exception e) {
catch(Exception e) log.error("Error closing PDF file: " + e.getMessage(), e);
{
log.error("Error closing PDF file: " + e.getMessage(), e);
} }
try try {
{
writer.close(); writer.close();
} } catch (Exception e) {
catch(Exception e) log.error("Error closing temporary extract file: " + e.getMessage(), e);
{
log.error("Error closing temporary extract file: " + e.getMessage(), e);
} }
} }
if (useTemporaryFile) if (useTemporaryFile) {
{
return new FileInputStream(tempTextFile); return new FileInputStream(tempTextFile);
} } else {
else
{
byte[] bytes = byteStream.toByteArray(); byte[] bytes = byteStream.toByteArray();
return new ByteArrayInputStream(bytes); return new ByteArrayInputStream(bytes);
} }
} } catch (OutOfMemoryError oome) {
catch (OutOfMemoryError oome)
{
log.error("Error parsing PDF document " + oome.getMessage(), oome); log.error("Error parsing PDF document " + oome.getMessage(), oome);
if (!ConfigurationManager.getBooleanProperty("pdffilter.skiponmemoryexception", false)) if (!ConfigurationManager.getBooleanProperty("pdffilter.skiponmemoryexception", false)) {
{
throw oome; throw oome;
} }
} }

View File

@@ -0,0 +1,71 @@
/**
* The contents of this file are subject to the license and copyright
* detailed in the LICENSE and NOTICE files at the root of the source
* tree and available online at
*
* http://www.dspace.org/license/
*/
package org.dspace.app.mediafilter;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import org.apache.poi.POITextExtractor;
import org.apache.poi.extractor.ExtractorFactory;
import org.apache.poi.openxml4j.exceptions.OpenXML4JException;
import org.apache.xmlbeans.XmlException;
import org.dspace.content.Item;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* Extract flat text from Microsoft Word documents (.doc, .docx).
*/
public class PoiWordFilter
extends MediaFilter {
private static final Logger LOG = LoggerFactory.getLogger(PoiWordFilter.class);
@Override
public String getFilteredName(String oldFilename) {
return oldFilename + ".txt";
}
@Override
public String getBundleName() {
return "TEXT";
}
@Override
public String getFormatString() {
return "Text";
}
@Override
public String getDescription() {
return "Extracted text";
}
@Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception {
String text;
try {
// get input stream from bitstream, pass to filter, get string back
POITextExtractor extractor = ExtractorFactory.createExtractor(source);
text = extractor.getText();
} catch (IOException | OpenXML4JException | XmlException e) {
System.err.format("Invalid File Format: %s%n", e.getMessage());
LOG.error("Unable to parse the bitstream: ", e);
throw e;
}
// if verbose flag is set, print out extracted text to STDOUT
if (verbose) {
System.out.println(text);
}
// return the extracted text as a stream.
return new ByteArrayInputStream(text.getBytes());
}
}

View File

@@ -10,47 +10,41 @@ package org.dspace.app.mediafilter;
import java.io.ByteArrayInputStream; import java.io.ByteArrayInputStream;
import java.io.InputStream; import java.io.InputStream;
import org.apache.poi.extractor.ExtractorFactory;
import org.apache.poi.xslf.extractor.XSLFPowerPointExtractor;
import org.apache.poi.hslf.extractor.PowerPointExtractor;
import org.apache.poi.POITextExtractor;
import org.apache.log4j.Logger; import org.apache.log4j.Logger;
import org.apache.poi.POITextExtractor;
import org.apache.poi.extractor.ExtractorFactory;
import org.apache.poi.hslf.extractor.PowerPointExtractor;
import org.apache.poi.xslf.extractor.XSLFPowerPointExtractor;
import org.dspace.content.Item; import org.dspace.content.Item;
/* /*
* TODO: Allow user to configure extraction of only text or only notes * TODO: Allow user to configure extraction of only text or only notes
* *
*/ */
public class PowerPointFilter extends MediaFilter public class PowerPointFilter extends MediaFilter {
{
private static Logger log = Logger.getLogger(PowerPointFilter.class); private static Logger log = Logger.getLogger(PowerPointFilter.class);
@Override @Override
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".txt"; return oldFilename + ".txt";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
@Override @Override
public String getBundleName() public String getBundleName() {
{
return "TEXT"; return "TEXT";
} }
/** /**
* @return String bitstream format * @return String bitstream format
* *
* TODO: Check that this is correct * TODO: Check that this is correct
*/ */
@Override @Override
public String getFormatString() public String getFormatString() {
{
return "Text"; return "Text";
} }
@@ -58,59 +52,48 @@ public class PowerPointFilter extends MediaFilter
* @return String description * @return String description
*/ */
@Override @Override
public String getDescription() public String getDescription() {
{
return "Extracted text"; return "Extracted text";
} }
/** /**
* @param currentItem item * @param currentItem item
* @param source source input stream * @param source source input stream
* @param verbose verbose mode * @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{
try try {
{
String extractedText = null; String extractedText = null;
new ExtractorFactory(); new ExtractorFactory();
POITextExtractor pptExtractor = ExtractorFactory POITextExtractor pptExtractor = ExtractorFactory
.createExtractor(source); .createExtractor(source);
// PowerPoint XML files and legacy format PowerPoint files // PowerPoint XML files and legacy format PowerPoint files
// require different classes and APIs for text extraction // require different classes and APIs for text extraction
// If this is a PowerPoint XML file, extract accordingly // If this is a PowerPoint XML file, extract accordingly
if (pptExtractor instanceof XSLFPowerPointExtractor) if (pptExtractor instanceof XSLFPowerPointExtractor) {
{
// The true method arguments indicate that text from // The true method arguments indicate that text from
// the slides and the notes is desired // the slides and the notes is desired
extractedText = ((XSLFPowerPointExtractor) pptExtractor) extractedText = ((XSLFPowerPointExtractor) pptExtractor)
.getText(true, true); .getText(true, true);
} } else if (pptExtractor instanceof PowerPointExtractor) { // Legacy PowerPoint files
// Legacy PowerPoint files
else if (pptExtractor instanceof PowerPointExtractor)
{
extractedText = ((PowerPointExtractor) pptExtractor).getText() extractedText = ((PowerPointExtractor) pptExtractor).getText()
+ " " + ((PowerPointExtractor) pptExtractor).getNotes(); + " " + ((PowerPointExtractor) pptExtractor).getNotes();
} }
if (extractedText != null) if (extractedText != null) {
{
// if verbose flag is set, print out extracted text // if verbose flag is set, print out extracted text
// to STDOUT // to STDOUT
if (verbose) if (verbose) {
{
System.out.println(extractedText); System.out.println(extractedText);
} }
@@ -120,9 +103,7 @@ public class PowerPointFilter extends MediaFilter
return bais; return bais;
} }
} } catch (Exception e) {
catch (Exception e)
{
log.error("Error filtering bitstream: " + e.getMessage(), e); log.error("Error filtering bitstream: " + e.getMessage(), e);
throw e; throw e;
} }

View File

@@ -11,8 +11,7 @@ package org.dspace.app.mediafilter;
* Interface to allow filters to register the input formats they handle * Interface to allow filters to register the input formats they handle
* (useful for exposing underlying capabilities of libraries used) * (useful for exposing underlying capabilities of libraries used)
*/ */
public interface SelfRegisterInputFormats public interface SelfRegisterInputFormats {
{
public String[] getInputMIMETypes(); public String[] getInputMIMETypes();
public String[] getInputDescriptions(); public String[] getInputDescriptions();

View File

@@ -8,39 +8,34 @@
package org.dspace.app.mediafilter; package org.dspace.app.mediafilter;
import java.io.ByteArrayInputStream; import java.io.ByteArrayInputStream;
import java.io.InputStream;
import java.io.IOException; import java.io.IOException;
import java.io.InputStream;
import org.apache.log4j.Logger; import org.apache.log4j.Logger;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.textmining.extraction.TextExtractor; import org.textmining.extraction.TextExtractor;
import org.textmining.extraction.word.WordTextExtractorFactory; import org.textmining.extraction.word.WordTextExtractorFactory;
/* /*
* *
* to do: helpful error messages - can't find mediafilter.cfg - can't * to do: helpful error messages - can't find mediafilter.cfg - can't
* instantiate filter - bitstream format doesn't exist. * instantiate filter - bitstream format doesn't exist.
* *
*/ */
public class WordFilter extends MediaFilter public class WordFilter extends MediaFilter {
{
private static Logger log = Logger.getLogger(WordFilter.class); private static Logger log = Logger.getLogger(WordFilter.class);
@Override @Override
public String getFilteredName(String oldFilename) public String getFilteredName(String oldFilename) {
{
return oldFilename + ".txt"; return oldFilename + ".txt";
} }
/** /**
* @return String bundle name * @return String bundle name
*
*/ */
@Override @Override
public String getBundleName() public String getBundleName() {
{
return "TEXT"; return "TEXT";
} }
@@ -48,8 +43,7 @@ public class WordFilter extends MediaFilter
* @return String bitstreamformat * @return String bitstreamformat
*/ */
@Override @Override
public String getFormatString() public String getFormatString() {
{
return "Text"; return "Text";
} }
@@ -57,35 +51,30 @@ public class WordFilter extends MediaFilter
* @return String description * @return String description
*/ */
@Override @Override
public String getDescription() public String getDescription() {
{
return "Extracted text"; return "Extracted text";
} }
/** /**
* @param currentItem item * @param currentItem item
* @param source source input stream * @param source source input stream
* @param verbose verbose mode * @param verbose verbose mode
*
* @return InputStream the resulting input stream * @return InputStream the resulting input stream
* @throws Exception if error * @throws Exception if error
*/ */
@Override @Override
public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose) public InputStream getDestinationStream(Item currentItem, InputStream source, boolean verbose)
throws Exception throws Exception {
{
// get input stream from bitstream // get input stream from bitstream
// pass to filter, get string back // pass to filter, get string back
try try {
{
WordTextExtractorFactory factory = new WordTextExtractorFactory(); WordTextExtractorFactory factory = new WordTextExtractorFactory();
TextExtractor e = factory.textExtractor(source); TextExtractor e = factory.textExtractor(source);
String extractedText = e.getText(); String extractedText = e.getText();
// if verbose flag is set, print out extracted text // if verbose flag is set, print out extracted text
// to STDOUT // to STDOUT
if (verbose) if (verbose) {
{
System.out.println(extractedText); System.out.println(extractedText);
} }
@@ -94,12 +83,10 @@ public class WordFilter extends MediaFilter
ByteArrayInputStream bais = new ByteArrayInputStream(textBytes); ByteArrayInputStream bais = new ByteArrayInputStream(textBytes);
return bais; // will this work? or will the byte array be out of scope? return bais; // will this work? or will the byte array be out of scope?
} } catch (IOException ioe) {
catch (IOException ioe)
{
System.out.println("Invalid Word Format"); System.out.println("Invalid Word Format");
log.error("Error detected - Word File format not recognized: " log.error("Error detected - Word File format not recognized: "
+ ioe.getMessage(), ioe); + ioe.getMessage(), ioe);
throw ioe; throw ioe;
} }
} }

View File

@@ -11,7 +11,8 @@ import org.dspace.app.mediafilter.service.MediaFilterService;
import org.dspace.services.factory.DSpaceServicesFactory; import org.dspace.services.factory.DSpaceServicesFactory;
/** /**
* Abstract factory to get services for the mediafilter package, use MediaFilterServiceFactory.getInstance() to retrieve an implementation * Abstract factory to get services for the mediafilter package, use MediaFilterServiceFactory.getInstance() to
* retrieve an implementation
* *
* @author kevinvandevelde at atmire.com * @author kevinvandevelde at atmire.com
*/ */
@@ -19,7 +20,8 @@ public abstract class MediaFilterServiceFactory {
public abstract MediaFilterService getMediaFilterService(); public abstract MediaFilterService getMediaFilterService();
public static MediaFilterServiceFactory getInstance(){ public static MediaFilterServiceFactory getInstance() {
return DSpaceServicesFactory.getInstance().getServiceManager().getServiceByName("mediaFilterServiceFactory", MediaFilterServiceFactory.class); return DSpaceServicesFactory.getInstance().getServiceManager()
.getServiceByName("mediaFilterServiceFactory", MediaFilterServiceFactory.class);
} }
} }

View File

@@ -11,7 +11,8 @@ import org.dspace.app.mediafilter.service.MediaFilterService;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
/** /**
* Factory implementation to get services for the mediafilter package, use MediaFilterServiceFactory.getInstance() to retrieve an implementation * Factory implementation to get services for the mediafilter package, use MediaFilterServiceFactory.getInstance() to
* retrieve an implementation
* *
* @author kevinvandevelde at atmire.com * @author kevinvandevelde at atmire.com
*/ */

View File

@@ -7,6 +7,9 @@
*/ */
package org.dspace.app.mediafilter.service; package org.dspace.app.mediafilter.service;
import java.util.List;
import java.util.Map;
import org.dspace.app.mediafilter.FormatFilter; import org.dspace.app.mediafilter.FormatFilter;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
import org.dspace.content.Collection; import org.dspace.content.Collection;
@@ -14,9 +17,6 @@ import org.dspace.content.Community;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.core.Context; import org.dspace.core.Context;
import java.util.List;
import java.util.Map;
/** /**
* MediaFilterManager is the class that invokes the media/format filters over the * MediaFilterManager is the class that invokes the media/format filters over the
* repository's content. A few command line flags affect the operation of the * repository's content. A few command line flags affect the operation of the
@@ -36,10 +36,10 @@ public interface MediaFilterService {
public void applyFiltersAllItems(Context context) throws Exception; public void applyFiltersAllItems(Context context) throws Exception;
public void applyFiltersCommunity(Context context, Community community) public void applyFiltersCommunity(Context context, Community community)
throws Exception; throws Exception;
public void applyFiltersCollection(Context context, Collection collection) public void applyFiltersCollection(Context context, Collection collection)
throws Exception; throws Exception;
public void applyFiltersItem(Context c, Item item) throws Exception; public void applyFiltersItem(Context c, Item item) throws Exception;
@@ -49,9 +49,9 @@ public interface MediaFilterService {
* filters if possible. * filters if possible.
* *
* @param context context * @param context context
* @param myItem item * @param myItem item
* @return true if any bitstreams processed, * @return true if any bitstreams processed,
* false if none * false if none
* @throws Exception if error * @throws Exception if error
*/ */
public boolean filterItem(Context context, Item myItem) throws Exception; public boolean filterItem(Context context, Item myItem) throws Exception;
@@ -63,11 +63,11 @@ public interface MediaFilterService {
* instantiated. Exceptions from filtering will be logged to STDOUT and * instantiated. Exceptions from filtering will be logged to STDOUT and
* swallowed. * swallowed.
* *
* @param c context * @param c context
* @param myItem item * @param myItem item
* @param myBitstream bitstream * @param myBitstream bitstream
* @return true if bitstream processed, * @return true if bitstream processed,
* false if no applicable filter or already processed * false if no applicable filter or already processed
* @throws Exception if error * @throws Exception if error
*/ */
public boolean filterBitstream(Context c, Item myItem, Bitstream myBitstream) throws Exception; public boolean filterBitstream(Context c, Item myItem, Bitstream myBitstream) throws Exception;
@@ -79,20 +79,16 @@ public interface MediaFilterService {
* already been filtered, and if not or if overWrite is set, invokes the * already been filtered, and if not or if overWrite is set, invokes the
* filter. * filter.
* *
* @param context * @param context context
* context * @param item item containing bitstream to process
* @param item * @param source source bitstream to process
* item containing bitstream to process * @param formatFilter FormatFilter to perform filtering
* @param source
* source bitstream to process
* @param formatFilter
* FormatFilter to perform filtering
* @throws Exception if error occurs
* @return true if new rendition is created, false if rendition already * @return true if new rendition is created, false if rendition already
* exists and overWrite is not set * exists and overWrite is not set
* @throws Exception if error occurs
*/ */
public boolean processBitstream(Context context, Item item, Bitstream source, FormatFilter formatFilter) public boolean processBitstream(Context context, Item item, Bitstream source, FormatFilter formatFilter)
throws Exception; throws Exception;
/** /**
* Return the item that is currently being processed/filtered * Return the item that is currently being processed/filtered
@@ -109,11 +105,9 @@ public interface MediaFilterService {
/** /**
* Check whether or not to skip processing the given identifier. * Check whether or not to skip processing the given identifier.
* *
* @param identifier * @param identifier identifier (handle) of a community, collection or item
* identifier (handle) of a community, collection or item
*
* @return true if this community, collection or item should be skipped * @return true if this community, collection or item should be skipped
* during processing. Otherwise, return false. * during processing. Otherwise, return false.
*/ */
public boolean inSkipList(String identifier); public boolean inSkipList(String identifier);

View File

@@ -25,8 +25,8 @@ import org.dspace.content.DSpaceObject;
import org.dspace.content.crosswalk.CrosswalkException; import org.dspace.content.crosswalk.CrosswalkException;
import org.dspace.content.packager.PackageDisseminator; import org.dspace.content.packager.PackageDisseminator;
import org.dspace.content.packager.PackageException; import org.dspace.content.packager.PackageException;
import org.dspace.content.packager.PackageParameters;
import org.dspace.content.packager.PackageIngester; import org.dspace.content.packager.PackageIngester;
import org.dspace.content.packager.PackageParameters;
import org.dspace.core.Constants; import org.dspace.core.Constants;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.core.factory.CoreServiceFactory; import org.dspace.core.factory.CoreServiceFactory;
@@ -51,7 +51,8 @@ import org.dspace.workflow.WorkflowException;
* (Add the -h option to get the command to show its own help) * (Add the -h option to get the command to show its own help)
* *
* <pre> * <pre>
* 1. To submit a SIP (submissions tend to create a *new* object, with a new handle. If you want to restore an object, see -r option below) * 1. To submit a SIP (submissions tend to create a *new* object, with a new handle. If you want to restore an
* object, see -r option below)
* dspace packager * dspace packager
* -e {ePerson} * -e {ePerson}
* -t {PackagerType} * -t {PackagerType}
@@ -71,7 +72,8 @@ import org.dspace.workflow.WorkflowException;
* *
* 2. To restore an AIP (similar to submit mode, but attempts to restore with the handles/parents specified in AIP): * 2. To restore an AIP (similar to submit mode, but attempts to restore with the handles/parents specified in AIP):
* dspace packager * dspace packager
* -r --- restores a object from a package info, including the specified handle (will throw an error if handle is already in use) * -r --- restores a object from a package info, including the specified handle (will throw an error if
* handle is already in use)
* -e {ePerson} * -e {ePerson}
* -t {PackagerType} * -t {PackagerType}
* [-o {name}={value} [ -o {name}={value} ..]] * [-o {name}={value} [ -o {name}={value} ..]]
@@ -81,14 +83,19 @@ import org.dspace.workflow.WorkflowException;
* Use with -r to only restore objects which do not already exist. By default, -r will throw an error * Use with -r to only restore objects which do not already exist. By default, -r will throw an error
* and rollback all changes when an object is found that already exists. * and rollback all changes when an object is found that already exists.
* [-f] --- Force a restore (even if object already exists). * [-f] --- Force a restore (even if object already exists).
* Use with -r to replace an existing object with one from a package (essentially a delete and restore). * Use with -r to replace an existing object with one from a package (essentially a delete and
* By default, -r will throw an error and rollback all changes when an object is found that already exists. * restore).
* [-i {identifier-handle-of-object}] -- Optional when -f is specified. When replacing an object, you can specify the * By default, -r will throw an error and rollback all changes when an object is found that already
* exists.
* [-i {identifier-handle-of-object}] -- Optional when -f is specified. When replacing an object, you can
* specify the
* object to replace if it cannot be easily determined from the package itself. * object to replace if it cannot be easily determined from the package itself.
* {package-filename} * {package-filename}
* *
* Restoring is very similar to submitting, except that you are recreating pre-existing objects. So, in a restore, the object(s) are * Restoring is very similar to submitting, except that you are recreating pre-existing objects. So, in a restore,
* being recreated based on the details in the AIP. This means that the object is recreated with the same handle and same parent/children * the object(s) are
* being recreated based on the details in the AIP. This means that the object is recreated with the same handle
* and same parent/children
* objects. Not all {PackagerTypes} may support a "restore". * objects. Not all {PackagerTypes} may support a "restore".
* *
* 3. To write out a DIP: * 3. To write out a DIP:
@@ -113,49 +120,60 @@ import org.dspace.workflow.WorkflowException;
* @author Tim Donohue * @author Tim Donohue
* @version $Revision$ * @version $Revision$
*/ */
public class Packager public class Packager {
{
/* Various private global settings/options */ /* Various private global settings/options */
protected String packageType = null; protected String packageType = null;
protected boolean submit = true; protected boolean submit = true;
protected boolean userInteractionEnabled = true; protected boolean userInteractionEnabled = true;
// die from illegal command line // die from illegal command line
protected static void usageError(String msg) protected static void usageError(String msg) {
{
System.out.println(msg); System.out.println(msg);
System.out.println(" (run with -h flag for details)"); System.out.println(" (run with -h flag for details)");
System.exit(1); System.exit(1);
} }
public static void main(String[] argv) throws Exception public static void main(String[] argv) throws Exception {
{
Options options = new Options(); Options options = new Options();
options.addOption("p", "parent", true, options.addOption("p", "parent", true,
"Handle(s) of parent Community or Collection into which to ingest object (repeatable)"); "Handle(s) of parent Community or Collection into which to ingest object (repeatable)");
options.addOption("e", "eperson", true, options.addOption("e", "eperson", true,
"email address of eperson doing importing"); "email address of eperson doing importing");
options options
.addOption( .addOption(
"w", "w",
"install", "install",
false, false,
"disable workflow; install immediately without going through collection's workflow"); "disable workflow; install immediately without going through collection's workflow");
options.addOption("r", "restore", false, "ingest in \"restore\" mode. Restores a missing object based on the contents in a package."); options.addOption("r", "restore", false,
options.addOption("k", "keep-existing", false, "if an object is found to already exist during a restore (-r), then keep the existing object and continue processing. Can only be used with '-r'. This avoids object-exists errors which are thrown by -r by default."); "ingest in \"restore\" mode. Restores a missing object based on the contents in a package.");
options.addOption("f", "force-replace", false, "if an object is found to already exist during a restore (-r), then remove it and replace it with the contents of the package. Can only be used with '-r'. This REPLACES the object(s) in the repository with the contents from the package(s)."); options.addOption("k", "keep-existing", false,
"if an object is found to already exist during a restore (-r), then keep the existing " +
"object and continue processing. Can only be used with '-r'. This avoids " +
"object-exists errors which are thrown by -r by default.");
options.addOption("f", "force-replace", false,
"if an object is found to already exist during a restore (-r), then remove it and replace " +
"it with the contents of the package. Can only be used with '-r'. This REPLACES the " +
"object(s) in the repository with the contents from the package(s).");
options.addOption("t", "type", true, "package type or MIMEtype"); options.addOption("t", "type", true, "package type or MIMEtype");
options options
.addOption("o", "option", true, .addOption("o", "option", true,
"Packager option to pass to plugin, \"name=value\" (repeatable)"); "Packager option to pass to plugin, \"name=value\" (repeatable)");
options.addOption("d", "disseminate", false, options.addOption("d", "disseminate", false,
"Disseminate package (output); default is to submit."); "Disseminate package (output); default is to submit.");
options.addOption("s", "submit", false, options.addOption("s", "submit", false,
"Submission package (Input); this is the default. "); "Submission package (Input); this is the default. ");
options.addOption("i", "identifier", true, "Handle of object to disseminate."); options.addOption("i", "identifier", true, "Handle of object to disseminate.");
options.addOption("a", "all", false, "also recursively ingest/disseminate any child packages, e.g. all Items within a Collection (not all packagers may support this option!)"); options.addOption("a", "all", false,
options.addOption("h", "help", false, "help (you may also specify '-h -t [type]' for additional help with a specific type of packager)"); "also recursively ingest/disseminate any child packages, e.g. all Items within a Collection" +
options.addOption("u", "no-user-interaction", false, "Skips over all user interaction (i.e. [y/n] question prompts) within this script. This flag can be used if you want to save (pipe) a report of all changes to a file, and therefore need to bypass all user interaction."); " (not all packagers may support this option!)");
options.addOption("h", "help", false,
"help (you may also specify '-h -t [type]' for additional help with a specific type of " +
"packager)");
options.addOption("u", "no-user-interaction", false,
"Skips over all user interaction (i.e. [y/n] question prompts) within this script. This " +
"flag can be used if you want to save (pipe) a report of all changes to a file, and " +
"therefore need to bypass all user interaction.");
CommandLineParser parser = new PosixParser(); CommandLineParser parser = new PosixParser();
CommandLine line = parser.parse(options, argv); CommandLine line = parser.parse(options, argv);
@@ -170,15 +188,13 @@ public class Packager
//initialize a new packager -- we'll add all our current params as settings //initialize a new packager -- we'll add all our current params as settings
Packager myPackager = new Packager(); Packager myPackager = new Packager();
if (line.hasOption('h')) if (line.hasOption('h')) {
{
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("Packager [options] package-file|-\n", myhelp.printHelp("Packager [options] package-file|-\n",
options); options);
//If user specified a type, also print out the SIP and DIP options //If user specified a type, also print out the SIP and DIP options
// that are specific to that type of packager // that are specific to that type of packager
if (line.hasOption('t')) if (line.hasOption('t')) {
{
System.out.println("\n--------------------------------------------------------------"); System.out.println("\n--------------------------------------------------------------");
System.out.println("Additional options for the " + line.getOptionValue('t') + " packager:"); System.out.println("Additional options for the " + line.getOptionValue('t') + " packager:");
System.out.println("--------------------------------------------------------------"); System.out.println("--------------------------------------------------------------");
@@ -187,44 +203,36 @@ public class Packager
PackageIngester sip = (PackageIngester) pluginService PackageIngester sip = (PackageIngester) pluginService
.getNamedPlugin(PackageIngester.class, line.getOptionValue('t')); .getNamedPlugin(PackageIngester.class, line.getOptionValue('t'));
if (sip != null) if (sip != null) {
{
System.out.println("\n\n" + line.getOptionValue('t') + " Submission (SIP) plugin options:\n"); System.out.println("\n\n" + line.getOptionValue('t') + " Submission (SIP) plugin options:\n");
System.out.println(sip.getParameterHelp()); System.out.println(sip.getParameterHelp());
} } else {
else
{
System.out.println("\nNo valid Submission plugin found for " + line.getOptionValue('t') + " type."); System.out.println("\nNo valid Submission plugin found for " + line.getOptionValue('t') + " type.");
} }
PackageDisseminator dip = (PackageDisseminator) pluginService PackageDisseminator dip = (PackageDisseminator) pluginService
.getNamedPlugin(PackageDisseminator.class, line.getOptionValue('t')); .getNamedPlugin(PackageDisseminator.class, line.getOptionValue('t'));
if (dip != null) if (dip != null) {
{
System.out.println("\n\n" + line.getOptionValue('t') + " Dissemination (DIP) plugin options:\n"); System.out.println("\n\n" + line.getOptionValue('t') + " Dissemination (DIP) plugin options:\n");
System.out.println(dip.getParameterHelp()); System.out.println(dip.getParameterHelp());
} } else {
else System.out
{ .println("\nNo valid Dissemination plugin found for " + line.getOptionValue('t') + " type.");
System.out.println("\nNo valid Dissemination plugin found for " + line.getOptionValue('t') + " type.");
} }
} } else {
else //otherwise, display list of valid packager types //otherwise, display list of valid packager types
{
System.out.println("\nAvailable Submission Package (SIP) types:"); System.out.println("\nAvailable Submission Package (SIP) types:");
String pn[] = pluginService String pn[] = pluginService
.getAllPluginNames(PackageIngester.class); .getAllPluginNames(PackageIngester.class);
for (int i = 0; i < pn.length; ++i) for (int i = 0; i < pn.length; ++i) {
{
System.out.println(" " + pn[i]); System.out.println(" " + pn[i]);
} }
System.out System.out
.println("\nAvailable Dissemination Package (DIP) types:"); .println("\nAvailable Dissemination Package (DIP) types:");
pn = pluginService.getAllPluginNames(PackageDisseminator.class); pn = pluginService.getAllPluginNames(PackageDisseminator.class);
for (int i = 0; i < pn.length; ++i) for (int i = 0; i < pn.length; ++i) {
{
System.out.println(" " + pn[i]); System.out.println(" " + pn[i]);
} }
} }
@@ -232,85 +240,66 @@ public class Packager
} }
//look for flag to disable all user interaction //look for flag to disable all user interaction
if(line.hasOption('u')) if (line.hasOption('u')) {
{
myPackager.userInteractionEnabled = false; myPackager.userInteractionEnabled = false;
} }
if (line.hasOption('w')) if (line.hasOption('w')) {
{
pkgParams.setWorkflowEnabled(false); pkgParams.setWorkflowEnabled(false);
} }
if (line.hasOption('r')) if (line.hasOption('r')) {
{
pkgParams.setRestoreModeEnabled(true); pkgParams.setRestoreModeEnabled(true);
} }
//keep-existing is only valid in restoreMode (-r) -- otherwise ignore -k option. //keep-existing is only valid in restoreMode (-r) -- otherwise ignore -k option.
if (line.hasOption('k') && pkgParams.restoreModeEnabled()) if (line.hasOption('k') && pkgParams.restoreModeEnabled()) {
{
pkgParams.setKeepExistingModeEnabled(true); pkgParams.setKeepExistingModeEnabled(true);
} }
//force-replace is only valid in restoreMode (-r) -- otherwise ignore -f option. //force-replace is only valid in restoreMode (-r) -- otherwise ignore -f option.
if (line.hasOption('f') && pkgParams.restoreModeEnabled()) if (line.hasOption('f') && pkgParams.restoreModeEnabled()) {
{
pkgParams.setReplaceModeEnabled(true); pkgParams.setReplaceModeEnabled(true);
} }
if (line.hasOption('e')) if (line.hasOption('e')) {
{
eperson = line.getOptionValue('e'); eperson = line.getOptionValue('e');
} }
if (line.hasOption('p')) if (line.hasOption('p')) {
{
parents = line.getOptionValues('p'); parents = line.getOptionValues('p');
} }
if (line.hasOption('t')) if (line.hasOption('t')) {
{
myPackager.packageType = line.getOptionValue('t'); myPackager.packageType = line.getOptionValue('t');
} }
if (line.hasOption('i')) if (line.hasOption('i')) {
{
identifier = line.getOptionValue('i'); identifier = line.getOptionValue('i');
} }
if (line.hasOption('a')) if (line.hasOption('a')) {
{ //enable 'recursiveMode' param to packager implementations, in case it helps with packaging or ingestion
//enable 'recursiveMode' param to packager implementations, in case it helps with packaging or ingestion process // process
pkgParams.setRecursiveModeEnabled(true); pkgParams.setRecursiveModeEnabled(true);
} }
String files[] = line.getArgs(); String files[] = line.getArgs();
if (files.length > 0) if (files.length > 0) {
{
sourceFile = files[0]; sourceFile = files[0];
} }
if (line.hasOption('d')) if (line.hasOption('d')) {
{
myPackager.submit = false; myPackager.submit = false;
} }
if (line.hasOption('o')) if (line.hasOption('o')) {
{
String popt[] = line.getOptionValues('o'); String popt[] = line.getOptionValues('o');
for (int i = 0; i < popt.length; ++i) for (int i = 0; i < popt.length; ++i) {
{
String pair[] = popt[i].split("\\=", 2); String pair[] = popt[i].split("\\=", 2);
if (pair.length == 2) if (pair.length == 2) {
{
pkgParams.addProperty(pair[0].trim(), pair[1].trim()); pkgParams.addProperty(pair[0].trim(), pair[1].trim());
} } else if (pair.length == 1) {
else if (pair.length == 1)
{
pkgParams.addProperty(pair[0].trim(), ""); pkgParams.addProperty(pair[0].trim(), "");
} } else {
else
{
System.err System.err
.println("Warning: Illegal package option format: \"" .println("Warning: Illegal package option format: \""
+ popt[i] + "\""); + popt[i] + "\"");
} }
} }
} }
// Sanity checks on arg list: required args // Sanity checks on arg list: required args
// REQUIRED: sourceFile, ePerson (-e), packageType (-t) // REQUIRED: sourceFile, ePerson (-e), packageType (-t)
if (sourceFile == null || eperson == null || myPackager.packageType == null) if (sourceFile == null || eperson == null || myPackager.packageType == null) {
{
System.err.println("Error - missing a REQUIRED argument or option.\n"); System.err.println("Error - missing a REQUIRED argument or option.\n");
HelpFormatter myhelp = new HelpFormatter(); HelpFormatter myhelp = new HelpFormatter();
myhelp.printHelp("PackageManager [options] package-file|-\n", options); myhelp.printHelp("PackageManager [options] package-file|-\n", options);
@@ -321,67 +310,60 @@ public class Packager
Context context = new Context(); Context context = new Context();
EPerson myEPerson = null; EPerson myEPerson = null;
myEPerson = EPersonServiceFactory.getInstance().getEPersonService().findByEmail(context, eperson); myEPerson = EPersonServiceFactory.getInstance().getEPersonService().findByEmail(context, eperson);
if (myEPerson == null) if (myEPerson == null) {
{
usageError("Error, eperson cannot be found: " + eperson); usageError("Error, eperson cannot be found: " + eperson);
} }
context.setCurrentUser(myEPerson); context.setCurrentUser(myEPerson);
//If we are in REPLACE mode //If we are in REPLACE mode
if(pkgParams.replaceModeEnabled()) if (pkgParams.replaceModeEnabled()) {
{ context.setMode(Context.Mode.BATCH_EDIT);
PackageIngester sip = (PackageIngester) pluginService PackageIngester sip = (PackageIngester) pluginService
.getNamedPlugin(PackageIngester.class, myPackager.packageType); .getNamedPlugin(PackageIngester.class, myPackager.packageType);
if (sip == null) if (sip == null) {
{
usageError("Error, Unknown package type: " + myPackager.packageType); usageError("Error, Unknown package type: " + myPackager.packageType);
} }
DSpaceObject objToReplace = null; DSpaceObject objToReplace = null;
//if a specific identifier was specified, make sure it is valid //if a specific identifier was specified, make sure it is valid
if(identifier!=null && identifier.length()>0) if (identifier != null && identifier.length() > 0) {
{ objToReplace = HandleServiceFactory.getInstance().getHandleService()
objToReplace = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context, identifier); .resolveToObject(context, identifier);
if (objToReplace == null) if (objToReplace == null) {
{
throw new IllegalArgumentException("Bad identifier/handle -- " throw new IllegalArgumentException("Bad identifier/handle -- "
+ "Cannot resolve handle \"" + identifier + "\""); + "Cannot resolve handle \"" + identifier + "\"");
} }
} }
String choiceString = null; String choiceString = null;
if(myPackager.userInteractionEnabled) if (myPackager.userInteractionEnabled) {
{
BufferedReader input = new BufferedReader(new InputStreamReader(System.in)); BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
System.out.println("\n\nWARNING -- You are running the packager in REPLACE mode."); System.out.println("\n\nWARNING -- You are running the packager in REPLACE mode.");
System.out.println("\nREPLACE mode may be potentially dangerous as it will automatically remove and replace contents within DSpace."); System.out.println(
System.out.println("We highly recommend backing up all your DSpace contents (files & database) before continuing."); "\nREPLACE mode may be potentially dangerous as it will automatically remove and replace contents" +
" within DSpace.");
System.out.println(
"We highly recommend backing up all your DSpace contents (files & database) before continuing.");
System.out.print("\nWould you like to continue? [y/n]: "); System.out.print("\nWould you like to continue? [y/n]: ");
choiceString = input.readLine(); choiceString = input.readLine();
} } else {
else
{
//user interaction disabled -- default answer to 'yes', otherwise script won't continue //user interaction disabled -- default answer to 'yes', otherwise script won't continue
choiceString = "y"; choiceString = "y";
} }
if (choiceString.equalsIgnoreCase("y")) if (choiceString.equalsIgnoreCase("y")) {
{
System.out.println("Beginning replacement process..."); System.out.println("Beginning replacement process...");
try try {
{
//replace the object from the source file //replace the object from the source file
myPackager.replace(context, sip, pkgParams, sourceFile, objToReplace); myPackager.replace(context, sip, pkgParams, sourceFile, objToReplace);
//commit all changes & exit successfully //commit all changes & exit successfully
context.complete(); context.complete();
System.exit(0); System.exit(0);
} } catch (Exception e) {
catch (Exception e)
{
// abort all operations // abort all operations
e.printStackTrace(); e.printStackTrace();
context.abort(); context.abort();
@@ -390,74 +372,67 @@ public class Packager
} }
} }
} } else if (myPackager.submit || pkgParams.restoreModeEnabled()) {
//else if normal SUBMIT mode (or basic RESTORE mode -- which is a special type of submission) //else if normal SUBMIT mode (or basic RESTORE mode -- which is a special type of submission)
else if (myPackager.submit || pkgParams.restoreModeEnabled()) context.setMode(Context.Mode.BATCH_EDIT);
{
PackageIngester sip = (PackageIngester) pluginService PackageIngester sip = (PackageIngester) pluginService
.getNamedPlugin(PackageIngester.class, myPackager.packageType); .getNamedPlugin(PackageIngester.class, myPackager.packageType);
if (sip == null) if (sip == null) {
{
usageError("Error, Unknown package type: " + myPackager.packageType); usageError("Error, Unknown package type: " + myPackager.packageType);
} }
// validate each parent arg (if any) // validate each parent arg (if any)
DSpaceObject parentObjs[] = null; DSpaceObject parentObjs[] = null;
if(parents!=null) if (parents != null) {
{
System.out.println("Destination parents:"); System.out.println("Destination parents:");
parentObjs = new DSpaceObject[parents.length]; parentObjs = new DSpaceObject[parents.length];
for (int i = 0; i < parents.length; i++) for (int i = 0; i < parents.length; i++) {
{
// sanity check: did handle resolve? // sanity check: did handle resolve?
parentObjs[i] = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context, parentObjs[i] = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context,
parents[i]); parents[i]);
if (parentObjs[i] == null) if (parentObjs[i] == null) {
{
throw new IllegalArgumentException( throw new IllegalArgumentException(
"Bad parent list -- " "Bad parent list -- "
+ "Cannot resolve parent handle \"" + "Cannot resolve parent handle \""
+ parents[i] + "\""); + parents[i] + "\"");
} }
System.out.println((i == 0 ? "Owner: " : "Parent: ") System.out.println((i == 0 ? "Owner: " : "Parent: ")
+ parentObjs[i].getHandle()); + parentObjs[i].getHandle());
} }
} }
try try {
{
//ingest the object from the source file //ingest the object from the source file
myPackager.ingest(context, sip, pkgParams, sourceFile, parentObjs); myPackager.ingest(context, sip, pkgParams, sourceFile, parentObjs);
//commit all changes & exit successfully //commit all changes & exit successfully
context.complete(); context.complete();
System.exit(0); System.exit(0);
} } catch (Exception e) {
catch (Exception e)
{
// abort all operations // abort all operations
e.printStackTrace(); e.printStackTrace();
context.abort(); context.abort();
System.out.println(e); System.out.println(e);
System.exit(1); System.exit(1);
} }
}// else, if DISSEMINATE mode } else {
else // else, if DISSEMINATE mode
{ context.setMode(Context.Mode.READ_ONLY);
//retrieve specified package disseminator //retrieve specified package disseminator
PackageDisseminator dip = (PackageDisseminator) pluginService PackageDisseminator dip = (PackageDisseminator) pluginService
.getNamedPlugin(PackageDisseminator.class, myPackager.packageType); .getNamedPlugin(PackageDisseminator.class, myPackager.packageType);
if (dip == null) if (dip == null) {
{
usageError("Error, Unknown package type: " + myPackager.packageType); usageError("Error, Unknown package type: " + myPackager.packageType);
} }
DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context, identifier); DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService()
if (dso == null) .resolveToObject(context, identifier);
{ if (dso == null) {
throw new IllegalArgumentException("Bad identifier/handle -- " throw new IllegalArgumentException("Bad identifier/handle -- "
+ "Cannot resolve handle \"" + identifier + "\""); + "Cannot resolve handle \"" + identifier + "\"");
} }
//disseminate the requested object //disseminate the requested object
@@ -473,26 +448,26 @@ public class Packager
* <p> * <p>
* Please note that replace (-r -f) mode calls the replace() method instead. * Please note that replace (-r -f) mode calls the replace() method instead.
* *
* @param context DSpace Context * @param context DSpace Context
* @param sip PackageIngester which will actually ingest the package * @param sip PackageIngester which will actually ingest the package
* @param pkgParams Parameters to pass to individual packager instances * @param pkgParams Parameters to pass to individual packager instances
* @param sourceFile location of the source package to ingest * @param sourceFile location of the source package to ingest
* @param parentObjs Parent DSpace object(s) to attach new object to * @param parentObjs Parent DSpace object(s) to attach new object to
* @throws IOException if IO error * @throws IOException if IO error
* @throws SQLException if database error * @throws SQLException if database error
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
* @throws CrosswalkException if crosswalk error * @throws CrosswalkException if crosswalk error
* @throws PackageException if packaging error * @throws PackageException if packaging error
*/ */
protected void ingest(Context context, PackageIngester sip, PackageParameters pkgParams, String sourceFile, DSpaceObject parentObjs[]) protected void ingest(Context context, PackageIngester sip, PackageParameters pkgParams, String sourceFile,
throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException, PackageException DSpaceObject parentObjs[])
{ throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException,
PackageException {
// make sure we have an input file // make sure we have an input file
File pkgFile = new File(sourceFile); File pkgFile = new File(sourceFile);
if(!pkgFile.exists()) if (!pkgFile.exists()) {
{
System.out.println("\nERROR: Package located at " + sourceFile + " does not exist!"); System.out.println("\nERROR: Package located at " + sourceFile + " does not exist!");
System.exit(1); System.exit(1);
} }
@@ -501,108 +476,92 @@ public class Packager
//find first parent (if specified) -- this will be the "owner" of the object //find first parent (if specified) -- this will be the "owner" of the object
DSpaceObject parent = null; DSpaceObject parent = null;
if(parentObjs!=null && parentObjs.length>0) if (parentObjs != null && parentObjs.length > 0) {
{
parent = parentObjs[0]; parent = parentObjs[0];
} }
//NOTE: at this point, Parent may be null -- in which case it is up to the PackageIngester //NOTE: at this point, Parent may be null -- in which case it is up to the PackageIngester
// to either determine the Parent (from package contents) or throw an error. // to either determine the Parent (from package contents) or throw an error.
try try {
{
//If we are doing a recursive ingest, call ingestAll() //If we are doing a recursive ingest, call ingestAll()
if(pkgParams.recursiveModeEnabled()) if (pkgParams.recursiveModeEnabled()) {
{
System.out.println("\nAlso ingesting all referenced packages (recursive mode).."); System.out.println("\nAlso ingesting all referenced packages (recursive mode)..");
System.out.println("This may take a while, please check your logs for ongoing status while we process each package."); System.out.println(
"This may take a while, please check your logs for ongoing status while we process each package.");
//ingest first package & recursively ingest anything else that package references (child packages, etc) //ingest first package & recursively ingest anything else that package references (child packages, etc)
List<String> hdlResults = sip.ingestAll(context, parent, pkgFile, pkgParams, null); List<String> hdlResults = sip.ingestAll(context, parent, pkgFile, pkgParams, null);
if (hdlResults != null) if (hdlResults != null) {
{
//Report total objects created //Report total objects created
System.out.println("\nCREATED a total of " + hdlResults.size() + " DSpace Objects."); System.out.println("\nCREATED a total of " + hdlResults.size() + " DSpace Objects.");
String choiceString = null; String choiceString = null;
//Ask if user wants full list printed to command line, as this may be rather long. //Ask if user wants full list printed to command line, as this may be rather long.
if (this.userInteractionEnabled) if (this.userInteractionEnabled) {
{
BufferedReader input = new BufferedReader(new InputStreamReader(System.in)); BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
System.out.print("\nWould you like to view a list of all objects that were created? [y/n]: "); System.out.print("\nWould you like to view a list of all objects that were created? [y/n]: ");
choiceString = input.readLine(); choiceString = input.readLine();
} } else {
else
{
// user interaction disabled -- default answer to 'yes', as // user interaction disabled -- default answer to 'yes', as
// we want to provide user with as detailed a report as possible. // we want to provide user with as detailed a report as possible.
choiceString = "y"; choiceString = "y";
} }
// Provide detailed report if user answered 'yes' // Provide detailed report if user answered 'yes'
if (choiceString.equalsIgnoreCase("y")) if (choiceString.equalsIgnoreCase("y")) {
{
System.out.println("\n\n"); System.out.println("\n\n");
for (String result : hdlResults) for (String result : hdlResults) {
{ DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService()
DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context, result); .resolveToObject(context, result);
if(dso!=null) if (dso != null) {
{
if (pkgParams.restoreModeEnabled()) { if (pkgParams.restoreModeEnabled()) {
System.out.println("RESTORED DSpace " + Constants.typeText[dso.getType()] + System.out.println("RESTORED DSpace " + Constants.typeText[dso.getType()] +
" [ hdl=" + dso.getHandle() + ", dbID=" + dso.getID() + " ] "); " [ hdl=" + dso.getHandle() + ", dbID=" + dso
.getID() + " ] ");
} else { } else {
System.out.println("CREATED new DSpace " + Constants.typeText[dso.getType()] + System.out.println("CREATED new DSpace " + Constants.typeText[dso.getType()] +
" [ hdl=" + dso.getHandle() + ", dbID=" + dso.getID() + " ] "); " [ hdl=" + dso.getHandle() + ", dbID=" + dso
.getID() + " ] ");
} }
} }
} }
} }
} }
} } else {
else
{
//otherwise, just one package to ingest //otherwise, just one package to ingest
try try {
{
DSpaceObject dso = sip.ingest(context, parent, pkgFile, pkgParams, null); DSpaceObject dso = sip.ingest(context, parent, pkgFile, pkgParams, null);
if (dso != null) if (dso != null) {
{ if (pkgParams.restoreModeEnabled()) {
if (pkgParams.restoreModeEnabled())
{
System.out.println("RESTORED DSpace " + Constants.typeText[dso.getType()] + System.out.println("RESTORED DSpace " + Constants.typeText[dso.getType()] +
" [ hdl=" + dso.getHandle() + ", dbID=" + dso.getID() + " ] "); " [ hdl=" + dso.getHandle() + ", dbID=" + dso.getID() + " ] ");
} } else {
else
{
System.out.println("CREATED new DSpace " + Constants.typeText[dso.getType()] + System.out.println("CREATED new DSpace " + Constants.typeText[dso.getType()] +
" [ hdl=" + dso.getHandle() + ", dbID=" + dso.getID() + " ] "); " [ hdl=" + dso.getHandle() + ", dbID=" + dso.getID() + " ] ");
} }
} }
} } catch (IllegalStateException ie) {
catch (IllegalStateException ie)
{
// NOTE: if we encounter an IllegalStateException, this means the // NOTE: if we encounter an IllegalStateException, this means the
// handle is already in use and this object already exists. // handle is already in use and this object already exists.
//if we are skipping over (i.e. keeping) existing objects //if we are skipping over (i.e. keeping) existing objects
if (pkgParams.keepExistingModeEnabled()) if (pkgParams.keepExistingModeEnabled()) {
{ System.out.println(
System.out.println("\nSKIPPED processing package '" + pkgFile + "', as an Object already exists with this handle."); "\nSKIPPED processing package '" + pkgFile + "', as an Object already exists with this " +
} "handle.");
else // Pass this exception on -- which essentially causes a full rollback of all changes (this is the default) } else {
{ // Pass this exception on -- which essentially causes a full rollback of all changes (this
// is the default)
throw ie; throw ie;
} }
} }
} }
} } catch (WorkflowException e) {
catch (WorkflowException e)
{
throw new PackageException(e); throw new PackageException(e);
} }
} }
@@ -612,128 +571,116 @@ public class Packager
* Disseminate one or more DSpace objects into package(s) based on the * Disseminate one or more DSpace objects into package(s) based on the
* options passed to the 'packager' script * options passed to the 'packager' script
* *
* @param context DSpace context * @param context DSpace context
* @param dip PackageDisseminator which will actually create the package * @param dip PackageDisseminator which will actually create the package
* @param dso DSpace Object to disseminate as a package * @param dso DSpace Object to disseminate as a package
* @param pkgParams Parameters to pass to individual packager instances * @param pkgParams Parameters to pass to individual packager instances
* @param outputFile File where final package should be saved * @param outputFile File where final package should be saved
* @throws IOException if IO error * @throws IOException if IO error
* @throws SQLException if database error * @throws SQLException if database error
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
* @throws CrosswalkException if crosswalk error * @throws CrosswalkException if crosswalk error
* @throws PackageException if packaging error * @throws PackageException if packaging error
*/ */
protected void disseminate(Context context, PackageDisseminator dip, protected void disseminate(Context context, PackageDisseminator dip,
DSpaceObject dso, PackageParameters pkgParams, DSpaceObject dso, PackageParameters pkgParams,
String outputFile) String outputFile)
throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException, PackageException throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException,
{ PackageException {
// initialize output file // initialize output file
File pkgFile = new File(outputFile); File pkgFile = new File(outputFile);
System.out.println("\nDisseminating DSpace " + Constants.typeText[dso.getType()] + System.out.println("\nDisseminating DSpace " + Constants.typeText[dso.getType()] +
" [ hdl=" + dso.getHandle() + " ] to " + outputFile); " [ hdl=" + dso.getHandle() + " ] to " + outputFile);
//If we are doing a recursive dissemination of this object & all its child objects, call disseminateAll() //If we are doing a recursive dissemination of this object & all its child objects, call disseminateAll()
if(pkgParams.recursiveModeEnabled()) if (pkgParams.recursiveModeEnabled()) {
{
System.out.println("\nAlso disseminating all child objects (recursive mode).."); System.out.println("\nAlso disseminating all child objects (recursive mode)..");
System.out.println("This may take a while, please check your logs for ongoing status while we process each package."); System.out.println(
"This may take a while, please check your logs for ongoing status while we process each package.");
//disseminate initial object & recursively disseminate all child objects as well //disseminate initial object & recursively disseminate all child objects as well
List<File> fileResults = dip.disseminateAll(context, dso, pkgParams, pkgFile); List<File> fileResults = dip.disseminateAll(context, dso, pkgParams, pkgFile);
if(fileResults!=null) if (fileResults != null) {
{
//Report total files created //Report total files created
System.out.println("\nCREATED a total of " + fileResults.size() + " dissemination package files."); System.out.println("\nCREATED a total of " + fileResults.size() + " dissemination package files.");
String choiceString = null; String choiceString = null;
//Ask if user wants full list printed to command line, as this may be rather long. //Ask if user wants full list printed to command line, as this may be rather long.
if(this.userInteractionEnabled) if (this.userInteractionEnabled) {
{
BufferedReader input = new BufferedReader(new InputStreamReader(System.in)); BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
System.out.print("\nWould you like to view a list of all files that were created? [y/n]: "); System.out.print("\nWould you like to view a list of all files that were created? [y/n]: ");
choiceString = input.readLine(); choiceString = input.readLine();
} } else {
else
{
// user interaction disabled -- default answer to 'yes', as // user interaction disabled -- default answer to 'yes', as
// we want to provide user with as detailed a report as possible. // we want to provide user with as detailed a report as possible.
choiceString = "y"; choiceString = "y";
} }
// Provide detailed report if user answered 'yes' // Provide detailed report if user answered 'yes'
if (choiceString.equalsIgnoreCase("y")) if (choiceString.equalsIgnoreCase("y")) {
{
System.out.println("\n\n"); System.out.println("\n\n");
for(File result : fileResults) for (File result : fileResults) {
{
System.out.println("CREATED package file: " + result.getCanonicalPath()); System.out.println("CREATED package file: " + result.getCanonicalPath());
} }
} }
} }
} } else {
else
{
//otherwise, just disseminate a single object to a single package file //otherwise, just disseminate a single object to a single package file
dip.disseminate(context, dso, pkgParams, pkgFile); dip.disseminate(context, dso, pkgParams, pkgFile);
if(pkgFile!=null && pkgFile.exists()) if (pkgFile != null && pkgFile.exists()) {
{
System.out.println("\nCREATED package file: " + pkgFile.getCanonicalPath()); System.out.println("\nCREATED package file: " + pkgFile.getCanonicalPath());
} }
} }
} }
/** /**
* Replace an one or more existing DSpace objects with the contents of * Replace an one or more existing DSpace objects with the contents of
* specified package(s) based on the options passed to the 'packager' script. * specified package(s) based on the options passed to the 'packager' script.
* This method is only called for full replaces ('-r -f' options specified) * This method is only called for full replaces ('-r -f' options specified)
* *
* @param context DSpace Context * @param context DSpace Context
* @param sip PackageIngester which will actually replace the object with the package * @param sip PackageIngester which will actually replace the object with the package
* @param pkgParams Parameters to pass to individual packager instances * @param pkgParams Parameters to pass to individual packager instances
* @param sourceFile location of the source package to ingest as the replacement * @param sourceFile location of the source package to ingest as the replacement
* @param objToReplace DSpace object to replace (may be null if it will be specified in the package itself) * @param objToReplace DSpace object to replace (may be null if it will be specified in the package itself)
* @throws IOException if IO error * @throws IOException if IO error
* @throws SQLException if database error * @throws SQLException if database error
* @throws FileNotFoundException if file doesn't exist * @throws FileNotFoundException if file doesn't exist
* @throws AuthorizeException if authorization error * @throws AuthorizeException if authorization error
* @throws CrosswalkException if crosswalk error * @throws CrosswalkException if crosswalk error
* @throws PackageException if packaging error * @throws PackageException if packaging error
*/ */
protected void replace(Context context, PackageIngester sip, PackageParameters pkgParams, String sourceFile, DSpaceObject objToReplace) protected void replace(Context context, PackageIngester sip, PackageParameters pkgParams, String sourceFile,
throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException, PackageException DSpaceObject objToReplace)
{ throws IOException, SQLException, FileNotFoundException, AuthorizeException, CrosswalkException,
PackageException {
// make sure we have an input file // make sure we have an input file
File pkgFile = new File(sourceFile); File pkgFile = new File(sourceFile);
if(!pkgFile.exists()) if (!pkgFile.exists()) {
{
System.out.println("\nPackage located at " + sourceFile + " does not exist!"); System.out.println("\nPackage located at " + sourceFile + " does not exist!");
System.exit(1); System.exit(1);
} }
System.out.println("\nReplacing DSpace object(s) with package located at " + sourceFile); System.out.println("\nReplacing DSpace object(s) with package located at " + sourceFile);
if(objToReplace!=null) if (objToReplace != null) {
{
System.out.println("Will replace existing DSpace " + Constants.typeText[objToReplace.getType()] + System.out.println("Will replace existing DSpace " + Constants.typeText[objToReplace.getType()] +
" [ hdl=" + objToReplace.getHandle() + " ]"); " [ hdl=" + objToReplace.getHandle() + " ]");
} }
// NOTE: At this point, objToReplace may be null. If it is null, it is up to the PackageIngester // NOTE: At this point, objToReplace may be null. If it is null, it is up to the PackageIngester
// to determine which Object needs to be replaced (based on the handle specified in the pkg, etc.) // to determine which Object needs to be replaced (based on the handle specified in the pkg, etc.)
try try {
{
//If we are doing a recursive replace, call replaceAll() //If we are doing a recursive replace, call replaceAll()
if (pkgParams.recursiveModeEnabled()) if (pkgParams.recursiveModeEnabled()) {
{ //ingest first object using package & recursively replace anything else that package references
//ingest first object using package & recursively replace anything else that package references (child objects, etc) // (child objects, etc)
List<String> hdlResults = sip.replaceAll(context, objToReplace, pkgFile, pkgParams); List<String> hdlResults = sip.replaceAll(context, objToReplace, pkgFile, pkgParams);
if (hdlResults != null) { if (hdlResults != null) {
@@ -742,52 +689,42 @@ public class Packager
String choiceString = null; String choiceString = null;
//Ask if user wants full list printed to command line, as this may be rather long. //Ask if user wants full list printed to command line, as this may be rather long.
if (this.userInteractionEnabled) if (this.userInteractionEnabled) {
{
BufferedReader input = new BufferedReader(new InputStreamReader(System.in)); BufferedReader input = new BufferedReader(new InputStreamReader(System.in));
System.out.print("\nWould you like to view a list of all objects that were replaced? [y/n]: "); System.out.print("\nWould you like to view a list of all objects that were replaced? [y/n]: ");
choiceString = input.readLine(); choiceString = input.readLine();
} } else {
else
{
// user interaction disabled -- default answer to 'yes', as // user interaction disabled -- default answer to 'yes', as
// we want to provide user with as detailed a report as possible. // we want to provide user with as detailed a report as possible.
choiceString = "y"; choiceString = "y";
} }
// Provide detailed report if user answered 'yes' // Provide detailed report if user answered 'yes'
if (choiceString.equalsIgnoreCase("y")) if (choiceString.equalsIgnoreCase("y")) {
{
System.out.println("\n\n"); System.out.println("\n\n");
for (String result : hdlResults) for (String result : hdlResults) {
{ DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService()
DSpaceObject dso = HandleServiceFactory.getInstance().getHandleService().resolveToObject(context, result); .resolveToObject(context, result);
if (dso != null) if (dso != null) {
{
System.out.println("REPLACED DSpace " + Constants.typeText[dso.getType()] + System.out.println("REPLACED DSpace " + Constants.typeText[dso.getType()] +
" [ hdl=" + dso.getHandle() + " ] "); " [ hdl=" + dso.getHandle() + " ] ");
} }
} }
} }
} }
} } else {
else
{
//otherwise, just one object to replace //otherwise, just one object to replace
DSpaceObject dso = sip.replace(context, objToReplace, pkgFile, pkgParams); DSpaceObject dso = sip.replace(context, objToReplace, pkgFile, pkgParams);
if (dso != null) if (dso != null) {
{
System.out.println("REPLACED DSpace " + Constants.typeText[dso.getType()] + System.out.println("REPLACED DSpace " + Constants.typeText[dso.getType()] +
" [ hdl=" + dso.getHandle() + " ] "); " [ hdl=" + dso.getHandle() + " ] ");
} }
} }
} } catch (WorkflowException e) {
catch (WorkflowException e)
{
throw new PackageException(e); throw new PackageException(e);
} }
} }

View File

@@ -7,26 +7,37 @@
*/ */
package org.dspace.app.requestitem; package org.dspace.app.requestitem;
import java.util.Date;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.ManyToOne;
import javax.persistence.SequenceGenerator;
import javax.persistence.Table;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;
import org.dspace.content.Bitstream; import org.dspace.content.Bitstream;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.core.ReloadableEntity; import org.dspace.core.ReloadableEntity;
import javax.persistence.*;
import java.util.Date;
/** /**
* Object representing an Item Request * Object representing an Item Request
*/ */
@Entity @Entity
@Table(name="requestitem") @Table(name = "requestitem")
public class RequestItem implements ReloadableEntity<Integer> { public class RequestItem implements ReloadableEntity<Integer> {
@Id @Id
@Column(name="requestitem_id") @Column(name = "requestitem_id")
@GeneratedValue(strategy = GenerationType.SEQUENCE ,generator="requestitem_seq") @GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "requestitem_seq")
@SequenceGenerator(name="requestitem_seq", sequenceName="requestitem_seq", allocationSize = 1) @SequenceGenerator(name = "requestitem_seq", sequenceName = "requestitem_seq", allocationSize = 1)
private int requestitem_id; private int requestitem_id;
@ManyToOne(fetch = FetchType.LAZY) @ManyToOne(fetch = FetchType.LAZY)
@@ -43,9 +54,9 @@ public class RequestItem implements ReloadableEntity<Integer> {
@Column(name = "request_name", length = 64) @Column(name = "request_name", length = 64)
private String reqName; private String reqName;
// @Column(name = "request_message") // @Column(name = "request_message")
// @Lob // @Lob
@Column(name="request_message", columnDefinition = "text") @Column(name = "request_message", columnDefinition = "text")
private String reqMessage; private String reqMessage;
@Column(name = "token", unique = true, length = 48) @Column(name = "token", unique = true, length = 48)
@@ -71,10 +82,10 @@ public class RequestItem implements ReloadableEntity<Integer> {
/** /**
* Protected constructor, create object using: * Protected constructor, create object using:
* {@link org.dspace.app.requestitem.service.RequestItemService#createRequest(Context, Bitstream, Item, boolean, String, String, String)} * {@link org.dspace.app.requestitem.service.RequestItemService#createRequest(Context, Bitstream, Item,
* boolean, String, String, String)}
*/ */
protected RequestItem() protected RequestItem() {
{
} }
public Integer getID() { public Integer getID() {

View File

@@ -12,19 +12,18 @@ import org.dspace.eperson.EPerson;
/** /**
* Simple DTO to transfer data about the corresponding author for the Request * Simple DTO to transfer data about the corresponding author for the Request
* Copy feature * Copy feature
* *
* @author Andrea Bollini * @author Andrea Bollini
*
*/ */
public class RequestItemAuthor { public class RequestItemAuthor {
private String fullName; private String fullName;
private String email; private String email;
public RequestItemAuthor(String fullName, String email) { public RequestItemAuthor(String fullName, String email) {
super(); super();
this.fullName = fullName; this.fullName = fullName;
this.email = email; this.email = email;
} }
public RequestItemAuthor(EPerson ePerson) { public RequestItemAuthor(EPerson ePerson) {
super(); super();
@@ -32,11 +31,11 @@ public class RequestItemAuthor {
this.email = ePerson.getEmail(); this.email = ePerson.getEmail();
} }
public String getEmail() { public String getEmail() {
return email; return email;
} }
public String getFullName() { public String getFullName() {
return fullName; return fullName;
} }
} }

View File

@@ -15,11 +15,10 @@ import org.dspace.core.Context;
/** /**
* Interface to abstract the strategy for select the author to contact for * Interface to abstract the strategy for select the author to contact for
* request copy * request copy
* *
* @author Andrea Bollini * @author Andrea Bollini
*
*/ */
public interface RequestItemAuthorExtractor { public interface RequestItemAuthorExtractor {
public RequestItemAuthor getRequestItemAuthor(Context context, Item item) public RequestItemAuthor getRequestItemAuthor(Context context, Item item)
throws SQLException; throws SQLException;
} }

View File

@@ -7,6 +7,8 @@
*/ */
package org.dspace.app.requestitem; package org.dspace.app.requestitem;
import java.sql.SQLException;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang.StringUtils;
import org.apache.log4j.Logger; import org.apache.log4j.Logger;
import org.dspace.content.Item; import org.dspace.content.Item;
@@ -17,16 +19,15 @@ import org.dspace.eperson.EPerson;
import org.dspace.eperson.service.EPersonService; import org.dspace.eperson.service.EPersonService;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import java.sql.SQLException;
/** /**
* RequestItem strategy to allow DSpace support team's helpdesk to receive requestItem request * RequestItem strategy to allow DSpace support team's helpdesk to receive requestItem request
* With this enabled, then the Item author/submitter doesn't receive the request, but the helpdesk instead does. * With this enabled, then the Item author/submitter doesn't receive the request, but the helpdesk instead does.
* *
* Failover to the RequestItemSubmitterStrategy, which means the submitter would get the request if there is no specified helpdesk email. * Failover to the RequestItemSubmitterStrategy, which means the submitter would get the request if there is no
* specified helpdesk email.
* *
* @author Sam Ottenhoff * @author Sam Ottenhoff
* @author Peter Dietz * @author Peter Dietz
*/ */
public class RequestItemHelpdeskStrategy extends RequestItemSubmitterStrategy { public class RequestItemHelpdeskStrategy extends RequestItemSubmitterStrategy {
@@ -35,11 +36,13 @@ public class RequestItemHelpdeskStrategy extends RequestItemSubmitterStrategy {
@Autowired(required = true) @Autowired(required = true)
protected EPersonService ePersonService; protected EPersonService ePersonService;
public RequestItemHelpdeskStrategy() {} public RequestItemHelpdeskStrategy() {
}
@Override @Override
public RequestItemAuthor getRequestItemAuthor(Context context, Item item) throws SQLException { public RequestItemAuthor getRequestItemAuthor(Context context, Item item) throws SQLException {
boolean helpdeskOverridesSubmitter = ConfigurationManager.getBooleanProperty("request.item.helpdesk.override", false); boolean helpdeskOverridesSubmitter = ConfigurationManager
.getBooleanProperty("request.item.helpdesk.override", false);
String helpDeskEmail = ConfigurationManager.getProperty("mail.helpdesk"); String helpDeskEmail = ConfigurationManager.getProperty("mail.helpdesk");
if (helpdeskOverridesSubmitter && StringUtils.isNotBlank(helpDeskEmail)) { if (helpdeskOverridesSubmitter && StringUtils.isNotBlank(helpDeskEmail)) {
@@ -54,19 +57,20 @@ public class RequestItemHelpdeskStrategy extends RequestItemSubmitterStrategy {
* Return a RequestItemAuthor object for the specified helpdesk email address. * Return a RequestItemAuthor object for the specified helpdesk email address.
* It makes an attempt to find if there is a matching eperson for the helpdesk address, to use the name, * It makes an attempt to find if there is a matching eperson for the helpdesk address, to use the name,
* Otherwise it falls back to a helpdeskname key in the Messages.props. * Otherwise it falls back to a helpdeskname key in the Messages.props.
* @param context context *
* @param context context
* @param helpDeskEmail email * @param helpDeskEmail email
* @return RequestItemAuthor * @return RequestItemAuthor
* @throws SQLException if database error * @throws SQLException if database error
*/ */
public RequestItemAuthor getHelpDeskPerson(Context context, String helpDeskEmail) throws SQLException{ public RequestItemAuthor getHelpDeskPerson(Context context, String helpDeskEmail) throws SQLException {
EPerson helpdeskEPerson = null; EPerson helpdeskEPerson = null;
context.turnOffAuthorisationSystem(); context.turnOffAuthorisationSystem();
helpdeskEPerson = ePersonService.findByEmail(context, helpDeskEmail); helpdeskEPerson = ePersonService.findByEmail(context, helpDeskEmail);
context.restoreAuthSystemState(); context.restoreAuthSystemState();
if(helpdeskEPerson != null) { if (helpdeskEPerson != null) {
return new RequestItemAuthor(helpdeskEPerson); return new RequestItemAuthor(helpdeskEPerson);
} else { } else {
String helpdeskName = I18nUtil.getMessage( String helpdeskName = I18nUtil.getMessage(

View File

@@ -11,8 +11,8 @@ import java.sql.SQLException;
import java.util.List; import java.util.List;
import org.apache.commons.lang.StringUtils; import org.apache.commons.lang.StringUtils;
import org.dspace.content.MetadataValue;
import org.dspace.content.Item; import org.dspace.content.Item;
import org.dspace.content.MetadataValue;
import org.dspace.content.service.ItemService; import org.dspace.content.service.ItemService;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.core.I18nUtil; import org.dspace.core.I18nUtil;
@@ -21,9 +21,8 @@ import org.springframework.beans.factory.annotation.Autowired;
/** /**
* Try to look to an item metadata for the corresponding author name and email. * Try to look to an item metadata for the corresponding author name and email.
* Failover to the RequestItemSubmitterStrategy * Failover to the RequestItemSubmitterStrategy
* *
* @author Andrea Bollini * @author Andrea Bollini
*
*/ */
public class RequestItemMetadataStrategy extends RequestItemSubmitterStrategy { public class RequestItemMetadataStrategy extends RequestItemSubmitterStrategy {
@@ -33,49 +32,44 @@ public class RequestItemMetadataStrategy extends RequestItemSubmitterStrategy {
@Autowired(required = true) @Autowired(required = true)
protected ItemService itemService; protected ItemService itemService;
public RequestItemMetadataStrategy() { public RequestItemMetadataStrategy() {
} }
@Override @Override
public RequestItemAuthor getRequestItemAuthor(Context context, Item item) public RequestItemAuthor getRequestItemAuthor(Context context, Item item)
throws SQLException { throws SQLException {
if (emailMetadata != null) if (emailMetadata != null) {
{ List<MetadataValue> vals = itemService.getMetadataByMetadataString(item, emailMetadata);
List<MetadataValue> vals = itemService.getMetadataByMetadataString(item, emailMetadata); if (vals.size() > 0) {
if (vals.size() > 0) String email = vals.iterator().next().getValue();
{ String fullname = null;
String email = vals.iterator().next().getValue(); if (fullNameMetadata != null) {
String fullname = null;
if (fullNameMetadata != null)
{
List<MetadataValue> nameVals = itemService.getMetadataByMetadataString(item, fullNameMetadata); List<MetadataValue> nameVals = itemService.getMetadataByMetadataString(item, fullNameMetadata);
if (nameVals.size() > 0) if (nameVals.size() > 0) {
{ fullname = nameVals.iterator().next().getValue();
fullname = nameVals.iterator().next().getValue(); }
} }
}
if (StringUtils.isBlank(fullname))
{
fullname = I18nUtil
.getMessage(
"org.dspace.app.requestitem.RequestItemMetadataStrategy.unnamed",
context);
}
RequestItemAuthor author = new RequestItemAuthor(
fullname, email);
return author;
}
}
return super.getRequestItemAuthor(context, item);
}
public void setEmailMetadata(String emailMetadata) { if (StringUtils.isBlank(fullname)) {
this.emailMetadata = emailMetadata; fullname = I18nUtil
} .getMessage(
"org.dspace.app.requestitem.RequestItemMetadataStrategy.unnamed",
context);
}
RequestItemAuthor author = new RequestItemAuthor(
fullname, email);
return author;
}
}
return super.getRequestItemAuthor(context, item);
}
public void setFullNameMetadata(String fullNameMetadata) { public void setEmailMetadata(String emailMetadata) {
this.fullNameMetadata = fullNameMetadata; this.emailMetadata = emailMetadata;
} }
public void setFullNameMetadata(String fullNameMetadata) {
this.fullNameMetadata = fullNameMetadata;
}
} }

View File

@@ -7,6 +7,9 @@
*/ */
package org.dspace.app.requestitem; package org.dspace.app.requestitem;
import java.sql.SQLException;
import java.util.Date;
import org.apache.log4j.Logger; import org.apache.log4j.Logger;
import org.dspace.app.requestitem.dao.RequestItemDAO; import org.dspace.app.requestitem.dao.RequestItemDAO;
import org.dspace.app.requestitem.service.RequestItemService; import org.dspace.app.requestitem.service.RequestItemService;
@@ -16,9 +19,6 @@ import org.dspace.core.Context;
import org.dspace.core.Utils; import org.dspace.core.Utils;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import java.sql.SQLException;
import java.util.Date;
/** /**
* Service implementation for the RequestItem object. * Service implementation for the RequestItem object.
* This class is responsible for all business logic calls for the RequestItem object and is autowired by spring. * This class is responsible for all business logic calls for the RequestItem object and is autowired by spring.
@@ -33,13 +33,13 @@ public class RequestItemServiceImpl implements RequestItemService {
@Autowired(required = true) @Autowired(required = true)
protected RequestItemDAO requestItemDAO; protected RequestItemDAO requestItemDAO;
protected RequestItemServiceImpl() protected RequestItemServiceImpl() {
{
} }
@Override @Override
public String createRequest(Context context, Bitstream bitstream, Item item, boolean allFiles, String reqEmail, String reqName, String reqMessage) throws SQLException { public String createRequest(Context context, Bitstream bitstream, Item item, boolean allFiles, String reqEmail,
String reqName, String reqMessage) throws SQLException {
RequestItem requestItem = requestItemDAO.create(context, new RequestItem()); RequestItem requestItem = requestItemDAO.create(context, new RequestItem());
requestItem.setToken(Utils.generateHexKey()); requestItem.setToken(Utils.generateHexKey());
@@ -53,10 +53,9 @@ public class RequestItemServiceImpl implements RequestItemService {
requestItemDAO.save(context, requestItem); requestItemDAO.save(context, requestItem);
if (log.isDebugEnabled()) if (log.isDebugEnabled()) {
{
log.debug("Created requestitem_token " + requestItem.getID() log.debug("Created requestitem_token " + requestItem.getID()
+ " with token " + requestItem.getToken() + "\""); + " with token " + requestItem.getToken() + "\"");
} }
return requestItem.getToken(); return requestItem.getToken();
} }

View File

@@ -15,22 +15,21 @@ import org.dspace.eperson.EPerson;
/** /**
* Basic strategy that looks to the original submitter. * Basic strategy that looks to the original submitter.
*
* @author Andrea Bollini
* *
* @author Andrea Bollini
*/ */
public class RequestItemSubmitterStrategy implements RequestItemAuthorExtractor { public class RequestItemSubmitterStrategy implements RequestItemAuthorExtractor {
public RequestItemSubmitterStrategy() { public RequestItemSubmitterStrategy() {
} }
@Override @Override
public RequestItemAuthor getRequestItemAuthor(Context context, Item item) public RequestItemAuthor getRequestItemAuthor(Context context, Item item)
throws SQLException { throws SQLException {
EPerson submitter = item.getSubmitter(); EPerson submitter = item.getSubmitter();
RequestItemAuthor author = new RequestItemAuthor( RequestItemAuthor author = new RequestItemAuthor(
submitter.getFullName(), submitter.getEmail()); submitter.getFullName(), submitter.getEmail());
return author; return author;
} }
} }

View File

@@ -7,15 +7,16 @@
*/ */
package org.dspace.app.requestitem.dao; package org.dspace.app.requestitem.dao;
import java.sql.SQLException;
import org.dspace.app.requestitem.RequestItem; import org.dspace.app.requestitem.RequestItem;
import org.dspace.core.Context; import org.dspace.core.Context;
import org.dspace.core.GenericDAO; import org.dspace.core.GenericDAO;
import java.sql.SQLException;
/** /**
* Database Access Object interface class for the RequestItem object. * Database Access Object interface class for the RequestItem object.
* The implementation of this class is responsible for all database calls for the RequestItem object and is autowired by spring * The implementation of this class is responsible for all database calls for the RequestItem object and is autowired
* by spring
* This class should only be accessed from a single service and should never be exposed outside of the API * This class should only be accessed from a single service and should never be exposed outside of the API
* *
* @author kevinvandevelde at atmire.com * @author kevinvandevelde at atmire.com

Some files were not shown because too many files have changed in this diff Show More