Compare commits

...

446 Commits

Author SHA1 Message Date
Emmanuel Cazenave 733cc1104f litterlias: accept json response in annexes (#90254)
gitea/passerelle/pipeline/head This commit looks good Details
2024-05-02 14:58:37 +02:00
Emmanuel Cazenave 022d3f2f37 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2024-05-02 10:54:12 +02:00
Emmanuel Cazenave 9d8ed0ae1d r2p: add hide_spi parameter (#90217)
gitea/passerelle/pipeline/head This commit looks good Details
2024-05-02 10:39:52 +02:00
Frédéric Péters 35201c953f translation update
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-30 16:23:53 +02:00
Corentin Sechet 48eeaf01e4 translations update
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-26 12:05:27 +02:00
Corentin Sechet a3862e80fe translations update
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-26 12:02:32 +02:00
Frédéric Péters 09632084c0 misc: allow an "empty" value (single dash) for trace emails (#88841)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-26 11:38:15 +02:00
Frédéric Péters feb6fd1428 photon: handle "name" and pack other extra properties (#89845)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-26 09:02:56 +02:00
Frédéric Péters db9740da15 tcl: use id_data in status check (#89823)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-22 09:28:53 +02:00
Emmanuel Cazenave 38d62092ef jsonresponse: add x-error-code header in case of error (#89784)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-19 14:45:58 +02:00
Emmanuel Cazenave c25a371c10 litteralis: receive and forward file as an attachment (#89739)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-18 16:14:58 +02:00
Benjamin Dauvergne c10b2327dd rsa13: keep null integer values in CSV export (#89677)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-17 11:10:01 +02:00
Corentin Sechet 3538d06f6f toulouse-foederis: increase requests timeout (#89287)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-09 11:27:54 +02:00
Nicolas Roche abafd19553 translations update
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-04 12:03:13 +02:00
Nicolas Roche a565716db2 toulouse-maelis: change activities sort on global catalog (#89096)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-04 11:45:13 +02:00
Nicolas Roche d4bb8059e0 toulouse-maelis: [tests] count soap requests (#88873)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-03 17:17:37 +02:00
Benjamin Dauvergne 10590eb9d9 toulouse_maelis: check q is a DUI in search-family-dui (#88873)
A DUI is an xs:int.
2024-04-03 17:17:37 +02:00
Yann Weber 23a59e3f0f caldav: add EGW SQL log cleaning (#88974)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-03 11:05:54 +02:00
Yann Weber 71d40f1230 caldav: add a test with datetime (#88977)
gitea/passerelle/pipeline/head This commit looks good Details
During tests with SEQUENCE field, we found that no tests were testing
an event with date + time
2024-04-02 18:28:25 +02:00
Yann Weber d740651fcd caldav: fix SEQUENCE value when creating an event (#88977) 2024-04-02 18:28:07 +02:00
Yann Weber 1fa2c0f9a7 caldav: remove caldav lib mocks from tests (#88463)
gitea/passerelle/pipeline/head This commit looks good Details
2024-04-02 16:03:31 +02:00
Corentin Sechet 96c1e49e23 qrcode: add retry button to refetch metadata on error (#86091)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-29 09:35:00 +01:00
Thomas NOËL 3729c4605d filr_rest: precise folder title when search (#88491)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-22 12:29:06 +01:00
Yann Weber 923125e786 caldav: add SEQUENCE field when missing on event update (#88417)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-21 16:11:27 +01:00
Valentin Deniaud 4a243f72e5 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-21 11:39:59 +01:00
Yann Weber b5ccfb60ff carl: add check_status method checking if Carl is up (#88345)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-21 10:33:05 +01:00
Yann Weber a36a0f9e9c translations update
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-21 09:30:12 +01:00
Yann Weber fd7a2d487d caldav: add rrule/until convertion to date (#88393)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-20 18:18:44 +01:00
Yann Weber 1e13344dc0 caldav: pin bookworm's version of caldav lib & deps (#88393) 2024-03-20 18:18:36 +01:00
Yann Weber ed4a4b629e carl: add check on empty id (#87989)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-19 18:15:31 +01:00
Yann Weber 82e4d424c7 caldav: remove DTSTART & DTEND from required arguments (#88313)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-18 17:28:00 +01:00
Serghei Mihai 9ccf11fbd2 mgdis: add initial connector (#88018)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-15 09:39:04 +01:00
Yann Weber 764a1997f9 caldav: fix event update (#88140)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-14 14:04:55 +01:00
Yann Weber 04a9840744 caldav: add attribute to avoid logging of "normal" 401 (#88120)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-14 11:29:59 +01:00
Yann Weber 7b58db7ea7 caldav: make caldav use our requests in order to log DAV request (#88089)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-13 11:59:07 +01:00
Yann Weber 8986d35915 caldav: change CATEGORIES parameter handling (#88089)
Allowing only a single category
2024-03-13 11:59:00 +01:00
Nicolas Roche 5d26332646 toulouse-maelis: sort activities on global catalog (#87953)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-08 16:00:12 +01:00
Yann Weber 0f75026c9a update translation
gitea/passerelle/pipeline/head This commit looks good Details
Carl connector implementation (#86683)
2024-03-07 18:34:15 +01:00
Yann Weber f32d06b474 carl: add carl connector (#86683)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-05 17:27:44 +01:00
Yann Weber 08b82d398d update translation
gitea/passerelle/pipeline/head This commit looks good Details
CalDAV connector
2024-03-05 16:59:46 +01:00
Yann Weber 5e3a2d23d9 Revert "update translations"
gitea/passerelle/pipeline/head This commit looks good Details
This reverts commit c729600cfc.
2024-03-05 15:37:26 +01:00
Yann Weber c729600cfc update translations
gitea/passerelle/pipeline/head This commit looks good Details
CalDAV connector
2024-03-05 15:00:04 +01:00
Yann Weber 665c16bca2 caldav: add event creation/update/deletion (#87227)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-05 14:30:06 +01:00
Nicolas Roche 5f86102711 toulouse-maelis: add get-family-id endpoint (#87561)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-01 17:58:37 +01:00
Corentin Sechet 47f925d62c update translation
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-01 15:46:10 +01:00
Corentin Sechet 6369875b12 qrcode: center popup and make it full width (#86853) 2024-03-01 15:46:10 +01:00
Corentin Sechet 7330fce543 qrcode: move tally to a new element (#86853) 2024-03-01 15:46:10 +01:00
Corentin Sechet 6fc97253ff qrcode: wait for UI update in tests event when not calling fetch (#86853) 2024-03-01 15:46:10 +01:00
Corentin Sechet 631fd54f30 qrcode: move data items to a new element (#86853) 2024-03-01 15:46:10 +01:00
Corentin Sechet 8e7056f4de qrcode: move validity code to a new element (#86853) 2024-03-01 15:46:10 +01:00
Corentin Sechet 99662694e6 qrcode: remove title and show errors in dedicated div (#86853) 2024-03-01 15:46:10 +01:00
Frédéric Péters f7f848af82 litteralis: do not send microseconds (#87648)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-01 15:44:07 +01:00
Emmanuel Cazenave b35b0fccd3 atal_rest: expand data returned by worksrequest-status (#87630)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-01 10:38:42 +01:00
Emmanuel Cazenave 1570ba1f3f atal_rest: change a parameter name (#87489)
gitea/passerelle/pipeline/head Build queued... Details
2024-03-01 10:37:07 +01:00
Emmanuel Cazenave ea1a5a87d6 atal_rest: accept string and float for geoloc (#87484)
gitea/passerelle/pipeline/head This commit looks good Details
2024-03-01 10:36:33 +01:00
Corentin Sechet 96f992e11d js: pin happy-dom version (#87584)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-29 17:50:10 +01:00
Nicolas Roche fd91664b1e toulouse-maelis: [tests] empty activity referential into a test (#87625)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-29 16:04:28 +01:00
Corentin Sechet 3f584e13a7 js: fix timers mocking in QRCode tests (#87365)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2024-02-28 18:27:39 +01:00
Benjamin Dauvergne 668ddf08e5 toulouse_maelis: trigger wcs after paid notification (#87168)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-23 20:20:02 +01:00
Benjamin Dauvergne b9cc850b7f toulouse_maelis: add an error status to invoice (#87168) 2024-02-23 20:20:02 +01:00
Benjamin Dauvergne b2aa4c72f1 misc: add count parameter to BaseResource.jobs() (#87168)
It limits the number of jobs to run; it is useful for tests.
2024-02-23 20:20:02 +01:00
Frédéric Péters d416793d64 litteralis: accept tz-aware datetimes (#87415)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-23 20:01:38 +01:00
Frédéric Péters 51d2b4b314 toulouse_smart: accept tz-aware datetimes (#87415)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-23 18:43:26 +01:00
Corentin Sechet c9001cdda4 qrcode: don't refresh tallying when reader is not visible (#86856)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-23 11:53:34 +01:00
Corentin Sechet 2f50e4b207 qrcode: fix tally-url present even when tallying is disabled for a reader (#87343)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-22 18:58:50 +01:00
Benjamin Dauvergne 91b92aeb44 toulouse_maelis: restart paid notification job every 5 minutes on failure (#87030)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-20 07:47:53 +01:00
Benjamin Dauvergne 628b38fe5f toulouse_maelis: ignore all errors when retrieving invoices (#87028)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-16 16:49:51 +01:00
Benjamin Dauvergne 8d6f202b16 utils/soap: use SOAPServiceUnreachable for all service down errors (#87028) 2024-02-16 16:49:44 +01:00
Benjamin Dauvergne 68764cd9c2 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-14 16:02:37 +01:00
Benjamin Dauvergne 33b39c52f1 base_adresse: add indexes on on geographic models names (#66694)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-14 15:29:10 +01:00
Corentin Sechet 0fa387d6d0 qrcode: don't refetch metadata & tally status if scanning the same certificate twice (#86854)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-14 13:23:15 +01:00
Corentin Sechet 9ed59bc94d qrcode: wait for metadata & tally to be loaded before returning from scan (#86854) 2024-02-14 12:00:11 +01:00
Benjamin Dauvergne 7c9e487482 rsa13: rename type-evenement endpoint to typeevenement (#86918)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-13 15:55:44 +01:00
Corentin Sechet 2842439ce1 qrcode: add tally service worker (#86092)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-13 13:17:30 +01:00
Corentin Sechet 4738850fcc qrcode: add tallying support (#86092) 2024-02-13 13:17:28 +01:00
Serghei Mihai 38d3fbbf4e arcgis: add endpoint to add/edit/delete features (#86679)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-12 15:42:54 +01:00
Corentin Sechet 36dfa9508e qrcode: add metadata support (#82653)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-12 10:47:38 +01:00
Serghei Mihai cd0f441d3b mdel: remove job call to non-existent method (#86794)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-09 11:44:31 +01:00
Benjamin Dauvergne 97e77b55c9 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-09 10:49:37 +01:00
Nicolas Roche c3b5707ea6 toulouse-maelis: add DUI to already linked error (#86594)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-09 10:46:49 +01:00
Benjamin Dauvergne d05217cdc0 utils/api: do not clobber endpoint.methods when endpoint.post exists (#84252)
gitea/passerelle/pipeline/head Build queued... Details
2024-02-07 19:00:52 +01:00
Benjamin Dauvergne 5b2f2d5b97 rsa13: add beneficiaire evenement endpoints (#84252) 2024-02-07 19:00:52 +01:00
Benjamin Dauvergne 4465d64903 rsa13: add type-evenement endpoint (#84252)
Cf. [[bouches-du-rhone-cd13:Spec_Beta#EN-RECETTE-v17-apitype-evenement-GET-liste-des-types-d%C3%A9v%C3%A8nements]]
2024-02-07 16:15:05 +01:00
Benjamin Dauvergne 0b4de669fd tests: remove references to InMemoryCache (#85832)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-02 20:49:28 +01:00
Benjamin Dauvergne a15a11ec4a utils/soap: use http cache for wsdl and xsd files (#85832)
We should not use another cache since the requests cache has a better
behaviour.
2024-02-02 20:49:28 +01:00
Benjamin Dauvergne 9a487dde91 utils: return cached response if refresh fails (#85832) 2024-02-02 20:49:28 +01:00
Nicolas Roche 9a158a66d1 toulouse-maelis: always return err=0 on dlnuf endpoint (#86503)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-02 16:37:23 +01:00
Benjamin Dauvergne a1d4c44ac4 utils: normalize schema before validation (#86422)
gitea/passerelle/pipeline/head This commit looks good Details
To allow using lazy strings in JSON schemas we cast lazy strings to str
before validating the schema itself. The resulting validator is cached.
2024-02-01 19:00:16 +01:00
Benjamin Dauvergne f2b64b6ebf utils: factorize json schema validation (#86422)
It was already used in two places (views.py and photon connector), and
wrongly (photon connector was ignoring some errors for no reason).

META_SCHEMA manipulation is removed and will be replaced by a
normalization of the schema to remove lazy strings in a later commit.

A new JSONValidationError subclass of APIError is introduced.
2024-02-01 19:00:16 +01:00
Benjamin Dauvergne 45f6ee9e8d utils: simplify subclassing APIError (#86422)
By making err, http_status and log_error class variables with their
default values.
2024-02-01 18:53:22 +01:00
Benjamin Dauvergne 3c30a76f3e ci: use jsonschema version from debian bookworm (#86422) 2024-02-01 12:47:39 +01:00
Benjamin Dauvergne 3ca8e98485 packaging: relax version constraint on jsonschema (#86422) 2024-02-01 12:43:44 +01:00
Emmanuel Cazenave 8f21df1dc7 r2p: adapt some labels (#86420)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-01 12:10:40 +01:00
Emmanuel Cazenave 4258520491 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-01 11:49:55 +01:00
Emmanuel Cazenave 0f1117f483 r2p: start connector (#86000)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-01 11:08:04 +01:00
Nicolas Roche eac67cb852 toulouse-maelis: don't validate a basket unrelated to new subscriptions (#86374)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-01 09:42:08 +01:00
Nicolas Roche f841e98049 toulouse-maelis: do not link invoice to removed subscriptions (#86373)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-01 09:34:12 +01:00
Nicolas Roche ad543177f1 toulouse-maelis: use a basket id variable (#86373) 2024-02-01 09:34:12 +01:00
Nicolas Roche 1a30653192 toulouse-maelis: [tests] do subscribe before basket validation on tests (#86364)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-01 09:28:04 +01:00
Nicolas Roche 33c756e76a toulouse-maelis: [tests] use same basket id in all tests (#86354)
gitea/passerelle/pipeline/head This commit looks good Details
2024-02-01 09:27:19 +01:00
Benjamin Dauvergne 94e60a35b2 mdel: intercept exceptions during response retrieval (#86379)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-31 15:52:44 +01:00
Benjamin Dauvergne 3ad6c89068 mdel: keep sending demands (#86379)
- add Demand.sent attribute
- set Demand.sent to True after pushing the demand
- retry after one hour in case of failure
2024-01-31 15:52:29 +01:00
Benjamin Dauvergne df0084d202 utils: raise sftp timeout to 20 secondes (#86379) 2024-01-31 12:37:44 +01:00
Nicolas Roche 13d30a8049 toulouse-maelis: normalize rl indicator referential (#86068)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-31 11:47:57 +01:00
Nicolas Roche b5aa2bb2b2 toulouse-maelis: normalize organ referential (#86068) 2024-01-31 11:47:57 +01:00
Nicolas Roche 509ae33314 toulouse-maelis: normalize direction referential (#86068) 2024-01-31 11:47:57 +01:00
Nicolas Roche c3557f628a toulouse-maelis: normalize service referential (#86068) 2024-01-31 11:47:57 +01:00
Nicolas Roche 93c6224357 toulouse-maelis: add service to nursery data (#86068) 2024-01-31 11:47:57 +01:00
Nicolas Roche e2e591afeb toulouse-maelis: add service to personnal activity data (#86068) 2024-01-31 11:47:57 +01:00
Nicolas Roche 94057f7a0f toulouse-maelis: add service to activity referential data (#86068) 2024-01-31 11:47:57 +01:00
Nicolas Roche 86ba7d6a55 toulouse-maelis: add direction to service referential data (#86068) 2024-01-31 11:47:57 +01:00
Frédéric Péters 129e2a5af3 csv: keep on failing on nul bytes (#86355)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-31 09:44:13 +01:00
Frédéric Péters 1c7f6c2557 tests: keep on using deprecated cgi module in tests (#86355)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2024-01-31 09:28:43 +01:00
Nicolas Roche 8459ef11a3 litteralis: raise API error on bad coordinates (#85310)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-19 16:51:45 +01:00
Nicolas Roche 8c9edcd332 toulouse-maelis: notify all invoices paid (#85807)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-19 10:39:56 +01:00
Lauréline Guérin f69f6281ab
base_adresse: endpoint cities, compatibility with wcs geolocation (#85863)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-19 08:40:55 +01:00
Nicolas Roche 72b3315ae4 toulouse-maelis: detail no week calendar error message (#85852)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-18 17:45:36 +01:00
Nicolas Roche 08f347edf3 toulouse-maelis: ignore invoice without number (#85704)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-18 11:28:46 +01:00
Nicolas Roche 28d2ea1ba9 toulouse-maelis: add family id to recurrent week call (#85389)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-17 10:31:56 +01:00
Frédéric Péters 9c695acf63 opengis: force projection for GeoJSON (#85670)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-16 16:43:23 +01:00
Benjamin Dauvergne bbf7aabb30 Revert base_adresse: add indexes on on geographic models names (#66694)
gitea/passerelle/pipeline/head This commit looks good Details
This reverts commit ee1baf8b50.
2024-01-15 18:34:09 +01:00
Benjamin Dauvergne ee1baf8b50 base_adresse: add indexes on on geographic models names (#66694)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-15 16:22:17 +01:00
Benjamin Dauvergne 37536df9d1 toulouse_maelis: report update errors after a delay (#84953)
gitea/passerelle/pipeline/head This commit looks good Details
Also try to update as soon as possible after an error during daily
updates.
2024-01-15 15:48:01 +01:00
Thomas NOËL 514c3f1995 scrib: always send numBac if present (#85548)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-13 00:49:28 +01:00
Thomas NOËL dd87030c50 scrib: return with err 1 on error codeErreur/codeRetour responses (#85447)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-09 16:37:43 +01:00
Nicolas Roche 104f96b61c toulouse-maelis: parse datetime on activity birth dates (#85263)
gitea/passerelle/pipeline/head This commit looks good Details
2024-01-04 11:49:14 +01:00
Thomas NOËL 6f3516297e settings: disable nantes_scrib app by default (#85087)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-22 15:31:48 +01:00
Thomas NOËL 43e2d9222a jenkinsfile: use parallel for pytest/vitest/pylint (#85071)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-22 14:30:02 +01:00
Thomas NOËL d0ecf8af77 update translations
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-22 12:21:00 +01:00
Thomas NOËL 31a0828d6d scrib: use abstract SOAP model (#84895)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-22 11:07:17 +01:00
Thomas NOËL 020a402a96 soap: create an AbstractSOAPConnector (#84895) 2023-12-22 11:07:17 +01:00
Nicolas Roche 0010095146 toulouse-maelis: detail get-recurrent-week errors (#84975)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-21 09:53:08 +01:00
Corentin Sechet b377b87d5d js: fix vitest version used (#84940)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-19 17:54:41 +01:00
Thomas NOËL 52886c216c update translations
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-19 09:54:06 +01:00
Thomas NOËL b8fc9716a4 add contrib.nantes_scrib connector (#83360)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-19 09:46:11 +01:00
Thomas NOËL 40c3c6affb matrix42: add a generic object/action POST endpoint (#84636)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-18 16:43:47 +01:00
Nicolas Roche ec8dd0a43c toulouse-maelis: hide standard unit from main catalog items (#84760)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-18 10:08:19 +01:00
Nicolas Roche 5ad847df95 misc: remove copyright line from footer (#84811)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-15 17:48:02 +01:00
Nicolas Roche 8e6f61ceb7 toulouse-maelis: allow to link providing accents (#84370)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-15 10:47:45 +01:00
Nicolas Roche 13fe6411eb toulouse-maelis: do not apply distinct on activity referential (#84576)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-15 10:47:13 +01:00
Thomas NOËL 6101988404 soap: fix initial migration (#84761)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-15 10:40:32 +01:00
Serghei Mihai 8b3b8edfda api_particulier: increase api_key size (#84671)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-13 13:17:24 +01:00
Pierre Ducroquet 52981183ff jenkins: double test processes (#84680)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-13 11:14:47 +01:00
Benjamin Dauvergne afab9d49a1 astregs: fix typo in find_tiers_by_rib (#84512)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-09 09:31:31 +01:00
Nicolas Roche e592c33021 toulouse-maelis: manage fiscal number on RL (#84372)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-07 15:57:27 +01:00
Nicolas Roche ad4b9de490 toulouse-maelis: update Family WSDL (#84372) 2023-12-07 11:28:24 +01:00
Benjamin Dauvergne b4d637249a cmis: disable logging of requests errors (#84348)
gitea/passerelle/pipeline/head This commit looks good Details
2023-12-06 18:05:17 +01:00
Benjamin Dauvergne f15d802b11 cmis: improve reporting of ObjectNotfound and PermissionDenied (#84348) 2023-12-06 17:42:02 +01:00
Emmanuel Cazenave 2e167a3466 esup_signature: ensure compatibility with requests < 2.27 (#84043)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-30 10:47:12 +01:00
Frédéric Péters f513f2451d opengis: add support for wfs:FeatureCollection responses (#84053)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-29 10:34:57 +01:00
Frédéric Péters b9939892b8 opengis: include data from namespaced elements (#79982)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-27 13:41:47 +01:00
Benjamin Dauvergne 7de7cd8b3f cmis: produce more precise APIError on cmislib.PermissionDeniedException (#83682)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-27 10:31:35 +01:00
Benjamin Dauvergne c247197c6e cmis: search first existing parent folder from the end of the path (#83682) 2023-11-27 10:31:35 +01:00
Benjamin Dauvergne bfd1fcc2f6 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-22 21:25:20 +01:00
Nicolas Roche 905e3b141f toulouse-maelis: hide invoices no more returned by maelis (#83676)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-22 21:02:16 +01:00
Nicolas Roche e8122d29eb toulouse-maelis: allow to link with RL2 data (#83842)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-22 19:03:53 +01:00
Serghei Mihai 4c5204bd2f mdel: cleanup demands folders (#83570)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-16 11:54:35 +01:00
Corentin Sechet 07619bc012 qrcode: fix fullscreen size on desktop & icons (#83517)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-15 13:19:18 +01:00
Corentin Sechet b516b7b66c qrcode: fix warnings on the reader page (#83525)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-15 13:19:09 +01:00
Corentin Sechet 92768f5852 qrcode: don't show error in console if camera torch isn't supported (#83516)
gitea/passerelle/pipeline/head Build queued... Details
2023-11-15 13:18:58 +01:00
Corentin Sechet d69e4df328 toulouse-foederis: add degree level parameter to attach_degree endpoint (#83507)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-15 11:39:05 +01:00
Corentin Sechet 2711f5c615 toulouse-foederis: add count parameters when updating announces (#83507) 2023-11-15 11:39:05 +01:00
Benjamin Dauvergne cd08a2068c qrcode: make validity optional (#83464)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-14 20:29:46 +01:00
Benjamin Dauvergne 66e99362ef qrcode: factorize generation of b45 signed data (#83464) 2023-11-14 20:29:46 +01:00
Benjamin Dauvergne 00443f8629 qrcode: change lost signature key in JS tests (#83464) 2023-11-14 20:29:46 +01:00
Corentin Sechet 230d424571 isere-esrh: add job-types endpoint (#82880)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-14 16:36:56 +01:00
Corentin Sechet a8e2223c50 isere-esrh: add entities endpoint (#82880) 2023-11-14 16:36:56 +01:00
Corentin Sechet 7951510aa1 isere-esrh: add connector and official endpoint (#82880) 2023-11-14 16:36:56 +01:00
Thomas NOËL 08fa0fad21 soap: complete check status system (#83473)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-14 14:35:54 +01:00
Benjamin Dauvergne 320013ac68 qrcode: add creation and last modification datetime to data models (#83463)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 22:08:07 +01:00
Benjamin Dauvergne 4338ee9cd7 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 18:42:38 +01:00
Corentin Sechet 32d3dd01bc qrcode: add frontend qrcode reader (#82652)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 16:57:06 +01:00
Corentin Sechet 7314fa224c js: add js unit tests support (#82651)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 16:19:51 +01:00
Corentin Sechet 82e9018865 qrcode: add qrcode reader management endpoints (#82650)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 15:31:35 +01:00
Corentin Sechet a51c49a865 qrcode: add get-qrcode endpoint (#82649)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 14:55:32 +01:00
Corentin Sechet 1e12dae71b qrcode: create qrcode connector & certificate management endpoints (#82648)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 14:34:31 +01:00
Thomas NOËL e506facfd6 debian: add back memory-report to uwsgi default configuration (#80451)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-13 11:34:46 +01:00
Thomas NOËL 0a28034137 astech: don't use bytes in APIError content (#83342)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 15:32:21 +01:00
Nicolas Roche 81f58cad59 toulouse-maelis: filter subscribable school years (#83262)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 12:11:28 +01:00
Emmanuel Cazenave 2a73e4dfb3 general: log Publik-Caller header (#83111)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 11:00:14 +01:00
Nicolas Roche 979e531b3a toulouse-maelis: correct for-payment parameter usage (#77110)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 10:41:34 +01:00
Nicolas Roche 5cd1e3aacc toulouse-maelis: get users linked to a family (#77295)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 10:38:06 +01:00
Nicolas Roche aa9585071a toulouse-maelis: manage online_payment without cache (#83257)
gitea/passerelle/pipeline/head Build queued... Details
2023-11-10 10:37:35 +01:00
Nicolas Roche 923427783c toulouse-maelis: rename activity type label (#83165)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 10:12:43 +01:00
Nicolas Roche 34ac701200 toulouse-maelis: display default template values in manager (#83054)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 09:54:35 +01:00
Nicolas Roche 1b0c842d48 toulouse-maelis: manage applicative error on school registration (#82705)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 09:51:20 +01:00
Nicolas Roche d0f4b9ecf9 toulouse-maelis: [tests] add test on school registration error (#82705) 2023-11-10 09:51:20 +01:00
Nicolas Roche a7ff9bbc4a toulouse-maelis: put global catalog in cache (#82379)
gitea/passerelle/pipeline/head Build queued... Details
2023-11-10 09:29:05 +01:00
Nicolas Roche b7b50717ca toulouse-maelis: disable connecteur by default (#83025)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 09:27:34 +01:00
Nicolas Roche 6c4fc4152d toulouse-maelis: remove unused ref_date on catalog endpoint (#82966)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 08:59:40 +01:00
Nicolas Roche e59765eaf7 toulouse-maelis: prevent creating invoice in concurency (#82706)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-10 08:43:17 +01:00
Thomas NOËL 8bb8f2c1df translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 15:33:33 +01:00
Thomas NOËL e2a45ea01b matrix42: rename search_filter to filter (#83112)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 15:06:41 +01:00
Thomas NOËL 11d3bd5a9b translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 13:39:27 +01:00
Thomas NOËL 2162e9d08d matrix42: use a pattern for ddname on fragment endpoint (#83105)
gitea/passerelle/pipeline/head Build queued... Details
2023-11-03 13:12:47 +01:00
Thomas NOËL 264550e363 matrix42: add filter possibility on fragment endpoint (#83103)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 12:10:00 +01:00
Thomas NOËL ba58f183ed matrix42: do not log requests errors (#83101)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-03 11:38:02 +01:00
Thomas NOËL f564e71d5d translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 17:00:01 +01:00
Thomas NOËL 6f7acc1489 add matrix42 connector (#81490)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 15:59:03 +01:00
Serghei Mihai 62c0b91ac4 astech: don't log requests errors (#83034)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 12:16:23 +01:00
Emmanuel Cazenave 2b0842eb03 atal_rest: accept empty list in worksrequest-intervention-status (#83029)
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 10:57:25 +01:00
Emmanuel Cazenave d315580294 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-11-02 10:43:59 +01:00
Serghei Mihai 140863373f translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 16:03:12 +01:00
Serghei Mihai fa50ff9129 astech: add filters to column results (#82963)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 15:55:16 +01:00
Emmanuel Cazenave 0154defcce translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 15:04:24 +01:00
Emmanuel Cazenave f336d7a952 atal_rest: add worksrequest-intervention-status endpoint (#82948)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 14:58:31 +01:00
Thomas NOËL c148f6ae03 debian: add uwsgi/passerelle SyslogIdentifier in service (#82977)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-31 13:21:13 +01:00
Nicolas Roche bda1eba253 cron: add an every5min entry (#82961)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-30 18:02:50 +01:00
Emmanuel Cazenave 8892a97435 setup: compute pep440 compliant dirty version number (#81731)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-30 17:36:44 +01:00
Serghei Mihai 14a6fb1aed translations update
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-19 10:11:49 +02:00
Benjamin Dauvergne f63e250e0d utils: add tools to execute actions outisde of transactions (#31204)
gitea/passerelle/pipeline/head This commit looks good Details
First use of it is to create ResourceLog objects after the current
transaction commit or abort.
2023-10-19 09:59:10 +02:00
Serghei Mihai 4789f1e1ff astech: add endpoint to get view data (#82416)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-17 16:41:42 +02:00
Serghei Mihai 2bbc835787 astech: add endpoint to get view columns (#82416) 2023-10-16 15:34:15 +02:00
Serghei Mihai 94184d9c5e astech: add endpoint to list available views (#82416) 2023-10-16 15:33:38 +02:00
Corentin Sechet 76f3860ad2 toulouse-foederis: allow empty degree files (#82390)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-16 11:29:49 +02:00
Corentin Sechet 3d5ec0268c toulouse-foederis: don't send filename in Content-Disposition of form-data when posting attachments (#82346)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-16 10:59:18 +02:00
Nicolas Roche f2652bac36 toulouse-smart: do not crash on receiving string in place of block field (#79816)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-12 15:08:55 +02:00
Corentin Sechet c598673e3d translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-12 15:05:52 +02:00
Corentin Sechet fe1f40cc7d toulouse-foederis: remove type_emploi data source (#82294)
gitea/passerelle/pipeline/head Build queued... Details
2023-10-12 14:56:48 +02:00
Corentin Sechet a3db9b1e35 toulouse-foederis: use multipart/form-data to attach files & diplomas (#82291)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-12 11:18:56 +02:00
Thomas NOËL 92f5b5f26b update translations
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-10-03 10:19:19 +02:00
Frédéric Péters d6b87039cb ci: keep on using pylint 2 while pylint-django is not ready (#81905)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-03 08:15:23 +02:00
Corentin Sechet 9d67f8587a toulouse-foederis: allow empty phone in create-application (#81728)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-02 10:28:48 +02:00
Benjamin Dauvergne a9f2956db7 templatags: rendering of $id/$ref in jsonschema (#81643)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-02 10:08:31 +02:00
Benjamin Dauvergne 8266740b52 soap: handle recursive complexType (#81643)
Reference to already converted complexType are converted to JSON schema
references.
2023-10-02 10:08:31 +02:00
Lauréline Guérin 117743e0a6 caluire-axel: fix family_date with a None first name (#81673)
gitea/passerelle/pipeline/head This commit looks good Details
2023-10-01 16:54:03 +02:00
Nicolas Roche 49226aca44 toulouse-maelis: add service filter on get_nursery_geojson (#81538)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-29 09:09:12 +02:00
Nicolas Roche bc62bdc3fd toulouse-maelis: [tests] return more nurseries on get_nursery_geojson (#81538) 2023-09-29 09:09:12 +02:00
Nicolas Roche 4bd7032998 toulouse-maelis: add service filter on read_nursery_list (#81538) 2023-09-29 09:09:12 +02:00
Nicolas Roche e1b3ab7646 toulouse-maelis: [tests] get last wsdls (#81538) 2023-09-29 09:09:12 +02:00
Frédéric Péters 7a671f7e74 misc: introduce setting to disable https checks for given hostnames (#81541)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-29 07:39:26 +02:00
Nicolas Roche 441ac49c58 toulouse-maelis: [tools] connector benchmark script (#81399)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-25 16:43:02 +02:00
Nicolas Roche bac28e933c toulouse-maelis: [tools] add tests to soap benchmark (#81399) 2023-09-25 16:43:02 +02:00
Benjamin Dauvergne b497988bf5 toulouse-maelis: [tools] read_family benchmark script (#81399) 2023-09-25 16:43:02 +02:00
Emmanuel Cazenave 898a14f821 atal_rest: accept empty files (#81518)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-25 14:38:33 +02:00
Emmanuel Cazenave ef0b518aba sne: fail silently on common sne errors (#81452)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-25 11:56:48 +02:00
Emmanuel Cazenave bf2610b4c5 sne: fail silently on incorrectly formatted demand_id (#81452) 2023-09-25 11:56:48 +02:00
Paul Marillonnet c56c0676de tests: provide tox4 compatibility (#81548)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-25 09:29:19 +02:00
Corentin Sechet 649c1c05a8 toulouse-foederis: send offer id in good field when applying (#81476)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-22 10:08:56 +02:00
Benjamin Dauvergne a192a953b9 toulouse_maelis: cache soap_client for 5 minutes (#81418)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-20 18:47:37 +02:00
Benjamin Dauvergne faf3e4692e base: add cache to soap_client method (#81418) 2023-09-20 18:47:37 +02:00
Nicolas Roche 60bcc9d82e toulouse-maelis: [tools] add pay_invoice script (#81398)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-20 14:39:41 +02:00
Corentin Sechet 0d6e180fef translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-19 18:02:09 +02:00
Corentin Sechet 02300e612e toulouse-foederis: make current_situation field of create-application optional (#81346)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-19 17:22:17 +02:00
Corentin Sechet 520e6a818b toulouse-foederis: send offer id and application type when creating application (#81346) 2023-09-19 17:22:17 +02:00
Corentin Sechet 199075ed80 sendethic: add send ethic sms connector (#81143)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-19 14:45:50 +02:00
Corentin Sechet 5fe9b9a1e6 sms: factorize OVH & SMS Factor low credit alert tests (#81143) 2023-09-19 14:43:38 +02:00
Corentin Sechet aed4c44107 sms: factorize OVH and SMSFactor connectors (#81143) 2023-09-19 14:43:38 +02:00
Corentin Sechet 62e452c31e toulouse-foederis: normalize phone number (#77528)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-19 13:44:44 +02:00
Nicolas Roche fd1c591ab3 toulouse-maelis: do not crash linking RL1 with no birth data (#80922)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-18 16:32:35 +02:00
Nicolas Roche 40287181cc toulouse-maelis: [functests] update only doctor (#2770)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-15 17:59:57 +02:00
Serghei Mihai 4ccaad6d35 adullact_pastell: do not log http errors (#80837)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-06 11:53:28 +02:00
Nicolas Roche 96b0777324 toulouse-maelis: restore clean-log on daily (#80885)
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-05 11:52:27 +02:00
Serghei Mihai 9e64fa5c9b translations update
gitea/passerelle/pipeline/head This commit looks good Details
2023-09-04 16:21:07 +02:00
Serghei Mihai 2916fb7c32 adullact_pastell: add endpoint to run an action on document (#80835) 2023-09-04 16:20:41 +02:00
Emmanuel Cazenave e549db488f translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-29 18:24:49 +02:00
Emmanuel Cazenave 16fc487119 atal_rest: add state label (#80653)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-29 17:40:27 +02:00
Corentin Sechet 9d78d8fcf3 toulouse-foederis: fix querying 'emploi' datasource with parent (#79938)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-28 11:08:13 +02:00
Corentin Sechet 8cdf3dcae2 toulouse-foederis: add attach-degree endpoint (#80565)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-25 17:29:06 +02:00
Nicolas Roche 659ba18a00 toulouse-maelis: do not update basket removal date (#79253)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-25 15:21:15 +02:00
Nicolas Roche 7cbd27afd3 toulouse-maelis: check read-schools-for-address-and-level parameters (#79211)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-25 15:07:09 +02:00
Nicolas Roche 1a37984298 toulouse-maelis: log we got no invoice on basket validation (#79254)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-25 14:55:54 +02:00
Nicolas Roche a4805681a1 toulouse-maelis: accept indicator on subscription (#74978)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-25 14:45:57 +02:00
Nicolas Roche eacfb506d6 toulouse-maelis: [tests] rename a test (#74978) 2023-08-25 14:27:55 +02:00
Frédéric Péters 306cba2423 lille urban card: allow path in base URL (#80444)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-18 12:04:34 +02:00
Valentin Deniaud bcfc02d94a misc: update git-blame-ignore-revs to ignore quote changes (#79788)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-16 10:08:30 +02:00
Valentin Deniaud 40142de8d2 misc: apply double-quote-string-fixer (#79788) 2023-08-16 10:08:30 +02:00
Valentin Deniaud 6e7ac8c145 misc: add pre commit hook to force single quotes (#79788) 2023-08-16 10:08:27 +02:00
Lauréline Guérin 0a46addb73
api_particulier: add an message in known_errors (#80127)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-04 11:30:43 +02:00
Frédéric Péters a19073c7e9 general: add a timestamp to static URLs, to avoid caching issues (#80228)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-03 09:27:14 +02:00
Frédéric Péters 7178f7c4d0 misc: mark some basic data sources endpoint with datasource parameter (#79706)
gitea/passerelle/pipeline/head This commit looks good Details
2023-08-02 11:46:07 +02:00
Nicolas Roche fac61176a3 toulouse-maelis: [functests] update vaccins referential (#79979)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-28 18:25:27 +02:00
Lauréline Guérin 4f136ee898
signal_arretes: fix RequestException not raised by raise_for_status (#79958)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-24 15:22:04 +02:00
Nicolas Roche 06f22a03f8 toulouse-maelis: explicit RL can use perisco booking endpoints (#78890)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-21 09:59:35 +02:00
Frédéric Péters ab2f8a847b tcl: reorder schedules (#79548)
gitea/passerelle/pipeline/head Build queued... Details
2023-07-21 09:59:29 +02:00
Thomas NOËL c38ee2913c debian: remove memory-report from uwsgi default configuration (#79890)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-20 17:59:45 +02:00
Corentin Sechet 100064eba8 toulouse-foederis: add count parameter to data source endpoints (#77526)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-20 12:12:18 +02:00
Corentin Sechet 5157fde445 toulouse-foederis: use a different endpoint for emploi data source (#79737)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-20 12:10:23 +02:00
Serghei Mihai 8fa7d79b1e translations update
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-17 17:44:29 +02:00
Serghei Mihai c83228e375 adullact_pastell: add initial connector (#79105) 2023-07-17 17:44:29 +02:00
Nicolas Roche 3cee8e4350 toulouse-maelis: return an array with place ids (#79411)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-17 11:01:43 +02:00
Lauréline Guérin 2775202bd8 opengis: fix reverse endpoint with bad params (#79136)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-17 10:41:55 +02:00
Nicolas Roche 64b25c7c73 fixup! wcs: explicite wcs response on WcsApiError (#78967)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-17 09:58:54 +02:00
Nicolas Roche a8bab7fa01 wcs: explicite wcs response on WcsApiError (#78967) 2023-07-17 09:58:54 +02:00
Nicolas Roche 16fd6aae41 toulouse-maelis: stop triggering a removed wcs demand (#78967) 2023-07-17 09:58:54 +02:00
Nicolas Roche 56e2a4b1d9 toulouse-maelis: add building letters to complement referential (#79386)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-10 10:56:56 +02:00
Nicolas Roche 122a0f6c22 toulouse-maelis: [functests] book providing APE indicators (#79484)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-07 14:09:46 +02:00
Nicolas Roche 569159a95f toulouse-maelis: do not crash on tomcat unavailability (#74621)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-07 13:58:37 +02:00
Nicolas Roche 2b0612d5ef toulouse-maelis: do not warn about unknown axel indicator (#79338)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-07 13:45:02 +02:00
Nicolas Roche 25420ca260 toulouse-maelis: do not crash on unbounded subscribed activity (#79088)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-07 13:02:18 +02:00
Nicolas Roche bd388e42a6 solis-afi-mss: provide quantity to demand help WS (#79327)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-07 08:01:31 +02:00
Valentin Deniaud 4ce087b6ef translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-06 18:09:29 +02:00
Valentin Deniaud 75db45cdfa Revert "pastell: add initial connector (#79105)" (#79494)
This reverts commit ccb53be16e.
2023-07-06 18:09:00 +02:00
Valentin Deniaud 5cda735517 sms: update credit left in check_status (#79444)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-06 11:46:20 +02:00
Serghei Mihai ccb53be16e pastell: add initial connector (#79105)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-06 11:44:59 +02:00
Serghei Mihai 9626f03f34 arpege_ecp: allow filtering demands by status (#79400)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-05 11:01:02 +02:00
Emmanuel Cazenave de3d69e2d2 atal_rest: do not expect json when sending attachments (#79332)
gitea/passerelle/pipeline/head This commit looks good Details
2023-07-04 10:53:41 +02:00
Emmanuel Cazenave 84f1b2e728 atal_rest: ensure compatibility with requests < 2.27 (#79332) 2023-07-03 17:34:36 +02:00
Emmanuel Cazenave d199eb9de7 greco: use stored application code as default (#79320)
gitea/passerelle/pipeline/head Build queued... Details
2023-07-03 15:36:25 +02:00
Nicolas Roche 9386398863 toulouse-maelis: do not crash reading a family with no birth data (#78027)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-30 11:12:51 +02:00
Lauréline Guérin f928a10fc5
toulouse-axel: fix possible days endpoint (#78903)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-29 18:59:11 +02:00
Nicolas Roche 0616f216bf toulouse-maelis: allow activity booking on civil year (#78579)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-29 14:00:55 +02:00
Benjamin Dauvergne 5d05b38653 pdf: make thumbnail url vary with PDF form file content (#78879)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-29 12:45:02 +02:00
Thomas NOËL 1afe1a8649 filr_rest: accept invalid email in share-folder endpoint (#79207)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-29 10:55:52 +02:00
Emmanuel Cazenave 34e0b6f8d8 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-28 15:18:02 +02:00
Emmanuel Cazenave a16dc0c83a atal_rest: add multiple attachment endpoint (#79140)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-28 15:10:28 +02:00
Lauréline Guérin de81517bb4 ovh: try/except APIError on hourly job (#78939)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-28 12:03:25 +02:00
Emmanuel Cazenave 5ce8d34fa5 atal_rest: make referential enpoints compliant with data source format (#79166)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-28 10:49:53 +02:00
Nicolas Roche 0b8730b9ba toulouse-maelis: do not crash on cron when family is removed (#78929)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-27 17:58:05 +02:00
Emmanuel Cazenave 60d2277b55 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-27 17:20:45 +02:00
Emmanuel Cazenave faebc78066 esup_signature: add workflows endpoint (#79131)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-27 17:14:05 +02:00
Emmanuel Cazenave 7212c9056d atal_rest: start connector (#78904)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-27 16:15:34 +02:00
Nicolas Roche 0e07b8fca7 tests: remove wcs service from tests (#78935)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-26 15:15:35 +02:00
Frédéric Péters f498f8f32a translation update (typography)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-25 10:56:17 +02:00
Frédéric Péters d59eaa8ab3 translation update (french orthography rectifications of 1990) 2023-06-25 09:52:24 +02:00
Frédéric Péters 85f6e24aab ci: build deb package for bookworm (#78968)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-23 18:06:44 +02:00
Nicolas Roche 9dee19e493 toulouse-maelis: accept only one subsciption per demand (#78928)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-23 17:16:19 +02:00
Nicolas Roche 6238b21727 toulouse-maelis: refuse additionnal parameters on add-supplied-document (#78915)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-23 17:02:31 +02:00
Nicolas Roche 451cf508ce toulouse-maelis: change interval on catalog request (#78311)
gitea/passerelle/pipeline/head Build queued... Details
2023-06-23 16:30:48 +02:00
Nicolas Roche 5a1046c7d2 toulouse-maelis: [functests] correct dates on catalog script (#78311) 2023-06-23 16:30:48 +02:00
Nicolas Roche 5e94eb86a3 toulouse-maelis: remove cache on activity catalog (#78141)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-23 16:01:41 +02:00
Nicolas Roche ad752230d6 toulouse-maelis: [tests] simplify test on activities without nature (#78141) 2023-06-23 16:01:41 +02:00
Nicolas Roche 16ab7c0a77 tououse-maelis: correct typos on person descriptions (#78749)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-23 15:58:54 +02:00
Nicolas Roche 4d58d2c80b toulouse-maelis: [functests] test document validity (#78539)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-06-23 15:44:27 +02:00
Nicolas Roche 6147e497bc toulouse-maelis: add read-supplied-document-validity endpoint (#78539) 2023-06-23 15:43:15 +02:00
Nicolas Roche c22981630f toulouse-maelis: [tools] add readSuppliedDocumentValidity call (#78539) 2023-06-23 15:43:15 +02:00
Nicolas Roche 16c3bbd120 toulouse-maelis: [wsdl] get last family wsdl (#78539) 2023-06-23 15:43:15 +02:00
Nicolas Roche 37fcc2d65a toulouse-maelis: [functests] adding tests on capacities (#77634)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-23 15:15:56 +02:00
Nicolas Roche 1257eea8d2 toulouse-maelis: [functests] update invoice test on extrasco (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche f5dc0f4fb2 toulouse-maelis: [functests] update loisirs basket tests (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche 3946028e53 toulouse-maelis: [functests] include ban zipcode into test (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche 998e1c1208 toulouse-maelis: [functests] add subscriptions out from Toulouse (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche aa99a0d826 toulouse-maelis: [functests] re-enabling tests on extra-sco (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche e2047aa318 toulouse-maelis: [functests] re-enabling tests on loisirs (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche 6546c7ac63 toulouse-maelis: [functests] add test for adultes on perisco (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche 2083b70610 toulouse-maelis: [functests] re-enable test on pericso (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche a15c80765b toulouse-maelis: [functests] improve tests on scolaire (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche bcbe12679e toulouse-maelis: [functests] re-enabling tests on ape (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche ae5681b0e7 toulouse-maelis: [functests] flagCom correction (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche 03cbdf578f toulouse-maelis: [functests] locate test family into Toulouse (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche e873bfaaa8 toulouse-maelis: [functests] update referentials (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche 59989d562b toulouse-maelis: [functests] rename test families (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche aa250342da toulouse-maelis: [functests] complete tests on school subscription (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche 59955c60db toulouse-maelis: [functests] add visa date to supplied documents (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche 72b3cc8a87 toulouse-maelis: [tools] correct tools (#77634) 2023-06-23 15:15:56 +02:00
Nicolas Roche ea1c2b34bb toulouse-maelis: [tools] set quantity on booking (#77634) 2023-06-23 15:15:56 +02:00
Lauréline Guérin 95ba4d5f0e
api_particulier: add an message in known_errors (#78942)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-23 10:42:09 +02:00
Lauréline Guérin fb01b9a9ec mdel: fix status endpoint with incorrect zipfile (#78917)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-06-23 08:02:23 +02:00
Lauréline Guérin 468e5309a9
opengis: fix features endpoint with bad json response (#78901)
gitea/passerelle/pipeline/head Build queued... Details
2023-06-22 15:55:31 +02:00
Thomas NOËL 002af7c243 update translations
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-06-20 11:59:31 +02:00
Thomas NOËL f7739d1aa2 base_adresse: inform that zipcode parameter is rarely usefull (#78719)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-06-19 14:06:42 +02:00
Nicolas Roche 4e5ec54b26 toulouse-maelis: automaticaly select required recurrent-week (#78028)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-15 10:41:08 +02:00
Nicolas Roche 61308407de toulouse-maelis: correct calendarGeneration values in tests (#78028) 2023-06-15 10:41:08 +02:00
Nicolas Roche ec1c4886fd toulouse-maelis: use text as key for service criteria in catalog (#78308)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-06-15 10:31:46 +02:00
Frédéric Péters 2618463abb plone rest api: do not send emails on token request errors (#78535)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-15 09:37:02 +02:00
Nicolas Roche c8fb63fe3e toulouse-maelis: do not crash on null recurrent_week (#78517)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-06-14 15:35:31 +02:00
Thomas NOËL 7a21a3e50c sne: use Debian version of cryptography (#78475)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-13 19:29:58 +02:00
Emmanuel Cazenave d479819f50 esup_signature: switch eppn and create_by_eppn parameters (#78405)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-12 22:37:05 +02:00
Emmanuel Cazenave 415e9f8a9e greco: send motifsrejet as None (#78264)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-06-12 15:21:01 +02:00
Serghei Mihai 166d58591a grenoble_gru: cleanup phone numbers (#78187)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-12 09:44:24 +02:00
Nicolas Roche 2cac256517 photon: validate geojson content received (#68414)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-09 09:36:25 +02:00
Benjamin Dauvergne 62ed945d62 soap: do not check wsdl url if certificates are changed (#77923)
gitea/passerelle/pipeline/head This commit looks good Details
The request wrapper needs a filepath to the certificates and Django
cannot provide one during the clean method.
2023-06-03 08:59:38 +02:00
Benjamin Dauvergne dcb772fdbd api_impot_particulier: make fields required (#77899)
gitea/passerelle/pipeline/head This commit looks good Details
API impot particulier cannot be called without values for oauth_scopes
and id_teleservice.
2023-06-03 08:26:49 +02:00
Benjamin Dauvergne 7959ec9a3c api_impot_particulier: fix api_url's default value in migrations (#77899) 2023-06-03 08:26:49 +02:00
Frédéric Péters 65d3f390f3 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-01 19:17:07 +02:00
Emmanuel Cazenave 65409f2070 sne: start connector (#77933)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-01 16:39:57 +02:00
Nicolas Roche 3f69bdb447 toulouse-maelis: add new date parameters supply documents (#78074)
gitea/passerelle/pipeline/head This commit looks good Details
2023-06-01 15:13:19 +02:00
Nicolas Roche 9d0fc45957 toulouse-maelis: [tests] complete supplied document test (#78074) 2023-06-01 15:09:53 +02:00
Nicolas Roche 29ce646989 toulouse-maelis: [functest] add add_supplied_document script (#78074) 2023-06-01 15:09:49 +02:00
Frédéric Péters 9a892a0e77 general: remove no longer necessary perm='can_access' (#78041)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-31 16:22:00 +02:00
Frédéric Péters e4a9d16719 general: be explicit about open endpoints (#78041) 2023-05-31 16:21:20 +02:00
Corentin Sechet 29b8775a16 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-31 14:36:39 +02:00
Nicolas Roche d176d9fc4b toulouse-maelis: add service criteria to catalog (#77084)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-31 10:35:59 +02:00
Nicolas Roche 8df0c9ec11 toulouse-maelis: accept an empty string as recurrent week (#78009)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-30 20:43:18 +02:00
Nicolas Roche d21669a250 toulouse-maelis: use dedicated soap input on reccurent week tests (#78009) 2023-05-30 20:43:18 +02:00
Emmanuel Cazenave b74e848dbd esup_signature: add a field to define HTTP headers (#78003)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-30 16:31:10 +02:00
Emmanuel Cazenave 0c06086585 esup_signature: add parameters to the 'new' endpoint (#77670)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-30 13:39:30 +02:00
Emmanuel Cazenave 6b74e9a632 esup_signature: send standard parameters through the query string (#77670) 2023-05-30 13:29:50 +02:00
Emmanuel Cazenave 7102c3150a esup_signature: add new-with-workflow endpoint (#77670) 2023-05-30 13:29:23 +02:00
Frédéric Péters 9ff69633a3 misc: fix display of connector kebab menus (#77979)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-29 09:59:44 +02:00
Frédéric Péters 6194728fb3 debian: apply new pre-commit-debian (#77727)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-27 21:49:08 +02:00
Frédéric Péters 1ab81c200b ci: upgrade pre-commit-debian (#77727) 2023-05-27 21:49:08 +02:00
Thomas NOËL 2277fcdd23 api_particulier: normalize birth dates (#77306)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-26 12:00:38 +02:00
Corentin Sechet 54dbbc3148 toulouse-foederis: send job_type, job_realm, job_family and job fields as list (#77776)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-26 08:51:30 +02:00
Frédéric Péters 6b432122d3 solis-apa: do not call gettext on French strings (#13108)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-05-26 07:40:30 +02:00
Thomas NOËL 3ba866a275 proxy: do not log requests errors (#77845)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-23 12:14:10 +02:00
Thomas Jund 3934030677 templates: move appbar extra-actions-menu inside actions block (#77797)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-22 17:27:11 +02:00
Corentin Sechet e2ce17f701 toulouse-foederis: add attach-file endpoint (#77527)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-19 12:22:08 +02:00
Corentin Sechet 7395fa5560 toulouse-foederis: add candidature endpoint (#77524)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-19 11:40:09 +02:00
Nicolas Roche 816da0f6b6 plone_restapi: use ok webservice to get service status (#77695)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-17 12:12:13 +02:00
Nicolas Roche be4a65b6be toulouse-maelis: trigger wcs when basket subscription is paid (#76398)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-17 11:31:32 +02:00
Nicolas Roche 347d09db89 toulouse-maelis: trigger wcs when basket subscription is cancelled (#76398) 2023-05-17 11:31:32 +02:00
Nicolas Roche 0276de78c2 toulouse-maelis: trigger wcs when basket subscription is removed (#76398) 2023-05-17 11:31:32 +02:00
Nicolas Roche 80fc536e2b toulouse-maelis: detect removed basket subscriptions (#76398) 2023-05-17 11:31:32 +02:00
Nicolas Roche 3dab63a0f3 toulouse-maelis: record wcs-demand with subscription ids (#76398) 2023-05-17 11:31:32 +02:00
Nicolas Roche bd9270a8ad toulouse-maelis: [tests] update soap content (#76398) 2023-05-17 11:31:32 +02:00
Emmanuel Cazenave 31efc19163 greco: send an empty string when a param is missing (#77681)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-17 10:51:30 +02:00
Lauréline Guérin 9a4f57612e astech: fix labels endpoint (#76998)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-16 22:28:28 +02:00
Lauréline Guérin e0ed5cc1c9 astech: fix companies endpoint (#76963)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-16 17:07:43 +02:00
Emmanuel Cazenave ab46f17856 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-16 11:26:19 +02:00
Nicolas Roche 0b81087341 toulouse-maelis: remove flagCom input from connector (#77547)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-15 15:17:41 +02:00
Nicolas Roche f71891abd7 toulouse-maelis: [tests] get a libelle2 on subscription from readFamily (#77547) 2023-05-15 15:17:41 +02:00
Nicolas Roche 731148917e toulouse-maelis: [tests] get more data from town referential (#77547) 2023-05-15 15:17:41 +02:00
Nicolas Roche c413b5738f toulouse-maelis: [tests] update wsdls (#77547) 2023-05-15 15:17:41 +02:00
Emmanuel Cazenave fd09fb2fd7 esup_signature: start connector (#76994)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-15 14:38:33 +02:00
Benjamin Dauvergne 84cd51957e base_adresse: add parameter type=housenumber to prevent users from picking street addresses (#76376)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-15 10:36:37 +02:00
Nicolas Roche 8e215185ec toulouse-maelis: [functests] adapt tests to last build (#77362)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-15 09:40:15 +02:00
Nicolas Roche 4cdfeb47b0 toulouse-maelis: [functests] work on last test activities (#77362) 2023-05-15 09:40:15 +02:00
Nicolas Roche 896391c718 toulouse-maelis: [tools] add school referentials (#77362) 2023-05-15 09:40:15 +02:00
Nicolas Roche 86ac566bbb toulouse-maelis: return a quotients dict on RL (#77257)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-15 09:18:26 +02:00
Nicolas Roche 135cdbf46a toulouse-maelis: use libelle2 to display activities (#77291)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-12 18:06:16 +02:00
Thomas NOËL d87dd7b107 ldap: fix unclosed a tag (#77386)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-05-09 14:01:04 +02:00
Frédéric Péters 23480ce819 build: declare django 3.2 requirement (#77368)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-09 09:55:15 +02:00
Frédéric Péters 1bc79d7312 build: limit urllib3 version (#77360)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-05-09 07:50:19 +02:00
Nicolas Roche bdd68dc6e8 toulouse-maelis: do not crash on agenda if we get no type (#77116)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-03 12:31:46 +02:00
Serghei Mihai f3a7f7f460 translation update (#76631)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-03 11:38:27 +02:00
Serghei Mihai f704763565 cityweb: add expired files deletion (#76631) 2023-05-03 11:38:27 +02:00
Emmanuel Cazenave 822a0d83b4 atal: handle error when uploading big file (#76884)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-03 11:37:32 +02:00
Benjamin Dauvergne 0a6733f070 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-03 11:19:17 +02:00
Benjamin Dauvergne 4d89c476bb api-impot-particulier: add housing tax endpoint (#77236)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-03 10:45:45 +02:00
Benjamin Dauvergne 96413bd5d9 api-impot-particulier: parametrize Accept header (#77236) 2023-05-03 10:45:20 +02:00
Benjamin Dauvergne 7defa59ccc translation update
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-05-03 08:33:42 +02:00
Benjamin Dauvergne 9f5927daa5 add new connector "api-impot-particulier" (#76668)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-02 16:38:22 +02:00
Benjamin Dauvergne a20835e118 dpark: convert soap errors to APIError (#76903)
gitea/passerelle/pipeline/head This commit looks good Details
2023-05-02 07:59:47 +02:00
Nicolas Roche fc9444cd98 toulouse-maelis: [functests] add test on school pre-booking (#76379)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-29 01:14:03 +02:00
Nicolas Roche 6f562d6e10 toujouse-maelis: [functests] add functests on ape module (#76381)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-28 18:26:40 +02:00
Nicolas Roche acd0ba843c toulouse-maelis: [functests] add test to pay an invoice (#76726)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-28 17:11:56 +02:00
Nicolas Roche 5662aa069d toulouse-maelis: accept None on conveyance selected place (#76736)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-28 07:54:21 +02:00
Nicolas Roche 06b640731f toulouse-maelis: detail cancelled status and display for_payment invoice (#76856)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-26 11:43:27 +02:00
Nicolas Roche 5243e328a4 toulouse-maelis: add a for_payment parameter to reduce cancellation delay (#76856) 2023-04-26 11:43:27 +02:00
Nicolas Roche bbc8e1cb5b toulouse-maelis: add a max_payment_delay connector parameter (#76855)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-21 14:41:15 +02:00
Nicolas Roche 1be01a198e toulouse-maelis: correct typo on cancel_invoice_delay verbose name (#76855) 2023-04-21 14:40:18 +02:00
Nicolas Roche c7d33287c5 toulouse-maelis: sent invoice cancellation order on get-baskets endpoint too (#76395)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-21 13:52:19 +02:00
Nicolas Roche 483268c636 toulouse-maelis: lock invoice while sending cancellation order (#76395) 2023-04-21 13:52:19 +02:00
Nicolas Roche be6d3df42e toulouse-maelis: sent invoice cancellation order 20 minutes after expiration (#76395) 2023-04-21 13:52:19 +02:00
Nicolas Roche db11bfcbb7 toulouse-maelis: pay invoice if cancellation order is not sent (#76395) 2023-04-21 13:52:19 +02:00
Nicolas Roche 1d21ac784c toulouse-maelis: do not return cancelled invoice on invoice endpoint (#76395) 2023-04-21 13:52:19 +02:00
Nicolas Roche 87f982e8a4 toulouse-maelis: [tests] use same family_id on basket tests (#76395) 2023-04-21 13:52:19 +02:00
Nicolas Roche 18825c057e toulouse-maelis: [tests] assert cancelled invoice are not displayed (#76395) 2023-04-21 13:52:19 +02:00
Nicolas Roche 36940933d9 toulouse-maelis: cancel invoices (#76395) 2023-04-21 13:52:19 +02:00
Nicolas Roche 9535c6c68f toulouse-maelis: [tests] get new basket data (#76836)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-21 13:02:02 +02:00
Nicolas Roche 2d208a9b96 toulouse-maelis: [tests] update regie referential (#76836) 2023-04-21 13:02:02 +02:00
Nicolas Roche d4c0214ac7 toulouse-maelis: [tests] correct regie code use for invoice payment (#76836) 2023-04-21 13:02:02 +02:00
Nicolas Roche 87b97a417b toulouse-maelis: correct invoice log diff order (#76828)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-21 12:10:38 +02:00
Frédéric Péters 8e095b9dfa translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-20 18:40:24 +02:00
Thomas NOËL a75835584b solis_afi_mss: allow indexIndividus in declare-tax endpoint (#76820)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-04-20 14:32:50 +02:00
479 changed files with 66271 additions and 8218 deletions

View File

@ -8,3 +8,5 @@ d2c0be039649febded68d9d04f745cd18b2b2e03
989fb5271967e8e87fd57837dd6d8cfe932e7ebe
# misc: apply djhtml (#69422)
6da81964bd91b5656364357ec06776fed3529c8a
# misc: apply double-quote-string-fixer (#79788)
40142de8d2d9885f7a57f4b0f5ab1a593e13aaca

6
.gitignore vendored
View File

@ -12,5 +12,7 @@ passerelle.egg-info/
coverage.xml
junit-py*.xml
.sass-cache/
passerelle/static/css/style.css
passerelle/static/css/style.css.map
passerelle/**/static/**/css/style.css
passerelle/**/static/**/css/style.css.map
node_modules/
coverage/

View File

@ -1,6 +1,10 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: double-quote-string-fixer
- repo: https://github.com/asottile/pyupgrade
rev: v3.3.1
hooks:
@ -27,6 +31,6 @@ repos:
- id: djhtml
args: ['--tabwidth', '2']
- repo: https://git.entrouvert.org/pre-commit-debian.git
rev: v0.1
rev: v0.3
hooks:
- id: pre-commit-debian

43
Jenkinsfile vendored
View File

@ -11,19 +11,34 @@ pipeline {
RAND_TEST = "${Math.abs(new Random().nextInt(max+1))}"
}
stages {
stage('Unit Tests') {
steps {
sh "NUMPROCESSES=6 RAND_TEST=${env.RAND_TEST} tox -rv"
}
post {
always {
script {
utils = new Utils()
utils.publish_coverage('coverage.xml')
utils.publish_coverage_native('index.html')
utils.publish_pylint('pylint.out')
stage('Tests (in parallel)') {
failFast true
parallel {
stage('Unit Tests (pytest)') {
steps {
sh "NUMPROCESSES=12 RAND_TEST=${env.RAND_TEST} tox -rv"
}
post {
always {
script {
utils = new Utils()
utils.publish_coverage('coverage.xml')
utils.publish_coverage_native('index.html')
utils.publish_pylint('pylint.out')
}
mergeJunitResults()
}
}
}
stage('Unit Tests (vitest)') {
steps {
sh "NUMPROCESSES=12 RAND_TEST=${env.RAND_TEST} tox -rv -e vitest"
}
}
stage('Linter (pylint)') {
steps {
sh "NUMPROCESSES=12 RAND_TEST=${env.RAND_TEST} tox -rv -e pylint"
}
mergeJunitResults()
}
}
}
@ -39,9 +54,9 @@ pipeline {
'''
).trim()
if (env.GIT_BRANCH == 'main' || env.GIT_BRANCH == 'origin/main') {
sh "sudo -H -u eobuilder /usr/local/bin/eobuilder -d bullseye ${SHORT_JOB_NAME}"
sh "sudo -H -u eobuilder /usr/local/bin/eobuilder -d bullseye,bookworm ${SHORT_JOB_NAME}"
} else if (env.GIT_BRANCH.startsWith('hotfix/')) {
sh "sudo -H -u eobuilder /usr/local/bin/eobuilder -d bullseye --branch ${env.GIT_BRANCH} --hotfix ${SHORT_JOB_NAME}"
sh "sudo -H -u eobuilder /usr/local/bin/eobuilder -d bullseye,bookworm --branch ${env.GIT_BRANCH} --hotfix ${SHORT_JOB_NAME}"
}
}
}

15
README
View File

@ -126,3 +126,18 @@ django-jsonresponse (https://github.com/jjay/django-jsonresponse)
# Files: passerelle/utils/jsonresponse.py
# Copyright (c) 2012 Yasha Borevich <j.borevich@gmail.com>
# Licensed under the BSD license
tweetnacl-js (https://github.com/dchest/tweetnacl-js)
# Files: passerelle/apps/qrcode/static/qrcode/js/nacl.min.js
# Copyright: https://github.com/dchest/tweetnacl-js/blob/master/AUTHORS.md
# Licensed under the Unlicense license (public domain)
zxing-browser (https://github.com/zxing-js/browser/)
# Files: passerelle/apps/qrcode/static/qrcode/js/zxing-browser.min.js
# Copyright: (c) 2018 ZXing for JS
# Licensed under the MIT license.
RemixIcon (https://github.com/Remix-Design/RemixIcon)
# Files: passerelle/apps/qrcode/static/qrcode/img/favicon.ico
# Copyright (c) 2020 RemixIcon.com
# Licensed under the Apache License Version 2.0

12
debian/control vendored
View File

@ -16,7 +16,9 @@ Architecture: all
Depends: ghostscript,
pdftk,
poppler-utils,
python3-caldav,
python3-cmislib,
python3-cryptography,
python3-dateutil,
python3-distutils,
python3-django (>= 2:3.2),
@ -43,6 +45,7 @@ Depends: ghostscript,
python3-uwsgidecorators,
python3-vobject,
python3-xmlschema,
python3-xmltodict,
python3-zeep (>= 3.2),
${misc:Depends},
${python3:Depends},
@ -60,8 +63,9 @@ Depends: adduser,
uwsgi,
uwsgi-plugin-python3,
${misc:Depends},
Recommends: memcached, nginx
Suggests: postgresql
Breaks: python-passerelle (<<5.75.post9)
Replaces: python-passerelle (<<5.75.post9)
Recommends: memcached,
nginx,
Suggests: postgresql,
Breaks: python-passerelle (<<5.75.post9),
Replaces: python-passerelle (<<5.75.post9),
Description: Uniform access to multiple data sources and services

View File

@ -4,6 +4,7 @@ After=network.target postgresql.service
Wants=postgresql.service
[Service]
SyslogIdentifier=uwsgi/%p
Environment=PASSERELLE_SETTINGS_FILE=/usr/lib/%p/debian_config.py
Environment=PASSERELLE_WSGI_TIMEOUT=120
Environment=PASSERELLE_WSGI_WORKERS=5

1
debian/uwsgi.ini vendored
View File

@ -18,6 +18,7 @@ spooler-python-import = passerelle.utils.spooler
spooler-max-tasks = 20
# every five minutes
unique-cron = -5 -1 -1 -1 -1 /usr/bin/passerelle-manage tenant_command cron --all-tenants every5min
unique-cron = -5 -1 -1 -1 -1 /usr/bin/passerelle-manage tenant_command cron --all-tenants availability
unique-cron = -5 -1 -1 -1 -1 /usr/bin/passerelle-manage tenant_command cron --all-tenants jobs
# hourly

View File

@ -2,23 +2,23 @@ import pytest
def pytest_addoption(parser):
parser.addoption("--url", help="Url of a passerelle Caluire Axel connector instance")
parser.addoption("--nameid", help="Publik Name ID")
parser.addoption("--firstname", help="first name of a user")
parser.addoption("--lastname", help="Last name of a user")
parser.addoption("--family", help="Family ID")
parser.addoption('--url', help='Url of a passerelle Caluire Axel connector instance')
parser.addoption('--nameid', help='Publik Name ID')
parser.addoption('--firstname', help='first name of a user')
parser.addoption('--lastname', help='Last name of a user')
parser.addoption('--family', help='Family ID')
@pytest.fixture(scope='session')
def conn(request):
return request.config.getoption("--url")
return request.config.getoption('--url')
@pytest.fixture(scope='session')
def user(request):
return {
'name_id': request.config.getoption("--nameid"),
'first_name': request.config.getoption("--firstname"),
'last_name': request.config.getoption("--lastname"),
'family': request.config.getoption("--family"),
'name_id': request.config.getoption('--nameid'),
'first_name': request.config.getoption('--firstname'),
'last_name': request.config.getoption('--lastname'),
'family': request.config.getoption('--family'),
}

View File

@ -12,7 +12,7 @@ def test_link(conn, user):
'NOM': user['last_name'],
'PRENOM': user['first_name'],
}
print("Creating link with the following payload:")
print('Creating link with the following payload:')
pprint.pprint(payload)
resp = requests.post(url, json=payload)
resp.raise_for_status()
@ -21,7 +21,7 @@ def test_link(conn, user):
assert res['err'] == 0
print('\n')
print("GET family info")
print('GET family info')
url = conn + '/family_info?NameID=%s' % name_id
resp = requests.get(url)
resp.raise_for_status()
@ -30,7 +30,7 @@ def test_link(conn, user):
assert data['err'] == 0
print('\n')
print("GET children info")
print('GET children info')
url = conn + '/children_info?NameID=%s' % (name_id)
resp = requests.get(url)
resp.raise_for_status()
@ -40,7 +40,7 @@ def test_link(conn, user):
print('\n')
for child in data['data']['MEMBRE']:
print("GET child info")
print('GET child info')
url = conn + '/child_info?NameID=%s&idpersonne=%s' % (name_id, child['IDENT'])
resp = requests.get(url)
resp.raise_for_status()
@ -49,7 +49,7 @@ def test_link(conn, user):
assert res['err'] == 0
print('\n')
print("and GET school info")
print('and GET school info')
url = conn + '/child_schooling_info?NameID=%s&idpersonne=%s&schooling_date=%s' % (
name_id,
child['IDENT'],
@ -62,7 +62,7 @@ def test_link(conn, user):
assert res['err'] == 0
print('\n')
print("and GET activities info")
print('and GET activities info')
url = conn + '/child_activities_info?NameID=%s&idpersonne=%s&schooling_date=%s' % (
name_id,
child['IDENT'],
@ -75,7 +75,7 @@ def test_link(conn, user):
assert res['err'] == 0
print('\n')
print("GET school list")
print('GET school list')
url = conn + '/school_list'
payload = {
'num': data['data']['RESPONSABLE1']['ADRESSE']['NORUE'],
@ -92,7 +92,7 @@ def test_link(conn, user):
print('\n')
return
print("Deleting link")
print('Deleting link')
url = conn + '/unlink?NameID=%s' % name_id
resp = requests.post(url)
resp.raise_for_status()

View File

@ -5,25 +5,25 @@ import pytest
def pytest_addoption(parser):
parser.addoption("--cmis-connector-url", help="Url of a passerelle CMIS connector instance")
parser.addoption("--cmis-endpoint", help="Url of a passerelle CMIS endpoint")
parser.addoption("--cmis-username", help="Username for the CMIS endpoint")
parser.addoption("--cmis-password", help="Password for the CMIS endpoint")
parser.addoption("--preserve-tree", action="store_true", default=False, help="Preserve test directory")
parser.addoption('--cmis-connector-url', help='Url of a passerelle CMIS connector instance')
parser.addoption('--cmis-endpoint', help='Url of a passerelle CMIS endpoint')
parser.addoption('--cmis-username', help='Username for the CMIS endpoint')
parser.addoption('--cmis-password', help='Password for the CMIS endpoint')
parser.addoption('--preserve-tree', action='store_true', default=False, help='Preserve test directory')
@pytest.fixture(scope='session')
def cmisclient(request):
return cmislib.CmisClient(
request.config.getoption("--cmis-endpoint"),
request.config.getoption("--cmis-username"),
request.config.getoption("--cmis-password"),
request.config.getoption('--cmis-endpoint'),
request.config.getoption('--cmis-username'),
request.config.getoption('--cmis-password'),
)
@pytest.fixture(scope='session')
def cmis_connector(request):
return request.config.getoption("--cmis-connector-url")
return request.config.getoption('--cmis-connector-url')
@pytest.fixture(scope='session')
@ -31,6 +31,6 @@ def cmis_tmpdir(cmisclient, request):
path = 'test-%s' % random.randint(0, 10000)
folder = cmisclient.defaultRepository.rootFolder.createFolder(path)
yield folder.properties['cmis:path']
preserve_tree = request.config.getoption("--preserve-tree")
preserve_tree = request.config.getoption('--preserve-tree')
if not preserve_tree:
folder.deleteTree()

View File

@ -10,7 +10,7 @@ SPECIAL_CHARS = '!#$%&+-^_`;[]{}+='
@pytest.mark.parametrize(
"path,file_name",
'path,file_name',
[
('', 'some.file'),
('/toto', 'some.file'),
@ -31,8 +31,8 @@ def test_uploadfile(cmisclient, cmis_connector, cmis_tmpdir, tmpdir, monkeypatch
response = requests.post(
url,
json={
"path": cmis_tmpdir + path,
"file": {"content": file_b64_content, "filename": file_name, "content_type": "image/jpeg"},
'path': cmis_tmpdir + path,
'file': {'content': file_b64_content, 'filename': file_name, 'content_type': 'image/jpeg'},
},
)
assert response.status_code == 200
@ -59,8 +59,8 @@ def test_uploadfile_conflict(cmisclient, cmis_connector, cmis_tmpdir, tmpdir, mo
response = requests.post(
url,
json={
"path": cmis_tmpdir + '/uploadconflict',
"file": {"content": file_b64_content, "filename": 'some.file', "content_type": "image/jpeg"},
'path': cmis_tmpdir + '/uploadconflict',
'file': {'content': file_b64_content, 'filename': 'some.file', 'content_type': 'image/jpeg'},
},
)
assert response.status_code == 200
@ -70,11 +70,11 @@ def test_uploadfile_conflict(cmisclient, cmis_connector, cmis_tmpdir, tmpdir, mo
response = requests.post(
url,
json={
"path": cmis_tmpdir + '/uploadconflict',
"file": {"content": file_b64_content, "filename": 'some.file', "content_type": "image/jpeg"},
'path': cmis_tmpdir + '/uploadconflict',
'file': {'content': file_b64_content, 'filename': 'some.file', 'content_type': 'image/jpeg'},
},
)
assert response.status_code == 200
resp_data = response.json()
assert resp_data['err'] == 1
assert resp_data['err_desc'].startswith("update conflict")
assert resp_data['err_desc'].startswith('update conflict')

View File

@ -2,9 +2,9 @@ import pytest
def pytest_addoption(parser):
parser.addoption("--url", help="Url of a passerelle Planitech connector instance")
parser.addoption('--url', help='Url of a passerelle Planitech connector instance')
@pytest.fixture(scope='session')
def conn(request):
return request.config.getoption("--url")
return request.config.getoption('--url')

View File

@ -113,7 +113,7 @@ def test_main(conn):
def call_generic(conn, endpoint):
print("%s \n" % endpoint)
print('%s \n' % endpoint)
url = conn + '/%s' % endpoint
resp = requests.get(url)
resp.raise_for_status()

View File

@ -2,25 +2,25 @@ import pytest
def pytest_addoption(parser):
parser.addoption("--url", help="Url of a passerelle Toulouse Axel connector instance")
parser.addoption("--nameid", help="Publik Name ID")
parser.addoption("--firstname", help="first name of a user")
parser.addoption("--lastname", help="Last name of a user")
parser.addoption("--dob", help="Date of birth of a user")
parser.addoption("--dui", help="DUI number")
parser.addoption('--url', help='Url of a passerelle Toulouse Axel connector instance')
parser.addoption('--nameid', help='Publik Name ID')
parser.addoption('--firstname', help='first name of a user')
parser.addoption('--lastname', help='Last name of a user')
parser.addoption('--dob', help='Date of birth of a user')
parser.addoption('--dui', help='DUI number')
@pytest.fixture(scope='session')
def conn(request):
return request.config.getoption("--url")
return request.config.getoption('--url')
@pytest.fixture(scope='session')
def user(request):
return {
'name_id': request.config.getoption("--nameid"),
'first_name': request.config.getoption("--firstname"),
'last_name': request.config.getoption("--lastname"),
'dob': request.config.getoption("--dob"),
'dui': request.config.getoption("--dui"),
'name_id': request.config.getoption('--nameid'),
'first_name': request.config.getoption('--firstname'),
'last_name': request.config.getoption('--lastname'),
'dob': request.config.getoption('--dob'),
'dui': request.config.getoption('--dui'),
}

View File

@ -4,7 +4,7 @@ import requests
def test_link(conn, user):
print("Get update management dates")
print('Get update management dates')
url = conn + '/management_dates'
resp = requests.get(url)
resp.raise_for_status()
@ -21,7 +21,7 @@ def test_link(conn, user):
'PRENOM': user['first_name'],
'NAISSANCE': user['dob'],
}
print("Creating link with the following payload:")
print('Creating link with the following payload:')
pprint.pprint(payload)
resp = requests.post(url, json=payload)
resp.raise_for_status()
@ -30,7 +30,7 @@ def test_link(conn, user):
pprint.pprint(res)
print('\n')
print("GET family info")
print('GET family info')
url = conn + '/family_info?NameID=%s' % name_id
resp = requests.get(url)
resp.raise_for_status()
@ -158,7 +158,7 @@ def test_link(conn, user):
for key in flags:
payload[key] = True
print("Update family info with the following payload:")
print('Update family info with the following payload:')
pprint.pprint(payload)
url = conn + '/update_family_info?NameID=%s' % name_id
resp = requests.post(url, json=payload)
@ -168,7 +168,7 @@ def test_link(conn, user):
pprint.pprint(res)
print('\n')
print("GET children info")
print('GET children info')
url = conn + '/children_info?NameID=%s' % (name_id)
resp = requests.get(url)
resp.raise_for_status()
@ -178,7 +178,7 @@ def test_link(conn, user):
print('\n')
for child in data['data']['ENFANT']:
print("GET child info")
print('GET child info')
url = conn + '/child_info?NameID=%s&idpersonne=%s' % (name_id, child['IDPERSONNE'])
resp = requests.get(url)
resp.raise_for_status()
@ -187,7 +187,7 @@ def test_link(conn, user):
pprint.pprint(res)
print('\n')
print("GET child contact info")
print('GET child contact info')
url = conn + '/child_contacts_info?NameID=%s&idpersonne=%s' % (name_id, child['IDPERSONNE'])
resp = requests.get(url)
resp.raise_for_status()
@ -196,7 +196,7 @@ def test_link(conn, user):
pprint.pprint(res)
print('\n')
print("Deleting link")
print('Deleting link')
url = conn + '/unlink?NameID=%s' % name_id
resp = requests.post(url)
resp.raise_for_status()

View File

@ -21,7 +21,7 @@ FAMILY_PAYLOAD = {
'rl1': {
'civility': 'MME',
'firstname': 'Marge',
'lastname': 'Simpson',
'lastname': 'Test_Simpson',
'maidenName': 'Bouvier',
'quality': 'MERE',
'birth': {
@ -32,14 +32,14 @@ FAMILY_PAYLOAD = {
'idStreet': '2317',
'num': '4',
'street1': 'requeried having idStreet provided',
'town': 'Springfield',
'zipcode': '62701',
'town': 'Toulouse',
'zipcode': '31400',
},
},
'rl2': {
'civility': 'MR',
'firstname': 'Homer',
'lastname': 'Simpson',
'lastname': 'Test_Simpson',
'quality': 'PERE',
'birth': {
'dateBirth': '1956-05-12',
@ -96,7 +96,7 @@ FAMILY_PAYLOAD = {
{
'sexe': 'M',
'firstname': 'Bart',
'lastname': 'Simpson',
'lastname': 'Test_Simpson',
'birth': {
'dateBirth': '2014-04-01',
'place': 'Brive-la-Gaillarde',
@ -133,11 +133,11 @@ FAMILY_PAYLOAD = {
'hospital': 'Springfield General Hospital',
'vaccinList': [
{
'code': '45',
'code': '8',
'vaccinationDate': '2011-01-11',
},
{
'code': '24',
'code': '1',
'vaccinationDate': '2022-02-22',
},
],
@ -158,7 +158,7 @@ FAMILY_PAYLOAD = {
'personInfo': {
'civility': 'MR',
'firstname': 'Abraham Jebediah',
'lastname': 'Simpson',
'lastname': 'Test_Simpson',
'dateBirth': '1927-05-24',
'sexe': 'M',
'contact': {
@ -175,7 +175,7 @@ FAMILY_PAYLOAD = {
'personInfo': {
'civility': 'MME',
'firstname': 'Mona Penelope',
'lastname': 'Simpson',
'lastname': 'Test_Simpson',
'dateBirth': '1929-03-15',
'sexe': 'F',
'contact': {
@ -193,7 +193,7 @@ FAMILY_PAYLOAD = {
{
'sexe': 'F',
'firstname': 'Lisa',
'lastname': 'Simpson',
'lastname': 'Test_Simpson',
'birth': {'dateBirth': '2016-05-09'},
'dietcode': 'MENU_SV',
'paiInfoBean': {
@ -203,7 +203,7 @@ FAMILY_PAYLOAD = {
{
'sexe': 'F',
'firstname': 'Maggie',
'lastname': 'Simpson',
'lastname': 'Test_Simpson',
'birth': {'dateBirth': '2018-12-17'},
'dietcode': 'MENU_PAI',
'paiInfoBean': {
@ -213,7 +213,7 @@ FAMILY_PAYLOAD = {
{
'sexe': 'M',
'firstname': 'Hugo',
'lastname': 'Simpson',
'lastname': 'Test_Simpson',
'birth': {'dateBirth': '2018-04-01'},
'dietcode': 'MENU_AV',
'paiInfoBean': {
@ -261,7 +261,10 @@ def pytest_addoption(parser):
parser.addoption('--nameid', help='Publik Name ID', default='functest')
parser.addoption('--dui', help='DUI number', default='')
parser.addoption(
'--lastname', help='override lastname to create a new "update" family', default='Simpson'
'--lastname', help='override lastname to create a new "update" family', default='Test_Simpson'
)
parser.addoption(
'--quick', action='store_true', help='do not reload referentials to speed-up tests', default=False
)
@ -348,6 +351,7 @@ def remove_id_on_rlg(conn, rlg):
rlg['indicatorList'].sort(key=lambda x: x['code'])
rlg['quotientList'].sort(key=lambda x: (x['yearRev'], x['dateStart']))
del rlg['indicators'] # order may change
del rlg['quotients'] # order may change
rlg['subscribeActivityList'] = [] # not managed by test yet
del rlg['subscribe_natures'] # order may change
@ -402,7 +406,10 @@ def conn(request):
@pytest.fixture(scope='session')
def referentials(conn):
def referentials(request, conn):
quick = request.config.getoption('--quick')
if quick:
return
url = urlparse.urlparse(conn)
slug = url.path.split('/')[2]
cmd = (
@ -415,10 +422,10 @@ def referentials(conn):
@pytest.fixture(scope='session')
def create_data(request, conn):
def create_data(request, conn, reference_year):
name_id = request.config.getoption('--nameid')
unlink(conn, name_id)
lastname = 'EO_' + uuid4().hex[0:27]
lastname = 'TEST_' + uuid4().hex[0:25]
# create family
create_family_payload = copy.deepcopy(FAMILY_PAYLOAD)
@ -442,6 +449,21 @@ def create_data(request, conn):
resp.raise_for_status()
create_result = resp.json()
assert create_result['err'] == 0
# add requiered quotient for subscriptions
data = read_family(conn, name_id)
url = conn + '/update-quotient?NameID=%s&rl_id=%s' % (name_id, data['RL1']['num'])
payload = {
'yearRev': str(reference_year),
'dateStart': '%s-09-01' % (reference_year),
'dateEnd': '3000-08-31',
'mtt': '5000.0',
'cdquo': '1',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
print('\ncreate DUI: %s' % str(create_result['data']['number']))
data = diff_family(conn, name_id, 'test_create_family.json')
@ -458,6 +480,58 @@ def create_data(request, conn):
}
@pytest.fixture(scope='session')
def create_data2(request, conn, reference_year):
name_id = request.config.getoption('--nameid')
unlink(conn, name_id)
lastname = 'TEST_' + uuid4().hex[0:25]
# create family that is not located into Toulouse
create_family_payload = copy.deepcopy(FAMILY_PAYLOAD)
create_family_payload['rl1']['lastname'] = lastname
create_family_payload['rl1']['adresse'] = create_family_payload['rl2']['adresse']
create_family_payload['rl2']['adresse'] = copy.deepcopy(FAMILY_PAYLOAD['rl1']['adresse'])
for child in create_family_payload['childList']:
child['lastname'] = lastname
url = conn + '/create-family?NameID=%s' % name_id
resp = requests.post(url, json=create_family_payload)
resp.raise_for_status()
create_result = resp.json()
assert create_result['err'] == 0
# add requiered quotient for subscriptions
data = read_family(conn, name_id)
url = conn + '/update-quotient?NameID=%s&rl_id=%s' % (name_id, data['RL1']['num'])
payload = {
'yearRev': str(reference_year),
'dateStart': '2023-05-15',
'dateEnd': '3000-12-31',
'mtt': '5000.0',
'cdquo': '1',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
print('\ncreate DUI again: %s' % str(create_result['data']['number']))
data = diff_family(conn, name_id, 'test_create_family_out_town.json')
return {
'name_id': name_id, # linked
'family_id': str(create_result['data']['number']),
'family_payload': create_family_payload,
'lastname': lastname,
'rl1_num': data['RL1']['num'],
'rl2_num': data['RL2']['num'],
'bart_num': data['childList'][0]['num'],
'lisa_num': data['childList'][1]['num'],
'maggie_num': data['childList'][2]['num'],
'hugo_num': data['childList'][3]['num'],
'data': data,
}
@pytest.fixture(scope='session')
def update_data(request, conn):
name_id = request.config.getoption('--nameid')
@ -616,22 +690,102 @@ def get_subscription_info(nature, activity_text, unit_text, place_text, con, nam
}
def get_loisirs_subscribe_info(con, data, year):
return get_subscription_info(
'LOISIRS',
# Sigec made this loisirs activity available for functests
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES',
'MERCREDI - 15h30/17h - 8/15Ans',
'ARGOULETS',
con,
data['name_id'],
data['bart_num'],
year,
)
def get_loisirs_subscribe_info3(con, data, year):
return get_subscription_info(
'LOISIRS',
# Sigec made this loisirs activity available for functests
'Vitrail Fusing 1/2 Je Adultes',
'Inscription annuelle',
'Centre Culturel ALBAN MINVILLE',
con,
data['name_id'],
data['bart_num'],
year,
)
def get_extrasco_subscribe_info(con, data, year):
return get_subscription_info(
'EXTRASCO',
# Sigec made this extra-sco activity available for functests
'ADL ELEMENTAIRE Maourine Juin',
'PUBLIK ADL ELEMENTAIRE Maourine JUIN 22/23(NE PAS UTILISER)',
'MAOURINE (la) ELEMENTAIRE',
con,
data['name_id'],
data['bart_num'],
year,
)
def get_extrasco_subscribe_info2(con, data, year):
return get_subscription_info(
'EXTRASCO',
# Sigec made this extra-sco activity available for functests
'ADL MATERNELLE Lardenne Juin',
'PUBLIK ADL MATER JOURNEE AVEC REPAS',
'LARDENNE MATERNELLE',
con,
data['name_id'],
data['bart_num'],
year,
)
@pytest.fixture(scope='session')
def loisirs_subscribe_info(conn, create_data, reference_year):
unlink(conn, create_data['name_id'])
link(conn, create_data)
return get_loisirs_subscribe_info(conn, create_data, reference_year)
@pytest.fixture(scope='session')
def loisirs_subscribe_info2(conn, create_data2, reference_year):
unlink(conn, create_data2['name_id'])
link(conn, create_data2)
return get_loisirs_subscribe_info(conn, create_data2, reference_year)
@pytest.fixture(scope='session')
def loisirs_subscribe_info3(conn, create_data2, reference_year):
unlink(conn, create_data2['name_id'])
link(conn, create_data2)
return get_loisirs_subscribe_info3(conn, create_data2, reference_year)
@pytest.fixture(scope='session')
def extrasco_subscribe_info(conn, create_data, reference_year):
unlink(conn, create_data['name_id'])
link(conn, create_data)
return get_extrasco_subscribe_info(conn, create_data, reference_year)
return get_subscription_info(
'EXTRASCO',
# Sigec made this extra-sco activity available for functests
'ADL ELEMENTAIRE Maourine Avril 2023',
'ADL ELEMENTAIRE Maourine Avril 2023',
'MAOURINE (la) ELEMENTAIRE',
conn,
create_data['name_id'],
create_data['bart_num'],
reference_year,
)
@pytest.fixture(scope='session')
def extrasco_subscribe_info2(conn, create_data, reference_year):
unlink(conn, create_data['name_id'])
link(conn, create_data)
return get_extrasco_subscribe_info2(conn, create_data, reference_year)
@pytest.fixture(scope='session')
def extrasco_subscribe_info3(conn, create_data2, reference_year):
unlink(conn, create_data2['name_id'])
link(conn, create_data2)
return get_extrasco_subscribe_info2(conn, create_data2, reference_year)
@pytest.fixture(scope='session')
@ -645,11 +799,32 @@ def perisco_subscribe_info(conn, create_data, reference_year):
return get_subscription_info(
None,
# Sigec made this peri-sco activity available for functests
'TEMPS DU MIDI 22/23',
'TEMPS DU MIDI 22/23',
'DOLTO FRANCOISE MATERNELLE',
'Temps du midi',
'TEST TEMPS DU MIDI 22/23',
'AMIDONNIERS ELEMENTAIRE',
conn,
create_data['name_id'],
create_data['bart_num'],
reference_year,
)
@pytest.fixture(scope='session')
def perisco_subscribe_adulte_info(conn, create_data2, reference_year):
'''This fixture is a configuration trick from Sigec
as peri-sco should not be available for subscription
and as a consequence, should not be displayed from catalogs'''
unlink(conn, create_data2['name_id'])
link(conn, create_data2)
return get_subscription_info(
None,
# Sigec made this peri-sco activity available for functests
'RESTAURATION ADULTE',
'TEST RESTAURATION ADULTE 22/23',
'DOLTO FRANCOISE MATERNELLE',
conn,
create_data2['name_id'],
create_data2['bart_num'],
reference_year,
)

View File

@ -7,6 +7,14 @@
"typeDesc": "NONE",
"isActive": false
},
{
"id": "AUTO_OUT",
"code": "AUTO_OUT",
"text": "Autorisation de sortie - CLAE",
"libelle": "Autorisation de sortie - CLAE",
"typeDesc": "NONE",
"isActive": false
},
{
"id": "AUTRE",
"code": "AUTRE",
@ -16,6 +24,30 @@
"isActive": true,
"note": "rebellious"
},
{
"id": "AUT_OUTADL",
"code": "AUT_OUTADL",
"text": "Autorisation de sortie - ADL",
"libelle": "Autorisation de sortie - ADL",
"typeDesc": "NONE",
"isActive": false
},
{
"id": "AUT_SANT",
"code": "AUT_SANT",
"text": "J'autorise le responsable d'\u00e9tablissement \u00e0 prendre, en cas d'urgence des mesures rendues n\u00e9cessaires par l'\u00e9tat de sant\u00e9 de mon enfant",
"libelle": "J'autorise le responsable d'\u00e9tablissement \u00e0 prendre, en cas d'urgence des mesures rendues n\u00e9cessaires par l'\u00e9tat de sant\u00e9 de mon enfant",
"typeDesc": "NONE",
"isActive": false
},
{
"id": "AUT_TRANS",
"code": "AUT_TRANS",
"text": "J'autorise mon enfant \u00e0 prendre les transports de la collectivit\u00e9",
"libelle": "J'autorise mon enfant \u00e0 prendre les transports de la collectivit\u00e9",
"typeDesc": "NONE",
"isActive": false
},
{
"id": "AVL",
"code": "AVL",
@ -27,8 +59,8 @@
{
"id": "AVS",
"code": "AVS",
"text": "Assistant de Vie scolaire",
"libelle": "Assistant de Vie scolaire ",
"text": "Auxiliaire de Vie scolaire",
"libelle": "Auxiliaire de Vie scolaire ",
"typeDesc": "NONE",
"isActive": false
},
@ -41,6 +73,14 @@
"isActive": false,
"note": null
},
{
"id": "HPURG",
"code": "HPURG",
"text": "Hospitalisation / musures d'urgence",
"libelle": "Hospitalisation / musures d'urgence",
"typeDesc": "NONE",
"isActive": false
},
{
"id": "LENTILLE",
"code": "LENTILLE",

View File

@ -27,8 +27,8 @@
"numComp": null,
"street1": "RUE ACHILLE VIADIEU",
"street2": null,
"town": "Springfield",
"zipcode": "62701",
"town": "Toulouse",
"zipcode": "31400",
"idStreet_text": "RUE ACHILLE VIADIEU"
},
"contact": {
@ -42,7 +42,17 @@
"profession": null,
"CAFInfo": null,
"indicatorList": [],
"quotientList": [],
"quotientList": [
{
"yearRev": 2022,
"dateStart": "2022-09-01T00:00:00+02:00",
"dateEnd": "3000-08-31T00:00:00+02:00",
"mtt": 5000.0,
"cdquo": "1",
"codeUti": null,
"cdquo_text": "Revenus fiscaux"
}
],
"subscribeActivityList": [],
"civility_text": "MADAME",
"quality_text": "M\u00e8re"
@ -62,7 +72,8 @@
"countryCode": null,
"cdDepartment": "19",
"communeCode_text": "BRIVE-LA-GAILLARDE",
"cdDepartment_text": "CORREZE"
"cdDepartment_text": "CORREZE",
"zipCode": "19100"
},
"dietcode": "MENU_AV",
"bPhoto": true,
@ -71,7 +82,7 @@
{
"personInfo": {
"num": "N/A",
"lastname": "SIMPSON",
"lastname": "TEST_SIMPSON",
"firstname": "ABRAHAM JEBEDIAH",
"dateBirth": "1927-05-24T00:00:00+01:00",
"civility": "MR",
@ -93,7 +104,7 @@
{
"personInfo": {
"num": "N/A",
"lastname": "SIMPSON",
"lastname": "TEST_SIMPSON",
"firstname": "MONA PENELOPE",
"dateBirth": "1929-03-15T00:00:00Z",
"civility": "MME",
@ -149,13 +160,13 @@
"hospital": null,
"vaccinList": [
{
"code": "24",
"libelle": "IMOVAX OREILLONS",
"code": "1",
"libelle": "TETANOS",
"vaccinationDate": "2022-02-22T00:00:00+01:00"
},
{
"code": "45",
"libelle": "DT TETANOS COQ",
"code": "8",
"libelle": "DTPOLIO",
"vaccinationDate": "2011-01-11T00:00:00+01:00"
}
]

View File

@ -0,0 +1,408 @@
{
"number": "N/A",
"category": "BI",
"situation": "MARI",
"flagCom": false,
"nbChild": 3,
"nbTotalChild": 4,
"nbAES": "1",
"RL1": {
"num": "N/A",
"firstname": "MARGE",
"lastname": "N/A",
"maidenName": "BOUVIER",
"quality": "MERE",
"civility": "MME",
"birth": {
"dateBirth": "1950-10-01T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": "404",
"cdDepartment": null,
"countryCode_text": "USA"
},
"adresse": {
"idStreet": null,
"num": 742,
"numComp": null,
"street1": "Evergreen Terrace",
"street2": null,
"town": "Springfield",
"zipcode": "90701"
},
"contact": {
"phone": null,
"mobile": null,
"mail": null,
"isContactMail": false,
"isContactSms": false,
"isInvoicePdf": false
},
"profession": null,
"CAFInfo": null,
"indicatorList": [],
"quotientList": [
{
"yearRev": 2022,
"dateStart": "2023-05-15T00:00:00+02:00",
"dateEnd": "3000-12-31T00:00:00+01:00",
"mtt": 5000.0,
"cdquo": "1",
"codeUti": null,
"cdquo_text": "Revenus fiscaux"
}
],
"subscribeActivityList": [],
"civility_text": "MADAME",
"quality_text": "M\u00e8re"
},
"RL2": {
"num": "N/A",
"firstname": "HOMER",
"lastname": "N/A",
"maidenName": null,
"quality": "PERE",
"civility": "MR",
"birth": {
"dateBirth": "1956-05-12T00:00:00+01:00",
"place": "Brive-la-Gaillarde",
"communeCode": "19031",
"countryCode": null,
"cdDepartment": "19",
"communeCode_text": "BRIVE-LA-GAILLARDE",
"cdDepartment_text": "CORREZE",
"zipCode": "19100"
},
"adresse": {
"idStreet": "2317",
"num": 4,
"numComp": null,
"street1": "RUE ACHILLE VIADIEU",
"street2": null,
"town": "Toulouse",
"zipcode": "31400",
"idStreet_text": "RUE ACHILLE VIADIEU"
},
"contact": {
"phone": "0122222222",
"mobile": "0622222222",
"mail": "homer.simpson@example.org.com",
"isContactMail": true,
"isContactSms": true,
"isInvoicePdf": true
},
"profession": {
"codeCSP": "46",
"profession": "Inspecteur de s\u00e9curit\u00e9",
"employerName": "Burns",
"phone": "0133333333",
"addressPro": {
"num": null,
"street": null,
"zipcode": "90701",
"town": "Springfield"
},
"situation": null,
"weeklyHours": null,
"codeCSP_text": "EMPLOYES"
},
"CAFInfo": {
"number": "123",
"organ": "GENE",
"organ_text": "CAF 31"
},
"indicatorList": [
{
"code": "AVL",
"libelle": "Auxiliaire de Vie loisirs",
"note": null,
"choice": null,
"code_text": "Auxiliaire de Vie loisirs"
},
{
"code": "ETABSPEC",
"libelle": "Etablissement sp\u00e9cialis\u00e9",
"note": "SNPP",
"choice": null,
"code_text": "Etablissement sp\u00e9cialis\u00e9"
}
],
"quotientList": [],
"subscribeActivityList": [],
"civility_text": "MONSIEUR",
"quality_text": "P\u00e8re"
},
"quotientList": [],
"childList": [
{
"num": "N/A",
"lastname": "N/A",
"firstname": "BART",
"sexe": "M",
"birth": {
"dateBirth": "2014-04-01T00:00:00+02:00",
"place": "Brive-la-Gaillarde",
"communeCode": "19031",
"countryCode": null,
"cdDepartment": "19",
"communeCode_text": "BRIVE-LA-GAILLARDE",
"cdDepartment_text": "CORREZE",
"zipCode": "19100"
},
"dietcode": "MENU_AV",
"bPhoto": true,
"bLeaveAlone": true,
"authorizedPersonList": [
{
"personInfo": {
"num": "N/A",
"lastname": "TEST_SIMPSON",
"firstname": "ABRAHAM JEBEDIAH",
"dateBirth": "1927-05-24T00:00:00+01:00",
"civility": "MR",
"sexe": "M",
"contact": {
"phone": "0312345678",
"mobile": null,
"mail": "abe.simpson@example.org"
},
"civility_text": "MONSIEUR",
"sexe_text": "Masculin"
},
"personQuality": {
"code": "13",
"libelle": "Famille",
"code_text": "Famille"
}
},
{
"personInfo": {
"num": "N/A",
"lastname": "TEST_SIMPSON",
"firstname": "MONA PENELOPE",
"dateBirth": "1929-03-15T00:00:00Z",
"civility": "MME",
"sexe": "F",
"contact": {
"phone": "0412345678",
"mobile": "0612345678",
"mail": "mona.simpson@example.org"
},
"civility_text": "MADAME",
"sexe_text": "F\u00e9minin"
},
"personQuality": {
"code": "13",
"libelle": "Famille",
"code_text": "Famille"
}
}
],
"indicatorList": [
{
"code": "AUTRE",
"libelle": "Autre",
"note": "rebellious",
"choice": null,
"code_text": "Autre"
},
{
"code": "LUNETTE",
"libelle": "Port de lunettes",
"note": null,
"choice": null,
"code_text": "Port de lunettes"
}
],
"medicalRecord": {
"familyDoctor": {
"name": "MONROE",
"phone": "0612341234",
"address": {
"street1": "Alameda",
"zipcode": "90701",
"town": "Springfield"
}
},
"allergy1": "butterscotch, imitation butterscotch, glow-in-the-dark monster make-up",
"allergy2": "shrimp and cauliflower",
"comment1": "the shrimp allergy isn't fully identified",
"comment2": null,
"observ1": "Ay Caramba!",
"observ2": "Eat my shorts!",
"isAuthHospital": false,
"hospital": null,
"vaccinList": [
{
"code": "24",
"libelle": "IMOVAX OREILLONS",
"vaccinationDate": "2022-02-22T00:00:00+01:00"
},
{
"code": "45",
"libelle": "DT TETANOS COQ",
"vaccinationDate": "2011-01-11T00:00:00+01:00"
}
]
},
"insurance": null,
"paiInfoBean": {
"code": "PAI_01",
"dateDeb": "2022-09-01T00:00:00+02:00",
"dateFin": "2023-07-01T00:00:00+02:00",
"description": "mischievous, rebellious, misunderstood, disruptive",
"code_text": "PAI Alimentaire Int\u00e9gral"
},
"mother": "N/A",
"father": "N/A",
"rl": null,
"subscribeSchoolList": [],
"subscribeActivityList": [],
"sexe_text": "Masculin",
"dietcode_text": "Avec viande"
},
{
"num": "N/A",
"lastname": "N/A",
"firstname": "LISA",
"sexe": "F",
"birth": {
"dateBirth": "2016-05-09T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_SV",
"bPhoto": false,
"bLeaveAlone": false,
"authorizedPersonList": [],
"indicatorList": [],
"medicalRecord": null,
"insurance": null,
"paiInfoBean": {
"code": "PAI_02",
"dateDeb": null,
"dateFin": null,
"description": null,
"code_text": "PAI Alimentaire Partiel"
},
"mother": "N/A",
"father": "N/A",
"rl": null,
"subscribeSchoolList": [],
"subscribeActivityList": [],
"sexe_text": "F\u00e9minin",
"dietcode_text": "Sans viande"
},
{
"num": "N/A",
"lastname": "N/A",
"firstname": "MAGGIE",
"sexe": "F",
"birth": {
"dateBirth": "2018-12-17T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_PAI",
"bPhoto": false,
"bLeaveAlone": false,
"authorizedPersonList": [],
"indicatorList": [],
"medicalRecord": null,
"insurance": null,
"paiInfoBean": {
"code": "PAI_02",
"dateDeb": null,
"dateFin": null,
"description": null,
"code_text": "PAI Alimentaire Partiel"
},
"mother": "N/A",
"father": "N/A",
"rl": null,
"subscribeSchoolList": [],
"subscribeActivityList": [],
"sexe_text": "F\u00e9minin",
"dietcode_text": "Panier PAI"
},
{
"num": "N/A",
"lastname": "N/A",
"firstname": "HUGO",
"sexe": "M",
"birth": {
"dateBirth": "2018-04-01T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_AV",
"bPhoto": false,
"bLeaveAlone": false,
"authorizedPersonList": [],
"indicatorList": [],
"medicalRecord": null,
"insurance": null,
"paiInfoBean": {
"code": "PAI_01",
"dateDeb": null,
"dateFin": null,
"description": null,
"code_text": "PAI Alimentaire Int\u00e9gral"
},
"mother": "N/A",
"father": "N/A",
"rl": null,
"subscribeSchoolList": [],
"subscribeActivityList": [],
"sexe_text": "Masculin",
"dietcode_text": "Avec viande"
}
],
"emergencyPersonList": [
{
"numPerson": "N/A",
"civility": "MME",
"firstname": "PATTY",
"lastname": "BOUVIER",
"dateBirth": "1948-08-30T00:00:00+01:00",
"sexe": "F",
"quality": "13",
"contact": {
"phone": "0112345678",
"mobile": "0612345678",
"mail": "patty.bouvier@example.org"
},
"civility_text": "MADAME",
"quality_text": "Famille",
"sexe_text": "F\u00e9minin"
},
{
"numPerson": "N/A",
"civility": "MME",
"firstname": "SELMA",
"lastname": "BOUVIER",
"dateBirth": "1946-04-29T00:00:00+01:00",
"sexe": "F",
"quality": "13",
"contact": {
"phone": "0112345678",
"mobile": "0612345678",
"mail": "selma.bouvier@example.org"
},
"civility_text": "MADAME",
"quality_text": "Famille",
"sexe_text": "F\u00e9minin"
}
],
"indicatorList": [],
"childErrorList": [],
"category_text": "BIPARENTALE",
"situation_text": "MARIE(E)",
"family_id": "N/A"
}

View File

@ -12,7 +12,8 @@
"countryCode": null,
"cdDepartment": "19",
"communeCode_text": "BRIVE-LA-GAILLARDE",
"cdDepartment_text": "CORREZE"
"cdDepartment_text": "CORREZE",
"zipCode": "19100"
},
"adresse": {
"idStreet": null,

View File

@ -0,0 +1,125 @@
[
{
"id": "INDI_APE_ENF",
"text": "INDI_APE_ENF",
"level": "INDI_APE_ENF",
"indicatorList": [
{
"code": "APE_COMPO3",
"libelle": "CF-0/1 actif",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_HBOTH",
"libelle": "SP-handicap parent et fratrie",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_HPAR",
"libelle": "SP-handicap parents",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_MULTIACC",
"libelle": "CF-2 enfants \u00e0 accueillir",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_SITUP",
"libelle": "SP-situation particuli\u00e8re personne",
"typeDesc": "NONE",
"choiceList": []
}
]
},
{
"id": "INDI_APE_FAM",
"text": "INDI_APE_FAM",
"level": "INDI_APE_FAM",
"indicatorList": [
{
"code": "APE_COMPO2",
"libelle": "CF-1/2 actif",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_COMPO4",
"libelle": "CF-0/2 actif",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_FIRSTC",
"libelle": "CF-premier enfant",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_HAND",
"libelle": "H-handicap ou maladie chronique",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_NAIM",
"libelle": "CF-naissance multiple",
"typeDesc": "NONE",
"choiceList": []
}
]
},
{
"id": "INDI_APE_RES",
"text": "INDI_APE_RES",
"level": "INDI_APE_RES",
"indicatorList": [
{
"code": "APE_COMPO1",
"libelle": "CF-100% actif",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_FRAT",
"libelle": "CF-Fratrie d\u00e9j\u00e0 en accueil",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_KOFRAT",
"libelle": "CF-sans proposition pour une partie de la fratrie",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_HFRAT",
"libelle": "SP-handicap fratrie",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_SPLOG",
"libelle": "SP-situation particuli\u00e8re logement",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE_ALLO",
"libelle": "SP-accompagnement enfant allophone",
"typeDesc": "NONE",
"choiceList": []
},
{
"code": "APE-MINE",
"libelle": "SP-parent mineur",
"typeDesc": "NONE",
"choiceList": []
}
]
}
]

View File

@ -1,9 +1,17 @@
[
{
"id": "AVS",
"code": "AVS",
"text": "Assistant de Vie scolaire",
"libelle": "Assistant de Vie scolaire ",
"id": "AUT_OUTADL",
"code": "AUT_OUTADL",
"text": "Autorisation de sortie - ADL",
"libelle": "Autorisation de sortie - ADL",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AUTO_OUT",
"code": "AUTO_OUT",
"text": "Autorisation de sortie - CLAE",
"libelle": "Autorisation de sortie - CLAE",
"typeDesc": "NONE",
"choiceList": []
},
@ -23,6 +31,14 @@
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AVS",
"code": "AVS",
"text": "Auxiliaire de Vie scolaire",
"libelle": "Auxiliaire de Vie scolaire ",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "ETABSPEC",
"code": "ETABSPEC",
@ -31,6 +47,30 @@
"typeDesc": "NOTE",
"choiceList": []
},
{
"id": "HPURG",
"code": "HPURG",
"text": "Hospitalisation / musures d'urgence",
"libelle": "Hospitalisation / musures d'urgence",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AUT_SANT",
"code": "AUT_SANT",
"text": "J'autorise le responsable d'\u00e9tablissement \u00e0 prendre, en cas d'urgence des mesures rendues n\u00e9cessaires par l'\u00e9tat de sant\u00e9 de mon enfant",
"libelle": "J'autorise le responsable d'\u00e9tablissement \u00e0 prendre, en cas d'urgence des mesures rendues n\u00e9cessaires par l'\u00e9tat de sant\u00e9 de mon enfant",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AUT_TRANS",
"code": "AUT_TRANS",
"text": "J'autorise mon enfant \u00e0 prendre les transports de la collectivit\u00e9",
"libelle": "J'autorise mon enfant \u00e0 prendre les transports de la collectivit\u00e9",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "MDPH",
"code": "MDPH",

View File

@ -1,4 +1,10 @@
[
{
"id": "MORAL",
"code": "MORAL",
"text": "",
"libelle": null
},
{
"id": "MME",
"code": "MME",
@ -10,11 +16,5 @@
"code": "MR",
"text": "MONSIEUR",
"libelle": "MONSIEUR"
},
{
"id": "MORAL",
"code": "MORAL",
"text": "MORAL",
"libelle": "MORAL"
}
]

View File

@ -1,4 +1,11 @@
[
{
"id": "87",
"code": "87",
"rang": "PERSON",
"text": "Acte de d\u00e9c\u00e8s",
"libelle": "Acte de d\u00e9c\u00e8s"
},
{
"id": "43",
"code": "43",
@ -188,6 +195,13 @@
"text": "Certificat de scolarit\u00e9",
"libelle": "Certificat de scolarit\u00e9"
},
{
"id": "93",
"code": "93",
"rang": "PERSON",
"text": "Certificat de travail",
"libelle": "Certificat de travail"
},
{
"id": "74",
"code": "74",

View File

@ -0,0 +1,26 @@
[
{
"id": "05DERO-8",
"code": "05DERO-8",
"text": "DERO05 - SANTE",
"libelle": "DERO05 - SANTE"
},
{
"id": "05DERO-6",
"code": "05DERO-6",
"text": "DERO05 - SANTE : SANTE / ORGANISATION",
"libelle": "DERO05 - SANTE : SANTE / ORGANISATION"
},
{
"id": "10DERO-2",
"code": "10DERO-2",
"text": "DERO10 - ORGANISATION",
"libelle": "DERO10 - ORGANISATION"
},
{
"id": "11DERO-1",
"code": "11DERO-1",
"text": "DERO11 - AUTRE",
"libelle": "DERO11 - AUTRE"
}
]

View File

@ -0,0 +1,56 @@
[
{
"id": 102,
"code": 102,
"text": "CANTINE / CLAE",
"libelle": "CANTINE / CLAE"
},
{
"id": 103,
"code": 103,
"text": "CCAS",
"libelle": "CCAS"
},
{
"id": 101,
"code": 101,
"text": "DASC",
"libelle": "DASC"
},
{
"id": 104,
"code": 104,
"text": "DSCS",
"libelle": "DSCS"
},
{
"id": 105,
"code": 105,
"text": "ENFANCE LOISIRS",
"libelle": "ENFANCE LOISIRS"
},
{
"id": 106,
"code": 106,
"text": "PARCOURS EDUCATIFS",
"libelle": "PARCOURS EDUCATIFS"
},
{
"id": 107,
"code": 107,
"text": "REMBOURSEMENT",
"libelle": "REMBOURSEMENT"
},
{
"id": 108,
"code": 108,
"text": "SENIORS",
"libelle": "SENIORS"
},
{
"id": 109,
"code": 109,
"text": "SPORT",
"libelle": "SPORT"
}
]

View File

@ -1,12 +1,4 @@
[
{
"id": "AVS",
"code": "AVS",
"text": "Assistant de Vie scolaire",
"libelle": "Assistant de Vie scolaire ",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AVL",
"code": "AVL",
@ -15,6 +7,14 @@
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AVS",
"code": "AVS",
"text": "Auxiliaire de Vie scolaire",
"libelle": "Auxiliaire de Vie scolaire ",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "ETABSPEC",
"code": "ETABSPEC",

View File

@ -0,0 +1,92 @@
[
{
"id": "CE1",
"age": 7,
"code": "CE1",
"text": "Cours \u00e9l\u00e9mentaire 1",
"nature": null,
"libelle": "Cours \u00e9l\u00e9mentaire 1",
"numOrder": "6",
"nextLevelCode": "CE2"
},
{
"id": "CE2",
"age": 8,
"code": "CE2",
"text": "Cours \u00e9l\u00e9mentaire 2",
"nature": null,
"libelle": "Cours \u00e9l\u00e9mentaire 2",
"numOrder": "7",
"nextLevelCode": "CM1"
},
{
"id": "CM1",
"age": 9,
"code": "CM1",
"text": "Cours moyen 1",
"nature": null,
"libelle": "Cours moyen 1",
"numOrder": "8",
"nextLevelCode": "CM2"
},
{
"id": "CM2",
"age": 10,
"code": "CM2",
"text": "Cours moyen 2",
"nature": null,
"libelle": "Cours moyen 2",
"numOrder": "9",
"nextLevelCode": null
},
{
"id": "CP",
"age": 6,
"code": "CP",
"text": "Cours pr\u00e9paratoire",
"nature": null,
"libelle": "Cours pr\u00e9paratoire",
"numOrder": "5",
"nextLevelCode": "CE1"
},
{
"id": "GS",
"age": 5,
"code": "GS",
"text": "Section grand",
"nature": null,
"libelle": "Section grand",
"numOrder": "4",
"nextLevelCode": "CP"
},
{
"id": "MS",
"age": 4,
"code": "MS",
"text": "Section moyen",
"nature": null,
"libelle": "Section moyen",
"numOrder": "3",
"nextLevelCode": "GS"
},
{
"id": "PS",
"age": 3,
"code": "PS",
"text": "Section petit",
"nature": null,
"libelle": "Section petit",
"numOrder": "2",
"nextLevelCode": "MS"
},
{
"id": "TPS",
"age": 2,
"code": "TPS",
"text": "Section tout petit",
"nature": null,
"libelle": "Section tout petit",
"numOrder": "1",
"nextLevelCode": "PS"
}
]

View File

@ -0,0 +1,20 @@
[
{
"id": 2022,
"text": "2022",
"schoolYear": 2022,
"dateEndYearSchool": "2023-07-07T00:00:00+02:00",
"dateStartYearSchool": "2022-09-01T00:00:00+02:00",
"dateEndSubscribeSchool": "2023-09-01T00:00:00+02:00",
"dateStartSubscribeSchool": "2022-09-01T00:00:00+02:00"
},
{
"id": 2023,
"text": "2023",
"schoolYear": 2023,
"dateEndYearSchool": "2024-07-07T00:00:00+02:00",
"dateStartYearSchool": "2023-09-04T00:00:00+02:00",
"dateEndSubscribeSchool": "2023-09-01T00:00:00+02:00",
"dateStartSubscribeSchool": "2022-09-01T00:00:00+02:00"
}
]

View File

@ -1,33 +1,9 @@
[
{
"id": "105",
"code": "105",
"text": "AUTRE",
"libelle": "AUTRE"
},
{
"id": "30",
"code": "30",
"text": "B.C.G.",
"libelle": "B.C.G."
},
{
"id": "56",
"code": "56",
"text": "BOOSTRIX",
"libelle": "BOOSTRIX"
},
{
"id": "27",
"code": "27",
"text": "CHOLERA",
"libelle": "CHOLERA"
},
{
"id": "48",
"code": "48",
"text": "Contr\u00f4le B.C.G.",
"libelle": "Contr\u00f4le B.C.G."
"text": "BCG",
"libelle": "BCG"
},
{
"id": "3",
@ -41,107 +17,17 @@
"text": "DIPHTERIE",
"libelle": "DIPHTERIE"
},
{
"id": "6",
"code": "6",
"text": "DIPHTERIE TETANOS",
"libelle": "DIPHTERIE TETANOS"
},
{
"id": "9",
"code": "9",
"text": "DIPHT TETANOS COQ",
"libelle": "DIPHT TETANOS COQ"
},
{
"id": "19",
"code": "19",
"text": "DT BISRUDIVAX",
"libelle": "DT BISRUDIVAX"
},
{
"id": "10",
"code": "10",
"text": "DT COQ POLIO",
"libelle": "DT COQ POLIO"
},
{
"id": "13",
"code": "13",
"text": "DT COQ POLIO IPAD",
"libelle": "DT COQ POLIO IPAD"
},
{
"id": "8",
"code": "8",
"text": "DT POLIO",
"libelle": "DT POLIO"
},
{
"id": "45",
"code": "45",
"text": "DT TETANOS COQ",
"libelle": "DT TETANOS COQ"
},
{
"id": "11",
"code": "11",
"text": "DT TYPHOIDE",
"libelle": "DT TYPHOIDE"
},
{
"id": "129",
"code": "129",
"text": "ENGERIX",
"libelle": "ENGERIX"
},
{
"id": "26",
"code": "26",
"text": "FIEVRE JAUNE",
"libelle": "FIEVRE JAUNE"
},
{
"id": "4",
"code": "4",
"text": "F.TYPHOIDES",
"libelle": "F.TYPHOIDES"
},
{
"id": "144",
"code": "144",
"text": "GRIPPE",
"libelle": "GRIPPE"
},
{
"id": "143",
"code": "143",
"text": "HAEMOPHILUS HIB",
"libelle": "HAEMOPHILUS HIB"
},
{
"id": "17",
"code": "17",
"text": "HAVRIX",
"libelle": "HAVRIX"
"text": "DTPOLIO",
"libelle": "DTPOLIO"
},
{
"id": "29",
"code": "29",
"text": "HEPATITE B",
"libelle": "HEPATITE B"
},
{
"id": "146",
"code": "146",
"text": "HEXAXIM",
"libelle": "HEXAXIM"
},
{
"id": "59",
"code": "59",
"text": "HEXYON",
"libelle": "HEXYON"
"text": "HEPATITEB",
"libelle": "HEPATITEB"
},
{
"id": "16",
@ -150,226 +36,28 @@
"libelle": "HIB"
},
{
"id": "24",
"code": "24",
"text": "IMOVAX OREILLONS",
"libelle": "IMOVAX OREILLONS"
"id": "152",
"code": "152",
"text": "IIP",
"libelle": "IIP"
},
{
"id": "121",
"code": "121",
"text": "INFANRIX",
"libelle": "INFANRIX"
"id": "151",
"code": "151",
"text": "MENINGOCOQUE",
"libelle": "MENINGOCOQUE"
},
{
"id": "52",
"code": "52",
"text": "INFANRIX HEXA",
"libelle": "INFANRIX HEXA"
},
{
"id": "32",
"code": "32",
"text": "INFANRIX POLIO",
"libelle": "INFANRIX POLIO"
},
{
"id": "33",
"code": "33",
"text": "INFANRIX POLIO HIB",
"libelle": "INFANRIX POLIO HIB"
},
{
"id": "51",
"code": "51",
"text": "INFANRIX QUINTA",
"libelle": "INFANRIX QUINTA"
},
{
"id": "55",
"code": "55",
"text": "INFANRIX TETRA",
"libelle": "INFANRIX TETRA"
},
{
"id": "147",
"code": "147",
"text": "INFLUVAC TETRA",
"libelle": "INFLUVAC TETRA"
},
{
"id": "137",
"code": "137",
"text": "INNUGRIP",
"libelle": "INNUGRIP"
},
{
"id": "18",
"code": "18",
"text": "LEPTOSPIROSE",
"libelle": "LEPTOSPIROSE"
},
{
"id": "22",
"code": "22",
"text": "MENINGITE",
"libelle": "MENINGITE"
},
{
"id": "130",
"code": "130",
"text": "MENINGITEC",
"libelle": "MENINGITEC"
},
{
"id": "123",
"code": "123",
"text": "MENINVAC",
"libelle": "MENINVAC"
},
{
"id": "120",
"code": "120",
"text": "MENINVACT",
"libelle": "MENINVACT"
},
{
"id": "139",
"code": "139",
"text": "MENJUGATE",
"libelle": "MENJUGATE"
},
{
"id": "149",
"code": "149",
"text": "M-M RVAX PRO",
"libelle": "M-M RVAX PRO"
},
{
"id": "133",
"code": "133",
"text": "MONOTEST",
"libelle": "MONOTEST"
},
{
"id": "124",
"code": "124",
"text": "MONOVAX",
"libelle": "MONOVAX"
},
{
"id": "132",
"code": "132",
"text": "NEISVAC",
"libelle": "NEISVAC"
},
{
"id": "110",
"code": "110",
"text": "OTITE",
"libelle": "OTITE"
},
{
"id": "134",
"code": "134",
"text": "PANENZA",
"libelle": "PANENZA"
},
{
"id": "31",
"code": "31",
"text": "PENTACOQ",
"libelle": "PENTACOQ"
},
{
"id": "53",
"code": "53",
"text": "PENTAVAC",
"libelle": "PENTAVAC"
},
{
"id": "2",
"code": "2",
"text": "POLIOMYELITE",
"libelle": "POLIOMYELITE"
},
{
"id": "128",
"code": "128",
"text": "PREVENAR",
"libelle": "PREVENAR"
},
{
"id": "125",
"code": "125",
"text": "PRIORIX",
"libelle": "PRIORIX"
},
{
"id": "54",
"code": "54",
"text": "REPEVAX",
"libelle": "REPEVAX"
},
{
"id": "47",
"code": "47",
"text": "REVAXIS",
"libelle": "REVAXIS"
"id": "150",
"code": "150",
"text": "POLIO",
"libelle": "POLIO"
},
{
"id": "28",
"code": "28",
"text": "R O R",
"libelle": "R O R"
},
{
"id": "127",
"code": "127",
"text": "ROR VAX",
"libelle": "ROR VAX"
},
{
"id": "135",
"code": "135",
"text": "ROTARIX",
"libelle": "ROTARIX"
},
{
"id": "20",
"code": "20",
"text": "ROUVAX",
"libelle": "ROUVAX"
},
{
"id": "23",
"code": "23",
"text": "RUDI ROUVAX",
"libelle": "RUDI ROUVAX"
},
{
"id": "21",
"code": "21",
"text": "RUDIVAX",
"libelle": "RUDIVAX"
},
{
"id": "113",
"code": "113",
"text": "SCARLATINE",
"libelle": "SCARLATINE"
},
{
"id": "14",
"code": "14",
"text": "SERUM ANTI-TETANIQUE",
"libelle": "SERUM ANTI-TETANIQUE"
},
{
"id": "141",
"code": "141",
"text": "SYNAGIS",
"libelle": "SYNAGIS"
"text": "ROR",
"libelle": "ROR"
},
{
"id": "1",
@ -377,46 +65,10 @@
"text": "TETANOS",
"libelle": "TETANOS"
},
{
"id": "7",
"code": "7",
"text": "TETANOS POLIO",
"libelle": "TETANOS POLIO"
},
{
"id": "12",
"code": "12",
"text": "TETRA COQ",
"libelle": "TETRA COQ"
},
{
"id": "46",
"code": "46",
"text": "TETRAVAC ACELLULAIRE",
"libelle": "TETRAVAC ACELLULAIRE"
},
{
"id": "107",
"code": "107",
"text": "VARICELLE",
"libelle": "VARICELLE"
},
{
"id": "15",
"code": "15",
"text": "VARIOLE",
"libelle": "VARIOLE"
},
{
"id": "34",
"code": "34",
"text": "VAXELIS",
"libelle": "VAXELIS"
},
{
"id": "148",
"code": "148",
"text": "VAXIGRIP",
"libelle": "VAXIGRIP"
"text": "TETRACOQ",
"libelle": "TETRACOQ"
}
]

View File

@ -10,8 +10,8 @@
{
"id": "AVS",
"code": "AVS",
"text": "Assistant de Vie scolaire",
"libelle": "Assistant de Vie scolaire ",
"text": "Auxiliaire de Vie scolaire",
"libelle": "Auxiliaire de Vie scolaire ",
"typeDesc": "NONE",
"isActive": false
},

View File

@ -18,7 +18,7 @@
{
"personInfo": {
"num": "N/A",
"lastname": "SIMPSON",
"lastname": "TEST_SIMPSON",
"firstname": "ABRAHAM JEBEDIAH",
"dateBirth": "1927-05-24T00:00:00+01:00",
"civility": "MR",
@ -40,7 +40,7 @@
{
"personInfo": {
"num": "N/A",
"lastname": "SIMPSON",
"lastname": "TEST_SIMPSON",
"firstname": "MONA PENELOPE",
"dateBirth": "1929-03-15T00:00:00Z",
"civility": "MME",
@ -96,13 +96,13 @@
"hospital": "Springfield General Hospital",
"vaccinList": [
{
"code": "24",
"libelle": "IMOVAX OREILLONS",
"code": "1",
"libelle": "TETANOS",
"vaccinationDate": "2022-02-22T00:00:00+01:00"
},
{
"code": "45",
"libelle": "DT TETANOS COQ",
"code": "8",
"libelle": "DTPOLIO",
"vaccinationDate": "2011-01-11T00:00:00+01:00"
}
]

View File

@ -0,0 +1,31 @@
{
"familyDoctor": {
"name": "HIBBERT",
"phone": "0656785678",
"address": {
"street1": "General Hospital",
"zipcode": "90701",
"town": "Springfield"
}
},
"allergy1": null,
"allergy2": null,
"comment1": null,
"comment2": null,
"observ1": null,
"observ2": null,
"isAuthHospital": true,
"hospital": "Springfield General Hospital",
"vaccinList": [
{
"code": "1",
"libelle": "TETANOS",
"vaccinationDate": "2022-02-22T00:00:00+01:00"
},
{
"code": "8",
"libelle": "DTPOLIO",
"vaccinationDate": "2011-01-11T00:00:00+01:00"
}
]
}

View File

@ -10,13 +10,13 @@
"hospital": null,
"vaccinList": [
{
"code": "24",
"libelle": "IMOVAX OREILLONS",
"code": "1",
"libelle": "TETANOS",
"vaccinationDate": "2022-02-22T00:00:00+01:00"
},
{
"code": "45",
"libelle": "DT TETANOS COQ",
"code": "8",
"libelle": "DTPOLIO",
"vaccinationDate": "2011-01-11T00:00:00+01:00"
}
]

View File

@ -19,7 +19,7 @@
{
"personInfo": {
"num": "N/A",
"lastname": "SIMPSON",
"lastname": "TEST_SIMPSON",
"firstname": "MONA PENELOPE",
"dateBirth": "1929-03-15T00:00:00Z",
"civility": "MME",

View File

@ -27,8 +27,8 @@
"numComp": null,
"street1": "RUE ACHILLE VIADIEU",
"street2": null,
"town": "Springfield",
"zipcode": "62701",
"town": "Toulouse",
"zipcode": "31400",
"idStreet_text": "RUE ACHILLE VIADIEU"
},
"contact": {
@ -61,7 +61,8 @@
"countryCode": null,
"cdDepartment": "19",
"communeCode_text": "BRIVE-LA-GAILLARDE",
"cdDepartment_text": "CORREZE"
"cdDepartment_text": "CORREZE",
"zipCode": "19100"
},
"adresse": {
"idStreet": null,
@ -135,7 +136,8 @@
"countryCode": null,
"cdDepartment": "19",
"communeCode_text": "BRIVE-LA-GAILLARDE",
"cdDepartment_text": "CORREZE"
"cdDepartment_text": "CORREZE",
"zipCode": "19100"
},
"dietcode": "MENU_AV",
"bPhoto": true,
@ -144,7 +146,7 @@
{
"personInfo": {
"num": "N/A",
"lastname": "SIMPSON",
"lastname": "TEST_SIMPSON",
"firstname": "ABRAHAM JEBEDIAH",
"dateBirth": "1927-05-24T00:00:00+01:00",
"civility": "MR",
@ -166,7 +168,7 @@
{
"personInfo": {
"num": "N/A",
"lastname": "SIMPSON",
"lastname": "TEST_SIMPSON",
"firstname": "MONA PENELOPE",
"dateBirth": "1929-03-15T00:00:00Z",
"civility": "MME",
@ -222,13 +224,13 @@
"hospital": "Springfield General Hospital",
"vaccinList": [
{
"code": "24",
"libelle": "IMOVAX OREILLONS",
"code": "1",
"libelle": "TETANOS",
"vaccinationDate": "2022-02-22T00:00:00+01:00"
},
{
"code": "45",
"libelle": "DT TETANOS COQ",
"code": "8",
"libelle": "DTPOLIO",
"vaccinationDate": "2011-01-11T00:00:00+01:00"
}
]

View File

@ -2,7 +2,7 @@
"number": "N/A",
"category": "AUTR",
"situation": "AUTR",
"flagCom": true,
"flagCom": false,
"nbChild": 0,
"nbTotalChild": 0,
"nbAES": "0",
@ -153,13 +153,13 @@
"hospital": null,
"vaccinList": [
{
"code": "24",
"libelle": "IMOVAX OREILLONS",
"code": "1",
"libelle": "TETANOS",
"vaccinationDate": "2022-02-22T00:00:00+01:00"
},
{
"code": "45",
"libelle": "DT TETANOS COQ",
"code": "8",
"libelle": "DTPOLIO",
"vaccinationDate": "2011-01-11T00:00:00+01:00"
}
]

View File

@ -4,24 +4,33 @@
"dateStart": "2022-01-02T00:00:00+01:00",
"dateEnd": "2022-12-31T00:00:00+01:00",
"mtt": 1500.33,
"cdquo": "1",
"cdquo": "2",
"codeUti": null,
"cdquo_text": "Revenus fiscaux"
"cdquo_text": "Revenus Petite enfance"
},
{
"yearRev": 2021,
"dateStart": "2022-01-01T00:00:00+01:00",
"dateEnd": "2022-01-01T00:00:00+01:00",
"mtt": 1500.33,
"cdquo": "1",
"cdquo": "2",
"codeUti": null,
"cdquo_text": "Revenus fiscaux"
"cdquo_text": "Revenus Petite enfance"
},
{
"yearRev": 2021,
"dateStart": "2022-01-02T00:00:00+01:00",
"dateEnd": "2022-12-31T00:00:00+01:00",
"mtt": 1500.33,
"cdquo": "2",
"codeUti": null,
"cdquo_text": "Revenus Petite enfance"
},
{
"yearRev": 2022,
"dateStart": "2022-09-01T00:00:00+02:00",
"dateEnd": "3000-08-31T00:00:00+02:00",
"mtt": 5000.0,
"cdquo": "1",
"codeUti": null,
"cdquo_text": "Revenus fiscaux"

View File

@ -18,8 +18,8 @@
"numComp": null,
"street1": "RUE ACHILLE VIADIEU",
"street2": null,
"town": "Springfield",
"zipcode": "62701",
"town": "Toulouse",
"zipcode": "31400",
"idStreet_text": "RUE ACHILLE VIADIEU"
},
"contact": {

View File

@ -5,8 +5,9 @@ from .conftest import diff
@pytest.mark.parametrize(
"ref",
'ref',
[
'ape-indicators',
'category',
'child-indicator',
'civility',
@ -15,11 +16,16 @@ from .conftest import diff
'csp',
'dietcode',
'document',
'exemption-reasons',
#'nursery',
'organ',
'pai',
'quality',
'quotient',
#'regie',
'rl-indicator',
'school-levels',
'school-years',
'situation',
'street',
'vaccin',
@ -35,5 +41,5 @@ def test_referentials(conn, referentials, ref):
for item in res['data']:
assert 'id' in item
assert 'text' in item
if ref not in ['street', 'county']:
if ref not in ['street', 'county', 'nursery']:
assert diff(res['data'], 'test_read_%s_list.json' % ref)

View File

@ -36,7 +36,7 @@ def test_link(conn, update_data):
res = resp.json()
assert res['err'] == 1
assert res['err_class'] == 'passerelle.utils.soap.SOAPFault'
assert "E02 : Le dossier numéro [999999] ne correspond à aucune famille" in res['err_desc']
assert 'E02 : Le dossier numéro [999999] ne correspond à aucune famille' in res['err_desc']
# wrong DUI firstname
payload = {

View File

@ -15,7 +15,7 @@ FAMILY_RESET_PAYLOAD = {
'rl1': {
'civility': 'MR', # no effect
'firstname': 'Marge', # must be
'lastname': 'Simpson', # must be
'lastname': 'Test_Simpson', # must be
'maidenName': 'reset', # no effect
'quality': 'AU',
'birth': {
@ -27,7 +27,7 @@ FAMILY_RESET_PAYLOAD = {
'rl2': {
'civility': 'MME', # no effect
'firstname': 'Homer', # must be
'lastname': 'Simpson', # must be
'lastname': 'Test_Simpson', # must be
'quality': 'AU',
'birth': {
'dateBirth': '1956-05-12', # must be
@ -236,13 +236,18 @@ def test_update_family(conn, update_data):
def test_create_family(conn, create_data, update_data):
unlink(conn, create_data['name_id'])
link(conn, create_data)
# search the 'Test_Simpson' default test family
resp = requests.get(conn + '/search-family?q=Test_Simpson')
resp.raise_for_status()
assert len(resp.json()['data']) >= 1
assert any(data['RL1']['lastname'] == 'TEST_SIMPSON' for data in resp.json()['data'])
url = conn + '/create-family?NameID=%s' % create_data['name_id']
# RL1 already exists (on update_data) error
unlink(conn, create_data['name_id'])
payload = copy.deepcopy(create_data['family_payload'])
payload['rl1']['lastname'] = 'Simpson'
payload['rl1']['lastname'] = 'Test_Simpson'
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
@ -263,7 +268,7 @@ def test_create_family(conn, create_data, update_data):
def test_is_rl_exists(conn, update_data):
url = conn + '/is-rl-exists'
payload = {'firstname': 'Marge', 'lastname': 'Simpson', 'dateBirth': '1950-10-01'}
payload = {'firstname': 'Marge', 'lastname': 'Test_Simpson', 'dateBirth': '1950-10-01'}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json() == {'err': 0, 'data': True}
@ -280,7 +285,7 @@ def test_is_rl_exists(conn, update_data):
assert resp.json() == {'err': 0, 'data': False}
# test on rl2
payload = {'firstname': 'Homer', 'lastname': 'Simpson', 'dateBirth': '1956-05-12'}
payload = {'firstname': 'Homer', 'lastname': 'Test_Simpson', 'dateBirth': '1956-05-12'}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json() == {'err': 0, 'data': True}
@ -304,7 +309,7 @@ def test_create_rl2(conn, create_data, update_data):
assert diff_rlg(conn, create_data['name_id'], 2, 'test_create_rl2.json')
@pytest.mark.parametrize("rl", ['1', '2'])
@pytest.mark.parametrize('rl', ['1', '2'])
def test_update_rlg(conn, update_data, rl):
rlg = 'rl' + rl
RLG = 'RL' + rl
@ -365,7 +370,7 @@ def test_update_rlg(conn, update_data, rl):
in res['err_desc']
)
else:
assert "La date de naissance ne peut pas être modifiée" in res['err_desc']
assert 'La date de naissance ne peut pas être modifiée' in res['err_desc']
# restore RL1
payload = copy.deepcopy(update_data['family_payload'][rlg])
@ -454,7 +459,7 @@ def test_create_child(conn, create_data, update_data):
assert 'E65 : Il existe déjà un enfant correspondant' in res['err_desc']
# child already exists error (Lisa form update_data)
payload['lastname'] = 'Simpson'
payload['lastname'] = 'Test_Simpson'
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
@ -613,6 +618,24 @@ def test_update_child_medical_record(conn, update_data):
update_data['bart_num'],
)
# update only doctor
# #2720: allergies comments, and observations are erased
payload = {
'familyDoctor': {
'name': 'Hibbert',
'phone': '0656785678',
'address': {
'street1': 'General Hospital',
'zipcode': '90701',
'town': 'Springfield',
},
},
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert diff_child(conn, update_data['name_id'], 0, 'test_update_child_doctor.json', key='medicalRecord')
# reset medical record
payload = FAMILY_RESET_PAYLOAD['childList'][0]['medicalRecord']
resp = requests.post(url, json=payload)
@ -776,21 +799,22 @@ def test_update_quotient(conn, create_data):
'dateStart': '2022-01-01',
'dateEnd': '2022-12-31',
'mtt': '1500.33',
'cdquo': '1',
'cdquo': '2',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = read_family(conn, create_data['name_id'])
assert data['RL1']['quotientList'] == [
assert len(data['RL1']['quotientList']) == 2
assert data['RL1']['quotients']['2'] == [
{
'yearRev': 2021,
'dateStart': '2022-01-01T00:00:00+01:00',
'dateEnd': '2022-12-31T00:00:00+01:00',
'mtt': 1500.33,
'cdquo': '1',
'cdquo': '2',
'codeUti': None,
'cdquo_text': 'Revenus fiscaux',
'cdquo_text': 'Revenus Petite enfance',
}
]
@ -800,7 +824,7 @@ def test_update_quotient(conn, create_data):
resp.raise_for_status()
assert resp.json()['err'] == 0
data = read_family(conn, create_data['name_id'])
assert len(data['RL1']['quotientList']) == 2
assert len(data['RL1']['quotients']['2']) == 2
# add quotient on another income year
payload['yearRev'] = '2020'
@ -808,7 +832,7 @@ def test_update_quotient(conn, create_data):
resp.raise_for_status()
assert resp.json()['err'] == 0
data = diff_rlg(conn, create_data['name_id'], 1, 'test_update_quotient.json', 'quotientList')
assert len(data['RL1']['quotientList']) == 3
assert len(data['RL1']['quotients']['2']) == 3
# test read-family with reference year
url = conn + '/read-family?NameID=%s&income_year=%s' % (create_data['name_id'], '2020')
@ -908,7 +932,7 @@ def test_read_family_members(conn, update_data):
assert res['data']['personInfo']['firstname'] == 'ABRAHAM JEBEDIAH'
def test_add_supplied_document(conn, create_data):
def test_supplied_document(conn, create_data):
unlink(conn, create_data['name_id'])
link(conn, create_data)
@ -916,6 +940,8 @@ def test_add_supplied_document(conn, create_data):
payload = {
'documentList/0/code': '46',
'documentList/0/depositDate': '2022-12-20',
'documentList/0/visaDate': '2022-12-21',
'documentList/0/validityDate': '2022-12-22',
'documentList/0/file': { # w.c.s. file field
'filename': '201x201.jpg',
'content_type': 'image/jpeg',
@ -929,6 +955,7 @@ def test_add_supplied_document(conn, create_data):
assert res['err'] == 0
# push on RL
payload['documentList/0/code'] = '85'
payload['numPerson'] = create_data['rl1_num']
url = conn + '/add-supplied-document?NameID=%s' % create_data['name_id']
resp = requests.post(url, json=payload)
@ -937,9 +964,43 @@ def test_add_supplied_document(conn, create_data):
assert res['err'] == 0
# push on child
payload['documentList/0/code'] = '69'
payload['numPerson'] = create_data['bart_num']
url = conn + '/add-supplied-document?NameID=%s' % create_data['name_id']
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
# check validity on family
params = {
'code': '46',
'ref_date': '2022-12-22',
}
url = conn + '/read-supplied-document-validity?NameID=%s' % create_data['name_id']
resp = requests.get(url, params=params)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
# check validity on RL
params = {
'code': '85',
'person_id': create_data['rl1_num'],
'ref_date': '2022-12-22',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
# check validity on child
params = {
'code': '69',
'person_id': create_data['bart_num'],
'ref_date': '2022-12-22',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0

View File

@ -1,174 +0,0 @@
import datetime
import requests
def test_perisco(perisco_subscribe_info):
assert perisco_subscribe_info['info']['activity']['libelle1'] == 'TEMPS DU MIDI 22/23'
def test_perisco_agenda(conn, create_data, perisco_subscribe_info):
# subscription
url = conn + '/add-person-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'unit_id': perisco_subscribe_info['unit']['id'],
'place_id': perisco_subscribe_info['place']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# find first available booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) > 0
booking = None
for booking in resp.json()['data']:
if booking['disabled'] is False:
break
else:
raise Exception("no booking available")
assert booking['details']['activity_id'] == perisco_subscribe_info['activity']['id']
assert booking['prefill'] is False
# book activity
url = conn + '/update-child-agenda?NameID=%s' % create_data['name_id']
payload = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [booking['id']],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json() == {
'updated': True,
'count': 1,
'changes': [
{
'booked': True,
'activity_id': booking['details']['activity_id'],
'activity_label': 'Restauration scolaire',
'day': booking['details']['day_str'],
}
],
'err': 0,
}
# check booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert [x['prefill'] for x in resp.json()['data'] if x['id'] == booking['id']][0] is True
def test_perisco_recurrent_week(conn, create_data, perisco_subscribe_info, reference_year):
# no subscribed activity
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'nature': 'PERISCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 0
# subscription
url = conn + '/add-person-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'unit_id': perisco_subscribe_info['unit']['id'],
'place_id': perisco_subscribe_info['place']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'nature': 'PERISCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 1
assert resp.json()['data'][0]['id'] == perisco_subscribe_info['activity']['id']
# get recurent-week gabarit
url = conn + '/get-recurrent-week?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'ref_date': datetime.date.today().strftime('%Y-%m-%d'),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert [(x['id'], x['day']) for x in resp.json()['data']] == [
('1-X', 'Lundi'),
('2-X', 'Mardi'),
('4-X', 'Jeudi'),
('5-X', 'Vendredi'),
]
# no booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['maggie_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert not any(x['prefill'] for x in resp.json()['data'])
# set recurent-week gabarit
url = conn + '/update-recurrent-week?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
'recurrent_week': ['1-X', '2-X'],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
# there is now some bookings
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['maggie_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert any(x['prefill'] for x in resp.json()['data'])

View File

@ -0,0 +1,192 @@
import datetime
import pytest
import requests
from .conftest import link, unlink
def test_create_nursery_demand_on_existing_child(conn, create_data):
unlink(conn, create_data['name_id'])
link(conn, create_data)
url = conn + '/get-nursery-geojson'
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
nurseries = resp.json()['features']
assert len(nurseries) >= 2
url = conn + '/read-family?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
nb_childs = len(res['data']['childList'])
assert sorted(x['code'] for x in res['data']['indicatorList']) == []
url = conn + '/read-child?NameID=%s&child_id=%s' % (create_data['name_id'], create_data['maggie_num'])
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
assert sorted(x['code'] for x in res['data']['indicatorList']) == []
url = conn + '/create-nursery-demand'
payload = {
'family_id': create_data['family_id'],
'family_indicators/0/code': 'APE_FIRSTC',
'family_indicators/0/isActive': True,
'child_id': create_data['maggie_num'],
'demand_indicators/0/code': 'APE_COMPO1',
'demand_indicators/0/isActive': True,
'start_date': datetime.date.today().strftime('%Y-%m-%d'),
'number_of_days': '2',
'start_hour_Mon': '08:00',
'end_hour_Mon': '',
'comment': 'bla',
'accept_other_nurseries': True,
'nursery1/idActivity': nurseries[0]['properties']['activity_id'],
'nursery1/idUnit': nurseries[0]['properties']['unit_id'],
'nursery1/idPlace': nurseries[0]['properties']['place_id'],
'nursery2/idActivity': nurseries[1]['properties']['activity_id'],
'nursery2/idUnit': nurseries[1]['properties']['unit_id'],
'nursery2/idPlace': nurseries[1]['properties']['place_id'],
'nursery3/idActivity': '',
'nursery3/idUnit': '',
'nursery3/idPlace': '',
# indicators
'child_indicators/0/code': 'APE_HBOTH',
'child_indicators/0/isActive': True,
'child_indicators/1/code': 'APE_HPAR',
'child_indicators/1/isActive': True,
'child_indicators/2/code': 'APE_COMPO3',
'child_indicators/2/isActive': True,
'child_indicators/3/code': 'APE_MULTIACC',
'child_indicators/3/isActive': True,
'family_indicators/0/code': 'APE_COMPO4',
'family_indicators/0/isActive': True,
'family_indicators/1/code': 'APE_NAIM',
'family_indicators/1/isActive': True,
'family_indicators/2/code': 'APE_FIRSTC',
'family_indicators/2/isActive': True,
'family_indicators/3/code': 'APE_COMPO2',
'family_indicators/3/isActive': True,
'family_indicators/4/code': 'APE_HAND',
'family_indicators/4/isActive': True,
'demand_indicators/0/code': 'APE_FRAT',
'demand_indicators/0/isActive': True,
'demand_indicators/1/code': 'APE_COMPO1',
'demand_indicators/1/isActive': True,
'demand_indicators/2/code': 'APE_HFRAT',
'demand_indicators/2/isActive': True,
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json() == {'data': None, 'err': 0}
# no child added
url = conn + '/read-family?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
assert len(res['data']['childList']) == nb_childs
# check indicators
assert sorted(x['code'] for x in res['data']['indicatorList']) == [
'APE_COMPO2',
'APE_COMPO4',
'APE_FIRSTC',
'APE_HAND',
'APE_NAIM',
]
url = conn + '/read-child?NameID=%s&child_id=%s' % (create_data['name_id'], create_data['maggie_num'])
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
assert sorted(x['code'] for x in res['data']['indicatorList']) == [
'APE_COMPO3',
'APE_HBOTH',
'APE_HPAR',
'APE_MULTIACC',
]
def test_create_nursery_demand_adding_new_child(conn, create_data):
unlink(conn, create_data['name_id'])
link(conn, create_data)
url = conn + '/get-nursery-geojson'
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
nurseries = resp.json()['features']
assert len(nurseries) >= 2
url = conn + '/read-family?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
nb_childs = len(res['data']['childList'])
assert 'NELSON' not in [x['firstname'] for x in res['data']['childList']]
url = conn + '/create-nursery-demand'
payload = {
'family_id': create_data['family_id'],
'child_first_name': 'Nelson',
'child_last_name': 'Muntz',
'child_birthdate': '2013-10-31',
'child_gender': 'G',
'start_date': datetime.date.today().strftime('%Y-%m-%d'),
'nursery1/idActivity': nurseries[0]['properties']['activity_id'],
'nursery1/idUnit': nurseries[0]['properties']['unit_id'],
'nursery1/idPlace': nurseries[0]['properties']['place_id'],
'nursery2/idActivity': nurseries[1]['properties']['activity_id'],
'nursery2/idUnit': nurseries[1]['properties']['unit_id'],
'nursery2/idPlace': nurseries[1]['properties']['place_id'],
'nursery3/idActivity': '',
'nursery3/idUnit': '',
'nursery3/idPlace': '',
# indicators
'child_indicators/0/code': 'APE_HBOTH',
'child_indicators/0/isActive': True,
'child_indicators/1/code': 'APE_HPAR',
'child_indicators/1/isActive': True,
'child_indicators/2/code': 'APE_COMPO3',
'child_indicators/2/isActive': True,
'child_indicators/3/code': 'APE_MULTIACC',
'child_indicators/3/isActive': True,
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert res['err'] == 0
child_id = resp.json()['data']
assert child_id is not None
# a new child is created on family
url = conn + '/read-family?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
assert len(res['data']['childList']) == nb_childs + 1
assert 'NELSON' in [x['firstname'] for x in res['data']['childList']]
assert res['data']['childList'][nb_childs]['num'] == child_id
# check child indicators
url = conn + '/read-child?NameID=%s&child_id=%s' % (create_data['name_id'], child_id)
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
assert res['data']['firstname'] == 'NELSON'
assert sorted(x['code'] for x in res['data']['indicatorList']) == [
'APE_COMPO3',
'APE_HBOTH',
'APE_HPAR',
'APE_MULTIACC',
]

View File

@ -1,24 +0,0 @@
import datetime
import requests
from .conftest import link, unlink
# LOISIR is a subset of EXTRACO, we only test the genaral catalog cell here
def test_catalog_general_loisirs(conn, update_data):
unlink(conn, update_data['name_id'])
link(conn, update_data)
url = conn + '/read-activity-list'
params = {'ref_date': datetime.date.today().strftime('%Y-%m-%d')}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
[x['text'] for x in resp.json()['data']] == [
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES, MERCREDI - 13h45/17h - 8/15Ans, ARGOULETS',
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES, MERCREDI - 14h/16h30 - 10/15Ans, LA RAMEE',
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES, MERCREDI - 15h30/17h - 8/15Ans, ARGOULETS',
]

View File

@ -0,0 +1,308 @@
import datetime
import pytest
import requests
@pytest.fixture(scope='session')
def school_year(conn):
url = conn + '/read-school-years-list'
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
year = res['data'][0]['text']
return year
@pytest.fixture(scope='session')
def exemption(conn):
# get an exemption code
url = conn + '/read-exemption-reasons-list'
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
return res['data'][0]['id']
def test_displaying_school_subscribed(conn, create_data, school_year, exemption):
"""
Read-family ramène les inscriptions aux date de visualisation paramétrées
sur le référential YearSchool
"""
school_year = str(int(school_year) + 1)
# create a 7 year-old child
url = conn + '/create-child?NameID=%s' % create_data['name_id']
payload = {
'sexe': 'F',
'firstname': 'Claris',
'lastname': create_data['lastname'],
'birth': {'dateBirth': '2016-09-12'},
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
claris_id = str(resp.json()['data']['child_id'])
# book
url = conn + '/create-child-school-pre-registration'
payload = {
'numPerson': claris_id,
'schoolYear': school_year,
'levelCode': 'CE1',
'dateSubscribe': school_year + '-01-01',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data']['returnMessage'] is None
assert resp.json()['data']['subscribeSchoolBean']['schoolName'] == 'DUPONT PIERRE ELEMENTAIRE'
assert resp.json()['data']['subscribeSchoolBean']['adresse'] == '101 GRANDE-RUE SAINT MICHEL'
# get Claris school from read-family
url = conn + '/read-school-years-list'
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()['data']
date_start = [x['dateStartYearSchool'] for x in res if x['text'] == school_year][0]
assert date_start[10] > datetime.datetime.now().strftime('%Y-%m-%d')
url = conn + '/read-family?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
schools = [x['subscribeSchoolList'] for x in res['data']['childList'] if x['num'] == claris_id][0]
assert len(schools) == 0 # school is filtered, but it is related to an hidden school year
# field, not dateStartYearSchool, checked before : #2425
def test_school_pre_registration_by_sector(conn, create_data, school_year, exemption):
"""
Pré-inscription de l'enfant de 7 ans dans son secteur
"""
# create a 7 year-old child
url = conn + '/create-child?NameID=%s' % create_data['name_id']
payload = {
'sexe': 'F',
'firstname': 'Sego',
'lastname': create_data['lastname'],
'birth': {'dateBirth': '2016-05-09'},
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
sego_id = str(resp.json()['data']['child_id'])
# assert there is a school at this address
url = conn + '/read-schools-for-address-and-level'
params = {
'id_street': '2317',
'num': '4',
'year': school_year,
'level': 'CE1',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 1
assert resp.json()['data'][0]['text'] == 'DUPONT PIERRE ELEMENTAIRE'
# assert there is a school at child address
url = conn + '/read-schools-for-child-and-level'
params = {
'child_id': sego_id,
'year': school_year,
'level': 'CE1',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 1
assert resp.json()['data'][0]['text'] == 'DUPONT PIERRE ELEMENTAIRE'
school_id = resp.json()['data'][0]['idSchool']
assert school_id == '2435'
# book
url = conn + '/create-child-school-pre-registration'
payload = {
'numPerson': sego_id,
'schoolYear': school_year,
'levelCode': 'CE1',
'dateSubscribe': school_year + '-01-01',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data']['returnMessage'] is None
assert resp.json()['data']['subscribeSchoolBean']['schoolName'] == 'DUPONT PIERRE ELEMENTAIRE'
assert resp.json()['data']['subscribeSchoolBean']['adresse'] == '101 GRANDE-RUE SAINT MICHEL'
# get Sego school from read-family
url = conn + '/read-school-years-list'
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()['data']
date_start = [x['dateStartYearSchool'] for x in res if x['text'] == school_year][0]
assert date_start[10] > datetime.datetime.now().strftime('%Y-%m-%d')
# school is filtered, but it is related to an hidden school year
# field, not dateStartYearSchool, see #2425
url = conn + '/read-family?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
res = resp.json()
assert res['err'] == 0
schools = [x['subscribeSchoolList'] for x in res['data']['childList'] if x['num'] == sego_id][0]
assert len(schools) == 1
assert schools[0]['schoolName'] == 'DUPONT PIERRE ELEMENTAIRE'
"""
Pré-inscription d'un enfant de 5 ans en CP avec rappprochement de fratrie pour celui de 7 ans :
rapprochement dans le secteur de l'enfant.
"""
# get Sego school
url = conn + '/read-child-school-informations?NameID=%s' % create_data['name_id']
params = {
'child_id': sego_id,
'year': school_year,
'level': 'CE1',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
schools = data['childSubscribeSchoolInformation']['subscribeSchoolYearList']
assert len(schools) == 1
assert schools[0]['subscribeSchool']['school']['idSchool'] == school_id
assert schools[0]['subscribeSchool']['perim']['idPerim'] == '2707'
url = conn + '/create-child-school-pre-registration-with-sibling'
payload = {
'numPerson': create_data['maggie_num'],
'schoolYear': school_year,
'levelCode': 'GS',
'datePresubscribe': school_year + '-01-01',
'idSchoolRequested': school_id,
'numPersonSibling': sego_id,
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert 'returnMessage' not in resp.json()
assert resp.json()['data']['schoolName'] == 'CALAS MATERNELLE'
assert resp.json()['data']['adresse'] == '47 RUE ACHILLE VIADIEU' # same sector
def test_school_pre_registration_by_exemption(conn, create_data, school_year, exemption):
"""
Pré-inscription de l'enfant de 9 ans en dérogation :
c'est une dérogation avec sélection du motif sur un établissement hors secteur
"""
# school list
url = conn + '/read-child-school-informations?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'year': school_year,
'level': 'CM1',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
schools = data['childSubscribeSchoolInformation']['subscribeSchoolInformation']['derogSchoolList']
assert len(schools) > 1
school_id = schools[0]['id']
# book
url = conn + '/create-child-school-pre-registration-with-exemption'
payload = {
'numPerson': create_data['bart_num'],
'schoolYear': school_year,
'levelCode': 'CM1',
'datePresubscribe': school_year + '-01-01',
'idRequestSchool1': school_id,
'derogReasonCode': exemption,
'derogComment': 'bla',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert 'returnMessage' not in resp.json()
assert resp.json()['data']['schoolName'] == 'AMIDONNIERS ELEMENTAIRE'
assert resp.json()['data']['adresse'] == '123 ALL DE BRIENNE'
"""
Pré-inscription de l'autre enfant de 5 ans en CP
avec rapprochement de fratrie pour celui de 9 ans :
rapprochement hors du secteur de l'enfant.
"""
# check E124 error
# get a school that do not provide a level in its sector
url = conn + '/read-child-school-informations?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['hugo_num'],
'year': school_year,
'level': 'GS',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert [
x['idSchool']
for x in data['childSubscribeSchoolInformation']['subscribeSchoolInformation']['derogSchoolList']
if x['text'] == 'DIEUZAIDE JEAN MATERNELLE'
] == ['2437']
# try to book on a sector that do not provide the requested level
url = conn + '/create-child-school-pre-registration-with-sibling'
payload = {
'numPerson': create_data['hugo_num'],
'schoolYear': school_year,
'levelCode': 'CP',
'datePresubscribe': school_year + '-01-01',
'idSchoolRequested': '2437',
'numPersonSibling': create_data['bart_num'],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 1
assert resp.json()['err_class'] == 'passerelle.utils.soap.SOAPFault'
assert 'E124' in resp.json()['err_desc']
# get Bart school
url = conn + '/read-child-school-informations?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'year': school_year,
'level': 'CM1',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
schools = data['childSubscribeSchoolInformation']['subscribeSchoolYearList']
assert len(schools) == 1
assert schools[0]['subscribeSchool']['school']['idSchool'] == school_id
assert schools[0]['subscribeSchool']['perim']['idPerim'] == '2663'
# book
url = conn + '/create-child-school-pre-registration-with-sibling'
payload = {
'numPerson': create_data['hugo_num'],
'schoolYear': school_year,
'levelCode': 'GS',
'datePresubscribe': school_year + '-01-01',
'idSchoolRequested': school_id,
'numPersonSibling': create_data['bart_num'],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert 'returnMessage' not in resp.json()
assert resp.json()['data']['schoolName'] == 'AMIDONNIERS MATERNELLE'
assert resp.json()['data']['adresse'] == '125 ALL DE BRIENNE'

View File

@ -0,0 +1,369 @@
import datetime
import pytest
import requests
from .conftest import link, unlink
def test_perisco(perisco_subscribe_info):
assert perisco_subscribe_info['info']['activity']['libelle1'] == 'TEST TEMPS DU MIDI 22/23'
def test_perisco_adulte(perisco_subscribe_adulte_info):
assert perisco_subscribe_adulte_info['info']['activity']['libelle1'] == 'TEST RESTAURATION ADULTE 22/23'
def test_perisco_agenda(conn, create_data, perisco_subscribe_info):
unlink(conn, create_data['name_id'])
link(conn, create_data)
# subscription
url = conn + '/add-person-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'unit_id': perisco_subscribe_info['unit']['id'],
'place_id': perisco_subscribe_info['place']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# find first available booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) > 0
booking = None
for booking in resp.json()['data']:
if booking['disabled'] is False:
break
else:
raise Exception('no booking available')
assert booking['details']['activity_id'] == perisco_subscribe_info['activity']['id']
assert booking['details']['activity_label'] == 'Temps du midi'
assert booking['prefill'] is False
# book activity
url = conn + '/update-child-agenda?NameID=%s' % create_data['name_id']
payload = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [booking['id']],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json() == {
'updated': True,
'count': 1,
'changes': [
{
'booked': True,
'activity_id': booking['details']['activity_id'],
'activity_label': 'Temps du midi',
'day': booking['details']['day_str'],
}
],
'err': 0,
}
# check booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert [x['prefill'] for x in resp.json()['data'] if x['id'] == booking['id']][0] is True
def test_perisco_agenda_adulte(conn, create_data2, perisco_subscribe_adulte_info):
unlink(conn, create_data2['name_id'])
link(conn, create_data2)
# subscription
url = conn + '/add-person-subscription?NameID=%s' % create_data2['name_id']
payload = {
'person_id': create_data2['rl1_num'],
'activity_id': perisco_subscribe_adulte_info['activity']['id'],
'unit_id': perisco_subscribe_adulte_info['unit']['id'],
'place_id': perisco_subscribe_adulte_info['place']['id'],
'start_date': perisco_subscribe_adulte_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_adulte_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# find first available booking
url = conn + '/read-child-agenda?NameID=%s' % create_data2['name_id']
params = {
'child_id': create_data2['rl1_num'],
'start_date': perisco_subscribe_adulte_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_adulte_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) > 0
booking = None
for booking in resp.json()['data']:
if booking['disabled'] is False:
break
else:
raise Exception('no booking available')
assert booking['details']['activity_id'] == perisco_subscribe_adulte_info['activity']['id']
assert booking['details']['activity_label'] == 'RESTAURATION ADULTE'
assert booking['prefill'] is False
# book activity
url = conn + '/update-child-agenda?NameID=%s' % create_data2['name_id']
payload = {
'child_id': create_data2['rl1_num'],
'start_date': perisco_subscribe_adulte_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_adulte_info['unit']['dateEnd'][:10],
'booking_list': [booking['id']],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json() == {
'updated': True,
'count': 1,
'changes': [
{
'booked': True,
'activity_id': booking['details']['activity_id'],
'activity_label': 'RESTAURATION ADULTE',
'day': booking['details']['day_str'],
}
],
'err': 0,
}
# check booking
url = conn + '/read-child-agenda?NameID=%s' % create_data2['name_id']
params = {
'child_id': create_data2['rl1_num'],
'start_date': perisco_subscribe_adulte_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_adulte_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert [x['prefill'] for x in resp.json()['data'] if x['id'] == booking['id']][0] is True
def test_perisco_recurrent_week(conn, create_data, perisco_subscribe_info, reference_year):
unlink(conn, create_data['name_id'])
link(conn, create_data)
# no subscribed activity
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'nature': 'PERISCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 0
# subscription
url = conn + '/add-person-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'unit_id': perisco_subscribe_info['unit']['id'],
'place_id': perisco_subscribe_info['place']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'nature': 'PERISCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 1
assert resp.json()['data'][0]['id'] == perisco_subscribe_info['activity']['id']
assert [(x['text'], x['libelle'], x['libelle2']) for x in resp.json()['data']] == [
('Temps du midi', 'TEST TEMPS DU MIDI 22/23', 'Temps du midi'),
]
# get recurent-week gabarit
url = conn + '/get-recurrent-week?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'ref_date': datetime.date.today().strftime('%Y-%m-%d'),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert [(x['id'], x['day']) for x in resp.json()['data']] == [
('1-X', 'Lundi'),
('2-X', 'Mardi'),
('4-X', 'Jeudi'),
('5-X', 'Vendredi'),
]
# no booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['maggie_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert not any(x['prefill'] for x in resp.json()['data'])
# set recurent-week gabarit
url = conn + '/update-recurrent-week?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
'recurrent_week': ['1-X', '2-X'],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
# there is now some bookings
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['maggie_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert any(x['prefill'] for x in resp.json()['data'])
def test_perisco_recurrent_week_adulte(conn, create_data2, perisco_subscribe_adulte_info, reference_year):
unlink(conn, create_data2['name_id'])
link(conn, create_data2)
# no subscribed activity
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data2['name_id']
params = {
'person_id': create_data2['rl2_num'],
'nature': 'PERISCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 0
# subscription
url = conn + '/add-person-subscription?NameID=%s' % create_data2['name_id']
payload = {
'person_id': create_data2['rl2_num'],
'activity_id': perisco_subscribe_adulte_info['activity']['id'],
'unit_id': perisco_subscribe_adulte_info['unit']['id'],
'place_id': perisco_subscribe_adulte_info['place']['id'],
'start_date': perisco_subscribe_adulte_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_adulte_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data2['name_id']
params = {
'person_id': create_data2['rl2_num'],
'nature': 'PERISCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 1
assert resp.json()['data'][0]['id'] == perisco_subscribe_adulte_info['activity']['id']
assert [(x['text'], x['libelle'], x['libelle2']) for x in resp.json()['data']] == [
('RESTAURATION ADULTE', 'TEST RESTAURATION ADULTE 22/23', 'RESTAURATION ADULTE')
]
# get recurent-week gabarit
url = conn + '/get-recurrent-week?NameID=%s' % create_data2['name_id']
params = {
'person_id': create_data2['rl2_num'],
'activity_id': perisco_subscribe_adulte_info['activity']['id'],
'ref_date': datetime.date.today().strftime('%Y-%m-%d'),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert [(x['id'], x['day']) for x in resp.json()['data']] == [
('1-X', 'Lundi'),
('2-X', 'Mardi'),
('3-X', 'Mercredi'),
('4-X', 'Jeudi'),
('5-X', 'Vendredi'),
]
# no booking
url = conn + '/read-child-agenda?NameID=%s' % create_data2['name_id']
params = {
'child_id': create_data2['rl2_num'],
'start_date': perisco_subscribe_adulte_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_adulte_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert not any(x['prefill'] for x in resp.json()['data'])
# set recurent-week gabarit
url = conn + '/update-recurrent-week?NameID=%s' % create_data2['name_id']
payload = {
'person_id': create_data2['rl2_num'],
'activity_id': perisco_subscribe_adulte_info['activity']['id'],
'start_date': perisco_subscribe_adulte_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_adulte_info['unit']['dateEnd'][:10],
'recurrent_week': ['1-X', '2-X'],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
# there is now some bookings
url = conn + '/read-child-agenda?NameID=%s' % create_data2['name_id']
params = {
'child_id': create_data2['rl2_num'],
'start_date': perisco_subscribe_adulte_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_adulte_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert any(x['prefill'] for x in resp.json()['data'])

View File

@ -1,205 +0,0 @@
import requests
def test_basket_subscribe(conn, create_data, extrasco_subscribe_info, reference_year):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
def subscriptions(person_id):
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'nature': 'EXTRASCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def get_bookings(person_id):
url = conn + '/read-activity-agenda?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
# no subscription
assert subscriptions(create_data['bart_num']) == []
assert subscriptions(create_data['maggie_num']) == []
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 3
assert len(subscriptions(create_data['bart_num'])) == 1
assert subscriptions(create_data['maggie_num']) == []
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 105
assert data[0]['text'] == 'ENFANCE LOISIRS ET PE'
assert len(data[0]['lignes']) == 3
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 1
# get 3 idIns because we subscribe a generic unit
assert len({x['idIns'] for x in data[0]['lignes']}) == 3
basket_id = data[0]['id']
# cannot subscribe Bart twice
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 1
assert 'E1019' in resp.json()['err_desc']
assert len(get_baskets()) == 1
# delete basket
# should be call by user or by cron job
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
assert get_baskets() == []
assert subscriptions(create_data['bart_num']) == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len(subscriptions(create_data['bart_num'])) == 1
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
assert len(subscriptions(create_data['maggie_num'])) == 1
# delete (generic) basket line for Bart
data = get_baskets()
assert len(data) == 1
assert len(data[0]['lignes']) == 6
basket_id = data[0]['id']
# idIns for the generic unit
line_id = [
y['id']
for x in data
for y in x['lignes']
if y['personneInfo']['numPerson'] == int(create_data['bart_num'])
if y['inscription']['idUnit'] == extrasco_subscribe_info['unit']['id']
][0]
url = conn + '/delete-basket-line?NameID=%s' % create_data['name_id']
payload = {
'basket_id': basket_id,
'line_id': line_id,
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['lignes']}) == 1
assert len({x['idIns'] for x in data['lignes']}) == 3
data = get_baskets()
assert len(data) == 1
assert len(get_baskets()) == 1
assert len(data[0]['lignes']) == 3
assert subscriptions(create_data['bart_num']) == []
assert len(subscriptions(create_data['maggie_num'])) == 1
# re-subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
assert len(subscriptions(create_data['bart_num'])) == 1
# add bookings to Bart
slots = [x['id'] for x in extrasco_subscribe_info['info']['agenda'] if x['disabled'] is False]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
assert len([x['prefill'] for x in get_bookings(create_data['bart_num']) if x['prefill'] is True]) == 2
# add bookings to Maggie
slots = [':'.join([create_data['maggie_num']] + x.split(':')[1:]) for x in slots]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
assert len([x['prefill'] for x in get_bookings(create_data['maggie_num']) if x['prefill'] is True]) == 2
# validate basket
url = conn + '/validate-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data['idInsLst']) == 6
assert len(data['factureLst']) == 0 # No invoice #2187
assert get_baskets() == []
assert len(subscriptions(create_data['bart_num'])) == 1
assert len(subscriptions(create_data['maggie_num'])) == 1
# call cancelInvoiceAndDeleteSubscribeList de remove subscriptions

View File

@ -0,0 +1,261 @@
import datetime
import pytest
import requests
from .conftest import get_subscription_info, link, unlink
# LOISIR is like EXTRACO (most tests are redondants) but :
# * there is no calendar (days) to provide.
# * there is a general catalog to display
def test_catalog_general_loisirs(conn, update_data):
unlink(conn, update_data['name_id'])
link(conn, update_data)
url = conn + '/read-activity-list'
params = {'ref_date': datetime.date.today().strftime('%Y-%m-%d')}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
labels = [x['text'] for x in resp.json()['data']]
assert (
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES, MERCREDI - 13h45/17h - 8/15Ans, ARGOULETS'
in labels
)
assert (
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES, MERCREDI - 14h/16h30 - 10/15Ans, LA RAMEE'
in labels
)
assert (
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES, MERCREDI - 15h30/17h - 8/15Ans, ARGOULETS'
in labels
)
assert 'Promenade forêt enchantée, TEST promenade forêt enchantée, TERRITOIRE OUEST' in labels
assert 'Vitrail Fusing 1/2 Je Adultes, Inscription annuelle, Centre Culturel ALBAN MINVILLE' in labels
for item in resp.json()['data']:
if (
item['text']
== 'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES, MERCREDI - 13h45/17h - 8/15Ans, ARGOULETS'
):
assert item['criterias'] == {
'service': {'text': 'Service', 'data': {'sports': 'Sports'}, 'order': ['sports']},
'nature': {
'text': "Nature de l'activité",
'data': {'1': 'Activités Régulières'},
'order': ['1'],
},
'type': {
'text': "Type de l'activité",
'data': {'activites-aquatiques': 'Activités Aquatiques'},
'order': ['activites-aquatiques'],
},
'public': {
'text': 'Public',
'data': {'1': 'Enfant (3-11 ans)', '2': 'Ado (12-17 ans)'},
'order': ['1', '2'],
},
'day': {'text': 'Jours', 'data': {'3': 'Mercredi'}, 'order': ['3']},
'place': {'text': 'Lieu', 'data': {'A10053179757': 'ARGOULETS'}, 'order': ['A10053179757']},
}
assert item['activity']['activityPortail']['blocNoteList'] == [
{
'note': "Activité ayant lieu le Mercredi, merci de choisir votre tranche horraire en fonction de l'âge de votre enfant.",
'numIndex': 1,
}
]
if item['text'] == 'Promenade forêt enchantée, TEST promenade forêt enchantée, TERRITOIRE OUEST':
assert item['criterias'] == {
'service': {'text': 'Service', 'data': {'sports': 'Sports'}, 'order': ['sports']},
'nature': {
'text': "Nature de l'activité",
'data': {'1': 'Activités Régulières'},
'order': ['1'],
},
'type': {
'text': "Type de l'activité",
'data': {'activite-pedestre': 'Activité Pédestre'},
'order': ['activite-pedestre'],
},
'public': {'text': 'Public', 'data': {'5': 'Sénior (60 ans et plus)'}, 'order': ['5']},
'day': {
'text': 'Jours',
'data': {'1': 'Lundi', '2': 'Mardi', '3': 'Mercredi', '4': 'Jeudi', '5': 'Vendredi'},
'order': ['1', '2', '3', '4', '5'],
},
'place': {
'text': 'Lieu',
'data': {'A10056517597': 'TERRITOIRE OUEST'},
'order': ['A10056517597'],
},
}
assert item['activity']['activityPortail']['blocNoteList'] == [
{'note': 'Activité de promenade en forêt.', 'numIndex': 1}
]
def test_catalog_personnalise_loisirs(loisirs_subscribe_info):
assert (
loisirs_subscribe_info['info']['activity']['libelle1']
== 'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES'
)
assert loisirs_subscribe_info['info']['calendarGeneration']['code'] == 'REQUIRED'
assert [(x['id'], x['day']) for x in loisirs_subscribe_info['info']['recurrent_week']] == []
assert loisirs_subscribe_info['info']['billingInformation'] == {
'modeFact': 'FORFAIT',
'quantity': 1.0,
'unitPrice': 88.5,
}
def test_catalog_personnalise_loisirs_not_allowed(conn, create_data, reference_year):
unlink(conn, create_data['name_id'])
link(conn, create_data)
try:
get_subscription_info(
'LOISIRS',
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES',
'MERCREDI - 15h30/17h - 8/15Ans',
'ARGOULETS',
conn,
create_data['name_id'],
create_data['rl1_num'],
reference_year,
)
except Exception:
return
assert False, 'Adult can subscribe to child activity'
def test_direct_subscribe(conn, create_data, loisirs_subscribe_info, reference_year):
assert loisirs_subscribe_info['info']['controlResult']['controlOK'] is True
unlink(conn, create_data['name_id'])
link(conn, create_data)
url = conn + '/add-person-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['hugo_num'],
'activity_id': loisirs_subscribe_info['activity']['id'],
'unit_id': loisirs_subscribe_info['unit']['id'],
'place_id': loisirs_subscribe_info['place']['id'],
'start_date': loisirs_subscribe_info['unit']['dateStart'][:10],
'end_date': loisirs_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# no idIns provided to remove subscription later
assert resp.json()['data'] == {'controlOK': True, 'message': None}
def test_direct_subscribe_out_town(conn, create_data2, loisirs_subscribe_info2, reference_year):
assert loisirs_subscribe_info2['info']['controlResult']['controlOK'] is True
unlink(conn, create_data2['name_id'])
link(conn, create_data2)
url = conn + '/add-person-subscription?NameID=%s' % create_data2['name_id']
payload = {
'person_id': create_data2['hugo_num'],
'activity_id': loisirs_subscribe_info2['activity']['id'],
'unit_id': loisirs_subscribe_info2['unit']['id'],
'place_id': loisirs_subscribe_info2['place']['id'],
'start_date': loisirs_subscribe_info2['unit']['dateStart'][:10],
'end_date': loisirs_subscribe_info2['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# no idIns provided to remove subscription later
assert resp.json()['data'] == {'controlOK': True, 'message': None}
def test_subscribe_to_basket(conn, create_data, loisirs_subscribe_info, reference_year):
assert loisirs_subscribe_info['info']['controlResult']['controlOK'] is True
unlink(conn, create_data['name_id'])
link(conn, create_data)
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': loisirs_subscribe_info['activity']['id'],
'unit_id': loisirs_subscribe_info['unit']['id'],
'place_id': loisirs_subscribe_info['place']['id'],
'start_date': loisirs_subscribe_info['unit']['dateStart'][:10],
'end_date': loisirs_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
basket_id = resp.json()['data']['basket']['id']
# remove subscription
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
@pytest.mark.xfail(run=False)
def test_global_capacity(conn, create_data2, loisirs_subscribe_info3, reference_year):
assert loisirs_subscribe_info3['info']['controlResult']['controlOK'] is True
unlink(conn, create_data2['name_id'])
link(conn, create_data2)
# subscribe Bart
url = conn + '/add-person-subscription?NameID=%s' % create_data2['name_id']
# url = conn + '/add-person-basket-subscription?NameID=%s' % create_data2['name_id']
payload = {
'person_id': create_data2['bart_num'],
'activity_id': loisirs_subscribe_info3['activity']['id'],
'unit_id': loisirs_subscribe_info3['unit']['id'],
'place_id': loisirs_subscribe_info3['place']['id'],
'start_date': loisirs_subscribe_info3['unit']['dateStart'][:10],
'end_date': loisirs_subscribe_info3['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# basket_id = resp.json()['data']['basket']['id']
# subscribe Lisa
payload['person_id'] = create_data2['lisa_num']
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# subscribe Maggie
payload['person_id'] = create_data2['maggie_num']
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# can't subscribe Huggo
payload['person_id'] = create_data2['hugo_num']
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 1
assert resp.json()['err_desc'] == ''
# check capacity on main catalog
url = conn + '/read-activity-list'
params = {'ref_date': datetime.date.today().strftime('%Y-%m-%d')}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
for item in resp.json()['data']:
if item['activity']['libelle'] == 'PUBLIK Vitrail Fusing 1/2 Je Adultes 2022/2023 - Mardi 14h-1':
import pdb
pdb.set_trace()
# # remove subscriptions
# url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
# payload = {'basket_id': basket_id}
# resp = requests.post(url, json=payload)
# resp.raise_for_status()
# assert resp.json()['err'] == 0

View File

@ -1,14 +1,47 @@
import pytest
import requests
def test_catalog_personnalise_extrasco(extrasco_subscribe_info):
assert extrasco_subscribe_info['info']['activity']['libelle1'] == 'ADL ELEMENTAIRE Maourine Avril 2023'
assert extrasco_subscribe_info['info']['calendarGeneration']['code'] == 'REQUIRED'
assert (
extrasco_subscribe_info['info']['activity']['libelle1']
== 'PUBLIK ADL ELEMENTAIRE Maourine JUIN 22/23(NE PAS UTILISER)'
)
assert extrasco_subscribe_info['info']['calendarGeneration']['code'] == 'NOT_REQUIRED'
assert extrasco_subscribe_info['info']['billingInformation'] == {
'modeFact': 'PRESENCE',
'quantity': None,
'unitPrice': 43.0,
'unitPrice': 11.5,
}
assert extrasco_subscribe_info['info']['activity']['blocNoteList'] == [
{
'note': 'Lien vers le réglement intérieur :\r\nhttps://portail-parsifal.test.entrouvert.org/media/uploads/2023/03/23/flyer-sejour.pdf\r\nLien vers arrêté municipal :\r\nhttps://portail-parsifal.test.entrouvert.org/media/uploads/2023/04/05/arrete-municipal.pdf',
'numIndex': 1,
}
]
assert (
extrasco_subscribe_info['info']['agenda'][0]['details']['activity_label']
== 'ADL ELEMENTAIRE Maourine Juin'
)
def test_catalog_personnalise_extrasco2(extrasco_subscribe_info2):
assert (
extrasco_subscribe_info2['info']['activity']['libelle1']
== 'PUBLIK ADL MATERNELLE Lardenne JUIN 22/23 (NEPAS UTILISER)'
)
assert extrasco_subscribe_info2['info']['calendarGeneration']['code'] == 'FORBIDDEN'
assert extrasco_subscribe_info2['info']['billingInformation'] == {
'modeFact': 'PRESENCE',
'quantity': None,
'unitPrice': 11.5,
}
assert extrasco_subscribe_info2['info']['activity']['blocNoteList'] == [
{
'note': 'Lien vers le réglement intérieur :\r\nhttps://portail-parsifal.test.entrouvert.org/media/uploads/2023/03/23/flyer-sejour.pdf\r\nLien vers arrêté municipal :\r\nhttps://portail-parsifal.test.entrouvert.org/media/uploads/2023/04/05/arrete-municipal.pdf',
'numIndex': 1,
}
]
def test_direct_subscribe(conn, create_data, extrasco_subscribe_info, reference_year):
@ -71,16 +104,11 @@ def test_subscribe_with_conveyance(conn, create_data, extrasco_subscribe_info):
def test_subscribe_with_recurrent_week(conn, create_data, extrasco_subscribe_info):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
assert [(x['id'], x['day']) for x in extrasco_subscribe_info['info']['recurrent_week']] == [
('1-C', 'Lundi'),
('1-B', 'Lundi'),
('2-C', 'Mardi'),
('2-B', 'Mardi'),
('3-C', 'Mercredi'),
('3-B', 'Mercredi'),
('4-C', 'Jeudi'),
('4-B', 'Jeudi'),
('5-C', 'Vendredi'),
('5-B', 'Vendredi'),
('1-X', 'Lundi'),
('2-X', 'Mardi'),
('3-X', 'Mercredi'),
('4-X', 'Jeudi'),
('5-X', 'Vendredi'),
]
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
@ -91,7 +119,7 @@ def test_subscribe_with_recurrent_week(conn, create_data, extrasco_subscribe_inf
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'recurrent_week': ['1-B', '2-C'],
'recurrent_week': ['1-X', '2-X'],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
@ -111,6 +139,16 @@ def test_subscribe_with_recurrent_week(conn, create_data, extrasco_subscribe_inf
assert resp.json()['err'] == 0
assert any(x['prefill'] for x in resp.json()['data'])
# check quantity into basket
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
line = resp.json()['data'][0]['lignes'][0]
assert line['prixUnit'] == 11.5
assert line['qte'] > 0
assert line['montant'] == line['prixUnit'] * line['qte']
# remove subscription
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
@ -135,7 +173,23 @@ def test_subscribe_with_agenda(conn, create_data, extrasco_subscribe_info):
assert resp.json()['err'] == 0
return resp.json()['data']
# subscribe witout providing calandar
def get_perisco_bookings():
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return [
item
for item in resp.json()['data']
if item['details']['activity_id'] == extrasco_subscribe_info['activity']['id']
]
# subscribe without providing calendar
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
@ -152,6 +206,7 @@ def test_subscribe_with_agenda(conn, create_data, extrasco_subscribe_info):
# no booking
assert not any(x['prefill'] for x in get_bookings())
assert not any(x['prefill'] for x in get_perisco_bookings())
# book using info calendar gabarit (booking registered from w.c.s. form)
assert len(extrasco_subscribe_info['info']['agenda']) > 0
@ -173,6 +228,17 @@ def test_subscribe_with_agenda(conn, create_data, extrasco_subscribe_info):
# there is now 2 bookings
assert len([x['prefill'] for x in get_bookings() if x['prefill'] is True]) == 2
perisco_bookings = get_perisco_bookings()
assert len([x['prefill'] for x in perisco_bookings if x['prefill'] is True]) == 2
assert perisco_bookings[0]['details']['activity_label'] == 'ADL ELEMENTAIRE Maourine Juin'
# check quantity into basket
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
line = resp.json()['data'][0]['lignes'][0]
assert (line['prixUnit'], line['qte'], line['montant']) == (11.5, 0.0, 0.0)
# unbook slots
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
@ -196,3 +262,61 @@ def test_subscribe_with_agenda(conn, create_data, extrasco_subscribe_info):
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
@pytest.mark.xfail(run=False)
def test_daily_capacity(conn, create_data2, extrasco_subscribe_info3):
assert extrasco_subscribe_info3['info']['controlResult']['controlOK'] is True
def subscribe(child):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data2['name_id']
payload = {
'person_id': create_data2['%s_num' % child],
'activity_id': extrasco_subscribe_info3['activity']['id'],
'unit_id': extrasco_subscribe_info3['unit']['id'],
'place_id': extrasco_subscribe_info3['place']['id'],
'start_date': extrasco_subscribe_info3['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info3['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']['basket']['id']
def book(child, slot):
url = conn + '/update-activity-agenda/?NameID=%s' % create_data2['name_id']
payload = {
'person_id': create_data2['%s_num' % child],
'activity_id': extrasco_subscribe_info3['activity']['id'],
'start_date': extrasco_subscribe_info3['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info3['unit']['dateEnd'][:10],
'booking_list': [slot],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
# subscribe all family childs
basket_id = subscribe('bart')
for child in 'lisa', 'maggie', 'hugo':
assert subscribe(child) == basket_id
# book all childs on the same day
assert len(extrasco_subscribe_info3['info']['agenda']) > 0
assert not any(x['prefill'] for x in extrasco_subscribe_info3['info']['agenda'])
slots = [x['id'] for x in extrasco_subscribe_info3['info']['agenda'] if x['disabled'] is False]
for child in 'bart', 'lisa', 'maggie':
resp = book(child, slots[-1])
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True]
resp = book('hugo', slots[-1])
assert resp.json()['err'] == 1
assert resp.json()['err_desc'] == 0
# # remove subscriptions
# url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
# payload = {'basket_id': basket_id}
# resp = requests.post(url, json=payload)
# resp.raise_for_status()
# assert resp.json()['err'] == 0

View File

@ -1,109 +0,0 @@
import pytest
import requests
from .conftest import diff, link, unlink
def test_direct_debit_order(conn, create_data):
unlink(conn, create_data['name_id'])
link(conn, create_data)
url = conn + '/add-rl1-direct-debit-order?NameID=%s' % create_data['name_id']
payload = {
'codeRegie': '102',
'bank/bankBIC': 'BDFEFR2T',
'bank/bankIBAN': 'FR7630001007941234567890185',
'bank/bankRUM': 'xxx',
'bank/dateStart': '2023-01-01',
'bank/bankAddress': '75049 PARIS cedex 01',
'bank/civility': 'x',
'bank/lastName': 'Ewing',
'bank/firstName': 'John Ross',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
assert res['data'] == 'ok'
url = conn + '/get-rl1-direct-debit-order?NameID=%s' % create_data['name_id']
params = {
'codeRegie': '102',
'dateRef': '2023-01-01',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
res = resp.json()
res['data']['numPerson'] = 'N/A'
assert diff(res['data'], 'test_get_rl1_direct_debit_order.json')
@pytest.mark.xfail(run=False)
def test_basket_subscribe(conn, create_data, extrasco_subscribe_info, reference_year):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 3
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 105
assert data[0]['text'] == 'ENFANCE LOISIRS ET PE'
assert len(data[0]['lignes']) == 3
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 1
# get 3 idIns because we subscribe a generic unit
assert len({x['idIns'] for x in data[0]['lignes']}) == 3
basket_id = data[0]['id']
# validate basket
url = conn + '/validate-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data['idInsLst']) == 6
assert len(data['factureLst']) == 0
assert get_baskets() == []
# to continue :
# cancelInvoiceAndDeleteSubscribeList
# payInvoice

View File

@ -0,0 +1,557 @@
import pytest
import requests
def test_basket_subscribe_extrasco(conn, create_data, extrasco_subscribe_info, reference_year):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
def subscriptions(person_id):
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'nature': 'EXTRASCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def get_bookings(person_id):
url = conn + '/read-activity-agenda?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
# no subscription
assert subscriptions(create_data['bart_num']) == []
assert subscriptions(create_data['maggie_num']) == []
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 1 # 3 sur Larden
subs = subscriptions(create_data['bart_num'])
assert len(subs) == 1
assert len(subs[0]['subscribesUnit']) == 1
assert subscriptions(create_data['maggie_num']) == []
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 105
assert data[0]['text'] == 'ENFANCE LOISIRS'
assert len(data[0]['lignes']) == 1 # 3 sur Larden
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 1
# get 3 idIns because we subscribe a generic unit
assert len({x['idIns'] for x in data[0]['lignes']}) == 1 # 3 sur Larden
basket_id = data[0]['id']
# cannot subscribe Bart twice
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 1
assert 'E1019' in resp.json()['err_desc']
assert len(get_baskets()) == 1
# delete basket
# should be call by user or by cron job
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
assert get_baskets() == []
assert subscriptions(create_data['bart_num']) == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len(subscriptions(create_data['bart_num'])) == 1
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
subs = subscriptions(create_data['maggie_num'])
assert len(subs) == 1
assert len(subs[0]['subscribesUnit']) == 1
# delete (generic) basket line for Bart
data = get_baskets()
assert len(data) == 1
assert len(data[0]['lignes']) == 2 # 6 sur Larden
basket_id = data[0]['id']
# line for the generic unit for Bart
line_id = [
y['id']
for x in data
for y in x['lignes']
if y['personneInfo']['numPerson'] == int(create_data['bart_num'])
if y['inscription']['idUnit'] == extrasco_subscribe_info['unit']['id']
][0]
url = conn + '/delete-basket-line?NameID=%s' % create_data['name_id']
payload = {
'basket_id': basket_id,
'line_id': line_id,
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['lignes']}) == 1
assert len({x['idIns'] for x in data['lignes']}) == 1 # 3 sur Larden
data = get_baskets()
assert len(data) == 1
assert len(get_baskets()) == 1
assert len(data[0]['lignes']) == 1 # 3 sur Larden
assert subscriptions(create_data['bart_num']) == []
assert len(subscriptions(create_data['maggie_num'])) == 1
# re-subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
assert len(subscriptions(create_data['bart_num'])) == 1
# add bookings to Bart
slots = [x['id'] for x in extrasco_subscribe_info['info']['agenda'] if x['disabled'] is False]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
assert len([x['prefill'] for x in get_bookings(create_data['bart_num']) if x['prefill'] is True]) == 2
# add bookings to Maggie
slots = [':'.join([create_data['maggie_num']] + x.split(':')[1:]) for x in slots]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
assert len([x['prefill'] for x in get_bookings(create_data['maggie_num']) if x['prefill'] is True]) == 2
# delete basket
# should be call by user or by cron job
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
assert get_baskets() == []
assert subscriptions(create_data['bart_num']) == []
assert subscriptions(create_data['maggie_num']) == []
@pytest.mark.xfail(run=False)
def test_basket_subscribe_extrasco2(conn, create_data, extrasco_subscribe_info2, reference_year):
"""Subscribing to a generic unit"""
assert extrasco_subscribe_info2['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info2['activity']['id'],
'unit_id': extrasco_subscribe_info2['unit']['id'],
'place_id': extrasco_subscribe_info2['place']['id'],
'start_date': extrasco_subscribe_info2['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info2['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
def subscriptions(person_id):
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'nature': 'EXTRASCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def get_bookings(person_id):
url = conn + '/read-activity-agenda?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info2['activity']['id'],
'start_date': extrasco_subscribe_info2['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info2['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
# no subscription
assert subscriptions(create_data['bart_num']) == []
assert subscriptions(create_data['maggie_num']) == []
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 1 # 3 expected
subs = subscriptions(create_data['bart_num'])
assert len(subs) == 1
assert len(subs[0]['subscribesUnit']) == 2
assert [x['libelle'] for x in subs[0]['subscribesUnit']] == [
'PUBLIK ADL MATERNELLE Lardenne JUIN 22/23 (NEPAS UTILISER)',
'PUBLIK ADL MATER JOURNEE AVEC REPAS',
]
assert subscriptions(create_data['maggie_num']) == []
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 105
assert data[0]['text'] == 'ENFANCE LOISIRS'
assert len(data[0]['lignes']) == 1 # 3 expected
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 1
# we should get 3 idIns because we subscribe a generic unit
assert len({x['idIns'] for x in data[0]['lignes']}) == 1 # 3 expected
basket_id = data[0]['id']
# cannot subscribe Bart twice
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 1
assert 'E1019' in resp.json()['err_desc']
assert len(get_baskets()) == 1
# delete basket
# should be call by user or by cron job
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
assert get_baskets() == []
assert subscriptions(create_data['bart_num']) == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len(subscriptions(create_data['bart_num'])) == 1
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
assert len(subscriptions(create_data['maggie_num'])) == 1
# delete (generic) basket line for Bart
data = get_baskets()
assert len(data) == 1
assert len(data[0]['lignes']) == 2 # 6 sur Larden
basket_id = data[0]['id']
# line for the generic unit for Bart
line_id = [
y['id']
for x in data
for y in x['lignes']
if y['personneInfo']['numPerson'] == int(create_data['bart_num'])
if y['inscription']['idUnit'] == extrasco_subscribe_info2['unit']['id']
][0]
url = conn + '/delete-basket-line?NameID=%s' % create_data['name_id']
payload = {
'basket_id': basket_id,
'line_id': line_id,
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['lignes']}) == 1
assert len({x['idIns'] for x in data['lignes']}) == 1 # 3 sur Larden
data = get_baskets()
assert len(data) == 1
assert len(get_baskets()) == 1
assert len(data[0]['lignes']) == 1 # 3 sur Larden
assert subscriptions(create_data['bart_num']) == []
assert len(subscriptions(create_data['maggie_num'])) == 1
# re-subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
assert len(subscriptions(create_data['bart_num'])) == 1
# add bookings to Bart
slots = [x['id'] for x in extrasco_subscribe_info2['info']['agenda'] if x['disabled'] is False]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info2['activity']['id'],
'start_date': extrasco_subscribe_info2['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info2['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
assert len([x['prefill'] for x in get_bookings(create_data['bart_num']) if x['prefill'] is True]) == 2
# add bookings to Maggie
slots = [':'.join([create_data['maggie_num']] + x.split(':')[1:]) for x in slots]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': extrasco_subscribe_info2['activity']['id'],
'start_date': extrasco_subscribe_info2['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info2['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
assert len([x['prefill'] for x in get_bookings(create_data['maggie_num']) if x['prefill'] is True]) == 2
# delete basket
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
assert get_baskets() == []
assert subscriptions(create_data['bart_num']) == []
assert subscriptions(create_data['maggie_num']) == []
def test_basket_subscribe_loisirs(conn, create_data, loisirs_subscribe_info, reference_year):
assert loisirs_subscribe_info['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': loisirs_subscribe_info['activity']['id'],
'unit_id': loisirs_subscribe_info['unit']['id'],
'place_id': loisirs_subscribe_info['place']['id'],
'start_date': loisirs_subscribe_info['unit']['dateStart'][:10],
'end_date': loisirs_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
def subscriptions(person_id):
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'nature': 'LOISIRS',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return [
x
for x in resp.json()['data']
if x['libelle'] == 'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES'
]
# no subscription
assert subscriptions(create_data['bart_num']) == []
assert subscriptions(create_data['maggie_num']) == []
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 109
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 1
subs = subscriptions(create_data['bart_num'])
assert len(subs) == 1
assert len(subs[0]['subscribesUnit']) == 2
assert [x['libelle'] for x in subs[0]['subscribesUnit']] == [
'TEST ECOLE DES SPORTS 22/23 SEMESTRE 2 - MULTIACTIVITES',
'MERCREDI - 15h30/17h - 8/15Ans',
]
assert subscriptions(create_data['maggie_num']) == []
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 109
assert data[0]['text'] == 'SPORT'
assert len(data[0]['lignes']) == 1
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 1
assert len({x['idIns'] for x in data[0]['lignes']}) == 1
assert data[0]['lignes'][0]['montant'] == 88.5
basket_id = data[0]['id']
# cannot subscribe Bart twice
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 1
assert 'E1019' in resp.json()['err_desc']
assert len(get_baskets()) == 1
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
subs = subscriptions(create_data['maggie_num'])
assert len(subs) == 1
assert len(subs[0]['subscribesUnit']) == 2
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['id'] == basket_id
assert data[0]['codeRegie'] == 109
assert data[0]['text'] == 'SPORT'
assert len(data[0]['lignes']) == 2
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 2
assert len({x['idIns'] for x in data[0]['lignes']}) == 2
assert all(x['montant'] == 88.5 for x in data[0]['lignes'])
# delete basket line for Bart
data = get_baskets()
assert len(data) == 1
assert len(data[0]['lignes']) == 2
basket_id = data[0]['id']
# line for Bart
line_id = [
y['id']
for x in data
for y in x['lignes']
if y['personneInfo']['numPerson'] == int(create_data['bart_num'])
][0]
url = conn + '/delete-basket-line?NameID=%s' % create_data['name_id']
payload = {
'basket_id': basket_id,
'line_id': line_id,
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['codeRegie'] == 109
assert len({x['personneInfo']['numPerson'] for x in data['lignes']}) == 1
assert len({x['idIns'] for x in data['lignes']}) == 1
data = get_baskets()
assert len(data) == 1
assert len(get_baskets()) == 1
assert len(data[0]['lignes']) == 1
assert subscriptions(create_data['bart_num']) == []
assert len(subscriptions(create_data['maggie_num'])) == 1
# delete basket
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
assert get_baskets() == []
assert subscriptions(create_data['maggie_num']) == []

View File

@ -0,0 +1,346 @@
import datetime
import pytest
import requests
from .conftest import diff, link, unlink
def test_direct_debit_order(conn, create_data):
unlink(conn, create_data['name_id'])
link(conn, create_data)
url = conn + '/add-rl1-direct-debit-order?NameID=%s' % create_data['name_id']
payload = {
'codeRegie': '102',
'bank/bankBIC': 'BDFEFR2T',
'bank/bankIBAN': 'FR7630001007941234567890185',
'bank/bankRUM': 'xxx',
'bank/dateStart': '2023-01-01',
'bank/bankAddress': '75049 PARIS cedex 01',
'bank/civility': 'x',
'bank/lastName': 'Ewing',
'bank/firstName': 'John Ross',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
assert res['data'] == 'ok'
url = conn + '/get-rl1-direct-debit-order?NameID=%s' % create_data['name_id']
params = {
'codeRegie': '102',
'dateRef': '2023-01-01',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
res = resp.json()
res['data']['numPerson'] = 'N/A'
assert diff(res['data'], 'test_get_rl1_direct_debit_order.json')
def test_pay_invoice_loisirs(conn, create_data, loisirs_subscribe_info, reference_year):
assert loisirs_subscribe_info['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': loisirs_subscribe_info['activity']['id'],
'unit_id': loisirs_subscribe_info['unit']['id'],
'place_id': loisirs_subscribe_info['place']['id'],
'start_date': loisirs_subscribe_info['unit']['dateStart'][:10],
'end_date': loisirs_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 109
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 1
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 109
assert data[0]['text'] == 'SPORT'
assert len(data[0]['lignes']) == 2
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 2
assert len({x['idIns'] for x in data[0]['lignes']}) == 2
basket_id = data[0]['id']
# validate basket de generate an invoice
url = conn + '/validate-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data['idInsLst']) == 2
assert len(data['factureLst']) == 1
assert len(data['factureLst'][0]['lineInvoiceList']) == 2
assert data['factureLst'][0]['regie']['code'] == 109
invoice_num = data['factureLst'][0]['numInvoice']
invoice_id = data['factureLst'][0]['idInvoice']
assert get_baskets() == []
# get invoices paid
url = conn + '/regie/109/invoices/history?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json() == {'data': [], 'err': 0}
# get invoices to be paid
url = conn + '/regie/109/invoices?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data) == 1
assert data[0]['amount'] == '177' # ou juste > 0 ?
assert data[0]['online_payment'] is True
assert data[0]['paid'] is False
assert len({x['idIns'] for x in data[0]['maelis_item']['lineInvoiceList']}) == 2
assert data[0]['maelis_item']['idInvoice'] == invoice_id
assert data[0]['maelis_item']['numInvoice'] == invoice_num
# payInvoice
url = conn + '/regie/109/invoice/%s-%s/pay/' % (create_data['family_id'], invoice_num)
payload = {
'transaction_date': datetime.datetime.now().strftime('%Y-%m-%dT%H:%M:%S'),
'transaction_id': 'xxx',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
assert res['data'] == 'ok'
# get invoices to be paid
url = conn + '/regie/109/invoices?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json() == {'has_invoice_for_payment': True, 'data': [], 'err': 0}
# get invoices paid
url = conn + '/regie/109/invoices/history?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data) == 1
assert data[0]['amount'] == '0'
assert data[0]['total_amount'] == '177' # ou juste > 0 ?
assert data[0]['online_payment'] is False
assert data[0]['paid'] is True
assert len({x['idIns'] for x in data[0]['maelis_item']['lineInvoiceList']}) == 2
assert data[0]['maelis_item']['idInvoice'] == invoice_id
assert data[0]['maelis_item']['numInvoice'] == invoice_num
def test_payinvoice_extrasco(conn, create_data, extrasco_subscribe_info, reference_year):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
def subscriptions(person_id):
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'nature': 'EXTRASCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def get_bookings(person_id):
url = conn + '/read-activity-agenda?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
# no subscription
assert subscriptions(create_data['bart_num']) == []
assert subscriptions(create_data['maggie_num']) == []
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 1
assert len(subscriptions(create_data['bart_num'])) == 1
assert subscriptions(create_data['maggie_num']) == []
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 105
assert len(data[0]['lignes']) == 1
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 1
assert len({x['idIns'] for x in data[0]['lignes']}) == 1
basket_id = data[0]['id']
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
assert len(subscriptions(create_data['maggie_num'])) == 1
# add bookings to Bart
slots = [x['id'] for x in extrasco_subscribe_info['info']['agenda'] if x['disabled'] is False]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert len([x['prefill'] for x in get_bookings(create_data['bart_num']) if x['prefill'] is True]) > 0
# add bookings to Maggie
slots = [':'.join([create_data['maggie_num']] + x.split(':')[1:]) for x in slots]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert len([x['prefill'] for x in get_bookings(create_data['maggie_num']) if x['prefill'] is True]) > 0
# validate basket
url = conn + '/validate-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data['idInsLst']) == 2
assert len(data['factureLst']) == 1
assert get_baskets() == []
assert len(data['factureLst'][0]['lineInvoiceList']) == 2
assert data['factureLst'][0]['regie']['code'] == 105
invoice_num = data['factureLst'][0]['numInvoice']
invoice_id = data['factureLst'][0]['idInvoice']
# get invoices paid
url = conn + '/regie/105/invoices/history?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json() == {'data': [], 'err': 0}
# get invoices to be paid
url = conn + '/regie/105/invoices?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data) == 1
assert int(data[0]['amount']) > 0
assert data[0]['online_payment'] is True
assert data[0]['paid'] is False
assert len({x['idIns'] for x in data[0]['maelis_item']['lineInvoiceList']}) == 2
assert data[0]['maelis_item']['idInvoice'] == invoice_id
assert data[0]['maelis_item']['numInvoice'] == invoice_num
# payInvoice
url = conn + '/regie/105/invoice/%s-%s/pay/' % (create_data['family_id'], invoice_num)
payload = {
'transaction_date': datetime.datetime.now().strftime('%Y-%m-%dT%H:%M:%S'),
'transaction_id': 'xxx',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
assert res['data'] == 'ok'
# get invoices to be paid
url = conn + '/regie/105/invoices?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json() == {'has_invoice_for_payment': True, 'data': [], 'err': 0}
# get invoices history
url = conn + '/regie/105/invoices/history?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data) == 1
assert data[0]['amount'] == '0'
assert int(data[0]['total_amount']) > 0
assert data[0]['online_payment'] is False
assert data[0]['paid'] is True
assert len({x['idIns'] for x in data[0]['maelis_item']['lineInvoiceList']}) == 2
assert data[0]['maelis_item']['idInvoice'] == invoice_id
assert data[0]['maelis_item']['numInvoice'] == invoice_num

View File

@ -2,9 +2,9 @@ import pytest
def pytest_addoption(parser):
parser.addoption("--url", help="Url of a passerelle Vivaticket connector instance")
parser.addoption('--url', help='Url of a passerelle Vivaticket connector instance')
@pytest.fixture(scope='session')
def conn(request):
return request.config.getoption("--url")
return request.config.getoption('--url')

View File

@ -6,7 +6,7 @@ import requests
def call_generic(conn, endpoint):
print("%s \n" % endpoint)
print('%s \n' % endpoint)
url = conn + '/%s' % endpoint
resp = requests.get(url)
resp.raise_for_status()
@ -50,7 +50,7 @@ def test_book_event(conn):
themes = call_generic(conn, 'themes')
random.shuffle(themes)
payload['theme'] = themes[0]['id']
print("Creating booking with the following payload:\n%s" % payload)
print('Creating booking with the following payload:\n%s' % payload)
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()

View File

@ -1,4 +0,0 @@
#!/bin/sh -ue
test -d wcs || git clone https://git.entrouvert.org/entrouvert/wcs.git
(cd wcs && git pull)

View File

@ -2,8 +2,8 @@
import os
import sys
if __name__ == "__main__":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "passerelle.settings")
if __name__ == '__main__':
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'passerelle.settings')
from django.core.management import execute_from_command_line

View File

@ -102,6 +102,7 @@ class AddressResource(BaseResource):
@endpoint(
name='sectors',
description=_('List related Sectorizations'),
perm='OPEN',
parameters={
'id': {'description': _('Sector Identifier (slug)')},
'q': {'description': _('Filter by Sector Title or Identifier')},

View File

@ -48,7 +48,7 @@ class ActesWeb(BaseResource):
def basepath(self):
return os.path.join(default_storage.path('actesweb'), self.slug)
@endpoint(perm='can_access', methods=['post'], description=_('Create demand'))
@endpoint(methods=['post'], description=_('Create demand'))
def create(self, request, *args, **kwargs):
try:
payload = json.loads(request.body)

View File

@ -0,0 +1,77 @@
# Generated by Django 3.2.18 on 2023-07-07 10:10
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('base', '0030_resourcelog_base_resour_appname_298cbc_idx'),
]
operations = [
migrations.CreateModel(
name='AdullactPastell',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('title', models.CharField(max_length=50, verbose_name='Title')),
('slug', models.SlugField(unique=True, verbose_name='Identifier')),
('description', models.TextField(verbose_name='Description')),
(
'basic_auth_username',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication username'
),
),
(
'basic_auth_password',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication password'
),
),
(
'client_certificate',
models.FileField(
blank=True, null=True, upload_to='', verbose_name='TLS client certificate'
),
),
(
'trusted_certificate_authorities',
models.FileField(blank=True, null=True, upload_to='', verbose_name='TLS trusted CAs'),
),
(
'verify_cert',
models.BooleanField(blank=True, default=True, verbose_name='TLS verify certificates'),
),
(
'http_proxy',
models.CharField(blank=True, max_length=128, verbose_name='HTTP and HTTPS proxy'),
),
(
'api_base_url',
models.URLField(
help_text='Example: https://pastell.example.com/api/v2/',
max_length=128,
verbose_name='API base URL',
),
),
('token', models.CharField(blank=True, max_length=128, verbose_name='API token')),
(
'users',
models.ManyToManyField(
blank=True,
related_name='_adullact_pastell_adullactpastell_users_+',
related_query_name='+',
to='base.ApiUser',
),
),
],
options={
'verbose_name': 'Adullact Pastell',
},
),
]

View File

@ -0,0 +1,265 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import base64
from urllib import parse as urlparse
import requests
from django.core.exceptions import ValidationError
from django.db import models
from django.http import HttpResponse
from django.utils.translation import gettext_lazy as _
from passerelle.base.models import BaseResource, HTTPResource
from passerelle.utils.api import endpoint
from passerelle.utils.jsonresponse import APIError
FILE_OBJECT_PROPERTIES = {
'title': _('File object'),
'type': 'object',
'properties': {
'filename': {
'type': 'string',
'description': _('Filename'),
},
'content': {
'type': 'string',
'description': _('Content'),
},
'content_type': {
'type': 'string',
'description': _('Content type'),
},
},
'required': ['filename', 'content'],
}
DOCUMENT_CREATION_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['type'],
'additionalProperties': True,
'properties': {
'type': {'type': 'string', 'description': _('Document type')},
'file_field_name': {
'type': 'string',
'description': _('Document file\'s field name'),
},
'file': FILE_OBJECT_PROPERTIES,
'filename': {
'type': 'string',
'description': _('Filename (takes precedence over filename in "file" object)'),
},
},
}
DOCUMENT_FILE_UPLOAD_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['file', 'file_field_name'],
'additionalProperties': False,
'properties': {
'filename': {
'type': 'string',
'description': _('Filename (takes precedence over filename in "file" object)'),
},
'file': FILE_OBJECT_PROPERTIES,
'file_field_name': {
'type': 'string',
'description': _('Document file\'s field name'),
},
},
}
class AdullactPastell(BaseResource, HTTPResource):
api_base_url = models.URLField(
max_length=128,
verbose_name=_('API base URL'),
help_text=_('Example: https://pastell.example.com/api/v2/'),
)
token = models.CharField(max_length=128, blank=True, verbose_name=_('API token'))
category = _('Business Process Connectors')
log_requests_errors = False
class Meta:
verbose_name = _('Adullact Pastell')
def clean(self, *args, **kwargs):
if not self.token and not self.basic_auth_username:
raise ValidationError(_('API token or authentication username and password should be defined.'))
return super().clean(*args, **kwargs)
def call(self, path, method='get', params=None, **kwargs):
url = urlparse.urljoin(self.api_base_url, path)
if self.token:
kwargs.update({'headers': {'Authorization': 'Bearer: %s' % self.token}, 'auth': None})
try:
response = self.requests.request(url=url, method=method, params=params, **kwargs)
response.raise_for_status()
except (requests.Timeout, requests.RequestException) as e:
raise APIError(str(e))
return response
def check_status(self):
try:
response = self.call('version')
except APIError as e:
raise Exception('Pastell server is down: %s' % e)
return {'data': response.json()}
def upload_file(self, entity_id, document_id, file_field_name, data, **kwargs):
filename = kwargs.get('filename') or data['filename']
file_data = {
'file_content': (
filename,
base64.b64decode(data['content']),
data.get('content_type'),
)
}
return self.call(
'entite/%s/document/%s/file/%s' % (entity_id, document_id, file_field_name),
'post',
files=file_data,
data={'file_name': filename},
)
@endpoint(
description=_('List entities'),
datasource=True,
)
def entities(self, request):
data = []
response = self.call('entite')
for item in response.json():
item['id'] = item['id_e']
item['text'] = item['denomination']
data.append(item)
return {'data': data}
@endpoint(
description=_('List entity documents'),
parameters={'entity_id': {'description': _('Entity ID'), 'example_value': '42'}},
datasource=True,
)
def documents(self, request, entity_id):
if request.GET.get('id'):
response = self.call('entite/%s/document/%s' % (entity_id, request.GET['id']))
return {'data': response.json()}
data = []
response = self.call('entite/%s/document' % entity_id)
for item in response.json():
item['id'] = item['id_d']
item['text'] = item['titre']
data.append(item)
return {'data': data}
@endpoint(
post={
'description': _('Create a document for an entity'),
'request_body': {'schema': {'application/json': DOCUMENT_CREATION_SCHEMA}},
},
name='create-document',
parameters={
'entity_id': {'description': _('Entity ID'), 'example_value': '42'},
},
)
def create_document(self, request, entity_id, post_data):
file_data = post_data.pop('file', None)
file_field_name = post_data.pop('file_field_name', None)
# create document
response = self.call('entite/%s/document' % entity_id, 'post', params=post_data)
document_id = response.json()['id_d']
# update it with other attributes
response = self.call('entite/%s/document/%s' % (entity_id, document_id), 'patch', params=post_data)
# upload file if it's filled
if file_field_name and file_data:
self.upload_file(entity_id, document_id, file_field_name, file_data, **post_data)
return {'data': response.json()}
@endpoint(
post={
'description': _('Upload a file to a document'),
'request_body': {'schema': {'application/json': DOCUMENT_FILE_UPLOAD_SCHEMA}},
},
name='upload-document-file',
parameters={
'entity_id': {'description': _('Entity ID'), 'example_value': '42'},
'document_id': {'description': _('Document ID'), 'example_value': 'hDWtdSC'},
},
)
def upload_document_file(self, request, entity_id, document_id, post_data):
file_field_name = post_data.pop('file_field_name')
file_data = post_data.pop('file')
response = self.upload_file(entity_id, document_id, file_field_name, file_data, **post_data)
return {'data': response.json()}
@endpoint(
description=_('Get document\'s file'),
name='get-document-file',
parameters={
'entity_id': {'description': _('Entity ID'), 'example_value': '42'},
'document_id': {'description': _('Document ID'), 'example_value': 'hDWtdSC'},
'field_name': {
'description': _('Document file\'s field name'),
'example_value': 'document',
},
},
)
def get_document_file(self, request, entity_id, document_id, field_name):
document = self.call('entite/%s/document/%s/file/%s' % (entity_id, document_id, field_name))
response = HttpResponse(document.content, content_type=document.headers['Content-Type'])
response['Content-Disposition'] = document.headers['Content-disposition']
return response
@endpoint(
post={
'description': _('Run action on document'),
'request_body': {
'schema': {
'application/json': {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['action_name'],
'additionalProperties': False,
'properties': {
'action_name': {'type': 'string', 'description': _('Action name')},
},
}
}
},
},
name='run-document-action',
parameters={
'entity_id': {'description': _('Entity ID'), 'example_value': '42'},
'document_id': {'description': _('Document ID'), 'example_value': 'hDWtdSC'},
},
)
def run_document_action(self, request, entity_id, document_id, post_data):
response = self.call(
'entite/%s/document/%s/action/%s' % (entity_id, document_id, post_data['action_name']), 'post'
)
return {'data': response.json()}

View File

@ -44,6 +44,7 @@ class AirQuality(BaseResource):
@endpoint(
pattern=r'^(?P<country>\w+)/(?P<city>\w+)/$',
example_pattern='{country}/{city}/',
perm='OPEN',
parameters={
'country': {'description': _('Country Code'), 'example_value': 'fr'},
'city': {'description': _('City Name'), 'example_value': 'lyon'},

View File

@ -185,7 +185,6 @@ class APIEntreprise(BaseResource):
METHOD_PARAM = {'description': _('method used for user identity matching'), 'example_value': 'simple'}
@endpoint(
perm='can_access',
pattern=r'(?P<association_id>\w+)/$',
example_pattern='{association_id}/',
description=_('Get association\'s documents'),
@ -289,7 +288,6 @@ class APIEntreprise(BaseResource):
return {'data': document}
@endpoint(
perm='can_access',
pattern=r'(?P<siren>\w+)/$',
example_pattern='{siren}/',
description=_('Get firm\'s data from Infogreffe'),
@ -305,7 +303,6 @@ class APIEntreprise(BaseResource):
return {'data': raw_data['data']}
@endpoint(
perm='can_access',
pattern=r'(?P<association_id>\w+)/$',
example_pattern='{association_id}/',
description=_('Get association\'s related informations'),
@ -324,7 +321,6 @@ class APIEntreprise(BaseResource):
return {'data': res}
@endpoint(
perm='can_access',
pattern=r'(?P<siren>\w+)/$',
example_pattern='{siren}/',
description=_('Get firm\'s related informations'),
@ -385,7 +381,6 @@ class APIEntreprise(BaseResource):
return {'data': {'entreprise': data, 'etablissement_siege': siege_data}}
@endpoint(
perm='can_access',
methods=['get'],
pattern=r'(?P<siret>\w+)/$',
example_pattern='{siret}/',
@ -420,7 +415,6 @@ class APIEntreprise(BaseResource):
return {'data': res}
@endpoint(
perm='can_access',
methods=['get'],
pattern=r'(?P<siret>\w+)/$',
example_pattern='{siret}/',
@ -436,7 +430,6 @@ class APIEntreprise(BaseResource):
return self.get('v3/dgfip/etablissements/%s/chiffres_affaires' % siret, raw=True, **kwargs)
@endpoint(
perm='can_access',
pattern=r'(?P<siren>\w+)/$',
description=_(
'Match firm\'s society representative against local FranceConnect identity information'

View File

@ -0,0 +1,56 @@
# Generated by Django 3.2.18 on 2023-04-14 17:35
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('base', '0030_resourcelog_base_resour_appname_298cbc_idx'),
]
operations = [
migrations.CreateModel(
name='Resource',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('title', models.CharField(max_length=50, verbose_name='Title')),
('slug', models.SlugField(unique=True, verbose_name='Identifier')),
('description', models.TextField(verbose_name='Description')),
(
'api_url',
models.URLField(
default='https://gw.dgfip.finances.gouv.fr/impotparticulier/1.0',
max_length=256,
verbose_name='DGFIP API base URL',
),
),
('oauth_username', models.CharField(max_length=128, verbose_name='DGFIP API Username')),
('oauth_password', models.CharField(max_length=128, verbose_name='DGFIP API Password')),
(
'oauth_scopes',
models.CharField(max_length=128, verbose_name='DGFIP API Scopes', blank=True),
),
(
'id_teleservice',
models.TextField(max_length=128, verbose_name='DGFIP API ID_Teleservice', blank=True),
),
(
'users',
models.ManyToManyField(
blank=True,
related_name='_api_impot_particulier_resource_users_+',
related_query_name='+',
to='base.ApiUser',
),
),
],
options={
'verbose_name': 'API Impot Particulier',
},
),
]

View File

@ -0,0 +1,22 @@
# Generated by Django 3.2.18 on 2023-05-25 09:49
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api_impot_particulier', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='resource',
name='id_teleservice',
field=models.TextField(max_length=128, verbose_name='DGFIP API ID_Teleservice'),
),
migrations.AlterField(
model_name='resource',
name='oauth_scopes',
field=models.CharField(max_length=128, verbose_name='DGFIP API Scopes'),
),
]

View File

@ -0,0 +1,306 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import datetime
import hashlib
import uuid
from urllib.parse import urljoin
import requests
from django.core.cache import cache
from django.db import models
from django.utils.translation import gettext_lazy as _
from passerelle.base.models import BaseResource
from passerelle.utils.api import endpoint
from passerelle.utils.jsonresponse import APIError
from passerelle.utils.timeout import Timeout
class ServiceIsDown(APIError):
def __init__(self):
super().__init__(_('API Impot Particulier service is unavailable'))
def __str__(self):
if self.__context__:
return f'{super().__str__()}: {self.__context__}'
return super().__str__()
class Resource(BaseResource):
api_url = models.URLField(
_('DGFIP API base URL'),
max_length=256,
default='https://gw.dgfip.finances.gouv.fr/impotparticulier/1.0',
)
oauth_username = models.CharField(_('DGFIP API Username'), max_length=128)
oauth_password = models.CharField(_('DGFIP API Password'), max_length=128)
oauth_scopes = models.CharField(_('DGFIP API Scopes'), max_length=128)
id_teleservice = models.TextField(_('DGFIP API ID_Teleservice'), max_length=128)
log_requests_errors = False
requests_timeout = 30
requests_max_retries = {
'total': 3,
'backoff_factor': 0.5,
'allowed_methods': ['GET', 'POST'],
# retry after: 0.5, 1.5 and 3.5 seconds
'status_forcelist': [413, 429, 503, 504],
}
category = _('Business Process Connectors')
class Meta:
verbose_name = _('API Impot Particulier')
@classmethod
def parse_numero_fiscal(cls, value):
value = value.strip().replace(' ', '')
if not (value and value.isascii() and value.isdigit()):
raise APIError(_('invalid numero_fiscal'))
return value
@classmethod
def parse_annee_de_revenu(cls, value):
try:
value = int(value)
except (TypeError, ValueError):
raise APIError(_('invalid annee_de_revenu'))
today = datetime.date.today()
if not (0 < today.year - value < 10):
raise APIError(_('invalid annee_de_revenu'))
return value
@endpoint(
name='spi-situations-ir-assiettes-annrev',
description=_('Provides revenue tax situation for a specific year.'),
parameters={
'numero_fiscal': {
'description': _('Tax number of the person'),
},
'annee_de_revenu': {
'description': _('Income year'),
},
},
)
def spi_situations_ir_assiettes_annrev(self, request, numero_fiscal, annee_de_revenu):
numero_fiscal = self.parse_numero_fiscal(numero_fiscal)
annee_de_revenu = self.parse_annee_de_revenu(annee_de_revenu)
return {
'data': self.get_spi_situations_ir_assiettes_annrev(
numero_fiscal=numero_fiscal, annee_de_revenu=annee_de_revenu, timeout=Timeout(20)
)
}
def get_spi_situations_ir_assiettes_annrev(self, numero_fiscal, annee_de_revenu, timeout=None):
return self.call(
name='spi-situations-ir-assiettes-deuxans',
endpoint_template='spi/{spi}/situations/ir/assiettes/annrev/{annrev}',
timeout=timeout,
spi=numero_fiscal,
annrev=annee_de_revenu,
accept='application/prs.dgfip.part.situations.ir.assiettes.v1+json',
)
@endpoint(
name='spi-situations-th-assiettes-principale-annrev',
description=_('Provides housing tax situation for a specific year.'),
parameters={
'numero_fiscal': {
'description': _('Tax number of the person'),
},
'annee_de_revenu': {
'description': _('Income year'),
},
},
)
def spi_situations_th_assiettes_principale_annrev(self, request, numero_fiscal, annee_de_revenu):
numero_fiscal = self.parse_numero_fiscal(numero_fiscal)
annee_de_revenu = self.parse_annee_de_revenu(annee_de_revenu)
return {
'data': self.get_spi_situations_th_assiettes_principale_annrev(
numero_fiscal=numero_fiscal, annee_de_revenu=annee_de_revenu, timeout=Timeout(20)
)
}
def get_spi_situations_th_assiettes_principale_annrev(self, numero_fiscal, annee_de_revenu, timeout=None):
return self.call(
name='spi-situations-th-assiettes-principale-deuxans',
endpoint_template='spi/{spi}/situations/th/assiettes/principale/annrev/{annrev}',
timeout=timeout,
spi=numero_fiscal,
annrev=annee_de_revenu,
accept='application/prs.dgfip.part.situations.th.assiettes.v1+json',
)
def call(self, name, endpoint_template, timeout=None, **kwargs):
correlation_id = str(uuid.uuid4().hex)
kwargs_formatted = ', '.join(f'{key}={value}' for key, value in kwargs.items())
try:
data = self.get_tax_data(
session=self.requests,
base_url=self.api_url,
access_token=self._get_access_token(timeout=timeout),
correlation_id=correlation_id,
endpoint_template=endpoint_template,
id_teleservice=self.id_teleservice,
timeout=timeout,
**kwargs,
)
except ServiceIsDown as e:
self.logger.warning(
'%s(%s) failed: %s',
name,
kwargs_formatted,
e,
extra={
'correlation_id': correlation_id,
'id_teleservice': self.id_teleservice,
'kwargs': kwargs,
},
)
raise
else:
self.logger.warning(
'%s(%s) success',
name,
kwargs_formatted,
extra={
'data': data,
'correlation_id': correlation_id,
'id_teleservice': self.id_teleservice,
'kwargs': kwargs,
},
)
return data
@classmethod
def get_tax_data(
cls,
session,
base_url,
access_token,
correlation_id,
endpoint_template,
accept,
id_teleservice=None,
headers=None,
timeout=None,
**kwargs,
):
headers = {
**(headers or {}),
'Authorization': f'Bearer {access_token}',
'X-Correlation-ID': correlation_id,
'Accept': accept,
}
if id_teleservice:
headers['ID_Teleservice'] = id_teleservice
endpoint = endpoint_template.format(**kwargs)
if not base_url.endswith('/'):
base_url += '/'
url = urljoin(base_url, endpoint)
if timeout is not None:
timeout = float(timeout)
# api-impot-particulier error reporting is byzantine, some errors are
# accompanied by a 4xx code, some others with a 20x code, some have a
# JSON content, other are only identified by a codeapp header on
# the response
try:
response = session.get(url, headers=headers, timeout=timeout)
response.raise_for_status()
except requests.HTTPError:
try:
content = response.json()['erreur']
except (ValueError, KeyError):
try:
raise APIError(
'api-impot-particulier error', data={'codeapp': response.headers['codeapp']}
)
except KeyError:
pass
raise ServiceIsDown
raise APIError('api-impot-particulier-error', data=content)
except requests.RequestException:
raise ServiceIsDown
if response.status_code != 200:
try:
content = response.json()['erreur']
except (ValueError, KeyError):
try:
raise APIError(
'api-impot-particulier-error', data={'codeapp': response.headers['codeapp']}
)
except KeyError:
raise ServiceIsDown
raise APIError('api-impot-particulier error', data=content)
try:
response_data = response.json()
except ValueError:
raise ServiceIsDown
return response_data
def _get_access_token(self, timeout=None):
key = (
'dgfip-at-'
+ hashlib.sha256(
f'{self.oauth_username}-{self.oauth_password}-{self.api_url}'.encode()
).hexdigest()
)
access_token = cache.get(key)
if not access_token:
access_token = self.get_access_token(
session=self.requests,
base_url=self.api_url,
username=self.oauth_username,
password=self.oauth_password,
scope=self.oauth_scopes,
timeout=timeout,
)
cache.set(key, access_token, 300)
return access_token
@classmethod
def get_access_token(cls, session, base_url, username, password, scope, timeout=None):
data = {
'grant_type': 'client_credentials',
}
if scope:
data['scope'] = scope
url = urljoin(base_url, '/token')
if timeout is not None:
timeout = float(timeout)
try:
response = session.post(url, data=data, auth=(username, password), timeout=timeout)
response.raise_for_status()
except requests.RequestException:
raise ServiceIsDown
try:
response_data = response.json()
access_token = response_data['access_token']
response_data = response.json()
except (ValueError, KeyError, TypeError):
raise ServiceIsDown
return access_token

View File

@ -17,8 +17,9 @@ KNOWN_ERRORS = {
'Pas de droit sur la période demandée pour la prestation sélectionnée et le bénéficiaire choisi',
'Pas de droit sur la période demandée pour la prestation sélectionnée.',
"Votre quotient familial (Qf) sur cette période est non disponible. Pour plus d'information, contactez-nous.",
# API particulier error message not from the source above
# API particulier error messages not from the source above
'Les paramètres fournis sont incorrects ou ne correspondent pas à un avis',
"L'identifiant indiqué n'existe pas, n'est pas connu ou ne comporte aucune information pour cet appel.",
},
400: {
'Absence de code confidentiel. Le document ne peut être édité.',
@ -30,6 +31,8 @@ KNOWN_ERRORS = {
'Il existe des droits pour la prestation sélectionnée sur le dossier et/ou la période demandée',
'Il existe des droits pour la prestation sélectionnée sur le dossier et/ou la période demandée (après date du jour)',
'Lopérateurs téléphonique» ne propose pas de raccordement SMS avec un prestataire externe (raccordement avec un numéro court). ',
# API particulier error messages not from the source above
"La référence de l'avis n'est pas correctement formatée",
},
500: {
'Les informations souhaitées sont momentanément indisponibles. Merci de renouveler votre demande ultérieurement.',
@ -39,7 +42,7 @@ KNOWN_ERRORS = {
"Votre demande n'a pu aboutir en raison d'un incident technique lié à l'appel au service IMC. Des paramètres manquent.",
(
"Votre demande n'a pu aboutir en raison d'un incident technique lié à l'appel au service IMC. "
"La taille du message ne doit pas être supérieure à 160 caractères."
'La taille du message ne doit pas être supérieure à 160 caractères.'
),
(
"Votre demande n'a pu aboutir en raison d'un incident technique lié à l'appel au service IMC. "
@ -50,7 +53,7 @@ KNOWN_ERRORS = {
"Votre demande n'a pu aboutir en raison d'une erreur technique lié à l'appel au service IMC.",
(
"Votre demande na pu aboutir en raison d'un problème technique lié aux données entrantes du webservice. "
"Merci de renouveler votre demande ultérieurement."
'Merci de renouveler votre demande ultérieurement.'
),
},
}

View File

@ -0,0 +1,17 @@
# Generated by Django 3.2.18 on 2023-12-13 10:33
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api_particulier', '0006_api_key_length_1024'),
]
operations = [
migrations.AlterField(
model_name='apiparticulier',
name='api_key',
field=models.CharField(blank=True, default='', max_length=2048, verbose_name='API key'),
),
]

View File

@ -63,7 +63,7 @@ class APIParticulier(BaseResource):
choices=[(key, platform['label']) for key, platform in PLATFORMS.items()],
)
api_key = models.CharField(max_length=1024, default='', blank=True, verbose_name=_('API key'))
api_key = models.CharField(max_length=2048, default='', blank=True, verbose_name=_('API key'))
log_requests_errors = False
@ -170,7 +170,6 @@ class APIParticulier(BaseResource):
self.save()
@endpoint(
perm='can_access',
description=_('Get scopes available'),
display_order=1,
)
@ -184,7 +183,6 @@ class APIParticulier(BaseResource):
}
@endpoint(
perm='can_access',
show=False,
description=_('Get citizen\'s fiscal informations'),
parameters={
@ -208,7 +206,6 @@ class APIParticulier(BaseResource):
@endpoint(
name='avis-imposition',
perm='can_access',
description=_('Get citizen\'s fiscal informations'),
parameters={
'numero_fiscal': {
@ -303,7 +300,6 @@ class APIParticulier(BaseResource):
return data
@endpoint(
perm='can_access',
show=False,
description=_('Get family allowances recipient informations'),
parameters={
@ -327,7 +323,6 @@ class APIParticulier(BaseResource):
@endpoint(
name='situation-familiale',
perm='can_access',
description=_('Get family allowances recipient informations'),
parameters={
'code_postal': {
@ -363,6 +358,11 @@ class APIParticulier(BaseResource):
)
data['data']['numero_allocataire'] = numero_allocataire
data['data']['code_postal'] = code_postal
for kind in 'allocataires', 'enfants':
for person in data['data'].get(kind) or []:
if len(person.get('dateDeNaissance') or '') == 8:
birthdate = person['dateDeNaissance']
person['dateDeNaissance_iso'] = birthdate[4:] + '-' + birthdate[2:4] + '-' + birthdate[:2]
return data
category = _('Business Process Connectors')

View File

@ -14,6 +14,7 @@
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import json
import string
from urllib import parse as urlparse
@ -32,6 +33,42 @@ from passerelle.utils.conversion import num2deg
from passerelle.utils.jsonresponse import APIError
from passerelle.utils.templates import render_to_string, validate_template
EDIT_ITEM_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'Item schema',
'description': '',
'type': 'object',
'properties': {
'geometry': {
'type': 'object',
'properties': {
'x': {'type': 'string'},
'y': {'type': 'string'},
},
},
'attributes': {'type': 'object'},
},
'required': ['attributes'],
}
EDIT_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'Edit payload',
'description': '',
'type': 'object',
'properties': {
'adds': {
'type': 'array',
'description': 'Adds object',
'items': EDIT_ITEM_SCHEMA,
},
'updates': {'type': 'array', 'description': 'Updates object', 'items': EDIT_ITEM_SCHEMA},
'deletes': {'type': 'array', 'description': 'Deletes object', 'items': {'type': 'string'}},
},
'minProperties': 1,
'unflatten': True,
}
class ArcGISError(APIError):
pass
@ -177,7 +214,6 @@ class ArcGIS(BaseResource, HTTPResource):
@endpoint(
name='mapservice-query',
description=_('Map Service Query'),
perm='can_access',
parameters={
'folder': {
'description': _('Folder name'),
@ -247,7 +283,6 @@ class ArcGIS(BaseResource, HTTPResource):
@endpoint(
name='featureservice-query',
description=_('Feature Service Query'),
perm='can_access',
parameters={
'folder': {
'description': _('Folder name'),
@ -318,9 +353,49 @@ class ArcGIS(BaseResource, HTTPResource):
text_fieldname=text_fieldname,
)
@endpoint(
name='featureservice-applyedits',
description=_('Feature Service Apply Edits'),
parameters={
'folder': {
'description': _('Folder name'),
'example_value': 'Specialty',
},
'service': {
'description': _('Service name'),
'example_value': 'ESRI_StateCityHighway_USA',
},
'layer': {
'description': _('Layer or table name'),
'example_value': '1',
},
},
post={'request_body': {'schema': {'application/json': EDIT_SCHEMA}}},
)
def featureservice_applyedits(
self,
request,
post_data,
service,
layer='0',
folder='',
):
# implement "apply edits" feature service
# https://developers.arcgis.com/rest/services-reference/enterprise/apply-edits-feature-service-layer-.htm
uri = 'services/'
if folder:
uri += folder + '/'
uri = uri + service + '/FeatureServer/' + layer + '/applyEdits'
params = {'f': 'pjson'}
for key, value in post_data.items():
post_data[key] = json.dumps(value)
params.update(post_data)
return {'data': self.request(urlparse.urljoin(self.base_url, uri), data=params)}
@endpoint(
name='tile',
description=_('Tiles layer'),
perm='OPEN',
pattern=r'^(?P<layer>[\w/]+)/(?P<zoom>\d+)/(?P<tile_x>\d+)/(?P<tile_y>\d+)\.png$',
)
def tile(self, request, layer, zoom, tile_x, tile_y):
@ -349,7 +424,6 @@ class ArcGIS(BaseResource, HTTPResource):
name='q',
description=_('Query'),
pattern=r'^(?P<query_slug>[\w:_-]+)/$',
perm='can_access',
show=False,
)
def q(self, request, query_slug, q=None, full=False, **kwargs):

View File

@ -75,13 +75,21 @@ class ArpegeECP(BaseResource):
@endpoint(
name='api',
pattern=r'^users/(?P<nameid>\w+)/forms$',
perm='can_access',
description='Returns user forms',
example_pattern='users/{nameid}/forms',
description=_('Returns user forms'),
parameters={
'nameid': {'description': _('Publik ID'), 'example_value': 'nameid'},
'status': {'description': _('Demands status'), 'example_value': 'pending'},
},
)
def get_user_forms(self, request, nameid):
def get_user_forms(self, request, nameid, status='pending'):
access_token = self.get_access_token(nameid)
url = urlparse.urljoin(self.webservice_base_url, 'DemandesUsager')
params = {'scope': 'data_administratives'}
if status == 'pending':
params['EtatDemande'] = 'DEPOSEE, ENCRSINSTR' # value for filtering pending forms
elif status == 'done':
params['EtatDemande'] = 'TRAITEEPOS, TRAITEENEG, TRAITEE' # value for filtering done forms
auth = HawkAuth(self.hawk_auth_id, self.hawk_auth_key, ext=access_token)
try:
response = self.requests.get(url, params=params, auth=auth)
@ -94,7 +102,7 @@ class ArpegeECP(BaseResource):
except ValueError:
raise APIError('No JSON content returned: %r' % response.content[:1000])
if not result.get('Data'):
raise APIError("%s (%s)" % (result.get('LibErreur'), result.get('CodErreur')))
raise APIError('%s (%s)' % (result.get('LibErreur'), result.get('CodErreur')))
for demand in result['Data']['results']:
try:
data_administratives = demand['data_administratives']

View File

@ -147,6 +147,8 @@ class ASTech(BaseResource, HTTPResource):
_category_ordering = [_('Parameters'), _('Rules'), _('Demand'), 'Tech & Debug']
log_requests_errors = False
class Meta:
verbose_name = _('AS-TECH')
@ -159,7 +161,7 @@ class ASTech(BaseResource, HTTPResource):
try:
content = response.json()
except ValueError:
content = response.content[:1024]
content = '%r' % response.content[:1024]
raise APIError(
'AS-TECH response: %s %s' % (response.status_code, response.reason),
data={
@ -220,10 +222,51 @@ class ASTech(BaseResource, HTTPResource):
json_response = self.call_json(method, url, params=params, **kwargs)
return json_response
def get_view_schema(self, view_code):
cache_key = 'astech-%s-%s-schema' % (self.id, view_code)
schema = cache.get(cache_key)
if schema:
return schema
endpoint = 'apicli/data/%s/columns' % view_code
columns = self.call(endpoint).get('columns', [])
schema = {}
for column in columns:
column.pop('des')
code = column.pop('code')
if column['type'] == 'NUM':
column['operator'] = '='
else:
column['operator'] = 'is_equal'
schema[code] = column
cache.set(cache_key, schema)
return schema
def build_view_filters(self, view_code, filters):
if not filters:
return []
schema = self.get_view_schema(view_code)
filters_expression = []
for expression in filters.split(';'):
try:
name, value = expression.split('=')
except ValueError:
continue
if value and schema[name]['length'] and len(value) > int(schema[name]['length']):
raise APIError(
_('Value of %s exceeds authorized length (%s)') % (name, schema[name]['length'])
)
filters_expression.append(
{
'field': name,
'type': schema[name]['type'],
'filter': {'value': value, 'operator': schema[name]['operator']},
}
)
return filters_expression
@endpoint(
name='connections',
description=_('See all possible connections codes (see configuration)'),
perm='can_access',
display_category='Tech & Debug',
display_order=1,
)
@ -233,7 +276,6 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='authorization',
description=_('See authorization tokens (testing only)'),
perm='can_access',
display_category='Tech & Debug',
display_order=2,
)
@ -242,8 +284,7 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='services',
description=_("List authorized services for connected user"),
perm='can_access',
description=_('List authorized services for connected user'),
display_category=_('Rules'),
display_order=1,
)
@ -256,7 +297,6 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='company',
description=_('Company code of the applicant'),
perm='can_access',
parameters={
'applicant': {
'description': _(
@ -278,7 +318,6 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='companies',
description=_('List of authorized companies for an applicant'),
perm='can_access',
parameters={
'applicant': {
'description': _(
@ -305,6 +344,8 @@ class ASTech(BaseResource, HTTPResource):
'designation': True,
}
companies = self.call('apicli/rule-call-by-alias/societes_demandeur/invoke', json=payload)
if not isinstance(companies, dict):
raise APIError('Invalid response: %s' % companies)
companies = [{'id': str(key), 'text': value} for key, value in companies.items()]
companies.sort(key=lambda item: item['id']) # "same as output" sort
return {'data': companies}
@ -312,7 +353,6 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='labels',
description=_('List of predefined labels for a company'),
perm='can_access',
parameters={
'company': {
'description': _('Company code (societeDemandeur). If absent, use "company" endpoint result')
@ -327,14 +367,15 @@ class ASTech(BaseResource, HTTPResource):
labels = self.call(
'apicli/rule-call-by-alias/libelles_predefinis/invoke', json={'societeDemandeur': company}
)
if not isinstance(labels, dict):
raise APIError('Invalid response: %s' % labels)
labels = [{'id': str(key), 'text': value} for key, value in labels.items()]
labels.sort(key=lambda item: item['id']) # "same as output" sort
return {'data': labels}
@endpoint(
name='parameter',
description=_("Value of a parameter"),
perm='can_access',
description=_('Value of a parameter'),
parameters={
'name': {'description': _('Name of the parameter'), 'example_value': 'LIBELDEMDEF'},
'company': {'description': _('Company code. If absent, use "company" endpoint result')},
@ -354,7 +395,6 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='create-demand',
description=_('Create a demand'),
perm='can_access',
methods=['post'],
post={'request_body': {'schema': {'application/json': DEMAND_SCHEMA}}},
display_category=_('Demand'),
@ -392,7 +432,6 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='add-document',
description=_('Add a document in a demand'),
perm='can_access',
methods=['post'],
post={'request_body': {'schema': {'application/json': ADD_DOCUMENT_SCHEMA}}},
display_category=_('Demand'),
@ -414,7 +453,6 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='demand-position',
description=_('Get demand position'),
perm='can_access',
parameters={
'demand_id': {
'description': _('Demand id'),
@ -436,7 +474,6 @@ class ASTech(BaseResource, HTTPResource):
@endpoint(
name='demand-all-positions',
description=_('List all demand possible positions'),
perm='can_access',
display_category=_('Demand'),
display_order=4,
)
@ -447,3 +484,69 @@ class ASTech(BaseResource, HTTPResource):
position['id'] = position['position']
position['text'] = position['positionLib']
return {'data': positions}
@endpoint(
name='list-views',
display_order=1,
description=_('List available views'),
display_category=_('Referential'),
)
def list_views(self, request):
results = self.call('apicli/data/views')
astech_views = results.get('views', [])
for view in astech_views:
view['id'] = view['apivId']
view['text'] = view['apivNom']
return {'data': astech_views}
@endpoint(
name='get-view-columns',
display_order=2,
description=_('Get view columns'),
display_category=_('Referential'),
parameters={
'code': {
'description': _('View code'),
'example_value': 'ASTECH_BIENS',
},
},
)
def get_view_columns(self, request, code):
endpoint = 'apicli/data/%s/columns' % code
results = self.call(endpoint)
columns = results.get('columns', [])
for column in columns:
column['id'] = column['code']
column['text'] = column['des']
return {'data': columns}
@endpoint(
name='get-view-data',
display_order=3,
description=_('Get view data'),
display_category=_('Referential'),
datasource=True,
parameters={
'code': {
'description': _('View code'),
'example_value': 'ASTECH_BIENS',
},
'id_column': {'description': _('Name of column contaning the id'), 'example_value': 'BIEN_ID'},
'text_column': {
'description': _('Name of column contaning the label'),
'example_value': 'DESIGNATION',
},
'filters': {
'description': _('Semicolon separated filter expressions'),
'example_value': 'GENRE=SIT;SECTEUR=S1',
},
},
)
def get_view_data(self, request, code, id_column, text_column, filters=None):
endpoint = 'apicli/data/%s/results' % code
filters = self.build_view_filters(code, filters)
results = self.call(endpoint, json={'data': {'filters': filters}})
for result in results:
result['id'] = result[id_column]
result['text'] = result[text_column]
return {'data': results}

View File

@ -282,7 +282,6 @@ class AstreREST(BaseResource):
@endpoint(
methods=['get'],
perm='can_access',
name='gf-documents-entites-getref',
parameters=GF_DOCUMENTS_ENTITIES_GETFREF_PARAMS,
)
@ -294,7 +293,6 @@ class AstreREST(BaseResource):
@endpoint(
methods=['get'],
perm='can_access',
name='gf-documents-entites-list',
parameters=GF_DOCUMENTS_ENTITIES_LIST_PARAMS,
)
@ -323,7 +321,6 @@ class AstreREST(BaseResource):
@endpoint(
methods=['get'],
perm='can_access',
name='gf-documents-entites-read',
parameters=GF_DOCUMENTS_ENTITIES_READ_PARAMS,
)
@ -335,7 +332,6 @@ class AstreREST(BaseResource):
@endpoint(
methods=['get'],
perm='can_access',
name='gf-documents-entites-search',
parameters=GF_DOCUMENTS_ENTITIES_SEARCH_PARAMS,
)
@ -364,7 +360,6 @@ class AstreREST(BaseResource):
@endpoint(
name='gf-documents-gedmanager-document-create',
description=_('Create document'),
perm='can_access',
post={
'request_body': {
'schema': {
@ -387,7 +382,6 @@ class AstreREST(BaseResource):
@endpoint(
name='gf-documents-gedmanager-document-delete',
description=_('Delete document'),
perm='can_access',
post={
'request_body': {
'schema': {
@ -416,7 +410,6 @@ class AstreREST(BaseResource):
@endpoint(
methods=['get'],
perm='can_access',
name='gf-documents-gedmanager-document-read',
parameters=GF_DOCUMENTS_DOCUMENT_READ_PARAMS,
)
@ -433,7 +426,6 @@ class AstreREST(BaseResource):
@endpoint(
name='gf-documents-gedmanager-document-update',
description=_('Update document'),
perm='can_access',
post={
'request_body': {
'schema': {
@ -453,7 +445,7 @@ class AstreREST(BaseResource):
)
}
@endpoint(methods=['get'], perm='can_access', name='gf-documents-referentiel-domainepj')
@endpoint(methods=['get'], name='gf-documents-referentiel-domainepj')
def gf_documents_referentiel_domainepj(self, request):
return {
'data': self._get_data_source(
@ -461,7 +453,7 @@ class AstreREST(BaseResource):
)
}
@endpoint(methods=['get'], perm='can_access', name='gf-documents-referentiel-typedocument')
@endpoint(methods=['get'], name='gf-documents-referentiel-typedocument')
def gf_documents_referentiel_typedocument(self, request):
return {
'data': self._get_data_source(

View File

@ -28,164 +28,164 @@ from passerelle.utils.jsonresponse import APIError
from passerelle.utils.validation import is_number
ASSOCIATION_SCHEMA = {
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "AstreGS assocation",
"description": "",
"type": "object",
"required": [
"Financier",
"CodeFamille",
"CatTiers",
"NomEnregistrement",
"StatutTiers",
"Type",
"AdresseTitre",
"AdresseIsAdresseDeCommande",
"AdresseIsAdresseDeFacturation",
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'AstreGS assocation',
'description': '',
'type': 'object',
'required': [
'Financier',
'CodeFamille',
'CatTiers',
'NomEnregistrement',
'StatutTiers',
'Type',
'AdresseTitre',
'AdresseIsAdresseDeCommande',
'AdresseIsAdresseDeFacturation',
],
"properties": {
"Financier": {"description": "financial association", "type": "string", "enum": ["true", "false"]},
"CodeFamille": {
"description": "association family code",
"type": "string",
'properties': {
'Financier': {'description': 'financial association', 'type': 'string', 'enum': ['true', 'false']},
'CodeFamille': {
'description': 'association family code',
'type': 'string',
},
"CatTiers": {
"description": "association category",
"type": "string",
'CatTiers': {
'description': 'association category',
'type': 'string',
},
"NomEnregistrement": {
"description": "association name",
"type": "string",
'NomEnregistrement': {
'description': 'association name',
'type': 'string',
},
"StatutTiers": {
"description": "association status",
"type": "string",
"enum": ["PROPOSE", "VALIDE", "REFUSE", "BLOQUE", "A COMPLETER"],
'StatutTiers': {
'description': 'association status',
'type': 'string',
'enum': ['PROPOSE', 'VALIDE', 'REFUSE', 'BLOQUE', 'A COMPLETER'],
},
"Type": {"description": "association type", "type": "string", "enum": ["D", "F", "*"]},
"NumeroSiret": {
"description": "SIREN number",
"type": "string",
'Type': {'description': 'association type', 'type': 'string', 'enum': ['D', 'F', '*']},
'NumeroSiret': {
'description': 'SIREN number',
'type': 'string',
},
"NumeroSiretFin": {
"description": "NIC number",
"type": "string",
'NumeroSiretFin': {
'description': 'NIC number',
'type': 'string',
},
"AdresseTitre": {
"type": "string",
'AdresseTitre': {
'type': 'string',
},
"AdresseIsAdresseDeCommande": {"type": "string", "enum": ["true", "false"]},
"AdresseIsAdresseDeFacturation": {"type": "string", "enum": ["true", "false"]},
"organism": {
"description": _('Organisme'),
"type": "string",
'AdresseIsAdresseDeCommande': {'type': 'string', 'enum': ['true', 'false']},
'AdresseIsAdresseDeFacturation': {'type': 'string', 'enum': ['true', 'false']},
'organism': {
'description': _('Organisme'),
'type': 'string',
},
"budget": {
"description": _('Budget'),
"type": "string",
'budget': {
'description': _('Budget'),
'type': 'string',
},
"exercice": {
"description": _('Exercice'),
"type": "string",
'exercice': {
'description': _('Exercice'),
'type': 'string',
},
},
}
CONTACT_SCHEMA = {
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "AstreGS contact",
"description": "",
"type": "object",
"required": [
"CodeContact",
"CodeTitreCivilite",
"Nom",
"AdresseDestinataire",
"CodePostal",
"Ville",
"EncodeKeyStatut",
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'AstreGS contact',
'description': '',
'type': 'object',
'required': [
'CodeContact',
'CodeTitreCivilite',
'Nom',
'AdresseDestinataire',
'CodePostal',
'Ville',
'EncodeKeyStatut',
],
"properties": {
"CodeContact": {
"type": "string",
'properties': {
'CodeContact': {
'type': 'string',
},
"CodeTitreCivilite": {
"type": "string",
'CodeTitreCivilite': {
'type': 'string',
},
"Nom": {
"type": "string",
'Nom': {
'type': 'string',
},
"AdresseDestinataire": {
"type": "string",
'AdresseDestinataire': {
'type': 'string',
},
"CodePostal": {
"type": "string",
'CodePostal': {
'type': 'string',
},
"Ville": {
"type": "string",
'Ville': {
'type': 'string',
},
"EncodeKeyStatut": {
"type": "string",
'EncodeKeyStatut': {
'type': 'string',
},
"organism": {
"description": _('Organisme'),
"type": "string",
'organism': {
'description': _('Organisme'),
'type': 'string',
},
"budget": {
"description": _('Budget'),
"type": "string",
'budget': {
'description': _('Budget'),
'type': 'string',
},
"exercice": {
"description": _('Exercice'),
"type": "string",
'exercice': {
'description': _('Exercice'),
'type': 'string',
},
},
}
DOCUMENT_SCHEMA = {
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "AstreGS assocation",
"description": "",
"type": "object",
"required": [
"Sujet",
"Entite",
"CodType",
"Type",
"hdnCodeTrt",
"EncodeKeyEntite",
"CodeDomaine",
"CodDom",
"document",
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'AstreGS assocation',
'description': '',
'type': 'object',
'required': [
'Sujet',
'Entite',
'CodType',
'Type',
'hdnCodeTrt',
'EncodeKeyEntite',
'CodeDomaine',
'CodDom',
'document',
],
"properties": {
"Sujet": {
"type": "string",
'properties': {
'Sujet': {
'type': 'string',
},
"Entite": {
"type": "string",
'Entite': {
'type': 'string',
},
"CodType": {
"type": "string",
'CodType': {
'type': 'string',
},
"Type": {
"type": "string",
'Type': {
'type': 'string',
},
"hdnCodeTrt": {
"type": "string",
'hdnCodeTrt': {
'type': 'string',
},
"EncodeKeyEntite": {
"type": "string",
'EncodeKeyEntite': {
'type': 'string',
},
"CodeDomaine": {
"type": "string",
'CodeDomaine': {
'type': 'string',
},
"CodDom": {
"type": "string",
'CodDom': {
'type': 'string',
},
"document": {
"type": "object",
"required": ['filename', 'content_type', 'content'],
'document': {
'type': 'object',
'required': ['filename', 'content_type', 'content'],
'properties': {
'filename': {
'type': 'string',
@ -198,236 +198,236 @@ DOCUMENT_SCHEMA = {
},
},
},
"organism": {
"description": _('Organisme'),
"type": "string",
'organism': {
'description': _('Organisme'),
'type': 'string',
},
"budget": {
"description": _('Budget'),
"type": "string",
'budget': {
'description': _('Budget'),
'type': 'string',
},
"exercice": {
"description": _('Exercice'),
"type": "string",
'exercice': {
'description': _('Exercice'),
'type': 'string',
},
},
}
GRANT_SCHEMA = {
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "AstreGS grant",
"description": "",
"type": "object",
"required": [
"Libelle",
"LibelleCourt",
"ModGestion",
"TypeAide",
"Sens",
"CodeTiersDem",
"CodeServiceGestionnaire",
"CodeServiceUtilisateur",
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'AstreGS grant',
'description': '',
'type': 'object',
'required': [
'Libelle',
'LibelleCourt',
'ModGestion',
'TypeAide',
'Sens',
'CodeTiersDem',
'CodeServiceGestionnaire',
'CodeServiceUtilisateur',
],
"properties": {
"Libelle": {
"type": "string",
'properties': {
'Libelle': {
'type': 'string',
},
"LibelleCourt": {
"type": "string",
'LibelleCourt': {
'type': 'string',
},
"ModGestion": {"type": "string", "enum": ["1", "2", "3", "4"]},
"TypeAide": {
"type": "string",
'ModGestion': {'type': 'string', 'enum': ['1', '2', '3', '4']},
'TypeAide': {
'type': 'string',
},
"Sens": {
"type": "string",
'Sens': {
'type': 'string',
},
"CodeTiersDem": {
"type": "string",
'CodeTiersDem': {
'type': 'string',
},
"CodeServiceGestionnaire": {
"type": "string",
'CodeServiceGestionnaire': {
'type': 'string',
},
"CodeServiceUtilisateur": {
"type": "string",
'CodeServiceUtilisateur': {
'type': 'string',
},
"organism": {
"description": _('Organisme'),
"type": "string",
'organism': {
'description': _('Organisme'),
'type': 'string',
},
"budget": {
"description": _('Budget'),
"type": "string",
'budget': {
'description': _('Budget'),
'type': 'string',
},
"exercice": {
"description": _('Exercice'),
"type": "string",
'exercice': {
'description': _('Exercice'),
'type': 'string',
},
},
}
INDANA_SCHEMA = {
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "AstreGS INDANA indicator",
"description": "",
"type": "object",
"required": ["CodeDossier", "CodeInd_1", "AnneeInd_1", "ValInd_1"],
"properties": {
"CodeDossier": {
"type": "string",
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'AstreGS INDANA indicator',
'description': '',
'type': 'object',
'required': ['CodeDossier', 'CodeInd_1', 'AnneeInd_1', 'ValInd_1'],
'properties': {
'CodeDossier': {
'type': 'string',
},
"CodeInd_1": {
"type": "string",
'CodeInd_1': {
'type': 'string',
},
"AnneeInd_1": {
"type": "string",
'AnneeInd_1': {
'type': 'string',
},
"ValInd_1": {
"type": "string",
'ValInd_1': {
'type': 'string',
},
"IndAide": {
"type": "string",
'IndAide': {
'type': 'string',
},
"organism": {
"description": _('Organisme'),
"type": "string",
'organism': {
'description': _('Organisme'),
'type': 'string',
},
"budget": {
"description": _('Budget'),
"type": "string",
'budget': {
'description': _('Budget'),
'type': 'string',
},
"exercice": {
"description": _('Exercice'),
"type": "string",
'exercice': {
'description': _('Exercice'),
'type': 'string',
},
},
}
INDANA_KEY_SCHEMA = {
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "AstreGS INDANA indicator key",
"description": "",
"type": "object",
"required": ["CodeDossier", "CodeInd_1", "AnneeInd_1"],
"properties": {
"CodeDossier": {
"type": "string",
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'AstreGS INDANA indicator key',
'description': '',
'type': 'object',
'required': ['CodeDossier', 'CodeInd_1', 'AnneeInd_1'],
'properties': {
'CodeDossier': {
'type': 'string',
},
"CodeInd_1": {
"type": "string",
'CodeInd_1': {
'type': 'string',
},
"AnneeInd_1": {
"type": "string",
'AnneeInd_1': {
'type': 'string',
},
"organism": {
"description": _('Organisme'),
"type": "string",
'organism': {
'description': _('Organisme'),
'type': 'string',
},
"budget": {
"description": _('Budget'),
"type": "string",
'budget': {
'description': _('Budget'),
'type': 'string',
},
"exercice": {
"description": _('Exercice'),
"type": "string",
'exercice': {
'description': _('Exercice'),
'type': 'string',
},
},
}
TIERS_RIB_SCHEMA = {
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "AstreGS TiersRib",
"description": "TiersRib",
"type": "object",
"required": [
"CodeTiers",
"CodePaiement",
"LibelleCourt",
"NumeroIban",
"CleIban",
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'AstreGS TiersRib',
'description': 'TiersRib',
'type': 'object',
'required': [
'CodeTiers',
'CodePaiement',
'LibelleCourt',
'NumeroIban',
'CleIban',
'CodeBic',
"CodeDomiciliation",
"CodeStatut",
"CodeDevise",
"CodeIso2Pays",
"LibelleCompteEtranger",
'CodeDomiciliation',
'CodeStatut',
'CodeDevise',
'CodeIso2Pays',
'LibelleCompteEtranger',
],
"properties": {
"CodeDevise": {"type": "string"},
"CodeDomiciliation": {"type": "string"},
"CodeIso2Pays": {"type": "string"},
"CodePaiement": {"type": "string"},
"CodeStatut": {
"type": "string",
"enum": ["PROPOSE", "VALIDE", "REFUSE", "A COMPLETER", "BLOQUE", "EN MODIFICATION"],
'properties': {
'CodeDevise': {'type': 'string'},
'CodeDomiciliation': {'type': 'string'},
'CodeIso2Pays': {'type': 'string'},
'CodePaiement': {'type': 'string'},
'CodeStatut': {
'type': 'string',
'enum': ['PROPOSE', 'VALIDE', 'REFUSE', 'A COMPLETER', 'BLOQUE', 'EN MODIFICATION'],
},
"CodeTiers": {"type": "string"},
"IndicateurRibDefaut": {"type": "string"},
"LibelleCompteEtranger": {"type": "string"},
"LibelleCourt": {"type": "string"},
"NumeroIban": {"type": "string"},
"CleIban": {"type": "string"},
"CodeBic": {"type": "string"},
"IdRib": {"type": "string"},
"organism": {
"description": _('Organisme'),
"type": "string",
'CodeTiers': {'type': 'string'},
'IndicateurRibDefaut': {'type': 'string'},
'LibelleCompteEtranger': {'type': 'string'},
'LibelleCourt': {'type': 'string'},
'NumeroIban': {'type': 'string'},
'CleIban': {'type': 'string'},
'CodeBic': {'type': 'string'},
'IdRib': {'type': 'string'},
'organism': {
'description': _('Organisme'),
'type': 'string',
},
"budget": {
"description": _('Budget'),
"type": "string",
'budget': {
'description': _('Budget'),
'type': 'string',
},
"exercice": {
"description": _('Exercice'),
"type": "string",
'exercice': {
'description': _('Exercice'),
'type': 'string',
},
},
}
TIERS_RIB_UPDATE_SCHEMA = {
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "AstreGS TiersRib",
"description": "TiersRib Update",
"type": "object",
"required": [
"CodePaiement",
"LibelleCourt",
"NumeroIban",
"CleIban",
"CodeBic",
"CodeDomiciliation",
"CodeStatut",
"CodeDevise",
"CodeIso2Pays",
"LibelleCompteEtranger",
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'AstreGS TiersRib',
'description': 'TiersRib Update',
'type': 'object',
'required': [
'CodePaiement',
'LibelleCourt',
'NumeroIban',
'CleIban',
'CodeBic',
'CodeDomiciliation',
'CodeStatut',
'CodeDevise',
'CodeIso2Pays',
'LibelleCompteEtranger',
],
"properties": {
"CodeDevise": {"type": "string"},
"CodeDomiciliation": {"type": "string"},
"CodeIso2Pays": {"type": "string"},
"CodePaiement": {"type": "string"},
"CodeStatut": {
"type": "string",
"enum": ["PROPOSE", "VALIDE", "REFUSE", "A COMPLETER", "BLOQUE", "EN MODIFICATION"],
'properties': {
'CodeDevise': {'type': 'string'},
'CodeDomiciliation': {'type': 'string'},
'CodeIso2Pays': {'type': 'string'},
'CodePaiement': {'type': 'string'},
'CodeStatut': {
'type': 'string',
'enum': ['PROPOSE', 'VALIDE', 'REFUSE', 'A COMPLETER', 'BLOQUE', 'EN MODIFICATION'],
},
"IndicateurRibDefaut": {"type": "string"},
"LibelleCompteEtranger": {"type": "string"},
"LibelleCourt": {"type": "string"},
"NumeroIban": {"type": "string"},
"CleIban": {"type": "string"},
"CodeBic": {"type": "string"},
"organism": {
"description": _('Organisme'),
"type": "string",
'IndicateurRibDefaut': {'type': 'string'},
'LibelleCompteEtranger': {'type': 'string'},
'LibelleCourt': {'type': 'string'},
'NumeroIban': {'type': 'string'},
'CleIban': {'type': 'string'},
'CodeBic': {'type': 'string'},
'organism': {
'description': _('Organisme'),
'type': 'string',
},
"budget": {
"description": _('Budget'),
"type": "string",
'budget': {
'description': _('Budget'),
'type': 'string',
},
"exercice": {
"description": _('Exercice'),
"type": "string",
'exercice': {
'description': _('Exercice'),
'type': 'string',
},
},
}
@ -492,7 +492,6 @@ class AstreGS(BaseResource):
@endpoint(
description=_('Find associations by SIREN number'),
perm='can_access',
parameters={
'siren': {'description': _('SIREN Number'), 'example_value': '77567227216096'},
'organism': {'description': _('Organisme'), 'example_value': 'NOMDEVILLE'},
@ -517,7 +516,6 @@ class AstreGS(BaseResource):
@endpoint(
description=_('Check if association exists by its SIRET number'),
name='check-association-by-siret',
perm='can_access',
parameters={
'siret': {'description': _('SIRET Number'), 'example_value': '7756722721609600014'},
'organism': {'description': _('Organisme'), 'example_value': 'NOMDEVILLE'},
@ -535,7 +533,6 @@ class AstreGS(BaseResource):
@endpoint(
name='get-association-link-means',
description=_('Get association linking means'),
perm='can_access',
parameters={
'association_id': {'description': _('Association ID'), 'example_value': '42435'},
'NameID': {'description': _('Publik ID'), 'example_value': 'xyz24d934'},
@ -585,7 +582,6 @@ class AstreGS(BaseResource):
@endpoint(
description=_('Create link between user and association'),
perm='can_access',
parameters={
'NameID': {
'description': _('Publik NameID'),
@ -619,7 +615,6 @@ class AstreGS(BaseResource):
@endpoint(
description=_('Remove link between user and association'),
perm='can_access',
parameters={
'NameID': {'description': _('Publik NameID'), 'example_value': 'xyz24d934'},
'association_id': {'description': _('Association ID'), 'example_value': '12345'},
@ -635,7 +630,6 @@ class AstreGS(BaseResource):
@endpoint(
description=_('List user links'),
perm='can_access',
parameters={
'NameID': {
'description': _('Publik NameID'),
@ -665,7 +659,6 @@ class AstreGS(BaseResource):
return {'data': data}
@endpoint(
perm='can_access',
name='create-association',
post={
'description': _('Creates an association'),
@ -680,7 +673,6 @@ class AstreGS(BaseResource):
@endpoint(
description=_('Get association informations'),
name='get-association-by-id',
perm='can_access',
parameters={
'association_id': {'description': _('Association ID'), 'example_value': '42435'},
'NameID': {'description': _('Publik ID'), 'example_value': 'xyz24d934'},
@ -701,7 +693,6 @@ class AstreGS(BaseResource):
@endpoint(
name='get-contact',
perm='can_access',
description=_('Get contact details'),
parameters={
'contact_id': {
@ -720,7 +711,6 @@ class AstreGS(BaseResource):
@endpoint(
name='create-contact',
perm='can_access',
post={
'description': _('Create contact'),
'request_body': {'schema': {'application/json': CONTACT_SCHEMA}},
@ -737,7 +727,6 @@ class AstreGS(BaseResource):
@endpoint(
description=_('Delete contact'),
name='delete-contact',
perm='can_access',
parameters={
'contact_id': {'description': _('Contact ID'), 'example_value': '4242'},
'organism': {'description': _('Organisme'), 'example_value': 'NOMDEVILLE'},
@ -752,7 +741,6 @@ class AstreGS(BaseResource):
@endpoint(
name='create-document',
perm='can_access',
post={
'description': _('Create document'),
'request_body': {'schema': {'application/json': DOCUMENT_SCHEMA}},
@ -769,7 +757,6 @@ class AstreGS(BaseResource):
@endpoint(
name='create-grant-demand',
perm='can_access',
post={
'description': _('Create grant demand'),
'request_body': {'schema': {'application/json': GRANT_SCHEMA}},
@ -782,7 +769,6 @@ class AstreGS(BaseResource):
@endpoint(
name='create-indana-indicator',
perm='can_access',
post={
'description': _('Create indana indicator'),
'request_body': {'schema': {'application/json': INDANA_SCHEMA}},
@ -795,7 +781,6 @@ class AstreGS(BaseResource):
@endpoint(
name='update-indana-indicator',
perm='can_access',
post={
'description': _('Update indana indicator'),
'request_body': {'schema': {'application/json': INDANA_SCHEMA}},
@ -808,7 +793,6 @@ class AstreGS(BaseResource):
@endpoint(
name='delete-indana-indicator',
perm='can_access',
post={
'description': _('Delete indana indicator'),
'request_body': {'schema': {'application/json': INDANA_KEY_SCHEMA}},
@ -821,7 +805,6 @@ class AstreGS(BaseResource):
@endpoint(
name='create-tiers-rib',
perm='can_access',
post={
'description': _('Create RIB'),
'request_body': {'schema': {'application/json': TIERS_RIB_SCHEMA}},
@ -834,7 +817,6 @@ class AstreGS(BaseResource):
@endpoint(
name='get-tiers-rib',
perm='can_access',
description=_('Get RIB'),
parameters={
'CodeTiers': {'example_value': '42435'},
@ -852,7 +834,6 @@ class AstreGS(BaseResource):
@endpoint(
name='update-tiers-rib',
perm='can_access',
post={
'description': _('Update RIB'),
'request_body': {'schema': {'application/json': TIERS_RIB_UPDATE_SCHEMA}},
@ -871,7 +852,6 @@ class AstreGS(BaseResource):
@endpoint(
name='delete-tiers-rib',
perm='can_access',
description=_('Delete RIB'),
parameters={
'CodeTiers': {'example_value': '42435'},
@ -889,7 +869,6 @@ class AstreGS(BaseResource):
@endpoint(
name='find-tiers-by-rib',
perm='can_access',
description=_('Find person by RIB'),
parameters={
'banque': {'example_value': '30001'},
@ -912,13 +891,12 @@ class AstreGS(BaseResource):
for item in r.liste.EnregRechercheTiersReturn:
tiers_data = serialize_object(item)
tiers_data['id'] = tiers_data['N']
tiers_data['text'] = '%{Nom_Enregistrement}s (%{N}s)'.format(**tiers_data)
tiers_data['text'] = '{Nom_Enregistrement} ({N})'.format(**tiers_data)
data.append(tiers_data)
return {'data': data}
@endpoint(
name='get-dossier',
perm='can_access',
description=_('Get Dossier'),
parameters={
'CodeDossier': {'example_value': '2021-0004933'},

View File

@ -23,6 +23,7 @@ from django.db import models
from django.utils import dateformat, dateparse
from django.utils.translation import gettext_lazy as _
from zeep import helpers
from zeep.exceptions import Fault
from passerelle.base.models import BaseResource
from passerelle.utils.api import endpoint
@ -80,24 +81,23 @@ class ATALConnector(BaseResource):
"""
self._soap_client(wsdl='DemandeService')
@endpoint(methods=['get'], perm='can_access', name='get-thematique')
@endpoint(methods=['get'], name='get-thematique')
def get_thematique(self, request):
return self._xml_ref('DemandeService', 'getThematiqueATAL', 'thematiques')
@endpoint(methods=['get'], perm='can_access', name='get-type-activite')
@endpoint(methods=['get'], name='get-type-activite')
def get_type_activite(self, request):
return self._basic_ref('VilleAgileService', 'getTypeActivite')
@endpoint(methods=['get'], perm='can_access', name='get-type-de-voie')
@endpoint(methods=['get'], name='get-type-de-voie')
def get_type_de_voie(self, request):
return self._basic_ref('VilleAgileService', 'getTypeDeVoie')
@endpoint(methods=['get'], perm='can_access', name='get-types-equipement')
@endpoint(methods=['get'], name='get-types-equipement')
def get_types_equipement(self, request):
return self._xml_ref('VilleAgileService', 'getTypesEquipement', 'types')
@endpoint(
perm='can_access',
name='insert-action-comment',
post={
'description': _('Insert action comment'),
@ -114,7 +114,6 @@ class ATALConnector(BaseResource):
return process_response(demande_number)
@endpoint(
perm='can_access',
name='insert-demande-complet-by-type',
post={
'description': _('Insert demande complet by type'),
@ -171,7 +170,6 @@ class ATALConnector(BaseResource):
@endpoint(
methods=['get'],
perm='can_access',
example_pattern='{demande_number}/',
pattern=r'^(?P<demande_number>\w+)/$',
name='retrieve-details-demande',
@ -185,7 +183,6 @@ class ATALConnector(BaseResource):
@endpoint(
methods=['get'],
perm='can_access',
example_pattern='{demande_number}/',
pattern=r'^(?P<demande_number>\w+)/$',
name='retrieve-etat-travaux',
@ -197,7 +194,6 @@ class ATALConnector(BaseResource):
@endpoint(
methods=['get'],
perm='can_access',
example_pattern='{demande_number}/',
pattern=r'^(?P<demande_number>\w+)/$',
parameters={
@ -265,7 +261,6 @@ class ATALConnector(BaseResource):
return {'data': data}
@endpoint(
perm='can_access',
post={
'description': _('Upload a file'),
'request_body': {'schema': {'application/json': schemas.UPLOAD}},
@ -286,12 +281,15 @@ class ATALConnector(BaseResource):
'numeroDemande': post_data['numero_demande'],
'nomFichier': filename,
}
self._soap_call(wsdl='ChargementPiecesJointesService', method='upload', **data)
try:
self._soap_call(wsdl='ChargementPiecesJointesService', method='upload', **data)
except Fault as e:
raise APIError(str(e))
return {}
@endpoint(
methods=['get'],
perm='can_access',
example_pattern='{demande_number}/',
pattern=r'^(?P<demande_number>\w+)/$',
name='new-comments',

View File

View File

@ -0,0 +1,59 @@
# Generated by Django 3.2.18 on 2023-06-26 15:06
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = []
operations = [
migrations.CreateModel(
name='AtalREST',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('title', models.CharField(max_length=50, verbose_name='Title')),
('slug', models.SlugField(unique=True, verbose_name='Identifier')),
('description', models.TextField(verbose_name='Description')),
(
'basic_auth_username',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication username'
),
),
(
'basic_auth_password',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication password'
),
),
(
'client_certificate',
models.FileField(
blank=True, null=True, upload_to='', verbose_name='TLS client certificate'
),
),
(
'trusted_certificate_authorities',
models.FileField(blank=True, null=True, upload_to='', verbose_name='TLS trusted CAs'),
),
(
'verify_cert',
models.BooleanField(blank=True, default=True, verbose_name='TLS verify certificates'),
),
(
'http_proxy',
models.CharField(blank=True, max_length=128, verbose_name='HTTP and HTTPS proxy'),
),
('base_url', models.URLField(verbose_name='API URL')),
('api_key', models.CharField(max_length=1024, verbose_name='API key')),
],
options={
'verbose_name': 'Atal REST',
},
),
]

View File

@ -0,0 +1,539 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2023 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import base64
import binascii
import collections
import io
import json
import urllib
import requests
from django.db import models
from django.utils import dateparse
from django.utils.translation import gettext_lazy as _
from passerelle.base.models import BaseResource, HTTPResource
from passerelle.utils.api import endpoint
from passerelle.utils.jsonresponse import APIError
FILE_OBJECT = {
'type': 'object',
'description': 'File object',
'required': ['content'],
'properties': {
'filename': {
'type': 'string',
'description': 'Filename',
},
'content': {
'type': 'string',
'description': 'Content',
},
'content_type': {
'type': 'string',
'description': 'Content type',
},
},
}
SINGLE_ATTACHMENT_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'additionalProperties': False,
'properties': {
'file': {
'oneOf': [
FILE_OBJECT,
{'type': 'string', 'description': 'empty file, do not consider', 'pattern': r'^$'},
{'type': 'null', 'description': 'empty file, do not consider'},
]
}
},
'required': ['file'],
}
ATTACHMENTS_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'additionalProperties': False,
'properties': {
'files': {
'type': 'array',
'items': {
'oneOf': [
FILE_OBJECT,
{'type': 'string', 'description': 'empty file, do not consider', 'pattern': r'^$'},
{'type': 'null', 'description': 'empty file, do not consider'},
]
},
},
'worksrequests_ids': {'type': 'array', 'items': {'type': 'string'}},
},
'required': ['files', 'worksrequests_ids'],
'unflatten': True,
}
WORKSREQUESTS_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': collections.OrderedDict(
{
'activity_nature_id': {'type': 'string'},
'comments': {'type': 'string'},
'contact': {
'type': 'object',
'properties': {
'adress1': {'type': 'string'},
'city': {'type': 'string'},
'email': {'type': 'string'},
'first_name': {'type': 'string'},
'last_name': {'type': 'string'},
'mobile': {'type': 'string'},
'phone': {'type': 'string'},
'zipcode': {'type': 'string'},
},
},
'description': {'type': 'string'},
'desired_date': {'type': 'string', 'description': 'format YYYY-MM-DD'},
'keywords': {'type': 'string'},
'latitude': {
'oneOf': [
{'type': 'number'},
{'type': 'string'},
]
},
'localization': {'type': 'string'},
'longitude': {
'oneOf': [
{'type': 'number'},
{'type': 'string'},
]
},
'object': {'type': 'string'},
'operator': {'type': 'string'},
'patrimony_id': {'type': 'string'},
'priority_id': {'type': 'string'},
'recipient_id': {'type': 'string'},
'request_date': {
'type': 'string',
'description': 'format YYYY-MM-DD',
},
'requester_id': {'type': 'string'},
'requesting_department_id': {'type': 'string'},
'request_type': {'type': 'string'},
'suggested_recipient_id': {'type': 'string'},
'thematic_ids': {'type': 'array', 'items': {'type': 'string'}},
}
),
'required': ['object', 'recipient_id', 'requester_id', 'requesting_department_id'],
'unflatten': True,
}
STATUS_MAP = {
0: 'En attente',
1: 'En analyse',
2: 'Acceptée',
3: 'Refusée',
4: 'Annulée',
5: 'Ajournée',
6: 'Brouillon',
7: 'Redirigée',
8: 'Prise en compte',
9: 'Clôturée',
13: 'Archivée',
14: 'À spécifier',
15: 'À valider',
}
INTERVENTION_STATUS_MAP = {
1: 'Pas commencé',
2: 'En cours',
4: 'Terminé',
5: 'Fermé',
}
def to_ds(record):
record['id'] = record['Id']
record['text'] = record['Name']
return record
class AtalREST(BaseResource, HTTPResource):
base_url = models.URLField(_('API URL'))
api_key = models.CharField(max_length=1024, verbose_name=_('API key'))
category = _('Business Process Connectors')
class Meta:
verbose_name = _('Atal REST')
def _call(
self, path, method='get', params=None, data=None, json_data=None, files=None, return_response=False
):
url = urllib.parse.urljoin(self.base_url, path)
kwargs = {}
kwargs['headers'] = {'X-API-Key': self.api_key}
if params:
kwargs['params'] = params
if method == 'post':
if not json_data:
json_data = {}
kwargs['json'] = json_data
if files:
kwargs['files'] = files
if data:
kwargs['data'] = data
try:
resp = self.requests.request(url=url, method=method, **kwargs)
except (requests.Timeout, requests.RequestException) as e:
raise APIError(str(e))
try:
resp.raise_for_status()
except requests.RequestException as main_exc:
try:
err_data = resp.json()
except (json.JSONDecodeError, requests.exceptions.RequestException):
err_data = {'response_text': resp.text}
raise APIError(str(main_exc), data=err_data)
if return_response:
return resp
try:
return resp.json()
except (json.JSONDecodeError, requests.exceptions.RequestException) as e:
raise APIError(str(e))
def check_status(self):
return self._call('api/Test', return_response=True)
@endpoint(
methods=['get'],
name='thirdparties-requesting-departments',
description=_('Get the third parties requesting departments referential'),
parameters={
'request_type': {
'example_value': '1001',
}
},
)
def thirdparties_requesting_departments(self, request, request_type):
return {
'data': [
to_ds(record)
for record in self._call(
'api/ThirdParties/RequestingDepartments', params={'RequestType': request_type}
)
]
}
@endpoint(
methods=['get'],
description=_('Get the users referential'),
)
def users(self, request):
return {'data': [to_ds(record) for record in self._call('api/Users')]}
@endpoint(
description=_('Create a works request'),
post={
'request_body': {
'schema': {
'application/json': WORKSREQUESTS_SCHEMA,
}
},
'input_example': {
'activity_nature_id': '0',
'comments': 'some comment',
'contact/adress1': '1 rue des cinq diamants',
'contact/city': 'paris',
'contact/email': 'foo@bar.invalid',
'contact/first_name': 'john',
'contact/last_name': 'doe',
'contact/mobile': '0606060606',
'contact/phone': '0101010101',
'contact/zipcode': '75013',
'description': 'some description',
'desired_date': '2023-06-28',
'keywords': 'foo bar',
'latitude': '0',
'localization': 'somewhere',
'longitude': '0',
'object': 'some object',
'operator': 'some operator',
'patrimony_id': '0',
'priority_id': '0',
'recipient_id': '0',
'request_date': '2023-06-27',
'requester_id': '0',
'requesting_department_id': '0',
'request_type': '0',
'suggested_recipient_id': {'type': 'string'},
'thematic_ids/0': '1',
'thematic_ids/1': '2',
},
},
)
def worksrequests(self, request, post_data):
data = {}
int_params = {
'activity_nature_id': 'ActivityNatureId',
'patrimony_id': 'PatrimonyId',
'priority_id': 'PriorityId',
'recipient_id': 'RecipientId',
'requester_id': 'RequesterId',
'requesting_department_id': 'RequestingDepartmentId',
'request_type': 'RequestType',
'suggested_recipient_id': 'SuggestedRecipientId',
}
for param, atal_param in int_params.items():
if param in post_data:
try:
data[atal_param] = int(post_data[param])
except ValueError:
raise APIError('%s must be an integer' % param)
float_params = {
'latitude': 'Latitude',
'longitude': 'Longitude',
}
for param, atal_param in float_params.items():
param_value = post_data.get(param, '')
if param_value:
if isinstance(param_value, str):
param_value = param_value.replace(',', '.')
try:
data[atal_param] = float(param_value)
except ValueError:
raise APIError('%s must be a float' % param)
if 'thematic_ids' in post_data:
data['ThematicIds'] = []
for thematic_id in post_data['thematic_ids']:
try:
data['ThematicIds'].append(int(thematic_id))
except ValueError:
raise APIError('a thematic identifier must be an intenger')
datetime_params = {
'desired_date': 'DesiredDate',
'request_date': 'RequestDate',
}
for param, atal_param in datetime_params.items():
if param in post_data:
try:
obj = dateparse.parse_date(post_data[param])
except ValueError:
obj = None
if obj is None:
raise APIError(
'%s must be a valid YYYY-MM-DD date (received: "%s")' % (param, post_data[param])
)
data[atal_param] = obj.isoformat()
contact_params = {
'adress1': 'Adress1',
'city': 'City',
'email': 'Email',
'first_name': 'FirstName',
'last_name': 'LastName',
'mobile': 'Mobile',
'phone': 'Phone',
'zipcode': 'ZipCode',
}
if 'contact' in post_data:
data['Contact'] = {}
for param, atal_param in contact_params.items():
if param in post_data['contact']:
data['Contact'][atal_param] = post_data['contact'][param]
string_params = {
'comments': 'Comments',
'description': 'Description',
'keywords': 'Keywords',
'localization': 'Localization',
'object': 'Object',
'operator': 'Operator',
}
for param, atal_param in string_params.items():
if param in post_data:
data[atal_param] = post_data[param]
resp_data = self._call('api/WorksRequests', method='post', json_data=data)
resp_data['RequestStateLabel'] = STATUS_MAP.get(resp_data.get('RequestState', ''), '')
return {'data': resp_data}
@endpoint(
description=_('Add an attachment to a works requests'),
name='worksrequests-single-attachment',
post={
'request_body': {
'schema': {
'application/json': SINGLE_ATTACHMENT_SCHEMA,
}
},
'input_example': {
'file': {
'filename': 'example-1.pdf',
'content_type': 'application/pdf',
'content': 'JVBERi0xL...(base64 PDF)...',
},
},
},
parameters={
'worksrequests_id': {
'example_value': '1',
}
},
)
def worksrequests_single_attachment(self, request, worksrequests_id, post_data):
if not post_data['file']:
return {}
try:
content = base64.b64decode(post_data['file']['content'])
except (TypeError, binascii.Error):
raise APIError('Invalid file content')
files = {
'File': (
post_data['file'].get('filename', ''),
io.BytesIO(content).read(),
post_data['file'].get('content_type', ''),
)
}
# return nothing if successful
self._call(
'api/WorksRequests/%s/Attachments' % worksrequests_id,
method='post',
files=files,
return_response=True,
)
return {}
@endpoint(
description=_('Add attachments to multiple works requests'),
name='worksrequests-attachments',
post={
'request_body': {
'schema': {
'application/json': ATTACHMENTS_SCHEMA,
}
},
'input_example': {
'files/0': {
'filename': 'example-1.pdf',
'content_type': 'application/pdf',
'content': 'JVBERi0xL...(base64 PDF)...',
},
'files/1': {
'filename': 'example-2.pdf',
'content_type': 'application/pdf',
'content': 'JVBERi0xL...(base64 PDF)...',
},
'worksrequests_ids/0': '1',
'worksrequests_ids/1': '2',
},
},
)
def worksrequests_attachments(self, request, post_data):
files = []
for file_ in post_data.get('files', []):
if not file_:
continue
try:
content = base64.b64decode(file_['content'])
except (TypeError, binascii.Error):
raise APIError('Invalid file content')
files.append(
(
'Files',
(
file_.get('filename', ''),
io.BytesIO(content).read(),
file_.get('content_type', ''),
),
)
)
if not files:
return {}
data = {'Ids': post_data['worksrequests_ids']}
# return nothing if successful
self._call(
'api/WorksRequests/Attachments',
method='post',
files=files,
data=data,
return_response=True,
)
return {}
@endpoint(
methods=['get'],
name='worksrequest-status',
description=_('Get the status of a works request'),
parameters={
'worksrequests_id': {
'example_value': '1',
},
'filter_responses': {
'example_value': '501,507',
},
},
)
def worksrequest_status(self, request, worksrequests_id, filter_responses=None):
filter_responses = (
[type_id.strip() for type_id in filter_responses.split(',') if type_id.strip()]
if filter_responses
else []
)
action_type_ids = []
for type_id in filter_responses:
try:
action_type_ids.append(int(type_id))
except ValueError:
raise APIError('filter_responses must be a list of integer')
resp_data = self._call('api/WorksRequests/%s' % worksrequests_id, params={'$expand': 'Responses'})
resp_data['RequestStateLabel'] = STATUS_MAP.get(resp_data.get('RequestState', ''), '')
if action_type_ids:
responses = resp_data.pop('Responses', [])
resp_data['Responses'] = [
resp for resp in responses if resp.get('ActionTypeId') in action_type_ids
] or []
return {'data': resp_data}
@endpoint(
methods=['get'],
name='worksrequest-intervention-status',
description=_('Get the status of a works request intervention'),
parameters={
'number': {
'example_value': 'DIT23070011',
}
},
)
def worksrequest_intervention_status(self, request, number):
resp_data = self._call('/api/WorksRequests/GetInterventionStates', params={'number': number})
resp_data = resp_data[0] if resp_data else {}
resp_data['WorkStateLabel'] = INTERVENTION_STATUS_MAP.get(resp_data.get('WorkState', ''), '')
return {'data': resp_data}

View File

@ -199,7 +199,6 @@ class Resource(BaseResource, HTTPResource):
name='link',
methods=['post'],
description=_('Create link with an extranet account'),
perm='can_access',
parameters={
'NameID': {
'description': _('Publik NameID'),
@ -233,7 +232,6 @@ class Resource(BaseResource, HTTPResource):
name='unlink',
methods=['post'],
description=_('Delete link with an extranet account'),
perm='can_access',
parameters={
'NameID': {
'description': _('Publik NameID'),
@ -292,7 +290,6 @@ class Resource(BaseResource, HTTPResource):
@endpoint(
name='dossiers',
description=_('Get datas for all links'),
perm='can_access',
parameters={
'NameID': {
'description': _('Publik NameID'),
@ -372,7 +369,6 @@ class Resource(BaseResource, HTTPResource):
@endpoint(
name='search',
description=_('Search for beneficiaries'),
perm='can_access',
parameters={
'first_name': {
'description': _('Beneficiary first name'),
@ -506,7 +502,6 @@ class Resource(BaseResource, HTTPResource):
name='link-by-id-per',
methods=['post'],
description=_('Create link with an extranet account'),
perm='can_access',
parameters={
'NameID': {
'description': _('Publik NameID'),
@ -526,7 +521,6 @@ class Resource(BaseResource, HTTPResource):
@endpoint(
name='dossier-by-pair',
description=_('Get dossier data with two integers'),
perm='can_access',
parameters={
'p1': {
'description': _('First integer'),

View File

@ -91,6 +91,7 @@ class Migration(migrations.Migration):
blank=True,
max_length=600,
verbose_name='Postal codes or department number to get streets, separated with commas',
help_text='This parameter is only useful for the /streets/ endpoint (very rarely used)',
),
),
migrations.AlterField(

View File

@ -6,19 +6,19 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("base_adresse", "0018_text_to_jsonb"),
('base_adresse', '0018_text_to_jsonb'),
]
operations = [
migrations.AddField(
model_name="streetmodel",
name="resource",
model_name='streetmodel',
name='resource',
field=models.ForeignKey(
default=None,
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="base_adresse.BaseAdresse",
verbose_name="BAN Connector",
to='base_adresse.BaseAdresse',
verbose_name='BAN Connector',
),
),
]

View File

@ -4,8 +4,8 @@ from django.db import migrations
def set_streetmodel_resource(apps, schema_editor):
BaseAdresse = apps.get_model("base_adresse", "BaseAdresse")
StreetModel = apps.get_model("base_adresse", "StreetModel")
BaseAdresse = apps.get_model('base_adresse', 'BaseAdresse')
StreetModel = apps.get_model('base_adresse', 'StreetModel')
if BaseAdresse.objects.exists():
StreetModel.objects.update(resource=BaseAdresse.objects.first())
else:
@ -14,7 +14,7 @@ def set_streetmodel_resource(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [
("base_adresse", "0019_streetmodel_resource_add"),
('base_adresse', '0019_streetmodel_resource_add'),
]
operations = [

View File

@ -6,17 +6,17 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("base_adresse", "0020_streetmodel_resource_runpython"),
('base_adresse', '0020_streetmodel_resource_runpython'),
]
operations = [
migrations.AlterField(
model_name="streetmodel",
name="resource",
model_name='streetmodel',
name='resource',
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="base_adresse.BaseAdresse",
verbose_name="BAN Connector",
to='base_adresse.BaseAdresse',
verbose_name='BAN Connector',
),
),
]

View File

@ -4,11 +4,11 @@ from django.db import migrations
def set_resource(apps, schema_editor):
BaseAdresse = apps.get_model("base_adresse", "BaseAdresse")
RegionModel = apps.get_model("base_adresse", "RegionModel")
DepartmentModel = apps.get_model("base_adresse", "DepartmentModel")
CityModel = apps.get_model("base_adresse", "CityModel")
AddressCacheModel = apps.get_model("base_adresse", "AddressCacheModel")
BaseAdresse = apps.get_model('base_adresse', 'BaseAdresse')
RegionModel = apps.get_model('base_adresse', 'RegionModel')
DepartmentModel = apps.get_model('base_adresse', 'DepartmentModel')
CityModel = apps.get_model('base_adresse', 'CityModel')
AddressCacheModel = apps.get_model('base_adresse', 'AddressCacheModel')
if BaseAdresse.objects.exists():
resource = BaseAdresse.objects.first()
RegionModel.objects.update(resource=resource)

View File

@ -0,0 +1,67 @@
# Generated by Django 3.2.18 on 2023-11-29 18:06
import django.contrib.postgres.indexes
import django.db.models.functions.text
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0030_auto_20220627_1511'),
]
operations = [
migrations.RunSQL(
[
'CREATE EXTENSION IF NOT EXISTS pg_trgm WITH SCHEMA public',
],
reverse_sql=['DROP EXTENSION IF EXISTS pg_trgm'],
),
migrations.AddIndex(
model_name='citymodel',
index=models.Index(fields=['code'], name='base_adress_code_e169d0_idx'),
),
migrations.AddIndex(
model_name='citymodel',
index=models.Index(fields=['zipcode'], name='base_adress_zipcode_79aa6f_idx'),
),
migrations.AddIndex(
model_name='citymodel',
index=models.Index(
django.db.models.functions.text.Upper('unaccent_name'), name='base_adresse_city_name_idx'
),
),
migrations.AddIndex(
model_name='departmentmodel',
index=models.Index(
django.db.models.functions.text.Upper('unaccent_name'), name='base_adresse_dept_name_idx'
),
),
migrations.AddIndex(
model_name='regionmodel',
index=models.Index(
django.db.models.functions.text.Upper('unaccent_name'), name='base_adresse_region_name_idx'
),
),
migrations.AddIndex(
model_name='streetmodel',
index=models.Index(fields=['ban_id'], name='base_adress_ban_id_2c35ab_idx'),
),
migrations.AddIndex(
model_name='streetmodel',
index=models.Index(fields=['zipcode'], name='base_adress_zipcode_bf7091_idx'),
),
migrations.AddIndex(
model_name='streetmodel',
index=models.Index(fields=['citycode'], name='base_adress_citycod_428b79_idx'),
),
migrations.AddIndex(
model_name='streetmodel',
index=django.contrib.postgres.indexes.GinIndex(
django.contrib.postgres.indexes.OpClass(
django.db.models.functions.text.Upper('unaccent_name'), 'public.gin_trgm_ops'
),
name='base_adresse_street_name_idx',
),
),
]

View File

@ -5,9 +5,11 @@ import json
from io import StringIO
from urllib import parse as urlparse
from django.contrib.postgres import indexes as postgresql_indexes
from django.core.exceptions import FieldError
from django.db import connection, models
from django.db.models import JSONField, Q
from django.db.models.functions import Upper
from django.utils import timezone
from django.utils.http import urlencode
from django.utils.translation import gettext_lazy as _
@ -44,12 +46,6 @@ class BaseAdresse(AddressResource):
'<a href="https://api.gouv.fr/api/api-geo.html">API Geo</a>.'
)
zipcode = models.CharField(
max_length=600,
blank=True,
verbose_name=_('Postal codes or department number to get streets, separated with commas'),
)
latitude = models.FloatField(
null=True,
blank=True,
@ -63,6 +59,13 @@ class BaseAdresse(AddressResource):
help_text=_('Geographic priority for /addresses/ endpoint.'),
)
zipcode = models.CharField(
max_length=600,
blank=True,
verbose_name=_('Postal codes or department number to get streets, separated with commas'),
help_text=_('This parameter is only useful for the /streets/ endpoint (very rarely used)'),
)
class Meta:
verbose_name = _('Base Adresse Web Service')
@ -109,6 +112,7 @@ class BaseAdresse(AddressResource):
@endpoint(
pattern='(?P<q>.+)?$',
description=_('Addresses list'),
perm='OPEN',
parameters={
'id': {'description': _('Address identifier')},
'q': {'description': _('Address'), 'example_value': '169 rue du chateau, paris'},
@ -125,10 +129,24 @@ class BaseAdresse(AddressResource):
'Prioritize results according to coordinates. "lat" parameter must also be present.'
)
},
'type': {
'description': _(
'Type of address to return, housenumber, street, locality, municipality or all. Default is all.'
)
},
},
)
def addresses(
self, request, id=None, q=None, zipcode='', citycode=None, lat=None, lon=None, page_limit=5
self,
request,
id=None,
q=None,
zipcode='',
citycode=None,
lat=None,
lon=None,
page_limit=5,
type=None,
):
if id is not None:
return self.get_by_id(request, id=id, citycode=citycode)
@ -156,6 +174,8 @@ class BaseAdresse(AddressResource):
if self.latitude and self.longitude or lat and lon:
query_args['lat'] = lat or self.latitude
query_args['lon'] = lon or self.longitude
if type in ('housenumber', 'street', 'locality', 'municipality'):
query_args['type'] = type
query = urlencode(query_args)
url = urlparse.urlunparse((scheme, netloc, path, params, query, fragment))
@ -167,7 +187,8 @@ class BaseAdresse(AddressResource):
result = []
for feature in result_response.json().get('features'):
features = result_response.json().get('features')
for feature in features:
if not feature['geometry']['type'] == 'Point':
continue # skip unknown
data = self.format_address_data(feature)
@ -177,7 +198,6 @@ class BaseAdresse(AddressResource):
)
if not created:
address.update_timestamp()
return {'data': result}
def get_by_id(self, request, id, citycode=None):
@ -222,6 +242,7 @@ class BaseAdresse(AddressResource):
@endpoint(
pattern='(?P<q>.+)?$',
description=_('Geocoding (Nominatim API)'),
perm='OPEN',
parameters={
'q': {'description': _('Address'), 'example_value': '169 rue du chateau, paris'},
'zipcode': {'description': _('Zipcode')},
@ -236,30 +257,51 @@ class BaseAdresse(AddressResource):
'Prioritize results according to coordinates. "lon" parameter must be present.'
)
},
'type': {
'description': _(
'Type of address to return, housenumber, street, locality, municipality or all. Default is all.'
)
},
},
)
def search(self, request, q, zipcode='', citycode=None, lat=None, lon=None, **kwargs):
def search(self, request, q, zipcode='', citycode=None, lat=None, lon=None, type=None, **kwargs):
if kwargs.get('format', 'json') != 'json':
raise NotImplementedError()
result = self.addresses(
request, q=q, zipcode=zipcode, citycode=citycode, lat=lat, lon=lon, page_limit=1
request,
q=q,
zipcode=zipcode,
citycode=citycode,
lat=lat,
lon=lon,
page_limit=1,
type=type,
)
return result['data']
@endpoint(
description=_('Reverse geocoding'),
perm='OPEN',
parameters={
'lat': {'description': _('Latitude'), 'example_value': 48.833708},
'lon': {'description': _('Longitude'), 'example_value': 2.323349},
'type': {
'description': _(
'Type of address to return, housenumber, street, locality, municipality or all. Default is all.'
)
},
},
)
def reverse(self, request, lat, lon, **kwargs):
def reverse(self, request, lat, lon, type=None, **kwargs):
if kwargs.get('format', 'json') != 'json':
raise NotImplementedError()
scheme, netloc, path, params, query, fragment = urlparse.urlparse(self.service_url)
path = urlparse.urljoin(path, 'reverse/')
query = urlencode({'lat': lat, 'lon': lon})
query_dict = {'lat': lat, 'lon': lon}
if type in ('housenumber', 'street', 'locality', 'municipality'):
query_dict['type'] = type
query = urlencode(query_dict)
url = urlparse.urlunparse((scheme, netloc, path, params, query, fragment))
try:
@ -283,9 +325,10 @@ class BaseAdresse(AddressResource):
@endpoint(
description=_('Streets from zipcode'),
perm='OPEN',
parameters={
'id': {'description': _('Street identifier')},
'q': {'description': _("Street name")},
'q': {'description': _('Street name')},
'zipcode': {'description': _('Zipcode')},
'citycode': {'description': _('INSEE City code')},
'page_limit': {'description': _('Maximum number of results to return'), 'example_value': 30},
@ -336,12 +379,13 @@ class BaseAdresse(AddressResource):
@endpoint(
description=_('Cities list'),
perm='OPEN',
parameters={
'id': {
'description': _('Get exactly one city using its code and postal code separated with a dot'),
'example_value': '75056.75014',
},
'q': {'description': _("Search text in name or postal code"), 'example_value': 'Paris'},
'q': {'description': _('Search text in name or postal code'), 'example_value': 'Paris'},
'code': {
'description': _('INSEE code (or multiple codes separated with commas)'),
'example_value': '75056',
@ -397,6 +441,7 @@ class BaseAdresse(AddressResource):
@endpoint(
description=_('Departments list'),
perm='OPEN',
parameters={
'id': {'description': _('Get exactly one department using its code'), 'example_value': '59'},
'q': {'description': _('Search text in name or code'), 'example_value': 'Nord'},
@ -421,6 +466,7 @@ class BaseAdresse(AddressResource):
@endpoint(
description=_('Regions list'),
perm='OPEN',
parameters={
'id': {'description': _('Get exactly one region using its code'), 'example_value': '32'},
'q': {'description': _('Search text in name or code'), 'example_value': 'Hauts-de-France'},
@ -691,6 +737,15 @@ class StreetModel(UnaccentNameMixin, models.Model):
class Meta:
ordering = ['unaccent_name', 'name']
indexes = [
models.Index(fields=['ban_id']),
models.Index(fields=['zipcode']),
models.Index(fields=['citycode']),
postgresql_indexes.GinIndex(
postgresql_indexes.OpClass(Upper('unaccent_name'), 'public.gin_trgm_ops'),
name='%(app_label)s_street_name_idx',
),
]
def __str__(self):
return self.name
@ -715,6 +770,9 @@ class RegionModel(UnaccentNameMixin, models.Model):
class Meta:
ordering = ['code']
unique_together = ('resource', 'code')
indexes = [
models.Index(Upper('unaccent_name'), name='%(app_label)s_region_name_idx'),
]
def __str__(self):
return '%s %s' % (self.code, self.name)
@ -742,6 +800,9 @@ class DepartmentModel(UnaccentNameMixin, models.Model):
class Meta:
ordering = ['code']
unique_together = ('resource', 'code')
indexes = [
models.Index(Upper('unaccent_name'), name='%(app_label)s_dept_name_idx'),
]
def __str__(self):
return '%s %s' % (self.code, self.name)
@ -765,7 +826,9 @@ class CityModel(UnaccentNameMixin, models.Model):
'id': '%s.%s' % (self.code, self.zipcode),
'code': self.code,
'name': self.name,
'city': self.name,
'zipcode': self.zipcode,
'postcode': self.zipcode,
'population': self.population,
'department_code': self.department.code if self.department else None,
'department_name': self.department.name if self.department else None,
@ -777,6 +840,11 @@ class CityModel(UnaccentNameMixin, models.Model):
class Meta:
ordering = ['-population', 'zipcode', 'unaccent_name', 'name']
unique_together = ('resource', 'code', 'zipcode')
indexes = [
models.Index(fields=['code']),
models.Index(fields=['zipcode']),
models.Index(Upper('unaccent_name'), name='%(app_label)s_city_name_idx'),
]
def __str__(self):
return '%s %s' % (self.zipcode, self.name)

View File

@ -68,7 +68,6 @@ class Resource(BaseResource, HTTPResource):
@endpoint(
methods=['post'],
name='meeting',
perm='can_access',
description_post=_('Create a meeting'),
post={
'request_body': {
@ -146,7 +145,6 @@ class Resource(BaseResource, HTTPResource):
@endpoint(
methods=['get', 'delete'],
name='meeting',
perm='can_access',
pattern=r'^(?P<guid>[0-9a-f]{32})/?$',
example_pattern='{guid}/',
description_post=_('Get a meeting'),
@ -174,6 +172,7 @@ class Resource(BaseResource, HTTPResource):
@endpoint(
methods=['get'],
name='meeting',
perm='OPEN',
pattern=r'^(?P<guid>[0-9a-f]{32})/is-running/?$',
example_pattern='{guid}/is-running/',
description_post=_('Report if meeting is running'),
@ -196,6 +195,7 @@ class Resource(BaseResource, HTTPResource):
@endpoint(
methods=['get'],
name='meeting',
perm='OPEN',
pattern=r'^(?P<guid>[0-9a-f]{32})/join/agent/(?P<key>[^/]*)/?$',
example_pattern='{guid}/join/agent/',
description_post=_('Get a meeting'),
@ -223,6 +223,7 @@ class Resource(BaseResource, HTTPResource):
@endpoint(
methods=['get'],
name='meeting',
perm='OPEN',
pattern=r'^(?P<guid>[0-9a-f]{32})/join/user/(?P<key>[^/]*)/?$',
example_pattern='{guid}/join/user/',
description_post=_('Get a meeting'),

View File

View File

@ -0,0 +1,47 @@
# Generated by Django 3.2.18 on 2024-02-20 15:41
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('base', '0030_resourcelog_base_resour_appname_298cbc_idx'),
]
operations = [
migrations.CreateModel(
name='CalDAV',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('title', models.CharField(max_length=50, verbose_name='Title')),
('slug', models.SlugField(unique=True, verbose_name='Identifier')),
('description', models.TextField(verbose_name='Description')),
(
'dav_url',
models.URLField(
help_text='DAV root URL (such as https://test.egw/groupdav.php/)',
verbose_name='DAV root URL',
),
),
('dav_login', models.CharField(max_length=128, verbose_name='DAV username')),
('dav_password', models.CharField(max_length=512, verbose_name='DAV password')),
(
'users',
models.ManyToManyField(
blank=True,
related_name='_caldav_caldav_users_+',
related_query_name='+',
to='base.ApiUser',
),
),
],
options={
'verbose_name': 'CalDAV',
},
),
]

View File

@ -0,0 +1,367 @@
# passerelle - uniform access to multiple data sources and services
# Copyright (C) 2024 Entr'ouvert
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import functools
import urllib.parse
import caldav
import requests
from django.db import models
from django.utils.dateparse import parse_date, parse_datetime
from django.utils.translation import gettext_lazy as _
from passerelle.base.models import BaseResource
from passerelle.utils.api import endpoint
from passerelle.utils.conversion import exception_to_text
from passerelle.utils.jsonresponse import APIError
EVENT_SCHEMA_PART = {
'type': 'object',
'description': _('Ical event properties ( VEVENT RFC 5545 3.6.1 )'),
'properties': {
'DTSTART': {
'type': 'string',
'description': _('Event start (included) ISO-8601 date-time or date (for allday event)'),
},
'DTEND': {
'type': 'string',
'description': _('Event end (excluded) ISO-8601 date-time or date (for allday event)'),
},
'SUMMARY': {
'type': 'string',
'description': 'RFC 5545 3.8.1.12',
},
'DESCRIPTION': {
'type': 'string',
'description': 'RFC 5545 3.8.2.5',
},
'LOCATION': {
'type': 'string',
'description': 'RFC 5545 3.8.1.7',
},
'CATEGORY': {'type': 'string'},
'TRANSP': {
'type': 'boolean',
'description': _('Transparent if true else opaque (RFC 5545 3.8.2.7)'),
},
'RRULE': {
'description': _('Recurrence rule (RFC 5545 3.8.5.3)'),
'type': 'object',
'properties': {
'FREQ': {
'type': 'string',
'enum': ['WEEKLY', 'MONTHLY', 'YEARLY'],
},
'BYDAY': {
'type': 'array',
'items': {
'type': 'string',
'enum': ['MO', 'TU', 'WE', 'TH', 'FR', 'SA', 'SU'],
},
},
'BYMONTH': {
'type': 'array',
'items': {
'type': 'integer',
'minimum': 1,
'maximum': 12,
},
},
'COUNT': {
'type': 'integer',
'minimum': 1,
},
'UNTIL': {
'type': 'string',
'description': _('Date or date and time indicating the end of recurrence'),
},
},
},
},
}
USERNAME_PARAM = {
'description': _('The calendar\'s owner username'),
'type': 'string',
}
EVENT_UID_PARAM = {
'description': _('An event UID'),
'type': 'string',
}
# Action's request body schema
EVENT_SCHEMA = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': _('Event description schema'),
'unflatten': True,
**EVENT_SCHEMA_PART,
}
def clean_egw_response(response, *args, **kwargs):
'''requests hooks that modify requests's responses deleting
EGW's SQL log lines when there is some
SQL log lines are matched by checking that they :
- startswith "==> SQL =>"
- endswith "<br>"
'''
response._content = b'\n'.join(
line
for line in response.content.split(b'\n')
if not line.startswith(b'==> SQL =>') or not line.endswith(b'<br>')
)
return response
class CalDAV(BaseResource):
dav_url = models.URLField(
blank=False,
verbose_name=_('DAV root URL'),
help_text=_('DAV root URL (such as https://test.egw/groupdav.php/)'),
)
dav_login = models.CharField(max_length=128, verbose_name=_('DAV username'), blank=False)
dav_password = models.CharField(max_length=512, verbose_name=_('DAV password'), blank=False)
category = _('Misc')
log_requests_errors = False
class Meta:
verbose_name = _('CalDAV')
@functools.cached_property
def dav_client(self):
'''Instanciate a caldav.DAVCLient and return the instance'''
client = caldav.DAVClient(self.dav_url, username=self.dav_login, password=self.dav_password)
# Replace DAVClient.session requests.Session instance by our
# own requests session in order to log DAV interactions
client.session = self.requests
# adds EGW response cleaning hook
self.requests.hooks['response'] = clean_egw_response
return client
def check_status(self):
'''Attempt a propfind on DAV root URL'''
try:
rep = self.dav_client.propfind()
rep.find_objects_and_props()
except caldav.lib.error.AuthorizationError:
raise Exception(_('Not authorized: bad login/password ?'))
@endpoint(
name='event',
pattern='^create$',
example_pattern='create',
methods=['post'],
post={'request_body': {'schema': {'application/json': EVENT_SCHEMA}}},
parameters={
'username': USERNAME_PARAM,
},
)
def create_event(self, request, username, post_data):
'''Event creation endpoint'''
cal = self.get_calendar(username)
self._process_event_properties(post_data)
# Sequence is auto-incremented when saved, -1 will lead to the
# expected SEQUENCE:0 when an event is created
post_data['SEQUENCE'] = -1
try:
evt = cal.save_event(**post_data)
except requests.exceptions.RequestException as expt:
raise APIError(
_('Error sending creation request to caldav server'),
data={
'expt_class': str(type(expt)),
'expt': str(expt),
'username': username,
},
)
except caldav.lib.error.DAVError as expt:
raise APIError(
_('Error creating event'),
data={'expt_class': str(type(expt)), 'expt': exception_to_text(expt), 'username': username},
)
return {'data': {'event_id': evt.id}}
# Patch do not support request_body validation, using post instead
@endpoint(
name='event',
pattern='^update$',
example_pattern='update',
methods=['post'],
post={'request_body': {'schema': {'application/json': EVENT_SCHEMA}}},
parameters={
'username': USERNAME_PARAM,
'event_id': EVENT_UID_PARAM,
},
)
def update_event(self, request, username, event_id, post_data):
'''Event update endpoint'''
self._process_event_properties(post_data)
ical = self.get_event(username, event_id)
vevent = ical.icalendar_instance.walk('VEVENT')
if not len(vevent) == 1:
raise APIError(
_('Given event (user:%r uid:%r) do not contains VEVENT component') % (username, event_id),
data={
'username': username,
'event_uid': event_id,
'VEVENT': str(vevent),
},
)
vevent = vevent[0]
# vevent.update(post_data) do not convert values as expected
for k, v in post_data.items():
vevent.pop(k)
vevent.add(k, v)
if 'SEQUENCE' not in vevent:
# SEQUENCE is auto-incremented when present
# here after a 1st modification the SEQUENCE will be 1 (not 0)
vevent['SEQUENCE'] = 0
try:
# do not use ical.save(no_create=True) : no_create fails on some calDAV
ical.save()
except requests.exceptions.RequestException as expt:
raise APIError(
_('Error sending update request to caldav server'),
data={
'expt_class': str(type(expt)),
'expt': str(expt),
'username': username,
'event_id': event_id,
},
)
return {'data': {'event_id': ical.id}}
@endpoint(
name='event',
pattern='^delete$',
example_pattern='delete',
methods=['delete'],
parameters={
'username': USERNAME_PARAM,
'event_id': EVENT_UID_PARAM,
},
)
def delete_event(self, request, username, event_id):
ical = self.get_event(username, event_id)
try:
ical.delete()
except requests.exceptions.RequestException as expt:
raise APIError(
_('Error sending deletion request to caldav server'),
data={
'expt_class': str(type(expt)),
'expt': str(expt),
'username': username,
'event_id': event_id,
},
)
return {}
def get_event(self, username, event_uid):
'''Fetch an event given a username and an event_uid
Arguments:
- username: Calendar owner's username
- event_uid: The event's UID
Returns an caldav.Event instance
'''
event_path = '%s/calendar/%s.ics' % (urllib.parse.quote(username), urllib.parse.quote(str(event_uid)))
cal = self.get_calendar(username)
try:
ical = cal.event_by_url(event_path)
except caldav.lib.error.DAVError as expt:
raise APIError(
_('Unable to get event %r in calendar owned by %r') % (event_uid, username),
data={
'expt': exception_to_text(expt),
'expt_cls': str(type(expt)),
'username': username,
'event_uid': event_uid,
},
)
except requests.exceptions.RequestException as expt:
raise APIError(
_('Unable to communicate with caldav server while fetching event'),
data={
'expt': exception_to_text(expt),
'expt_class': str(type(expt)),
'username': username,
'event_uid': event_uid,
},
)
return ical
def get_calendar(self, username):
'''Given a username returns the associated calendar set
Arguments:
- username: Calendar owner's username
Returns A caldav.Calendar instance
Note: do not raise any caldav exception before a method trying to make
a request is called
'''
path = '%s/calendar' % urllib.parse.quote(username)
calendar = caldav.Calendar(client=self.dav_client, url=path)
return calendar
def _process_event_properties(self, data):
'''Handles verification & convertion of event properties
@note Modify given data dict inplace
'''
if 'TRANSP' in data:
data['TRANSP'] = 'TRANSPARENT' if data['TRANSP'] else 'OPAQUE'
if 'CATEGORY' in data:
data['CATEGORIES'] = [data.pop('CATEGORY')]
if 'RRULE' in data and 'UNTIL' in data['RRULE']:
try:
data['RRULE']['UNTIL'] = self._parse_date_or_datetime(data['RRULE']['UNTIL'])
except ValueError:
raise APIError(
_('Unable to convert field %(name)s=%(value)r: not a valid date nor date-time')
% {'name': 'RRULE/UNTIL', 'value': data['RRULE']['UNTIL']},
http_status=400,
)
for dt_field in ('DTSTART', 'DTEND'):
if dt_field not in data:
continue
try:
data[dt_field] = self._parse_date_or_datetime(data[dt_field])
except ValueError:
raise APIError(
_('Unable to convert field %(name)s=%(value)r: not a valid date nor date-time')
% {'name': dt_field, 'value': data[dt_field]},
http_status=400,
)
def _parse_date_or_datetime(self, value):
try:
ret = parse_date(value) or parse_datetime(value)
except ValueError:
ret = None
if not ret:
raise ValueError('Invalid value')
return ret

View File

View File

@ -0,0 +1,87 @@
# Generated by Django 3.2.18 on 2024-02-28 09:13
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('base', '0030_resourcelog_base_resour_appname_298cbc_idx'),
]
operations = [
migrations.CreateModel(
name='Carl',
fields=[
(
'id',
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('title', models.CharField(max_length=50, verbose_name='Title')),
('slug', models.SlugField(unique=True, verbose_name='Identifier')),
('description', models.TextField(verbose_name='Description')),
(
'basic_auth_username',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication username'
),
),
(
'basic_auth_password',
models.CharField(
blank=True, max_length=128, verbose_name='Basic authentication password'
),
),
(
'client_certificate',
models.FileField(
blank=True, null=True, upload_to='', verbose_name='TLS client certificate'
),
),
(
'trusted_certificate_authorities',
models.FileField(blank=True, null=True, upload_to='', verbose_name='TLS trusted CAs'),
),
(
'verify_cert',
models.BooleanField(blank=True, default=True, verbose_name='TLS verify certificates'),
),
(
'http_proxy',
models.CharField(blank=True, max_length=128, verbose_name='HTTP and HTTPS proxy'),
),
(
'service_url',
models.URLField(
help_text='Base webservice URL (such as https://carlsource.server.com/gmaoCS02/',
verbose_name='Service URL',
),
),
(
'carl_username',
models.CharField(
blank=True, max_length=128, verbose_name='Carl token authentication username'
),
),
(
'carl_password',
models.CharField(
blank=True, max_length=128, verbose_name='Carl token authentication password'
),
),
(
'users',
models.ManyToManyField(
blank=True,
related_name='_carl_carl_users_+',
related_query_name='+',
to='base.ApiUser',
),
),
],
options={
'verbose_name': 'Carl',
},
),
]

Some files were not shown because too many files have changed in this diff Show More