Compare commits

..

87 Commits

Author SHA1 Message Date
Nicolas Roche 3d99dd40d5 toulouse-maelis: cache invoices on link call (#74790)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-07 14:08:50 +02:00
Nicolas Roche e4089b33a0 toulouse-maelis: store invoice in database (#74790) 2023-04-07 14:08:50 +02:00
Nicolas Roche bb5e12809f toulouse-maelis: add an endpoint to get a PDF invoice (#74790) 2023-04-07 14:08:50 +02:00
Nicolas Roche 332ad30f1b toulouse-maelis: add an endpoint to pay an invoice (#74790) 2023-04-07 14:08:50 +02:00
Nicolas Roche b5a19ee8b7 toulouse-maelis: add an endpoint to get an invoice (#74790) 2023-04-07 14:08:50 +02:00
Nicolas Roche 75de5eb36e toulouse-maelis: add an endpoint to get historical invoices (#74790) 2023-04-07 14:08:50 +02:00
Nicolas Roche bd7b1566f2 toulouse-maelis: add an endpoint to get current invoices (#74790) 2023-04-07 14:08:50 +02:00
Nicolas Roche 3226fd0cf5 toulouse-maelis: get last invoice WSDL for tests (#74790) 2023-04-07 14:08:50 +02:00
Nicolas Roche b425a2ac80 toulouse-maelis: [functests] add tests on activity service (#76089)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-07 12:47:14 +02:00
Nicolas Roche 5b53826063 toulouse_maelis: [functests] rename tests to fix run order (#76089) 2023-04-07 12:39:08 +02:00
Nicolas Roche 34aaf879cb toulouse-maelis: [functests] prefix new family name with EO_ (#76089) 2023-04-07 12:39:04 +02:00
Nicolas Roche d1fea2f13f toulouse-maelis: [tools] do no crash on empty date (#76089) 2023-04-06 10:15:12 +02:00
Frédéric Péters ac4842c64d gdema: really do not log requests (#76309)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-06 08:26:15 +02:00
Nicolas Roche 289f7701b9 toulouse-maelis: [tests] rename 2 tests (#76280)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-05 15:41:08 +02:00
Nicolas Roche c4094d5933 toulouse-maelis: allow no calendarGeneration field on getPersonUnitInfo reply (#76280) 2023-04-05 15:41:08 +02:00
Nicolas Roche 4b6ceac39e toulouse-maelis: check person_id on subscription endpoints (#76218)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-04 15:33:05 +02:00
Thomas NOËL 5cb6187702 translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-04 12:10:56 +02:00
Thomas NOËL 2ac7a1c99d pdf: add watermark endpoint (#74796)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-04 11:56:50 +02:00
Thomas NOËL d0c271ac41 phonecalls: remove X-Frame-Options on newtab (#76162)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-04 11:56:21 +02:00
Frédéric Péters 8f707e90cb photon: return api error on invalid photon responses (#76191)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-03 21:31:12 +02:00
Serghei Mihai aa24d04fbb pdf: do not display form's fill page if no pdf file associated (#75715)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-03 14:19:44 +02:00
Nicolas Roche 6a3d1a1866 toulouse-maelis: [tests] move WSDL into tools directory (#75927)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-03 12:02:24 +02:00
Nicolas Roche 5fd5e84900 toulouse-maelis: [tools] add raw SOAP scripts (#75927) 2023-04-03 12:02:18 +02:00
Nicolas Roche c28b2ccb10 toulouse-maelis: correct read-subscribe-activity-list parameters (#76136)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-03 11:04:49 +02:00
Frédéric Péters b0466e9468 tox: limit Pillow version to avoid failure in test_fill_form_ok (#76117)
gitea/passerelle/pipeline/head This commit looks good Details
2023-04-02 21:21:48 +02:00
Frédéric Péters 906adb0e5d tests: remove erroneous check "up" string in availability test (#76116)
gitea/passerelle/pipeline/head Build queued... Details
It would catch the old "We recommend you to update your web browser"
from gadjo.
2023-04-02 21:01:42 +02:00
Nicolas Roche d9b53e1ccc translation update
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-30 11:48:03 +02:00
Nicolas Roche c9dd0f86f9 toulouse-maelis: add endpoints to update week-calendar (#75425)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-30 11:26:22 +02:00
Nicolas Roche 3f81f5efa7 toulouse-maelis: correct type on person_id description (#75425) 2023-03-30 11:26:22 +02:00
Nicolas Roche f0be7e0aba toulouse-maelis: add a school year parameter on read_subscribe_activity_list (#75552)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-30 10:00:33 +02:00
Nicolas Roche 50c725d393 toulouse-maelis: add natures subscribed to read-family response (#75549)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 18:34:26 +02:00
Nicolas Roche 3a558b0ba4 toulouse-maelis: add direction and service referentials (#75812)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 18:20:47 +02:00
Nicolas Roche 685014df6b toulouse-maelis: remove no more required freezer on catalog test (#75652)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 18:03:32 +02:00
Nicolas Roche 791b341b83 toulouse-maelis: factorize updating activity referential (#75652) 2023-03-29 18:03:32 +02:00
Nicolas Roche b78f698616 toulouse_maelis: bypass cache on activity catalog (#75652) 2023-03-29 18:03:32 +02:00
Nicolas Roche 0c8f408266 toulouse-maelis: correct school year period on catalog (#75652) 2023-03-29 18:03:32 +02:00
Nicolas Roche ac4eef819e toulouse-maelis: correct bug loading next year catalog (#75652) 2023-03-29 18:03:32 +02:00
Nicolas Roche 54783a392f toulouse-maelis: factorize payload processing on subscriptions (#74763)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 17:34:30 +02:00
Nicolas Roche 6ba66d06ff toulouse-maelis: add direct subscription endpoint (#74763) 2023-03-29 17:34:30 +02:00
Valentin Deniaud 39c0fab26b misc: use new JSONField location in migrations (#75442)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 17:01:40 +02:00
Nicolas Roche c54f56633a toulouse-maelis: update family WSDL (#75810)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 15:58:04 +02:00
Nicolas Roche 03b4088aff toulouse-maelis: update activity WSDL (#75810) 2023-03-29 15:58:03 +02:00
Nicolas Roche 1d56451789 toulouse-maelis: update invoice WSDL (#75810) 2023-03-29 15:58:03 +02:00
Nicolas Roche b64b6f5233 toulouse-maelis: update ape WSDL (#75810) 2023-03-29 15:58:03 +02:00
Valentin Deniaud 4a4ecadd4e misc: fix Django 3.2 default auto field warning (#75442)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 14:51:08 +02:00
Valentin Deniaud 8c88ec414d misc: bump djhtml version (#75442) 2023-03-29 14:51:08 +02:00
Valentin Deniaud 3a63f02dae misc: bump black version (#75442) 2023-03-29 14:51:08 +02:00
Valentin Deniaud 164433b269 misc: change pyupgrade target version to 3.9 (#75442) 2023-03-29 14:50:27 +02:00
Valentin Deniaud afcaed5061 misc: change django-upgrade target version to 3.2 (#75442) 2023-03-29 14:50:27 +02:00
Valentin Deniaud a134eabcd3 misc: require django 3.2 (#75442) 2023-03-29 14:50:27 +02:00
Valentin Deniaud 9dc1482d37 ci: remove django22 unused target (#75442) 2023-03-29 14:50:27 +02:00
Nicolas Roche dff430d1d5 toulouse-maelis: add endpoint to list RL or child subscribed activities (#75472)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 12:34:19 +02:00
Nicolas Roche b763a34662 toulouse-maelis: add RL and child subscriptions to test data (#75472) 2023-03-29 12:34:19 +02:00
Nicolas Roche f2a66ff67b toulouse-maelis: rewrite filtering in personnal catalog (#75752)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 12:05:39 +02:00
Nicolas Roche bd485a77df toulouse-maelis: get personnal catalogs for extra-sco and loisir (#75752) 2023-03-29 12:05:37 +02:00
Nicolas Roche d0b5c579b9 toulouse-maelis: remove nature codes parameter on loisir catalog (#75752) 2023-03-29 11:36:20 +02:00
Nicolas Roche 940979b2f3 toulouse-maelis: remove hard coded activity natures (#75752) 2023-03-29 11:36:20 +02:00
Nicolas Roche 9198bb87ce toulouse-maelis: add activity natures codes fields on connector (#75752) 2023-03-29 11:36:20 +02:00
Nicolas Roche 276794b487 toulouse-maelis: adapt migration to add encoder to JSON field (#75752) 2023-03-29 11:36:20 +02:00
Nicolas Roche 9676943a0d toulouse-maelis: factorize childs and RLs data (#75455)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 11:07:18 +02:00
Nicolas Roche 6210cf3fa3 toulouse-maelis: add endpoint to get RLs and childs data (#75455) 2023-03-29 11:07:18 +02:00
Nicolas Roche 4b64f39337 toulouse-maelis: add tests on agent access providing DUI (#75463)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 10:45:13 +02:00
Nicolas Roche a61f858ebe toulouse_maelis: ease retrieving maelis ids on nursery (#74446)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 10:30:21 +02:00
Nicolas Roche f467587386 toulouse-maelis: add content on read_nursery_list test (#74446) 2023-03-29 10:30:21 +02:00
Nicolas Roche 761391c484 toulouse-maelis: manage indicators on create_nursery_demand (#74445)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-29 10:13:09 +02:00
Benjamin Dauvergne c8c54b50cb teamnet_axel: do not log HTTP errors (#75790)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-28 17:26:26 +02:00
Nicolas Roche 6322b75699 Revert "toulouse-maelis: [tools] add raw SOAP scripts (#75927)
gitea/passerelle/pipeline/head This commit looks good Details
This reverts commit 5b043a4030.
2023-03-28 15:42:30 +02:00
Nicolas Roche 5b043a4030 toulouse-maelis: [tools] add raw SOAP scripts (#75927)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-28 15:36:34 +02:00
Emmanuel Cazenave d656b273c9 api_entreprise: use v3 in match_mandataire_social endpoint (#75641)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-27 18:09:44 +02:00
Emmanuel Cazenave 5a44fdcdeb api_entreprise: delete effectifs/covid endpoints (#75640)
gitea/passerelle/pipeline/head Build started... Details
2023-03-27 18:08:51 +02:00
Emmanuel Cazenave 255933f904 api_entreprise: use v3 in exercices endpoint (#75621)
gitea/passerelle/pipeline/head There was a failure building this commit Details
2023-03-27 18:06:48 +02:00
Benjamin Dauvergne 57d2bb06a9 translation update (#75649)
gitea/passerelle/pipeline/head This commit looks good Details
Remove references to maelis connector's messages.
2023-03-24 16:47:53 +01:00
Benjamin Dauvergne c7dcc2a510 misc: remove maelis connector (#75649) 2023-03-24 16:47:53 +01:00
Benjamin Dauvergne 2de6325c6d misc: replace interception of zeep.exceptions.Fault (#75649)
* replaced by no interception at all in some places, as SOAPError inherit
  from APIError
* replaced by SOAPFault in cartads_cs as there was a custom handling of
  the soap Faults.
* new SOAPValidationError error is added to handle translation to
  APIError with 400 status and not logging as an error of the connector.
2023-03-24 16:47:53 +01:00
Benjamin Dauvergne 1f93f5506d utils/soap: handle case where Fault.detail is a byte string (#75649) 2023-03-24 16:47:53 +01:00
Benjamin Dauvergne 9c85b556f2 misc: let django.views.static.serve do its job (#75700)
gitea/passerelle/pipeline/head This commit looks good Details
It already checks the security of the given path through
django.utils._os.safe_join() and checks if the file exists.
2023-03-24 16:47:23 +01:00
Benjamin Dauvergne 2fbfeedb14 utils/pdf: do not crash on field without a /Rect (#75698)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-24 15:42:50 +01:00
Benjamin Dauvergne 64456cde9f utils/pdf: check radio buttons have kids (#75698) 2023-03-24 15:42:50 +01:00
Nicolas Roche 1fa564c70d toulouse-maelis: adapt functests with last changes (#74645)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-24 14:42:57 +01:00
Nicolas Roche 57b9c49afb toulouse_maelis: add a basket id parameter to basket endpoints (#75535)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-24 09:39:34 +01:00
Nicolas Roche 2b5eb786dc toulouse-maelis: rename get-basket endpoint that now handle multiple regies (#75535) 2023-03-24 09:10:50 +01:00
Nicolas Roche a0ef9e293e toulouse-maelis: add regie referential data (#75475) 2023-03-24 08:58:29 +01:00
Nicolas Roche 2e85f7e013 toulouse-maelis: display read-only extra-sco activities in child calendar (#75144)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-24 08:36:09 +01:00
Nicolas Roche 3cddb32080 toulouse-maelis: update activity calendar (#75144) 2023-03-24 08:36:09 +01:00
Nicolas Roche 479cfbb7be toulouse-maelis: get activity calendar after subscription (#75144) 2023-03-24 08:36:09 +01:00
Nicolas Roche d533a91ad4 toulouse-maelis: get activity calendar before subscription (#75144) 2023-03-24 08:36:09 +01:00
Nicolas Roche 7e66af136c toulouse-maelis: get last activity WSDL (#75584)
gitea/passerelle/pipeline/head This commit looks good Details
2023-03-24 08:14:10 +01:00
597 changed files with 9423 additions and 41789 deletions

View File

@ -2,27 +2,27 @@
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/asottile/pyupgrade
rev: v3.1.0
rev: v3.3.1
hooks:
- id: pyupgrade
args: ['--keep-percent-format', '--py37-plus']
args: ['--keep-percent-format', '--py39-plus']
- repo: https://github.com/adamchainz/django-upgrade
rev: 1.10.0
rev: 1.13.0
hooks:
- id: django-upgrade
args: ['--target-version', '2.2']
args: ['--target-version', '3.2']
- repo: https://github.com/psf/black
rev: 22.3.0
rev: 23.3.0
hooks:
- id: black
args: ['--target-version', 'py37', '--skip-string-normalization', '--line-length', '110']
args: ['--target-version', 'py39', '--skip-string-normalization', '--line-length', '110']
- repo: https://github.com/PyCQA/isort
rev: 5.12.0
hooks:
- id: isort
args: ['--profile', 'black', '--line-length', '110']
- repo: https://github.com/rtts/djhtml
rev: 'v1.5.2'
rev: '3.0.6'
hooks:
- id: djhtml
args: ['--tabwidth', '2']

2
README
View File

@ -76,7 +76,7 @@ djhtml is used to automatically indent html files, using those parameters:
django-upgrade is used to automatically upgrade Django syntax, using those parameters:
django-upgrade --target-version 2.2
django-upgrade --target-version 3.2
There is .pre-commit-config.yaml to use pre-commit to automatically run these tools
before commits. (execute `pre-commit install` to install the git hook.)

2
debian/control vendored
View File

@ -19,7 +19,7 @@ Depends: ghostscript,
python3-cmislib,
python3-dateutil,
python3-distutils,
python3-django (>= 2:2.2),
python3-django (>= 2:3.2),
python3-django-model-utils,
python3-feedparser,
python3-gadjo,

View File

@ -15,6 +15,9 @@ from zeep.helpers import serialize_object
FAMILY_PAYLOAD = {
'category': 'BI',
'situation': 'MARI',
'nbChild': '3',
'nbTotalChild': '4',
'nbAES': '1',
'rl1': {
'civility': 'MME',
'firstname': 'Marge',
@ -319,6 +322,7 @@ def remove_id_on_child(conn, child):
del child['indicators'] # order may change
child['subscribeSchoolList'] = [] # not managed by test yet
child['subscribeActivityList'] = [] # not managed by test yet
del child['subscribe_natures'] # order may change
def remove_id_on_rlg(conn, rlg):
@ -327,7 +331,10 @@ def remove_id_on_rlg(conn, rlg):
rlg['lastname'] = 'N/A'
remove_extra_indicators(conn, rlg['indicatorList'], 'rl-indicator')
rlg['indicatorList'].sort(key=lambda x: x['code'])
rlg['quotientList'].sort(key=lambda x: (x['yearRev'], x['dateStart']))
del rlg['indicators'] # order may change
rlg['subscribeActivityList'] = [] # not managed by test yet
del rlg['subscribe_natures'] # order may change
def remove_id_on_family(conn, family):
@ -396,7 +403,7 @@ def referentials(conn):
def create_data(request, conn):
name_id = request.config.getoption('--nameid')
unlink(conn, name_id)
lastname = uuid4().hex[0:30]
lastname = 'EO_' + uuid4().hex[0:27]
# create family
create_family_payload = copy.deepcopy(FAMILY_PAYLOAD)
@ -430,6 +437,8 @@ def create_data(request, conn):
'lastname': lastname,
'rl1_num': data['RL1']['num'],
'bart_num': data['childList'][0]['num'],
'maggie_num': data['childList'][1]['num'],
'hugo_num': data['childList'][2]['num'],
'data': data,
}
@ -501,3 +510,131 @@ def update_data(request, conn):
'maggie_num': data['childList'][2]['num'],
'data': data,
}
@pytest.fixture(scope='session')
def reference_year():
some_date = datetime.date.today()
if some_date.month <= 8:
# between january and august, reference year is the year just before
return some_date.year - 1
return some_date.year
def get_subscription_info(nature, activity_text, unit_text, place_text, con, name_id, person_id, year):
def select_item(resp, text):
item = None
for item in resp.json()['data']:
if item['text'] == text:
break
else:
raise Exception("do not find '%s'" % text)
return item
# select activity
url = con + '/get-person-activity-list'
params = {
'NameID': name_id,
'person_id': person_id,
'start_date': '%s-09-01' % year,
'end_date': '%s-08-31' % (year + 1),
'nature': nature,
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) > 0
activity = select_item(resp, activity_text)
# select unit
url = con + '/get-person-unit-list'
params = {
'NameID': name_id,
'person_id': person_id,
'start_date': '%s-09-01' % year,
'end_date': '%s-08-31' % (year + 1),
'activity_id': activity['id'],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) > 0
unit = select_item(resp, unit_text)
# select place
url = con + '/get-person-place-list'
params = {
'NameID': name_id,
'person_id': person_id,
'start_date': '%s-09-01' % year,
'end_date': '%s-08-31' % (year + 1),
'activity_id': activity['id'],
'unit_id': unit['id'],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) > 0
place = select_item(resp, place_text)
assert place['capacityInfo']['controlOK'] is True
# check subscription info
url = con + '/get-person-subscription-info'
params = {
'NameID': name_id,
'person_id': person_id,
'activity_id': activity['id'],
'unit_id': unit['id'],
'place_id': place['id'],
'ref_date': datetime.date.today().strftime('%Y-%m-%d'),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
info = resp.json()['data']
assert info['controlResult']['controlOK'] is True
return {
'activity': activity,
'unit': unit,
'place': place,
'info': info,
}
@pytest.fixture(scope='session')
def extrasco_subscribe_info(conn, create_data, reference_year):
unlink(conn, create_data['name_id'])
link(conn, create_data)
return get_subscription_info(
'EXTRASCO',
# Sigec made this extra-sco activity available for functests
'ADL ELEMENTAIRE Maourine Avril 2023',
'ADL ELEMENTAIRE Maourine Avril 2023',
'MAOURINE (la) ELEMENTAIRE',
conn,
create_data['name_id'],
create_data['bart_num'],
reference_year,
)
@pytest.fixture(scope='session')
def perisco_subscribe_info(conn, create_data, reference_year):
'''This fixture is a configuration trick from Sigec
as peri-sco should not be available for subscription
and as a consequence, should not be displayed from catalogs'''
unlink(conn, create_data['name_id'])
link(conn, create_data)
return get_subscription_info(
None,
# Sigec made this peri-sco activity available for functests
'TEMPS DU MIDI 22/23',
'TEMPS DU MIDI 22/23',
'DOLTO FRANCOISE MATERNELLE',
conn,
create_data['name_id'],
create_data['bart_num'],
reference_year,
)

View File

@ -27,8 +27,8 @@
{
"id": "AVS",
"code": "AVS",
"text": "Auxiliaire de Vie scolaire",
"libelle": "Auxiliaire de Vie scolaire ",
"text": "Assistant de Vie scolaire",
"libelle": "Assistant de Vie scolaire ",
"typeDesc": "NONE",
"isActive": false
},

View File

@ -7,7 +7,8 @@
"dateBirth": "2016-05-09T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": null,
"bPhoto": false,

View File

@ -3,9 +3,9 @@
"category": "BI",
"situation": "MARI",
"flagCom": true,
"nbChild": null,
"nbTotalChild": null,
"nbAES": null,
"nbChild": 3,
"nbTotalChild": 4,
"nbAES": "1",
"RL1": {
"num": "N/A",
"firstname": "MARGE",
@ -17,7 +17,8 @@
"dateBirth": "1950-10-01T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"adresse": {
"idStreet": "2317",
@ -57,7 +58,8 @@
"dateBirth": "2014-04-01T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_AV",
"bPhoto": true,
@ -186,7 +188,8 @@
"dateBirth": "2018-12-17T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_PAI",
"bPhoto": false,
@ -219,7 +222,8 @@
"dateBirth": "2018-04-01T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_AV",
"bPhoto": false,

View File

@ -9,7 +9,8 @@
"dateBirth": "1956-05-12T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"adresse": {
"idStreet": null,

View File

@ -1,4 +1,12 @@
[
{
"id": "AVS",
"code": "AVS",
"text": "Assistant de Vie scolaire",
"libelle": "Assistant de Vie scolaire ",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AUTRE",
"code": "AUTRE",
@ -15,14 +23,6 @@
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AVS",
"code": "AVS",
"text": "Auxiliaire de Vie scolaire",
"libelle": "Auxiliaire de Vie scolaire ",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "ETABSPEC",
"code": "ETABSPEC",

View File

@ -219,7 +219,7 @@
{
"id": "37",
"code": "37",
"rang": "FAMILY",
"rang": "PERSON",
"text": "D\u00e9claration conjointe sign\u00e9e des parents",
"libelle": "D\u00e9claration conjointe sign\u00e9e des parents"
},
@ -261,14 +261,14 @@
{
"id": "64",
"code": "64",
"rang": "FAMILY",
"rang": "PERSON",
"text": "Jugement des affaires familiales",
"libelle": "Jugement des affaires familiales"
},
{
"id": "65",
"code": "65",
"rang": "FAMILY",
"rang": "PERSON",
"text": "Jugement mise sous tutelle",
"libelle": "Jugement mise sous tutelle"
},

View File

@ -1,4 +1,12 @@
[
{
"id": "AVS",
"code": "AVS",
"text": "Assistant de Vie scolaire",
"libelle": "Assistant de Vie scolaire ",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AVL",
"code": "AVL",
@ -7,14 +15,6 @@
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "AVS",
"code": "AVS",
"text": "Auxiliaire de Vie scolaire",
"libelle": "Auxiliaire de Vie scolaire ",
"typeDesc": "NONE",
"choiceList": []
},
{
"id": "ETABSPEC",
"code": "ETABSPEC",

File diff suppressed because it is too large Load Diff

View File

@ -10,8 +10,8 @@
{
"id": "AVS",
"code": "AVS",
"text": "Auxiliaire de Vie scolaire",
"libelle": "Auxiliaire de Vie scolaire ",
"text": "Assistant de Vie scolaire",
"libelle": "Assistant de Vie scolaire ",
"typeDesc": "NONE",
"isActive": false
},

View File

@ -7,7 +7,8 @@
"dateBirth": "1970-01-01T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_AV",
"bPhoto": false,

View File

@ -3,9 +3,9 @@
"category": "BI",
"situation": "MARI",
"flagCom": true,
"nbChild": null,
"nbTotalChild": null,
"nbAES": null,
"nbChild": 3,
"nbTotalChild": 4,
"nbAES": "1",
"RL1": {
"num": "N/A",
"firstname": "MARGE",
@ -17,7 +17,8 @@
"dateBirth": "1950-10-01T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"adresse": {
"idStreet": "2317",
@ -56,7 +57,8 @@
"dateBirth": "1956-05-12T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"adresse": {
"idStreet": null,
@ -127,7 +129,8 @@
"dateBirth": "2014-04-01T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_AV",
"bPhoto": true,
@ -250,7 +253,8 @@
"dateBirth": "2016-05-09T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_SV",
"bPhoto": false,
@ -283,7 +287,8 @@
"dateBirth": "2018-12-17T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_PAI",
"bPhoto": false,
@ -316,7 +321,8 @@
"dateBirth": "2018-04-01T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_AV",
"bPhoto": false,

View File

@ -3,9 +3,9 @@
"category": "AUTR",
"situation": "AUTR",
"flagCom": true,
"nbChild": null,
"nbTotalChild": null,
"nbAES": null,
"nbChild": 0,
"nbTotalChild": 0,
"nbAES": "0",
"RL1": {
"num": "N/A",
"firstname": "MARGE",
@ -17,7 +17,8 @@
"dateBirth": "1950-10-01T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"adresse": {
"idStreet": null,
@ -55,7 +56,8 @@
"dateBirth": "1956-05-12T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"adresse": {
"idStreet": null,
@ -111,7 +113,8 @@
"dateBirth": "1970-01-01T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": null,
"bPhoto": false,
@ -183,7 +186,8 @@
"dateBirth": "2016-05-09T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_SV",
"bPhoto": false,
@ -216,7 +220,8 @@
"dateBirth": "2018-12-17T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_PAI",
"bPhoto": false,
@ -249,7 +254,8 @@
"dateBirth": "2018-04-01T00:00:00+02:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"dietcode": "MENU_AV",
"bPhoto": false,

View File

@ -1,13 +1,4 @@
[
{
"yearRev": 2021,
"dateStart": "2022-01-02T00:00:00+01:00",
"dateEnd": "2022-12-31T00:00:00+01:00",
"mtt": 1500.33,
"cdquo": "1",
"codeUti": null,
"cdquo_text": "Revenus fiscaux"
},
{
"yearRev": 2020,
"dateStart": "2022-01-02T00:00:00+01:00",
@ -25,5 +16,14 @@
"cdquo": "1",
"codeUti": null,
"cdquo_text": "Revenus fiscaux"
},
{
"yearRev": 2021,
"dateStart": "2022-01-02T00:00:00+01:00",
"dateEnd": "2022-12-31T00:00:00+01:00",
"mtt": 1500.33,
"cdquo": "1",
"codeUti": null,
"cdquo_text": "Revenus fiscaux"
}
]

View File

@ -9,7 +9,8 @@
"dateBirth": "1950-10-01T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"adresse": {
"idStreet": "2317",

View File

@ -9,7 +9,8 @@
"dateBirth": "1956-05-12T00:00:00+01:00",
"place": null,
"communeCode": null,
"countryCode": null
"countryCode": null,
"cdDepartment": null
},
"adresse": {
"idStreet": null,

View File

@ -1,35 +0,0 @@
#!/usr/bin/python3
import argparse
import requests
import zeep
from zeep.transports import Transport
from zeep.wsse.username import UsernameToken
WSSE = UsernameToken('maelis-webservice', 'maelis-password')
WSDL_URL = 'https://demo-toulouse.sigec.fr/maelisws-toulouse-recette/services/FamilyService?wsdl'
def read_family(family_id, verbose):
session = requests.Session()
session.verify = False
transport = Transport(session=session)
settings = zeep.Settings(strict=False, xsd_ignore_sequence_order=True)
client = zeep.Client(WSDL_URL, transport=transport, wsse=WSSE, settings=settings)
result = client.service.readFamily(
dossierNumber=family_id,
# schoolYear=
# incomeYear=2022, # <-- pour filtrer les quotients sur cette année
# referenceYear=2020,
)
print(result)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument('--verbose', '-v', type=int, default=2, help='display errors')
parser.add_argument('family_id', help='196544', nargs='?', default='196544')
args = parser.parse_args()
read_family(args.family_id, verbose=args.verbose)

View File

@ -34,4 +34,5 @@ def test_referentials(conn, referentials, ref):
for item in res['data']:
assert 'id' in item
assert 'text' in item
assert diff(res['data'], 'test_read_%s_list.json' % ref)
if ref not in ['street']:
assert diff(res['data'], 'test_read_%s_list.json' % ref)

View File

@ -9,6 +9,9 @@ from .conftest import diff, diff_child, diff_family, diff_rlg, link, read_family
FAMILY_RESET_PAYLOAD = {
'category': 'AUTR',
'situation': 'AUTR',
'nbChild': '',
'nbTotalChild': '',
'nbAES': '',
'rl1': {
'civility': 'MR', # no effect
'firstname': 'Marge', # must be
@ -230,7 +233,7 @@ def test_create_family(conn, create_data, update_data):
res = resp.json()
assert res['err'] == 1
assert 'Il existe déjà un Responsable Légal correspondant' in res['err_desc']
assert res['err_class'] == 'passerelle.utils.soap.SOAPFault'
assert res['err_class'] == 'passerelle.utils.jsonresponse.APIError'
# RL1 already exists (on update_data, as RL2) error
payload['rl1']['firstname'] = 'Homer'
@ -240,7 +243,7 @@ def test_create_family(conn, create_data, update_data):
res = resp.json()
assert res['err'] == 1
assert 'Il existe déjà un Responsable Légal correspondant' in res['err_desc']
assert res['err_class'] == 'passerelle.utils.soap.SOAPFault'
assert res['err_class'] == 'passerelle.utils.jsonresponse.APIError'
def test_is_rl_exists(conn, update_data):
@ -918,7 +921,7 @@ def test_add_supplied_document(conn, create_data):
res = resp.json()
assert res['err'] == 0
# push on childe
# push on child
payload['numPerson'] = create_data['bart_num']
url = conn + '/add-supplied-document?NameID=%s' % create_data['name_id']
resp = requests.post(url, json=payload)

View File

@ -0,0 +1,174 @@
import datetime
import requests
def test_perisco(perisco_subscribe_info):
assert perisco_subscribe_info['info']['activity']['libelle1'] == 'TEMPS DU MIDI 22/23'
def test_perisco_agenda(conn, create_data, perisco_subscribe_info):
# subscription
url = conn + '/add-person-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'unit_id': perisco_subscribe_info['unit']['id'],
'place_id': perisco_subscribe_info['place']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# find first available booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) > 0
booking = None
for booking in resp.json()['data']:
if booking['disabled'] is False:
break
else:
raise Exception("no booking available")
assert booking['details']['activity_id'] == perisco_subscribe_info['activity']['id']
assert booking['prefill'] is False
# book activity
url = conn + '/update-child-agenda?NameID=%s' % create_data['name_id']
payload = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [booking['id']],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json() == {
'updated': True,
'count': 1,
'changes': [
{
'booked': True,
'activity_id': booking['details']['activity_id'],
'activity_label': 'Restauration scolaire',
'day': booking['details']['day_str'],
}
],
'err': 0,
}
# check booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['bart_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert [x['prefill'] for x in resp.json()['data'] if x['id'] == booking['id']][0] is True
def test_perisco_recurrent_week(conn, create_data, perisco_subscribe_info, reference_year):
# no subscribed activity
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'nature': 'PERISCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 0
# subscription
url = conn + '/add-person-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'unit_id': perisco_subscribe_info['unit']['id'],
'place_id': perisco_subscribe_info['place']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'nature': 'PERISCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert len(resp.json()['data']) == 1
assert resp.json()['data'][0]['id'] == perisco_subscribe_info['activity']['id']
# get recurent-week gabarit
url = conn + '/get-recurrent-week?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'ref_date': datetime.date.today().strftime('%Y-%m-%d'),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert [(x['id'], x['day']) for x in resp.json()['data']] == [
('1-X', 'Lundi'),
('2-X', 'Mardi'),
('4-X', 'Jeudi'),
('5-X', 'Vendredi'),
]
# no booking
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['maggie_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert not any(x['prefill'] for x in resp.json()['data'])
# set recurent-week gabarit
url = conn + '/update-recurrent-week?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': perisco_subscribe_info['activity']['id'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
'recurrent_week': ['1-X', '2-X'],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
# there is now some bookings
url = conn + '/read-child-agenda?NameID=%s' % create_data['name_id']
params = {
'child_id': create_data['maggie_num'],
'start_date': perisco_subscribe_info['unit']['dateStart'][:10],
'end_date': perisco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert any(x['prefill'] for x in resp.json()['data'])

View File

@ -0,0 +1,21 @@
import datetime
import requests
from .conftest import link, unlink
# LOISIR is a subset of EXTRACO, we only test the genaral catalog cell here
def test_catalog_general_loisirs(conn, update_data):
unlink(conn, update_data['name_id'])
link(conn, update_data)
url = conn + '/read-activity-list'
params = {'ref_date': datetime.date.today().strftime('%Y-%m-%d')}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
# still not nature code defined on LOISIR activities
assert len(resp.json()['data']) == 0

View File

@ -0,0 +1,198 @@
import requests
def test_catalog_personnalise_extrasco(extrasco_subscribe_info):
assert extrasco_subscribe_info['info']['activity']['libelle1'] == 'ADL ELEMENTAIRE Maourine Avril 2023'
assert extrasco_subscribe_info['info']['calendarGeneration']['code'] == 'REQUIRED'
assert extrasco_subscribe_info['info']['billingInformation'] == {
'modeFact': 'PRESENCE',
'quantity': None,
'unitPrice': 43.0,
}
def test_direct_subscribe(conn, create_data, extrasco_subscribe_info, reference_year):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
url = conn + '/add-person-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['hugo_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
# no idIns provided to remove subscription later
assert resp.json()['data'] == {'controlOK': True, 'message': None}
def test_subscribe_with_conveyance(conn, create_data, extrasco_subscribe_info):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
assert extrasco_subscribe_info['info']['conveyance'] is not None
morning = [
x['id'] for x in extrasco_subscribe_info['info']['conveyance']['morningJourney']['depositPlaceList']
]
afternoon = [
x['id'] for x in extrasco_subscribe_info['info']['conveyance']['afternoonJourney']['depositPlaceList']
]
assert len(morning) > 0
assert len(afternoon) > 0
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'conveyanceSubscribe/idPlaceMorning': morning[0],
'conveyanceSubscribe/idPlaceAfternoon': afternoon[0],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
basket_id = resp.json()['data']['basket']['id']
# remove subscription
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
def test_subscribe_with_recurrent_week(conn, create_data, extrasco_subscribe_info):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
assert [(x['id'], x['day']) for x in extrasco_subscribe_info['info']['recurrent_week']] == [
('1-C', 'Lundi'),
('1-B', 'Lundi'),
('2-C', 'Mardi'),
('2-B', 'Mardi'),
('3-C', 'Mercredi'),
('3-B', 'Mercredi'),
('4-C', 'Jeudi'),
('4-B', 'Jeudi'),
('5-C', 'Vendredi'),
('5-B', 'Vendredi'),
]
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'recurrent_week': ['1-B', '2-C'],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
basket_id = resp.json()['data']['basket']['id']
# there is now some bookings
url = conn + '/read-activity-agenda?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert any(x['prefill'] for x in resp.json()['data'])
# remove subscription
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
def test_subscribe_with_agenda(conn, create_data, extrasco_subscribe_info):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
def get_bookings():
url = conn + '/read-activity-agenda?NameID=%s' % create_data['name_id']
params = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
# subscribe witout providing calandar
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
basket_id = resp.json()['data']['basket']['id']
# no booking
assert not any(x['prefill'] for x in get_bookings())
# book using info calendar gabarit (booking registered from w.c.s. form)
assert len(extrasco_subscribe_info['info']['agenda']) > 0
assert not any(x['prefill'] for x in extrasco_subscribe_info['info']['agenda'])
slots = [x['id'] for x in extrasco_subscribe_info['info']['agenda'] if x['disabled'] is False]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
# there is now 2 bookings
assert len([x['prefill'] for x in get_bookings() if x['prefill'] is True]) == 2
# unbook slots
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [False, False]
assert not any(x['prefill'] for x in get_bookings())
# remove subscription
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0

View File

@ -0,0 +1,190 @@
import requests
def test_basket_subscribe(conn, create_data, extrasco_subscribe_info, reference_year):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
def subscriptions(person_id):
url = conn + '/read-subscribe-activity-list?NameID=%s' % create_data['name_id']
params = {
'person_id': person_id,
'nature': 'EXTRASCO',
'school_year': '%s-%s' % (reference_year, reference_year + 1),
}
resp = requests.get(url, params=params)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
# no subscription
assert subscriptions(create_data['bart_num']) == []
assert subscriptions(create_data['maggie_num']) == []
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 3
assert len(subscriptions(create_data['bart_num'])) == 1
assert subscriptions(create_data['maggie_num']) == []
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 105
assert data[0]['text'] == 'ENFANCE LOISIRS ET PE'
assert len(data[0]['lignes']) == 3
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 1
# get 3 idIns because we subscribe a generic unit
assert len({x['idIns'] for x in data[0]['lignes']}) == 3
basket_id = data[0]['id']
# cannot subscribe Bart twice
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 1
# assert 'E1019' in resp.json()['err_desc'] #2206
assert len(get_baskets()) == 1
# delete basket
# should be call by user or by cron job
url = conn + '/delete-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['data'] == 'ok'
assert get_baskets() == []
assert subscriptions(create_data['bart_num']) == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len(subscriptions(create_data['bart_num'])) == 1
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
assert len(subscriptions(create_data['maggie_num'])) == 1
# delete (generic) basket line for Bart
data = get_baskets()
assert len(data) == 1
assert len(data[0]['lignes']) == 6
basket_id = data[0]['id']
# idIns for the generic unit
line_id = [
y['id']
for x in data
for y in x['lignes']
if y['personneInfo']['numPerson'] == int(create_data['bart_num'])
if y['inscription']['idUnit'] == extrasco_subscribe_info['unit']['id']
][0]
url = conn + '/delete-basket-line?NameID=%s' % create_data['name_id']
payload = {
'basket_id': basket_id,
'line_id': line_id,
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['codeRegie'] == None # #2207
assert len({x['personneInfo']['numPerson'] for x in data['lignes']}) == 1
assert len({x['idIns'] for x in data['lignes']}) == 3
data = get_baskets()
assert len(data) == 1
assert len(get_baskets()) == 1
assert len(data[0]['lignes']) == 3
assert subscriptions(create_data['bart_num']) == []
assert len(subscriptions(create_data['maggie_num'])) == 1
# re-subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
assert len(subscriptions(create_data['bart_num'])) == 1
# add bookings to Bart
slots = [x['id'] for x in extrasco_subscribe_info['info']['agenda'] if x['disabled'] is False]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['bart_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
# add bookings to Maggie
slots = [':'.join([create_data['maggie_num']] + x.split(':')[1:]) for x in slots]
url = conn + '/update-activity-agenda/?NameID=%s' % create_data['name_id']
payload = {
'person_id': create_data['maggie_num'],
'activity_id': extrasco_subscribe_info['activity']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
'booking_list': [slots[0], slots[-1]],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
assert resp.json()['updated'] is True
assert [x['booked'] for x in resp.json()['changes']] == [True, True]
# validate basket
url = conn + '/validate-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data['idInsLst']) == 6
assert len(data['factureLst']) == 0 # No invoice #2187
assert get_baskets() == []
assert len(subscriptions(create_data['bart_num'])) == 1
assert len(subscriptions(create_data['maggie_num'])) == 1
# call cancelInvoiceAndDeleteSubscribeList de remove subscriptions

View File

@ -0,0 +1,109 @@
import pytest
import requests
from .conftest import diff, link, unlink
def test_direct_debit_order(conn, create_data):
unlink(conn, create_data['name_id'])
link(conn, create_data)
url = conn + '/add-rl1-direct-debit-order?NameID=%s' % create_data['name_id']
payload = {
'codeRegie': '102',
'bank/bankBIC': 'BDFEFR2T',
'bank/bankIBAN': 'FR7630001007941234567890185',
'bank/bankRUM': 'xxx',
'bank/dateStart': '2023-01-01',
'bank/bankAddress': '75049 PARIS cedex 01',
'bank/civility': 'x',
'bank/lastName': 'Ewing',
'bank/firstName': 'John Ross',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
assert res['data'] == 'ok'
url = conn + '/get-rl1-direct-debit-order?NameID=%s' % create_data['name_id']
params = {
'codeRegie': '102',
'dateRef': '2023-01-01',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
res = resp.json()
res['data']['numPerson'] = 'N/A'
assert diff(res['data'], 'test_get_rl1_direct_debit_order.json')
@pytest.mark.xfail(run=False)
def test_basket_subscribe(conn, create_data, extrasco_subscribe_info, reference_year):
assert extrasco_subscribe_info['info']['controlResult']['controlOK'] is True
def get_baskets():
url = conn + '/get-baskets?NameID=%s' % create_data['name_id']
resp = requests.get(url)
resp.raise_for_status()
assert resp.json()['err'] == 0
return resp.json()['data']
def subscribe(person_id):
url = conn + '/add-person-basket-subscription?NameID=%s' % create_data['name_id']
payload = {
'person_id': person_id,
'activity_id': extrasco_subscribe_info['activity']['id'],
'unit_id': extrasco_subscribe_info['unit']['id'],
'place_id': extrasco_subscribe_info['place']['id'],
'start_date': extrasco_subscribe_info['unit']['dateStart'][:10],
'end_date': extrasco_subscribe_info['unit']['dateEnd'][:10],
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
return resp
# empty basket
assert get_baskets() == []
# subscribe Bart
resp = subscribe(create_data['bart_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert data['basket']['codeRegie'] == 105
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 1
assert len({x['idIns'] for x in data['basket']['lignes']}) == 3
# subscribe Maggie
resp = subscribe(create_data['maggie_num'])
assert resp.json()['err'] == 0
data = resp.json()['data']
assert data['controlResult'] == {'controlOK': True, 'message': None}
assert len({x['personneInfo']['numPerson'] for x in data['basket']['lignes']}) == 2
# basket
data = get_baskets()
assert len(data) == 1
assert data[0]['codeRegie'] == 105
assert data[0]['text'] == 'ENFANCE LOISIRS ET PE'
assert len(data[0]['lignes']) == 3
assert len({x['personneInfo']['numPerson'] for x in data[0]['lignes']}) == 1
# get 3 idIns because we subscribe a generic unit
assert len({x['idIns'] for x in data[0]['lignes']}) == 3
basket_id = data[0]['id']
# validate basket
url = conn + '/validate-basket?NameID=%s' % create_data['name_id']
payload = {'basket_id': basket_id}
resp = requests.post(url, json=payload)
resp.raise_for_status()
assert resp.json()['err'] == 0
data = resp.json()['data']
assert len(data['idInsLst']) == 6
assert len(data['factureLst']) == 0
assert get_baskets() == []
# to continue :
# cancelInvoiceAndDeleteSubscribeList
# payInvoice

View File

@ -1,36 +0,0 @@
import requests
from .conftest import diff, link, unlink
def test_direct_debit_order(conn, create_data):
unlink(conn, create_data['name_id'])
link(conn, create_data)
url = conn + '/add-rl1-direct-debit-order?NameID=%s' % create_data['name_id']
payload = {
'codeRegie': '1',
'bank/bankBIC': 'BDFEFR2T',
'bank/bankIBAN': 'FR7630001007941234567890185',
'bank/bankRUM': 'xxx',
'bank/dateStart': '2023-01-01',
'bank/bankAddress': '75049 PARIS cedex 01',
'bank/civility': 'x',
'bank/lastName': 'Ewing',
'bank/firstName': 'John Ross',
}
resp = requests.post(url, json=payload)
resp.raise_for_status()
res = resp.json()
assert res['data'] == 'ok'
url = conn + '/get-rl1-direct-debit-order?NameID=%s' % create_data['name_id']
params = {
'codeRegie': '1',
'dateRef': '2023-01-01',
}
resp = requests.get(url, params=params)
resp.raise_for_status()
res = resp.json()
res['data']['numPerson'] = 'N/A'
assert diff(res['data'], 'test_get_rl1_direct_debit_order.json')

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base', '0006_resourcestatus'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('actesweb', '0001_initial'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base', '0005_resourcelog'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('airquality', '0001_initial'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('airquality', '0002_auto_20170920_0951'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('airquality', '0003_remove_airquality_log_level'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api_entreprise', '0001_initial'),
]

View File

@ -14,7 +14,6 @@ def remove_url_path(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [
('api_entreprise', '0002_auto_20190701_1357'),
]

View File

@ -432,64 +432,7 @@ class APIEntreprise(BaseResource):
},
)
def exercices(self, request, siret, **kwargs):
return self.get('v2/exercices/%s/' % siret, **kwargs)
@endpoint(
perm='can_access',
pattern=r'(?P<siren>\w+)/$',
example_pattern='{siren}/',
description=_('Get firm\'s annual workforce data'),
parameters={
'siren': SIREN_PARAM,
'object': OBJECT_PARAM,
'context': CONTEXT_PARAM,
'recipient': RECIPIENT_PARAM,
},
)
def effectifs_annuels_acoss_covid(self, request, siren, **kwargs):
if len(siren) != 9:
raise APIError(_('invalid SIREN length (must be 9 characters)'))
return self.get('v2/effectifs_annuels_acoss_covid/%s/' % siren, **kwargs)
@endpoint(
perm='can_access',
pattern=r'(?P<year>\w+)/(?P<month>\w+)/(?P<siren>\w+)/$',
description=_('Get firm\'s monthly workforce data, by SIREN'),
parameters={
'year': YEAR_PARAM,
'month': MONTH_PARAM,
'siren': SIREN_PARAM,
'object': OBJECT_PARAM,
'context': CONTEXT_PARAM,
'recipient': RECIPIENT_PARAM,
},
)
def entreprise_effectifs_mensuels_acoss_covid(self, request, year, month, siren, **kwargs):
if len(siren) != 9:
raise APIError(_('invalid SIREN length (must be 9 characters)'))
month = month.zfill(2)
return self.get(
'v2/effectifs_mensuels_acoss_covid/%s/%s/entreprise/%s/' % (year, month, siren), **kwargs
)
@endpoint(
perm='can_access',
pattern=r'(?P<year>\w+)/(?P<month>\w+)/(?P<siret>\w+)/$',
description=_('Get firm\'s monthly workforce data, by SIRET'),
parameters={
'year': YEAR_PARAM,
'month': MONTH_PARAM,
'siret': SIRET_PARAM,
'object': OBJECT_PARAM,
'context': CONTEXT_PARAM,
'recipient': RECIPIENT_PARAM,
},
)
def etablissement_effectifs_mensuels_acoss_covid(self, request, year, month, siret, **kwargs):
month = month.zfill(2)
return self.get(
'v2/effectifs_mensuels_acoss_covid/%s/%s/etablissement/%s/' % (year, month, siret), **kwargs
)
return self.get('v3/dgfip/etablissements/%s/chiffres_affaires' % siret, raw=True, **kwargs)
@endpoint(
perm='can_access',
@ -511,18 +454,17 @@ class APIEntreprise(BaseResource):
def match_mandataire_social(
self, request, siren, first_name, last_name, birthdate, method='simple', **kwargs
):
entreprise = self.get(
'v2/entreprises/%s/' % siren,
raw=True,
**kwargs,
)
mandataires = self.get(
'v3/infogreffe/rcs/unites_legales/%s/mandataires_sociaux' % siren, raw=True, **kwargs
).get('data', [])
methods = {
'simple': simple_match,
'levenshtein': levenshtein_match,
}
if method not in methods:
return {'err': 1, 'err_desc': 'method %s not implemented' % method}
for mandataire in entreprise.get('entreprise', {}).get('mandataires_sociaux', []):
for mandataire in mandataires:
if methods[method](mandataire, first_name, last_name, birthdate):
return {'err': 0, 'data': mandataire}
return {'err': 0, 'data': {}}

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base', '0002_auto_20151009_0326'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('api_particulier', '0001_initial'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api_particulier', '0002_auto_20181118_0807'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('api_particulier', '0003_auto_20190212_0426'),
]

View File

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api_particulier', '0004_auto_20190215_0807'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api_particulier', '0005_auto_20210610_1508'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base', '0002_auto_20151009_0326'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('arcgis', '0001_initial'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('arcgis', '0002_auto_20170920_0951'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('arcgis', '0003_auto_20181102_1550'),
]

View File

@ -8,7 +8,6 @@ import passerelle.utils.templates
class Migration(migrations.Migration):
dependencies = [
('arcgis', '0004_remove_arcgis_log_level'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('arcgis', '0005_auto_20200310_1517'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('arcgis', '0006_auto_20200401_1025'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base', '0006_resourcestatus'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('arpege_ecp', '0001_initial'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('astre_rest', '0001_initial'),
]

View File

@ -172,7 +172,6 @@ AUTH_CHOICES = [
class AstreREST(BaseResource):
base_url = models.URLField(_('API URL'))
username = models.CharField(_('Username'), max_length=32)
password = models.CharField(_('Password'), max_length=32)

View File

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('astregs', '0001_initial'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [

View File

@ -21,10 +21,8 @@ import urllib
import lxml.etree
from django.db import models
from django.utils import dateformat, dateparse
from django.utils.encoding import force_str
from django.utils.translation import gettext_lazy as _
from zeep import helpers
from zeep.exceptions import Fault
from passerelle.base.models import BaseResource
from passerelle.utils.api import endpoint
@ -58,10 +56,7 @@ class ATALConnector(BaseResource):
def _soap_call(self, wsdl, method, **kwargs):
client = self._soap_client(wsdl=wsdl)
try:
return getattr(client.service, method)(**kwargs)
except Fault as e:
raise APIError(force_str(e))
return getattr(client.service, method)(**kwargs)
def _basic_ref(self, wsdl, method):
soap_res = self._soap_call(wsdl=wsdl, method=method)

View File

@ -1,12 +1,10 @@
# Generated by Django 1.11.10 on 2018-09-18 09:42
import django.contrib.postgres.fields.jsonb
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
@ -24,7 +22,7 @@ class Migration(migrations.Migration):
('name_id', models.CharField(max_length=256, verbose_name='NameID')),
('id_per', models.CharField(max_length=64, verbose_name='ID Per')),
('created', models.DateTimeField(auto_now_add=True, verbose_name='Creation date')),
('extra', django.contrib.postgres.fields.jsonb.JSONField(null=True, verbose_name='Anything')),
('extra', models.JSONField(null=True, verbose_name='Anything')),
],
options={
'ordering': ['created'],

View File

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('atos_genesys', '0001_initial'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('atos_genesys', '0002_remove_resource_log_level'),
]

View File

@ -6,7 +6,6 @@ from passerelle.utils.db import EnsureJsonbType
class Migration(migrations.Migration):
dependencies = [
('atos_genesys', '0003_auto_20200504_1402'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('atos_genesys', '0004_text_to_jsonb'),
]

View File

@ -19,8 +19,8 @@ import xml.etree.ElementTree as ET
from urllib import parse as urlparse
import requests
from django.contrib.postgres.fields import JSONField
from django.db import models
from django.db.models import JSONField
from django.utils.translation import gettext_lazy as _
from passerelle.base.models import BaseResource, HTTPResource

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base', '0001_initial'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0001_initial'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0002_auto_20150705_0330'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0003_baseaddresse_log_level'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0004_auto_20160316_0910'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0005_auto_20160407_0456'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0006_rename_model'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0007_auto_20160729_1540'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0008_delete_updatestreetmodel'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0009_streetmodel_simple_name'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0010_auto_20160914_0826'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0011_auto_20160919_0949'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0012_auto_20170920_0951'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0013_remove_baseadresse_log_level'),
]

View File

@ -7,7 +7,6 @@ import passerelle.apps.base_adresse.models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0014_auto_20190207_0456'),
]

View File

@ -1,11 +1,9 @@
# Generated by Django 1.11.18 on 2020-01-30 15:04
import django.contrib.postgres.fields.jsonb
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0015_auto_20191206_1244'),
]
@ -19,7 +17,7 @@ class Migration(migrations.Migration):
models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
('api_id', models.CharField(max_length=30, unique=True)),
('data', django.contrib.postgres.fields.jsonb.JSONField(default=dict)),
('data', models.JSONField(default=dict)),
('timestamp', models.DateTimeField(auto_now=True)),
],
),

View File

@ -1,11 +1,9 @@
# Generated by Django 1.11.18 on 2020-05-04 12:02
import django.contrib.postgres.fields.jsonb
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0016_auto_20200130_1604'),
]
@ -14,7 +12,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='addresscachemodel',
name='data',
field=django.contrib.postgres.fields.jsonb.JSONField(),
field=models.JSONField(),
),
migrations.AlterField(
model_name='baseadresse',

View File

@ -6,7 +6,6 @@ from passerelle.utils.db import EnsureJsonbType
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0017_auto_20200504_1402'),
]

View File

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("base_adresse", "0018_text_to_jsonb"),
]

View File

@ -13,7 +13,6 @@ def set_streetmodel_resource(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [
("base_adresse", "0019_streetmodel_resource_add"),
]

View File

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("base_adresse", "0020_streetmodel_resource_runpython"),
]

View File

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0021_streetmodel_resource_alter'),
]

View File

@ -23,7 +23,6 @@ def set_resource(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0022_resource_in_models_add'),
]

View File

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0023_resource_in_models_runpython'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('sector', '0001_initial'),
('base_adresse', '0024_resource_in_models_alter'),

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0025_baseadresse_sectors'),
]

View File

@ -9,7 +9,6 @@ def forwards(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0026_streetmodel_ban_id'),
]

View File

@ -9,7 +9,6 @@ def forwards(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0027_auto_20220603_0456'),
]

View File

@ -2,7 +2,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0028_alter_streetmodel_ban_id'),
]

View File

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('base_adresse', '0029_auto_20220624_0827'),
]

View File

@ -5,10 +5,9 @@ import json
from io import StringIO
from urllib import parse as urlparse
from django.contrib.postgres.fields import JSONField
from django.core.exceptions import FieldError
from django.db import connection, models
from django.db.models import Q
from django.db.models import JSONField, Q
from django.utils import timezone
from django.utils.http import urlencode
from django.utils.translation import gettext_lazy as _
@ -679,7 +678,6 @@ class UnaccentNameMixin:
class StreetModel(UnaccentNameMixin, models.Model):
ban_id = models.CharField(_('BAN Identifier'), max_length=32, null=True)
city = models.CharField(_('City'), max_length=150)
name = models.CharField(_('Street name'), max_length=150)
@ -699,7 +697,6 @@ class StreetModel(UnaccentNameMixin, models.Model):
class RegionModel(UnaccentNameMixin, models.Model):
name = models.CharField(_('Region name'), max_length=150)
unaccent_name = models.CharField(_('Region name ascii char'), max_length=150, null=True)
code = models.CharField(_('Region code'), max_length=3)
@ -724,7 +721,6 @@ class RegionModel(UnaccentNameMixin, models.Model):
class DepartmentModel(UnaccentNameMixin, models.Model):
name = models.CharField(_('Department name'), max_length=100)
unaccent_name = models.CharField(_('Department name ascii char'), max_length=150, null=True)
code = models.CharField(_('Department code'), max_length=3)
@ -752,7 +748,6 @@ class DepartmentModel(UnaccentNameMixin, models.Model):
class CityModel(UnaccentNameMixin, models.Model):
name = models.CharField(_('City name'), max_length=150)
unaccent_name = models.CharField(_('City name ascii char'), max_length=150, null=True)
code = models.CharField(_('INSEE code'), max_length=5)

View File

@ -2,13 +2,11 @@
import uuid
import django.contrib.postgres.fields.jsonb
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
@ -102,27 +100,21 @@ class Migration(migrations.Migration):
('last_time_running', models.DateTimeField(null=True, verbose_name='Last time running')),
(
'create_parameters',
django.contrib.postgres.fields.jsonb.JSONField(
null=True, verbose_name='Create parameters'
),
models.JSONField(null=True, verbose_name='Create parameters'),
),
('user_full_name', models.TextField(null=True, verbose_name='User full name')),
('agent_full_name', models.TextField(null=True, verbose_name='Agent full name')),
(
'join_user_parameters',
django.contrib.postgres.fields.jsonb.JSONField(
null=True, verbose_name='Join user parameters'
),
models.JSONField(null=True, verbose_name='Join user parameters'),
),
(
'join_agent_parameters',
django.contrib.postgres.fields.jsonb.JSONField(
null=True, verbose_name='Join agent parameters'
),
models.JSONField(null=True, verbose_name='Join agent parameters'),
),
(
'metadata',
django.contrib.postgres.fields.jsonb.JSONField(null=True, verbose_name='Metadata'),
models.JSONField(null=True, verbose_name='Metadata'),
),
(
'resource',

Some files were not shown because too many files have changed in this diff Show More