Consultation dates
The eds.consultation-dates
matcher consists of two main parts:
- A matcher which finds mentions of consultation events (more details below)
- A date parser (see the corresponding pipe) that links a date to those events
Examples
Note
The matcher has been built to run on consultation notes (CR-CONS
at APHP), so please filter accordingly before proceeding.
import edsnlp
nlp = edsnlp.blank("eds")
nlp.add_pipe("eds.sentences")
nlp.add_pipe(
"eds.normalizer",
config=dict(
lowercase=True,
accents=True,
quotes=True,
pollution=False,
),
)
nlp.add_pipe("eds.consultation_dates")
text = """
XXX
Objet : Compte-Rendu de Consultation du 03/10/2018.
XXX
"""
doc = nlp(text)
doc.spans["consultation_dates"]
# Out: [Consultation du 03/10/2018]
doc.spans["consultation_dates"][0]._.consultation_date.to_datetime()
# Out: DateTime(2018, 10, 3, 0, 0, 0, tzinfo=Timezone('Europe/Paris'))
Extensions
The eds.consultation_dates
pipeline declares one extension on the Span
object: the consultation_date
attribute, which is a Python datetime
object.
Parameters
PARAMETER | DESCRIPTION |
---|---|
nlp | Language pipeline object TYPE: |
consultation_mention | List of RegEx for consultation mentions.
This list contains terms directly referring to consultations, such as "Consultation du..." or "Compte rendu du...". This list is the only one enabled by default since it is fairly precise and not error-prone. TYPE: |
town_mention | List of RegEx for all AP-HP hospitals' towns mentions.
This list contains the towns of each AP-HP's hospital. Its goal is to fetch dates mentioned as "Paris, le 13 décembre 2015". It has a high recall but poor precision, since those dates can often be dates of letter redaction instead of consultation dates. TYPE: |
document_date_mention | List of RegEx for document date.
This list contains expressions mentioning the date of creation/edition of a document, such as "Date du rapport: 13/12/2015" or "Signé le 13/12/2015". Like TYPE: |
Authors and citation
The eds.consultation_dates
pipeline was developed by AP-HP's Data Science team.