{"id":5372,"date":"2022-11-30T16:42:26","date_gmt":"2022-11-30T15:42:26","guid":{"rendered":"https:\/\/samovar.telecom-sudparis.eu\/?p=5372"},"modified":"2022-12-08T12:10:42","modified_gmt":"2022-12-08T11:10:42","slug":"avis-de-soutenance-de-monsieur-paul-guelorget","status":"publish","type":"post","link":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/2022\/11\/30\/avis-de-soutenance-de-monsieur-paul-guelorget\/","title":{"rendered":"AVIS DE SOUTENANCE de Monsieur Paul GUELORGET"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\">L&rsquo;Ecole doctorale : Ecole Doctorale de l&rsquo;Institut Polytechnique de Paris<br>et le Laboratoire de recherche SAMOVAR &#8211; Services r\u00e9partis, Architectures, MOd\u00e9lisation, Validation, Administration des R\u00e9seaux<\/h2>\n\n\n\n<p>pr\u00e9sentent<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">l\u2019AVIS DE SOUTENANCE de Monsieur Paul GUELORGET<\/h2>\n\n\n\n<p>Autoris\u00e9 \u00e0 pr\u00e9senter ses travaux en vue de l\u2019obtention du Doctorat de l&rsquo;Institut Polytechnique de Paris, pr\u00e9par\u00e9 \u00e0 T\u00e9l\u00e9com SudParis en :<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Signal, Images, Automatique et robotique<\/h2>\n\n\n\n<h1 class=\"wp-block-heading\">\u00ab Apprentissage actif pour la d\u00e9tection d\u2019objets d\u2019int\u00e9r\u00eat op\u00e9rationnel dans les contenus multim\u00e9dia \u00bb<\/h1>\n\n\n\n<p>le vendredi 9 d\u00e9cembre 2022 \u00e0 14h00<\/p>\n\n\n\n<p><strong>Salle 4A467<\/strong><br>T\u00e9l\u00e9com SudParis &#8211; 19 place Marguerite Perey 91120 Palaiseau France<\/p>\n\n\n\n<p>Ou le lien zoom de connexion vers la salle virtuelle de soutenance:<br><a rel=\"noreferrer noopener\" href=\"https:\/\/telecom-paris.zoom.us\/j\/97907486366?pwd=MUNvcXlKK0FhWVNMNzFZRGQ0M3Y1QT09\" target=\"_blank\">https:\/\/telecom-paris.zoom.us\/j\/97907486366?pwd=MUNvcXlKK0FhWVNMNzFZRGQ0M3Y1QT09<\/a><\/p>\n\n\n\n<p>ID de r\u00e9union&nbsp;:&nbsp;<a href=\"979 0748 6366\" target=\"_blank\" rel=\"noreferrer noopener\">979 0748 6366<\/a><br>Code secret&nbsp;: 848029<br>Une seule touche sur l\u2019appareil mobile<br>+33186995831,,97907486366#,,,,*848029# France<br>+33170372246,,97907486366#,,,,*848029# France<\/p>\n\n\n\n<p>Composez un num\u00e9ro en fonction de votre emplacement<br><a href=\"+33 1 8699 5831\" target=\"_blank\" rel=\"noreferrer noopener\">+33 1 8699 5831<\/a>&nbsp;France<br><a href=\"+33 1 7037 2246\" target=\"_blank\" rel=\"noreferrer noopener\">+33 1 7037 2246<\/a>&nbsp;France<br><a href=\"+33 1 7037 9729\" target=\"_blank\" rel=\"noreferrer noopener\">+33 1 7037 9729<\/a>&nbsp;France<br><a href=\"+33 1 7095 0103\" target=\"_blank\" rel=\"noreferrer noopener\">+33 1 7095 0103<\/a>&nbsp;France<br><a href=\"+33 1 7095 0350\" target=\"_blank\" rel=\"noreferrer noopener\">+33 1 7095 0350<\/a>&nbsp;France<br>ID de r\u00e9union&nbsp;:&nbsp;<a href=\"979 0748 6366\" target=\"_blank\" rel=\"noreferrer noopener\">979 0748 6366<\/a><br>Code secret&nbsp;: 848029<br>Trouvez votre num\u00e9ro local&nbsp;:&nbsp;<a target=\"_blank\" href=\"https:\/\/telecom-paris.zoom.us\/u\/aeiq2fm6ax\" rel=\"noreferrer noopener\">https:\/\/telecom-paris.zoom.us\/u\/aeiq2fm6ax<\/a><\/p>\n\n\n\n<p>Participer \u00e0 l\u2019aide d\u2019un protocole SIP<br>97907486366@zoomcrc.com<\/p>\n\n\n\n<p>Participer \u00e0 l\u2019aide d\u2019un protocole H.323<br><a href=\"162.255.37.11\" target=\"_blank\" rel=\"noreferrer noopener\">162.255.37.11<\/a>&nbsp;(\u00c9tats-Unis (Ouest))<br><a href=\"162.255.36.11\" target=\"_blank\" rel=\"noreferrer noopener\">162.255.36.11<\/a>&nbsp;(\u00c9tats-Unis (Est))<br><a href=\"115.114.131.7\" target=\"_blank\" rel=\"noreferrer noopener\">115.114.131.7<\/a>&nbsp;(Mumbai &#8211; Inde)<br><a href=\"115.114.115.7\" target=\"_blank\" rel=\"noreferrer noopener\">115.114.115.7<\/a>&nbsp;(Hyderabad &#8211; Inde)<br><a href=\"213.19.144.110\" target=\"_blank\" rel=\"noreferrer noopener\">213.19.144.110<\/a>&nbsp;(Amsterdam Pays-Bas)<br><a href=\"213.244.140.110\" target=\"_blank\" rel=\"noreferrer noopener\">213.244.140.110<\/a>&nbsp;(Allemagne)<br><a href=\"103.122.166.55\" target=\"_blank\" rel=\"noreferrer noopener\">103.122.166.55<\/a>&nbsp;(Australie Sydney)<br><a href=\"103.122.167.55\" target=\"_blank\" rel=\"noreferrer noopener\">103.122.167.55<\/a>&nbsp;(Australie Melbourne)<br><a href=\"149.137.40.110\" target=\"_blank\" rel=\"noreferrer noopener\">149.137.40.110<\/a>&nbsp;(Singapour)<br><a href=\"64.211.144.160\" target=\"_blank\" rel=\"noreferrer noopener\">64.211.144.160<\/a>&nbsp;(Br\u00e9sil)<br><a href=\"149.137.68.253\" target=\"_blank\" rel=\"noreferrer noopener\">149.137.68.253<\/a>&nbsp;(Mexique)<br><a href=\"69.174.57.160\" target=\"_blank\" rel=\"noreferrer noopener\">69.174.57.160<\/a>&nbsp;(Canada Toronto)<br><a href=\"65.39.152.160\" target=\"_blank\" rel=\"noreferrer noopener\">65.39.152.160<\/a>&nbsp;(Canada Vancouver)<br><a href=\"207.226.132.110\" target=\"_blank\" rel=\"noreferrer noopener\">207.226.132.110<\/a>&nbsp;(Japon Tokyo)<br><a href=\"149.137.24.110\" target=\"_blank\" rel=\"noreferrer noopener\">149.137.24.110<\/a>&nbsp;(Japon Osaka)<br>Code secret&nbsp;: 848029<br>ID de r\u00e9union&nbsp;:&nbsp;<a href=\"979 0748 6366\" target=\"_blank\" rel=\"noreferrer noopener\">979 0748 6366<\/a><\/p>\n\n\n\n<p><strong>Membres du jury :<\/strong><\/p>\n\n\n\n<p><strong>M. Titus&nbsp;ZAHARIA<\/strong>, Professeur, T\u00e9l\u00e9com SudParis, FRANCE &#8211; Directeur de th\u00e8se<br><strong>M. Alexis&nbsp;JOLY<\/strong>, Directeur de recherche, Universit\u00e9 de Montpellier, FRANCE &#8211; Rapporteur<br><strong>Mme Jenny&nbsp;BENOIS-PINEAU<\/strong>, Professeure des universit\u00e9s, Universit\u00e9 de Bordeaux, FRANCE &#8211; Rapporteure<br><strong>Mme Anne&nbsp;VERROUST-BLONDET<\/strong>, Directrice de recherche, INRIA Paris, FRANCE &#8211; Examinatrice<br><strong>M. Azeddine&nbsp;BEGHDADI<\/strong>, Professeur des universit\u00e9s, Universit\u00e9 Sorbonne Paris Nord, FRANCE &#8211; Examinateur<br><strong>M. Bruno&nbsp;GRILHERES<\/strong>, Cadre scientifique des EPIC, Airbus Defence and Space, FRANCE &#8211; Co-encadrant de th\u00e8se<\/p>\n\n\n\n<p><br><strong>R\u00e9sum\u00e9 :<\/strong><\/p>\n\n\n\n<p>Une profusion de contenus, artistes et interactions en source ouverte sont cibl\u00e9es par les analystes \u00e0 des fins commerciales, politiques ou de renseignement. Analyser l&rsquo;immensit\u00e9 de ces donn\u00e9es requiert une assistance automatis\u00e9e. Bien que les propositions r\u00e9centes en mati\u00e8re d&rsquo;architectures de r\u00e9seaux de neurones aient montr\u00e9 de fortes capacit\u00e9s envers les modalit\u00e9s image et texte, leur entra\u00eenement exploite des jeux de donn\u00e9es massifs, inexistant pour la majorit\u00e9 des classes d&rsquo;int\u00e9r\u00eat op\u00e9rationnel. Pour r\u00e9soudre ce probl\u00e8me, l&rsquo;apprentissage actif tire parti de la grande quantit\u00e9 de documents non annot\u00e9s en sollicitant un oracle humain pour obtenir les labels des documents pr\u00e9sum\u00e9s les plus informatifs, afin d\u2019am\u00e9liorer la pr\u00e9cision. Cependant, les justifications derri\u00e8re les d\u00e9cisions du mod\u00e8le sont opaques et sans lien avec celles de l&rsquo;oracle. De plus, \u00e0 cause de ses longues \u00e9tapes successives, le d\u00e9roulement de l&rsquo;apprentissage actif nuit \u00e0 ses performances en temps r\u00e9el. Nos contributions dans cette th\u00e8se visent \u00e0 analyser et r\u00e9soudre ces probl\u00e8mes \u00e0 quatre niveaux. Premi\u00e8rement, nous observons les justifications derri\u00e8re les d\u00e9cision d&rsquo;un r\u00e9seau de neurones. Deuxi\u00e8mement, nous mettons ces justifications en perspective avec celles \u00e9labor\u00e9es par des humains. Troisi\u00e8mement, nous incitons un r\u00e9seau de neurones \u00e0 aligner ses justificatifs sur ceux d&rsquo;un mod\u00e8le professeur qui simule ceux d&rsquo;un oracle humain, et am\u00e9liorons sa pr\u00e9cision. Finalement, nous mettons au point et exploitons un syst\u00e8me d&rsquo;apprentissage actif pour surmonter ses limitations usuelles. Ces \u00e9tudes ont \u00e9t\u00e9 men\u00e9es sur des donn\u00e9es uni-modales texte ou image, ou sur des paires multi-modales texte\/image, principalement des articles de presse en anglais et en fran\u00e7ais. \u00c0 travers les chapitres de cette th\u00e8se, nous traitons plusieurs cas d&rsquo;utilisation parmi lesquels la reconnaissance du vague et des fausses nouvelles, la d\u00e9tection du manque d&rsquo;avis contradictoires dans les articles et la classification d&rsquo;articles comme abordant des sujets arbitrairement choisis, tels que les manifestations ou la violence.<\/p>\n\n\n\n<p><br><strong>Abstract : \u00ab\u00a0Active learning for the detection of objects of operational interest in open-source multimedia content\u00a0\u00bb<\/strong><\/p>\n\n\n\n<p>A profusion of openly accessible content, actors and interactions is targeted by analysts for intelligence, marketing or political purposes. Analysing the immensity of open source data requires automated assistance. Although recent propositions in neural network architectures have demonstrated strong capacities for image and text modalities, their training harnesses massive training datasets, non-existent for the majority of operational classes of interest. To address this issue, active learning takes advantage of the great amounts of unlabelled documents by soliciting from a human oracle the ground-truth labels of the presumed most informative documents, to improve accuracy. Yet, the model&rsquo;s decision-making rationales are opaque and might be unrelated to those of the oracle. Furthermore, with its time-consuming iterative steps, the active learning workflow is detrimental to its real-time performances. Our contributions in this thesis aim to analyse and address these issues at four levels. Firstly, we observe the rationales behind a neural network&rsquo;s decisions. Secondly, we put these rationales into perspective with human rationales. Thirdly, we try and make the neural network align its decision-making rationales with those of a teacher model to simulate the rationales of a human oracle and improve accuracy in what is called active learning with rationales. Finally, we design and exploit an active learning framework to overcome its usual limitations. These studies were conducted with uni-modal text and image data, and multi-modal text and image associations, principally press articles in English and French. Throughout this work&rsquo;s chapters, we address several use cases among which fake news classification, vagueness classification, the detection of lack of contradiction in articles, the detection of arbitrary topics such as demonstrations and violence.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>L&rsquo;Ecole doctorale : Ecole Doctorale de l&rsquo;Institut Polytechnique de Pariset le Laboratoire de recherche SAMOVAR &#8211; Services r\u00e9partis, Architectures, MOd\u00e9lisation, Validation, Administration des R\u00e9seaux pr\u00e9sentent l\u2019AVIS DE SOUTENANCE de Monsieur Paul GUELORGET Autoris\u00e9 \u00e0 pr\u00e9senter ses travaux en vue de l\u2019obtention du Doctorat de l&rsquo;Institut Polytechnique de Paris, pr\u00e9par\u00e9 \u00e0 T\u00e9l\u00e9com SudParis en : Signal, [&hellip;]<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"ocean_post_layout":"","ocean_both_sidebars_style":"","ocean_both_sidebars_content_width":0,"ocean_both_sidebars_sidebars_width":0,"ocean_sidebar":"0","ocean_second_sidebar":"0","ocean_disable_margins":"enable","ocean_add_body_class":"","ocean_shortcode_before_top_bar":"","ocean_shortcode_after_top_bar":"","ocean_shortcode_before_header":"","ocean_shortcode_after_header":"","ocean_has_shortcode":"","ocean_shortcode_after_title":"","ocean_shortcode_before_footer_widgets":"","ocean_shortcode_after_footer_widgets":"","ocean_shortcode_before_footer_bottom":"","ocean_shortcode_after_footer_bottom":"","ocean_display_top_bar":"default","ocean_display_header":"default","ocean_header_style":"","ocean_center_header_left_menu":"0","ocean_custom_header_template":"0","ocean_custom_logo":0,"ocean_custom_retina_logo":0,"ocean_custom_logo_max_width":0,"ocean_custom_logo_tablet_max_width":0,"ocean_custom_logo_mobile_max_width":0,"ocean_custom_logo_max_height":0,"ocean_custom_logo_tablet_max_height":0,"ocean_custom_logo_mobile_max_height":0,"ocean_header_custom_menu":"0","ocean_menu_typo_font_family":"0","ocean_menu_typo_font_subset":"","ocean_menu_typo_font_size":0,"ocean_menu_typo_font_size_tablet":0,"ocean_menu_typo_font_size_mobile":0,"ocean_menu_typo_font_size_unit":"px","ocean_menu_typo_font_weight":"","ocean_menu_typo_font_weight_tablet":"","ocean_menu_typo_font_weight_mobile":"","ocean_menu_typo_transform":"","ocean_menu_typo_transform_tablet":"","ocean_menu_typo_transform_mobile":"","ocean_menu_typo_line_height":0,"ocean_menu_typo_line_height_tablet":0,"ocean_menu_typo_line_height_mobile":0,"ocean_menu_typo_line_height_unit":"","ocean_menu_typo_spacing":0,"ocean_menu_typo_spacing_tablet":0,"ocean_menu_typo_spacing_mobile":0,"ocean_menu_typo_spacing_unit":"","ocean_menu_link_color":"","ocean_menu_link_color_hover":"","ocean_menu_link_color_active":"","ocean_menu_link_background":"","ocean_menu_link_hover_background":"","ocean_menu_link_active_background":"","ocean_menu_social_links_bg":"","ocean_menu_social_hover_links_bg":"","ocean_menu_social_links_color":"","ocean_menu_social_hover_links_color":"","ocean_disable_title":"default","ocean_disable_heading":"default","ocean_post_title":"","ocean_post_subheading":"","ocean_post_title_style":"","ocean_post_title_background_color":"","ocean_post_title_background":0,"ocean_post_title_bg_image_position":"","ocean_post_title_bg_image_attachment":"","ocean_post_title_bg_image_repeat":"","ocean_post_title_bg_image_size":"","ocean_post_title_height":0,"ocean_post_title_bg_overlay":0.5,"ocean_post_title_bg_overlay_color":"","ocean_disable_breadcrumbs":"default","ocean_breadcrumbs_color":"","ocean_breadcrumbs_separator_color":"","ocean_breadcrumbs_links_color":"","ocean_breadcrumbs_links_hover_color":"","ocean_display_footer_widgets":"default","ocean_display_footer_bottom":"default","ocean_custom_footer_template":"0","ocean_post_oembed":"","ocean_post_self_hosted_media":"","ocean_post_video_embed":"","ocean_link_format":"","ocean_link_format_target":"self","ocean_quote_format":"","ocean_quote_format_link":"post","ocean_gallery_link_images":"off","ocean_gallery_id":[],"footnotes":""},"categories":[286,169],"tags":[],"class_list":["post-5372","post","type-post","status-publish","format-standard","hentry","category-fractualites-ennews-fr","category-seminaires-armedia","entry"],"_links":{"self":[{"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/posts\/5372","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/comments?post=5372"}],"version-history":[{"count":2,"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/posts\/5372\/revisions"}],"predecessor-version":[{"id":5464,"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/posts\/5372\/revisions\/5464"}],"wp:attachment":[{"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/media?parent=5372"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/categories?post=5372"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/samovar.telecom-sudparis.eu\/index.php\/wp-json\/wp\/v2\/tags?post=5372"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}