<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://c4d.lias-lab.fr/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Enac</id>
	<title>COMP4DRONES - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://c4d.lias-lab.fr/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Enac"/>
	<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php/Special:Contributions/Enac"/>
	<updated>2026-04-07T00:45:44Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.37.1</generator>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=WP3-03&amp;diff=84</id>
		<title>WP3-03</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=WP3-03&amp;diff=84"/>
		<updated>2022-03-10T10:03:20Z</updated>

		<summary type="html">&lt;p&gt;Enac: Add WP3 page.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Sensor information algorithms=&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|  ID|| WP3-03&lt;br /&gt;
|-&lt;br /&gt;
|   Contributor	|| BUT&lt;br /&gt;
|-&lt;br /&gt;
|   Levels	|| Functional&lt;br /&gt;
|-&lt;br /&gt;
|   Require	|| Payload data (drone images)&lt;br /&gt;
|-&lt;br /&gt;
|   Provide		|| Tone mapped HDR video as Axi4 Video Stream&lt;br /&gt;
|-&lt;br /&gt;
|   Input		|| RGB images captured by drone (Payload data)&lt;br /&gt;
|-&lt;br /&gt;
|   Output		|| RGB images processed by component (HDR tone mapped images)&lt;br /&gt;
|-&lt;br /&gt;
|   C4D building block		|| (Video) Data Analytics&lt;br /&gt;
|-&lt;br /&gt;
|   TRL		|| 4&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Detailed Description==&lt;br /&gt;
HDR is part of image processing that focuses on capturing, processing, and displaying images with a High Dynamic Range. Its primary goal is to achieve reproduction of the captured scene on digital devices. HDR (HDR) for single images is deeply studied topic (although it is still quite opened). HDR can be obtained using specialized HDR sensors, or using standard sensors by acquisition of image sequence with different exposure times. These images can be captured simultaneously e.g. through beam splitter on several CCD/CMOS sensors or more often sequentially. HDR can be obtained using specialized HDR sensors, or using standard sensors by acquisition of image sequence with different exposure times. These images can be captured simultaneously e.g. through beam splitter on several CCD/CMOS or more often sequentially (merging/fusion of exposures). With the advent of HDR video capturing new problems arose. Application of TMO without careful consideration of temporal coherence between consecutive frames may lead to adverse effects. Video tone mapping methods needs to carefully consider temporal coherence to preserve this temporal character, for example during fast luminance changes. In the C4D project BUT is improving HDR video tone mapping FPGA IP core. The design is based on programmable hardware tightly connected to an &amp;quot;embedded&amp;quot; processor (FPGA SoC Xilinx Zynq, but may also be implemented on other platforms with FPGA). It covers all functionality: reading data from a camera sensor, merging multiple images with alternating exposures into HDR images/HDR video and applying HDR tone mapping. The system can be extended with other functions (software, hardware or FPGA IP core) such as HDR video compression, image pre-processing, exposure control, and the “ghost-free” function removes possible artifacts caused by the movement of objects in the programmable hardware. This block provides acquired data in HDR or tone mapped format and can be extended with other data analytics tools/algorithms (e.g., detectors). In the C4D reference architecture context, this block supports data acquisition for further processing either in the FPGA or in the following systems. Sensor information algorithms is part of payload management – data acquisition block, and component provides inputs to the data management block (i.e., payload data analytics).&lt;br /&gt;
&lt;br /&gt;
==Contribution and Improvements==&lt;br /&gt;
BUT is implementing and improving sensor data processing algorithms which include software and firmware for FPGA. This involves video processing algorithms (for example HDR algorithms). HDR multi-exposure fusion algorithm to be implemented in the drone, possibly also implementing tone mapping and/or ghost removal in order to &amp;quot;feed&amp;quot; further image and video processing subsystems in the drone by image information with high dynamic range. BUT increased performance of the algorithms, which reduce latency and increase throughput (currently IP core can process up to 200 mega pixels per second). Robustness of the controller with respect to environmental disturbances and increased resiliency. This improvement will be based on increased robustness of the video processing with respect to HDR while keeping the processing means and extent of video processing &amp;quot;unchanged&amp;quot; thanks to the tone mapping that virtually brings the &amp;quot;same image format&amp;quot; as in usual processing.&lt;br /&gt;
&lt;br /&gt;
==Design and Implementation==&lt;br /&gt;
Component is divided into four main blocks (each block can be use independently):&lt;br /&gt;
&lt;br /&gt;
# Sensor data acquisition&lt;br /&gt;
# Buffering&lt;br /&gt;
# HDR Merging and deghosting&lt;br /&gt;
# HDR Tone Mapping&lt;br /&gt;
&lt;br /&gt;
'''Sensor data acquisition'''&lt;br /&gt;
Architecture is based on Xilinx Zynq platform which is connected to Python 2000 CMOS sensor using LVDS (Low-voltage differential signaling) interface. The CMOS output consists of a raw CFA (Color Filter Array) image data with a Bayer filter mosaic.&lt;br /&gt;
&lt;br /&gt;
'''Buffering'''&lt;br /&gt;
The raw image is stored to DDR memory using DMA and double buffering to avoid overwriting of the data. For DDR write one DMA is used. For reading image data 3 DMAs are used.&lt;br /&gt;
&lt;br /&gt;
'''HDR Merging and deghosting'''&lt;br /&gt;
The HDR merge block reads three image streams simultaneously through the DMAs. First, it applies inverse camera response function to obtain image with linear response and merge HDR image. The merging algorithm performs per-pixel processing and requires a relatively small number of per-pixel operations. Some of its functionality is computationally demanding (e.g. division and Gauss function calculation), however, it can be optimised and/or tabulated. The Gaussian function used for ghosting suppression can be convenient because the pixel values are discrete and only a finite combination of pixel values is possible.&lt;br /&gt;
&lt;br /&gt;
'''HDR Tone Mapping'''&lt;br /&gt;
HDR pipeline is implemented in FPGA and pipelined at 200MHz while processing one pixel per clock. Input of Tone mapping block is 18bit CFA pixel in 10.8 fixed-point (FP) representation (10 integer and 4 fractional bits) and output is RGB pixel in &amp;lt;0,1&amp;gt; interval. Algorithm is based on Durand and Dorsey tone mapping operator. Durand operator is originally two pass algorithms, because it requires extreme values of the base layer. Implementation of multi-pass image processing algorithm in FPGA is problematic because of limited memory size. Typically, there is not enough space to store whole image directly in FPGA. In our case we need to compute only minimum and maximum value (or percentiles) of the base layer and we selected approach where minimum and maximum value is used from previous frame.&lt;br /&gt;
&lt;br /&gt;
[[File:Wp3-03_01.png|frame|Example of HDR processing during test flight. Top images contain tone mapped images, bottom images show LDR images used for merging and tone mapping]]&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=File:Wp3-03_01.png&amp;diff=83</id>
		<title>File:Wp3-03 01.png</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=File:Wp3-03_01.png&amp;diff=83"/>
		<updated>2022-03-10T10:00:05Z</updated>

		<summary type="html">&lt;p&gt;Enac: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=WP3-13&amp;diff=82</id>
		<title>WP3-13</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=WP3-13&amp;diff=82"/>
		<updated>2022-03-07T16:24:53Z</updated>

		<summary type="html">&lt;p&gt;Enac: pprz update.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Paparazzi UAV=&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|  ID|| WP3-13&lt;br /&gt;
|-&lt;br /&gt;
|   Contributor	|| ENAC&lt;br /&gt;
|-&lt;br /&gt;
|   Levels	|| Platform&lt;br /&gt;
|-&lt;br /&gt;
|   Require	|| C2LINK or Flight Plan definition&lt;br /&gt;
|-&lt;br /&gt;
|   Provide		|| Modular and flexible UAV autopilot system for testing and operation&lt;br /&gt;
|-&lt;br /&gt;
|   Input		&lt;br /&gt;
|&lt;br /&gt;
Depending on the flight mode:&lt;br /&gt;
* flight plan definition,&lt;br /&gt;
* mission sequence from datalink,&lt;br /&gt;
* position/speed from datalink,&lt;br /&gt;
* low-level remote control&lt;br /&gt;
|-&lt;br /&gt;
|   Output		|| Stabilized aircraft (fixedwing, multicopter, hybrid) on a given trajectory or attitude&lt;br /&gt;
|-&lt;br /&gt;
|   C4D building block		|| Flight control, Actuation, + parts of Perception and Communication&lt;br /&gt;
|-&lt;br /&gt;
|   TRL		|| 4 to 6&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Current status==&lt;br /&gt;
&lt;br /&gt;
Paparazzi is a complete system of open source hardware and software for Unmanned Aircraft Systems (UAS), including both the airborne autopilot as well as complete ground station mission planning and monitoring software utilizing a bi-directional datalink for telemetry and control. Paparazzi has been created at ENAC in 2003, and is now supported by other institutes such as MAVLAB of the TU-Delft, individual developers, and some private UAV companies from several countries.&lt;br /&gt;
&lt;br /&gt;
The Paparazzi system was initially designed for robust small fixed-wing aircrafts in 2003, but it now supports several other configurations and concepts such as high-aspect ratio gliders, multi-rotors, transitioning vehicles, and rovers. The communication between the software blocks running on the ground and the airborne autopilot is based on the PPRZLINK library, which provides API in C/C++, Python and OCaml.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:pprz_01_com_arch.png|Paparazzi communication architecture&lt;br /&gt;
File:pprz_02_control_options.png|Control options for Paparazzi UAVs&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the [https://wiki.paparazziuav.org/wiki/Main_Page Wiki] and the [https://github.com/paparazzi/paparazzi/ Github] repository. &lt;br /&gt;
&lt;br /&gt;
==Contribution and Improvements==&lt;br /&gt;
&lt;br /&gt;
Several improvements are intended in the scope of C4D projects.&lt;br /&gt;
&lt;br /&gt;
* Improvement of the internal airborne code organization: a detailed analysis of the timing between the elements of the embedded software have led to a new definition of the internal tasks’ groups. A particular attention has been given to the timing between the tasks and functions to provide a stable and reliable end-to-end execution time.&lt;br /&gt;
* Explicit definition of the dependencies between modules, both at functional and logic level: it allows to simplify the configuration of an aircraft (only relevant high-level modules needs to be specified), the dependency solver will include all the required modules. The overall system is also more robust as it can detect circular dependencies, conflicts and missing functionalities.&lt;br /&gt;
* Improvements of the static scheduling for the telemetry messages to spread the link loading over time and avoid buffer saturation and delays. The same approach can be applied to the scheduling of some of the functions calls inside the different tasks of the autopilot, linked to the new code organization mentioned in point 1.&lt;br /&gt;
&lt;br /&gt;
Performance analysis have been carried out. Some results are presented in the following figures. The first two are the different tasks and timing for the legacy architecture for fixedwing and rotorcraft firmwares. It can be seen that the group of tasks are not harmonized and don’t reflect the actual functional blocks of the system. If most of the timing are respected, the ‘event’ polling function that is expected to run at 10kHz is a bit slower than expected.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:Pprz_03.png|Performance analysis of fixed-wing legacy firmware at 100Hz&lt;br /&gt;
File:Pprz_04.png|Performance analysis of rotorcraft legacy firmware at 1000Hz&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The same work with the new architecture shows that the tasks are (almost) harmonized and reflect the reference architecture presented in the next section. The timings are all aligned with the base frequency and the ‘event’ polling function is called at the expected rate. What is not visible but is a result of this work is that the time between the sensor task (start reading and getting data from digital sensors) and the rest of the guidance and control loop is fixed and provides a correct sequencing of the data flow regardless of the user configuration.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:Pprz_05.png|Performance analysis of new fixed-wing legacy firmware at 100Hz&lt;br /&gt;
File:Pprz_06.png|Performance analysis of new rotorcraft legacy firmware at 1000Hz&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Design and Implementation==&lt;br /&gt;
&lt;br /&gt;
The new architecture corresponds to the architecture shown below. The grey boxes are tasks grouping related functionalities: sensors, estimation, control, actuators, payload, communication, etc. The components of the systems, known as modules, are described and configured thanks to an XML file, providing the relevant information:&lt;br /&gt;
&lt;br /&gt;
* module name, task group&lt;br /&gt;
* documentation&lt;br /&gt;
* dependencies (required modules and functionalities, provided functionalities, conflicts)&lt;br /&gt;
*  initialization, periodic and event based functions to be called&lt;br /&gt;
* source files and compilation flags&lt;br /&gt;
* testing flags for unit tests&lt;br /&gt;
&lt;br /&gt;
The module generator is using a topological tree search algorithm to solve the module dependencies, and then is generating C code for calling the functions, including the static schedulers for the periodic functions according to the system base frequency. The final sequence is controlled by the static dispatcher to guarantee that each task is called within a predefined order.&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=File:Pprz_06.png&amp;diff=81</id>
		<title>File:Pprz 06.png</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=File:Pprz_06.png&amp;diff=81"/>
		<updated>2022-03-07T16:18:15Z</updated>

		<summary type="html">&lt;p&gt;Enac: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=File:Pprz_05.png&amp;diff=80</id>
		<title>File:Pprz 05.png</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=File:Pprz_05.png&amp;diff=80"/>
		<updated>2022-03-07T16:18:03Z</updated>

		<summary type="html">&lt;p&gt;Enac: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=File:Pprz_04.png&amp;diff=79</id>
		<title>File:Pprz 04.png</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=File:Pprz_04.png&amp;diff=79"/>
		<updated>2022-03-07T16:17:54Z</updated>

		<summary type="html">&lt;p&gt;Enac: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=File:Pprz_03.png&amp;diff=78</id>
		<title>File:Pprz 03.png</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=File:Pprz_03.png&amp;diff=78"/>
		<updated>2022-03-07T16:17:44Z</updated>

		<summary type="html">&lt;p&gt;Enac: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=WP3-13&amp;diff=77</id>
		<title>WP3-13</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=WP3-13&amp;diff=77"/>
		<updated>2022-03-07T16:07:07Z</updated>

		<summary type="html">&lt;p&gt;Enac: Start porting ENAC page from github.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Paparazzi UAV=&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|  ID|| WP3-13&lt;br /&gt;
|-&lt;br /&gt;
|   Contributor	|| ENAC&lt;br /&gt;
|-&lt;br /&gt;
|   Levels	|| Platform&lt;br /&gt;
|-&lt;br /&gt;
|   Require	|| C2LINK or Flight Plan definition&lt;br /&gt;
|-&lt;br /&gt;
|   Provide		|| Modular and flexible UAV autopilot system for testing and operation&lt;br /&gt;
|-&lt;br /&gt;
|   Input		&lt;br /&gt;
|&lt;br /&gt;
Depending on the flight mode:&lt;br /&gt;
* flight plan definition,&lt;br /&gt;
* mission sequence from datalink,&lt;br /&gt;
* position/speed from datalink,&lt;br /&gt;
* low-level remote control&lt;br /&gt;
|-&lt;br /&gt;
|   Output		|| Stabilized aircraft (fixedwing, multicopter, hybrid) on a given trajectory or attitude&lt;br /&gt;
|-&lt;br /&gt;
|   C4D building block		|| Flight control, Actuation, + parts of Perception and Communication&lt;br /&gt;
|-&lt;br /&gt;
|   TRL		|| 4 to 6&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Current status==&lt;br /&gt;
&lt;br /&gt;
Paparazzi is a complete system of open source hardware and software for Unmanned Aircraft Systems (UAS), including both the airborne autopilot as well as complete ground station mission planning and monitoring software utilizing a bi-directional datalink for telemetry and control. Paparazzi has been created at ENAC in 2003, and is now supported by other institutes such as MAVLAB of the TU-Delft, individual developers, and some private UAV companies from several countries.&lt;br /&gt;
&lt;br /&gt;
The Paparazzi system was initially designed for robust small fixed-wing aircrafts in 2003, but it now supports several other configurations and concepts such as high-aspect ratio gliders, multi-rotors, transitioning vehicles, and rovers. The communication between the software blocks running on the ground and the airborne autopilot is based on the PPRZLINK library, which provides API in C/C++, Python and OCaml.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;gallery&amp;gt;&lt;br /&gt;
File:pprz_01_com_arch.png|Paparazzi communication architecture&lt;br /&gt;
File:pprz_02_control_options.png|Control options for Paparazzi UAVs&lt;br /&gt;
&amp;lt;/gallery&amp;gt;&lt;br /&gt;
&lt;br /&gt;
See the [https://wiki.paparazziuav.org/wiki/Main_Page Wiki] and the [https://github.com/paparazzi/paparazzi/ Github] repository. &lt;br /&gt;
&lt;br /&gt;
==Contribution and Improvements==&lt;br /&gt;
&lt;br /&gt;
Several improvements are intended in the scope of C4D projects.&lt;br /&gt;
&lt;br /&gt;
* Improvement of the internal airborne code organization: a detailed analysis of the timing between the elements of the embedded software have led to a new definition of the internal tasks’ groups. A particular attention has been given to the timing between the tasks and functions to provide a stable and reliable end-to-end execution time.&lt;br /&gt;
* Explicit definition of the dependencies between modules, both at functional and logic level: it allows to simplify the configuration of an aircraft (only relevant high-level modules needs to be specified), the dependency solver will include all the required modules. The overall system is also more robust as it can detect circular dependencies, conflicts and missing functionalities.&lt;br /&gt;
* Improvements of the static scheduling for the telemetry messages to spread the link loading over time and avoid buffer saturation and delays. The same approach can be applied to the scheduling of some of the functions calls inside the different tasks of the autopilot, linked to the new code organization mentioned in point 1.&lt;br /&gt;
&lt;br /&gt;
Performance analysis have been carried out. Some results are presented in the following figures. The first two are the different tasks and timing for the legacy architecture for fixedwing and rotorcraft firmwares. It can be seen that the group of tasks are not harmonized and don’t reflect the actual functional blocks of the system. If most of the timing are respected, the ‘event’ polling function that is expected to run at 10kHz is a bit slower than expected.&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=File:Pprz_02_control_options.png&amp;diff=76</id>
		<title>File:Pprz 02 control options.png</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=File:Pprz_02_control_options.png&amp;diff=76"/>
		<updated>2022-03-07T15:54:52Z</updated>

		<summary type="html">&lt;p&gt;Enac: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=File:Pprz_01_com_arch.png&amp;diff=75</id>
		<title>File:Pprz 01 com arch.png</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=File:Pprz_01_com_arch.png&amp;diff=75"/>
		<updated>2022-03-07T15:51:37Z</updated>

		<summary type="html">&lt;p&gt;Enac: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
	<entry>
		<id>https://c4d.lias-lab.fr/index.php?title=WP3-13&amp;diff=43</id>
		<title>WP3-13</title>
		<link rel="alternate" type="text/html" href="https://c4d.lias-lab.fr/index.php?title=WP3-13&amp;diff=43"/>
		<updated>2022-02-23T14:35:54Z</updated>

		<summary type="html">&lt;p&gt;Enac: Created page with &amp;quot;{|class=&amp;quot;wikitable&amp;quot; |  ID|| WP3-13 |- |   Contributor	|| ENAC |- |   Levels	|| xxx |- |   Require	|| xxx |- |   Provide		|| xxx |- |   Input		 | * xxx * yyy |- |   Output		|| xxx |- |   C4D building block		|| xxx |- |   TRL		|| xxx |}&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|  ID|| WP3-13&lt;br /&gt;
|-&lt;br /&gt;
|   Contributor	|| ENAC&lt;br /&gt;
|-&lt;br /&gt;
|   Levels	|| xxx&lt;br /&gt;
|-&lt;br /&gt;
|   Require	|| xxx&lt;br /&gt;
|-&lt;br /&gt;
|   Provide		|| xxx&lt;br /&gt;
|-&lt;br /&gt;
|   Input		&lt;br /&gt;
|&lt;br /&gt;
* xxx&lt;br /&gt;
* yyy&lt;br /&gt;
|-&lt;br /&gt;
|   Output		|| xxx&lt;br /&gt;
|-&lt;br /&gt;
|   C4D building block		|| xxx&lt;br /&gt;
|-&lt;br /&gt;
|   TRL		|| xxx&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Enac</name></author>
	</entry>
</feed>