Research topics

MEMA

The research objective of the Applied Mechanics Division is the theoretical prediction of the behavior of solids and fluids, with the help of new mathematical modeling and computer simulation techniques. Today’s main research topics are: geophysical fluid dynamics (ocean modeling, coastal flows), solid mechanics (composites materials, micro-mechanical modeling) and numerical methods (discontinuous Galerkin methods, radial basis functions, mesh generation). Some current research projects developed in MEMA are briefly presented in what follows.

 


  • Understanding mechanical properties of metallic alloys, such as strength, toughness and ductility, requires proper modeling of how plastic deformation is achieved within individual “grains” forming the polycrystalline microstructure. At the micron scale - thus inside the crystal lattice - micromechanics of metals is governed by the competition of dislocation glide, twinning and phase transformation. All such processes imply significant local anisotropy, which in turn gives rise to highly heterogeneous microscopic deformation patterns as well as large internal stresses

    By addressing fundamental questions about microstructure evolution in plastically deforming, single- and multiphase aggregates, we aim at an improved prediction of the mechanical behavior of advanced metallic alloys. The main research axes are:

    • The development of advanced single-crystal constitutive laws e.g. [1-2] that are implemented as numerically efficient user-defined material laws in finite element codes and are assessed thoroughly based on experimental observations at various length scales: lattice strains (neutron diffraction), lattice curvature and twins (ebsd), texture (X-rays), anisotropy and hardening (mechanical tests).
    • The definition of statistically representative, polycrystalline model microstructures and their reproduction as conforming, periodic and selectively refined finite element meshes in 2D and 3D [3-4].
    • he formulation of simplified laws about the interaction of adjacent grains and their use in real-scale simulations of forming processes [5].

    Related publications:

    1. Delannay L., Barnett M.R. "Modeling the combined effect of grain size and grain shape on plastic anisotropy of metals". International Journal of Plasticity, 32-33 (2012), 70-84
    2. Dancette S., Delannay L., Renard K., Melchior M.A., Jacques P.J. "Crystal plasticity modeling of texture development and hardening in TWIP steels". Acta Materialia, 60 (2012), 2135-2145
    3. Dobrzynski C., Melchior M.A., Delannay L. Remacle J-F. "A mesh adaptation procedure for periodic domains". International Journal of Numerical Methods in Engineering, 86 (2011), 1396-1412
    4. Delannay L., Yan P., Payne J.F.B., Tzelepi N. (2014) “Predictions of inter-granular cracking and dimensional changes of irradiated polycrystalline graphite under plane strain” Computational Materials Science, 87, 129-137
    5. Delannay L., Melchior M.A., Signorelli J.W., Remacle J.-F., Kuwabara T. (2009) “Influence of grain shape on the planar anisotropy of rolled steel sheets - Evaluation of three models”, Computational Materials Science, 45, 739-743

  • The Second-generation Louvain-la-Neuve Ice-ocean Model (SLIM) project objective is to develop a multi-scale model of the marine part of the hydrosphere (World Ocean, shelf areas, estuaries, tidal rivers) for carrying out climatic and environmental studies. It is currently extended to the atmosphere. Traditional ocean models are based on finite difference methods on structured grids. These models often work on uniform grids, which implies that the resolution is increased in a global way: reducing by two the mesh size of a model leads to an increase in the computational cost by a factor of 16.

    In our model the governing equations are solved on an unstructured mesh. Unstructured meshes enable an accurate representation of coastlines and islands and allow one to avoid singularities associated with the poles in geographic coordinates [1]. However, their main advantage is their ability to adjust the resolution when and where it is actually needed to increase the range of resolved scales. They can be refined in the areas of interest, or where the more demanding dynamics requires a finer resolution, allowing for modeling contrasting physical phenomena.
    The SLIM model relies on the discontinuous Galerkin (DG) method [2]. Among the methods based upon unstructured grids, the DG method offers several favorable features such as high-order accuracy, excellent parallel scaling and an efficient treatment of convective terms [3], [4].

    Developing and using ocean models based on unstructured grids is a recent topic, although numerical tools such as unstructured meshes and finite elements have been used for a while in the world of mechanical engineering. Therefore, a transfer of knowledge from the latter domain to geophysical fluid flow modeling is needed. The SLIM team, with members in both the mechanical engineering and geophysics communities, offers an excellent environment to achieved the aforementioned knowledge transfer.

    The SLIM model make it possible to simulate previously inaccessible ranges of scales in the presence of complex geometry [5], from regional processes such as tides and pollutant transport in shelf seas to the global scale prediction of the evolution of oceanic currents. In addition, it is believed that SLIM will be ideal for modeling phenomena taking place in the land-sea continuum.

    Discover more: http://www.climate.be/slim

    Related publications:

    1. J. Lambrechts, R. Comblen, V. Legat, C. Geuzaine, et J.-F. Remacle, « Multiscale mesh generation on the sphere », Ocean Dyn., vol. 58, no 5 6, p. 461 473, déc. 2008
    2. R. Comblen, J. Lambrechts, J.-F. Remacle, et V. Legat, « Practical evaluation of five partly discontinuous finite element pairs for the non-conservative shallow water equations », Int. J. Numer. Methods Fluids, vol. 63, no 6, p. 701–724, 2010
    3. S. Blaise, R. Comblen, V. Legat, J.-F. Remacle, E. Deleersnijder, et J. Lambrechts, « A discontinuous finite element baroclinic marine model on unstructured prismatic meshes », Ocean Dyn., vol. 60, no 6, p. 1371 1393, déc. 2010
    4. T. Kärnä, V. Legat, et E. Deleersnijder, « A baroclinic discontinuous Galerkin finite element model for coastal flows », Ocean Model., vol. 61, p. 1 20, janv. 2013
    5. C. J. Thomas, J. Lambrechts, E. Wolanski, V. A. Traag, V. D. Blondel, E. Deleersnijder, et E. Hanert, « Numerical modelling and graph theory tools to study ecological connectivity in the Great Barrier Reef », Ecol. Model., vol. 272, p. 160 174, janv. 2014

  • The MEMA team started adapting and applying the Second-generation Louvain-la-Neuve Ice-ocean Model (SLIM) to the Scheldt river-estuary-sea continuum in 2007. Thanks to the unstructured mesh, the multi-scale domain – covering the upstream tidal river network, the Scheldt Estuary, the Belgian-Dutch coastal zone and the continental shelf (cf. Figure) – could be discretized by a single mesh (1D for the river and 2D for the estuary and sea).
    After having validated the hydrodynamics , most effort was devoted to environmental tracer applications. Two scientific prizes acknowledged these environmental applications: the B-IWA Research Award 2011-2012, and the VLIZ North Sea Award 2013

    Timescales for water renewal in the Scheldt Estuary

    Following the formal framework of the Constituent-oriented Age and Residence time Theory (CART), several timescales – both well-established and newly defined – were computed for the Scheldt Estuary, characterizing its water renewal dynamics: residence time, exposure time, connectivity, age, influence time . With these many timescales being computed with high spatio-temporal resolution for the Scheldt Estuary, this domain is becoming a real benchmark application.

    Sediment transport

    Most contaminants in surface waters are present under (at least) two distinct forms: either the molecules are dissolved in the water phase, or they are attached to suspended particles. Both forms not only have very different transport dynamics, they are generally also associated with different ecological impact (toxicity, bio-availability, reactivity). For realistic environmental applications, it is therefore crucial to describe these two phases explicitly, hence the necessity to have a reliable model for the suspended particles in the Scheldt. A first suspended sediment module was recently developed in the SLIM architecture.

    Escherichia coli concentration as an indicator for microbiological water quality

    Due to its intensive agriculture, high concentration of industry and large population density, the Scheldt Watershed is subject to significant pollution pressure. Discharge of wastewaters or run-off loaded with animal or human fecal material degrades the microbiological water quality, i.e. increases the risk of having microbial pathogens present in the natural surface water. Because these pathogens are too diverse, and often present only at very low concentrations, it is not feasible to systematically monitor all their individual concentrations. Instead, it has been decided to measure the concentration of a small number of so-called “fecal indicator bacteria”, of which Escherichia coli (E. coli) is one of the most important.

    Colleagues at the ULB have monitored its concentration in the Scheldt Watershed in 2007-8, but the results were so variable in time that they were difficult to interpret. It is well known that E. coli concentrations can vary over orders of magnitude in natural systems, but the possible causes in the case of the Scheldt were too numerous (water discharge, upstream inputs, wastewater treatment plant inputs, interaction with sediment, tides, temperature effect) to disentangle them empirically. Only the simultaneous representation of these effects in a single model allows to quantify their relative importance. With this goal, two increasingly complex E. coli modules were built , in tight collaboration with the ULB.

    Metal concentrations

    A last environmental application started on the Scheldt domain deals with the simulation of toxic metal levels. This study is carried out together with the VUB. First results, combining SLIM with empirical functions, were surprisingly satisfactory

    Belgian Committee of the International Water Association: http://www.b-iwa.be/awardslist/2013
    Flemish Marine Institute: http://www.vliz.be/en/north-sea-award
    CART: http://www.climate.be/cart

    Related publications:

    1. Blaise, S., de Brye, B., de Brauwere, A., Deleersnijder, E., Delhez, E.J.M. and Comblen, R. (2010) Capturing the residence time boundary layer - Application to the Scheldt Estuary. Ocean Dynamics 60, 535-554
    2. de Brauwere, A., de Brye, B., Blaise, S. and Deleersnijder, E. (2011a) Residence time, exposure time and connectivity in the Scheldt Estuary. Journal of Marine Systems 84(3-4), 85-95, doi:10.1016/j.jmarsys.2010.10.001
    3. de Brauwere, A., de Brye, B., Servais, P., Passerat, J. and Deleersnijder, E. (2011b) Modelling Escherichia coli concentrations in the tidal Scheldt river and estuary. Water Research 45, 2724-2738 doi:DOI: 10.1016/j.watres.2011.02.003
    4. de Brauwere, A., Gourgue, O., de Brye, B., Servais, P., Ouattara, N.K. and Deleersnijder, E. (2014) Integrated modelling of faecal contamination in a densely populated river-sea continuum (Scheldt River and Estuary). Science of the Total Environment 468-469, 31-45, doi:10.1016/j.scitotenv.2013.08.019
    5. de Brye, B., de Brauwere, A., Gourgue, O., Kärnä, T., Lambrechts, J., Comblen, R. and Deleersnijder, E. (2010) A finite-element, multi-scale model of the Scheldt tributaries, River, Estuary and ROFI. Coastal Engineering 57, 850-863, doi:10.1016/j.coastaleng.2010.04.001
    6. de Brye, B., de Brauwere, A., Gourgue, O., Delhez, E.J.M. and Deleersnijder, E. (2012) Water renewal timescales in the Scheldt Estuary. Journal of Marine Systems 94, 74-86, doi:10.1016/j.jmarsys.2011.10.013
    7. Delhez, E.J.M., De Brye, B., de Brauwere, A. and Deleersnijder, E. (in press) Residence time vs influence time. Journal of Marine Systems
    8. Elskens, M., Gourgue, O., Baeyens, W., Chou, L., Deleersnijder, E., Leermakers, M. and de Brauwere, A. (2014) Modelling metal speciation in the Scheldt Estuary: combining a flexible-resolution transport model with empirical functions. Science of the Total Environment 476-477, 346-358, doi:10.1016/j.scitotenv.2013.12.047
    9. Gourgue, O., Baeyens, W., Chen, M.S., de Brauwere, A., de Brye, B., Deleersnijder, E., Elskens, M. and Legat, V. (2013) A depth-averaged two-dimensional sediment transport model for environmental studies in the Scheldt Estuary and tidal river network. Journal of Marine Systems 128, 27-39, doi:10.1016/j.jmarsys.2013.03.014

  •  

    The Congo River discharge is the second largest in the world. This area concentrates some important near shore oil resources. Total is interested in quarrying this oil. To achieve this, they need accurate knowledge concerning hydrodynamics in this area. The Second-generation Louvain-la-Neuve Ice-ocean Model (SLIM) and UCL team has been chosen to build a numerical model of the area.

    Since this area lacks of infrastructure, gathering accurate data is quite challenging. Concerning bathymetry, we have used the general digital bathymetry GEBCO 2008 filled up with data extracted from paper nautical charts. We have developed a new tool, GeoDesk, to collect and digitalize these data (from paper to numerical values). This tool also allowed us to extract coastlines from these charts, used in combination with the global coastline database GSHHG. Using these data, we have created a 2D mesh covering the whole region of interest using GMSH software.

    Given the complexity of the area, we have created some tools to correct the automatically generated meshes. To run our model, we are using wind data from NCAR and tidal forcing from TPXO) – some development is currently in progress to use the more accurate FES2012 database. We are still gathering currents and river discharge data. However, we have made some promising first runs. Especially, we have worked on harmonic analysis of tides to compare with tidal measurements from satellites. This comparison shows good agreement between our modeling and data.

    The next steps will be to create a 3D model and using the models for dynamical investigation such as tackle with residence time in the river mouth.


  •  

    In a 2004 paper, "Consider a Spherical" cow, Pr. Thomas J.R. Hughes claimed that about one million finite element analyses were performed in engineering offices around the world. Its study shows that about 80% of the human time spent FE computations is devoted to mesh generation. Pr. Hughes’s conclusions were confirmed by a later survey, "Meeting the Challenge for Automated Hexahedral Meshing", supervised by Dr. Ted Blacker from Sandia.

    Mesh generation is considered as a hard problem. Surprisingly, only about 20 research teams are actively working on the subject (we are clearly one of those). Fundamental work is performed by MEMA on different topics related to mesh generation: quad/hex meshing, parallel meshing, curvilinear meshing, computational geometry. Moreover, the team is largely involved in the most important open source project devoted to mesh generation [1].
    Gmsh’s developers have recently received a free software award (Les trophées du Libre). The team is presently trying to address several issues related to mesh generation. Among those are curvilinear meshing and hex meshing.

    Curvilinear meshing

    There is a growing consensus in the computational mechanics community that state of the art solver technology requires, and will continue to require too extensive computational resources to provide the necessary resolution for a broad range of demanding applications, even at the rate that computational power increases. The requirement for high resolution naturally leads us to consider methods, which have a higher order of grid convergence than the classical (formal) second order provided by most industrial grade codes.

    In many contributions, it is shown that the accuracy of the method strongly depends on the accuracy of the geometrical discretization. Consequently, it is necessary to address the problem of generating the high-order meshes that are needed to fully benefit from high-order methods. The MEMA team is actively involved in fundamental work on curvilinear meshing. Provable estimates of high order element validity have been proposed [2]. Robust untangling procedures have been developed [3]. This work was partly funded by the EU through the IDIHOM FP7 project.

    Quad/Hex meshing

    Automatic hexahedral mesh generation is considered as the quest of Graal of mesh generation. The generation of full-hex conforming meshes on arbitrary domains is one of the most exciting topics in computational geometry. Our team devotes considerable work on quad and hex meshing [4,5]. This work was partly funded by the WIST3 project DOMHEX and by the Walloon “Plan Marshall” project HPC4WR

    Related publications:

    1. C Geuzaine, JF Remacle, Gmsh: A 3‐D finite element mesh generator with built‐in pre‐and post‐processing facilities International Journal for Numerical Methods in Engineering 79 (11), 1309-1331,2009
    2. A Johnen, JF Remacle, C Geuzaine, Geometrical validity of curvilinear finite elements, Journal of Computational Physics 233, 359-372, 2013
    3. Robust untangling of curvilinear meshes, T Toulorge, C Geuzaine, JF Remacle, J Lambrechts, Journal of Computational Physics 254, 8-26, 2012
    4. JF Remacle, J Lambrechts, B Seny, E Marchandise, A Johnen, Blossom-Quad: A non-uniform quadrilateral mesh generator using a minimum‐cost perfect‐matching algorithm, International Journal for Numerical Methods in Engineering 89 (9), 1102-1119, 2012
    5. T.Carrier Baudouin, JF Remacle, E Marchandise, F Henrotte, C Geuzaine, A frontal approach to hex-dominant mesh generation, Advanced Modeling and Simulation in Engineering Sciences 1 (1), 8, 2014
  • MigFlow Project

  • Ever since its introduction in the 1970s, the RBF method has proved to be a powerful tool for interpolation and, more recently, for solving PDEs. Its strengths lie not only in its inherent ability to handle scattered nodes, but also in its potential to achieve spectral accuracy. Further, it leads to an unconditionally non-singular system, and its algorithm is remarkable in simplicity in any number of dimensions. It is a young method with many more aspects to be explored and weaknesses bypassed. This research aims to investigate the RBF method as a serious competitor to the most commonly used methods such as the finite elements or finite difference methods.

    A fast RBF method for solving partial differential equations on arbitrary surfaces Although much work has been done on using RBFs for reconstructing arbitrary surfaces, using RBFs to solve PDEs on arbitrary manifolds is only now being considered and is the subject of this project. In [3], we introduce a new technique that is loosely inspired by the Closest Point Method. This new technique, the Orthogonal Gradients Method (OGr), allows us to compute operators restricted to general surfaces just by means of point clouds.

    It also benefits from RBFs' strengths: simplicity, high accuracy and a meshfree character, which allows the flexibility to represent the most complex geometries. We were able to use this work to generate and repair mesh on complicated geometries [4]. In the upcoming study [5], we will introduce a fast version of the OGr algorithm, which makes use of RBF-generated finite difference methods to discretize the differential operators, allowing us to bypass both stability and high complexity issues.

    A RBF method for solving fractional diffusion equations in one and two spatial dimensions Diffusion processes in complex systems are often observed to deviate from standard laws and are not well-represented by second- order diffusion models. Such deviations have been observed in many different contexts such as: the dispersion of tracers in an aquifer, the stock market volatility or the random displacements of living species in their search for food. For all of these examples, fractional-order diffusion models have proved to be superior to classical second-order models, as they are able to represent anomalous diffusion processes.

    One of the ongoing issues with fractional diffusion models, however, is the design of an efficient, high-order numerical discretization. In [1], we introduced a RBF discretization technique in one spatial dimension. The technique for the two-spatial dimension case will be the subject of the upcoming work [2].In both cases, our numerical results suggest that both the global and meshfree features of RBFs makes them particularly well-suited for solving fractional diffusion equations (FDEs) on arbitrary domains.

    Related publications:

    1. Piret C. and E. Hanert“A Radial Basis Functions Method for Fractional Diffusion Equations”, Journal of Computational Physics, 238, 71–81, (2013)
    2. Piret C., Hanert E. and Larsson E., “A Radial Basis Functions Method for Fractional Diffusion Equations in Two Spatial Dimensions”, to be submitted
    3. Piret C. “The Orthogonal Gradients Method: a Radial Basis Functions Method for Solving Partial Differential Equations on Arbitrary Surfaces”, Journal of Computational Physics, 231(14), 4662–4675, (2012)
    4. Marchandise E., Piret C. and Remacle J.-F.,“CAD and mesh repair with Radial Basis Functions”,Journal of Computational Physics, 231(5),2376– 2387,(2012)
    5. Piret C. “The Fast Orthogonal Gradients Method: a Radial Basis Functions Method for Solving Partial Differential Equations on Arbitrary Surfaces”, to be submitted

  • Nowadays geophysical and environmental fluid flow models routinely produce large amounts of results. Making sense of all these real numbers is not a trivial task. This is why specific interpretation methods are needed, such as, among others, estimating timescales. In this respect, a comprehensive theory (CART) is developed that allows for the estimation of timescales, mainly the age and the residence time, from the solution of partial differential problems.

    The development and use of CART is coordinated by Prof Eric Delhez (Université de Liège, Belgium) and Eric Deleersnijder (UCL/IMMC/MEMA). Several partners, in Belgium and abroad, contribute regularly to this endeavour.

    At any time and position, the age — a measure of the elapsed time since a given origin — of every constituent, or group of constituents, of seawater can be estimated in such a way that advection, diffusion and production/destruction are properly taken into account. This method has been used to diagnose mass transfers in eco-system models of various degrees of complexity, help understand flow in estuaries, shallow seas and the World Ocean, and to develop reduced-dimension models, such as the leaky funnel, a metaphor of the ventilation of the World Ocean.

    The residence time is usually defined as the time taken by a water/tracer parcel to leave the domain of interest. In the framework of CART, a rigorous generic method is suggested for evaluating, at any time and position, the residence time. An alternative version of the latter, the exposure time, is also considered so as to account for the fact that constituent particles can re-enter the domain of interest after leaving for the first time. This concept was applied in numerous situations, including a contribution to the study of a long-standing problem of marine biology, i.e. the question of? how sinking phytoplankton species manage to survive.

    Water renewal refers to the processes by which water initially in a semi-enclosed domain is progressively replaced by water originating from its environment. The rate at which water is renewed may be characterised by having recourse to the aforementioned timescales. This was exemplified by several publications focussing on the water renewal of the Scheldt Estuary (Belgium / The Netherlands). The relationship between water renewal, connectivity and the concept of exposure is also being explored, by means of theoretical developments, idealised models and realistic numerical simulations.

    CART has been designed in such a way that it can be applied to reactive transport problems of any nature. Accordingly, new timescales can more of less easily be cast in the CART framework. The next challenge to be taken up is to use CART outside the realm of geophysical and environmental fluid flows.

    Discover more: http://sites.uclouvain.be/cart


  • Le développement de logiciels scientifiques open source est une tendance forte dans le domaine de la science en général. Plusieurs études empiriques ont étudié les facteurs clés déterminant la réussite d'un projet open source1. Les quatre facteurs de succès les plus importants sont
    - la taille et la vigueur de la communauté d'utilisateurs
    - le caractère innovant de l'outil mis en ligne
    - la maturité technique du projet
    - une coordination sans failles des développements.

    Le logiciel Gmsh développé par l'équipe MEMA répond assez bien à ces quatre conditions. La communauté Gmsh, forte de plusieurs milliers d'utilisateurs réguliers, est une communauté active, tant au niveau des blogs que de la mailing list et de la mise en ligne de correctifs. Gmsh est fourni par les distributions Linux les plus courantes (Debian, Ubuntu) et est disponible sous de multiples plateformes (Windows, OSX, Linux, IOS, Android)

    Il existe un grand besoin d'innovation dans le domaine de la génération de maillage et l'équipe qui développe Gmsh est largement reconnue pour les recherches qu'elle mène dans ce domaine. Nous sommes par exemple organisateur de la conférence la plus importante du domaine en 2014. Le succès actuel de Gmsh est le résultat de plus de 10 ans d'un travail persévérant. Le logiciel est stable (les scripts utilisateurs d'il y a 10 ans fonctionnent toujours), robuste, en constante amélioration. Il est utilisé autant dans le monde académique qu'industriel où son niveau de maturité technologique est considéré comme très élevé (TRL8).

    Dans le domaine du calcul scientifique en général, la communauté du logiciel libre produit des logiciels de simulation numérique "clé-en-main" de qualité professionnelle dans différents domaines de l'ingénierie : OpenFOAM en mécanique des fluides, Code_Aster en calcul de structures, GetDP en électromagnétisme, SLIM en simulation océanique, LMGC90 pour la mécanique non standard, pour en citer quelques uns parmi les plus emblématiques. Ces logiciels libres sont compétitifs par rapport aux solutions commerciales équivalentes, tant du point de vue de leurs capacités que de leurs performances, mais ils demeurent malgré tout relativement peu utilisés par rapport aux équivalents commerciaux.

    L'absence d'une interface standardisée pour le pré- et le post-traitement est en grande partie responsable de cet état de fait. Or, avec ses outils de CAO et de postprocessing, Gmsh est apte à prendre en charge ces étapes de pré- et post-traitement, c'est-à-dire le début et la fin d'une chaîne de simulation dont la partie centrale reste assurée par un code de calcul extérieur. L'étape suivante, presque naturelle, dans le développement du projet Gmsh concernerait donc l'intégration des codes de calcul scientifique. Gmsh a la potentialité de devenir le point d'entrée d'une très large gamme de logiciels libres pour le calcul scientifique. L'idée du projet de recherche ONELAB est d'offrir au départ de l'environnement Gmsh un niveau d'intégration comparable à celui que proposent des alternatives commerciales comme par exemple COMSOL, mais sur base uniquement de logiciels open source de haute qualité scientifique.

    Related publication:

    1. C. Geuzaine, F. Henrotte, E. Marchandise, J.-F. Remacle, P. Dular, R. Vazquez Sabariego, ONELAB: Open Numerical Engineering LABoratory, Proceedings of the 7th European Conference on Numerical Methods in Electromagnetism (NUMELEC2012) 01/2012
    2. E. Marchandise, A. Mouraux, L. Plaghki, F. Henrotte, Finite element analysis of thermal laser skin stimulation for a finer characterization of the nociceptive system, Journal of Neuroscience Methods, Volume 223, p 1-10, 2014, ISSN 0165-0270, http://dx.doi.org/10.1016/j.jneumeth.2013.11.010