Documents found
-
501.More information
AbstractThis article presents a discussion of interactivity from three very different viewpoints. In the first section, the author re-examines the definition of interactivity as it is related to simulation and considering human characteristics. A second section presents educational issues related to interactivity and attempts to show both why and how to promote a central objective which facilitates "becoming an author". The third section describes interactive narrative which reveals that through questions about narrative sequences, aspects of temporality, presence of the reader or the program in the relationship between author-reader, that "fictional theories" about life are always present in the background.
-
502.More information
Reinert Max - "Lexical worlds" and their "logic" through the statistical study of a body of nightmare narratives.The French school of data analysis, strongly influenced during the seventies by the work of J.P. Benzécri, owes much to his interest in language data. The development of desk-top publishing and the spreading of texts thanks to computers have revitalized this line of research by what is usually called the "statistical analysis of textual data" (Lebart, Salem, 1988). In this article, the author presents his work and questions in the field : the Alceste method and the notion of lexical worlds (which are central to the proposed strategy). The presentation is backed up by a specific application: the analysis of a body of 212 accounts of nightmares.
-
503.More information
ContextThe St-Lawrence Center, part of Environment Canada, undertook a few years ago the very ambitious project of studying the toxic contamination of the St-Lawrence River. In collaboration with the Institut National de la Recherche Scientifique - Eau, a sub-project based on numerical modeling was defined in order to analyze contaminant propagation from industrial and municipal effluents into the river system.GoalsThe specific goals of the project were the following :1) to provide a precise quantification of contaminant concentrations in the effluent plume al a convenient scale;2) to analyze areas influenced by main tributaries and different water masses entering the river reach;3) to map and quantify areas as compared to water quality criteria ;4) to provide a method to select relevant hydrological events as a significant part of the analysis frameworkMethodologySome basic choices were made at the beginning of the project :1) the analysis framework emphasis the instream water quality instead of the effluent water quality;2) numerical modelling was the main tool used to evaluate the water quality;3) as far as possible references to public regulations were incorporated;4) a strong complementarity of different computer tools was favoured : Geographical Information Systems, Database management systems, simulation models;5) the numerical solution method for the transport diffusion model is typically Lagrangian : the Random Walk Method;6) the contamination analysis uses the so-called « Weighted Unusable Area » method to quantify areas that do not respect some water quality criteria.A typical contamination analysis project based on numerical modelling includes the following steps (fig 2) :1) a preliminary study to determine the main characteristics of the problem and to choose the best strategy to analyze it;2) field measurements essential to the calibration and validation of the computer model;3) hydrodynamic modelling provides the basic data on the flow field; this step includes the calibration and the validation of the model, as well as the prediction of the flow fields corresponding to well-defined and contamination relevant hydrological events;4) hydrological analysis identifies the relevant flow events chat will further be used in the mode) prediction ; this approach allows standardization of this very important input data set and avoids arbitrary choices of flow field;5) transport-diffusion modelling constitutes the main step; it provides the chemical species concentrations downstream from the effluent discharge and affords an estimate of the overall water quality of the reach, as influenced by the main tributaries. This step includes the calibration and the validation of the model which precedes the prediction exercise;6) contamination analysis necessitates the choice of appropriate and relevant water quality criteria ; we propose a new approach, inspired by the Instream Flow Incremental Methodology often used to define the quality and availability of fish habitat in river reaches, to implement this step.Numerical methodsAs previously mentioned, the project included the development of a Lagrangian model to simulate the transport of solutes in a two-dimensional steady-state river flow. We will emphasize this point. The main objective of the software development was to provide an efficient and user-friendly management tool for the public agencies. Many analytical test cases helped in the choice of the best numerical algorithms, non-physical related parameters, and in the validation of the computer code. Furthermore, the results of two dye tracing experiments performed in conjunction with airborne remote sensing techniques provided data to validate the model on the St-Lawrence River (fig. 5, 6, land 8 illustrate different simulation results corresponding to the different tasks mentioned previously). In the next paragraphs, we will summerize the basic mathematical and numerical concepts implemented in the simulations.To simulate solute transport in water media (porous or free surface), one usually uses eulerian methods which lead directly to concentration values. The solution algorithm presented here is rather based on a Lagrangian method which offers an explicit control over the additional numerical diffusion associated with every discretization method. This approach, also called the Random Walk Method (illustrated in fig. 3), or Particle Tracking Method, is more and more often used to solve hyperbolic equations. So far, the literature does not provide many applications of this method to solute transport in free surface flow. Oil spin modeling is a domain where many applications have been reported.The propagation of solute matter in free surface flow is mathematically described with momentum, mass and solute conservation equations. Since the Random Walk solution method of the transport-diffusion equation (equ. 1) requires hydrodynamic data to calculate the mean transport on streamlines along with dispersion, independent simulations providing the necessary flow field data (velocities, diffusivities, depths) have to be performed before undertaking the transport-diffusion tasks. For this purpose, the Navier-Stokes shallow water equations have become a well known tool to represent flow field in shallow waters. However, one should be aware of some often neglected but important aspects of such models, such as moving boundaries and turbulence closure. Solution techniquesTwo main goals were kept in mind during the implementation of the various algorithms : precision of results and fast computation. The following choices were made to achieve these objectives :1) A finite element discretization and solution method provides and carries hydrodynamic Information, but particles are tracked on a finite-difference grid (mixed discretization principle).2) The convective component of the movement is realized by moving the grid instead of the particles (shifted grid principle).3) Computation of concentrations optimizes smoothing while minimizing artificial diffusion (controlled effusive smoothing principle).4) When a section of the plume is described in a steady state « regime », it is mot necessary to continue the simulation on that section to proceed downstream ; the simulation is divided in almost independent sections (convolution principle).5) The particles have an a priori nondimensional weight and a unit concentration is calculated from these (unit plume principle).6) The real concentration is linearly dependent on the pollutant loads introduced into the milieu (linearity principle).The Weighted Unusable Area MethodThe Weighted Unusable Area method provides a convenient means to compare effluent plume water quality to water quality criteria as well as to quantify areas that do not comply to them. A comparable method is widely used to define the quality and availability of fish habitat downstream from regulation reservoirs, with the purpose of establishing minimum guaranteed flow discharge to protect target species (the Instream Flow Incremental Methodology : IFIM). The method consists essentially of computing areas within the analysis domain weighted by a certain factor that represents the exceedence of certain water quality, criteria. Among different options to define the weighting factor, all incorporating the effective contaminant concentration, we defined the following :1) the ratio of the concentration to the water quality criterion without consideration of exceedence or compliance;2) weighting factor equal to 1 only if the concentration exceeds the criterion (non-compliance);3) option #1, but using the concentration results corresponding only to the effluent plumes excluding the ambient water quality of the reach ; this emphasizes individual corporate responsibility (proposed for implementation);4) option 11, but with the ratio increased by a power « n », a procedure that emphasizes the non-linear increase of toxicity related to the exceedence of the criterion (could be useful for academic purposes).We also propose a Global Weighted Unusable Area concept to combine all the different chemical species present in an effluent plume. The combination is made possible using the specific criterion corresponding to each species. This procedure leads to a new state variable that represents Contamination Standard Units.
Keywords: Qualité de l'eau, modèle hydrodynamique, contamination industrielle, analyse spatiale, modélisation numérique, marche au hasard, aires pondérées inutilisables (API), fleuve Saint-Laurent, lac Saint-Pierre, Water quality, hydrodynamic model, industrial contamination, spatial analysis, numerical modelling, Random Walk Method, Weighted Unusable Area, Saint Lawrence River, Lake Saint-Pierre
-
504.More information
In initial teacher training, the studies on reflexive practice are often considered on the future teacher point of view. Nevertheless, the action of a supervisor is very important during an activity of retroaction. In one of the core activities, the know-how, ten interviews are analyzed to determine the differences between the practice of two supervisors and their impact in terms of reflective practice of the future teachers.
Keywords: Superviseur, rétroaction, réflexivité, futurs enseignants, Supervisor, retroaction, reflexivity, future teachers
-
-
506.More information
As part of a study linked to future repair work on water control structures along the St. Lawrence River (Québec, Canada), drone technology was used to evaluate the number and species of freshwater turtles present in 3 areas of interest. The objectives were to compare the results of turtle surveys conducted using drones, with those of visual surveys conducted on the ground; identify optimal weather conditions that maximize the detection of turtles by drone; clarify certain methodological elements for future inventories; and identify the limitations of the current approach. The drone survey method was more effective than visual surveys, mainly because it allowed the subsequent analysis of photographs taken in the field. However, the drone method was not as effective for less abundant species of turtles. Analysis of weather conditions indicated a small negative influence of air temperature. This study demonstrates that drone technology provides superior results in terms of turtle detection compared to visual surveys conducted on the ground. However, it is likely to be less efficient in environments where turtle density is low, land access is limited, canopy cover is extensive, or where there are regulatory flight restrictions. According to the present study, the average optimal altitude for detecting and identifying turtles using a drone is approximately 20 m.
Keywords: altitude, détection, milieu humide, système d'aéronef télépiloté, tortues, altitude, detection, turtles, unmanned aerial system, wetland
-
507.More information
In order to define the educational role of the scientific and technological exhibition intended for children, la Cité des enfants of la Cité des sciences et de l'industrie has developped a research policy in partnership with the universities. The first results indicate the favourable conditions for the development of certain forms oflearning in the exhibition so as to favour cooperative learning between child and child, and adult and child, and also enable the definition of strategies for activities and ways of exhibiting which will encourage ail children in their discoveries, including those who have difficulties in their schooling.
-
508.More information
Do new information and communication technologies make work less hard for employees and give them more freedom? This article examines whether these information and communication technologies help to liberate workers or whether they have the opposite effect. To better understand the effects of these technologies on work relations, it is important to consider to what extent the 20th century conceptual model of an "employee" is gradually becoming outdated in firms which organize their internal work processes on the basis of this new equipment. Thus, robotics and telematics are combined in a complex production process but their respective impact on the individual employee can be quite different. These external forces affect the individual employee in a much more direct and personal way than the simple constraints arising from technical means of production.The flexibility sought by the firm translates into a dual requirement for modem workers in terms of performance: rapid development of their professional autonomy (being able to work without a safety net) and continuous increases in their job mobility. While it is unquestionable that this digital technological "revolution" is increasing the supply of skilled jobs, it is also important to know the price that is, or can be, required from current workers.It is true that these memorized and processed data help avoid long debates about who did what, when and how. Nobody can justifiably deny these material and external facts. Moreover, this implies a retrieval of only selected data and not of all the elements of operations carried out by the employee. It goes without saying that a memory that retains such elements is worth ten times more than the memory of a manager, but the latter's valid and responsible judgement is still needed to provide an overall and human appreciation of the work performed by the same employee. Thus, can a firm's human resources manager, in all honesty, be satisfied with the data and rely on this Computing "subcontractor," forgetting that, in this case, the data processed are neither complete nor perfect. Owing to information and communication technologies, employees are always near regardless of their geographical location on any given day. Thus, at least for employees, absence from and presence at work no longer have the same meaning nor the same impact. Teleworkers, in the broad sense of the word, might organize their own work schedule, taking into account their personal constraints and the times of the day when they are at their best physically and mentally and have the optimal environmental conditions to carry out the work. But, is this really the case? In strict legal terms, are these teleworkers who have no roots and no fixed location on the employer's premises still employees? In such circumstances, have they acquired such a degree of autonomy that their legal status might be changed? Even though these teleworkers are judicially and legislatively described as employees, several current raies in employment-related laws would be hard to apply to them. The magnetic badge allows for tight and effective management of staff circulation within the establishment. Entries and exits made using the magnetic badge are thus memorized and processed (time, length of time and place, etc.). Thus, there is data on who circulates where, when, for how long and how many times per day, etc. This by-product of the electronic control of access routes thus makes it possible to track the entire staff and even to establish, at the precise time, an accurate accounting statement of the time spent near the work station and the time when the employee may be elsewhere... This gives rise to another question: should this employee not have a private "protective bubble" that guarantees him or her a degree of privacy, even inside the establishment? To what extent can an individual's actions be monitored so closely?The risk of a few blunders or the existence of a doubt about embezzlement by certain employees cannot justify a Kafkaesque surveillance of all employees, everywhere and all the time. Of course, there is no single, valid answer to any of these issues raised by the presence of information and communication technologies. Whatever the case may be, this "black eye" makes the employee highly visible to others, at the cost of abandoning his or her privacy, self-respect and perhaps, dignity. Employees are made so visible and transparent that they are "laid bare," or at least this is how they may feel and thus may be a source of distress. While not denying the efficiency of information and communication technologies, we should not be less vigilant under their blinding effects. Who knows what logic and model the authors of this software have used to obtain these results? It seems that a responsible manager should be cautious and entertain a systematic doubt so that this hidden technological subcontracting does not replace the necessary analysis of a set of qualitative and contextual data that are not dealt with at all by these new technologies.
-
509.More information
Polygenic neo-variables to the rescue of a winter scene. — This article seeks to demonstrate the potential application of remote sensing via a simulated mission of the satellite Spot for elucidating the urban landscape. The case is Strasbourg, the date, February 1982. Certain faults in this simulation exercice have unfortunately precluded the normal possibility of multi-spectral classification. Some attempts have been made to rectify these deficiencies by creating three new band categories: radiometric, structural, and textural. The morphological neo-variables of the two latter categories have provided indispensable help in producing a reasonably satisfactory classification. They have also raised a series of stimulating questions. There were other deficiencies besides those of the initial imagery which made it quite difficult to accomplish this particular case study. There was no specialized information which might facilitate corroborative analysis of the image. The fact that these unfavourable conditions have been surmounted should make for optimism about the potential use of better data from satellite Spot even in research centers which have only a modest level of graphic equipment.
Keywords: methodology, remote sensing, urban morphology, méthodologie, morphologie urbaine, télédétection
-
510.More information
Often considered a vector for major change, digital practices represent an opportunity through which to better understand literature and literary space by revealing, more explicitly than ever before, ontological elements that have, as such, timeless value. In particular, and as is the focus of this article, digital creation offers an opportunity to revive a debate that has persisted throughout the history of thought on the status of literature dating back to Plato and Aristotle : that of the relationship between literature and reality, the opposition of which Sartre and Derrida had already sought to deconstruct during the 20th century. We find that digital space highlights the absence of separation between symbolic and non-symbolic, thereby preventing us from considering a break between imaginary and reality. To address this structure, we turn to the concept of editorialization, which refers to the set of devices that allow for content production in digital space while accounting for the fusion of digital and non-digital spaces. Through literary examples – Traques Traces by Cécile Portier and Laisse venir by Anne Savelli and Pierre Ménard – we will demonstrate how literature today participates in editorializing the world, therefore definitively burying the imaginary-reality binary in favour of an anamorphic structure.
Keywords: Littérature numérique, imaginaire, réel, mimesis, éditorialisation, Google Street View, Google Maps, Sartre, Derrida, Digital literature, imaginary, reality, mimesis, editorialization, Google Street View, Google Maps, Sartre, Derrida