LometMain.C

Go to the documentation of this file.
00001 /**
00002    \file  Robots/LoBot/LometMain.C
00003    \brief Robolocust metrics log analyzer.
00004 
00005    This file defines the main function for a multithreaded analysis
00006    program that loads all the Robolocust metrics logs associated with an
00007    experiment and then combines the information contained in these logs
00008    to produce average trajectory information, exploiting parallelism
00009    wherever possible.
00010 
00011    Here's the background for this program: the Robolocust project aims to
00012    use locusts for robot navigation. Specifically, the locust sports a
00013    visual interneuron known as the Lobula Giant Movement Detector (LGMD)
00014    that spikes preferentially in response to objects moving toward the
00015    animal on collisional trajectories. Robolocust's goal is to use an
00016    array of locusts, each looking in a different direction. As the robot
00017    moves, we expect to receive greater spiking activity from the locusts
00018    looking in the direction in which obstacles are approaching (or being
00019    approached) and use this information to veer the robot away.
00020 
00021    Before we mount actual locusts on a robot, we would like to first
00022    simulate this LGMD-based navigation. Toward that end, we use a laser
00023    range finder (LRF) mounted on an iRobot Create driven by a quad-core
00024    mini-ITX computer. A computational model of the LGMD developed by
00025    Gabbiani, et al. takes the LRF distance readings and the Create's
00026    odometry as input and produces artificial LGMD spikes based on the
00027    time-to-impact of approaching objects. We simulate multiple virtual
00028    locusts by using different angular portions of the LRF's field of
00029    view. To simulate reality a little better, we inject Gaussian noise
00030    into the artificial spikes.
00031 
00032    We have devised three different LGMD-based obstacle avoidance
00033    algorithms:
00034 
00035       1. EMD: pairs of adjacent LGMD's are fed into Reichardt motion
00036               detectors to determine the dominant direction of spiking
00037               activity and steer the robot away from that direction;
00038 
00039       2. VFF: a spike rate threshold is used to "convert" each virtual
00040               locust's spike into an attractive or repulsive virtual
00041               force; all the force vectors are combined to produce the
00042               final steering vector;
00043 
00044       3. TTI: each locust's spike rate is fed into a Bayesian state
00045               estimator that computes the time-to-impact given a spike
00046               rate; these TTI estimates are then used to determine
00047               distances to approaching objects, thereby effecting the
00048               LGMD array's use as a kind of range sensor; the distances
00049               are compared against a threshold to produce attractive and
00050               repulsive forces, with the sum of the force field vectors
00051               determining the final steering direction.
00052 
00053    We also implemented a very simple algorithm that just steers the robot
00054    towards the direction of least spiking activity. However, although it
00055    functioned reasonably well as an obstacle avoidance technique, it was
00056    found to be quite unsuitable for navigation tasks. Therefore, we did
00057    not pursue formal tests for this algorithm, focusing instead on the
00058    three algorithms mentioned above.
00059 
00060    To evaluate the relative merits of the above algorithms, we designed a
00061    slalom course in an approximately 12'x6' enclosure. One end of this
00062    obstacle course was designated the start and the other end the goal.
00063    The robot's task was to drive autonomously from start to goal,
00064    keeping track of itself using Monte Carlo Localization. As it drove,
00065    it would collect trajectory and other pertinent information in a
00066    metrics log.
00067 
00068    For each algorithm, we used four noise profiles: no noise, 25Hz
00069    Gaussian noise in the LGMD spikes, 50Hz, and 100Hz. For each noise
00070    profile, we conducted 25 individual runs. We refer to an individual
00071    run from start to goal as an "experiment" and a set of 25 experiments
00072    as a "dataset."
00073 
00074    The objective of this program is to load an entire dataset and then
00075    perform the necessary computations to produce an "average" trajectory
00076    from start to goal. Other useful metrics are also computed such as
00077    average forward driving speed, total number of collisions across all
00078    experiments, etc.
00079 
00080    Since the above computations can take a while, this analysis program
00081    uses multiple threads to speed up various subtasks. Therefore, it
00082    would be best to run it on a multiprocessor machine (as there are 25
00083    experiments per dataset, something with 24 to 32 CPU's would provide
00084    maximal benefit).
00085 */
00086 
00087 // //////////////////////////////////////////////////////////////////// //
00088 // The iLab Neuromorphic Vision C++ Toolkit - Copyright (C) 2000-2005   //
00089 // by the University of Southern California (USC) and the iLab at USC.  //
00090 // See http://iLab.usc.edu for information about this project.          //
00091 // //////////////////////////////////////////////////////////////////// //
00092 // Major portions of the iLab Neuromorphic Vision Toolkit are protected //
00093 // under the U.S. patent ``Computation of Intrinsic Perceptual Saliency //
00094 // in Visual Environments, and Applications'' by Christof Koch and      //
00095 // Laurent Itti, California Institute of Technology, 2001 (patent       //
00096 // pending; application number 09/912,225 filed July 23, 2001; see      //
00097 // http://pair.uspto.gov/cgi-bin/final/home.pl for current status).     //
00098 // //////////////////////////////////////////////////////////////////// //
00099 // This file is part of the iLab Neuromorphic Vision C++ Toolkit.       //
00100 //                                                                      //
00101 // The iLab Neuromorphic Vision C++ Toolkit is free software; you can   //
00102 // redistribute it and/or modify it under the terms of the GNU General  //
00103 // Public License as published by the Free Software Foundation; either  //
00104 // version 2 of the License, or (at your option) any later version.     //
00105 //                                                                      //
00106 // The iLab Neuromorphic Vision C++ Toolkit is distributed in the hope  //
00107 // that it will be useful, but WITHOUT ANY WARRANTY; without even the   //
00108 // implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR      //
00109 // PURPOSE.  See the GNU General Public License for more details.       //
00110 //                                                                      //
00111 // You should have received a copy of the GNU General Public License    //
00112 // along with the iLab Neuromorphic Vision C++ Toolkit; if not, write   //
00113 // to the Free Software Foundation, Inc., 59 Temple Place, Suite 330,   //
00114 // Boston, MA 02111-1307 USA.                                           //
00115 // //////////////////////////////////////////////////////////////////// //
00116 //
00117 // Primary maintainer for this file: mviswana usc edu
00118 // $HeadURL: svn://isvn.usc.edu/software/invt/trunk/saliency/src/Robots/LoBot/LometMain.C $
00119 // $Id: LometMain.C 14294 2010-12-02 06:28:57Z mviswana $
00120 //
00121 
00122 //--------------------------- LIBRARY CHECKS ----------------------------
00123 
00124 #if !defined(INVT_HAVE_BOOST_PROGRAM_OPTIONS) || \
00125     !defined(INVT_HAVE_BOOST_FILESYSTEM)
00126 
00127 #include <iostream>
00128 
00129 int main()
00130 {
00131    std::cerr << "Sorry, this program requires the following Boost libraries:\n"
00132              << "\tprogram_options filesystem\n\n" ;
00133    std::cerr << "Please ensure development packages for above libraries "
00134              << "are installed\n"
00135              << "and then rebuild this program to get it to work.\n" ;
00136    return 255 ;
00137 }
00138 
00139 #else // various required libraries available
00140 
00141 //------------------------------ HEADERS --------------------------------
00142 
00143 // lobot headers
00144 #include "Robots/LoBot/metlog/LoCorrFinder.H"
00145 #include "Robots/LoBot/metlog/LoPointMatrix.H"
00146 #include "Robots/LoBot/metlog/LoMetlogLoader.H"
00147 #include "Robots/LoBot/metlog/LoDataset.H"
00148 #include "Robots/LoBot/metlog/LoExperiment.H"
00149 #include "Robots/LoBot/metlog/LoMetlogList.H"
00150 #include "Robots/LoBot/metlog/LoPointTypes.H"
00151 
00152 #include "Robots/LoBot/thread/LoThread.H"
00153 #include "Robots/LoBot/config/LoConfigHelpers.H"
00154 
00155 #include "Robots/LoBot/util/LoFile.H"
00156 #include "Robots/LoBot/util/LoStats.H"
00157 #include "Robots/LoBot/util/LoMath.H"
00158 #include "Robots/LoBot/util/LoSTL.H"
00159 #include "Robots/LoBot/util/LoSysConf.H"
00160 #include "Robots/LoBot/misc/LoExcept.H"
00161 #include "Robots/LoBot/misc/singleton.hh"
00162 
00163 // Boost headers
00164 #include <boost/program_options.hpp>
00165 #include <boost/lambda/lambda.hpp>
00166 
00167 // Standard C++ headers
00168 #include <iostream>
00169 #include <algorithm>
00170 #include <string>
00171 #include <vector>
00172 #include <functional>
00173 #include <iterator>
00174 #include <stdexcept>
00175 #include <utility>
00176 
00177 // Standard C headers
00178 #include <stdlib.h>
00179 
00180 // Standard Unix headers
00181 #include <unistd.h>
00182 
00183 //-------------------------- KNOB TWIDDLING -----------------------------
00184 
00185 namespace {
00186 
00187 // Retrieve settings from global section of config file
00188 template<typename T>
00189 inline T conf(const std::string& key, const T& default_value)
00190 {
00191    return lobot::global_conf<T>(key, default_value) ;
00192 }
00193 
00194 /// This inner class encapsulates various parameters that can be used
00195 /// to tweak different aspects of the trajectory metrics analysis.
00196 class LometParams : public lobot::singleton<LometParams> {
00197    /// The lomet program expects to be passed a list of directories on
00198    /// the command line. Each directory is assumed to contain a dataset
00199    /// consisting of 25 (or more) metrics log files collected from
00200    /// experiments conducted to gauge the performance of an LGMD-based
00201    /// obstacle avoidance algorithm in a local navigation task.
00202    ///
00203    /// The program reads the metlogs in these directories, parses them to
00204    /// extract the relevant info and performs the necessary analysis that
00205    /// highlights the algorithm's average-case behaviour. Since the
00206    /// directories may well contain files other than the metlogs, the
00207    /// analysis program needs some way to figure out which ones to load.
00208    ///
00209    /// This setting specifies a regular expression that matches the names
00210    /// of all the metrics logs.
00211    std::string m_log_name ;
00212 
00213    /// Once we have analyzed the log files stored in a directory, the
00214    /// results will be written to a file in that directory. This setting
00215    /// specifies the name of the results file.
00216    ///
00217    /// By default, the result file is named "result". Thus, if lomet is
00218    /// invoked with the command line argument "foo", the analysis will be
00219    /// written to the file "foo/result".
00220    std::string m_result ;
00221 
00222    /// To help with debugging, lomet can be configured to dump the
00223    /// datasets it loads once it's done parsing the metlog files making
00224    /// up the dataset. The dump will be written to the same directory
00225    /// as the one from which the logs were loaded and will be named
00226    /// "foo.dump", where "foo" is the original name of the metlog file.
00227    ///
00228    /// This setting turns dataset dumping on. By default, it is off.
00229    bool m_dump_dataset ;
00230 
00231    /// Private constructor because this is a singleton.
00232    LometParams() ;
00233 
00234    // Boilerplate code to make generic singleton design pattern work
00235    friend class lobot::singleton<LometParams> ;
00236 
00237 public:
00238    /// Accessing the various parameters.
00239    //@{
00240    static const std::string& log_name() {return instance().m_log_name     ;}
00241    static const std::string& result()   {return instance().m_result       ;}
00242    static bool  dump_dataset()          {return instance().m_dump_dataset ;}
00243    //@}
00244 } ;
00245 
00246 // Parameters initialization
00247 LometParams::LometParams()
00248    : m_log_name(conf<std::string>("log_file_name",
00249                                   "/(metlog-[[:digit:]]{8}-[[:digit:]]{6})$")),
00250      m_result(conf<std::string>("result_file", "result")),
00251      m_dump_dataset(conf("dump_dataset", false))
00252 {}
00253 
00254 // Shortcut
00255 typedef LometParams Params ;
00256 
00257 } // end of local anonymous namespace encapsulating above helpers
00258 
00259 //-------------------------- PROGRAM OPTIONS ----------------------------
00260 
00261 namespace {
00262 
00263 // This program recognizes just one option, viz., -c or --config-file.
00264 // All other command line arguments are interpreted as names of
00265 // directories, each of which is assumed to contain the Robolocust
00266 // metrics log files for one experiment, i.e., each directory specified
00267 // on the command line should hold one dataset.
00268 //
00269 // This data type is used to store the list of directories specified on
00270 // the command line.
00271 typedef std::vector<std::string> DirList ;
00272 
00273 // The following type stores the metlog list as well as the name of the
00274 // config file, i.e., it encapsulates all of the interesting stuff from
00275 // the command line in one neat little packet.
00276 typedef std::pair<std::string, DirList> CmdLine ;
00277 
00278 // If the user does not supply the -c option on the command line, we will
00279 // fall back to the default config file name returned by this function.
00280 std::string default_config_file()
00281 {
00282    return std::string(getenv("HOME")) + "/.lometrc" ;
00283 }
00284 
00285 // Helper function to take care of the annoying details of using
00286 // Boost.program_options to get at the command line arguments.
00287 //
00288 // DEVNOTE: This code is lifted almost verbatim from the
00289 // Boost.program_options tutorial and examples that comes with the Boost
00290 // documentation. There may be better (i.e., neater, more effective,
00291 // clearer, more efficient, whatever) ways to use the library.
00292 CmdLine parse(int argc, char* argv[])
00293 {
00294    std::string config_file_name ;
00295 
00296    // Specify the command line options
00297    namespace po = boost::program_options ;
00298    po::options_description options("Command line options") ;
00299    options.add_options()
00300       // the -c option for specifying the config file
00301       ("config-file,c",
00302        po::value<std::string>(&config_file_name)->
00303           default_value(default_config_file()),
00304        "specify configuration settings file")
00305 
00306       // the -d option for specifying metlog directories; this option
00307       // need not actually be supplied on the command line as all
00308       // non-option arguments will be "converted" to multiple -d options
00309       ("metlog-dir,d",
00310        po::value<DirList>(),
00311        "directory containing Robolocust metrics log files for an experiment") ;
00312 
00313    // Convert all non-option arguments to a list of -d specs
00314    po::positional_options_description p ;
00315    p.add("metlog-dir", -1) ;
00316 
00317    // Setup done: now parse argc and argv...
00318    po::variables_map varmap ;
00319    po::store(po::command_line_parser(argc, argv).
00320              options(options).positional(p).run(), varmap) ;
00321    po::notify(varmap) ;
00322 
00323    // Stuff the metlog directories specified on command line into a
00324    // vector of strings and return that along with the config file name
00325    // to the caller...
00326    if (varmap.count("metlog-dir"))
00327       return CmdLine(config_file_name, varmap["metlog-dir"].as<DirList>()) ;
00328 
00329    // Great, user did not supply any directory names to work on; s/he's
00330    // gonna get a smackin'...
00331    return CmdLine(config_file_name, DirList()) ;
00332 }
00333 
00334 // Helper function to read the lomet program's config file. If the
00335 // specified file doesn't exist, the program will rely on default
00336 // settings. If the config file contains problematic constructs, an error
00337 // will be reported but the program will continue on, simply ignoring the
00338 // bad settings.
00339 void load_config_file(const std::string& file_name)
00340 {
00341    using namespace lobot ;
00342    try
00343    {
00344       Configuration::load(file_name) ;
00345       //Configuration::dump() ;
00346    }
00347    catch (customization_error& e)
00348    {
00349       if (e.code() != NO_SUCH_CONFIG_FILE)
00350          std::cerr << e.what() << '\n' ;
00351    }
00352 }
00353 
00354 } // end of local anonymous namespace encapsulating above helpers
00355 
00356 //------------------------- METRICS ANALYSIS ----------------------------
00357 
00358 namespace {
00359 
00360 // Shortcuts
00361 using lobot::Dataset ;
00362 using lobot::PointList ;
00363 using lobot::PointListName ;
00364 
00365 // Forward declarations
00366 PointList analyze(Dataset&, PointListName) ;
00367 PointList analyze_bumps(Dataset&) ;
00368 std::vector<float> analyze_speeds(Dataset&) ;
00369 void analyze_events(Dataset&, lobot::Experiment* result) ;
00370 
00371 // This function reads all the Robolocust metrics logs in the specified
00372 // directory and combines them to produce the desired average trajectory
00373 // and other pertinent info.
00374 void process_experiment(const std::string& dir)
00375 {
00376    using namespace lobot ;
00377 
00378    // Make sure we're dealing with a valid directory...
00379    if (! is_dir(dir)) {
00380       std::cerr << dir << ": no such directory\n" ;
00381       return ;
00382    }
00383 
00384    // Read names of all metrics logs available under directory of interest
00385    std::vector<std::string> log_list = find_file(dir, Params::log_name()) ;
00386    if (log_list.empty()) {
00387       std::cerr << dir << ": no Robolocust metrics logs found\n" ;
00388       return ;
00389    }
00390    MetlogList metlog_list(log_list) ;
00391 
00392    // Create the dataset object that will collect the individual parsed
00393    // metlogs and help with their analysis...
00394    Dataset dataset ;
00395 
00396    // Load and parse all the logs in parallel
00397    const int T = std::min(static_cast<int>(log_list.size()), num_cpu()) ;
00398    std::vector<MetlogLoader*> loader_threads ;
00399    loader_threads.reserve(T) ;
00400    for (int i = 0; i < T; ++i)
00401       loader_threads.push_back(MetlogLoader::create(metlog_list, &dataset)) ;
00402 
00403    // Now we wait for the loader threads to do their thing...
00404    //
00405    // DEVNOTE: If we don't pause this main thread briefly before invoking
00406    // the wait API, we could be in big trouble because the scheduler
00407    // might decide to go right on executing this thread before it begins
00408    // any of the loader threads, in which case the wait will fail as none
00409    // of the other threads would have started up just yet, i.e., the
00410    // thread count would be zero, causing this thread to mistakenly
00411    // conclude that all the loaders are done and that it is safe to
00412    // delete them. When those threads then start executing, they will try
00413    // to reference their associated loader objects, which, because they
00414    // no longer exist, will end up taking us on a scenic, albeit short
00415    // and tragic, bus ride to Segfault City.
00416    //
00417    // DEVNOTE 2: A better way to do this is to use another condition
00418    // variable rather than the one used implicitly by Thread::wait_all(),
00419    // over which we have no control. For example, we could wait on a
00420    // condition variable that tests a counter going all the way up to T,
00421    // the number of loader threads created. When each loader is done, it
00422    // will increment the counter. Since we would have explicit control
00423    // over this variable over here, we can be assured that this thread
00424    // won't mistakenly assume all the loaders are done.
00425    //
00426    // Of course, the Right Thing is good and all. But why bother? A short
00427    // sleeps works just as well for most practical purposes...
00428    sleep(1) ; // HACK! to prevent segfault; see comment above
00429    Thread::wait_all() ;
00430    purge_container(loader_threads) ;
00431 
00432    // Make sure the dataset actually has something in it that can be
00433    // analyzed...
00434    if (dataset.empty()) {
00435       std::cerr << dir << ": unable to load any metrics logs\n" ;
00436       return ;
00437    }
00438    if (Params::dump_dataset())
00439       dataset.dump() ;
00440 
00441    // Now that all the metlogs have been loaded into Experiment objects,
00442    // analyze the data to help gauge the robot's average case behaviour.
00443    std::string result_file = dir + "/" + Params::result() ;
00444    Experiment* result = Experiment::create(result_file) ;
00445 
00446    result->point_list(TRAJECTORY,     analyze(dataset, TRAJECTORY)) ;
00447    result->point_list(EMERGENCY_STOP, analyze(dataset, EMERGENCY_STOP)) ;
00448    result->point_list(EXTRICATE,      analyze(dataset, EXTRICATE)) ;
00449    result->point_list(LGMD_EXTRICATE, analyze(dataset, LGMD_EXTRICATE)) ;
00450    result->point_list(BUMP, analyze_bumps(dataset)) ;
00451    result->speed_list(analyze_speeds(dataset)) ;
00452    analyze_events(dataset, result) ;
00453 
00454    if (! result->save())
00455       std::cerr << result_file << ": will not overwrite\n" ;
00456    delete result ;
00457 }
00458 
00459 // This function finds the "reference experiment" for the specified point
00460 // list and then launches multiple threads to "normalize" the remaining
00461 // experiments so that they all have the same number of points in the
00462 // point list of interest as the reference experiment. The average of all
00463 // these transformed point lists is then used as the final result for
00464 // that category.
00465 //
00466 // To make the above discussion a little more concrete and a little more
00467 // clear, let us say we have 25 experiments in a dataset and want to find
00468 // the robot's average trajectory from start to finish. Now, each of the
00469 // 25 experiments would have recorded some points for the robot's
00470 // trajectory. Unfortunately, we cannot simply take the centroids of the
00471 // "corresponding" points across all experiments because each
00472 // experiment's list of trajectory points will have a different
00473 // cardinality.
00474 //
00475 // For example, the first experiment might record 100 points, the second
00476 // one might record 120 points, the third one 92 points; so on and so
00477 // forth. Therefore, we first need to decide which experiment to use as a
00478 // reference. Once we have the reference experiment, we find point
00479 // correspondences between all the other experiments and it using a
00480 // simple Euclidean distance check.
00481 //
00482 // The above procedure will end up discarding points in experiments that
00483 // have more trajectory points than the reference experiment and
00484 // duplicating points in those experiments that have fewer trajectory
00485 // points than the reference experiment. At the end of this
00486 // transformation, we would have 25 trajectory point lists all with the
00487 // same cardinality. Now we can go ahead and take the centroids of all
00488 // the corresponding points to obtain the final average trajectory.
00489 PointList analyze(Dataset& dataset, PointListName point_list)
00490 {
00491    using namespace lobot ;
00492 
00493    dataset.rewind() ;
00494    const Experiment* refexp = dataset.find_refexp(point_list) ;
00495 
00496    PointMatrix point_matrix(refexp->size(point_list), dataset.size()) ;
00497 
00498    const int T = std::min(dataset.size() - 1, num_cpu()) ;
00499    std::vector<CorrFinder*> corr_threads ;
00500    corr_threads.reserve(T) ;
00501    for (int i = 0; i < T; ++i)
00502       corr_threads.push_back(CorrFinder::create(refexp, dataset,
00503                                                 point_list, &point_matrix)) ;
00504 
00505    sleep(1) ; // prevent segfault; see comment in process_experiment()
00506    Thread::wait_all() ;
00507    purge_container(corr_threads) ;
00508 
00509    const PointList& refpl = refexp->point_list(point_list) ;
00510    if (! refpl.empty())
00511       point_matrix.add(refpl) ;
00512    return point_matrix.average() ;
00513 }
00514 
00515 // This function "analyzes" the dataset's bump events, which simply
00516 // involves appending each experiment's bump list and returning the
00517 // resulting list as the final result to be saved. We don't bother with
00518 // finding a reference experiment and computing an average w.r.t. that
00519 // reference because the total number of bumps across all experiments
00520 // making up a dataset ought to be fairly low (on the order of 2-5).
00521 // Therefore, in the pretty pictures that all of this data processing
00522 // will eventually lead up to, we simply show all the bump points,
00523 // stating that they are the total across all experiments and not
00524 // averaged like the other point lists.
00525 PointList analyze_bumps(Dataset& dataset)
00526 {
00527    PointList bumps(10) ;
00528    try
00529    {
00530       dataset.rewind() ;
00531       for(;;)
00532       {
00533          const lobot::Experiment* E = dataset.next() ;
00534          bumps += E->point_list(lobot::BUMP) ;
00535       }
00536    }
00537    catch (Dataset::eol&){}
00538    return bumps ;
00539 }
00540 
00541 // This function "analyzes" the speed readings recorded by each
00542 // experiment, which involves simply appending all of the readings into
00543 // one giant list. This list is then stored in the result object, which
00544 // takes care of computing mean and standard deviation prior to saving
00545 // the result file.
00546 std::vector<float> analyze_speeds(Dataset& dataset)
00547 {
00548    std::vector<float> speeds ;
00549    speeds.reserve(dataset.size() * 30) ;
00550    try
00551    {
00552       dataset.rewind() ;
00553       for(;;)
00554       {
00555          const lobot::Experiment*  E = dataset.next() ;
00556          const std::vector<float>& S = E->speed_list() ;
00557          std::copy(S.begin(), S.end(), std::back_inserter(speeds)) ;
00558       }
00559    }
00560    catch (Dataset::eol&){}
00561 
00562    // Convert all speeds from m/s to mm/s to get non-fractional speed
00563    // values. Fractional floating point numbers result in extreme
00564    // roundoff errors when we perform the two-way ANOVA computations for
00565    // the speed_stats.
00566    using namespace boost::lambda ;
00567    std::transform(speeds.begin(), speeds.end(), speeds.begin(), _1 * 1000.0f) ;
00568 
00569    return speeds ;
00570 }
00571 
00572 // Quick helper to return the mean and standard deviation of a vector of
00573 // integers.
00574 lobot::generic_stats<int> stats(const std::vector<int>& v)
00575 {
00576    using namespace lobot ;
00577    generic_stats<float> s = compute_stats<float>(v.begin(), v.end()) ;
00578    return generic_stats<int>(s.n,
00579                              round(s.sum),  round(s.ssq),
00580                              round(s.mean), round(s.stdev)) ;
00581 }
00582 
00583 // Quick helper to return the mean and standard deviation of a vector of
00584 // floats.
00585 lobot::generic_stats<float> stats(const std::vector<float>& v)
00586 {
00587    return lobot::compute_stats<float>(v.begin(), v.end()) ;
00588 }
00589 
00590 // This function analyzes the occurences of emergency stop events and LRF
00591 // and LGMD extrication events. It computes the means and standard
00592 // deviations for these event types as well as the means and standard
00593 // deviations for the total number of extrications and the LGMD success
00594 // rate. Finally, it also computes the mean time-to-goal and its standard
00595 // deviation (i.e., the "reached goal" event).
00596 void analyze_events(Dataset& dataset, lobot::Experiment* result)
00597 {
00598    std::vector<int>   em_stop, lrf_extr, lgmd_extr, total_extr ;
00599    std::vector<float> lgmd_success, extr_success, durations ;
00600 
00601    em_stop.reserve(dataset.size()) ;
00602    lrf_extr.reserve(dataset.size()) ;
00603    lgmd_extr.reserve(dataset.size()) ;
00604    total_extr.reserve(dataset.size()) ;
00605    lgmd_success.reserve(dataset.size()) ;
00606    extr_success.reserve(dataset.size()) ;
00607    durations.reserve(dataset.size()) ;
00608 
00609    try
00610    {
00611       dataset.rewind() ;
00612       for(;;)
00613       {
00614          const lobot::Experiment* E = dataset.next() ;
00615 
00616          int stops = E->emergency_stop_size() ;
00617          em_stop.push_back(stops) ;
00618 
00619          int lrf = E->extricate_size() ;
00620          lrf_extr.push_back(lrf) ;
00621 
00622          int lgmd = E->lgmd_extricate_size() ;
00623          lgmd_extr.push_back(lgmd) ;
00624 
00625          int total = lrf + lgmd ;
00626          total_extr.push_back(total) ;
00627 
00628          lgmd_success.push_back(lgmd  * 100.0f/total) ;
00629          extr_success.push_back(total * 100.0f/stops) ;
00630 
00631          durations.push_back(E->duration()/1000.0f) ;
00632       }
00633    }
00634    catch (Dataset::eol&){}
00635 
00636    result->emergency_stop_stats(stats(em_stop)) ;
00637    result->lrf_extricate_stats(stats(lrf_extr)) ;
00638    result->lgmd_extricate_stats(stats(lgmd_extr)) ;
00639    result->total_extricate_stats(stats(total_extr)) ;
00640    result->lgmd_success_stats(stats(lgmd_success)) ;
00641    result->extricate_success_stats(stats(extr_success)) ;
00642    result->duration_stats(stats(durations)) ;
00643 }
00644 
00645 // This function walks through the list of directories specified on the
00646 // command line and processes the datasets in each of them one-by-one.
00647 void process_metlogs(const DirList& dirs)
00648 {
00649    std::for_each(dirs.begin(), dirs.end(), process_experiment) ;
00650 }
00651 
00652 } // end of local anonymous namespace encapsulating above helpers
00653 
00654 //------------------------------- MAIN ----------------------------------
00655 
00656 int main(int argc, char* argv[])
00657 {
00658    int ret = 0 ;
00659    try
00660    {
00661       CmdLine args = parse(argc, argv) ;
00662       if (args.second.empty())
00663          throw lobot::misc_error(lobot::MISSING_CMDLINE_ARGS) ;
00664 
00665       load_config_file(args.first) ;
00666       process_metlogs(args.second) ;
00667    }
00668    catch (lobot::uhoh& e)
00669    {
00670       std::cerr << e.what() << '\n' ;
00671       ret = e.code() ;
00672    }
00673    catch (std::exception& e)
00674    {
00675       std::cerr << e.what() << '\n' ;
00676       ret = 127 ;
00677    }
00678    catch(...)
00679    {
00680       std::cerr << "unknown exception\n" ;
00681       ret = 255 ;
00682    }
00683    return ret ;
00684 }
00685 
00686 //-----------------------------------------------------------------------
00687 
00688 #endif // library checks
00689 
00690 /* So things look consistent in everyone's emacs... */
00691 /* Local Variables: */
00692 /* indent-tabs-mode: nil */
00693 /* End: */
Generated on Sun May 8 08:41:30 2011 for iLab Neuromorphic Vision Toolkit by  doxygen 1.6.3