Frequently Asked Questions
- General
- What's the right way to report bugs and ask
questions on the mailing lists?
- Where can I find binary packages?
- Which versions of Player and Stage are compatible?
- What's the story of Player's creation?
- What other information about the Player/Stage Project is available online?
- What is a "local" or "user" installation?
- Why are Player and Stage (etc) so named?
- How can I make movies from screenshots?
- Player
- What is Player?
- How is Player different from other robot
interfaces?
- Where does Player run?
- What hardware and software does Player
support?
- How do I get/build/install Player?
- How do I cross-compile Player (e.g., for the iPAQ)?
- When I try to connect to Player, I get
"connection refused."
- How do I add a device driver to Player?
- When I run Player (possibly under Stage), it
exits with the message "unknown host; probably should quit." What's the
deal?
- I have a syntax error involving PKG_CHECK_MODULES. What's the fix?
- How can I get high data rates from my SICK
LMS laser rangefinder?
- How do I connect my (Sony or Canon) PTZ camera
to a standard serial port (instead of, e.g., the AUX port on my Pioneer)?
- How can I read image data from a camera?
- What is the purpose of the key in a
provides or requires field (e.g., "odometry" in
"odometry::position:0")?
- What is the difference between Player
and Stage and Gazebo? What is the difference between Player device
drivers and simulated device models in Stage or Gazebo?
- Stage
- What is Stage?
- How is Stage different from other robot
simulators?
- Where does Stage run?
- How do I get/build/install Stage?
- How can I make movies of my Stage simulations?
- What does it mean when Stage says "error
executing player"?
- Stage is running but there's no GUI!
- How do I add an object to Stage 1.3 and
later?
- Where is the "sonar" model in Stage?
- Gazebo
- What is Gazebo?
- Where does Gazebo run?
- How do I get/build/install Gazebo?
- How do I build Gazebo on OS X?
- How can I make movies of Gazebo?
- How can I read image data from a camera?
- librtk
- What is librtk?
- Where does librtk run?
- How do I get/build/install librtk?
- How do I dis/enable movie-making support in
librtk?
- Developers
- How do I get the latest code?
- How do I build from CVS?
-
-
Nobody actually asks this question,
but many should. Below are some tips for reporting
bugs and asking
questions. The idea is to use these resources as efficiently as possible
(specifically, to save time for the good people who will look into your
bug or answer your question). Please read this; if you don't, then in
answer to your query you'll likely be directed back here.
Before posting anything:
- First, check the bug list.
Known bugs, often with patches or workarounds, are generally found there.
If you have something to add to an existing bug, add it as a comment to
the bug, rather than posting to the mailing lists.
Be sure to check for bugs with Any status (i.e., not just Open); you're
likely to find your fix in a Closed bug.
- Next, check the mailing list archives.
Yes, I know that the SF.net archive interface is atrocious, but that's all
there is, and it is functional. Search the archives to see if someone else
has asked your question or reported your bug.
If you do post something:
- Don't contact the developers/maintainers directly.
Direct correspondence is not archived or otherwise publicly available,
which means that the larger user/developer community can't benefit from
your question or the ensuing answer(s). Also, by contacting a developer
directly, you're asking one person, whereas if you post to a
mailing list, you're asking hundreds of people. Open Source
development works best when the entire community participates in
discussions and helps to answer questions.
To be clear, send all questions to the appropriate mailing
list, and report all bugs to the bug
tracker.
- Be as specific as possible.
Describe exactly what you were
doing or are trying to do, and exactly what, if anything, went wrong.
If you say, "The camera in Gazebo is broken," we can't help you.
- Always provide the following information:
- Names and versions of our packages that you're using.
For
example, "I'm using Gazebo 0.5 with Player 1.6, as well as pmap 0.0.0."
- Your platform (architecture, OS & version/distro).
For
example, "I'm running OS X 10.3 on an iBook," or
"I'm running RedHat Fedora Core 3 on an x86, with kernel 2.6.12."
For Linux, always provide the distro and kernel versions.
- Any warnings or errors. Cut and paste them directly from the
terminal window to which they were printed. DO NOT re-type them yourself.
If you don't run a web browser or mail client in the same windowing
session as the program that generates the output, then save it to a file
and copy the file somewhere from which you can include or attach it.
When discussing any compiling/linking/installation issues, also
provide:
As appropriate, also include your:
- Player .cfg file
- Stage/Gazebo .world file
- Don't send your question more than once.
We all heard you the first time, and if you didn't get a response then
likely nobody has had time to answer you. Alternatively, it could be that
nobody knows the answer, or that nobody wants to help you. In any case,
sending it again is poor form and will only aggravate everybody else.
And don't tell us about your homework/project/paper deadlines; we
don't care.
If your mail client is broken in such a way that it sends multiple copies
when you hit "Send," then either fix your mail client or get a new one.
The world if full of mail clients, many of which work great. Life is
too short for broken mail software.
-
TODO: insert compatibility table here
-
We don't maintain binary packages, but some users do. Look on the pre-compiled
packages page.
-
TODO: insert compatibility table here
-
The story of the Player/Stage project is at Dylan Shell's Robotics Wiki.
-
Some online sources in rough order of usefulness:
-
Some developers (myself included) prefer to install applications in
our user directory (e.g., /home/[username]/local) rather than in a
system directory; this avoids screwing up other users of the machine
if you have some funky experimental code you want to work with. It's
also useful if you don't have root access.
Naturally, local installs can make it a bit tricky for the various
packages to find the right headers, libs and so on. Historically, we
have worked around this by specifying "--with-foo=path" arguments to
the configure scripts; unfortunately, this method is fraught with
danger and is now being phased out. Here, then, is the recommended
way to do it:
- Pick a spot for "local" installs; for me it is "/home/ahoward/local".
The install scripts will create relevant subdirs under this, such as:
/home/ahoward/local/bin
/home/ahoward/local/include
/home/ahoward/local/lib
- Set up the necessary compiler paths in your .bashrc (or whatever) script; e.g.:
$ export PATH=~/local/bin:$PATH
$ export CPATH=~/local/include:$CPATH
$ export LIBRARY_PATH=~/local/lib:$LIBRARY_PATH
The first line sets the executable path; the second sets the path for
C and C++ header files; the third line sets the library search path.
- Set up some additional paths in your .bashrc (or whatever)
$ export PKG_CONFIG_PATH=~/local/lib/pkgconfig:$PKG_CONFIG_PATH
$ export PYTHONPATH=~/local/lib/python2.2/site-packages:$PYTHONPATH
The first line sets the pkg-config path (for applications using
pkg-config, which will be everything in the P/S/G project pretty
soon); the second line is for Python modules (e.g. Python bindings to
the libplayerc client lib).
- Check to see if you have set CFLAGS or LDFLAGS:
$ env | grep CFLAGS
$ env | grep LDFLAGS
These variables can change how the compiler and linker behave.
If either one is set, and if it points to a place where there is another
installation of Player/Stage/Gazebo/librtk (e.g.,
LDFLAGS="-L/usr/local/lib", CFLAGS="-I/usr/local/include"),
then you must unset it:
$ unset LDFLAGS
$ unset CFLAGS
Note that if these variables are set to something that doesn't change the
compiler/linker paths (e.g., CFLAGS="-g -Wall") then you can leave them
alone.
- Build applications using the "--prefix" argument; e.g.;
./configure --prefix=/home/ahoward/local
make install
Everything should now build seamlessly without any additional
frigging around, and your locally installed packages will be used in
preference to any system-wide installs.
-
See Shakespeare's 'Seven Ages of Man' speech.
-
Many P/S/G programs (including playerv, playernav, stage, and gazebo)
can dump screenshots, and you might want to assemble these screenshots
into a movie, for example to include in a presentation. Unfortunately,
there is no good (universal) method for animating frames into a movie
that will play on all platforms. Some known methods, all of
which have pros and cons:
- On Linux, use mencoder (comes with mplayer). Works great, but the
movies it makes generally don't run on Windows machines (some kind of
DIVX problem). Sometimes Windows Media Player will play these movies,
but Powerpoint won't let you embed them in a slide (maddening, isn't it?).
Encoding with MPEG1 does work, but it looks terrible.
- On Windows, there is a nice freeware binary called BMP2AVI (google
it) that does the trick. Simple, but pretty effective.
- On Windows/OS X, you can pay $30 for the full version of QuickTime,
and use that to make your movies. You can generally tweak it so that the
movies play on all platforms (QuickTime on Windows and MPlayer on Linux).
- xvidcap:
Captures snapshots or movies of areas of the screen.
- wink:
Input formats: Capture screenshots from your PC, or use images in
BMP/JPG/PNG/TIFF/GIF formats.
Output formats: Macromedia Flash, Standalone EXE, PDF, PostScript,
HTML or any of the above image formats. Use Flash/html for the web,
EXE for distributing to PC users and PDF for printable manuals.
Please let us know if you can come up with a better solution.
-
-
Player is a device server that provides a powerful, flexible interface to
a variety of sensors and actuators (e.g., robots). Because Player uses a
TCP socket-based client/server model, robot control programs can be written
in any programming language and can execute on any computer with network
connectivity to the robot. In addition, Player supports multiple concurrent
client connections to devices, creating new possibilities for distributed
and collaborative sensing and control.
-
Previous work in the area of robot programming interfaces has focused primarily
on providing a development environment that suits a particular control
philosophy. While such tools are very useful, we believe that implementing
them at such a low level imposes unnecessary restrictions on the programmer,
who should have the choice to build any kind of control system while still
enjoying device abstraction and encapsulation.
Thus in Player we make a clear distinction between the programming
interface and the control structure, opting for a maximally general
programming interface, with the belief that users will develop their own
tools for building control systems. Further, most robot interfaces confine
the programmer to a single language, providing a (generally closed-source)
language-specific library to which the user must link his programs.
In contrast, the TCP socket abstraction of Player allows for the use of
virtually any programming language. In this way, it is much more "minimal"
that other robot interfaces.
-
Player runs on pretty much any POSIX platform, including embedded systems
(Player has been cross-compiled to run on several ARM- and PPC-based
Linux systems). Specifically, Player's requirements are:
- POSIX development environment, with threads (pthreads)
- TCP stack
- a compiler with both C and C++ (we have only tested gcc, but other
compilers may work)
- a bash shell, to run the configure script; this implies
that Player will not build natively in Windows, though some users have it
running
under Cygwin, and there are rumors of MinGW builds as well.
-
Look here
-
We don't make binary distributions, though some
users do. To get the source for Player, go to the
download
page. Then see the installation instructions.
-
Read this tutorial.
-
That's usually because either Player isn't running or because you're trying
the wrong port. To check whether Player is running and to verify on which
port(s) it is listening, use netstat. In Linux, the following should
help (arguments will be different for other platforms):
$ netstat --inet --tcp -lp
You should see a list of all processes currently listening on TCP ports; look
for player.
-
Read this tutorial.
-
(This seems to occur mostly on OS X)
Add an entry to your /etc/hosts for your machine's name. For
example, if your machine is called foobar:
127.0.0.1 localhost foobar
There's probably already a line for 127.0.0.1 (known as the "loopback
address"); you can just append your hostname to the end of that line.
-
If you get a syntax error involving PKG_CHECK_MODULES, it is likely
that aclocal can't find the pkg.m4 file, which defines this
macro. This is common on OS X with Fink, as the pkg-config package
puts this file in /sw/share/aclocal, while the standard OS X aclocal
program is looking in /usr/share/aclocal. Unfortunately, there is no
reliable search path mechanism for aclocal, so the best fix is just to
copy (or symlink) /sw/share/aclocal/pkg.m4 to /usr/share/aclocal. [this FAQ taken from the Autonomy Lab's P/S Wiki].
-
It's possible to get scans at 75Hz from the SICK LMS, if you use an
RS422 connection, which is a high-speed serial line. If you use a
run-of-the-mill RS232 connection, the best you can expect is about 10Hz,
depending on angular aperture and resolution.
Look
here
for one way to use a USB-RS422 converter to get high-speed laser scans.
For a detailed explanation of how the LMS works, timing considerations,
and what data rates can be expected, look
here.
More info and tips on using Player to get
high-speed laser scans can be found here.
-
ActivMedia robots that are equipped with a PTZ camera often have the camera
connected to the AUX port on the P2OS board in the robot. Player does not
support control of the camera through this connection, for reasons explained
here.
Instead, Player requires a standard, direct, serial line to the camera.
Documentation about the Sony PTZ units is available
here. In particular, page 15 of this manual has a wiring diagram.
Here are some detailed wiring instructions, courtesy of Jason L. Bryant
at the Navy Center for
Applied research in Artificial Intelligence:
Instructions for rewiring a pioneer robot so that the PTZ camera
device can be connected to a serial port (ttyS1) the on-board computer
rather than to the robot's microcontroller.
Purchase a VISCA - DB9 conversion cable (item # 0002V448 on-line), as
well as a length of 20 ribbon connection cable (our cable is about 18
inches long). You will also need a 20 pin header connector
Attach the 20 pin header to one end of the ribbon taking note of the
location pin 1 on both the ribbon and the header connector. At the
other end of the cable, split the ribbon into 2 10 pin sections. Cut
about 1 inch off of the last pin from each section (pins 10 and 20) so
that you now have 2 9-pin cable ends. Now attach 2 DB-9 serial
connectors (MALE) to the ends (being sure that pins 1 and 11 go into
the proper slots of the connector. The serial connection with pin 1
will eventually go to the serial port on the microcontroller and the
other connection will hook to the VISCA - DB9 conversion cable.
Remove the top plate and nose from your pioneer robot. Next,locate
and remove the 20 pin header with a 9 wire rainbow colored ribbon from
the Serial port on the on-board computer. This header connects to
serial ports ttyS0 and ttyS1, however, using the default pioneer
configuration, port ttyS1 is unused. The other end of this ribbon
connects to the serial port on the microcontroller (look in your
pioneer manual for the location of this port or just follow the
cable).
Now place the 20 pin header of the cable you just made into the now
free serial ports on the computer. Snake the wires under the robot's
control panel and to the back section of the chassis. Connect the
serial connection from ttyS0 (serial connection with pins 1 - 9) to
the now free serial port on the microcontroller. Connect the other
serial connection (pins 11 - 19) to the female DB-9 connector on the
VISCA to DB-9 conversion cable and snake the rest of this cable up and
outside the robot cover. Replace the nose and top cover of your
robot. Once you connect the other end of the VISCA cable to the
camera, you will now have a working ptz camera on port /dev/ttyS1.
You can test that the connections work by running
/usr/local/Aria/bin/demo on the robot, selecting 'C' for camera
control, then the appropriate key for your particular camera (Sony, or
Canon) connected to a serial port ('@' for a Canon), and finally '2'
for serial port /dev/ttyS1.
-
There are several options for accessing image data from a camera in Player:
- Write a (plugin) Player driver which reads the data directly from the
camera (through the camera interface).
- Use socket interface to return the image data to the client side.
- Use an external streaming system, like Quicktime RTSP, gstreamer,
VideoLAN or OpenH323.
The first is the recommended way of accessing the
camera. By building a driver in Player, the need to transmit camera
information via a network is minimized. The Player server can process
the image, extract whatever information you require, and return that
if necessary. That is how the blobfinder and cmvision "virtual
sensors" work. For custom vision processing algorithms (that do not
belong in the Player source tree), users can create "plugin" drivers.
Raw image data can be read on the client side using an appropriate
proxy ((e.g., CameraProxy in the C++ client, or
playerc_camera_t in the C client). Be aware that this
options will severly increase network traffic.
Setting up an external streaming server allows you to access the
"live" video feed using many other popular programs. Since the data is
not travelling via Player, there is less impact on the performance of
Player. Also, streaming servers typically compress the images before
sending, reducing the network load somewhat. That said, there are no
samples in Player/Stage to show you how to do this, as it is
completely outside of the project.
Searching the mailing lists for "camera" will bring up most of the previous
discussions of this matter.
Links:
http://developer.apple.com/darwin/projects/streaming/
http://gstreamer.freedesktop.org/
http://www.videolan.org/
http://www.openh323.org/
-
Look here.
-
The Player server provides an abstract interface to robotic devices,
including mobile robot bases, sensors, etc. Player communicates with these
specific devices using device drivers, but provides to its clients standard
device interface. For example, Player may use SICK LMS-200 and Pioneer
drivers, but simply provides to clients "laser" and "position" ("position"
is a movable mobile robot base), "sonar", etc. interfaces. This allows
clients to be portable to other robots. The same client could work with a
Player server running the "rflex" robot driver on, say, a RWI B21R or even
a Roomba and a Hokuyu URG laser. Or, custom drivers you write for your own
robot and devices. Player drivers are kept as plug-in modules. The Player
config file determines which drivers to load and any parameters they may
require.
Stage and Gazebo, on the other hand, do not simulate specific devices (such
as a SICK LMS-200 or Pioneer robot), instead they have configurable
abstract device models (such as "laser" and "position"). These models are
defined and configured in the world file. Though often a model that
imitates a specific device is used -- for example a "laser" model with a
180 degree field of view and angle resolution approximates a typically
configured SICK LMS-200 -- but Player device drivers are never used with
Stage or Gazebo. Currently, Stage and Gazebo models are built-in; new
model (for a completely new kind of device) can be added by modifying
Stage. To use Stage with a Player server instead of real devices, you
load the stage driver into Player, which then can provide any of a number
of standard interfaces ("position", "laser", etc.).
Stage and Gazebo are different simulators. They have different
configuration (world) files and implement different (though overlapping)
sets of simulated devices in different ways. See below for discussion of
the functional differences between Stage and Gazebo.
You can think of Stage, Player, and Player's clients as three seperate
layers.
+--------------+
| Stage Driver |
+--------------+ +--------+ +-----------------------+
OR <---> | Player | <---> | Client (e.g. PlayerV) |
+---------------+ +--------+ +-----------------------+
| Gazebo Driver |
+---------------+
OR
+---------------+
| Robot Drivers |
+---------------+
| Device Driver |
+---------------+
| ... |
Since Player provides standard interfaces to the client, the client
operates the same no matter what driver is loaded into Player, so
long as those drivers provide the required device interfaces.
-
-
Stage is a scaleable multiple robot simulator; it simulates a population of
mobile robots moving in and sensing a two-dimensional bitmapped environment,
controlled through Player. Stage provides virtual Player robots which
interact with simulated rather than physical devices. Various sensor models
are provided, including sonar, scanning laser rangefinder, pan-tilt-zoom
camera with color blob detection and odometry.
-
-
-
We don't make binary distributions, though some
users do. To get the source for Stage, go to the
download
page. After unpacking the tarball, the following steps will suffice for
most users:
$ ./configure
$ make
$ make install
Stage will be installed by default in /usr/local/ (Note
the change; older versions were installed in $HOME). To select a
different location, pass --prefix={path} to configure.
Note that before building Stage, you need to install Player. If you select a
non-default install location for Player (and it's different from the
--prefix that you use with Stage), you must tell Stage by
passing --with-player={path} to configure.
Stage can be built either with or without its GUI frontend. The GUI is
built on librtk, which in turn depends on GTK.
To build the GUI, you must first install
librtk. If you select a non-default install location for librtk
(and it's different from the --prefix that you use with
Stage), you must tell Stage by passing --with-rtk={path}
to configure.
Note that it is possible to build Stage without any GUI support (e.g.,
if you don't have GTK), but it will be less fun to play with.
configure also accepts other options. To see a list, try:
$ configure --help
-
For Stage 1.6 and up, use the File:Export menu in the GUI to dump
screenshots, then animate the screenshots.
For Stage 1.3.x, the following information applies:
librtk can now create movies directly, using the
ffmpeg library. If you
are using Stage with the librtk front-end, you can start/stop movie capture
by using the File/Capture Movie menu option. Options are provided
for both real-time and time-lapse capture.
Be aware of the following issues when exporting movies:
- Movie capture is both CPU and IO intensive, and is likely to slow
down Stage significantly. This may impact on your control programs,
leading to unexpected robot behavior.
- For fast movie capture, keep the Stage window small (less than
640x480) and run X locally (i.e., don't run across the network).
- librtk captures MPEG video using the MPEG1VIDEO codec. This is an
old and somewhat clunky codec, but has the advantage of being highly portable
(the movies should work with both the QuickTime and Windows Media Players).
For playing movies under Linux, I recommend MPlayer.
-
This usually means that Stage was unable to execute the Player server
because it's not in your PATH. You must have the path to the server binary
player (e.g., $HOME/player-1.3/bin) in your PATH.
-
Well that's not actually a question, but the problem is likely that you didn't
compile any GUI support into Stage. Read this.
-
(contributed by Alex Makarenko)
- write the code: myobject.hh/cc
- see a comment in file src/library.hh
- start with an existing object, e.g. src/models/puck.hh/cc
- add two lines in file src/library.cc
- #include models/mydevice.hh
- add an item to libitem_t library_items[] = {...}
- add files to the project and compile
- copy myobject.hh/cc to src/models directory
- add them to libstgmodels_a_SOURCES list in file src/models/Makefile.am
- make sure the files are actually added to the project, this
may require some coercing, such as:
- run autoconf (or possibly autoreconf ?)
- run automake
- run ./configure
- compile
-
Stage does not have a "sonar" model, but it does have a "ranger"
model. This model can be used to simulate any sensor which consists
of a group of individual range-sensing models. Stage's "ranger"
only simulates a single "beam", however, instead of (the potentially
computentially expensize) cone shape of most sonar sensors.
-
-
Gazebo is a 3D, dynamic, multi-robot simulator. Whereas Stage is intended to
simulate the behavior of very large populations of robots with moderate fidelity,
Gazebo simulates the behavior of small populations of robots (less than 10) with
high fidelity. Read the Gazebo page for more
information.
-
-
We don't make binary distributions, though some
users do. To get the source for Gazebo, go to the
download
page. After unpacking the tarball, the following steps will suffice for
most users:
$ ./configure
$ make
$ make install
Gazebo will be installed by default in $HOME/gazebo-{version}. To
select a different location, pass --prefix={path} to
configure.
configure also accepts other options. To see a list, try:
$ configure --help
Note that unlike Stage you should install Gazebo before
installing Player. When Player builds, it will look for an installed
version of Gazebo, and will only build the Gazebo drivers if it finds
one. If you install Gazebo in a non-standard location, you must tell Player
by passing --with-gazebo={path} to configure.
-
Directions for building on Gazebo on OS X can be found in the Gazebo
manual, available from the documentation
page.
-
Gazebo will not make movies directly, but can be instructed to export
still frames, which you can then animate.
In versions up to and including 0.3.0, click on the window you wish
to export, then press the 'W' key'; frames are saved in PPM format in
a directory named "frames-". Note that saving frames will
significantly affect Gazebo's performance.
-
For libgazebo users, raw image data is available through
the gz_camera_t interface.
For Player users, see the FAQ entry on reading
camera data; from Player's perspective, Gazebo cameras work just
like real cameras (which means you can develop image processing
algorithms using Gazebo-simulated images).
-
NOTE: librtk has been deprecated. As of Stage 1.6 and Player 1.6.3,
librtk is no longer used by any P/S/G package. You do NOT need to download
and install it.
-
librtk is a GTK-based graphics toolkit designed to help in building
robotics-related GUIs (RTK stands for Robot ToolKit). librtk was written by
Andrew Howard.
-
librtk should build anywhere that GTK is installed. However, movie-making
(i.e., ffmpeg) support will not necessarily build everywhere. See
below.
-
We don't make binary distributions. To get the source for librtk, go to the
download
page. After unpacking the tarball, the following steps will suffice for
most users:
$ ./configure
$ make
$ make install
librtk will be installed by default in /usr/local/ (Note
the change; older versions were installed in $HOME). To select a
different location, pass --prefix={path} to configure.
-
Included with librtk is the ffmpeg
library, which can be used to produce MPEG animations. By default, ffmpeg
support will be compiled into librtk. However, ffmpeg does not build
everywhere (notably, it seems to fail on Sparc/Solaris). If during the build,
ffmpeg/configure fails, then you should disable ffmpeg support.
To do so, pass --disable-libavcodec to configure before
building librtk.
-
-
All the code for the project is mantained in a CVS repository at SourceForge.
An excellent source of CVS documentation (besides the CVS
manual) is
here.
Project-specific instructions for CVS access, both anonymous and read/write,
are here.
We keep our code organized into CVS modules, and that is how you should
access it. You should not check out directories directly, because
you will bypass any module dependencies that we have set up. The following
modules will likely be of greatest interest to you:
-
Since we're using the GNU Autotools, it's a little different to build from CVS
instead of from a distribution. First, you need autoconf and
automake installed. They are already installed on any reasonable
UNIX-like machine, but you might need to upgrade them; you can download both
packages from any GNU mirror. We're currently using:
- autoconf 2.53
- automake 1.6
Newer versions will probably work, but older ones probably won't. If you do
use newer versions, keep in mind that you should not use any macros
that aren't available in the versions listed above, because that will likely
break the build for other developers.
Building librtk, player, and stage from CVS involve the same steps:
- autoreconf [-i -s]
OR
./bootstrap
- ./configure [options]
- make
- make install (optional)
The autoreconf tool runs the right Autotools in the right order
to generate necessary files, including a configure script.
You only need to supply the -i -s arguments the first time you
use autoreconf on a checked out copy. If autoreconf doesn't
work for you (older versions were pretty buggy), then you can run the
bootstrap script instead, which does the same thing.
You only usually need to run autoreconf when some part of the
build system, such as configure.in or acinclude.m4, has
changed; at other times, you can just run configure, or even just
make. However, it's safest to run autoreconf whenever
you udpate from CVS, in case something important changed.
The exact dependencies among the various files and tools are of course
deterministic but extremely complex and it's best not to think about them.
One more thing: since we're using automake, we don't write
Makefiles. Instead, we write Makefile.ams
(automake files), which are like
meta-Makefiles. Except in special cases, Makefiles (and
Makefile.ins) are auto-generated and should not be
checked in.
|