Syntax highlighting in the docs

This adds the syntax highlights in the MarkDown files.
Fixes #33741
This commit is contained in:
Guinness 2020-07-13 11:16:51 +02:00
parent 8ebbf62940
commit b119c55776
13 changed files with 491 additions and 327 deletions

View File

@ -18,20 +18,20 @@ The quick and dirty plan is to:
## Clone and compile tor
```bash
git clone https://git.torproject.org/tor.git
cd tor
git checkout tor-0.4.1.5
```console
$ git clone https://git.torproject.org/tor.git
$ cd tor
$ git checkout tor-0.4.1.5
```
Above we use the tag for tor-0.4.1.5 where the circuit padding framework was
released. Note that this version of the framework is missing many features and
fixes that have since been merged to origin/master. If you need the newest
framework features, you should use that master instead.
```bash
sh autogen.sh
./configure
make
```console
$ sh autogen.sh
$ ./configure
$ make
```
When you run `./configure` you'll be told of missing dependencies and packages
to install on debian-based distributions. Important: if you plan to run `tor` on

View File

@ -118,8 +118,9 @@ instance of the feature (--reverse).
For example, for #30224, we wanted to know when the bridge-distribution-request
feature was introduced into Tor:
$ git log -S bridge-distribution-request --reverse
commit ebab521525
```console
$ git log -S bridge-distribution-request --reverse commit ebab521525
Author: Roger Dingledine <arma@torproject.org>
Date: Sun Nov 13 02:39:16 2016 -0500
@ -127,8 +128,11 @@ feature was introduced into Tor:
$ git describe --contains ebab521525
tor-0.3.2.3-alpha~15^2~4
```
If you need to know all the Tor versions that contain a commit, use:
```console
$ git tag --contains 9f2efd02a1 | sort -V
tor-0.2.5.16
tor-0.2.8.17
@ -139,6 +143,7 @@ If you need to know all the Tor versions that contain a commit, use:
tor-0.3.1.9
tor-0.3.1.10
...
```
If at all possible, try to create the changes file in the same commit where
you are making the change. Please give it a distinctive name that no other
@ -438,8 +443,10 @@ use `tor_assert_nonfatal()` in place of `tor_assert()`. If you'd like to
write a conditional that incorporates a nonfatal assertion, use the `BUG()`
macro, as in:
```c
if (BUG(ptr == NULL))
return -1;
```
## Allocator conventions
@ -451,14 +458,17 @@ Also, a type named `abc_t` should be freed by a function named `abc_free_()`.
Don't call this `abc_free_()` function directly -- instead, wrap it in a
macro called `abc_free()`, using the `FREE_AND_NULL` macro:
```c
void abc_free_(abc_t *obj);
#define abc_free(obj) FREE_AND_NULL(abc_t, abc_free_, (obj))
```
This macro will free the underlying `abc_t` object, and will also set
the object pointer to NULL.
You should define all `abc_free_()` functions to accept NULL inputs:
```c
void
abc_free_(abc_t *obj)
{
@ -468,16 +478,19 @@ You should define all `abc_free_()` functions to accept NULL inputs:
thing_free(obj->thing);
tor_free(obj);
}
```
If you need a free function that takes a `void *` argument (for example,
to use it as a function callback), define it with a name like
`abc_free_void()`:
```c
static void
abc_free_void_(void *obj)
{
abc_free_(obj);
}
```
When deallocating, don't say e.g. `if (x) tor_free(x)`. The convention is to
have deallocators do nothing when NULL pointer is passed.
@ -488,6 +501,7 @@ Say what functions do as a series of one or more imperative sentences, as
though you were telling somebody how to be the function. In other words, DO
NOT say:
```c
/** The strtol function parses a number.
*
* nptr -- the string to parse. It can include whitespace.
@ -497,15 +511,18 @@ NOT say:
* returns: the resulting number.
*/
long strtol(const char *nptr, char **nptr, int base);
```
Instead, please DO say:
```c
/** Parse a number in radix <b>base</b> from the string <b>nptr</b>,
* and return the result. Skip all leading whitespace. If
* <b>endptr</b> is not NULL, set *<b>endptr</b> to the first character
* after the number parsed.
**/
long strtol(const char *nptr, char **nptr, int base);
```
Doxygen comments are the contract in our abstraction-by-contract world: if
the functions that call your function rely on it doing something, then your

View File

@ -22,20 +22,26 @@ For example, in a hypothetical `tor_addition` Rust module:
In `src/rust/tor_addition/addition.rs`:
```rust
pub fn get_sum(a: i32, b: i32) -> i32 {
a + b
}
```
In `src/rust/tor_addition/lib.rs`:
```rust
pub use addition::*;
```
In `src/rust/tor_addition/ffi.rs`:
```rust
#[no_mangle]
pub extern "C" fn tor_get_sum(a: c_int, b: c_int) -> c_int {
get_sum(a, b)
}
```
If your Rust code must call out to parts of Tor's C code, you must
declare the functions you are calling in the `external` crate, located
@ -129,6 +135,7 @@ crate. Unittests SHOULD go into their own module inside the module
they are testing, e.g. in `src/rust/tor_addition/addition.rs` you
should put:
```rust
#[cfg(test)]
mod test {
use super::*;
@ -139,6 +146,7 @@ should put:
assert_eq!(sum, 5);
}
}
```
## Benchmarking
@ -151,13 +159,17 @@ benchmarks in the following manner.
If you wish to benchmark some of your Rust code, you MUST put the
following in the `[features]` section of your crate's `Cargo.toml`:
```toml
[features]
bench = []
```
Next, in your crate's `lib.rs` you MUST put:
```rust
#[cfg(all(test, feature = "bench"))]
extern crate test;
```
This ensures that the external crate `test`, which contains utilities
for basic benchmarks, is only used when running benchmarks via `cargo
@ -166,6 +178,7 @@ bench --features bench`.
Finally, to write your benchmark code, in
`src/rust/tor_addition/addition.rs` you SHOULD put:
```rust
#[cfg(all(test, features = "bench"))]
mod bench {
use test::Bencher;
@ -176,6 +189,7 @@ Finally, to write your benchmark code, in
b.iter(| | get_sum(5i32, 0i32));
}
}
```
## Fuzzing
@ -247,30 +261,37 @@ Here are some additional bits of advice and rules:
potential error with the eel operator, `?` or another non panicking way.
For example, consider a function which parses a string into an integer:
```rust
fn parse_port_number(config_string: &str) -> u16 {
u16::from_str_radix(config_string, 10).unwrap()
}
```
There are numerous ways this can fail, and the `unwrap()` will cause the
whole program to byte the dust! Instead, either you SHOULD use `ok()`
(or another equivalent function which will return an `Option` or a `Result`)
and change the return type to be compatible:
```rust
fn parse_port_number(config_string: &str) -> Option<u16> {
u16::from_str_radix(config_string, 10).ok()
}
```
or you SHOULD use `or()` (or another similar method):
```rust
fn parse_port_number(config_string: &str) -> Option<u16> {
u16::from_str_radix(config_string, 10).or(Err("Couldn't parse port into a u16")
}
```
Using methods like `or()` can be particularly handy when you must do
something afterwards with the data, for example, if we wanted to guarantee
that the port is high. Combining these methods with the eel operator (`?`)
makes this even easier:
```rust
fn parse_port_number(config_string: &str) -> Result<u16, Err> {
let port = u16::from_str_radix(config_string, 10).or(Err("Couldn't parse port into a u16"))?;
@ -280,6 +301,7 @@ Here are some additional bits of advice and rules:
return Err("Low ports not allowed");
}
}
```
2. `unsafe`
@ -292,6 +314,7 @@ Here are some additional bits of advice and rules:
When creating an FFI in Rust for C code to call, it is NOT REQUIRED
to declare the entire function `unsafe`. For example, rather than doing:
```rust
#[no_mangle]
pub unsafe extern "C" fn increment_and_combine_numbers(mut numbers: [u8; 4]) -> u32 {
for number in &mut numbers {
@ -299,9 +322,11 @@ Here are some additional bits of advice and rules:
}
std::mem::transmute::<[u8; 4], u32>(numbers)
}
```
You SHOULD instead do:
```rust
#[no_mangle]
pub extern "C" fn increment_and_combine_numbers(mut numbers: [u8; 4]) -> u32 {
for index in 0..numbers.len() {
@ -311,6 +336,7 @@ Here are some additional bits of advice and rules:
std::mem::transmute::<[u8; 4], u32>(numbers)
}
}
```
3. Pass only C-compatible primitive types and bytes over the boundary
@ -385,15 +411,18 @@ Here are some additional bits of advice and rules:
rather than using an untyped mapping between strings and integers
like so:
```rust
use std::collections::HashMap;
pub fn get_elements_with_over_9000_points(map: &HashMap<String, usize>) -> Vec<String> {
...
}
```
It would be safer to define a new type, such that some other usage
of `HashMap<String, usize>` cannot be confused for this type:
```rust
pub struct DragonBallZPowers(pub HashMap<String, usize>);
impl DragonBallZPowers {
@ -408,10 +437,12 @@ Here are some additional bits of advice and rules:
powerful_enough
}
}
```
Note the following code, which uses Rust's type aliasing, is valid
but it does NOT meet the desired type safety goals:
```rust
pub type Power = usize;
pub fn over_nine_thousand(power: &Power) -> bool {
@ -424,6 +455,7 @@ Here are some additional bits of advice and rules:
// We can still do the following:
let his_power: usize = 9001;
over_nine_thousand(&his_power);
```
7. Unsafe mucking around with lifetimes
@ -431,6 +463,7 @@ Here are some additional bits of advice and rules:
family of types, individual lifetimes can be treated as types. For example,
one can arbitrarily extend and shorten lifetime using `std::mem::transmute`:
```rust
struct R<'a>(&'a i32);
unsafe fn extend_lifetime<'b>(r: R<'b>) -> R<'static> {
@ -440,6 +473,7 @@ Here are some additional bits of advice and rules:
unsafe fn shorten_invariant_lifetime<'b, 'c>(r: &'b mut R<'static>) -> &'b mut R<'c> {
std::mem::transmute::<&'b mut R<'static>, &'b mut R<'c>>(r)
}
```
Calling `extend_lifetime()` would cause an `R` passed into it to live forever
for the life of the program (the `'static` lifetime). Similarly,
@ -460,12 +494,14 @@ Here are some additional bits of advice and rules:
For example, `std::mem::transmute` can be abused in ways where casting with
`as` would be both simpler and safer:
```rust
// Don't do this
let ptr = &0;
let ptr_num_transmute = unsafe { std::mem::transmute::<&i32, usize>(ptr)};
// Use an `as` cast instead
let ptr_num_cast = ptr as *const i32 as usize;
```
In fact, using `std::mem::transmute` for *any* reason is a code smell and as
such SHOULD be avoided.
@ -475,8 +511,10 @@ Here are some additional bits of advice and rules:
This is generally fine to do, but it has some behaviours which you should be
aware of. Casting down chops off the high bits, e.g.:
```rust
let x: u32 = 4294967295;
println!("{}", x as u16); // prints 65535
```
Some cases which you MUST NOT do include:
@ -487,14 +525,17 @@ Here are some additional bits of advice and rules:
* Casting between integers and floats when the thing being cast
cannot fit into the type it is being casted into, e.g.:
```rust
println!("{}", 42949.0f32 as u8); // prints 197 in debug mode and 0 in release
println!("{}", 1.04E+17 as u8); // prints 0 in both modes
println!("{}", (0.0/0.0) as i64); // prints whatever the heck LLVM wants
```
Because this behaviour is undefined, it can even produce segfaults in
safe Rust code. For example, the following program built in release
mode segfaults:
```rust
#[inline(never)]
pub fn trigger_ub(sl: &[u8; 666]) -> &[u8] {
// Note that the float is out of the range of `usize`, invoking UB when casting.
@ -505,6 +546,7 @@ Here are some additional bits of advice and rules:
fn main() {
println!("{}", trigger_ub(&[1; 666])[999999]); // ~ out of bound
}
```
And in debug mode panics with:

View File

@ -6,7 +6,10 @@ Check out fuzzing-corpora, and set TOR_FUZZ_CORPORA to point to the place
where you checked it out.
To run the fuzzing test cases in a deterministic fashion, use:
make test-fuzz-corpora
```console
$ make test-fuzz-corpora
```
This won't actually fuzz Tor! It will just run all the fuzz binaries
on our existing set of testcases for the fuzzer.
@ -58,11 +61,13 @@ machine you care about, anyway.
To Build:
Get AFL from http://lcamtuf.coredump.cx/afl/ and unpack it
cd afl
make
cd ../tor
PATH=$PATH:../afl/ CC="../afl/afl-gcc" ./configure --enable-expensive-hardening
AFL_HARDEN=1 make clean fuzzers
```console
$ cd afl
$ make
$ cd ../tor
$ PATH=$PATH:../afl/ CC="../afl/afl-gcc" ./configure --enable-expensive-hardening
$ AFL_HARDEN=1 make clean fuzzers
```
To Find The ASAN Memory Limit: (64-bit only)
@ -75,10 +80,12 @@ Read afl/docs/notes_for_asan.txt for more details.
Download recidivm from http://jwilk.net/software/recidivm
Download the signature
Check the signature
tar xvzf recidivm*.tar.gz
cd recidivm*
make
/path/to/recidivm -v src/test/fuzz/fuzz-http
```console
$ tar xvzf recidivm*.tar.gz
$ cd recidivm*
$ make
$ /path/to/recidivm -v src/test/fuzz/fuzz-http
```
Use the final "ok" figure as the input to -m when calling afl-fuzz
(Normally, recidivm would output a figure automatically, but in some cases,
the fuzzing harness will hang when the memory limit is too small.)
@ -88,9 +95,11 @@ don't care about memory limits.
To Run:
mkdir -p src/test/fuzz/fuzz_http_findings
../afl/afl-fuzz -i ${TOR_FUZZ_CORPORA}/http -o src/test/fuzz/fuzz_http_findings -m <asan-memory-limit> -- src/test/fuzz/fuzz-http
```console
$ mkdir -p src/test/fuzz/fuzz_http_findings
$ ../afl/afl-fuzz -i ${TOR_FUZZ_CORPORA}/http -o src/test/fuzz/fuzz_http_findings -m <asan-memory-limit> -- src/test/fuzz/fuzz-http
```
AFL has a multi-core mode, check the documentation for details.
You might find the included fuzz-multi.sh script useful for this.
@ -109,7 +118,10 @@ valid inputs may take a second or so, particularly with the fuzzer and
sanitizers enabled.
To see what fuzz-http is doing with a test case, call it like this:
src/test/fuzz/fuzz-http --debug < /path/to/test.case
```console
$ src/test/fuzz/fuzz-http --debug < /path/to/test.case
```
(Logging is disabled while fuzzing to increase fuzzing speed.)

View File

@ -37,13 +37,17 @@ Once you've reached this point, here's what you need to know.
We keep our source under version control in Git. To get the latest
version, run:
git clone https://git.torproject.org/git/tor
```console
$ git clone https://git.torproject.org/git/tor
```
This will give you a checkout of the master branch. If you're
going to fix a bug that appears in a stable version, check out the
appropriate "maint" branch, as in:
git checkout maint-0.4.3
```console
$ git checkout maint-0.4.3
```
2. Find your way around the source.

View File

@ -54,7 +54,9 @@ fetching dependencies from Cargo or specifying a local directory.
**Fetch dependencies from Cargo**
./configure --enable-rust --enable-cargo-online-mode
```console
$ ./configure --enable-rust --enable-cargo-online-mode
```
**Using a local dependency cache**
@ -66,13 +68,17 @@ We vendor our Rust dependencies in a separate repo using
[cargo-vendor](https://github.com/alexcrichton/cargo-vendor). To use
them, do:
git submodule init
git submodule update
```console
$ git submodule init
$ git submodule update
```
To specify the local directory containing the dependencies, (assuming
you are in the top level of the repository) configure tor with:
TOR_RUST_DEPENDENCIES='path_to_dependencies_directory' ./configure --enable-rust
```console
$ TOR_RUST_DEPENDENCIES='path_to_dependencies_directory' ./configure --enable-rust
```
(Note that `TOR_RUST_DEPENDENCIES` must be the full path to the directory; it
cannot be relative.)
@ -80,7 +86,9 @@ cannot be relative.)
Assuming you used the above `git submodule` commands and you're in the
topmost directory of the repository, this would be:
TOR_RUST_DEPENDENCIES=`pwd`/src/ext/rust/crates ./configure --enable-rust
```console
$ TOR_RUST_DEPENDENCIES=`pwd`/src/ext/rust/crates ./configure --enable-rust
```
## Identifying which modules to rewrite
@ -102,10 +110,12 @@ areas of responsibility.
A good first step is to build a module-level callgraph to understand how
interconnected your target module is.
git clone https://git.torproject.org/user/nickm/calltool.git
cd tor
CFLAGS=0 ./configure
../calltool/src/main.py module_callgraph
```console
$ git clone https://git.torproject.org/user/nickm/calltool.git
$ cd tor
$ CFLAGS=0 ./configure
$ ../calltool/src/main.py module_callgraph
```
The output will tell you each module name, along with a set of every module that
the module calls. Modules which call fewer other modules are better targets.
@ -156,15 +166,21 @@ run on your crate.
Configure Tor's build system to build with Rust enabled:
./configure --enable-fatal-warnings --enable-rust --enable-cargo-online-mode
```console
$ ./configure --enable-fatal-warnings --enable-rust --enable-cargo-online-mode
```
Tor's test should be run by doing:
make check
```console
$ make check
```
Tor's integration tests should also pass:
make test-stem
```console
$ make test-stem
```
## Submitting a patch

View File

@ -43,7 +43,9 @@ Builds should show up on the web at jenkins.torproject.org and on IRC at
## Valgrind
valgrind --leak-check=yes --error-limit=no --show-reachable=yes src/app/tor
```console
$ valgrind --leak-check=yes --error-limit=no --show-reachable=yes src/app/tor
```
(Note that if you get a zillion openssl warnings, you will also need to
pass `--undef-value-errors=no` to valgrind, or rebuild your openssl
@ -77,10 +79,12 @@ we wish to permit are also documented in the blacklist file.
Lcov is a utility that generates pretty HTML reports of test code coverage.
To generate such a report:
./configure --enable-coverage
make
make coverage-html
$BROWSER ./coverage_html/index.html
```console
$ ./configure --enable-coverage
$ make
$ make coverage-html
$ $BROWSER ./coverage_html/index.html
```
This will run the tor unit test suite `./src/test/test` and generate the HTML
coverage code report under the directory `./coverage_html/`. To change the
@ -93,36 +97,48 @@ investigated (as of July 2014).
To quickly run all the tests distributed with Tor:
make check
```console
$ make check
```
To run the fast unit tests only:
make test
```console
$ make test
```
To selectively run just some tests (the following can be combined
arbitrarily):
./src/test/test <name_of_test> [<name of test 2>] ...
./src/test/test <prefix_of_name_of_test>.. [<prefix_of_name_of_test2>..] ...
./src/test/test :<name_of_excluded_test> [:<name_of_excluded_test2]...
```console
$ ./src/test/test <name_of_test> [<name of test 2>] ...
$ ./src/test/test <prefix_of_name_of_test>.. [<prefix_of_name_of_test2>..] ...
$ ./src/test/test :<name_of_excluded_test> [:<name_of_excluded_test2]...
```
To run all tests, including those based on Stem or Chutney:
make test-full
```console
$ make test-full
```
To run all tests, including those based on Stem or Chutney that require a
working connection to the internet:
make test-full-online
```console
$ make test-full-online
```
## Running gcov for unit test coverage
./configure --enable-coverage
make
make check
# or--- make test-full ? make test-full-online?
mkdir coverage-output
./scripts/test/coverage coverage-output
```console
$ ./configure --enable-coverage
$ make
$ make check
$ # or--- make test-full ? make test-full-online?
$ mkdir coverage-output
$ ./scripts/test/coverage coverage-output
```
(On OSX, you'll need to start with `--enable-coverage CC=clang`.)
@ -145,7 +161,9 @@ you can run `make reset-gcov` to clear the intermediary gcov output.
If you have two different `coverage-output` directories, and you want to see
a meaningful diff between them, you can run:
./scripts/test/cov-diff coverage-output1 coverage-output2 | less
```console
$ ./scripts/test/cov-diff coverage-output1 coverage-output2 | less
```
In this diff, any lines that were visited at least once will have coverage "1",
and line numbers are deleted. This lets you inspect what you (probably) really
@ -313,12 +331,16 @@ that you're using the emacs-specific version of `etags` (bundled under the
If you're using vim or emacs, you can also use Universal Ctags to build a tag
file using the syntax:
ctags -R -D 'MOCK_IMPL(r,h,a)=r h a' .
```console
$ ctags -R -D 'MOCK_IMPL(r,h,a)=r h a' .
```
If you're using an older version of Universal Ctags, you can use the following
instead:
```console
ctags -R --mline-regex-c='/MOCK_IMPL\([^,]+,\W*([a-zA-Z0-9_]+)\W*,/\1/f/{mgroup=1}' .
```
A vim-compatible tag file will be generated by default. If you use emacs, add
the `-e` flag to generate an emacs-compatible tag file.
@ -330,16 +352,19 @@ source code. Here's how to use it:
1. Begin every file that should be documented with
```
/**
* \file filename.c
* \brief Short description of the file.
*/
```
(Doxygen will recognize any comment beginning with /** as special.)
2. Before any function, structure, #define, or variable you want to
document, add a comment of the form:
```
/** Describe the function's actions in imperative sentences.
*
* Use blank lines for paragraph breaks
@ -355,22 +380,27 @@ source code. Here's how to use it:
* between_code_and_endcode_commands();
* \endcode
*/
```
3. Make sure to escape the characters `<`, `>`, `\`, `%` and `#` as `\<`,
`\>`, `\\`, `\%` and `\#`.
4. To document structure members, you can use two forms:
```c
struct foo {
/** You can put the comment before an element; */
int a;
int b; /**< Or use the less-than symbol to put the comment
* after the element. */
};
```
5. To generate documentation from the Tor source code, type:
```console
$ doxygen -g
```
to generate a file called `Doxyfile`. Edit that file and run
`doxygen` to generate the API documentation.

View File

@ -70,7 +70,7 @@ There are couples of "rules" you want to follow:
base. Every entry point should have a second definition if the module is
disabled. For instance:
```
```c
#ifdef HAVE_MODULE_DIRAUTH
int sr_init(int save_to_disk);
@ -109,7 +109,9 @@ There are couples of "rules" you want to follow:
* When you include headers from the module, **always** use the full module
path in your statement. Example:
`#include "feature/dirauth/dirvote.h"`
```c
#include "feature/dirauth/dirvote.h"`
```
The main reason is that we do **not** add the module include path by default
so it needs to be specified. But also, it helps our human brain understand

View File

@ -32,7 +32,9 @@ For an explanation of how to change Tor's design to work differently, look at
For the latest version of the code, get a copy of git, and
git clone https://git.torproject.org/git/tor
```console
$ git clone https://git.torproject.org/git/tor
```
## Stay in touch

View File

@ -38,9 +38,9 @@ new Tor release:
3. Run checks that aren't covered above, including:
* clang scan-build. (See the script in ./scripts/test/scan_build.sh)
* `clang scan-build`. (See the script in ./scripts/test/scan_build.sh)
* make test-network and make test-network-all (with
* `make test-network` and make `test-network-all` (with
--enable-fragile-hardening)
* Running Tor yourself and making sure that it actually works for you.
@ -57,8 +57,7 @@ new Tor release:
of them and reordering to focus on what users and funders would find
interesting and understandable.
To do this, run
`./scripts/maint/sortChanges.py changes/* > changelog.in`
To do this, run `./scripts/maint/sortChanges.py changes/* > changelog.inx`
to combine headings and sort the entries. Copy the changelog.in file
into the ChangeLog. Run 'format_changelog.py' (see below) to clean
up the line breaks.
@ -164,9 +163,11 @@ new Tor release:
1. Sign the tarball, then sign and push the git tag:
gpg -ba <the_tarball>
git tag -s tor-0.4.x.y-<status>
git push origin tag tor-0.4.x.y-<status>
```console
$ gpg -ba <the_tarball>
$ git tag -s tor-0.4.x.y-<status>
$ git push origin tag tor-0.4.x.y-<status>
```
(You must do this before you update the website: the website scripts
rely on finding the version by tag.)

View File

@ -28,7 +28,7 @@ Tracing is separated in two different concepts. The tracing API and the
tracing probes.
The API is in `src/lib/trace/` which defines how to call tracepoints in the
tor code. Every C files should include `src/lib/trace/events.h" if they want
tor code. Every C files should include `src/lib/trace/events.h` if they want
to call a tracepoint.
The probes are what actually record the tracepoint data. Because they often
@ -43,7 +43,9 @@ subsystem and an event name.
A trace event in tor has the following standard format:
tor_trace(subsystem, event\_name, args...)
```c
tor_trace(subsystem, event\_name, args...);
```
The `subsystem` parameter is the name of the subsytem the trace event is in.
For example that could be "scheduler" or "vote" or "hs". The idea is to add
@ -57,7 +59,9 @@ The `args` can be any number of arguments we want to collect.
Here is an example of a possible tracepoint in main():
tor_trace(main, init_phase, argc)
```c
tor_trace(main, init_phase, argc);
```
The above is a tracepoint in the `main` subsystem with `init\_phase` as the
event name and the `int argc` is passed to the event as one argument.
@ -80,7 +84,9 @@ Currently, we have 3 types of possible instrumentation:
arguments will be passed on because we don't know their type nor the string
format of the debug log. The output is standardized like this:
```
[debug] __FUNC__: Tracepoint <event_name> from subsystem <subsystem> hit.
```
2. USDT
@ -125,12 +131,16 @@ They can all be used together or independently. If one of them is set,
This is pretty easy. Let's say you want to add a trace event in
`src/feature/rend/rendcache.c`, you first need to include this file:
#include "lib/trace/events.h"
```c
#include "trace/events.h"
```
Then, the `tor\_trace()` macro can be used with the specific format detailled
before in a previous section. As an example:
```c
tor_trace(hs, store_desc_as_client, desc, desc_id);
```
For `Debug` instrumentation, you have nothing else to do.

View File

@ -107,7 +107,9 @@ covered or uncovered.
To count new or modified uncovered lines in D2, you can run:
./scripts/test/cov-diff ${D1} ${D2}" | grep '^+ *\#' | wc -l
```console
$ ./scripts/test/cov-diff ${D1} ${D2}" | grep '^+ *\#' | wc -l
```
## Marking lines as unreachable by tests
@ -163,6 +165,7 @@ I use the term "unit test" and "regression tests" very sloppily here.
Here's an example of a test function for a simple function in util.c:
```c
static void
test_util_writepid(void *arg)
{
@ -185,6 +188,7 @@ Here's an example of a test function for a simple function in util.c:
done:
tor_free(contents);
}
```
This should look pretty familiar to you if you've read the tinytest
manual. One thing to note here is that we use the testing-specific
@ -214,10 +218,12 @@ macro-protected declaration of the function in the module's header.
For example, `crypto_curve25519.h` contains:
```c
#ifdef CRYPTO_CURVE25519_PRIVATE
STATIC int curve25519_impl(uint8_t *output, const uint8_t *secret,
const uint8_t *basepoint);
#endif
```
The `crypto_curve25519.c` file and the `test_crypto.c` file both define
`CRYPTO_CURVE25519_PRIVATE`, so they can see this declaration.
@ -231,6 +237,7 @@ the test _really tests_ the code.
For example, here is a _bad_ test for the unlink() function (which is
supposed to remove a file).
```c
static void
test_unlink_badly(void *arg)
{
@ -252,7 +259,7 @@ supposed to remove a file).
done:
tor_free(contents);
}
```
This test might get very high coverage on unlink(). So why is it a
bad test? Because it doesn't check that unlink() *actually removes the
@ -273,20 +280,25 @@ To write tests for this case, you can replace the underlying functions
with testing stubs while your unit test is running. You need to declare
the underlying function as 'mockable', as follows:
```c
MOCK_DECL(returntype, functionname, (argument list));
```
and then later implement it as:
```c
MOCK_IMPL(returntype, functionname, (argument list))
{
/* implementation here */
}
```
For example, if you had a 'connect to remote server' function, you could
declare it as:
```c
MOCK_DECL(int, connect_to_remote, (const char *name, status_t *status));
```
When you declare a function this way, it will be declared as normal in
regular builds, but when the module is built for testing, it is declared
@ -295,11 +307,15 @@ as a function pointer initialized to the actual implementation.
In your tests, if you want to override the function with a temporary
replacement, you say:
```c
MOCK(functionname, replacement_function_name);
```
And later, you can restore the original function with:
```c
UNMOCK(functionname);
```
For more information, see the definitions of this mocking logic in
`testsupport.h`.
@ -324,11 +340,13 @@ cases and failure csaes.
For example, consider testing this function:
```c
/** Remove all elements E from sl such that E==element. Preserve
* the order of any elements before E, but elements after E can be
* rearranged.
*/
void smartlist_remove(smartlist_t *sl, const void *element);
```
In order to test it well, you should write tests for at least all of the
following cases. (These would be black-box tests, since we're only looking
@ -355,6 +373,7 @@ When you consider edge cases, you might try:
Now let's look at the implementation:
```c
void
smartlist_remove(smartlist_t *sl, const void *element)
{
@ -368,6 +387,7 @@ Now let's look at the implementation:
sl->list[sl->num_used] = NULL;
}
}
```
Based on the implementation, we now see three more edge cases to test:

View File

@ -29,7 +29,9 @@ the Android Software Development Kit (SDK) and Native Development Kit
3. Install the Android Package you generated in step 1:
```bash
$ adb install /path/to/your/app-fullperm-debug.apk
```
4. Check on your device that the newly installed Orbot actually works
and behaves in the way you expect it to.
@ -76,10 +78,12 @@ was spend on the call.
To access binaries, `torrc` files, and other useful information on
the device do the following:
```console
$ adb shell
(device):/ $ run-as org.torproject.android
(device):/data/data/org.torproject.android $ ls
app_bin app_data cache databases files lib shared_prefs
```
Descriptors, control authentication cookie, state, and other files can be
found in the `app_data` directory. The `torrc` can be found in the `app_bin/`
@ -88,10 +92,14 @@ was spend on the call.
- You can enable logging in Tor via the syslog (or android) log
mechanism with:
```console
$ adb shell
(device):/ $ run-as org.torproject.android
(device):/data/data/org.torproject.android $ echo -e "\nLog info syslog" >> app_bin/torrc
```
Start Tor the normal way via Orbot and collect the logs from your computer using
```console
$ adb logcat
```