Add serve_header.py for rapid testing on Compiler Explorer (#3456)

pull/3463/head
Florian Albrechtskirchinger 4 months ago committed by GitHub
parent b21c345179
commit 0c698b75cc
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 14
      .gitignore
  2. 7
      Makefile
  3. 91
      tools/serve_header/README.md
  4. BIN
      tools/serve_header/demo.png
  5. 2
      tools/serve_header/requirements.txt
  6. 410
      tools/serve_header/serve_header.py
  7. 15
      tools/serve_header/serve_header.yml.example

14
.gitignore vendored

@ -10,7 +10,14 @@
/.idea
/cmake-build-*
/.vs
/.vs/
/.vscode/
# clangd cache
/.cache/
# build directories (vscode-cmake-tools, user-defined, ...)
/build*/
/docs/mkdocs/docs/examples/
/docs/mkdocs/docs/__pycache__/
@ -19,3 +26,8 @@
/docs/docset/JSON_for_Modern_C++.docset/
/docs/docset/JSON_for_Modern_C++.tgz
/docs/mkdocs/docs/images/json.gif
# serve_header
/serve_header.yml
/localhost.pem
/localhost-key.pem

@ -251,3 +251,10 @@ update_hedley:
$(SED) -i '1s/^/#pragma once\n\n/' include/nlohmann/thirdparty/hedley/hedley.hpp
$(SED) -i '1s/^/#pragma once\n\n/' include/nlohmann/thirdparty/hedley/hedley_undef.hpp
$(MAKE) amalgamate
##########################################################################
# serve_header.py
##########################################################################
serve_header:
./tools/serve_header/serve_header.py --make $(MAKE)

@ -0,0 +1,91 @@
serve_header.py
===============
Serves the `single_include/nlohmann/json.hpp` header file over HTTP(S).
The header file is automatically amalgamated on demand.
![serve_header.py demo](demo.png)
## Prerequisites
1. Make sure these Python packages are installed.
```
PyYAML
watchdog
```
(see `tools/serve_header/requirements.txt`)
2. To serve the header over HTTPS (which is required by Compiler Explorer at this time), a certificate is needed.
The recommended method for creating a locally-trusted certificate is to use [`mkcert`](https://github.com/FiloSottile/mkcert).
- Install the `mkcert` certificate authority into your trust store(s):
```
$ mkcert -install
```
- Create a certificate for `localhost`:
```
$ mkcert localhost
```
The command will create two files, `localhost.pem` and `localhost-key.pem`, in the current working directory. It is recommended to create them in the top level or project root directory.
## Usage
`serve_header.py` has a built-in default configuration that will serve the `single_include/nlohmann/json.hpp` header file relative to the top level or project root directory it is homed in.
The built-in configuration expects the certificate `localhost.pem` and the private key `localhost-key.pem`to be located in the top level or project root directory.
To start serving the `json.hpp` header file at `https://localhost:8443/json.hpp`, run this command from the top level or project root directory:
```
$ make serve_header
```
Open [Compiler Explorer](https://godbolt.org/) and try it out:
```cpp
#include <https://localhost:8443/json.hpp>
using namespace nlohmann;
#include <iostream>
int main() {
// these macros are dynamically injected into the header file
std::cout << JSON_BUILD_TIME << " (" << JSON_BUILD_COUNT << ")\n";
return 0;
}
```
> `serve_header.py` dynamically injects the macros `JSON_BUILD_COUNT` and `JSON_BUILD_TIME` into the served header file. By comparing build count or time output from the compiled program with the output from `serve_header.py`, one can be reasonably sure the compiled code uses the expected revision of the header file.
## Configuration
`serve_header.py` will try to read a configuration file `serve_header.yml` in the top level or project root directory, and will fall back on built-in defaults if the file cannot be read.
An annotated example configuration can be found in `tools/serve_header/serve_header.yml.example`.
## Serving `json.hpp` from multiple project directory instances or working trees
`serve_header.py` was designed with the goal of supporting multiple project roots or working trees at the same time.
The recommended directory structure is shown below but `serve_header.py` can work with other structures as well, including a nested hierarchy.
```
json/ ⮜ the parent or web server root directoy
├── develop/ ⮜ the main git checkout
│ └── ...
├── feature1/
│ └── ... any number of additional
├── feature2/ ⮜ working trees (e.g., created
│ └── ... with git worktree)
└── feature3/
└── ...
```
To serve the header of each working tree at `https://localhost:8443/<worktree>/json.hpp`, a configuration file is needed.
1. Create the file `serve_header.yml` in the top level or project root directory of any working tree:
```yaml
root: ..
```
By shifting the web server root directory up one level, the `single_include/nlohmann/json.hpp` header files relative to each sibling directory or working tree will be served.
2. Start `serve_header.py` by running this command from the same top level or project root directory the configuration file is located in:
```
$ make serve_header
```
`serve_header.py` will automatically detect the addition or removal of working trees anywhere within the configured web server root directory.

Binary file not shown.

After

Width:  |  Height:  |  Size: 544 KiB

@ -0,0 +1,2 @@
PyYAML==6.0
watchdog==2.1.7

@ -0,0 +1,410 @@
#!/usr/bin/env python3
import contextlib
import logging
import os
import re
import shutil
import sys
import subprocess
from datetime import datetime, timedelta
from io import BytesIO
from threading import Lock, Timer
from watchdog.events import FileSystemEventHandler
from watchdog.observers import Observer
from http import HTTPStatus
from http.server import ThreadingHTTPServer, SimpleHTTPRequestHandler
CONFIG_FILE = 'serve_header.yml'
MAKEFILE = 'Makefile'
INCLUDE = 'include/nlohmann/'
SINGLE_INCLUDE = 'single_include/nlohmann/'
HEADER = 'json.hpp'
DATETIME_FORMAT = '%Y-%m-%d %H:%M:%S'
JSON_VERSION_RE = re.compile(r'\s*#\s*define\s+NLOHMANN_JSON_VERSION_MAJOR\s+')
class ExitHandler(logging.StreamHandler):
def __init__(self, level):
"""."""
super().__init__()
self.level = level
def emit(self, record):
if record.levelno >= self.level:
sys.exit(1)
def is_project_root(test_dir='.'):
makefile = os.path.join(test_dir, MAKEFILE)
include = os.path.join(test_dir, INCLUDE)
single_include = os.path.join(test_dir, SINGLE_INCLUDE)
return (os.path.exists(makefile)
and os.path.isfile(makefile)
and os.path.exists(include)
and os.path.exists(single_include))
class DirectoryEventBucket:
def __init__(self, callback, delay=1.2, threshold=0.8):
"""."""
self.delay = delay
self.threshold = timedelta(seconds=threshold)
self.callback = callback
self.event_dirs = set([])
self.timer = None
self.lock = Lock()
def start_timer(self):
if self.timer is None:
self.timer = Timer(self.delay, self.process_dirs)
self.timer.start()
def process_dirs(self):
result_dirs = []
event_dirs = set([])
with self.lock:
self.timer = None
while self.event_dirs:
time, event_dir = self.event_dirs.pop()
delta = datetime.now() - time
if delta < self.threshold:
event_dirs.add((time, event_dir))
else:
result_dirs.append(event_dir)
self.event_dirs = event_dirs
if result_dirs:
self.callback(os.path.commonpath(result_dirs))
if self.event_dirs:
self.start_timer()
def add_dir(self, path):
with self.lock:
# add path to the set of event_dirs if it is not a sibling of
# a directory already in the set
if not any(os.path.commonpath([path, event_dir]) == event_dir
for (_, event_dir) in self.event_dirs):
self.event_dirs.add((datetime.now(), path))
if self.timer is None:
self.start_timer()
class WorkTree:
make_command = 'make'
def __init__(self, root_dir, tree_dir):
"""."""
self.root_dir = root_dir
self.tree_dir = tree_dir
self.rel_dir = os.path.relpath(tree_dir, root_dir)
self.name = os.path.basename(tree_dir)
self.include_dir = os.path.abspath(os.path.join(tree_dir, INCLUDE))
self.header = os.path.abspath(os.path.join(tree_dir, SINGLE_INCLUDE, HEADER))
self.rel_header = os.path.relpath(self.header, root_dir)
self.dirty = True
self.build_count = 0
t = os.path.getmtime(self.header)
t = datetime.fromtimestamp(t)
self.build_time = t.strftime(DATETIME_FORMAT)
def __hash__(self):
"""."""
return hash((self.tree_dir))
def __eq__(self, other):
"""."""
if not isinstance(other, type(self)):
return NotImplemented
return self.tree_dir == other.tree_dir
def update_dirty(self, path):
if self.dirty:
return
path = os.path.abspath(path)
if os.path.commonpath([path, self.include_dir]) == self.include_dir:
logging.info(f'{self.name}: working tree marked dirty')
self.dirty = True
def amalgamate_header(self):
if not self.dirty:
return
mtime = os.path.getmtime(self.header)
subprocess.run([WorkTree.make_command, 'amalgamate'], cwd=self.tree_dir,
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
if mtime == os.path.getmtime(self.header):
logging.info(f'{self.name}: no changes')
else:
self.build_count += 1
self.build_time = datetime.now().strftime(DATETIME_FORMAT)
logging.info(f'{self.name}: header amalgamated (build count {self.build_count})')
self.dirty = False
class WorkTrees(FileSystemEventHandler):
def __init__(self, root_dir):
"""."""
super().__init__()
self.root_dir = root_dir
self.trees = set([])
self.tree_lock = Lock()
self.scan(root_dir)
self.created_bucket = DirectoryEventBucket(self.scan)
self.observer = Observer()
self.observer.schedule(self, root_dir, recursive=True)
self.observer.start()
def scan(self, base_dir):
scan_dirs = set([base_dir])
# recursively scan base_dir for working trees
while scan_dirs:
scan_dir = os.path.abspath(scan_dirs.pop())
self.scan_tree(scan_dir)
try:
with os.scandir(scan_dir) as dir_it:
for entry in dir_it:
if entry.is_dir():
scan_dirs.add(entry.path)
except FileNotFoundError as e:
logging.debug('path disappeared: %s', e)
def scan_tree(self, scan_dir):
if not is_project_root(scan_dir):
return
# skip source trees in build directories
# this check could be enhanced
if scan_dir.endswith('/_deps/json-src'):
return
tree = WorkTree(self.root_dir, scan_dir)
with self.tree_lock:
if not tree in self.trees:
if tree.name == tree.rel_dir:
logging.info(f'adding working tree {tree.name}')
else:
logging.info(f'adding working tree {tree.name} at {tree.rel_dir}')
url = os.path.join('/', tree.rel_dir, HEADER)
logging.info(f'{tree.name}: serving header at {url}')
self.trees.add(tree)
def rescan(self, path=None):
if path is not None:
path = os.path.abspath(path)
trees = set([])
# check if any working trees have been removed
with self.tree_lock:
while self.trees:
tree = self.trees.pop()
if ((path is None
or os.path.commonpath([path, tree.tree_dir]) == tree.tree_dir)
and not is_project_root(tree.tree_dir)):
if tree.name == tree.rel_dir:
logging.info(f'removing working tree {tree.name}')
else:
logging.info(f'removing working tree {tree.name} at {tree.rel_dir}')
else:
trees.add(tree)
self.trees = trees
def find(self, path):
# find working tree for a given header file path
path = os.path.abspath(path)
with self.tree_lock:
for tree in self.trees:
if path == tree.header:
return tree
return None
def on_any_event(self, event):
logging.debug('%s (is_dir=%s): %s', event.event_type,
event.is_directory, event.src_path)
path = os.path.abspath(event.src_path)
if event.is_directory:
if event.event_type == 'created':
# check for new working trees
self.created_bucket.add_dir(path)
elif event.event_type == 'deleted':
# check for deleted working trees
self.rescan(path)
elif event.event_type == 'closed':
with self.tree_lock:
for tree in self.trees:
tree.update_dirty(path)
def stop(self):
self.observer.stop()
self.observer.join()
class HeaderRequestHandler(SimpleHTTPRequestHandler):
def __init__(self, request, client_address, server):
"""."""
self.worktrees = server.worktrees
self.worktree = None
try:
super().__init__(request, client_address, server,
directory=server.worktrees.root_dir)
except ConnectionResetError:
logging.debug('connection reset by peer')
def translate_path(self, path):
path = os.path.abspath(super().translate_path(path))
# add single_include/nlohmann into path, if needed
header = os.path.join('/', HEADER)
header_path = os.path.join('/', SINGLE_INCLUDE, HEADER)
if (path.endswith(header)
and not path.endswith(header_path)):
path = os.path.join(os.path.dirname(path), SINGLE_INCLUDE, HEADER)
return path
def send_head(self):
# check if the translated path matches a working tree
# and fullfill the request; otherwise, send 404
path = self.translate_path(self.path)
self.worktree = self.worktrees.find(path)
if self.worktree is not None:
self.worktree.amalgamate_header()
logging.info(f'{self.worktree.name}; serving header (build count {self.worktree.build_count})')
return super().send_head()
logging.info(f'invalid request path: {self.path}')
super().send_error(HTTPStatus.NOT_FOUND, 'Not Found')
return None
def send_header(self, keyword, value):
# intercept Content-Length header; sent in copyfile later
if keyword == 'Content-Length':
return
super().send_header(keyword, value)
def end_headers (self):
# intercept; called in copyfile() or indirectly
# by send_head via super().send_error()
pass
def copyfile(self, source, outputfile):
injected = False
content = BytesIO()
length = 0
# inject build count and time into served header
for line in source:
line = line.decode('utf-8')
if not injected and JSON_VERSION_RE.match(line):
length += content.write(bytes('#define JSON_BUILD_COUNT '\
f'{self.worktree.build_count}\n', 'utf-8'))
length += content.write(bytes('#define JSON_BUILD_TIME '\
f'"{self.worktree.build_time}"\n\n', 'utf-8'))
injected = True
length += content.write(bytes(line, 'utf-8'))
# set content length
super().send_header('Content-Length', length)
# CORS header
self.send_header('Access-Control-Allow-Origin', '*')
# prevent caching
self.send_header('Cache-Control', 'no-cache, no-store, must-revalidate')
self.send_header('Pragma', 'no-cache')
self.send_header('Expires', '0')
super().end_headers()
# send the header
content.seek(0)
shutil.copyfileobj(content, outputfile)
def log_message(self, format, *args):
pass
class DualStackServer(ThreadingHTTPServer):
def __init__(self, addr, worktrees):
"""."""
self.worktrees = worktrees
super().__init__(addr, HeaderRequestHandler)
def server_bind(self):
# suppress exception when protocol is IPv4
with contextlib.suppress(Exception):
self.socket.setsockopt(
socket.IPPROTO_IPV6, socket.IPV6_V6ONLY, 0)
return super().server_bind()
if __name__ == '__main__':
import argparse
import ssl
import socket
import yaml
# exit code
ec = 0
# setup logging
logging.basicConfig(format='[%(asctime)s] %(levelname)s: %(message)s',
datefmt=DATETIME_FORMAT, level=logging.INFO)
log = logging.getLogger()
log.addHandler(ExitHandler(logging.ERROR))
# parse command line arguments
parser = argparse.ArgumentParser()
parser.add_argument('--make', default='make',
help='the make command (default: make)')
args = parser.parse_args()
# propagate the make command to use for amalgamating headers
WorkTree.make_command = args.make
worktrees = None
try:
# change working directory to project root
os.chdir(os.path.realpath(os.path.join(sys.path[0], '../../')))
if not is_project_root():
log.error('working directory does not look like project root')
# load config
config = {}
config_file = os.path.abspath(CONFIG_FILE)
try:
with open(config_file, 'r') as f:
config = yaml.safe_load(f)
except FileNotFoundError:
log.info(f'cannot find configuration file: {config_file}')
log.info('using default configuration')
# find and monitor working trees
worktrees = WorkTrees(config.get('root', '.'))
# start web server
infos = socket.getaddrinfo(config.get('bind', None), config.get('port', 8443),
type=socket.SOCK_STREAM, flags=socket.AI_PASSIVE)
DualStackServer.address_family = infos[0][0]
HeaderRequestHandler.protocol_version = 'HTTP/1.0'
with DualStackServer(infos[0][4], worktrees) as httpd:
scheme = 'HTTP'
https = config.get('https', {})
if https.get('enabled', True):
cert_file = https.get('cert_file', 'localhost.pem')
key_file = https.get('key_file', 'localhost-key.pem')
ssl.minimum_version = ssl.TLSVersion.TLSv1_3
ssl.maximum_version = ssl.TLSVersion.MAXIMUM_SUPPORTED
httpd.socket = ssl.wrap_socket(httpd.socket,
certfile=cert_file, keyfile=key_file,
server_side=True, ssl_version=ssl.PROTOCOL_TLS)
scheme = 'HTTPS'
host, port = httpd.socket.getsockname()[:2]
log.info(f'serving {scheme} on {host} port {port}')
log.info('press Ctrl+C to exit')
httpd.serve_forever()
except KeyboardInterrupt:
log.info('exiting')
except Exception:
log.exception('an error occurred:')
ec = 1
finally:
if worktrees is not None:
worktrees.stop()
sys.exit(ec)

@ -0,0 +1,15 @@
# all paths are relative to the project root
# the root directory for the web server
# root: .
# configure SSL
# https:
# enabled: true
# these filenames are listed in .gitignore
# cert_file: localhost.pem
# key_file: localhost-key.pem
# address and port for the server to listen on
# bind: null
# port: 8443
Loading…
Cancel
Save