Compare commits

...

108 Commits

Author SHA1 Message Date
b9335a6247 Added command line tool and removed lib folder from gitignore. 2025-09-20 14:01:43 +02:00
79366c9098 Fixed CLion project generation for projects without executable targets. 2025-09-20 12:17:51 +02:00
5c17999cdf (WIP) Restructuring of the project, rework of addons. 2025-09-20 12:17:51 +02:00
7b2e5c7432 Removed S++ subfolder from CLion VCS configuration. 2025-09-20 12:17:51 +02:00
Patrick Wuttke
07c2496342 Fixed .sln generation (a little) and changed config to create one .pdb per target. 2025-09-12 09:56:26 +02:00
e19f6115be Disabled array-bounds warnings for GCC due to (probably?) false positives. 2025-07-25 00:50:28 +02:00
Patrick Wuttke
7fc8518db4 Merge branch 'master' of https://git.mewin.de/mewin/scons-plus-plus 2025-07-14 18:51:46 +02:00
Patrick Wuttke
8b5d66dbec Forward CCFLAGS to Visual Studio project for IntelliSense. 2025-07-14 18:51:41 +02:00
75c626c235 Added option to pass CPU features (e.g. vector extensions) to the compiler (GCC/Clang only for now). 2025-07-12 12:51:14 +02:00
202331ba60 Added TARGET_PLATFORM variable and fixed (hopefully) debug symbols on
Windows.
2025-07-11 18:01:08 +02:00
9b82fb87c0 Don't create module configuration for dependencies. 2025-07-11 14:45:34 +02:00
45b4d164d0 Removed targets from module again to fix recursive references when trying to serialize them. 2025-07-09 00:58:32 +02:00
43503dfec6 Changed folder names for git worktrees to the name of the ref that is checked out for better readability. 2025-07-08 18:40:31 +02:00
7916566d47 Adjusted dynamic library extension for Windows (which is also .lib). 2025-07-08 18:40:31 +02:00
b47ceb81dc Added build_dir to cmake result. 2025-07-08 18:40:31 +02:00
Patrick Wuttke
6326454729 Fixed how C++ standard is passed to VS/IntelliSense. 2025-07-08 16:50:59 +02:00
Patrick Wuttke
18293fdcf7 Fixed target info dumping. 2025-07-08 16:50:06 +02:00
Patrick Wuttke
8371f96d4a Merge branch 'master' of https://git.mewin.de/mewin/scons-plus-plus 2025-07-08 14:34:23 +02:00
Patrick Wuttke
af53bf6084 Added Visual Studio project generation. 2025-07-08 14:34:20 +02:00
4bae8d67a0 Just warn instead of failing when tags cannot be fetched. 2025-07-07 00:13:07 +02:00
0ac1621494 Added option for projects to add own variables. 2025-07-06 10:40:33 +02:00
8770bd97dc Made preprocessor with MSVC behave correctly. 2025-06-24 15:21:49 +02:00
68f20bcf2d Disabled parentheses-equality warning on CLang vecause it was producing false positives. 2025-06-23 00:25:10 +02:00
e583c5ef6c Added module type and module configuration. 2025-06-19 16:43:53 +02:00
c3b5244eac Replaced --dump_env with --dump and --dump_format that allow dumping environment and config, both as text or json. And added _info() and env.Info() function for printing that reacts to SCons' -s flag. 2025-06-19 13:33:21 +02:00
88844ee5da Added small script for starting scons via IDE for debugging. 2025-06-19 13:32:18 +02:00
161f2e52d8 Give dependencies the option to add to CMAKE_PREFIX_PATH for compiling CMake dependencies against each other. 2025-04-03 16:29:36 +02:00
9436d2c48d Added JINJA_FILE_SEARCHPATH for Jinja file functions to make it easier for library projects to find their files. 2025-03-28 14:52:14 +01:00
2769fd801f Added file_size and file_content_hex Jinja filters.
Added check if Jinja module exists and appropriate error message to
project generation.
2025-03-28 11:35:59 +01:00
Patrick Wuttke
5f11d64744 Fixed generation of VSCode launch.json. 2025-03-18 09:45:21 +01:00
1401fdea90 Next attempt of adding build type to executable and library names. 2025-03-14 22:02:40 +01:00
Patrick Wuttke
3a3c79d240 Revert "Updated to include build type and variant in binary names, so they don't need to be rebuilt everytime the configuration is changed."
Modifying the suffix variables broke the library file detection, at least on Windows.

This reverts commit e6e7dbe6425e1b60cf6383242453d8fc8db8da1e.
2025-03-14 09:49:21 +01:00
e6e7dbe642 Updated to include build type and variant in binary names, so they don't need to be rebuilt everytime the configuration is changed. 2025-03-13 23:42:58 +01:00
1edf745b39 Clone recipe repositories into a separate subfolder. 2025-03-13 23:01:09 +01:00
9e5fc05f4f Introduced recipe repositories. 2025-03-13 22:58:49 +01:00
d03b7097f7 Fixed SQLite build on Linux. 2025-03-13 18:36:17 +01:00
ba5def01b0 Fixed imgui-node-editor build when using GCC. 2025-03-13 18:36:07 +01:00
Patrick Wuttke
4082f8cb22 Added build type to object suffix so switching between configs is a lot quicker. 2025-03-13 15:00:24 +01:00
Patrick Wuttke
0bec9f7062 Added project template for VSCode (only Windows for now). 2025-03-13 10:42:48 +01:00
283aa0f99c Added way for dependencies to use options and fixed compilation of ImGui with SDL3 backend. 2025-03-13 09:57:39 +01:00
71f8631e48 Added recipe for portable-file-dialogs. 2025-03-12 15:50:16 +01:00
ae739b9946 Fixed Release and Release_Debug build on windows. 2025-03-11 17:46:57 +01:00
2cc3145669 Added recipe for ImGui Node Editor. 2025-03-11 17:46:39 +01:00
5159c80167 Added recipe for SQLite on Windows (still WIP). 2025-03-11 09:35:38 +01:00
748f8f229f (WIP) SQLite recipe. 2025-03-10 09:55:34 +01:00
b034e87aaf Properly disable exceptions in MSVC. 2025-03-07 11:00:17 +01:00
726d206f9a Disabled tests for fmt. 2025-03-07 10:59:41 +01:00
5debc6d334 Added recipe for icon font cpp headers. 2025-03-07 10:48:56 +01:00
5c6a8a1cc6 Fixed issue with dependency injection not working due to the 'add target via name in dependencies' shortcut :/. 2025-03-02 22:20:21 +01:00
9d3d52ffaf Added recipe for RAID. 2025-03-02 22:19:22 +01:00
c994752c32 Added option for ImGui recipe to build from a docking build and to provide backends to compile via options. 2025-03-02 16:00:48 +01:00
42dada23c9 Added CXX_NO_EXCEPTIONS config to disable exceptions. 2025-03-02 16:00:12 +01:00
0e22057fc4 Added recipe for sdlpp. 2025-02-14 22:37:27 +01:00
c883f3c1c7 Added DXC recipe. 2025-01-19 01:17:35 +01:00
bca2398828 Added recipes for some Linux system libraries. 2025-01-12 13:11:43 +01:00
10a5239b7f Fixed default value for enable_asan. 2025-01-09 17:56:28 +01:00
0b8d115907 Added option for address sanitization on MSVC. 2024-12-26 14:48:18 +01:00
134fb106a8 Added cxxflags to cmake version hash calculation. 2024-12-13 23:39:27 +01:00
b546931c09 Added libraries to automatic CLion project generation. 2024-12-13 00:52:10 +01:00
7c4e403747 Fixed up Clang warnings. 2024-12-08 22:38:52 +01:00
c24c864915 Disabled Clang 'missing-field-initializers' warning. 2024-12-06 20:36:37 +01:00
fe95a92cf6 Added GCC build script. 2024-12-06 09:46:54 +01:00
1bdc1ef7a0 Used latest compiler version. 2024-12-05 17:03:44 +01:00
ff219840ea Added chmod to make scripts executable. 2024-12-05 16:59:17 +01:00
ee55878b18 Added build script stuff. 2024-12-05 16:31:59 +01:00
55893909f0 Added requests to requirements. 2024-11-21 00:08:30 +01:00
349a08b084 Added force parameter when fixing git tags so stuff compiles even if tags are updated. Also added the (yet missing) nghttp2 lib as a dependency to CURL if a version >= 8.10.0 is compiled. 2024-11-20 20:23:14 +01:00
5490de84eb Fixed Lua compilation on Windows. 2024-11-14 10:02:14 +01:00
70fc36335a Added GSL recipe. 2024-11-09 21:36:32 +01:00
ca1466469f Fixed lua recipe. 2024-11-08 12:36:59 +01:00
42a433582c Added Lua recipe. 2024-11-05 10:08:42 +01:00
0dcd8b383c Added DX12 recipe. 2024-10-30 11:10:30 +01:00
946cfc57ce Fixed wrong project name used when generating CLion project. 2024-10-27 13:52:00 +01:00
28c1675b87 Fixed typo. 2024-10-25 09:46:09 +02:00
7345305a77 Fixed CURL and Boost compilation on Windows. 2024-10-25 09:42:03 +02:00
651c8770c3 Added recipe for json and updated yaml-cpp recipe. 2024-10-25 08:37:07 +02:00
071cd207a9 Removed some old code. 2024-10-23 23:55:59 +02:00
a479e90335 Fixed compilation with MSVC. 2024-10-23 23:48:46 +02:00
cdbec36b8f Added recipes for curl, libidn2, libpsl and libunistring. 2024-10-23 23:48:46 +02:00
f2dc9872f7 Automatically apply patch when compiling SDL3. 2024-10-13 13:36:01 +02:00
2b05834798 Allow specifying a library from the current project as a dependency. 2024-10-12 12:30:21 +02:00
e3b3fd8f7c Save and reuse UUIDs between project generations. 2024-10-11 16:57:31 +02:00
329278c5f5 Fixed generated CLion project for platforms that use PROGPREFIX or PROGSUFFIX (e.g. Windows). 2024-10-11 10:21:21 +02:00
a0bbb46e51 Cleaned up/updated CLion project generation code and templates. 2024-10-10 23:23:29 +02:00
c6bba0e440 Added generation of CLion projects. 2024-10-10 00:06:58 +02:00
0a29b41639 Fixed (hopefully) the depends functionality. 2024-09-20 00:14:17 +02:00
a7c736de56 Added recipe for rectpack2D. 2024-09-19 14:57:37 +02:00
b45fec7561 Added enable_hlsl option to glslang. 2024-09-16 09:50:38 +02:00
c74fbb8798 Added recipe for tinyobjloader. 2024-09-15 14:45:47 +02:00
fe8f329b38 Fixed library name for debug SDL3 on Windows. 2024-09-12 15:09:06 +02:00
ae870e798d Added SHOW_INCLUDES debug setting (only MSVC for now). 2024-09-12 15:09:06 +02:00
941f94a7b6 Adjusted SDL build script to work with SDL3 (kind of). 2024-09-11 09:00:26 +02:00
1b8d6e175b Added options to version specs and allow specifying a git ref to build instead of a version. 2024-09-11 09:00:03 +02:00
bdf063a16c Added openssl recipe. 2024-09-09 21:52:57 +02:00
6cd076394d Added recipe for winsock2 and target_os to dependency conditions. 2024-09-09 21:52:57 +02:00
0175b812a2 Fixed Catch2 recipe. 2024-09-09 21:52:57 +02:00
eb044b6d2f Fixed compilation with MSVC. 2024-09-09 21:52:56 +02:00
e6adeb3a20 Moved check of SYSTEM_CACHE_DIR accessibility to before it is used. 2024-09-09 21:52:13 +02:00
fa83ff4581 Adjusted error description to make more sense. 2024-09-09 21:51:45 +02:00
29791d2e9c Fixed zlib recipe on linux. 2024-09-09 21:51:45 +02:00
4e579121db Fixed compilation with MSVC. 2024-09-09 21:51:36 +02:00
cd727e5a1d Update to new recipe system (S++ 2.0). 2024-09-09 21:50:47 +02:00
6f83b68788 Some more work on the new dependency resolution system. 2024-09-09 21:50:47 +02:00
17ee9777ed Some tests. 2024-09-09 21:50:47 +02:00
ecb8ea9a74 Allow settings COMPILATIONDB_FILTER_FILES via config. 2024-08-18 11:15:32 +02:00
a3426f1132 Enable experimental library features (jthread) for clang. 2024-08-18 09:58:30 +02:00
e9438455ea Moved check of SYSTEM_CACHE_DIR accessibility to before it is used. 2024-08-18 09:51:04 +02:00
cc5fd2d668 Added CXXFLAGS and CFLAGS to config variables. 2024-08-18 09:47:24 +02:00
79 changed files with 3377 additions and 637 deletions

220
.gitignore vendored
View File

@ -1 +1,219 @@
__pycache__ # Project files
/.idea/
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[codz]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
# lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py.cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
# Pipfile.lock
# UV
# Similar to Pipfile.lock, it is generally recommended to include uv.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# uv.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
# poetry.lock
# poetry.toml
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
# pdm recommends including project-wide configuration in pdm.toml, but excluding .pdm-python.
# https://pdm-project.org/en/latest/usage/project/#working-with-version-control
# pdm.lock
# pdm.toml
.pdm-python
.pdm-build/
# pixi
# Similar to Pipfile.lock, it is generally recommended to include pixi.lock in version control.
# pixi.lock
# Pixi creates a virtual environment in the .pixi directory, just like venv module creates one
# in the .venv directory. It is recommended not to include this directory in version control.
.pixi
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# Redis
*.rdb
*.aof
*.pid
# RabbitMQ
mnesia/
rabbitmq/
rabbitmq-data/
# ActiveMQ
activemq-data/
# SageMath parsed files
*.sage.py
# Environments
.env
.envrc
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
# .idea/
# Abstra
# Abstra is an AI-powered process automation framework.
# Ignore directories containing user credentials, local state, and settings.
# Learn more at https://abstra.io/docs
.abstra/
# Visual Studio Code
# Visual Studio Code specific template is maintained in a separate VisualStudioCode.gitignore
# that can be found at https://github.com/github/gitignore/blob/main/Global/VisualStudioCode.gitignore
# and can be added to the global gitignore or merged into this file. However, if you prefer,
# you could uncomment the following to ignore the entire vscode folder
# .vscode/
# Ruff stuff:
.ruff_cache/
# PyPI configuration file
.pypirc
# Marimo
marimo/_static/
marimo/_lsp/
__marimo__/
# Streamlit
.streamlit/secrets.toml

1160
SConscript

File diff suppressed because it is too large Load Diff

209
addons/astgen.py Normal file
View File

@ -0,0 +1,209 @@
import gzip
import json
import os.path
import pickle
import subprocess
from abc import ABC, abstractmethod
from typing import Callable, Any, Iterable, Self, Generator
from SCons.Script import *
from SCons.Node.FS import File
from spp import get_spp
spp = get_spp()
def post_environment(**kwargs) -> None:
env: Environment = spp.globals['env']
ast_json_builder = Builder(
action=_gen_ast_json
)
env.Append(BUILDERS = {'AstJson': ast_json_builder})
# env.SetDefault(ASTJSONCOM = '$ASTJSON -Xclang -ast-dump=json -fsyntax-only -Wno-unknown-warning-option -DSPP_AST_GEN $CXXFLAGS $SOURCES > $TARGET')
env.AddMethod(_ast_jinja, 'AstJinja')
def _gen_ast_json(target: list[File], source: list[File], env: Environment):
clang_exe = env.WhereIs('clang++')
cmd = [clang_exe, '-Xclang', '-ast-dump=json', '-fsyntax-only', '-Wno-unknown-warning-option',
'-DSPP_AST_GEN', f'-std={env["CXX_STANDARD"]}']
for define in env['CPPDEFINES']:
cmd.append(f'-D{define}')
for path in env['CPPPATH']:
cmd.append(f'-I{path}')
cmd.append(source[0].abspath)
# print(*cmd)
try:
proc = subprocess.Popen(cmd, text=True, stdout=subprocess.PIPE)
except subprocess.CalledProcessError as e:
env.Error(f'Clang exited with code {e.returncode}.')
return
parsed = json.load(proc.stdout)
inner: list = parsed["inner"]
# pos = 0
# last_file = None
#while pos < len(inner):
# last_file = inner[pos]["loc"].get("file", last_file)
# if last_file is None: # or os.path.isabs(last_file):
# del inner[pos]
# else:
# pos += 1
if target[0].suffix == '.bin':
with gzip.open(target[0].abspath, 'wb') as f:
pickle.dump(parsed, f)
elif target[0].suffix == '.gz':
with gzip.open(target[0].abspath, 'wt') as f:
json.dump(parsed, f)
else:
with open(target[0].abspath, 'wt') as f:
json.dump(parsed, f)
class ASTNode(ABC):
@abstractmethod
def _get_decls(self) -> Iterable[dict]: ...
def inner(self) -> Iterable[dict]:
return itertools.chain(*(decl['inner'] for decl in self._get_decls()))
def inner_filtered(self, **kwargs) -> Iterable[dict]:
def _applies(decl: dict) -> bool:
for name, val in kwargs.items():
if decl.get(name) != val:
return False
return True
return (decl for decl in self.inner() if _applies(decl))
class SimpleASTNode(ASTNode):
def __init__(self, decl: dict) -> None:
self._decl = decl
def _get_decls(self) -> Iterable[dict]:
return (self._decl,)
class Value(SimpleASTNode): ...
class Annotation(SimpleASTNode):
@property
def values(self) -> Iterable[Value]:
return (Value(decl) for decl in self.inner())
class Param(SimpleASTNode):
@property
def name(self) -> str:
return self._decl.get('name', '')
@property
def type(self) -> str:
return self._decl['type']['qualType']
class Method(SimpleASTNode):
def __init__(self, decl: dict, access: str) -> None:
super().__init__(decl)
self._access = access
@property
def access(self) -> str:
return self._access
@property
def name(self) -> str:
return self._decl['name']
@property
def mangled_name(self) -> str:
return self._decl['mangledName']
@property
def type(self) -> str:
return self._decl['type']['qualType']
@property
def return_type(self) -> str:
return self.type.split('(', 1)[0].strip()
@property
def params(self) -> Iterable[Param]:
return (Param(decl) for decl in self.inner_filtered(kind='ParmVarDecl'))
@property
def annotations(self) -> Iterable[Annotation]:
return (Annotation(decl) for decl in self.inner_filtered(kind='AnnotateAttr'))
class Class(SimpleASTNode):
@property
def name(self) -> str:
return self._decl['name']
@property
def tagUsed(self) -> str:
return self._decl['tagUsed']
@property
def methods(self) -> Generator[Method]:
access = 'private' if self.tagUsed == 'class' else 'public'
for decl in self.inner():
if decl['kind'] == 'AccessSpecDecl':
access = decl['access']
elif decl['kind'] == 'CXXMethodDecl' and not decl.get('isImplicit', False):
yield Method(decl, access)
class Namespace(ASTNode, ABC):
def get_namespace(self, ns_name: str) -> Self:
return InnerNamespace(list(self.inner_filtered(kind='NamespaceDecl', name=ns_name)))
@property
def classes(self) -> Iterable[Class]:
return (Class(decl) for decl in self.inner_filtered(kind='CXXRecordDecl', tagUsed='class', completeDefinition=True))
class InnerNamespace(Namespace):
def __init__(self, decls: list[dict]) -> None:
self._decls = decls
def _get_decls(self) -> Iterable[dict]:
return self._decls
class Ast(Namespace):
def __init__(self, file: File) -> None:
self._file = file
self._data_dict: dict|None = None
def _get_decls(self) -> tuple[dict]:
if self._data_dict is None:
if not self._file.exists():
self._data_dict = {
'inner': []
}
elif self._file.suffix == '.bin':
with gzip.open(self._file.abspath, 'rb') as f:
self._data_dict = pickle.load(f)
elif self._file.suffix == '.gz':
with gzip.open(self._file.abspath) as f:
self._data_dict = json.load(f)
else:
with open(self._file.abspath, 'r') as f:
self._data_dict = json.load(f)
return (self._data_dict,)
def _ast_jinja(env: Environment, source: File, target: File, template: File, **kwargs):
cache_dir = env['CACHE_DIR']
rel_path = env.Dir('#').rel_path(source)
json_file = env.File(os.path.join(cache_dir, 'ast_json', f'{rel_path}.bin'))
ast_json = env.AstJson(target=json_file, source=source, **kwargs)
ast_jinja = env.Jinja(
target=target,
source=template,
JINJA_CONTEXT = {
'ast': Ast(json_file)
},
**kwargs
)
env.Depends(ast_jinja, ast_json)
# env.AlwaysBuild(ast_jinja)
# env.Requires(ast_jinja, ast_json)
# env.Requires(source, ast_jinja)
env.Ignore(ast_json, ast_jinja)
return ast_jinja

View File

@ -1,43 +0,0 @@
import os
import pathlib
import subprocess
import sys
from SCons.Script import *
_BUILT_STAMPFILE = '.spp_built'
Import('env')
def _autotools_project(env: Environment, project_root: str, config_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = []) -> dict:
config = env['BUILD_TYPE']
build_dir = os.path.join(project_root, f'build_{config}')
install_dir = os.path.join(project_root, f'install_{config}')
is_built = os.path.exists(os.path.join(install_dir, _BUILT_STAMPFILE))
if not is_built or env['UPDATE_REPOSITORIES']:
print(f'Building {project_root}, config {config}')
os.makedirs(build_dir, exist_ok=True)
opt_level = {
'debug': '-O0',
}.get(env['BUILD_TYPE'], '-O2')
debug_symbols = {
'release': ''
}.get(env['BUILD_TYPE'], '-g')
cflags = f'{opt_level} {debug_symbols}'
jobs = env.GetOption('num_jobs')
env = os.environ.copy()
env['CFLAGS'] = cflags
subprocess.run((os.path.join(project_root, 'configure'), '--prefix', install_dir, *config_args), cwd=build_dir, env=env, stdout=sys.stdout, stderr=sys.stderr, check=True)
subprocess.run(('make', f'-j{jobs}', *build_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
subprocess.run(('make', 'install', *install_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
pathlib.Path(install_dir, _BUILT_STAMPFILE).touch()
return {
'install_dir': install_dir,
'LIBPATH': [os.path.join(install_dir, 'lib')],
'CPPPATH': [os.path.join(install_dir, 'include')]
}
env.AddMethod(_autotools_project, 'AutotoolsProject')
Return('env')

View File

@ -1,51 +0,0 @@
import os
import pathlib
from SCons.Script import *
_BUILT_STAMPFILE = '.spp_built'
Import('env')
def cmd_quote(s: str) -> str:
escaped = s.replace('\\', '\\\\')
return f'"{escaped}"'
def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = []) -> dict:
config = env['BUILD_TYPE']
build_dir = os.path.join(project_root, f'build_{config}')
install_dir = os.path.join(project_root, f'install_{config}')
is_built = os.path.exists(os.path.join(install_dir, _BUILT_STAMPFILE))
if not is_built or env['UPDATE_REPOSITORIES']:
print(f'Building {project_root}, config {config}')
os.makedirs(build_dir, exist_ok=True)
build_type = {
'debug': 'Debug',
'release_debug': 'RelWithDebInfo',
'release': 'Release',
'profile': 'RelWithDebInfo'
}.get(env['BUILD_TYPE'], 'RelWithDebInfo')
def run_cmd(args):
env.Execute(' '.join([str(s) for s in args]))
# TODO: is this a problem?
# environ = os.environ.copy()
# environ['CXXFLAGS'] = ' '.join(f'-D{define}' for define in env['CPPDEFINES']) # TODO: who cares about windows?
run_cmd(['cmake', '-G', 'Ninja', '-B', build_dir, f'-DCMAKE_BUILD_TYPE={build_type}', f'-DCMAKE_INSTALL_PREFIX={cmd_quote(install_dir)}', '-DBUILD_TESTING=OFF', *generate_args, project_root])
run_cmd(['cmake', '--build', *build_args, cmd_quote(build_dir)])
run_cmd(['cmake', '--install', *install_args, cmd_quote(build_dir)])
pathlib.Path(install_dir, _BUILT_STAMPFILE).touch()
libpath = []
for lib_folder in ('lib', 'lib64'):
full_path = os.path.join(install_dir, lib_folder)
if os.path.exists(full_path):
libpath.append(full_path)
return {
'install_dir': install_dir,
'LIBPATH': libpath,
'CPPPATH': [os.path.join(install_dir, 'include')]
}
env.AddMethod(_cmake_project, 'CMakeProject')
Return('env')

13
addons/compat_v1_0.py Normal file
View File

@ -0,0 +1,13 @@
from spp import get_spp
spp = get_spp()
def available(**kwargs) -> bool:
return spp.globals['config']['SPP_TARGET_VERSION'][0:2] == (1, 0)
def pre_environment(**kwargs) -> None:
spp.globals['tools'].append('unity_build') # S++ 1.0.0 had the unity_build enabled by default
def post_environment(**kwargs) -> None:
spp.globals['env']['_SPP_FALLBACK_RECIPE_REPO'] = {'repo_name': 'mewin', 'remote_url': 'https://git.mewin.de/mewin/spp_recipes.git', 'git_ref': 'stable'}

35
addons/config_cache.py Normal file
View File

@ -0,0 +1,35 @@
import json
from pathlib import Path
from spp import get_spp, TargetType
spp = get_spp()
def _should_generate() -> bool:
# check if any program or library target has been built
for target in spp.targets:
if target.target_type in (TargetType.PROGRAM, TargetType.STATIC_LIBRARY, TargetType.SHARED_LIBRARY):
return True
return False
def post_finalize(**kwargs) -> None:
if not _should_generate():
return
cache_file = Path(spp.env['CACHE_DIR']) / 'config_cache.json'
cache = {}
if cache_file.exists():
try:
with cache_file.open('r') as f:
cache = json.load(f)
except Exception as e:
spp.env.Warn(f'Error while loading config cache: {e}.')
cache['build_type'] = spp.env['BUILD_TYPE']
try:
with cache_file.open('w') as f:
json.dump(cache, f)
except Exception as e:
spp.env.Warn(f'Error while saving config cache: {e}.')

View File

@ -1,39 +0,0 @@
from git import Repo
from git.exc import GitError
import hashlib
import os
from SCons.Script import *
Import('env')
def _gitbranch(env: Environment, repo_name: str, remote_url: str, git_ref: str = "main") -> dict:
repo_dir = os.path.join(env['CLONE_DIR'], 'git', repo_name, '_bare')
try:
repo = Repo(repo_dir)
origin = repo.remotes['origin']
except GitError:
print(f'Initializing git repository at {repo_dir}.')
repo = Repo.init(repo_dir, bare=True)
origin = repo.create_remote('origin', remote_url)
worktree_dir = os.path.join(env['CLONE_DIR'], 'git', repo_name, hashlib.shake_128(git_ref.encode('utf-8')).hexdigest(6)) # TODO: commit hash would be better, right? -> not if it's a branch!
if not os.path.exists(worktree_dir):
print(f'Checking out into {worktree_dir}.')
origin.fetch(tags=True)
os.makedirs(worktree_dir)
repo.git.worktree('add', worktree_dir, git_ref)
elif env['UPDATE_REPOSITORIES']:
worktree_repo = Repo(worktree_dir)
if not worktree_repo.head.is_detached:
print(f'Updating git repository at {worktree_dir}')
worktree_origin = worktree_repo.remotes['origin']
worktree_origin.pull()
else:
print(f'Not updating git repository {worktree_dir} as it is not on a branch.')
return {
'checkout_root': worktree_dir
}
env.AddMethod(_gitbranch, 'GitBranch')
Return('env')

View File

@ -1,10 +1,178 @@
# based on https://github.com/hgomersall/scons-jinja
from SCons.Script import *
import os
import pathlib import pathlib
Import('env') from spp import get_spp
if not hasattr(env, 'Jinja'): try:
Return('env') import jinja2
from jinja2.utils import open_if_exists
except ImportError:
jinja2 = None
print('No Jinja :(')
spp = get_spp()
def available(**kwargs) -> bool:
return jinja2 is not None
def post_environment(**kwargs) -> None:
env: Environment = spp.globals['env']
env.SetDefault(JINJA_CONTEXT={})
env.SetDefault(JINJA_ENVIRONMENT_VARS={})
env.SetDefault(JINJA_FILTERS={'load_config': _jinja_load_config})
env.SetDefault(JINJA_GLOBALS={
'file_size': lambda *args: _file_size(env, *args),
'file_content_hex': lambda *args: _file_content_hex(env, *args)
})
env.SetDefault(JINJA_TEMPLATE_SEARCHPATH=['data/jinja'])
env.SetDefault(JINJA_CONFIG_SEARCHPATH=[env.Dir('#data/config')])
env.SetDefault(JINJA_FILE_SEARCHPATH=[env.Dir('#')])
env['BUILDERS']['Jinja'] = Builder(
action=render_jinja_template
)
scanner = env.Scanner(function=jinja_scanner,
skeys=['.jinja'])
env.Append(SCANNERS=scanner)
env.AddMethod(_wrap_jinja(env.Jinja), 'Jinja')
class FileSystemLoaderRecorder(jinja2.FileSystemLoader):
""" A wrapper around FileSystemLoader that records files as they are
loaded. These are contained within loaded_filenames set attribute.
"""
def __init__(self, searchpath, encoding='utf-8'):
self.loaded_filenames = set()
super(FileSystemLoaderRecorder, self).__init__(searchpath, encoding)
def get_source(self, environment, template):
"""Overwritten FileSystemLoader.get_source method that extracts the
filename that is used to load each filename and adds it to
self.loaded_filenames.
"""
for searchpath in self.searchpath:
filename = os.path.join(searchpath, template)
f = open_if_exists(filename)
if f is None:
continue
try:
contents = f.read().decode(self.encoding)
finally:
f.close()
self.loaded_filenames.add(filename)
return super(FileSystemLoaderRecorder, self).get_source(
environment, template)
# If the template isn't found, then we have to drop out.
raise jinja2.TemplateNotFound(template)
def jinja_scanner(node, env, path):
# Instantiate the file as necessary
node.get_text_contents()
template_dir, filename = os.path.split(str(node))
template_search_path = ([template_dir] +
env.subst(env['JINJA_TEMPLATE_SEARCHPATH']))
template_loader = FileSystemLoaderRecorder(template_search_path)
jinja_env = jinja2.Environment(loader=template_loader,
extensions=['jinja2.ext.do'], **env['JINJA_ENVIRONMENT_VARS'])
jinja_env.filters.update(env['JINJA_FILTERS'])
jinja_env.globals.update(env['JINJA_GLOBALS'])
try:
template = jinja_env.get_template(filename)
except jinja2.TemplateNotFound as e:
env.Error(f'Missing template: {os.path.join(template_dir, str(e))}')
# We need to render the template to do all the necessary loading.
#
# It's necessary to respond to missing templates by grabbing
# the content as the exception is raised. This makes sure of the
# existence of the file upon which the current scanned node depends.
#
# I suspect that this is pretty inefficient, but it does
# work reliably.
context = env['JINJA_CONTEXT']
last_missing_file = ''
while True:
try:
template.render(**context)
except jinja2.TemplateNotFound as e:
if last_missing_file == str(e):
# We've already been round once for this file,
# so need to raise
env.Error(f'Missing template: {os.path.join(template_dir, str(e))}')
last_missing_file = str(e)
# Find where the template came from (using the same ordering
# as Jinja uses).
for searchpath in template_search_path:
filename = os.path.join(searchpath, last_missing_file)
if os.path.exists(filename):
continue
else:
env.File(filename).get_text_contents()
continue
break
# Get all the files that were loaded. The set includes the current node,
# so we remove that.
found_nodes_names = list(template_loader.loaded_filenames)
try:
found_nodes_names.remove(str(node))
except ValueError as e:
env.Error(f'Missing template node: {str(node)}')
return [env.File(f) for f in found_nodes_names]
def render_jinja_template(target, source, env):
output_str = ''
if not source:
source = [f'{target}.jinja']
for template_file in source:
template_dir, filename = os.path.split(str(template_file))
template_search_path = ([template_dir] +
env.subst(env['JINJA_TEMPLATE_SEARCHPATH']))
template_loader = FileSystemLoaderRecorder(template_search_path)
jinja_env = jinja2.Environment(loader=template_loader,
extensions=['jinja2.ext.do'], **env['JINJA_ENVIRONMENT_VARS'])
jinja_env.filters.update(env['JINJA_FILTERS'])
jinja_env.globals.update(env['JINJA_GLOBALS'])
jinja_env.filters.update(env['JINJA_FILTERS'])
template = jinja_env.get_template(filename)
context = env['JINJA_CONTEXT']
template.render(**context)
output_str += template.render(**context)
with open(str(target[0]), 'w') as target_file:
target_file.write(output_str)
return None
def _jinja_load_config(env, config_name): def _jinja_load_config(env, config_name):
searched_paths = [] searched_paths = []
@ -23,16 +191,29 @@ def _wrap_jinja(orig_jinja):
def _wrapped(env, target, **kwargs): def _wrapped(env, target, **kwargs):
if 'source' not in kwargs: if 'source' not in kwargs:
kwargs['source'] = f'{target}.jinja' kwargs['source'] = f'{target}.jinja'
target = orig_jinja(**kwargs) target = orig_jinja(target=target, **kwargs)
if 'depends' in kwargs: if 'depends' in kwargs:
for dependency in kwargs['depends']: for dependency in kwargs['depends']:
env.Depends(target, dependency) env.Depends(target, dependency)
# env.Depends(alias_prepare, target)
return target return target
return _wrapped return _wrapped
env.AddMethod(_wrap_jinja(env.Jinja), 'Jinja') def _find_file(env, fname):
env.Append(JINJA_FILTERS = {'load_config': _jinja_load_config}) for path in env['JINJA_FILE_SEARCHPATH']:
env.Append(JINJA_TEMPLATE_SEARCHPATH = ['data/jinja']) fullpath = os.path.join(path.abspath, fname)
env['JINJA_CONFIG_SEARCHPATH'] = [env.Dir('#data/config')] if os.path.exists(fullpath):
Return('env') return env.File(fullpath)
return None
def _file_size(env, fname: str) -> int:
file = _find_file(env, fname)
if not file:
env.Error(f'File does not exist: {fname}. Searched in: {[d.abspath for d in env["JINJA_FILE_SEARCHPATH"]]}')
return file.get_size()
def _file_content_hex(env, fname: str) -> str:
file = _find_file(env, fname)
if not file:
env.Error(f'File does not exist: {fname}. Searched in: {[d.abspath for d in env["JINJA_FILE_SEARCHPATH"]]}')
bytes = file.get_contents()
return ','.join([hex(byte) for byte in bytes])

View File

@ -0,0 +1,60 @@
import os
import pathlib
import subprocess
import sys
from SCons.Script import *
_BUILT_STAMPFILE = '.spp_built'
Import('env')
def _autotools_project(env: Environment, project_root: str, config_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = [], configure_script_path: str = 'configure', skip_steps = ()) -> dict:
config = env['BUILD_TYPE']
build_dir = os.path.join(project_root, f'build_{config}')
install_dir = os.path.join(project_root, f'install_{config}')
is_built = os.path.exists(os.path.join(install_dir, _BUILT_STAMPFILE))
if not is_built or env['UPDATE_REPOSITORIES']:
print(f'Building {project_root}, config {config}')
os.makedirs(build_dir, exist_ok=True)
opt_level = {
'debug': '-O0',
}.get(env['BUILD_TYPE'], '-O2')
debug_symbols = {
'release': ''
}.get(env['BUILD_TYPE'], '-g')
cflags = f'{opt_level} {debug_symbols}'
jobs = env.GetOption('num_jobs')
env = os.environ.copy()
env['CFLAGS'] = cflags
config_script = os.path.join(project_root, configure_script_path)
if not os.path.exists(config_script) and os.path.exists(f'{config_script}.ac'):
subprocess.run(('autoreconf', '--install', '--force'), cwd=project_root)
if 'configure' not in skip_steps:
subprocess.run((config_script, f'--prefix={install_dir}', *config_args), cwd=build_dir, env=env, stdout=sys.stdout, stderr=sys.stderr, check=True)
if 'build' not in skip_steps:
subprocess.run(('make', f'-j{jobs}', *build_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
if 'install' not in skip_steps:
subprocess.run(('make', 'install', *install_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
else:
# must still create the install dir for the stamp file
os.makedirs(install_dir, exist_ok=True)
pathlib.Path(install_dir, _BUILT_STAMPFILE).touch()
libpath = []
for lib_folder in ('lib', 'lib64'):
full_path = os.path.join(install_dir, lib_folder)
if os.path.exists(full_path):
libpath.append(full_path)
return {
'build_dir': build_dir,
'install_dir': install_dir,
'LIBPATH': libpath,
'CPPPATH': [os.path.join(install_dir, 'include')]
}
env.AddMethod(_autotools_project, 'AutotoolsProject')
Return('env')

116
addons/old/cmake_project.py Normal file
View File

@ -0,0 +1,116 @@
import json
import pathlib
import shutil
from SCons.Script import *
_BUILT_STAMPFILE = '.spp_built'
_VERSION = 2 # bump if you change how the projects are build to trigger a clean build
Import('env')
def cmd_quote(s: str) -> str:
escaped = s.replace('\\', '\\\\')
return f'"{escaped}"'
def _generate_cmake_c_flags(env, dependencies: 'list[dict]') -> str:
parts = env['DEPS_CFLAGS'].copy()
for dependency in dependencies:
for path in dependency.get('CPPPATH', []):
parts.append(f'-I{path}')
return cmd_quote(' '.join(parts))
def _generate_cmake_cxx_flags(env, dependencies: 'list[dict]') -> str:
parts = env['DEPS_CXXFLAGS'].copy()
for dependency in dependencies:
for path in dependency.get('CPPPATH', []):
parts.append(f'-I{path}')
return cmd_quote(' '.join(parts))
def _get_cmake_cxx_standard(env: Environment) -> str:
return env['CXX_STANDARD'][3:] # we use "C++XX", CMake just "XX"
def _get_cmake_prefix_path(dependencies: 'list[dict]') -> str:
parts = []
for dependency in dependencies:
for path in dependency.get('CMAKE_PREFIX_PATH', []):
parts.append(path)
return cmd_quote(';'.join(parts))
def _generate_cmake_args(env: Environment, dependencies: 'list[dict]') -> 'list[str]':
args = [f'-DCMAKE_C_FLAGS={_generate_cmake_c_flags(env, dependencies)}',
f'-DCMAKE_CXX_FLAGS={_generate_cmake_cxx_flags(env, dependencies)}',
f'-DCMAKE_CXX_STANDARD={_get_cmake_cxx_standard(env)}',
f'-DCMAKE_PREFIX_PATH={_get_cmake_prefix_path(dependencies)}']
for dependency in dependencies:
for name, value in dependency.get('CMAKE_VARS', {}).items():
args.append(f'-D{name}={cmd_quote(value)}')
return args
def _calc_version_hash(env, dependencies: 'list[dict]') -> str:
return json.dumps({
'version': _VERSION,
'dependencies': dependencies,
'cxxflags': env['DEPS_CXXFLAGS']
})
def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = [], dependencies: 'list[dict]' = []) -> dict:
config = env['BUILD_TYPE']
build_dir = os.path.join(project_root, f'build_{config}')
install_dir = os.path.join(project_root, f'install_{config}')
version_hash = _calc_version_hash(env, dependencies)
stamp_file = pathlib.Path(install_dir, _BUILT_STAMPFILE)
is_built = stamp_file.exists()
if is_built:
with stamp_file.open('r') as f:
build_version = f.read()
if build_version != version_hash:
print(f'Rebuilding CMake project at {project_root} as the script version changed.')
is_built = False
if not is_built:
shutil.rmtree(build_dir)
shutil.rmtree(install_dir)
if not is_built or env['UPDATE_REPOSITORIES']:
print(f'Building {project_root}, config {config}')
os.makedirs(build_dir, exist_ok=True)
build_type = {
'debug': 'Debug',
'release_debug': 'RelWithDebInfo',
'release': 'Release',
'profile': 'RelWithDebInfo'
}.get(env['BUILD_TYPE'], 'RelWithDebInfo')
def run_cmd(args):
if env.Execute(' '.join([str(s) for s in args])):
Exit(1)
# TODO: is this a problem?
# environ = os.environ.copy()
# environ['CXXFLAGS'] = ' '.join(f'-D{define}' for define in env['CPPDEFINES']) # TODO: who cares about windows?
run_cmd(['cmake', '-G', 'Ninja', '-B', build_dir, f'-DCMAKE_BUILD_TYPE={build_type}',
f'-DCMAKE_INSTALL_PREFIX={cmd_quote(install_dir)}', '-DBUILD_TESTING=OFF',
*_generate_cmake_args(env, dependencies), *generate_args, project_root])
run_cmd(['cmake', '--build', *build_args, cmd_quote(build_dir)])
run_cmd(['cmake', '--install', *install_args, cmd_quote(build_dir)])
with pathlib.Path(install_dir, _BUILT_STAMPFILE).open('w') as f:
f.write(version_hash)
libpath = []
for lib_folder in ('lib', 'lib64'):
full_path = os.path.join(install_dir, lib_folder)
if os.path.exists(full_path):
libpath.append(full_path)
return {
'build_dir': build_dir,
'install_dir': install_dir,
'BINPATH': [os.path.join(install_dir, 'bin')],
'LIBPATH': libpath,
'CPPPATH': [os.path.join(install_dir, 'include')]
}
env.AddMethod(_cmake_project, 'CMakeProject')
Return('env')

View File

@ -37,19 +37,22 @@ def _download_file(url: str, path: pathlib.Path) -> None:
urllib.request.urlretrieve(url, dl_path) urllib.request.urlretrieve(url, dl_path)
dl_path.rename(path) dl_path.rename(path)
def _extract_file(path: pathlib.Path, output_dir: str, archive_type: ArchiveType, skip_folders: int) -> None: def _extract_file(path: pathlib.Path, output_dir: str, archive_type: ArchiveType, skip_folders: int = 0) -> None:
if archive_type == ArchiveType.TAR_GZ: if archive_type == ArchiveType.TAR_GZ:
file = tarfile.open(str(path)) file = tarfile.open(str(path))
filter = tarfile.data_filter
if skip_folders != 0: if skip_folders != 0:
def skip_filer(member: tarfile.TarInfo, path: str) -> tarfile.TarInfo: def skip_filter(member: tarfile.TarInfo, path: str) -> tarfile.TarInfo:
name_parts = member.name.split('/') name_parts = member.name.split('/', skip_folders)
if len(name_parts) <= skip_folders: if len(name_parts) <= skip_folders:
return None return None
return member.replace(name = '/'.join(name_parts[skip_folders:])) return member.replace(name = '/'.join(name_parts[skip_folders:]))
file.extraction_filter = skip_filer filter = skip_filter
file.extractall(output_dir) file.extractall(output_dir, filter=filter)
file.close() file.close()
elif archive_type == ArchiveType.ZIP: elif archive_type == ArchiveType.ZIP:
if skip_folders != 0:
raise Exception('skip_folders option is not yet supported for zip-archives :()')
file = zipfile.open(str(path)) file = zipfile.open(str(path))
file.extractall(output_dir) file.extractall(output_dir)
file.close() file.close()

176
addons/old/gitbranch.py Normal file
View File

@ -0,0 +1,176 @@
from git import Repo
from git.exc import GitError
import hashlib
import inspect
import os
import shutil
from SCons.Script import *
Import('env')
def _clone(env: Environment, repo_name: str, remote_url: str):
repo_dir = os.path.join(env['CLONE_DIR'], 'git', repo_name, '_bare')
try:
repo = Repo(repo_dir)
origin = repo.remotes['origin']
except GitError:
print(f'Initializing git repository at {repo_dir}.')
repo = Repo.init(repo_dir, bare=True)
origin = repo.create_remote('origin', remote_url)
return repo, origin
def _git_branch(env: Environment, repo_name: str, remote_url: str, git_ref: str = 'main') -> dict:
repo, origin = _clone(env, repo_name, remote_url)
old_worktree_dir = os.path.join(env['CLONE_DIR'], 'git', repo_name, hashlib.shake_128(git_ref.encode('utf-8')).hexdigest(6))
worktree_dir = os.path.join(env['CLONE_DIR'], 'git', repo_name, git_ref.replace('/', '_'))
if os.path.exists(old_worktree_dir) and not os.path.islink(old_worktree_dir):
if not os.path.exists(worktree_dir):
print(f'Found old Git worktree at {old_worktree_dir}, moving it to {worktree_dir}.')
try:
repo.git.worktree('move', old_worktree_dir, worktree_dir)
except GitError:
print('Error while moving worktree, manually moving and repairing it instead.')
shutil.move(old_worktree_dir, worktree_dir)
try:
repo.git.worktree('repair', worktree_dir)
except GitError:
print('Also didn\'t work, removing and redownloading it.')
try:
repo.git.worktree('remove', '-f', worktree_dir)
except GitError: ...
try:
repo.git.worktree('remove', '-f', old_worktree_dir)
except GitError: ...
if os.path.exists(worktree_dir):
shutil.rmtree(worktree_dir, ignore_errors=True)
# this is all we can do, I guess
else:
print(f'Found old Git worktree at {old_worktree_dir}, but the new one at {worktree_dir} already exists. Removing the old one.')
repo.git.worktree('remove', '-f', old_worktree_dir)
print('Attempting to create a symlink for older S++ versions.')
try:
os.symlink(worktree_dir, old_worktree_dir, target_is_directory=True)
except Exception as e:
print(f'Failed: {e}')
update_submodules = False
if not os.path.exists(worktree_dir):
print(f'Checking out into {worktree_dir}.')
origin.fetch(tags=True, force=True)
os.makedirs(worktree_dir)
repo.git.worktree('add', worktree_dir, git_ref)
worktree_repo = Repo(worktree_dir)
update_submodules = True
elif env['UPDATE_REPOSITORIES']:
worktree_repo = Repo(worktree_dir)
if not worktree_repo.head.is_detached:
print(f'Updating git repository at {worktree_dir}')
worktree_origin = worktree_repo.remotes['origin']
worktree_origin.pull()
update_submodules = True
else:
print(f'Not updating git repository {worktree_dir} as it is not on a branch.')
else:
worktree_repo = Repo(worktree_dir)
if update_submodules:
for submodule in worktree_repo.submodules:
submodule.update(init=True)
for submodule in worktree_repo.submodules:
if os.listdir(submodule.abspath) == ['.git']:
print(f'Submodule {submodule.name} seems borked, attempting to fix it.')
worktree_repo.git.submodule('deinit', '-f', submodule.path)
worktree_repo.git.submodule('init', submodule.path)
worktree_repo.git.submodule('update', submodule.path)
return {
'checkout_root': worktree_dir,
'repo': repo,
'origin': origin
}
def _git_tags(env: Environment, repo_name: str, remote_url: str, force_fetch: bool = False) -> 'list[str]':
repo, origin = _clone(env, repo_name, remote_url)
if force_fetch or env['UPDATE_REPOSITORIES']:
try:
origin.fetch(tags=True)
except GitError:
env.Warn(f'Error fetching tags from {repo_name} ({remote_url})')
return [t.name for t in repo.tags]
def _make_callable(val):
if callable(val):
return val
else:
def _wrapped(*args, **kwargs):
return val
return _wrapped
def _git_recipe(env: Environment, globals: dict, repo_name, repo_url, cook_fn, versions = None, tag_pattern = None, tag_fn = None, ref_fn = None, dependencies: dict = {}) -> None:
_repo_name = _make_callable(repo_name)
_repo_url = _make_callable(repo_url)
_tag_pattern = _make_callable(tag_pattern)
versions_cb = versions and _make_callable(versions)
dependencies_cb = _make_callable(dependencies)
def _versions(env: Environment, update: bool = False, options: dict = {}):
if 'ref' in options:
return [(0, 0, 0)] # no versions if compiling from a branch
pattern_signature = inspect.signature(_tag_pattern)
kwargs = {}
if 'options' in pattern_signature.parameters:
kwargs['options'] = options
pattern = _tag_pattern(env, **kwargs)
if pattern:
tags = env.GitTags(repo_name = _repo_name(env), remote_url = _repo_url(env), force_fetch=update)
result = []
for tag in tags:
match = pattern.match(tag)
if match:
result.append(tuple(int(part) for part in match.groups() if part is not None))
if len(result) == 0 and not update:
return _versions(env, update=True)
return result
elif versions_cb:
return versions_cb(env)
else:
return [(0, 0, 0)]
def _dependencies(env: Environment, version, options: dict) -> 'dict':
dependencies_signature = inspect.signature(dependencies_cb)
kwargs = {}
if 'options' in dependencies_signature.parameters:
kwargs['options'] = options
return dependencies_cb(env, version, **kwargs)
def _cook(env: Environment, version, options: dict = {}) -> dict:
if 'ref' in options:
git_ref = options['ref']
elif tag_fn:
tag_signature = inspect.signature(tag_fn)
kwargs = {}
if 'options' in tag_signature.parameters:
kwargs['options'] = options
git_ref = f'refs/tags/{tag_fn(version, **kwargs)}'
else:
assert ref_fn
git_ref = ref_fn(env, version)
repo = env.GitBranch(repo_name = _repo_name(env), remote_url = _repo_url(env), git_ref = git_ref)
cook_signature = inspect.signature(cook_fn)
kwargs = {}
if 'options' in cook_signature.parameters:
kwargs['options'] = options
return cook_fn(env, repo, **kwargs)
globals['versions'] = _versions
globals['dependencies'] = _dependencies
globals['cook'] = _cook
env.AddMethod(_git_branch, 'GitBranch')
env.AddMethod(_git_tags, 'GitTags')
env.AddMethod(_git_recipe, 'GitRecipe')
Return('env')

View File

@ -0,0 +1,11 @@
import os
Import('env')
def _recipe_repository(env, repo_name: str, remote_url: str, git_ref: str = 'master') -> None:
repo = env.GitBranch(repo_name = os.path.join('recipe_repos', repo_name), remote_url = remote_url, git_ref = git_ref)
env.Append(SPP_RECIPES_FOLDERS = [os.path.join(repo['checkout_root'], 'recipes')])
env.AddMethod(_recipe_repository, 'RecipeRepo')
Return('env')

View File

@ -0,0 +1,13 @@
FROM debian:sid-slim
RUN apt-get -y update && \
apt-get -y upgrade && \
apt-get -y install build-essential clang-19 gcc-14 g++-14 python3 python3-pip \
virtualenv python-is-python3 clang-tidy git ninja-build cmake
RUN ln -s /usr/bin/clang-19 /usr/local/bin/clang \
&& ln -s /usr/bin/clang++-19 /usr/local/bin/clang++ \
&& ln -s /usr/bin/clang-tidy-19 /usr/local/bin/clang-tidy \
&& ln -s /usr/bin/run-clang-tidy-19 /usr/local/bin/run-clang-tidy
COPY scripts /opt/scripts
RUN chmod a+x /opt/scripts/*.sh

View File

@ -0,0 +1,9 @@
#!/bin/bash
set -xe
python -m virtualenv venv
source venv/bin/activate
pip install scons
pip install -r external/scons-plus-plus/requirements.txt
scons -j$(nproc) --build_type=debug --variant=linux_clang_debug --compiler=clang
scons -j$(nproc) --build_type=release_debug --variant=linux_clang_release_debug --compiler=clang
scons -j$(nproc) --build_type=release --variant=linux_clang_release --compiler=clang

View File

@ -0,0 +1,9 @@
#!/bin/bash
set -xe
python -m virtualenv venv
source venv/bin/activate
pip install scons
pip install -r external/scons-plus-plus/requirements.txt
scons -j$(nproc) --build_type=debug --variant=linux_gcc_debug --compiler=gcc
scons -j$(nproc) --build_type=release_debug --variant=linux_gcc_release_debug --compiler=gcc
scons -j$(nproc) --build_type=release --variant=linux_gcc_release --compiler=gcc

53
contrib/vs/spp.targets Normal file
View File

@ -0,0 +1,53 @@
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<SolutionExt>.sln</SolutionExt>
<Language>C++</Language>
<DefaultLanguageSourceExtension>.cpp</DefaultLanguageSourceExtension>
</PropertyGroup>
<PropertyGroup>
<TargetFileName Condition="'$(TargetPath)' != ''">$([System.IO.Path]::GetFileName('$(TargetPath)'))</TargetFileName>
<TargetDir Condition="'$(TargetPath)' != ''">$([System.IO.Path]::GetDirectoryName('$(TargetPath)'))</TargetDir>
<OutputPath>$(TargetDir)</OutputPath>
<LocalDebuggerCommand Condition="'$(LocalDebuggerCommand)' == ''">$(TargetPath)</LocalDebuggerCommand>
<SConsCommandLine Condition="'$(SConsCommandLine)' == ''">scons</SConsCommandLine>
<SPPNumProcs Condition="'$(SPPNumProcs)' == ''">$([System.Environment]::ProcessorCount)</SPPNumProcs>
<SPPBuildType Condition="'$(SPPBuildType)' == ''">debug</SPPBuildType>
<SPPTargetType Condition="'$(SPPTargetType)' == ''">executable</SPPTargetType>
<OutDir>$(OutputPath)\</OutDir>
<IntDir>$(SolutionDir)cache\msbuild\</IntDir>
</PropertyGroup>
<Import Project="$(MSBuildToolsPath)\Microsoft.Common.targets" />
<Target Name="Build" Condition="'$(SPPTargetType)' != 'meta'">
<Exec Command="$(SConsCommandLine) -j$(SPPNumProcs) --build_type=$(SPPBuildType) --unity=disable $(TargetPath)"
WorkingDirectory="$(SolutionDir)" />
</Target>
<!--<Target Name="Build" Condition="'$(SPPTargetType)' == 'meta'">
<Message Importance="low" Text="Skipping build for meta target $(ProjectName)" />
</Target>-->
<Target Name="Clean" Condition="'$(SPPTargetType)' != 'meta'">
<Exec Command="$(SConsCommandLine) -c -j$(SPPNumProcs) --build_type=$(SPPBuildType) --unity=disable $(TargetPath)"
WorkingDirectory="$(SolutionDir)" />
</Target>
<!--<Target Name="Clean" Condition="'$(SPPTargetType)' == 'meta'">
<Message Importance="low" Text="Skipping clean for meta target $(ProjectName)" />
</Target>-->
<Target Name="Rebuild" Condition="'$(SPPTargetType)' != 'meta'" DependsOnTargets="Clean;Build" />
<!--<Target Name="Rebuild" Condition="'$(SPPTargetType)' == 'meta'">
<Message Importance="low" Text="Skipping rebuild for meta target $(ProjectName)" />
</Target>-->
<!-- This target is needed just to suppress "warning NU1503: Skipping restore for project '...'. The project file may be invalid or missing targets
required for restore." -->
<Target Name="_IsProjectRestoreSupported" Returns="@(_ValidProjectsForRestore)">
<ItemGroup>
<_ValidProjectsForRestore Include="$(MSBuildProjectFullPath)" />
</ItemGroup>
</Target>
<Import Condition="'$(_ImportMicrosoftCppDesignTime)' != 'false'" Project="$(VCTargetsPathActual)\Microsoft.Cpp.DesignTime.targets" />
</Project>

55
lib/spp.py Normal file
View File

@ -0,0 +1,55 @@
from dataclasses import dataclass
import enum
from typing import TYPE_CHECKING
from SCons.Script import *
if TYPE_CHECKING:
class SPPEnvironment(Environment):
def Info(self, message: str): ...
def Warn(self, message: str): ...
def Error(self, message: str): ...
else:
SPPEnvironment = Environment
@dataclass
class Module:
name: str
folder: str
description: str
cxx_namespace: str
class TargetType(enum.Enum):
PROGRAM = 0
STATIC_LIBRARY = 1
SHARED_LIBRARY = 2
MISC = 3
class Target:
name: str
target_type: TargetType
builder = None
args: list = []
kwargs: dict = {}
dependencies: list = []
target = None
module: Module = None
@dataclass(frozen=True)
class SPPInterface:
globals: dict
@property
def env(self) -> SPPEnvironment:
return self.globals['env']
@property
def targets(self) -> list[Target]:
return self.env['SPP_TARGETS']
_spp: SPPInterface
def _init_interface(**kwargs) -> None:
global _spp
_spp = SPPInterface(**kwargs)
def get_spp() -> SPPInterface:
return _spp

View File

@ -1,21 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'master', own_main: bool = False) -> dict:
repo = env.GitBranch(repo_name = 'catch2', remote_url = 'https://github.com/catchorg/Catch2.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
lib_name = {
'debug': 'Catch2d'
}.get(env['BUILD_TYPE'], 'Catch2')
libs = [lib_name]
if not own_main:
libs.append({
'debug': 'Catch2Maind'
}.get(env['BUILD_TYPE'], 'Catch2Main'))
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': libs
}

View File

@ -1,12 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref = 'main') -> dict:
repo = env.GitBranch(repo_name = 'ImageMagick', remote_url = 'https://github.com/ImageMagick/ImageMagick.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': ['backtrace']
}

View File

@ -1,27 +0,0 @@
import os
import platform
from SCons.Script import *
def cook(env: Environment, git_ref: str = "main") -> dict:
repo = env.GitBranch(repo_name = 'SDL', remote_url = 'https://github.com/libsdl-org/SDL.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root, generate_args = ['-DSDL_STATIC=ON', '-DSDL_SHARED=OFF'])
libs = []
if platform.system() == 'Windows':
if env['BUILD_TYPE'] == 'debug':
libs.append('SDL2-staticd')
else:
libs.append('SDL2-static')
libs.extend(('kernel32', 'user32', 'gdi32', 'winmm', 'imm32', 'ole32', 'oleaut32', 'version', 'uuid', 'advapi32', 'setupapi', 'shell32', 'dinput8'))
else:
if env['BUILD_TYPE'] == 'debug':
libs.append('SDL2d')
else:
libs.append('SDL2')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': [os.path.join(build_result['install_dir'], 'include/SDL2')], # SDL is really weird about include paths ...
'LIBS': libs
}

View File

@ -1,13 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, remote: str = 'github', git_ref: str = 'main') -> dict:
if remote == 'mewin':
repo = env.GitBranch(repo_name = 'VulkanHeaders_mewin', remote_url = 'https://git.mewin.de/mewin/vulkan-headers.git', git_ref = git_ref)
else:
repo = env.GitBranch(repo_name = 'VulkanHeaders', remote_url = 'https://github.com/KhronosGroup/Vulkan-Headers.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}

View File

@ -1,10 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'argparse', remote_url = 'https://github.com/p-ranav/argparse.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}

View File

@ -1,12 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, version: str = "1.85.0") -> dict:
# TODO: build binaries?
url = f'https://archives.boost.io/release/{version}/source/boost_{version.replace(".", "_")}.tar.gz'
repo = env.DownloadAndExtract(f'boost_{version}', url = url, skip_folders = 1)
checkout_root = repo['extracted_root']
return {
'CPPPATH': [checkout_root]
}

View File

@ -1,9 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'cgltf', remote_url = 'https://github.com/jkuhlmann/cgltf.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root]
}

View File

@ -1,16 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'master') -> dict:
repo = env.GitBranch(repo_name = 'fmt', remote_url = 'https://github.com/fmtlib/fmt.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root, generate_args = ['-DFMT_TEST=OFF'])
lib_name = {
'debug': 'fmtd'
}.get(env['BUILD_TYPE'], 'fmt')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': [lib_name]
}

View File

@ -1,13 +0,0 @@
from SCons.Script import *
def cook(env: Environment, remote: str = 'github', git_ref: str = "master") -> dict:
if remote == 'mewin':
repo = env.GitBranch(repo_name = 'glm_mewin', remote_url = 'https://git.mewin.de/mewin/glm.git', git_ref = git_ref)
else:
repo = env.GitBranch(repo_name = 'glm', remote_url = 'https://github.com/g-truc/glm.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root],
}

View File

@ -1,83 +0,0 @@
from SCons.Script import *
import glob
import os
import pathlib
import platform
import shutil
import sys
_SCRIPT_STAMPFILE = '.spp_script_run'
def cook(env: Environment, remote: str = 'github', git_ref: str = '') -> dict:
if remote == 'mewin':
repo = env.GitBranch(repo_name = 'glslang_mewin', remote_url = 'https://git.mewin.de/mewin/glslang.git', git_ref = git_ref or 'master')
else:
repo = env.GitBranch(repo_name = 'glslang', remote_url = 'https://github.com/KhronosGroup/glslang.git', git_ref = git_ref or 'main')
checkout_root = repo['checkout_root']
# TODO: windows?
did_run_script = os.path.exists(os.path.join(repo['checkout_root'], _SCRIPT_STAMPFILE))
if not did_run_script or env['UPDATE_REPOSITORIES']:
python_exe = os.path.realpath(sys.executable)
script_file = os.path.join(repo['checkout_root'], 'update_glslang_sources.py')
prev_cwd = os.getcwd()
os.chdir(repo['checkout_root'])
if env.Execute(f'"{python_exe}" {script_file}'):
env.Exit(1)
os.chdir(prev_cwd)
pathlib.Path(repo['checkout_root'], _SCRIPT_STAMPFILE).touch()
# generate the build_info.h
generator_script = os.path.join(repo['checkout_root'], 'build_info.py')
generator_script_input = os.path.join(repo['checkout_root'], 'build_info.h.tmpl')
generator_script_output = os.path.join(repo['checkout_root'], 'glslang/build_info.h')
env.Command(
target = generator_script_output,
source = [generator_script, generator_script_input, os.path.join(repo['checkout_root'], 'CHANGES.md')],
action = f'"$PYTHON" "{generator_script}" "{repo["checkout_root"]}" -i "{generator_script_input}" -o "$TARGET"'
)
platform_source_dir = {
'Linux': 'Unix',
'Windows': 'Windows',
'Darwin': 'Unix'
}.get(platform.system(), 'Unix')
glslang_source_files = env.RGlob(os.path.join(repo['checkout_root'], 'glslang/GenericCodeGen/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/MachineIndependent/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/OGLCompilersDLL/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/ResourceLimits/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'SPIRV/'), '*.cpp') \
+ [os.path.join(repo['checkout_root'], f'glslang/OSDependent/{platform_source_dir}/ossource.cpp')]
# disable a few warnings when compiling with clang
additional_cxx_flags = {
'clang': ['-Wno-deprecated-copy', '-Wno-missing-field-initializers', '-Wno-gnu-redeclared-enum',
'-Wno-unused-but-set-variable', '-Wno-deprecated-enum-enum-conversion']
}.get(env['COMPILER_FAMILY'], [])
env.StaticLibrary(
CCFLAGS = env['CCFLAGS'] + additional_cxx_flags,
CPPPATH = repo['checkout_root'],
target = env['LIB_DIR'] + '/glslang_full',
source = glslang_source_files
)
# build the include folder
include_dir = os.path.join(checkout_root, 'include')
if not os.path.exists(include_dir) or env['UPDATE_REPOSITORIES']:
def copy_headers(dst, src):
os.makedirs(dst, exist_ok=True)
for file in glob.glob(os.path.join(src, '*.h')):
shutil.copy(file, dst)
copy_headers(os.path.join(include_dir, 'glslang/HLSL'), os.path.join(checkout_root, 'glslang/HLSL'))
copy_headers(os.path.join(include_dir, 'glslang/Include'), os.path.join(checkout_root, 'glslang/Include'))
copy_headers(os.path.join(include_dir, 'glslang/MachineIndependent'), os.path.join(checkout_root, 'glslang/MachineIndependent'))
copy_headers(os.path.join(include_dir, 'glslang/Public'), os.path.join(checkout_root, 'glslang/Public'))
copy_headers(os.path.join(include_dir, 'glslang/SPIRV'), os.path.join(checkout_root, 'SPIRV'))
return {
'CPPPATH': [include_dir],
'LIBS': ['glslang_full']
}

View File

@ -1,30 +0,0 @@
from SCons.Script import *
import os
def cook(env: Environment, backends: list = [], git_ref: str = '') -> dict:
repo = env.GitBranch(repo_name = 'imgui', remote_url = 'https://github.com/ocornut/imgui.git', git_ref = git_ref or 'master')
imgui_source_files = [
os.path.join(repo['checkout_root'], 'imgui.cpp'),
os.path.join(repo['checkout_root'], 'imgui_draw.cpp'),
os.path.join(repo['checkout_root'], 'imgui_tables.cpp'),
os.path.join(repo['checkout_root'], 'imgui_widgets.cpp')
]
imgui_add_sources = []
backend_sources = {
'vulkan': os.path.join(repo['checkout_root'], 'backends/imgui_impl_vulkan.cpp'),
'sdl2': os.path.join(repo['checkout_root'], 'backends/imgui_impl_sdl2.cpp')
}
for backend in backends:
imgui_add_sources.append(backend_sources[backend])
lib_imgui = env.StaticLibrary(
CPPPATH = [repo['checkout_root']],
CPPDEFINES = ['IMGUI_IMPL_VULKAN_NO_PROTOTYPES=1'],
target = env['LIB_DIR'] + '/imgui',
source = imgui_source_files,
add_source = imgui_add_sources
)
return lib_imgui

View File

@ -1,8 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'iwa', remote_url = 'https://git.mewin.de/mewin/iwa.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return SConscript(os.path.join(checkout_root, 'LibConf'), exports = ['env'])

View File

@ -1,14 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref = 'master') -> dict:
if env['COMPILER_FAMILY'] not in ('gcc', 'clang'):
env.Error('libbacktrace requires gcc or clang.')
repo = env.GitBranch(repo_name = 'libbacktrace', remote_url = 'https://github.com/ianlancetaylor/libbacktrace.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': ['backtrace']
}

View File

@ -1,11 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref = 'main') -> dict:
repo = env.GitBranch(repo_name = 'libjpeg-turbo', remote_url = 'https://github.com/libjpeg-turbo/libjpeg-turbo.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(checkout_root)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib('jpeg', paths=build_result['LIBPATH'])],
}

View File

@ -1,13 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref = 'master') -> dict:
lib_z = env.Cook('zlib')
repo = env.GitBranch(repo_name = 'libpng', remote_url = 'https://git.code.sf.net/p/libpng/code.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib('png16', paths=build_result['LIBPATH'])],
'DEPENDENCIES': [lib_z]
}

View File

@ -1,10 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'magic_enum', remote_url = 'https://github.com/Neargye/magic_enum.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}

View File

@ -1,14 +0,0 @@
from SCons.Script import *
import os
def cook(env: Environment, git_ref = 'master') -> dict:
repo = env.GitBranch(repo_name = 'mecab', remote_url = 'https://github.com/taku910/mecab.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(os.path.join(checkout_root, 'mecab'))
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': ['mecab']
}

View File

@ -1,8 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'mijin', remote_url = 'https://git.mewin.de/mewin/mijin2.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return SConscript(os.path.join(checkout_root, 'LibConf'), exports = ['env'])

View File

@ -1,12 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'master') -> dict:
repo = env.GitBranch(repo_name = 'mikktspace', remote_url = 'https://github.com/mmikk/MikkTSpace.git', git_ref = git_ref)
lib_mikktspace = env.StaticLibrary(
CPPPATH = [repo['checkout_root']],
target = env['LIB_DIR'] + '/mikktspace',
source = [os.path.join(repo['checkout_root'], 'mikktspace.c')]
)
return lib_mikktspace

View File

@ -1,22 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'v1.x', use_external_libfmt = False) -> dict:
repo = env.GitBranch(repo_name = 'spdlog', remote_url = 'https://github.com/gabime/spdlog.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
lib_name = {
'debug': 'spdlogd'
}.get(env['BUILD_TYPE'], 'spdlog')
cppdefines = ['SPDLOG_COMPILE_LIB=1']
if use_external_libfmt:
cppdefines.append('SPDLOG_FMT_EXTERNAL=1')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'CPPDEFINES': cppdefines,
'LIBS': [lib_name]
}

View File

@ -1,9 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'stb', remote_url = 'https://github.com/nothings/stb.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root]
}

View File

@ -1,15 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'yaml-cpp', remote_url = 'https://github.com/jbeder/yaml-cpp', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
lib_name = {
'debug': 'yaml-cppd'
}.get(env['BUILD_TYPE'], 'yaml-cpp')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': [lib_name]
}

View File

@ -1,12 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'master') -> dict:
repo = env.GitBranch(repo_name = 'zlib', remote_url = 'https://github.com/madler/zlib.git', git_ref = git_ref)
extracted_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=extracted_root)
return {
'CPPPATH': [os.path.join(build_result['install_dir'], 'install')],
'LIBS': [env.FindLib('z', paths=build_result['LIBPATH'])]
}

View File

@ -1,2 +1,6 @@
GitPython GitPython~=3.1.45
psutil psutil~=7.0.0
Jinja2
requests
SCons~=4.9.1
cxxheaderparser~=1.5.4

Binary file not shown.

View File

@ -1,6 +0,0 @@
config = {
'PROJECT_NAME': 'DUMMY'
}
env = SConscript('../SConscript', exports = ['config'])

86
test/codegen/.gitignore vendored Normal file
View File

@ -0,0 +1,86 @@
# Generated Files
*.refl.hpp
*.generated.*
private/**/*.json
# Project
/.idea/
/.vs/
/.vscode/
/vs_project_files/
*.sln
# Executables
/bin
/bin_*
# Libraries
/lib
/lib_*
# Vulkan API dumps
/api_dump*
# Compile commands
compile_commands.json
# whatever this is
.cache
# ImGui config
/imgui.ini
# Environment setup
/.env
# Build Configuration
/config.py
/config_*.py
# Prerequisites
*.d
# Compiled Object files
*.slo
*.lo
*.o
*.obj
# Precompiled Headers
*.gch
*.pch
# Compiled Dynamic libraries
*.so
*.dylib
*.dll
# Fortran module files
*.mod
*.smod
# Compiled Static libraries
*.lai
*.la
*.a
*.lib
# Executables
*.exe
*.out
*.app
# Debug Info
*.pdb
# for projects that use SCons for building: http://http://www.scons.org/
.sconsign.dblite
/.sconf_temp
/config.log
# Byte-compiled / optimized python files
__pycache__/
*.py[cod]
# Backup files
*.bak

15
test/codegen/SConstruct Normal file
View File

@ -0,0 +1,15 @@
config = {
'PROJECT_NAME': 'S++ Codegen Test',
'SPP_TARGET_VERSION': (1, 1, 0)
}
env = SConscript('../../SConscript', exports = ['config'])
# recipe repo
env.RecipeRepo('mewin', 'https://git.mewin.de/mewin/spp_recipes.git', 'stable')
# app
env = env.Module('private/test/SModule')
env.Finalize()

View File

@ -0,0 +1,50 @@
Import('env')
env.ModuleConfig(
name = 'Test',
description = 'Test Module',
cxx_namespace = 'tst'
)
src_files = Split("""
main.cpp
test.cpp
test.generated.cpp
""")
# env.IncludeGen(src_files,
# template=env.File('#templates/header.jinja'),
# include_filter=r'.*\.refl.hpp'
# )
# env.CodeGen('GenSource', inputs = [], template=env.File('#templates/source.jinja'), )
# env.CodeGen(
# target = 'test.generated.cpp',
# template = env.File('#templates/source.jinja'),
# inputs = {'source': 'test.cpp'}
# )
ast_json = env.AstJson(
target = env.File('test.json'),
source = 'test.hpp'
)
env.Default(ast_json)
ast_hpp = env.AstJinja(
target = env.File('test.refl.hpp'),
source = env.File('test.hpp'),
template = env.File('#templates/header.jinja')
)
prog_app = env.Program(
name = 'Test',
target = env['BIN_DIR'] + '/test',
source = src_files,
dependencies = {
}
)
env.Requires(prog_app.target, ast_hpp)
env.Default(prog_app)
Return('env')

View File

@ -0,0 +1,8 @@
#include "./test.hpp"
int main(int, char**)
{
tst::printHelloWorld(100);
return 0;
}

View File

@ -0,0 +1,12 @@
#include "./test.hpp"
#include <print>
namespace tst
{
void printHelloWorld(int param) noexcept
{
std::println("Hello World! Param is {}.", param);
}
}

View File

@ -0,0 +1,33 @@
#pragma once
#include <vector>
#if __has_include("test.refl.hpp")
#include "test.refl.hpp"
#endif
namespace tst
{
static constexpr int kAnnotVal = 17;
class MyClass
{
private:
std::vector<int> mInts;
public:
MyClass();
#if defined(__clang__)
[[clang::annotate("reflect", "yes, please", kAnnotVal)]]
#endif
int getVal();
void setVal(int val);
static constexpr int kVal = 1;
};
}
namespace tst
{
void printHelloWorld(int param) noexcept;
}

View File

@ -0,0 +1,11 @@
#if !defined(SPP_AST_GEN)
{% for class in ast.get_namespace('tst').classes %}
/*
{{ class.name }}
{% for method in class.methods %}
{{ method.return_type }} {{ method.name }} ({% for param in method.params %} {{ param.type }} {{ param.name }} {% endfor %})
{% endfor %}
{% endfor %}
*/
#endif

View File

@ -0,0 +1,3 @@
{% for cls in source.namespace.classes %}
// {{ cls.class_decl.typename.format() }}
{% endfor %}

81
test/v1_0_0/.gitignore vendored Normal file
View File

@ -0,0 +1,81 @@
# Project
/.idea/
/.vs/
/.vscode/
/vs_project_files/
*.sln
# Executables
/bin
/bin_*
# Libraries
/lib
/lib_*
# Vulkan API dumps
/api_dump*
# Compile commands
compile_commands.json
# whatever this is
.cache
# ImGui config
/imgui.ini
# Environment setup
/.env
# Build Configuration
/config.py
/config_*.py
# Prerequisites
*.d
# Compiled Object files
*.slo
*.lo
*.o
*.obj
# Precompiled Headers
*.gch
*.pch
# Compiled Dynamic libraries
*.so
*.dylib
*.dll
# Fortran module files
*.mod
*.smod
# Compiled Static libraries
*.lai
*.la
*.a
*.lib
# Executables
*.exe
*.out
*.app
# Debug Info
*.pdb
# for projects that use SCons for building: http://http://www.scons.org/
.sconsign.dblite
/.sconf_temp
/config.log
# Byte-compiled / optimized python files
__pycache__/
*.py[cod]
# Backup files
*.bak

10
test/v1_0_0/SConstruct Normal file
View File

@ -0,0 +1,10 @@
config = {
'PROJECT_NAME': 'S++ 1.0.0 Test'
}
env = SConscript('../../SConscript', exports = ['config'])
env = env.Module('private/test/SModule')
env.Finalize()

View File

@ -0,0 +1,25 @@
Import('env')
env.ModuleConfig(
name = 'Test',
description = 'Test Module',
cxx_namespace = 'tst'
)
src_files = Split("""
main.cpp
test.cpp
""")
prog_app = env.UnityProgram(
name = 'Test',
target = env['BIN_DIR'] + '/test',
source = src_files,
dependencies = {
'mijin': {}
}
)
env.Default(prog_app)
Return('env')

View File

@ -0,0 +1,8 @@
#include "./test.hpp"
int main(int, char**)
{
tst::printHelloWorld(100);
return 0;
}

View File

@ -0,0 +1,15 @@
#include "./test.hpp"
#include <mijin/debug/assert.hpp>
#include <print>
namespace tst
{
void printHelloWorld(int param) noexcept
{
MIJIN_ASSERT(param > 0, "param is not >0 :(");
std::println("Hello World! Param is {}.", param);
}
}

View File

@ -0,0 +1,7 @@
#pragma once
namespace tst
{
void printHelloWorld(int param) noexcept;
}

View File

@ -0,0 +1,133 @@
import os
import math
from SCons.Script import *
from SCons.Node.FS import File
from SCons import Action
"""
Scons Unity Build Generator
Provides several generators for SCons to combine multiple source files into a bigger
one to reduce compilation time, so called "unity builds". This is achieved by generating
unity source files which in term include the actual source files and compile them using
one of the existing SCons builders.
Usage
-----
In order to use this, just place it inside your `site_scons/site_tools` folder, enable it by
adding "unity_build" to the tools when constructing your Environment and replace invocations
of the Program/Library/SharedLibrary/StaticLibrary builders with their Unity... counterpart:
env = Environment(tools = ['default', 'unity_build'])
source_files = ...
env.UnityProgram(
target = 'my_program',
source = source_files,
...
)
The tool will generate an amount of unity source files and invoke the Program builder on these,
forwarding any other arguments you passed.
Other Options
------------
You can control the behaviour of the builder using several Environment options:
env['UNITY_CACHE_DIR'] = '.unity' # Directory where the unity sources are stored.
# can be either a string or a Dir() node.
env['UNITY_MAX_SOURCES'] = 15 # Maximum number of source files per unity file.
env['UNITY_MIN_FILES'] = env.GetOption('num_jobs')
# Minimum number of unity files to generate (if possible).
# Defaults to the number of jobs passed to SCons.
env['UNITY_DISABLE'] = False # Set to True to completely disable unity builds. The commands
# will simply pass through their options to the regular builders.
Additionally any generator can be passed a `cache_dir` to overwrite the value from the Environment.
"""
def exists(env : Environment):
return True
def generate(env : Environment):
env.AddMethod(_make_generator(env.Program), 'UnityProgram')
env.AddMethod(_make_generator(env.Library), 'UnityLibrary')
env.AddMethod(_make_generator(env.StaticLibrary), 'UnityStaticLibrary')
env.AddMethod(_make_generator(env.SharedLibrary), 'UnitySharedLibrary')
# build for generating the unity source files
unity_source_builder = env.Builder(
action = Action.Action(_generate_unity_file, _generate_unity_file_msg)
)
env.Append(BUILDERS = {'UnitySource': unity_source_builder})
env.SetDefault(UNITY_CACHE_DIR = '.unity')
env.SetDefault(UNITY_MAX_SOURCES = 15)
env.SetDefault(UNITY_MIN_FILES = env.GetOption('num_jobs'))
env.SetDefault(UNITY_DISABLE = False)
def _make_generator(base_generator):
def generator(env, source, target, cache_dir = None, *args, **kwargs):
if env['UNITY_DISABLE']:
return base_generator(target = target, source = source, *args, **kwargs)
unity_source_files = []
source_files, other_nodes = _flatten_source(source)
max_sources_per_file = max(1, math.ceil(len(source_files) / env['UNITY_MIN_FILES']))
sources_per_file = min(max_sources_per_file, env['UNITY_MAX_SOURCES'])
num_unity_files = math.ceil(len(source_files) / sources_per_file)
if not cache_dir:
cache_dir = env['UNITY_CACHE_DIR']
if not isinstance(cache_dir, str):
cache_dir = cache_dir.abspath
os.makedirs(cache_dir, exist_ok=True)
target_base_name = os.path.basename(target)
for idx in range(num_unity_files):
unity_filename = f'{cache_dir}/{target_base_name}_{idx}.cpp'
unity_source_files.append(unity_filename)
begin = sources_per_file*idx
end = sources_per_file*(idx+1)
env.UnitySource(
target = unity_filename,
source = source_files[begin:end]
)
if len(other_nodes) > 0:
print(f'Exluded {len(other_nodes)} node(s) from Unity build.')
return [base_generator(target = target, source = unity_source_files + other_nodes, *args, **kwargs)]
return generator
def _flatten_source(source : list):
source_files = []
other_nodes = []
for ele in source:
if isinstance(ele, list):
more_sources, more_other = _flatten_source(ele)
source_files.extend(more_sources)
other_nodes.extend(more_other)
elif isinstance(ele, File):
source_files.append(ele.abspath)
elif isinstance(ele, str):
source_files.append(ele)
else:
other_nodes.append(ele)
return source_files, other_nodes
def _generate_unity_file_msg(target, source, env : Environment):
assert(len(target) == 1)
return f'Generating {str(target[0])} from {len(source)} source files.'
def _generate_unity_file(target, source, env : Environment):
assert(len(target) == 1)
unity_filename = target[0].abspath
with open(unity_filename, 'w') as f:
for source_file in source:
fpath = source_file.abspath.replace("\\", "\\\\")
f.write(f'#include "{fpath}"\n')

0
util/__init__.py Normal file
View File

View File

@ -0,0 +1,8 @@
# Default ignored files
/shelf/
/workspace.xml
# Editor-based HTTP Client requests
/httpRequests/
# Datasource local storage ignored files
/dataSources/
/dataSources.local.xml

View File

@ -0,0 +1,35 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CLionExternalBuildManager">
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<target id="{{ generate_uuid('target_' + executable.name + '_' + build_type) }}" name="{{ executable.name }} {{ build_type_name }}" defaultType="TOOL">
<configuration id="{{ generate_uuid('configuration_' + executable.name + '_' + build_type) }}" name="{{ executable.name }} {{ build_type_name }}">
<build type="TOOL">
<tool actionId="Tool_External Tools_{{ executable.name }} {{ build_type_name }}" />
</build>
<clean type="TOOL">
<tool actionId="Tool_External Tools_{{ executable.name }} {{ build_type_name }} Clean" />
</clean>
</configuration>
</target>
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<target id="{{ generate_uuid('target_' + library.name + '_' + build_type) }}" name="{{ library.name }} {{ build_type_name }}" defaultType="TOOL">
<configuration id="{{ generate_uuid('configuration_' + library.name + '_' + build_type) }}" name="{{ library.name }} {{ build_type_name }}">
<build type="TOOL">
<tool actionId="Tool_External Tools_{{ library.name }} {{ build_type_name }}" />
</build>
<clean type="TOOL">
<tool actionId="Tool_External Tools_{{ library.name }} {{ build_type_name }} Clean" />
</clean>
</configuration>
</target>
{% endfor %}
{% endfor %}
</component>
</project>

View File

@ -0,0 +1,17 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CompDBSettings">
<option name="linkedExternalProjectsSettings">
<CompDBProjectSettings>
<option name="externalProjectPath" value="$PROJECT_DIR$" />
<option name="modules">
<set>
<option value="$PROJECT_DIR$" />
</set>
</option>
</CompDBProjectSettings>
</option>
</component>
<component name="CompDBWorkspace" PROJECT_DIR="$PROJECT_DIR$" />
<component name="ExternalStorageConfigurationManager" enabled="true" />
</project>

View File

@ -0,0 +1,40 @@
<toolSet name="External Tools">
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<tool name="{{ executable.name }} {{ build_type_name }}" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="-j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) }} compile_commands.json" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
<tool name="{{ executable.name }} {{ build_type_name }} Clean" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="--build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) }} -c" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<tool name="{{ library.name }} {{ build_type_name }}" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="-j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) }} compile_commands.json" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
<tool name="{{ library.name }} {{ build_type_name }} Clean" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="--build_type={{ build_type }} --unity=disable {{ library.filename(build_type) }} -c" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
{% endfor %}
{% endfor %}
</toolSet>

View File

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="" vcs="Git" />
</component>
</project>

View File

@ -0,0 +1,109 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="AutoImportSettings">
<option name="autoReloadType" value="SELECTIVE" />
</component>
<component name="CMakeRunConfigurationManager">
<generated />
</component>
<component name="CMakeSettings">
<configurations>
<configuration PROFILE_NAME="Debug" ENABLED="true" CONFIG_NAME="Debug" />
</configurations>
</component>
<component name="ClangdSettings">
<option name="formatViaClangd" value="false" />
</component>
<component name="CompDBLocalSettings">
<option name="availableProjects">
<map>
<entry>
<key>
<ExternalProjectPojo>
<option name="name" value="{{ project.name }}" />
<option name="path" value="$PROJECT_DIR$" />
</ExternalProjectPojo>
</key>
<value>
<list>
<ExternalProjectPojo>
<option name="name" value="{{ project.name }}" />
<option name="path" value="$PROJECT_DIR$" />
</ExternalProjectPojo>
</list>
</value>
</entry>
</map>
</option>
<option name="projectSyncType">
<map>
<entry key="$PROJECT_DIR$" value="RE_IMPORT" />
</map>
</option>
</component>
<component name="ExternalProjectsData">
<projectState path="$PROJECT_DIR$">
<ProjectState />
</projectState>
</component>
<component name="Git.Settings">
<option name="RECENT_GIT_ROOT_PATH" value="$PROJECT_DIR$" />
</component>
<component name="ProjectColorInfo">{
&quot;associatedIndex&quot;: 5
}</component>
<component name="ProjectViewState">
<option name="hideEmptyMiddlePackages" value="true" />
<option name="showLibraryContents" value="true" />
</component>
<component name="PropertiesComponent"><![CDATA[{
"keyToString": {
{% for executable in project.executables -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
"Custom Build Application.{{ executable.name }} {{ build_type_name }}.executor": "Debug",
{% endfor -%}
{% endfor -%}
"RunOnceActivity.RadMigrateCodeStyle": "true",
"RunOnceActivity.ShowReadmeOnStart": "true",
"RunOnceActivity.cidr.known.project.marker": "true",
"RunOnceActivity.readMode.enableVisualFormatting": "true",
"cf.first.check.clang-format": "false",
"cidr.known.project.marker": "true",
"git-widget-placeholder": "master",
"node.js.detected.package.eslint": "true",
"node.js.detected.package.tslint": "true",
"node.js.selected.package.eslint": "(autodetect)",
"node.js.selected.package.tslint": "(autodetect)",
"nodejs_package_manager_path": "npm",
"settings.editor.selected.configurable": "CLionExternalConfigurable",
"vue.rearranger.settings.migration": "true"
}
}]]></component>
<component name="RunManager" selected="Custom Build Application.{% if project.executables|length > 0 %}{{ project.executables[0].name }}{% else %}{{ project.libraries[0].name }}{% endif %} {{ project.build_types[0] }}">
{% for executable in project.executables -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
<configuration name="{{ executable.name }} {{ build_type_name }}" type="CLionExternalRunConfiguration" factoryName="Application" REDIRECT_INPUT="false" ELEVATE="false" USE_EXTERNAL_CONSOLE="false" EMULATE_TERMINAL="false" WORKING_DIR="file://$ProjectFileDir$" PASS_PARENT_ENVS_2="true" PROJECT_NAME="{{ project.name }}" TARGET_NAME="{{ executable.name }} {{ build_type_name }}" CONFIG_NAME="{{ executable.name }} {{ build_type_name }}" RUN_PATH="$PROJECT_DIR$/{{ executable.filename(build_type) }}">
<method v="2">
<option name="CLION.EXTERNAL.BUILD" enabled="true" />
</method>
</configuration>
{% endfor -%}
{% endfor -%}
{% for library in project.libraries -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
<configuration name="{{ library.name }} {{ build_type_name }}" type="CLionExternalRunConfiguration" factoryName="Application" REDIRECT_INPUT="false" ELEVATE="false" USE_EXTERNAL_CONSOLE="false" EMULATE_TERMINAL="false" PASS_PARENT_ENVS_2="true" PROJECT_NAME="{{ project.name }}" TARGET_NAME="{{ library.name }} {{ build_type_name }}" CONFIG_NAME="{{ library.name }} {{ build_type_name }}">
<method v="2">
<option name="CLION.EXTERNAL.BUILD" enabled="true" />
</method>
</configuration>
{% endfor -%}
{% endfor -%}
</component>
<component name="SpellCheckerSettings" RuntimeDictionaries="0" Folders="0" CustomDictionaries="0" DefaultDictionary="application-level" UseSingleDictionary="true" transferred="true" />
<component name="TypeScriptGeneratedFilesManager">
<option name="version" value="3" />
</component>
</project>

View File

@ -0,0 +1,24 @@
"""
Scons++ Command Line Interface
"""
import argparse
import logging
from .ccjson import make_ccjson_parser
_STDOUT_LOG_FORMAT = '%(message)s'
def run_spp_cmd() -> int:
parser = argparse.ArgumentParser()
parser.add_argument('--verbose', '-v', action='store_true')
subparsers = parser.add_subparsers(required=True)
make_ccjson_parser(subparsers)
args = parser.parse_args()
logging.basicConfig(format=_STDOUT_LOG_FORMAT, level=logging.DEBUG if args.verbose else logging.INFO)
args.handler(args)
return 0

View File

@ -0,0 +1,18 @@
import argparse
from .common import exec_spp, get_config_cache, require_project_file
def _cmd(args: argparse.Namespace) -> None:
require_project_file()
build_type = args.build_type
if build_type == 'auto':
cache = get_config_cache()
build_type = cache.get('build_type', 'debug')
exec_spp((f'--build_type={build_type}', '--unity=disable', 'compile_commands.json'))
def make_ccjson_parser(subparsers) -> None:
parser : argparse.ArgumentParser = subparsers.add_parser('ccjson', help='Generate compile_commands.json')
parser.set_defaults(handler=_cmd)
parser.add_argument('--build_type', choices=('auto', 'debug', 'release_debug', 'release', 'profile'), default='auto')

View File

@ -0,0 +1,51 @@
import json
import logging
from pathlib import Path
import shlex
import subprocess
import sys
from typing import Sequence
_project_root = Path('.').absolute()
def get_project_root() -> Path:
return _project_root
def set_project_root(path: Path) -> None:
global _project_root
_project_root = path
def get_config_cache() -> dict:
cache_file = get_project_root() / 'cache' / 'config_cache.json'
if not cache_file.exists():
return {}
try:
with cache_file.open('r') as f:
cache = json.load(f)
if not isinstance(cache, dict):
logging.warning('Config cache is not a dictionary, ignoring it.')
return {}
return cache
except Exception as e:
logging.error(f'Error while reading config cache: {e}.')
return {}
def require_project_file() -> None:
if not (get_project_root() / 'SConstruct').exists():
logging.error('This command has to be run inside an existing S++ project folder. Exiting.')
sys.exit(1)
def exec_checked(args: Sequence[str], **kwargs) -> None:
logging.debug('exec_checked: "%s"', shlex.join(args))
subprocess.run(args, stdout=sys.stdout, stderr=sys.stderr, check=True, **kwargs)
def exec_get_output(args: Sequence[str], **kwargs) -> str:
logging.debug('exec_get_output: "%s"', shlex.join(args))
return subprocess.run(args, text=True, check=True, capture_output=True, **kwargs).stdout
def exec_spp(args: Sequence[str], **kwargs):
full_cmd = ('scons', '-s', '--disable_auto_update', *args)
exec_checked(full_cmd, **kwargs)

6
util/run_scons.py Normal file
View File

@ -0,0 +1,6 @@
# use this to start SCons from the IDE for debugging
import sys
from SCons.Script.Main import main
if __name__ == '__main__':
sys.exit(main())

10
util/spp_cmd.py Executable file
View File

@ -0,0 +1,10 @@
#!/usr/bin/env python3
import os
import sys
if __name__ == '__main__':
sys.path.append(os.path.join(os.path.dirname(os.path.realpath(__file__)), 'python_module'))
from sppcmd import run_spp_cmd
sys.exit(run_spp_cmd())

View File

@ -0,0 +1,48 @@
Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 17
VisualStudioVersion = 17.10.35122.118
MinimumVisualStudioVersion = 10.0.40219.1
{%- for executable in project.executables %}
Project("{{ generate_uuid(project.name, True) }}") = "{{ executable.name }}", "vs_project_files\{{ executable.name }}.vcxproj", ""{{ generate_uuid('target_' + executable.name, True) }}""
{%- endfor %}
{%- for library in project.libraries %}
Project("{{ generate_uuid(project.name, True) }}") = "{{ library.name }}", "vs_project_files\{{ library.name }}.vcxproj", ""{{ generate_uuid('target_' + library.name, True) }}""
{%- endfor %}
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution Items", "{{ generate_uuid('solution_items', True) }}"
ProjectSection(SolutionItems) = preProject
SConstruct = SConstruct
EndProjectSection
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
{%- for build_type in project.build_types %}
{%- set build_type_name = build_type | capitalize %}
{{ build_type_name }}|x64 = {{ build_type_name }}|x64
{%- endfor %}
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{%- for executable in project.executables %}
{%- for build_type in project.build_types %}
{%- set build_type_name = build_type | capitalize %}
{{ generate_uuid('target_' + executable.name, True) }}.{{ build_type_name }}|x64.ActiveCfg = {{ build_type_name }}|x64
{{ generate_uuid('target_' + executable.name, True) }}.{{ build_type_name }}|x64.Build.0 = {{ build_type_name }}|x64
{%- endfor %}
{%- endfor %}
{%- for library in project.libraries %}
{%- for build_type in project.build_types %}
{%- set build_type_name = build_type | capitalize %}
{{ generate_uuid('target_' + library.name, True) }}.{{ build_type_name }}|x64.ActiveCfg = {{ build_type_name }}|x64
{{ generate_uuid('target_' + library.name, True) }}.{{ build_type_name }}|x64.Build.0 = {{ build_type_name }}|x64
{%- endfor %}
{%- endfor %}
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {{ generate_uuid("solution", True) }}
EndGlobalSection
EndGlobal

View File

@ -0,0 +1,15 @@
{
"files": {
"solution.sln.jinja": {
"rename_to": "{{ project.name }}.sln"
},
"vs_project_files/target.vcxproj.jinja": {
"one_per": "target",
"rename_to": "vs_project_files/{{ target.name }}.vcxproj"
},
"vs_project_files/target.vcxproj.filters.jinja": {
"one_per": "target",
"rename_to": "vs_project_files/{{ target.name }}.vcxproj.filters"
}
}
}

View File

@ -0,0 +1,73 @@
{%- set source_files = get_sources(target) -%}
{%- set private_headers = get_headers('private\\' + target.module.folder) -%}
{%- set public_headers = get_headers('public\\' + target.module.folder) -%}
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="17.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<Filter Include="Source Files">
<UniqueIdentifier>{{ generate_uuid('filter_sources_' + target.name, True) }}</UniqueIdentifier>
</Filter>
{%- for folder in source_files | folder_list(2) | sort %}
<Filter Include="Source Files\{{ folder }}">
<UniqueIdentifier>{{ generate_uuid('filter_sources_' + target.name + '_' + folder, True) }}</UniqueIdentifier>
</Filter>
{%- endfor %}
{%- if public_headers | length > 0 %}
<Filter Include="Public Header Files">
<UniqueIdentifier>{{ generate_uuid('filter_public_headers_' + target.name, True) }}</UniqueIdentifier>
</Filter>
{%- for folder in public_headers | folder_list(2) | sort %}
<Filter Include="Public Header Files\{{ folder }}">
<UniqueIdentifier>{{ generate_uuid('filter_public_headers_' + target.name + '_' + folder, True) }}</UniqueIdentifier>
</Filter>
{%- endfor %}
{%- endif %}
{%- if private_headers | length > 0 %}
<Filter Include="Private Header Files">
<UniqueIdentifier>{{ generate_uuid('filter_private_headers_' + target.name, True) }}</UniqueIdentifier>
</Filter>
{%- for folder in private_headers | folder_list(2) | sort %}
<Filter Include="Private Header Files\{{ folder }}">
<UniqueIdentifier>{{ generate_uuid('filter_private_headers_' + target.name + '_' + folder, True) }}</UniqueIdentifier>
</Filter>
{%- endfor %}
{%- endif %}
</ItemGroup>
<ItemGroup>
{%- for source_file in source_files %}
<ClCompile Include="$(SolutionDir){{ source_file }}">
{%- set path = source_file | strip_path_prefix(2) | dirname -%}
{%- if path %}
<Filter>Source Files\{{ path }}</Filter>
{%- else %}
<Filter>Source Files</Filter>
{%- endif %}
</ClCompile>
{%- endfor %}
</ItemGroup>
<ItemGroup>
{%- for header_file in public_headers %}
<ClInclude Include="$(SolutionDir){{ header_file }}">
{%- set path = header_file | strip_path_prefix(2) | dirname -%}
{%- if path %}
<Filter>Public Header Files\{{ path }}</Filter>
{%- else %}
<Filter>Public Header Files</Filter>
{%- endif %}
</ClInclude>
{%- endfor %}
{%- for header_file in private_headers %}
<ClInclude Include="$(SolutionDir){{ header_file }}">
{%- set path = header_file | strip_path_prefix(2) | dirname -%}
{%- if path %}
<Filter>Private Header Files\{{ path }}</Filter>
{%- else %}
<Filter>Private Header Files</Filter>
{%- endif %}
</ClInclude>
{%- endfor %}
</ItemGroup>
<ItemGroup>
<Content Include="$(SolutionDir)private\{{ target.module.folder }}\SModule" />
</ItemGroup>
</Project>

View File

@ -0,0 +1,67 @@
{%- set ms_cxx_standard = {
'c++14': 'c++14',
'c++17': 'c++17',
'c++20': 'c++20',
'c++23': 'c++latest',
'c++26': 'c++latest'}[project.cxx_standard] | default('c++14')
-%}
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="17.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
{%- for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<ProjectConfiguration Include="{{ build_type_name }}|x64">
<Configuration>{{ build_type_name }}</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
{%- endfor %}
</ItemGroup>
<PropertyGroup Label="Globals">
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<ProjectGuid>{{ generate_uuid('target_' + target.name, True) }}</ProjectGuid>
<ProjectName>{{ target.name }}</ProjectName>
<SConsCommandLine>{{ scons_exe }}</SConsCommandLine>
</PropertyGroup>
{%- for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<PropertyGroup Condition="'$(Configuration)'=='{{ build_type_name }}'">
<TargetPath>$(SolutionDir){{ target.filename(build_type) }}</TargetPath>
<SPPBuildType>{{ build_type }}</SPPBuildType>
<SPPTargetType>{{ target.type }}</SPPTargetType>
</PropertyGroup>
{%- endfor %}
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Label="Configuration">
<ConfigurationType>Makefile</ConfigurationType>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ItemGroup>
{%- for source_file in get_sources(target) %}
<ClCompile Include="$(SolutionDir){{ source_file }}" />
{%- endfor %}
</ItemGroup>
<ItemGroup>
{%- for header_file in get_headers('private\\' + target.module.folder) %}
<ClInclude Include="$(SolutionDir){{ header_file }}" />
{%- endfor %}
{%- for header_file in get_headers('public\\' + target.module.folder) %}
<ClInclude Include="$(SolutionDir){{ header_file }}" />
{%- endfor %}
</ItemGroup>
<ItemGroup>
<Content Include="$(SolutionDir)private\{{ target.module.folder }}\SModule" />
</ItemGroup>
{%- for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<ItemDefinitionGroup Condition="'$(Configuration)'=='{{ build_type_name }}'">
<ClCompile>
<PreprocessorDefinitions>{{ get_target_property(build_type, target.name, 'CPPDEFINES') | join(';') }};%(PreprocessorDefinitions);</PreprocessorDefinitions>
<GenerateDebugInformation>{{ build_type != 'release' and 'true' or 'false' }}</GenerateDebugInformation>
<AdditionalIncludeDirectories>{{ get_target_property(build_type, target.name, 'CPPPATH') | join(';') }};%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<MsExtensions>false</MsExtensions>
<AdditionalOptions>{{ get_target_property(build_type, target.name, 'CCFLAGS') | join(' ') }}</AdditionalOptions> {# + get_target_property(build_type, target.name, 'CXXFLAGS')) #}
</ClCompile>
</ItemDefinitionGroup>
{%- endfor %}
<Import Project="$(SolutionDir)external\scons-plus-plus\contrib\vs\spp.targets" />
</Project>

View File

@ -0,0 +1,6 @@
{
"recommendations": [
"ms-vscode.cpptools",
"llvm-vs-code-extensions.vscode-clangd"
]
}

View File

@ -0,0 +1,21 @@
{
"configurations": [
{%- for executable in project.executables -%}
{%- for build_type in project.build_types -%}
{%- set build_type_name = build_type | capitalize %}
{
"name": "{{ executable.name }} ({{ build_type | capitalize }})",
"type": "cppvsdbg",
"request": "launch",
"program": "{{ executable.filename(build_type) | escape_path }}",
"args": [],
"stopAtEntry": false,
"cwd": "${workspaceFolder}",
"environment": [],
"console": "integratedTerminal",
"preLaunchTask": "{{ executable.name }} {{ build_type_name }}"
},
{%- endfor %}
{%- endfor %}
]
}

View File

@ -0,0 +1,69 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"tasks": [
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
{
"label": "{{ executable.name }} {{ build_type_name }}",
"type": "shell",
"command": "{{ scons_exe | escape_path }} -j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) | escape_path }} compile_commands.json",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{
"label": "{{ executable.name }} {{ build_type_name }} Clean",
"type": "shell",
"command": "{{ scons_exe | escape_path }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) | escape_path }} -c",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
{
"label": "{{ library.name }} {{ build_type_name }}",
"type": "shell",
"command": "{{ scons_exe | escape_path }} -j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) | escape_path }} compile_commands.json",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{
"label": "{{ library.name }} {{ build_type_name }} Clean",
"type": "shell",
"command": "{{ scons_exe | escape_path }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) | escape_path }} -c",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{% endfor %}
{% endfor %}
]
}