Compare commits

...

81 Commits

Author SHA1 Message Date
9436d2c48d Added JINJA_FILE_SEARCHPATH for Jinja file functions to make it easier for library projects to find their files. 2025-03-28 14:52:14 +01:00
2769fd801f Added file_size and file_content_hex Jinja filters.
Added check if Jinja module exists and appropriate error message to
project generation.
2025-03-28 11:35:59 +01:00
Patrick Wuttke
5f11d64744 Fixed generation of VSCode launch.json. 2025-03-18 09:45:21 +01:00
1401fdea90 Next attempt of adding build type to executable and library names. 2025-03-14 22:02:40 +01:00
Patrick Wuttke
3a3c79d240 Revert "Updated to include build type and variant in binary names, so they don't need to be rebuilt everytime the configuration is changed."
Modifying the suffix variables broke the library file detection, at least on Windows.

This reverts commit e6e7dbe642.
2025-03-14 09:49:21 +01:00
e6e7dbe642 Updated to include build type and variant in binary names, so they don't need to be rebuilt everytime the configuration is changed. 2025-03-13 23:42:58 +01:00
1edf745b39 Clone recipe repositories into a separate subfolder. 2025-03-13 23:01:09 +01:00
9e5fc05f4f Introduced recipe repositories. 2025-03-13 22:58:49 +01:00
d03b7097f7 Fixed SQLite build on Linux. 2025-03-13 18:36:17 +01:00
ba5def01b0 Fixed imgui-node-editor build when using GCC. 2025-03-13 18:36:07 +01:00
Patrick Wuttke
4082f8cb22 Added build type to object suffix so switching between configs is a lot quicker. 2025-03-13 15:00:24 +01:00
Patrick Wuttke
0bec9f7062 Added project template for VSCode (only Windows for now). 2025-03-13 10:42:48 +01:00
283aa0f99c Added way for dependencies to use options and fixed compilation of ImGui with SDL3 backend. 2025-03-13 09:57:39 +01:00
71f8631e48 Added recipe for portable-file-dialogs. 2025-03-12 15:50:16 +01:00
ae739b9946 Fixed Release and Release_Debug build on windows. 2025-03-11 17:46:57 +01:00
2cc3145669 Added recipe for ImGui Node Editor. 2025-03-11 17:46:39 +01:00
5159c80167 Added recipe for SQLite on Windows (still WIP). 2025-03-11 09:35:38 +01:00
748f8f229f (WIP) SQLite recipe. 2025-03-10 09:55:34 +01:00
b034e87aaf Properly disable exceptions in MSVC. 2025-03-07 11:00:17 +01:00
726d206f9a Disabled tests for fmt. 2025-03-07 10:59:41 +01:00
5debc6d334 Added recipe for icon font cpp headers. 2025-03-07 10:48:56 +01:00
5c6a8a1cc6 Fixed issue with dependency injection not working due to the 'add target via name in dependencies' shortcut :/. 2025-03-02 22:20:21 +01:00
9d3d52ffaf Added recipe for RAID. 2025-03-02 22:19:22 +01:00
c994752c32 Added option for ImGui recipe to build from a docking build and to provide backends to compile via options. 2025-03-02 16:00:48 +01:00
42dada23c9 Added CXX_NO_EXCEPTIONS config to disable exceptions. 2025-03-02 16:00:12 +01:00
0e22057fc4 Added recipe for sdlpp. 2025-02-14 22:37:27 +01:00
c883f3c1c7 Added DXC recipe. 2025-01-19 01:17:35 +01:00
bca2398828 Added recipes for some Linux system libraries. 2025-01-12 13:11:43 +01:00
10a5239b7f Fixed default value for enable_asan. 2025-01-09 17:56:28 +01:00
0b8d115907 Added option for address sanitization on MSVC. 2024-12-26 14:48:18 +01:00
134fb106a8 Added cxxflags to cmake version hash calculation. 2024-12-13 23:39:27 +01:00
b546931c09 Added libraries to automatic CLion project generation. 2024-12-13 00:52:10 +01:00
7c4e403747 Fixed up Clang warnings. 2024-12-08 22:38:52 +01:00
c24c864915 Disabled Clang 'missing-field-initializers' warning. 2024-12-06 20:36:37 +01:00
fe95a92cf6 Added GCC build script. 2024-12-06 09:46:54 +01:00
1bdc1ef7a0 Used latest compiler version. 2024-12-05 17:03:44 +01:00
ff219840ea Added chmod to make scripts executable. 2024-12-05 16:59:17 +01:00
ee55878b18 Added build script stuff. 2024-12-05 16:31:59 +01:00
55893909f0 Added requests to requirements. 2024-11-21 00:08:30 +01:00
349a08b084 Added force parameter when fixing git tags so stuff compiles even if tags are updated. Also added the (yet missing) nghttp2 lib as a dependency to CURL if a version >= 8.10.0 is compiled. 2024-11-20 20:23:14 +01:00
5490de84eb Fixed Lua compilation on Windows. 2024-11-14 10:02:14 +01:00
70fc36335a Added GSL recipe. 2024-11-09 21:36:32 +01:00
ca1466469f Fixed lua recipe. 2024-11-08 12:36:59 +01:00
42a433582c Added Lua recipe. 2024-11-05 10:08:42 +01:00
0dcd8b383c Added DX12 recipe. 2024-10-30 11:10:30 +01:00
946cfc57ce Fixed wrong project name used when generating CLion project. 2024-10-27 13:52:00 +01:00
28c1675b87 Fixed typo. 2024-10-25 09:46:09 +02:00
7345305a77 Fixed CURL and Boost compilation on Windows. 2024-10-25 09:42:03 +02:00
651c8770c3 Added recipe for json and updated yaml-cpp recipe. 2024-10-25 08:37:07 +02:00
071cd207a9 Removed some old code. 2024-10-23 23:55:59 +02:00
a479e90335 Fixed compilation with MSVC. 2024-10-23 23:48:46 +02:00
cdbec36b8f Added recipes for curl, libidn2, libpsl and libunistring. 2024-10-23 23:48:46 +02:00
f2dc9872f7 Automatically apply patch when compiling SDL3. 2024-10-13 13:36:01 +02:00
2b05834798 Allow specifying a library from the current project as a dependency. 2024-10-12 12:30:21 +02:00
e3b3fd8f7c Save and reuse UUIDs between project generations. 2024-10-11 16:57:31 +02:00
329278c5f5 Fixed generated CLion project for platforms that use PROGPREFIX or PROGSUFFIX (e.g. Windows). 2024-10-11 10:21:21 +02:00
a0bbb46e51 Cleaned up/updated CLion project generation code and templates. 2024-10-10 23:23:29 +02:00
c6bba0e440 Added generation of CLion projects. 2024-10-10 00:06:58 +02:00
0a29b41639 Fixed (hopefully) the depends functionality. 2024-09-20 00:14:17 +02:00
a7c736de56 Added recipe for rectpack2D. 2024-09-19 14:57:37 +02:00
b45fec7561 Added enable_hlsl option to glslang. 2024-09-16 09:50:38 +02:00
c74fbb8798 Added recipe for tinyobjloader. 2024-09-15 14:45:47 +02:00
fe8f329b38 Fixed library name for debug SDL3 on Windows. 2024-09-12 15:09:06 +02:00
ae870e798d Added SHOW_INCLUDES debug setting (only MSVC for now). 2024-09-12 15:09:06 +02:00
941f94a7b6 Adjusted SDL build script to work with SDL3 (kind of). 2024-09-11 09:00:26 +02:00
1b8d6e175b Added options to version specs and allow specifying a git ref to build instead of a version. 2024-09-11 09:00:03 +02:00
bdf063a16c Added openssl recipe. 2024-09-09 21:52:57 +02:00
6cd076394d Added recipe for winsock2 and target_os to dependency conditions. 2024-09-09 21:52:57 +02:00
0175b812a2 Fixed Catch2 recipe. 2024-09-09 21:52:57 +02:00
eb044b6d2f Fixed compilation with MSVC. 2024-09-09 21:52:56 +02:00
e6adeb3a20 Moved check of SYSTEM_CACHE_DIR accessibility to before it is used. 2024-09-09 21:52:13 +02:00
fa83ff4581 Adjusted error description to make more sense. 2024-09-09 21:51:45 +02:00
29791d2e9c Fixed zlib recipe on linux. 2024-09-09 21:51:45 +02:00
4e579121db Fixed compilation with MSVC. 2024-09-09 21:51:36 +02:00
cd727e5a1d Update to new recipe system (S++ 2.0). 2024-09-09 21:50:47 +02:00
6f83b68788 Some more work on the new dependency resolution system. 2024-09-09 21:50:47 +02:00
17ee9777ed Some tests. 2024-09-09 21:50:47 +02:00
ecb8ea9a74 Allow settings COMPILATIONDB_FILTER_FILES via config. 2024-08-18 11:15:32 +02:00
a3426f1132 Enable experimental library features (jthread) for clang. 2024-08-18 09:58:30 +02:00
e9438455ea Moved check of SYSTEM_CACHE_DIR accessibility to before it is used. 2024-08-18 09:51:04 +02:00
cc5fd2d668 Added CXXFLAGS and CFLAGS to config variables. 2024-08-18 09:47:24 +02:00
42 changed files with 1215 additions and 471 deletions

View File

@@ -1,30 +1,103 @@
import copy
import enum
import glob
import inspect
import json
import multiprocessing
import os
import pathlib
import psutil
import shutil
import sys
import time
import uuid
def _cook(env: Environment, recipe_name: str, *args, **kwargs):
class TargetType(enum.Enum):
PROGRAM = 0
STATIC_LIBRARY = 1
SHARED_LIBRARY = 2
class _VersionSpec:
minimum_version = None
maximum_version = None
options = {}
def __init__(self, minimum_version = None, maximum_version = None, options = {}):
self.minimum_version = minimum_version
self.maximum_version = maximum_version
self.options = options
def __str__(self):
return f'Min: {self.minimum_version}, Max: {self.maximum_version}, Options: {self.options}'
class _Dependency:
name: str = ''
version = None
version_spec: _VersionSpec
recipe = None
depdeps: list = []
cook_result: dict = {}
class _Target:
name: str
target_type: TargetType
builder = None
args: list = []
kwargs: dict = {}
dependencies: list = []
target = None
def _find_recipe(env: Environment, recipe_name: str):
if recipe_name in env['SPP_RECIPES']:
return env['SPP_RECIPES'][recipe_name]
import importlib.util
source_file = None
for folder in env['RECIPES_FOLDERS']:
try_source_file = f'{folder.abspath}/{recipe_name}/recipe.py'
if not env['SPP_RECIPES_FOLDERS']:
env.Error('No recipes repositories set. Add one using env.RecipeRepo(<name>, <url>, <branch>).')
for folder in env['SPP_RECIPES_FOLDERS']:
from SCons import Node
if folder is Node:
folder = folder.abspath
try_source_file = f'{folder}/{recipe_name}/recipe.py'
if os.path.exists(try_source_file):
source_file = try_source_file
break
if not source_file:
raise Exception(f'Could not find recipe {recipe_name}.')
env.Error(f'Could not find recipe for {recipe_name}.')
spec = importlib.util.spec_from_file_location(recipe_name, source_file)
recipe = importlib.util.module_from_spec(spec)
recipe.env = env
spec.loader.exec_module(recipe)
return recipe.cook(env, *args, **kwargs)
env['SPP_RECIPES'][recipe_name] = recipe
return recipe
def _run_cook(dependency: _Dependency):
if not dependency.cook_result:
cook_signature = inspect.signature(dependency.recipe.cook)
kwargs = {}
if 'options' in cook_signature.parameters:
kwargs['options'] = dependency.version_spec.options
dependency.cook_result = dependency.recipe.cook(env, dependency.version, **kwargs)
def _cook(env: Environment, recipe_name: str):
dependency = env['SPP_DEPENDENCIES'].get(recipe_name)
if not dependency:
raise Exception(f'Cannot cook {recipe_name} as it was not listed as a dependency.')
_run_cook(dependency)
return dependency.cook_result
def _module(env: Environment, file: str):
return SConscript(file, exports = 'env', variant_dir = env['VARIANT_DIR'], src_dir = '.')
def _parse_lib_conf(env: Environment, lib_conf: dict) -> None:
env.Append(CPPPATH = lib_conf.get('CPPPATH', []),
CPPDEFINES = lib_conf.get('CPPDEFINES', []),
LIBPATH = lib_conf.get('LIBPATH', []),
LIBS = lib_conf.get('LIBS', []),
LINKFLAGS = lib_conf.get('LINKFLAGS', []),
JINJA_TEMPLATE_SEARCHPATH = lib_conf.get('JINJA_TEMPLATE_SEARCHPATH', []))
def _inject_list(kwargs: dict, dependency: dict, list_name: str) -> None:
@@ -40,11 +113,30 @@ def _inject_dependency(dependency, kwargs: dict, add_sources: bool = True) -> No
_inject_list(kwargs, dependency, 'CPPDEFINES')
_inject_list(kwargs, dependency, 'LIBPATH')
_inject_list(kwargs, dependency, 'LIBS')
_inject_list(kwargs, dependency, 'LINKFLAGS')
if add_sources and 'ADDITIONAL_SOURCES' in dependency and hasattr(kwargs['source'], 'extend'):
kwargs['source'].extend(dependency['ADDITIONAL_SOURCES'])
if 'DEPENDENCIES' in dependency:
for inner_dependency in dependency['DEPENDENCIES']:
_inject_dependency(inner_dependency, kwargs, False)
elif isinstance(dependency, _Dependency):
_run_cook(dependency)
_inject_list(kwargs, dependency.cook_result, 'CPPPATH')
_inject_list(kwargs, dependency.cook_result, 'CPPDEFINES')
_inject_list(kwargs, dependency.cook_result, 'LIBPATH')
_inject_list(kwargs, dependency.cook_result, 'LIBS')
_inject_list(kwargs, dependency.cook_result, 'LINKFLAGS')
for depdep in dependency.depdeps:
_inject_dependency(depdep, kwargs)
elif isinstance(dependency, _Target):
_inject_list(kwargs, dependency.kwargs, 'CPPPATH')
_inject_list(kwargs, dependency.kwargs, 'CPPDEFINES')
_inject_list(kwargs, dependency.kwargs, 'LIBPATH')
_inject_list(kwargs, dependency.kwargs, 'LIBS')
_inject_list(kwargs, {'LIBS': [dependency]}, 'LIBS')
_inject_list(kwargs, dependency.kwargs, 'LINKFLAGS')
for depdep in dependency.dependencies:
_inject_dependency(depdep, kwargs)
def _rglob(env: Environment, root_path: str, pattern: str, **kwargs):
result_nodes = []
@@ -56,6 +148,37 @@ def _rglob(env: Environment, root_path: str, pattern: str, **kwargs):
result_nodes.extend(env.Glob(f'{path}/{pattern}', **kwargs))
return sorted(result_nodes)
def _safe_eval(condition: str, locals={}):
return eval(condition, {
'__builtins__': {
'abs': abs, 'all': all, 'any': any, 'ascii': ascii, 'bin': bin, 'bool': bool, 'chr': chr, 'complex': complex,
'dict': dict, 'divmod': divmod, 'enumerate': enumerate, 'filter': filter, 'float': float, 'format': format,
'hasattr': hasattr, 'hash': hash, 'hex': hex, 'id': id, 'int': int, 'isinstance': isinstance,
'issubclass': issubclass, 'len': len, 'list': list, 'map': map, 'max': max, 'min': min, 'next': next,
'oct': oct, 'ord': ord, 'pow': pow, 'range': range, 'reversed': reversed, 'round': round, 'set': set,
'slice': slice, 'sorted': sorted, 'str': str, 'sum': sum, 'tuple': tuple, 'type': type, 'zip': zip
}
}, locals)
def _deps_from_json(env: Environment, deps: dict) -> dict:
to_remove = []
for key, dep in deps.items():
if 'condition' in dep:
if not _safe_eval(dep['condition'], {
'compiler_family': env['COMPILER_FAMILY'],
'target_os': os.name,
'getenv': lambda name: env.get(name)
}):
to_remove.append(key)
continue
if 'min' in dep and isinstance(dep['min'], list):
dep['min'] = tuple(dep['min'])
if 'max' in dep and isinstance(dep['max'], list):
dep['max'] = tuple(dep['max'])
for key in to_remove:
del deps[key]
return deps
def _make_interface(env: Environment, dependencies: list = []):
kwargs = {}
for dependency in dependencies:
@@ -65,27 +188,204 @@ def _make_interface(env: Environment, dependencies: list = []):
'CPPDEFINES': kwargs.get('CPPDEFINES', [])
}
def _lib_filename(name: str, type: str = 'static') -> str:
# TODO: windows
ext = {
'static': 'a',
'shared': 'so'
}[type]
return f'lib{name}.{ext}'
def _exe_filename(env: Environment, name: str, type: str = 'static') -> str:
if os.name == 'posix':
return name
elif os.name == 'nt':
return f'{name}.exe'
else:
raise Exception('What OS is this?')
def _find_lib(env: Environment, name: str, paths: 'list[str]', type : str = 'static'):
def _find_executable(env: Environment, name: str, paths: 'list[str]', type : str = 'static', allow_fail: bool = False, use_glob: bool = False):
fname = _exe_filename(env, name, type)
for path in paths:
lib_path = os.path.join(path, _lib_filename(name, type))
if os.path.exists(lib_path):
lib_path = os.path.join(path, fname)
if use_glob:
files = glob.glob(lib_path)
if len(files) == 1:
return files[0]
elif len(files) > 1:
raise Exception(f'Multiple candidates found for executable with name {name} in paths: "{", ".join(paths)}" with name: "{", ".join(files)}".')
elif os.path.exists(lib_path):
return lib_path
return None
if allow_fail:
return None
raise Exception(f'Could not find executable with name {name} in paths: "{", ".join(paths)}" filename: "{fname}".')
def _lib_filename(env: Environment, name: str, type: str = 'static') -> str:
if os.name == 'posix':
ext = {
'static': 'a',
'shared': 'so'
}[type]
return f'lib{name}.{ext}'
elif os.name == 'nt':
ext = {
'static': 'lib',
'shared': 'dll'
}[type]
return f'{name}.{ext}'
else:
raise Exception('What OS is this?')
def _find_lib(env: Environment, name: str, paths: 'list[str]', type : str = 'static', allow_fail: bool = False, use_glob: bool = False):
fname = _lib_filename(env, name, type)
for path in paths:
lib_path = os.path.join(path, fname)
if use_glob:
files = glob.glob(lib_path)
if len(files) == 1:
return files[0]
elif len(files) > 1:
raise Exception(f'Multiple candidates found for library with name {name} in paths: "{", ".join(paths)}" with name: "{", ".join(files)}".')
elif os.path.exists(lib_path):
return lib_path
if allow_fail:
return None
raise Exception(f'Could not find library with name {name} in paths: "{", ".join(paths)}" filename: "{fname}".')
def _error(env: Environment, message: str):
print(message, file=sys.stderr)
env.Exit(1)
Exit(1)
def _try_merge_dicts(dictA: dict, dictB: dict) -> 'dict|None':
result = {}
for key, valueA in dictA.items():
if key in dictB:
valueB = dictB[key]
if type(valueA) != type(valueB):
return None
elif type(valueA) == list:
result[key] = valueA + valueB
elif type(valueA) == dict:
mergedValue = _try_merge_dicts(valueA, valueB)
if mergedValue is None:
return None
result[key] = mergedValue
elif valueA != valueB:
return None
else:
result[key] = valueA
for key, valueB in dictB.items():
if key not in result:
result[key] = valueB
return result
def _find_common_dependency_version(name: str, versionA: _VersionSpec, versionB: _VersionSpec) -> _VersionSpec:
options = _try_merge_dicts(versionA.options, versionB.options)
if options is None:
return None
result_version = _VersionSpec(options=options)
if versionA.minimum_version is not None:
if versionB.minimum_version is not None:
result_version.minimum_version = max(versionA.minimum_version, versionB.minimum_version)
else:
result_version.minimum_version = versionA.minimum_version
else:
result_version.minimum_version = versionB.minimum_version
if versionA.maximum_version is not None:
if versionB.maximum_version is not None:
result_version.maximum_version = min(versionA.maximum_version, versionB.maximum_version)
else:
result_version.maximum_version = versionA.maximum_version
else:
result_version.maximum_version = versionB.maximum_version
if result_version.minimum_version is not None and result_version.maximum_version is not None \
and (result_version.minimum_version > result_version.maximum_version):
return None
return result_version
def _parse_version_spec(version_spec: dict) -> _VersionSpec:
return _VersionSpec(version_spec.get('min'), version_spec.get('max'), version_spec.get('options', {}))
def _can_add_dependency(env: Environment, name: str, version_spec: _VersionSpec) -> bool:
if name not in env['SPP_DEPENDENCIES']:
return True
dependency = env['SPP_DEPENDENCIES'][name]
common_version_spec = _find_common_dependency_version(name, dependency.version_spec, version_spec)
return common_version_spec is not None
def _add_dependency(env: Environment, name: str, version_spec: _VersionSpec) -> _Dependency:
if name in env['SPP_DEPENDENCIES']:
dependency = env['SPP_DEPENDENCIES'][name]
common_version_spec = _find_common_dependency_version(name, dependency.version_spec, version_spec)
if common_version_spec is None:
raise Exception(f'Incompatible versions detected for {name}: {dependency.version_spec} and {version_spec}')
if dependency.version_spec != common_version_spec:
env['_SPP_DEPENDENCIES_OKAY'] = False
dependency.version_spec = common_version_spec
return dependency
dependency = _Dependency()
dependency.name = name
dependency.version_spec = version_spec
dependency.recipe = _find_recipe(env, name)
env['SPP_DEPENDENCIES'][name] = dependency
env['_SPP_DEPENDENCIES_OKAY'] = False
return dependency
def _sort_versions(versions: list) -> None:
import functools
def _compare(left, right):
if left < right:
return 1
elif left == right:
return 0
else:
return -1
versions.sort(key=functools.cmp_to_key(_compare))
def _version_matches(version, version_spec: _VersionSpec) -> bool:
if version_spec.minimum_version is not None and version < version_spec.minimum_version:
return False
if version_spec.maximum_version is not None and version > version_spec.maximum_version:
return False
return True
def _find_version(env: Environment, dependency: _Dependency):
for update in (False, True):
versions_signature = inspect.signature(dependency.recipe.versions)
kwargs = {}
if 'options' in versions_signature.parameters:
kwargs['options'] = dependency.version_spec.options
versions = dependency.recipe.versions(env, update=update, **kwargs)
_sort_versions(versions)
for version in versions:
kwargs = {}
dependencies_signature = inspect.signature(dependency.recipe.dependencies)
if 'options' in dependencies_signature.parameters:
kwargs['options'] = dependency.version_spec.options
if _version_matches(version, dependency.version_spec):
canadd = True
for depname, depspec in dependency.recipe.dependencies(env, version, **kwargs).items():
if not _can_add_dependency(env, depname, _parse_version_spec(depspec)):
canadd = False
break
if canadd:
depdeps = []
for depname, depspec in dependency.recipe.dependencies(env, version, **kwargs).items():
depdeps.append(_add_dependency(env, depname, _parse_version_spec(depspec)))
dependency.version = version
dependency.depdeps = depdeps
return
print(f'Available versions: \n{versions}')
print(f'Required version: {dependency.version_spec}')
raise Exception(f'Could not find a suitable version for dependency {dependency.name}.')
def _wrap_builder(builder, target_type: TargetType):
def _wrapped(env, dependencies = {}, *args, **kwargs):
target_dependencies = []
for name, version_spec in dependencies.items():
if version_spec == {} and name not in env['SPP_DEPENDENCIES']: # this is basically a shortcut to adding targets from other modules without having to save them in the env
dep_target = _find_target(env, name)
if dep_target is not None and dep_target.target_type != TargetType.PROGRAM:
target_dependencies.append(dep_target)
# TODO: there might be an issue here with dependencies not being injected this way :/
continue
target_dependencies.append(_add_dependency(env, name, _parse_version_spec(version_spec)))
def _wrap_builder(builder, is_lib: bool = False):
def _wrapped(env, dependencies = [], *args, **kwargs):
if 'CPPPATH' not in kwargs:
kwargs['CPPPATH'] = copy.copy(env['CPPPATH'])
if 'CPPDEFINES' not in kwargs:
@@ -94,32 +394,43 @@ def _wrap_builder(builder, is_lib: bool = False):
kwargs['LIBPATH'] = copy.copy(env['LIBPATH'])
if 'LIBS' not in kwargs and 'LIBS' in env:
kwargs['LIBS'] = copy.copy(env['LIBS'])
for dependency in dependencies:
_inject_dependency(dependency, kwargs)
if 'LIBS' in kwargs:
libs_copy = list(kwargs['LIBS'])
for lib in libs_copy:
if isinstance(lib, str) and os.path.isabs(lib):
kwargs['LIBS'].remove(lib)
kwargs['source'].append(lib)
if 'source' in kwargs:
source = kwargs['source']
if not isinstance(source, list):
source = [source]
new_source = []
for src in source:
if isinstance(src, str):
new_source.append(env.Entry(src))
else:
new_source.append(src)
kwargs['source'] = new_source
result = builder(*args, **kwargs)
if is_lib:
# generate a new libconf
return {
'CPPPATH': kwargs.get('CPPPATH', []),
'CPPDEFINES': kwargs.get('CPPDEFINES', []),
'LIBPATH': kwargs.get('LIBPATH', []),
'LIBS': result + kwargs.get('LIBS', []),
'ADDITIONAL_SOURCES': kwargs.get('add_source', []),
'_target': result
}
return result
target = _Target()
if 'name' in kwargs:
target.name = kwargs['name']
else:
trgt = _target_entry(kwargs.get('target'))
if trgt is not None:
target.name = str(trgt.name)
else:
target.name = 'Unknown target'
target.target_type = target_type
target.builder = builder
target.args = args
target.kwargs = kwargs
target.dependencies = target_dependencies
env.Append(SPP_TARGETS = [target])
if not target.dependencies:
_build_target(target)
return target
return _wrapped
def _wrap_default(default):
def _wrapped(env, arg):
if isinstance(arg, dict) and '_target' in arg:
if isinstance(arg, _Target):
env.Append(SPP_DEFAULT_TARGETS = [arg])
elif isinstance(arg, dict) and '_target' in arg:
default(arg['_target'])
else:
default(arg)
@@ -127,13 +438,80 @@ def _wrap_default(default):
def _wrap_depends(depends):
def _wrapped(env, dependant, dependency):
if isinstance(dependant, dict) and '_target' in dependant:
if isinstance(dependant, _Target) or isinstance(dependency, _Target):
env.Append(SPP_TARGET_DEPENDENCIES = [(dependant, dependency)])
return
elif isinstance(dependant, dict) and '_target' in dependant:
dependant = dependant['_target']
if isinstance(dependency, dict) and '_target' in dependency:
elif isinstance(dependency, dict) and '_target' in dependency:
dependency = dependency['_target']
depends(dependant, dependency)
return _wrapped
def _build_target(target: _Target):
for dependency in target.dependencies:
_inject_dependency(dependency, target.kwargs)
if 'LIBS' in target.kwargs:
libs_copy = list(target.kwargs['LIBS'])
for lib in libs_copy:
if isinstance(lib, str) and os.path.isabs(lib):
target.kwargs['LIBS'].remove(lib)
target.kwargs['LIBS'].append(env.File(lib))
pass
elif isinstance(lib, _Target):
if not lib.target:
_build_target(lib)
target.kwargs['LIBS'].remove(lib)
target.kwargs['LIBS'].append(lib.target)
new_kwargs = target.kwargs.copy()
if 'target' in new_kwargs: # there should always be a target, right?
new_kwargs['target'] = f"{new_kwargs['target']}-{build_type}"
target.target = target.builder(*target.args, **new_kwargs)
def _version_to_string(version) -> str:
return '.'.join([str(v) for v in version])
def _finalize(env: Environment):
if generate_project:
_generate_project(generate_project)
Exit(0)
version_requirements = {dep.name: {
'min': dep.version_spec.minimum_version and _version_to_string(dep.version_spec.minimum_version),
'max': dep.version_spec.maximum_version and _version_to_string(dep.version_spec.maximum_version),
} for dep in env['SPP_DEPENDENCIES'].values()}
env['_SPP_DEPENDENCIES_OKAY'] = False
while not env['_SPP_DEPENDENCIES_OKAY']:
env['_SPP_DEPENDENCIES_OKAY'] = True
for dependency in list(env['SPP_DEPENDENCIES'].values()):
if not dependency.version:
_find_version(env, dependency)
with open('cache/versions.json', 'w') as f:
json.dump({
'requirements': version_requirements,
'selected': {
dep.name: _version_to_string(dep.version) for dep in env['SPP_DEPENDENCIES'].values()
}
}, f)
for target in env['SPP_TARGETS']:
_build_target(target)
for target in env['SPP_DEFAULT_TARGETS']:
env.Default(target.target)
for dependant, dependency in env['SPP_TARGET_DEPENDENCIES']:
if isinstance(dependant, _Target):
dependant = dependant.target
if isinstance(dependency, _Target):
dependency = dependency.target
env.Depends(dependant, dependency)
def _find_target(env: Environment, target_name: str) -> '_Target|None':
for target in env['SPP_TARGETS']:
if target.name == target_name:
return target
return None
def _get_fallback_cache_dir() -> str:
return Dir('#cache').abspath
@@ -149,16 +527,148 @@ def _find_system_cache_dir() -> str:
# fallback
return _get_fallback_cache_dir()
def _target_entry(target_value):
if target_value is None:
return None
if not isinstance(target_value, list):
target_value = [target_value]
if len(target_value) < 1:
return None
if isinstance(target_value[0], str):
target_value[0] = env.Entry(target_value[0])
return target_value[0]
def _generate_project(project_type: str) -> None:
try:
import jinja2
except ImportError:
_error(None, 'Project generation requires the jinja2 to be installed.')
source_folder, target_folder = {
'clion': (os.path.join(_spp_dir.abspath, 'util', 'clion_project_template'), Dir('#.idea').abspath),
'vscode': (os.path.join(_spp_dir.abspath, 'util', 'vscode_project_template'), Dir('#.vscode').abspath)
}.get(project_type, (None, None))
if not source_folder:
_error(None, 'Invalid project type option.')
uuid_cache_file = pathlib.Path(env['SHARED_CACHE_DIR'], 'uuids.json')
uuid_cache = {}
save_uuid_cache = False
if uuid_cache_file.exists():
try:
with uuid_cache_file.open('r') as f:
uuid_cache = json.load(f)
except Exception as e:
print(f'Error loading UUID cache: {e}')
def _generate_uuid(name: str = '') -> str:
nonlocal save_uuid_cache
if name and name in uuid_cache:
return uuid_cache[name]
new_uuid = str(uuid.uuid4())
if name:
uuid_cache[name] = new_uuid
save_uuid_cache = True
return new_uuid
root_path = pathlib.Path(env.Dir('#').abspath)
def _get_executables() -> list:
result = []
for target in env['SPP_TARGETS']:
if target.target_type == TargetType.PROGRAM:
trgt = _target_entry(target.kwargs['target'])
def _exe_path(build_type) -> str:
exe_path = pathlib.Path(trgt.abspath).relative_to(root_path)
exe_path = exe_path.parent / f'{env.subst("$PROGPREFIX")}{exe_path.name}-{build_type}{env.subst("$PROGSUFFIX")}'
return str(exe_path)
result.append({
'name': target.name,
'filename': _exe_path
})
return result
def _get_libraries() -> list:
result = []
for target in env['SPP_TARGETS']:
if target.target_type == TargetType.STATIC_LIBRARY:
trgt = _target_entry(target.kwargs['target'])
def _lib_path(build_type) -> str:
lib_path = pathlib.Path(trgt.abspath).relative_to(root_path)
lib_path = lib_path.parent / f'{env.subst("$LIBPREFIX")}{lib_path.name}-{build_type}{env.subst("$LIBSUFFIX")}'
return str(lib_path)
result.append({
'name': target.name,
'filename': _lib_path
})
elif target.target_type == TargetType.SHARED_LIBRARY:
trgt = _target_entry(target.kwargs['target'])
def _lib_path(build_type) -> str:
lib_path = pathlib.Path(trgt.abspath).relative_to(root_path)
lib_path = lib_path.parent / f'{env.subst("$SHLIBPREFIX")}{lib_path.name}-{build_type}{env.subst("$SHLIBSUFFIX")}'
return str(lib_path)
result.append({
'name': target.name,
'filename': _lib_path
})
return result
def _escape_path(input: str) -> str:
return input.replace('\\', '\\\\')
jinja_env = jinja2.Environment()
jinja_env.globals['generate_uuid'] = _generate_uuid
jinja_env.globals['project'] = {
'name': env.Dir('#').name,
'executables': _get_executables(),
'libraries': _get_libraries(),
'build_types': ['debug', 'release_debug', 'release', 'profile']
}
jinja_env.globals['scons_exe'] = shutil.which('scons')
jinja_env.globals['nproc'] = multiprocessing.cpu_count()
jinja_env.filters['escape_path'] = _escape_path
source_path = pathlib.Path(source_folder)
target_path = pathlib.Path(target_folder)
for source_file in source_path.rglob('*'):
if source_file.is_file():
target_file = target_path / (source_file.relative_to(source_path))
target_file.parent.mkdir(parents=True, exist_ok=True)
if source_file.suffix != '.jinja':
shutil.copyfile(source_file, target_file)
continue
with source_file.open('r') as f:
templ = jinja_env.from_string(f.read())
target_file = target_file.with_suffix('')
with target_file.open('w') as f:
f.write(templ.render())
if save_uuid_cache:
try:
with uuid_cache_file.open('w') as f:
json.dump(uuid_cache, f)
except Exception as e:
print(f'Error writing uuid cache: {e}')
Import('config')
if not config.get('PROJECT_NAME'):
config['PROJECT_NAME'] = 'PROJECT'
if not config.get('CXX_STANDARD'):
config['CXX_STANDARD'] = 'c++23'
if not config.get('CXX_NO_EXCEPTIONS'):
config['CXX_NO_EXCEPTIONS'] = False
if not config.get('PREPROCESSOR_PREFIX'):
config['PREPROCESSOR_PREFIX'] = config['PROJECT_NAME'].upper() # TODO: may be nicer?
if 'COMPILATIONDB_FILTER_FILES' not in config:
config['COMPILATIONDB_FILTER_FILES'] = True
if 'WINDOWS_DISABLE_DEFAULT_DEFINES' not in config:
config['WINDOWS_DISABLE_DEFAULT_DEFINES'] = False
AddOption(
'--build_type',
dest = 'build_type',
@@ -222,6 +732,17 @@ AddOption(
action = 'store_true'
)
AddOption(
'--generate_project',
dest = 'generate_project',
type = 'choice',
choices = ('clion', 'vscode'),
nargs = 1,
action = 'store'
)
_spp_dir = Dir('.')
build_type = GetOption('build_type')
unity_mode = GetOption('unity_mode')
variant = GetOption('variant')
@@ -230,6 +751,7 @@ config_file = GetOption('config_file')
compiler = GetOption('compiler')
update_repositories = GetOption('update_repositories')
dump_env = GetOption('dump_env')
generate_project = GetOption('generate_project')
default_CC = {
'gcc': 'gcc',
@@ -250,29 +772,46 @@ vars.Add('CC', 'The C Compiler', default_CC)
vars.Add('CXX', 'The C++ Compiler', default_CXX)
vars.Add('LINK', 'The Linker')
vars.Add('CCFLAGS', 'C/C++ Compiler Flags')
vars.Add('CFLAGS', 'C Compiler Flags')
vars.Add('CXXFLAGS', 'C++ Compiler Flags')
vars.Add('LINKFLAGS', 'Linker Flags')
vars.Add('PYTHON', 'Python Executable', 'python')
vars.Add('COMPILATIONDB_FILTER_FILES', 'Removes source files from the compilation DB that are not from the current project.', True)
vars.Add('COMPILATIONDB_FILTER_FILES', 'Removes source files from the compilation DB that are not from the current'
' project.', config['COMPILATIONDB_FILTER_FILES'])
vars.Add('SHOW_INCLUDES', 'Show include hierarchy (for debugging).', False)
vars.Add('ENABLE_ASAN', 'Enable address sanitization.', bool(enable_asan))
tools = ['default', 'compilation_db', 'unity_build']
if 'TOOLS' in config:
tools.extend(config['TOOLS'])
env = Environment(tools = tools, variables = vars, ENV = os.environ)
env['RECIPES_FOLDERS'] = [Dir('recipes')]
env['SPP_RECIPES_FOLDERS'] = []
env['SYSTEM_CACHE_DIR'] = os.path.join(_find_system_cache_dir(), 'spp_cache')
env['CLONE_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'cloned')
env['DOWNLOAD_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'downloaded')
env['UPDATE_REPOSITORIES'] = update_repositories
env['CXX_STANDARD'] = config['CXX_STANDARD'] # make it available to everyone
env['DEPS_CFLAGS'] = []
env['DEPS_CXXFLAGS'] = []
env['DEPS_LINKFLAGS'] = []
print(f'Detected system cache directory: {env["SYSTEM_CACHE_DIR"]}')
try:
os.makedirs(env['SYSTEM_CACHE_DIR'], exist_ok=True)
except:
env['SYSTEM_CACHE_DIR'] = os.path.join(_get_fallback_cache_dir(), 'spp_cache')
env['CLONE_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'cloned')
print(f'Creating spp cache dir failed, using fallback: {env["SYSTEM_CACHE_DIR"]}.')
os.makedirs(env['SYSTEM_CACHE_DIR'], exist_ok=True) # no more safeguards!
env['CLONE_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'cloned')
env['DOWNLOAD_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'downloaded')
env['UPDATE_REPOSITORIES'] = update_repositories
env['CXX_STANDARD'] = config['CXX_STANDARD'] # make it available to everyone
env['CXX_NO_EXCEPTIONS'] = config['CXX_NO_EXCEPTIONS']
env['DEPS_CFLAGS'] = []
env['DEPS_CXXFLAGS'] = []
env['DEPS_LINKFLAGS'] = []
env['SHARED_CACHE_DIR'] = Dir(f'#cache').abspath
# allow compiling to variant directories (each gets their own bin/lib/cache dirs)
@@ -303,6 +842,18 @@ env.Append(CPPPATH = [])
env.Append(CPPDEFINES = [])
env.Append(LINKFLAGS = [])
# init SPP environment variables
env['SPP_DIR'] = _spp_dir.abspath
env['SPP_TARGETS'] = []
env['SPP_DEFAULT_TARGETS'] = []
env['SPP_TARGET_DEPENDENCIES'] = []
env['SPP_DEPENDENCIES'] = {}
env['SPP_RECIPES'] = {}
env['OBJSUFFIX'] = f".{env['BUILD_TYPE']}{env['OBJSUFFIX']}"
if variant:
env['OBJSUFFIX'] = f".{variant}{env['OBJSUFFIX']}"
# create the cache dir
os.makedirs(env['CACHE_DIR'], exist_ok=True)
cache_gitignore = f'{env["CACHE_DIR"]}/.gitignore'
@@ -357,8 +908,10 @@ elif unity_mode == 'stress': # compile everything in one single file to stress t
# setup compiler specific options
if env['COMPILER_FAMILY'] == 'gcc' or env['COMPILER_FAMILY'] == 'clang':
env.Append(CCFLAGS = ['-Wall', '-Wextra', '-Werror', '-Wstrict-aliasing', '-pedantic'])
env.Append(CCFLAGS = ['-Wall', '-Wextra', '-Werror', '-Wstrict-aliasing', '-pedantic', '-fvisibility=hidden'])
env.Append(CXXFLAGS = [f'-std={config["CXX_STANDARD"]}'])
if env['CXX_NO_EXCEPTIONS']:
env.Append(CXXFLAGS = [f'-fno-exceptions'])
if build_type != 'release':
env.Append(LINKFLAGS = [f'-Wl,-rpath,{env["LIB_DIR"]}'])
env['LINKCOM'] = env['LINKCOM'].replace('$_LIBFLAGS', '-Wl,--start-group $_LIBFLAGS -Wl,--end-group')
@@ -371,12 +924,15 @@ if env['COMPILER_FAMILY'] == 'gcc' or env['COMPILER_FAMILY'] == 'clang':
# -Wtautological-compare triggers in libfmt and doesn't seem too useful anyway
env.Append(CCFLAGS = ['-Wno-missing-field-initializers', '-Wno-maybe-uninitialized'])
env.Append(CXXFLAGS = ['-Wno-subobject-linkage', '-Wno-dangling-reference', '-Wno-init-list-lifetime', '-Wno-tautological-compare'])
else:
else: # clang only
# no-gnu-anonymous-struct - we don't care
env.Append(CCFLAGS = ['-Wno-gnu-anonymous-struct'])
# no-missing-field-initializers - useful in some cases, annoying in most
# no-ambiguous-reversed-operator - should be quite useful, but we get a false positive, apparently?
env.Append(CCFLAGS = ['-Wno-gnu-anonymous-struct', '-Wno-missing-field-initializers', '-Wno-ambiguous-reversed-operator'])
env.Append(CXXFLAGS = ['-fexperimental-library']) # enable std::jthread
if build_type == 'debug':
env.Append(CCFLAGS = ['-g', '-O0'], CPPDEFINES = ['_GLIBCXX_DEBUG'])
env.Append(DEPS_CXXFLAGS = ['-D_GLIBCXX_DEBUG'])
elif build_type == 'release_debug' or build_type == 'profile':
env.Append(CCFLAGS = ['-Wno-unused-variable', '-Wno-unused-parameter', '-Wno-unused-but-set-variable', '-Wno-unused-local-typedef', '-Wno-unused-local-typedefs', '-g', '-O2'], CPPDEFINES = [f'{config["PREPROCESSOR_PREFIX"]}_RELEASE', 'NDEBUG'])
if build_type == 'profile':
@@ -388,48 +944,85 @@ if env['COMPILER_FAMILY'] == 'gcc' or env['COMPILER_FAMILY'] == 'clang':
elif build_type == 'release':
env.Append(CCFLAGS = ['-Wno-unused-variable', '-Wno-unused-parameter', '-Wno-unused-but-set-variable', '-Wno-unused-local-typedef', '-Wno-unused-local-typedefs', '-O2'], CPPDEFINES = [f'{config["PREPROCESSOR_PREFIX"]}_RELEASE', 'NDEBUG'])
if enable_asan:
if env['ENABLE_ASAN']:
env.Append(CCFLAGS = ['-fsanitize=address', '-fno-omit-frame-pointer'])
env.Append(LINKFLAGS = ['-fsanitize=address'])
env.Append(DEPS_CXXFLAGS = ['-fsanitize=address', '-fno-omit-frame-pointer'])
env.Append(DEPS_LINKFLAGS = ['-fsanitize=address'])
elif env['COMPILER_FAMILY'] == 'cl':
cxx_version_name = {
'c++14': 'c++14',
'c++17': 'c++17',
'c++20': 'c++20',
'c++23': 'c++latest',
'c++26': 'c++latest'
}.get(env['CXX_STANDARD'], 'c++14') # default to C++14 for older versions
# C4201: nonstandard extension used : nameless struct/union - I use it and want to continue using it
# C4127: conditional expression is constant - some libs (CRC, format) don't compile with this enabled # TODO: fix?
# C4702: unreachable code, issued after MIJIN_FATAL macro
# C4251: missing dll-interface of some std types, yaml-cpp doesn't compile with this enabled
# C4275: same as above
env.Append(CCFLAGS = ['/W4', '/WX', '/wd4201', '/wd4127', '/wd4702', '/wd4251', '/wd4275', '/bigobj', f'/std:{config["CXX_STANDARD"]}', '/permissive-', '/EHsc', '/FS', '/Zc:char8_t'])
env.Append(CCFLAGS = ['/W4', '/WX', '/wd4201', '/wd4127', '/wd4702', '/wd4251', '/wd4275', '/bigobj', '/vmg',
f'/std:{cxx_version_name}', '/permissive-', '/FS', '/Zc:char8_t', '/utf-8'])
env.Append(CPPDEFINES = ['_CRT_SECURE_NO_WARNINGS']) # I'd like to not use MSVC specific versions of functions because they are "safer" ...
env.Append(DEPS_CXXFLAGS = ['/Zc:char8_t', '/utf-8', '/vmg'])
if env['CXX_NO_EXCEPTIONS']:
env.Append(CPPDEFINES = ['_HAS_EXCEPTIONS=0'])
else:
env.Append(CXXFLAGS = ['/EHsc'])
env.Append(DEPS_CXXFLAGS = ['/EHsc'])
if env['SHOW_INCLUDES']:
env.Append(CCFLAGS = ['/showIncludes'])
if build_type == 'debug':
env.Append(CCFLAGS = ['/Od', '/Zi', '/MDd'], LINKFLAGS = ' /DEBUG')
env.Append(CPPDEFINES = ['_DEBUG', '_ITERATOR_DEBUG_LEVEL=2'])
env.Append(DEPS_CXXFLAGS = ['/MDd', '/Zi', '/D_DEBUG', '/D_ITERATOR_DEBUG_LEVEL=2'])
env.Append(DEPS_LINKFLAGS = ['/DEBUG'])
elif build_type == 'release_debug' or build_type == 'profile':
env.Append(CCFLAGS = ['/O2', '/Zi'], LINKFLAGS = ' /DEBUG')
env.Append(CCFLAGS = ['/O2', '/MD', '/Zi'], LINKFLAGS = ' /DEBUG')
env.Append(DEPS_CXXFLAGS = ['/Zi', '/MD'])
env.Append(DEPS_LINKFLAGS = ['/DEBUG'])
else:
env.Append(CCFLAGS = ['/O2'])
env.Append(CCFLAGS = ['/O2', '/MD'])
env.Append(DEPS_CXXFLAGS = ['/MD'])
if env['ENABLE_ASAN']:
env.Append(CCFLAGS = ['/fsanitize=address'])
if env['COMPILER_FAMILY'] == 'gcc':
env.Append(CXXFLAGS = ['-Wno-volatile'])
elif env['COMPILER_FAMILY'] == 'clang':
env.Append(CCFLAGS = ['-Wno-deprecated-volatile', '-Wno-nested-anon-types', '-Wno-unknown-warning-option'])
# platform specific options
if os.name == 'nt':
if not config['WINDOWS_DISABLE_DEFAULT_DEFINES']:
env.Append(CDEFINES = ['WIN32_LEAN_AND_MEAN', 'NOMINMAX', 'STRICT', 'UNICODE'], CPPDEFINES = ['WIN32_LEAN_AND_MEAN', 'NOMINMAX', 'STRICT', 'UNICODE'])
env.AddMethod(_cook, 'Cook')
env.AddMethod(_parse_lib_conf, 'ParseLibConf')
env.AddMethod(_rglob, 'RGlob')
env.AddMethod(_deps_from_json, 'DepsFromJson')
env.AddMethod(_make_interface, 'MakeInterface')
env.AddMethod(_lib_filename, 'LibFilename')
env.AddMethod(_find_executable, 'FindExecutable')
env.AddMethod(_find_lib, 'FindLib')
env.AddMethod(_error, 'Error')
env.AddMethod(_wrap_builder(env.Library, is_lib = True), 'Library')
env.AddMethod(_wrap_builder(env.StaticLibrary, is_lib = True), 'StaticLibrary')
env.AddMethod(_wrap_builder(env.SharedLibrary, is_lib = True), 'SharedLibrary')
env.AddMethod(_wrap_builder(env.Program), 'Program')
env.AddMethod(_wrap_builder(env.Library, TargetType.STATIC_LIBRARY), 'Library')
env.AddMethod(_wrap_builder(env.StaticLibrary, TargetType.STATIC_LIBRARY), 'StaticLibrary')
env.AddMethod(_wrap_builder(env.SharedLibrary, TargetType.SHARED_LIBRARY), 'SharedLibrary')
env.AddMethod(_wrap_builder(env.Program, TargetType.PROGRAM), 'Program')
env.AddMethod(_wrap_default(env.Default), 'Default')
env.AddMethod(_wrap_depends(env.Depends), 'Depends')
env.AddMethod(_wrap_builder(env.UnityProgram), 'UnityProgram')
env.AddMethod(_wrap_builder(env.UnityLibrary, is_lib = True), 'UnityLibrary')
env.AddMethod(_wrap_builder(env.UnityStaticLibrary, is_lib = True), 'UnityStaticLibrary')
env.AddMethod(_wrap_builder(env.UnitySharedLibrary, is_lib = True), 'UnitySharedLibrary')
env.AddMethod(_wrap_builder(env.UnityProgram, TargetType.PROGRAM), 'UnityProgram')
env.AddMethod(_wrap_builder(env.UnityLibrary, TargetType.STATIC_LIBRARY), 'UnityLibrary')
env.AddMethod(_wrap_builder(env.UnityStaticLibrary, TargetType.STATIC_LIBRARY), 'UnityStaticLibrary')
env.AddMethod(_wrap_builder(env.UnitySharedLibrary, TargetType.SHARED_LIBRARY), 'UnitySharedLibrary')
env.AddMethod(_module, 'Module')
env.AddMethod(_finalize, 'Finalize')
env.AddMethod(_find_target, 'FindTarget')
if hasattr(env, 'Gch'):
env.AddMethod(_wrap_builder(env.Gch), 'Gch')
@@ -441,5 +1034,6 @@ if dump_env:
print('==== Begin Environment Dump =====')
print(env.Dump())
print('==== End Environment Dump =====')
Exit(0)
Return('env')

View File

@@ -9,7 +9,7 @@ _BUILT_STAMPFILE = '.spp_built'
Import('env')
def _autotools_project(env: Environment, project_root: str, config_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = []) -> dict:
def _autotools_project(env: Environment, project_root: str, config_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = [], configure_script_path: str = 'configure', skip_steps = ()) -> dict:
config = env['BUILD_TYPE']
build_dir = os.path.join(project_root, f'build_{config}')
install_dir = os.path.join(project_root, f'install_{config}')
@@ -27,15 +27,32 @@ def _autotools_project(env: Environment, project_root: str, config_args: 'list[s
jobs = env.GetOption('num_jobs')
env = os.environ.copy()
env['CFLAGS'] = cflags
config_script = os.path.join(project_root, configure_script_path)
if not os.path.exists(config_script) and os.path.exists(f'{config_script}.ac'):
subprocess.run(('autoreconf', '--install', '--force'), cwd=project_root)
subprocess.run((os.path.join(project_root, 'configure'), '--prefix', install_dir, *config_args), cwd=build_dir, env=env, stdout=sys.stdout, stderr=sys.stderr, check=True)
subprocess.run(('make', f'-j{jobs}', *build_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
subprocess.run(('make', 'install', *install_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
if 'configure' not in skip_steps:
subprocess.run((config_script, f'--prefix={install_dir}', *config_args), cwd=build_dir, env=env, stdout=sys.stdout, stderr=sys.stderr, check=True)
if 'build' not in skip_steps:
subprocess.run(('make', f'-j{jobs}', *build_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
if 'install' not in skip_steps:
subprocess.run(('make', 'install', *install_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
else:
# must still create the install dir for the stamp file
os.makedirs(install_dir, exist_ok=True)
pathlib.Path(install_dir, _BUILT_STAMPFILE).touch()
libpath = []
for lib_folder in ('lib', 'lib64'):
full_path = os.path.join(install_dir, lib_folder)
if os.path.exists(full_path):
libpath.append(full_path)
return {
'build_dir': build_dir,
'install_dir': install_dir,
'LIBPATH': [os.path.join(install_dir, 'lib')],
'LIBPATH': libpath,
'CPPPATH': [os.path.join(install_dir, 'include')]
}

View File

@@ -1,9 +1,12 @@
import os
import json
import pathlib
import shutil
from SCons.Script import *
_BUILT_STAMPFILE = '.spp_built'
_VERSION = 2 # bump if you change how the projects are build to trigger a clean build
Import('env')
@@ -11,11 +14,58 @@ def cmd_quote(s: str) -> str:
escaped = s.replace('\\', '\\\\')
return f'"{escaped}"'
def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = []) -> dict:
def _generate_cmake_c_flags(env, dependencies: 'list[dict]') -> str:
parts = env['DEPS_CFLAGS'].copy()
for dependency in dependencies:
for path in dependency.get('CPPPATH', []):
parts.append(f'-I{path}')
return cmd_quote(' '.join(parts))
def _generate_cmake_cxx_flags(env, dependencies: 'list[dict]') -> str:
parts = env['DEPS_CXXFLAGS'].copy()
for dependency in dependencies:
for path in dependency.get('CPPPATH', []):
parts.append(f'-I{path}')
return cmd_quote(' '.join(parts))
def _get_cmake_cxx_standard(env: Environment) -> str:
return env['CXX_STANDARD'][3:] # we use "C++XX", CMake just "XX"
def _generate_cmake_args(env: Environment, dependencies: 'list[dict]') -> 'list[str]':
args = [f'-DCMAKE_C_FLAGS={_generate_cmake_c_flags(env, dependencies)}',
f'-DCMAKE_CXX_FLAGS={_generate_cmake_cxx_flags(env, dependencies)}',
f'-DCMAKE_CXX_STANDARD={_get_cmake_cxx_standard(env)}']
for dependency in dependencies:
for name, value in dependency.get('CMAKE_VARS', {}).items():
args.append(f'-D{name}={cmd_quote(value)}')
return args
def _calc_version_hash(env, dependencies: 'list[dict]') -> str:
return json.dumps({
'version': _VERSION,
'dependencies': dependencies,
'cxxflags': env['DEPS_CXXFLAGS']
})
def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = [], dependencies: 'list[dict]' = []) -> dict:
config = env['BUILD_TYPE']
build_dir = os.path.join(project_root, f'build_{config}')
install_dir = os.path.join(project_root, f'install_{config}')
is_built = os.path.exists(os.path.join(install_dir, _BUILT_STAMPFILE))
version_hash = _calc_version_hash(env, dependencies)
stamp_file = pathlib.Path(install_dir, _BUILT_STAMPFILE)
is_built = stamp_file.exists()
if is_built:
with stamp_file.open('r') as f:
build_version = f.read()
if build_version != version_hash:
print(f'Rebuilding CMake project at {project_root} as the script version changed.')
is_built = False
if not is_built:
shutil.rmtree(build_dir)
shutil.rmtree(install_dir)
if not is_built or env['UPDATE_REPOSITORIES']:
print(f'Building {project_root}, config {config}')
os.makedirs(build_dir, exist_ok=True)
@@ -26,14 +76,19 @@ def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str
'profile': 'RelWithDebInfo'
}.get(env['BUILD_TYPE'], 'RelWithDebInfo')
def run_cmd(args):
env.Execute(' '.join([str(s) for s in args]))
if env.Execute(' '.join([str(s) for s in args])):
Exit(1)
# TODO: is this a problem?
# environ = os.environ.copy()
# environ['CXXFLAGS'] = ' '.join(f'-D{define}' for define in env['CPPDEFINES']) # TODO: who cares about windows?
run_cmd(['cmake', '-G', 'Ninja', '-B', build_dir, f'-DCMAKE_BUILD_TYPE={build_type}', f'-DCMAKE_INSTALL_PREFIX={cmd_quote(install_dir)}', '-DBUILD_TESTING=OFF', *generate_args, project_root])
run_cmd(['cmake', '-G', 'Ninja', '-B', build_dir, f'-DCMAKE_BUILD_TYPE={build_type}',
f'-DCMAKE_INSTALL_PREFIX={cmd_quote(install_dir)}', '-DBUILD_TESTING=OFF',
*_generate_cmake_args(env, dependencies), *generate_args, project_root])
run_cmd(['cmake', '--build', *build_args, cmd_quote(build_dir)])
run_cmd(['cmake', '--install', *install_args, cmd_quote(build_dir)])
pathlib.Path(install_dir, _BUILT_STAMPFILE).touch()
with pathlib.Path(install_dir, _BUILT_STAMPFILE).open('w') as f:
f.write(version_hash)
libpath = []
for lib_folder in ('lib', 'lib64'):
@@ -43,6 +98,7 @@ def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str
return {
'install_dir': install_dir,
'BINPATH': [os.path.join(install_dir, 'bin')],
'LIBPATH': libpath,
'CPPPATH': [os.path.join(install_dir, 'include')]
}

View File

@@ -2,12 +2,12 @@
from git import Repo
from git.exc import GitError
import hashlib
import os
import inspect
from SCons.Script import *
Import('env')
def _gitbranch(env: Environment, repo_name: str, remote_url: str, git_ref: str = "main") -> dict:
def _clone(env: Environment, repo_name: str, remote_url: str):
repo_dir = os.path.join(env['CLONE_DIR'], 'git', repo_name, '_bare')
try:
repo = Repo(repo_dir)
@@ -16,24 +16,114 @@ def _gitbranch(env: Environment, repo_name: str, remote_url: str, git_ref: str =
print(f'Initializing git repository at {repo_dir}.')
repo = Repo.init(repo_dir, bare=True)
origin = repo.create_remote('origin', remote_url)
return repo, origin
def _git_branch(env: Environment, repo_name: str, remote_url: str, git_ref: str = 'main') -> dict:
repo, origin = _clone(env, repo_name, remote_url)
worktree_dir = os.path.join(env['CLONE_DIR'], 'git', repo_name, hashlib.shake_128(git_ref.encode('utf-8')).hexdigest(6)) # TODO: commit hash would be better, right? -> not if it's a branch!
update_submodules = False
if not os.path.exists(worktree_dir):
print(f'Checking out into {worktree_dir}.')
origin.fetch(tags=True)
origin.fetch(tags=True, force=True)
os.makedirs(worktree_dir)
repo.git.worktree('add', worktree_dir, git_ref)
worktree_repo = Repo(worktree_dir)
update_submodules = True
elif env['UPDATE_REPOSITORIES']:
worktree_repo = Repo(worktree_dir)
if not worktree_repo.head.is_detached:
print(f'Updating git repository at {worktree_dir}')
worktree_origin = worktree_repo.remotes['origin']
worktree_origin.pull()
update_submodules = True
else:
print(f'Not updating git repository {worktree_dir} as it is not on a branch.')
if update_submodules:
for submodule in worktree_repo.submodules:
submodule.update(init=True)
return {
'checkout_root': worktree_dir
'checkout_root': worktree_dir,
'repo': repo,
'origin': origin
}
def _git_tags(env: Environment, repo_name: str, remote_url: str, force_fetch: bool = False) -> 'list[str]':
repo, origin = _clone(env, repo_name, remote_url)
if force_fetch or env['UPDATE_REPOSITORIES']:
origin.fetch(tags=True)
return [t.name for t in repo.tags]
env.AddMethod(_gitbranch, 'GitBranch')
def _make_callable(val):
if callable(val):
return val
else:
def _wrapped(*args, **kwargs):
return val
return _wrapped
def _git_recipe(env: Environment, globals: dict, repo_name, repo_url, cook_fn, versions = None, tag_pattern = None, tag_fn = None, ref_fn = None, dependencies: dict = {}) -> None:
_repo_name = _make_callable(repo_name)
_repo_url = _make_callable(repo_url)
_tag_pattern = _make_callable(tag_pattern)
versions_cb = versions and _make_callable(versions)
dependencies_cb = _make_callable(dependencies)
def _versions(env: Environment, update: bool = False, options: dict = {}):
if 'ref' in options:
return [(0, 0, 0)] # no versions if compiling from a branch
pattern_signature = inspect.signature(_tag_pattern)
kwargs = {}
if 'options' in pattern_signature.parameters:
kwargs['options'] = options
pattern = _tag_pattern(env, **kwargs)
if pattern:
tags = env.GitTags(repo_name = _repo_name(env), remote_url = _repo_url(env), force_fetch=update)
result = []
for tag in tags:
match = pattern.match(tag)
if match:
result.append(tuple(int(part) for part in match.groups() if part is not None))
if len(result) == 0 and not update:
return _versions(env, update=True)
return result
elif versions_cb:
return versions_cb(env)
else:
return [(0, 0, 0)]
def _dependencies(env: Environment, version, options: dict) -> 'dict':
dependencies_signature = inspect.signature(dependencies_cb)
kwargs = {}
if 'options' in dependencies_signature.parameters:
kwargs['options'] = options
return dependencies_cb(env, version, **kwargs)
def _cook(env: Environment, version, options: dict = {}) -> dict:
if 'ref' in options:
git_ref = options['ref']
elif tag_fn:
tag_signature = inspect.signature(tag_fn)
kwargs = {}
if 'options' in tag_signature.parameters:
kwargs['options'] = options
git_ref = f'refs/tags/{tag_fn(version, **kwargs)}'
else:
assert ref_fn
git_ref = ref_fn(env, version)
repo = env.GitBranch(repo_name = _repo_name(env), remote_url = _repo_url(env), git_ref = git_ref)
cook_signature = inspect.signature(cook_fn)
kwargs = {}
if 'options' in cook_signature.parameters:
kwargs['options'] = options
return cook_fn(env, repo, **kwargs)
globals['versions'] = _versions
globals['dependencies'] = _dependencies
globals['cook'] = _cook
env.AddMethod(_git_branch, 'GitBranch')
env.AddMethod(_git_tags, 'GitTags')
env.AddMethod(_git_recipe, 'GitRecipe')
Return('env')

View File

@@ -1,4 +1,5 @@
import os
import pathlib
Import('env')
@@ -31,8 +32,33 @@ def _wrap_jinja(orig_jinja):
return target
return _wrapped
def _find_file(env, fname):
for path in env['JINJA_FILE_SEARCHPATH']:
fullpath = os.path.join(path.abspath, fname)
if os.path.exists(fullpath):
return env.File(fullpath)
return None
def _file_size(env, fname: str) -> int:
file = _find_file(env, fname)
if not file:
env.Error(f'File does not exist: {fname}. Searched in: {[d.abspath for d in env["JINJA_FILE_SEARCHPATH"]]}')
return file.get_size()
def _file_content_hex(env, fname: str) -> str:
file = _find_file(env, fname)
if not file:
env.Error(f'File does not exist: {fname}. Searched in: {[d.abspath for d in env["JINJA_FILE_SEARCHPATH"]]}')
bytes = file.get_contents()
return ','.join([hex(byte) for byte in bytes])
env.AddMethod(_wrap_jinja(env.Jinja), 'Jinja')
env.Append(JINJA_FILTERS = {'load_config': _jinja_load_config})
env.Append(JINJA_GLOBALS = {
'file_size': lambda *args: _file_size(env, *args),
'file_content_hex': lambda *args: _file_content_hex(env, *args)
})
env.Append(JINJA_TEMPLATE_SEARCHPATH = ['data/jinja'])
env['JINJA_CONFIG_SEARCHPATH'] = [env.Dir('#data/config')]
env['JINJA_FILE_SEARCHPATH'] = [env.Dir('#')]
Return('env')

View File

@@ -0,0 +1,11 @@
import os
Import('env')
def _recipe_repository(env, repo_name: str, remote_url: str, git_ref: str = 'master') -> None:
repo = env.GitBranch(repo_name = os.path.join('recipe_repos', repo_name), remote_url = remote_url, git_ref = git_ref)
env.Append(SPP_RECIPES_FOLDERS = [os.path.join(repo['checkout_root'], 'recipes')])
env.AddMethod(_recipe_repository, 'RecipeRepo')
Return('env')

View File

@@ -0,0 +1,13 @@
FROM debian:sid-slim
RUN apt-get -y update && \
apt-get -y upgrade && \
apt-get -y install build-essential clang-19 gcc-14 g++-14 python3 python3-pip \
virtualenv python-is-python3 clang-tidy git ninja-build cmake
RUN ln -s /usr/bin/clang-19 /usr/local/bin/clang \
&& ln -s /usr/bin/clang++-19 /usr/local/bin/clang++ \
&& ln -s /usr/bin/clang-tidy-19 /usr/local/bin/clang-tidy \
&& ln -s /usr/bin/run-clang-tidy-19 /usr/local/bin/run-clang-tidy
COPY scripts /opt/scripts
RUN chmod a+x /opt/scripts/*.sh

View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -xe
python -m virtualenv venv
source venv/bin/activate
pip install scons
pip install -r external/scons-plus-plus/requirements.txt
scons -j$(nproc) --build_type=debug --variant=linux_clang_debug --compiler=clang
scons -j$(nproc) --build_type=release_debug --variant=linux_clang_release_debug --compiler=clang
scons -j$(nproc) --build_type=release --variant=linux_clang_release --compiler=clang

View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -xe
python -m virtualenv venv
source venv/bin/activate
pip install scons
pip install -r external/scons-plus-plus/requirements.txt
scons -j$(nproc) --build_type=debug --variant=linux_gcc_debug --compiler=gcc
scons -j$(nproc) --build_type=release_debug --variant=linux_gcc_release_debug --compiler=gcc
scons -j$(nproc) --build_type=release --variant=linux_gcc_release --compiler=gcc

View File

@@ -1,21 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'master', own_main: bool = False) -> dict:
repo = env.GitBranch(repo_name = 'catch2', remote_url = 'https://github.com/catchorg/Catch2.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
lib_name = {
'debug': 'Catch2d'
}.get(env['BUILD_TYPE'], 'Catch2')
libs = [lib_name]
if not own_main:
libs.append({
'debug': 'Catch2Maind'
}.get(env['BUILD_TYPE'], 'Catch2Main'))
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': libs
}

View File

@@ -1,12 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref = 'main') -> dict:
repo = env.GitBranch(repo_name = 'ImageMagick', remote_url = 'https://github.com/ImageMagick/ImageMagick.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': ['backtrace']
}

View File

@@ -1,27 +0,0 @@
import os
import platform
from SCons.Script import *
def cook(env: Environment, git_ref: str = "main") -> dict:
repo = env.GitBranch(repo_name = 'SDL', remote_url = 'https://github.com/libsdl-org/SDL.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root, generate_args = ['-DSDL_STATIC=ON', '-DSDL_SHARED=OFF'])
libs = []
if platform.system() == 'Windows':
if env['BUILD_TYPE'] == 'debug':
libs.append('SDL2-staticd')
else:
libs.append('SDL2-static')
libs.extend(('kernel32', 'user32', 'gdi32', 'winmm', 'imm32', 'ole32', 'oleaut32', 'version', 'uuid', 'advapi32', 'setupapi', 'shell32', 'dinput8'))
else:
if env['BUILD_TYPE'] == 'debug':
libs.append('SDL2d')
else:
libs.append('SDL2')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': [os.path.join(build_result['install_dir'], 'include/SDL2')], # SDL is really weird about include paths ...
'LIBS': libs
}

View File

@@ -1,13 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, remote: str = 'github', git_ref: str = 'main') -> dict:
if remote == 'mewin':
repo = env.GitBranch(repo_name = 'VulkanHeaders_mewin', remote_url = 'https://git.mewin.de/mewin/vulkan-headers.git', git_ref = git_ref)
else:
repo = env.GitBranch(repo_name = 'VulkanHeaders', remote_url = 'https://github.com/KhronosGroup/Vulkan-Headers.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}

View File

@@ -1,10 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'argparse', remote_url = 'https://github.com/p-ranav/argparse.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}

View File

@@ -1,12 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, version: str = "1.85.0") -> dict:
# TODO: build binaries?
url = f'https://archives.boost.io/release/{version}/source/boost_{version.replace(".", "_")}.tar.gz'
repo = env.DownloadAndExtract(f'boost_{version}', url = url, skip_folders = 1)
checkout_root = repo['extracted_root']
return {
'CPPPATH': [checkout_root]
}

View File

@@ -1,9 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'cgltf', remote_url = 'https://github.com/jkuhlmann/cgltf.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root]
}

View File

@@ -1,16 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'master') -> dict:
repo = env.GitBranch(repo_name = 'fmt', remote_url = 'https://github.com/fmtlib/fmt.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root, generate_args = ['-DFMT_TEST=OFF'])
lib_name = {
'debug': 'fmtd'
}.get(env['BUILD_TYPE'], 'fmt')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': [lib_name]
}

View File

@@ -1,13 +0,0 @@
from SCons.Script import *
def cook(env: Environment, remote: str = 'github', git_ref: str = "master") -> dict:
if remote == 'mewin':
repo = env.GitBranch(repo_name = 'glm_mewin', remote_url = 'https://git.mewin.de/mewin/glm.git', git_ref = git_ref)
else:
repo = env.GitBranch(repo_name = 'glm', remote_url = 'https://github.com/g-truc/glm.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root],
}

View File

@@ -1,83 +0,0 @@
from SCons.Script import *
import glob
import os
import pathlib
import platform
import shutil
import sys
_SCRIPT_STAMPFILE = '.spp_script_run'
def cook(env: Environment, remote: str = 'github', git_ref: str = '') -> dict:
if remote == 'mewin':
repo = env.GitBranch(repo_name = 'glslang_mewin', remote_url = 'https://git.mewin.de/mewin/glslang.git', git_ref = git_ref or 'master')
else:
repo = env.GitBranch(repo_name = 'glslang', remote_url = 'https://github.com/KhronosGroup/glslang.git', git_ref = git_ref or 'main')
checkout_root = repo['checkout_root']
# TODO: windows?
did_run_script = os.path.exists(os.path.join(repo['checkout_root'], _SCRIPT_STAMPFILE))
if not did_run_script or env['UPDATE_REPOSITORIES']:
python_exe = os.path.realpath(sys.executable)
script_file = os.path.join(repo['checkout_root'], 'update_glslang_sources.py')
prev_cwd = os.getcwd()
os.chdir(repo['checkout_root'])
if env.Execute(f'"{python_exe}" {script_file}'):
env.Exit(1)
os.chdir(prev_cwd)
pathlib.Path(repo['checkout_root'], _SCRIPT_STAMPFILE).touch()
# generate the build_info.h
generator_script = os.path.join(repo['checkout_root'], 'build_info.py')
generator_script_input = os.path.join(repo['checkout_root'], 'build_info.h.tmpl')
generator_script_output = os.path.join(repo['checkout_root'], 'glslang/build_info.h')
env.Command(
target = generator_script_output,
source = [generator_script, generator_script_input, os.path.join(repo['checkout_root'], 'CHANGES.md')],
action = f'"$PYTHON" "{generator_script}" "{repo["checkout_root"]}" -i "{generator_script_input}" -o "$TARGET"'
)
platform_source_dir = {
'Linux': 'Unix',
'Windows': 'Windows',
'Darwin': 'Unix'
}.get(platform.system(), 'Unix')
glslang_source_files = env.RGlob(os.path.join(repo['checkout_root'], 'glslang/GenericCodeGen/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/MachineIndependent/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/OGLCompilersDLL/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/ResourceLimits/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'SPIRV/'), '*.cpp') \
+ [os.path.join(repo['checkout_root'], f'glslang/OSDependent/{platform_source_dir}/ossource.cpp')]
# disable a few warnings when compiling with clang
additional_cxx_flags = {
'clang': ['-Wno-deprecated-copy', '-Wno-missing-field-initializers', '-Wno-gnu-redeclared-enum',
'-Wno-unused-but-set-variable', '-Wno-deprecated-enum-enum-conversion']
}.get(env['COMPILER_FAMILY'], [])
env.StaticLibrary(
CCFLAGS = env['CCFLAGS'] + additional_cxx_flags,
CPPPATH = repo['checkout_root'],
target = env['LIB_DIR'] + '/glslang_full',
source = glslang_source_files
)
# build the include folder
include_dir = os.path.join(checkout_root, 'include')
if not os.path.exists(include_dir) or env['UPDATE_REPOSITORIES']:
def copy_headers(dst, src):
os.makedirs(dst, exist_ok=True)
for file in glob.glob(os.path.join(src, '*.h')):
shutil.copy(file, dst)
copy_headers(os.path.join(include_dir, 'glslang/HLSL'), os.path.join(checkout_root, 'glslang/HLSL'))
copy_headers(os.path.join(include_dir, 'glslang/Include'), os.path.join(checkout_root, 'glslang/Include'))
copy_headers(os.path.join(include_dir, 'glslang/MachineIndependent'), os.path.join(checkout_root, 'glslang/MachineIndependent'))
copy_headers(os.path.join(include_dir, 'glslang/Public'), os.path.join(checkout_root, 'glslang/Public'))
copy_headers(os.path.join(include_dir, 'glslang/SPIRV'), os.path.join(checkout_root, 'SPIRV'))
return {
'CPPPATH': [include_dir],
'LIBS': ['glslang_full']
}

View File

@@ -1,30 +0,0 @@
from SCons.Script import *
import os
def cook(env: Environment, backends: list = [], git_ref: str = '') -> dict:
repo = env.GitBranch(repo_name = 'imgui', remote_url = 'https://github.com/ocornut/imgui.git', git_ref = git_ref or 'master')
imgui_source_files = [
os.path.join(repo['checkout_root'], 'imgui.cpp'),
os.path.join(repo['checkout_root'], 'imgui_draw.cpp'),
os.path.join(repo['checkout_root'], 'imgui_tables.cpp'),
os.path.join(repo['checkout_root'], 'imgui_widgets.cpp')
]
imgui_add_sources = []
backend_sources = {
'vulkan': os.path.join(repo['checkout_root'], 'backends/imgui_impl_vulkan.cpp'),
'sdl2': os.path.join(repo['checkout_root'], 'backends/imgui_impl_sdl2.cpp')
}
for backend in backends:
imgui_add_sources.append(backend_sources[backend])
lib_imgui = env.StaticLibrary(
CPPPATH = [repo['checkout_root']],
CPPDEFINES = ['IMGUI_IMPL_VULKAN_NO_PROTOTYPES=1'],
target = env['LIB_DIR'] + '/imgui',
source = imgui_source_files,
add_source = imgui_add_sources
)
return lib_imgui

View File

@@ -1,8 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'iwa', remote_url = 'https://git.mewin.de/mewin/iwa.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return SConscript(os.path.join(checkout_root, 'LibConf'), exports = ['env'])

View File

@@ -1,14 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref = 'master') -> dict:
if env['COMPILER_FAMILY'] not in ('gcc', 'clang'):
env.Error('libbacktrace requires gcc or clang.')
repo = env.GitBranch(repo_name = 'libbacktrace', remote_url = 'https://github.com/ianlancetaylor/libbacktrace.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': ['backtrace']
}

View File

@@ -1,11 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref = 'main') -> dict:
repo = env.GitBranch(repo_name = 'libjpeg-turbo', remote_url = 'https://github.com/libjpeg-turbo/libjpeg-turbo.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(checkout_root)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib('jpeg', paths=build_result['LIBPATH'])],
}

View File

@@ -1,13 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref = 'master') -> dict:
lib_z = env.Cook('zlib')
repo = env.GitBranch(repo_name = 'libpng', remote_url = 'https://git.code.sf.net/p/libpng/code.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib('png16', paths=build_result['LIBPATH'])],
'DEPENDENCIES': [lib_z]
}

View File

@@ -1,10 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'magic_enum', remote_url = 'https://github.com/Neargye/magic_enum.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}

View File

@@ -1,14 +0,0 @@
from SCons.Script import *
import os
def cook(env: Environment, git_ref = 'master') -> dict:
repo = env.GitBranch(repo_name = 'mecab', remote_url = 'https://github.com/taku910/mecab.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(os.path.join(checkout_root, 'mecab'))
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': ['mecab']
}

View File

@@ -1,8 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'mijin', remote_url = 'https://git.mewin.de/mewin/mijin2.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return SConscript(os.path.join(checkout_root, 'LibConf'), exports = ['env'])

View File

@@ -1,12 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'master') -> dict:
repo = env.GitBranch(repo_name = 'mikktspace', remote_url = 'https://github.com/mmikk/MikkTSpace.git', git_ref = git_ref)
lib_mikktspace = env.StaticLibrary(
CPPPATH = [repo['checkout_root']],
target = env['LIB_DIR'] + '/mikktspace',
source = [os.path.join(repo['checkout_root'], 'mikktspace.c')]
)
return lib_mikktspace

View File

@@ -1,22 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'v1.x', use_external_libfmt = False) -> dict:
repo = env.GitBranch(repo_name = 'spdlog', remote_url = 'https://github.com/gabime/spdlog.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
lib_name = {
'debug': 'spdlogd'
}.get(env['BUILD_TYPE'], 'spdlog')
cppdefines = ['SPDLOG_COMPILE_LIB=1']
if use_external_libfmt:
cppdefines.append('SPDLOG_FMT_EXTERNAL=1')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'CPPDEFINES': cppdefines,
'LIBS': [lib_name]
}

View File

@@ -1,9 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'stb', remote_url = 'https://github.com/nothings/stb.git', git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root]
}

View File

@@ -1,15 +0,0 @@
from SCons.Script import *
def cook(env: Environment, git_ref: str = "master") -> dict:
repo = env.GitBranch(repo_name = 'yaml-cpp', remote_url = 'https://github.com/jbeder/yaml-cpp', git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
lib_name = {
'debug': 'yaml-cppd'
}.get(env['BUILD_TYPE'], 'yaml-cpp')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': [lib_name]
}

View File

@@ -1,12 +0,0 @@
import os
from SCons.Script import *
def cook(env: Environment, git_ref: str = 'master') -> dict:
repo = env.GitBranch(repo_name = 'zlib', remote_url = 'https://github.com/madler/zlib.git', git_ref = git_ref)
extracted_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=extracted_root)
return {
'CPPPATH': [os.path.join(build_result['install_dir'], 'install')],
'LIBS': [env.FindLib('z', paths=build_result['LIBPATH'])]
}

View File

@@ -1,2 +1,4 @@
GitPython
psutil
Jinja2
requests

View File

@@ -0,0 +1,8 @@
# Default ignored files
/shelf/
/workspace.xml
# Editor-based HTTP Client requests
/httpRequests/
# Datasource local storage ignored files
/dataSources/
/dataSources.local.xml

View File

@@ -0,0 +1,35 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CLionExternalBuildManager">
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<target id="{{ generate_uuid('target_' + executable.name + '_' + build_type) }}" name="{{ executable.name }} {{ build_type_name }}" defaultType="TOOL">
<configuration id="{{ generate_uuid('configuration_' + executable.name + '_' + build_type) }}" name="{{ executable.name }} {{ build_type_name }}">
<build type="TOOL">
<tool actionId="Tool_External Tools_{{ executable.name }} {{ build_type_name }}" />
</build>
<clean type="TOOL">
<tool actionId="Tool_External Tools_{{ executable.name }} {{ build_type_name }} Clean" />
</clean>
</configuration>
</target>
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<target id="{{ generate_uuid('target_' + library.name + '_' + build_type) }}" name="{{ library.name }} {{ build_type_name }}" defaultType="TOOL">
<configuration id="{{ generate_uuid('configuration_' + library.name + '_' + build_type) }}" name="{{ library.name }} {{ build_type_name }}">
<build type="TOOL">
<tool actionId="Tool_External Tools_{{ library.name }} {{ build_type_name }}" />
</build>
<clean type="TOOL">
<tool actionId="Tool_External Tools_{{ library.name }} {{ build_type_name }} Clean" />
</clean>
</configuration>
</target>
{% endfor %}
{% endfor %}
</component>
</project>

View File

@@ -0,0 +1,17 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CompDBSettings">
<option name="linkedExternalProjectsSettings">
<CompDBProjectSettings>
<option name="externalProjectPath" value="$PROJECT_DIR$" />
<option name="modules">
<set>
<option value="$PROJECT_DIR$" />
</set>
</option>
</CompDBProjectSettings>
</option>
</component>
<component name="CompDBWorkspace" PROJECT_DIR="$PROJECT_DIR$" />
<component name="ExternalStorageConfigurationManager" enabled="true" />
</project>

View File

@@ -0,0 +1,40 @@
<toolSet name="External Tools">
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<tool name="{{ executable.name }} {{ build_type_name }}" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="-j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) }} compile_commands.json" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
<tool name="{{ executable.name }} {{ build_type_name }} Clean" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="--build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) }} -c" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<tool name="{{ library.name }} {{ build_type_name }}" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="-j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) }} compile_commands.json" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
<tool name="{{ library.name }} {{ build_type_name }} Clean" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="--build_type={{ build_type }} --unity=disable {{ library.filename(build_type) }} -c" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
{% endfor %}
{% endfor %}
</toolSet>

View File

@@ -0,0 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="" vcs="Git" />
<mapping directory="$PROJECT_DIR$/external/scons-plus-plus" vcs="Git" />
</component>
</project>

View File

@@ -0,0 +1,109 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="AutoImportSettings">
<option name="autoReloadType" value="SELECTIVE" />
</component>
<component name="CMakeRunConfigurationManager">
<generated />
</component>
<component name="CMakeSettings">
<configurations>
<configuration PROFILE_NAME="Debug" ENABLED="true" CONFIG_NAME="Debug" />
</configurations>
</component>
<component name="ClangdSettings">
<option name="formatViaClangd" value="false" />
</component>
<component name="CompDBLocalSettings">
<option name="availableProjects">
<map>
<entry>
<key>
<ExternalProjectPojo>
<option name="name" value="{{ project.name }}" />
<option name="path" value="$PROJECT_DIR$" />
</ExternalProjectPojo>
</key>
<value>
<list>
<ExternalProjectPojo>
<option name="name" value="{{ project.name }}" />
<option name="path" value="$PROJECT_DIR$" />
</ExternalProjectPojo>
</list>
</value>
</entry>
</map>
</option>
<option name="projectSyncType">
<map>
<entry key="$PROJECT_DIR$" value="RE_IMPORT" />
</map>
</option>
</component>
<component name="ExternalProjectsData">
<projectState path="$PROJECT_DIR$">
<ProjectState />
</projectState>
</component>
<component name="Git.Settings">
<option name="RECENT_GIT_ROOT_PATH" value="$PROJECT_DIR$" />
</component>
<component name="ProjectColorInfo">{
&quot;associatedIndex&quot;: 5
}</component>
<component name="ProjectViewState">
<option name="hideEmptyMiddlePackages" value="true" />
<option name="showLibraryContents" value="true" />
</component>
<component name="PropertiesComponent"><![CDATA[{
"keyToString": {
{% for executable in project.executables -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
"Custom Build Application.{{ executable.name }} {{ build_type_name }}.executor": "Debug",
{% endfor -%}
{% endfor -%}
"RunOnceActivity.RadMigrateCodeStyle": "true",
"RunOnceActivity.ShowReadmeOnStart": "true",
"RunOnceActivity.cidr.known.project.marker": "true",
"RunOnceActivity.readMode.enableVisualFormatting": "true",
"cf.first.check.clang-format": "false",
"cidr.known.project.marker": "true",
"git-widget-placeholder": "master",
"node.js.detected.package.eslint": "true",
"node.js.detected.package.tslint": "true",
"node.js.selected.package.eslint": "(autodetect)",
"node.js.selected.package.tslint": "(autodetect)",
"nodejs_package_manager_path": "npm",
"settings.editor.selected.configurable": "CLionExternalConfigurable",
"vue.rearranger.settings.migration": "true"
}
}]]></component>
<component name="RunManager" selected="Custom Build Application.{{ project.executables[0].name }} {{ project.build_types[0] }}">
{% for executable in project.executables -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
<configuration name="{{ executable.name }} {{ build_type_name }}" type="CLionExternalRunConfiguration" factoryName="Application" REDIRECT_INPUT="false" ELEVATE="false" USE_EXTERNAL_CONSOLE="false" EMULATE_TERMINAL="false" WORKING_DIR="file://$ProjectFileDir$" PASS_PARENT_ENVS_2="true" PROJECT_NAME="{{ project.name }}" TARGET_NAME="{{ executable.name }} {{ build_type_name }}" CONFIG_NAME="{{ executable.name }} {{ build_type_name }}" RUN_PATH="$PROJECT_DIR$/{{ executable.filename(build_type) }}">
<method v="2">
<option name="CLION.EXTERNAL.BUILD" enabled="true" />
</method>
</configuration>
{% endfor -%}
{% endfor -%}
{% for library in project.libraries -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
<configuration name="{{ library.name }} {{ build_type_name }}" type="CLionExternalRunConfiguration" factoryName="Application" REDIRECT_INPUT="false" ELEVATE="false" USE_EXTERNAL_CONSOLE="false" EMULATE_TERMINAL="false" PASS_PARENT_ENVS_2="true" PROJECT_NAME="{{ project.name }}" TARGET_NAME="{{ library.name }} {{ build_type_name }}" CONFIG_NAME="{{ library.name }} {{ build_type_name }}">
<method v="2">
<option name="CLION.EXTERNAL.BUILD" enabled="true" />
</method>
</configuration>
{% endfor -%}
{% endfor -%}
</component>
<component name="SpellCheckerSettings" RuntimeDictionaries="0" Folders="0" CustomDictionaries="0" DefaultDictionary="application-level" UseSingleDictionary="true" transferred="true" />
<component name="TypeScriptGeneratedFilesManager">
<option name="version" value="3" />
</component>
</project>

View File

@@ -0,0 +1,6 @@
{
"recommendations": [
"ms-vscode.cpptools",
"llvm-vs-code-extensions.vscode-clangd"
]
}

View File

@@ -0,0 +1,20 @@
{
"configurations": [
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
{
"name": "{{ executable.name }} ({{ build_type | capitalize }})",
"type": "cppvsdbg",
"request": "launch",
"program": "{{ executable.filename(build_type) | escape_path }}",
"args": [],
"stopAtEntry": false,
"cwd": "${workspaceFolder}",
"environment": [],
"console": "integratedTerminal"
}
{% endfor %}
{% endfor %}
]
}

View File

@@ -0,0 +1,69 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"tasks": [
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
{
"label": "{{ executable.name }} {{ build_type_name }}",
"type": "shell",
"command": "{{ scons_exe | escape_path }} -j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) | escape_path }} compile_commands.json",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{
"label": "{{ executable.name }} {{ build_type_name }} Clean",
"type": "shell",
"command": "{{ scons_exe | escape_path }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) | escape_path }} -c",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
{
"label": "{{ library.name }} {{ build_type_name }}",
"type": "shell",
"command": "{{ scons_exe | escape_path }} -j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) | escape_path }} compile_commands.json",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{
"label": "{{ library.name }} {{ build_type_name }} Clean",
"type": "shell",
"command": "{{ scons_exe | escape_path }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) | escape_path }} -c",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{% endfor %}
{% endfor %}
]
}