Compare commits

..

81 Commits

Author SHA1 Message Date
9436d2c48d Added JINJA_FILE_SEARCHPATH for Jinja file functions to make it easier for library projects to find their files. 2025-03-28 14:52:14 +01:00
2769fd801f Added file_size and file_content_hex Jinja filters.
Added check if Jinja module exists and appropriate error message to
project generation.
2025-03-28 11:35:59 +01:00
Patrick Wuttke
5f11d64744 Fixed generation of VSCode launch.json. 2025-03-18 09:45:21 +01:00
1401fdea90 Next attempt of adding build type to executable and library names. 2025-03-14 22:02:40 +01:00
Patrick Wuttke
3a3c79d240 Revert "Updated to include build type and variant in binary names, so they don't need to be rebuilt everytime the configuration is changed."
Modifying the suffix variables broke the library file detection, at least on Windows.

This reverts commit e6e7dbe642.
2025-03-14 09:49:21 +01:00
e6e7dbe642 Updated to include build type and variant in binary names, so they don't need to be rebuilt everytime the configuration is changed. 2025-03-13 23:42:58 +01:00
1edf745b39 Clone recipe repositories into a separate subfolder. 2025-03-13 23:01:09 +01:00
9e5fc05f4f Introduced recipe repositories. 2025-03-13 22:58:49 +01:00
d03b7097f7 Fixed SQLite build on Linux. 2025-03-13 18:36:17 +01:00
ba5def01b0 Fixed imgui-node-editor build when using GCC. 2025-03-13 18:36:07 +01:00
Patrick Wuttke
4082f8cb22 Added build type to object suffix so switching between configs is a lot quicker. 2025-03-13 15:00:24 +01:00
Patrick Wuttke
0bec9f7062 Added project template for VSCode (only Windows for now). 2025-03-13 10:42:48 +01:00
283aa0f99c Added way for dependencies to use options and fixed compilation of ImGui with SDL3 backend. 2025-03-13 09:57:39 +01:00
71f8631e48 Added recipe for portable-file-dialogs. 2025-03-12 15:50:16 +01:00
ae739b9946 Fixed Release and Release_Debug build on windows. 2025-03-11 17:46:57 +01:00
2cc3145669 Added recipe for ImGui Node Editor. 2025-03-11 17:46:39 +01:00
5159c80167 Added recipe for SQLite on Windows (still WIP). 2025-03-11 09:35:38 +01:00
748f8f229f (WIP) SQLite recipe. 2025-03-10 09:55:34 +01:00
b034e87aaf Properly disable exceptions in MSVC. 2025-03-07 11:00:17 +01:00
726d206f9a Disabled tests for fmt. 2025-03-07 10:59:41 +01:00
5debc6d334 Added recipe for icon font cpp headers. 2025-03-07 10:48:56 +01:00
5c6a8a1cc6 Fixed issue with dependency injection not working due to the 'add target via name in dependencies' shortcut :/. 2025-03-02 22:20:21 +01:00
9d3d52ffaf Added recipe for RAID. 2025-03-02 22:19:22 +01:00
c994752c32 Added option for ImGui recipe to build from a docking build and to provide backends to compile via options. 2025-03-02 16:00:48 +01:00
42dada23c9 Added CXX_NO_EXCEPTIONS config to disable exceptions. 2025-03-02 16:00:12 +01:00
0e22057fc4 Added recipe for sdlpp. 2025-02-14 22:37:27 +01:00
c883f3c1c7 Added DXC recipe. 2025-01-19 01:17:35 +01:00
bca2398828 Added recipes for some Linux system libraries. 2025-01-12 13:11:43 +01:00
10a5239b7f Fixed default value for enable_asan. 2025-01-09 17:56:28 +01:00
0b8d115907 Added option for address sanitization on MSVC. 2024-12-26 14:48:18 +01:00
134fb106a8 Added cxxflags to cmake version hash calculation. 2024-12-13 23:39:27 +01:00
b546931c09 Added libraries to automatic CLion project generation. 2024-12-13 00:52:10 +01:00
7c4e403747 Fixed up Clang warnings. 2024-12-08 22:38:52 +01:00
c24c864915 Disabled Clang 'missing-field-initializers' warning. 2024-12-06 20:36:37 +01:00
fe95a92cf6 Added GCC build script. 2024-12-06 09:46:54 +01:00
1bdc1ef7a0 Used latest compiler version. 2024-12-05 17:03:44 +01:00
ff219840ea Added chmod to make scripts executable. 2024-12-05 16:59:17 +01:00
ee55878b18 Added build script stuff. 2024-12-05 16:31:59 +01:00
55893909f0 Added requests to requirements. 2024-11-21 00:08:30 +01:00
349a08b084 Added force parameter when fixing git tags so stuff compiles even if tags are updated. Also added the (yet missing) nghttp2 lib as a dependency to CURL if a version >= 8.10.0 is compiled. 2024-11-20 20:23:14 +01:00
5490de84eb Fixed Lua compilation on Windows. 2024-11-14 10:02:14 +01:00
70fc36335a Added GSL recipe. 2024-11-09 21:36:32 +01:00
ca1466469f Fixed lua recipe. 2024-11-08 12:36:59 +01:00
42a433582c Added Lua recipe. 2024-11-05 10:08:42 +01:00
0dcd8b383c Added DX12 recipe. 2024-10-30 11:10:30 +01:00
946cfc57ce Fixed wrong project name used when generating CLion project. 2024-10-27 13:52:00 +01:00
28c1675b87 Fixed typo. 2024-10-25 09:46:09 +02:00
7345305a77 Fixed CURL and Boost compilation on Windows. 2024-10-25 09:42:03 +02:00
651c8770c3 Added recipe for json and updated yaml-cpp recipe. 2024-10-25 08:37:07 +02:00
071cd207a9 Removed some old code. 2024-10-23 23:55:59 +02:00
a479e90335 Fixed compilation with MSVC. 2024-10-23 23:48:46 +02:00
cdbec36b8f Added recipes for curl, libidn2, libpsl and libunistring. 2024-10-23 23:48:46 +02:00
f2dc9872f7 Automatically apply patch when compiling SDL3. 2024-10-13 13:36:01 +02:00
2b05834798 Allow specifying a library from the current project as a dependency. 2024-10-12 12:30:21 +02:00
e3b3fd8f7c Save and reuse UUIDs between project generations. 2024-10-11 16:57:31 +02:00
329278c5f5 Fixed generated CLion project for platforms that use PROGPREFIX or PROGSUFFIX (e.g. Windows). 2024-10-11 10:21:21 +02:00
a0bbb46e51 Cleaned up/updated CLion project generation code and templates. 2024-10-10 23:23:29 +02:00
c6bba0e440 Added generation of CLion projects. 2024-10-10 00:06:58 +02:00
0a29b41639 Fixed (hopefully) the depends functionality. 2024-09-20 00:14:17 +02:00
a7c736de56 Added recipe for rectpack2D. 2024-09-19 14:57:37 +02:00
b45fec7561 Added enable_hlsl option to glslang. 2024-09-16 09:50:38 +02:00
c74fbb8798 Added recipe for tinyobjloader. 2024-09-15 14:45:47 +02:00
fe8f329b38 Fixed library name for debug SDL3 on Windows. 2024-09-12 15:09:06 +02:00
ae870e798d Added SHOW_INCLUDES debug setting (only MSVC for now). 2024-09-12 15:09:06 +02:00
941f94a7b6 Adjusted SDL build script to work with SDL3 (kind of). 2024-09-11 09:00:26 +02:00
1b8d6e175b Added options to version specs and allow specifying a git ref to build instead of a version. 2024-09-11 09:00:03 +02:00
bdf063a16c Added openssl recipe. 2024-09-09 21:52:57 +02:00
6cd076394d Added recipe for winsock2 and target_os to dependency conditions. 2024-09-09 21:52:57 +02:00
0175b812a2 Fixed Catch2 recipe. 2024-09-09 21:52:57 +02:00
eb044b6d2f Fixed compilation with MSVC. 2024-09-09 21:52:56 +02:00
e6adeb3a20 Moved check of SYSTEM_CACHE_DIR accessibility to before it is used. 2024-09-09 21:52:13 +02:00
fa83ff4581 Adjusted error description to make more sense. 2024-09-09 21:51:45 +02:00
29791d2e9c Fixed zlib recipe on linux. 2024-09-09 21:51:45 +02:00
4e579121db Fixed compilation with MSVC. 2024-09-09 21:51:36 +02:00
cd727e5a1d Update to new recipe system (S++ 2.0). 2024-09-09 21:50:47 +02:00
6f83b68788 Some more work on the new dependency resolution system. 2024-09-09 21:50:47 +02:00
17ee9777ed Some tests. 2024-09-09 21:50:47 +02:00
ecb8ea9a74 Allow settings COMPILATIONDB_FILTER_FILES via config. 2024-08-18 11:15:32 +02:00
a3426f1132 Enable experimental library features (jthread) for clang. 2024-08-18 09:58:30 +02:00
e9438455ea Moved check of SYSTEM_CACHE_DIR accessibility to before it is used. 2024-08-18 09:51:04 +02:00
cc5fd2d668 Added CXXFLAGS and CFLAGS to config variables. 2024-08-18 09:47:24 +02:00
49 changed files with 786 additions and 1114 deletions

View File

@@ -1,23 +1,36 @@
import copy
import enum
import glob
import inspect
import json
import multiprocessing
import os
import pathlib
import psutil
import shutil
import sys
import time
import uuid
class TargetType(enum.Enum):
PROGRAM = 0
STATIC_LIBRARY = 1
SHARED_LIBRARY = 2
class _VersionSpec:
minimum_version = None
maximum_version = None
options = {}
def __init__(self, minimum_version = None, maximum_version = None):
def __init__(self, minimum_version = None, maximum_version = None, options = {}):
self.minimum_version = minimum_version
self.maximum_version = maximum_version
self.options = options
def __str__(self):
return f'Min: {self.minimum_version}, Max: {self.maximum_version}'
return f'Min: {self.minimum_version}, Max: {self.maximum_version}, Options: {self.options}'
class _Dependency:
name: str = ''
@@ -28,6 +41,8 @@ class _Dependency:
cook_result: dict = {}
class _Target:
name: str
target_type: TargetType
builder = None
args: list = []
kwargs: dict = {}
@@ -39,13 +54,19 @@ def _find_recipe(env: Environment, recipe_name: str):
return env['SPP_RECIPES'][recipe_name]
import importlib.util
source_file = None
for folder in env['RECIPES_FOLDERS']:
try_source_file = f'{folder.abspath}/{recipe_name}/recipe.py'
if not env['SPP_RECIPES_FOLDERS']:
env.Error('No recipes repositories set. Add one using env.RecipeRepo(<name>, <url>, <branch>).')
for folder in env['SPP_RECIPES_FOLDERS']:
from SCons import Node
if folder is Node:
folder = folder.abspath
try_source_file = f'{folder}/{recipe_name}/recipe.py'
if os.path.exists(try_source_file):
source_file = try_source_file
break
if not source_file:
raise Exception(f'Could not find recipe {recipe_name}.')
env.Error(f'Could not find recipe for {recipe_name}.')
spec = importlib.util.spec_from_file_location(recipe_name, source_file)
recipe = importlib.util.module_from_spec(spec)
recipe.env = env
@@ -53,12 +74,19 @@ def _find_recipe(env: Environment, recipe_name: str):
env['SPP_RECIPES'][recipe_name] = recipe
return recipe
def _run_cook(dependency: _Dependency):
if not dependency.cook_result:
cook_signature = inspect.signature(dependency.recipe.cook)
kwargs = {}
if 'options' in cook_signature.parameters:
kwargs['options'] = dependency.version_spec.options
dependency.cook_result = dependency.recipe.cook(env, dependency.version, **kwargs)
def _cook(env: Environment, recipe_name: str):
dependency = env['SPP_DEPENDENCIES'].get(recipe_name)
if not dependency:
raise Exception(f'Cannot cook {recipe_name} as it was not listed as a dependency.')
if not dependency.cook_result:
dependency.cook_result = dependency.recipe.cook(env, dependency.version)
_run_cook(dependency)
return dependency.cook_result
def _module(env: Environment, file: str):
@@ -69,6 +97,7 @@ def _parse_lib_conf(env: Environment, lib_conf: dict) -> None:
CPPDEFINES = lib_conf.get('CPPDEFINES', []),
LIBPATH = lib_conf.get('LIBPATH', []),
LIBS = lib_conf.get('LIBS', []),
LINKFLAGS = lib_conf.get('LINKFLAGS', []),
JINJA_TEMPLATE_SEARCHPATH = lib_conf.get('JINJA_TEMPLATE_SEARCHPATH', []))
def _inject_list(kwargs: dict, dependency: dict, list_name: str) -> None:
@@ -84,20 +113,30 @@ def _inject_dependency(dependency, kwargs: dict, add_sources: bool = True) -> No
_inject_list(kwargs, dependency, 'CPPDEFINES')
_inject_list(kwargs, dependency, 'LIBPATH')
_inject_list(kwargs, dependency, 'LIBS')
_inject_list(kwargs, dependency, 'LINKFLAGS')
if add_sources and 'ADDITIONAL_SOURCES' in dependency and hasattr(kwargs['source'], 'extend'):
kwargs['source'].extend(dependency['ADDITIONAL_SOURCES'])
if 'DEPENDENCIES' in dependency:
for inner_dependency in dependency['DEPENDENCIES']:
_inject_dependency(inner_dependency, kwargs, False)
elif isinstance(dependency, _Dependency):
if not dependency.cook_result:
dependency.cook_result = dependency.recipe.cook(env, dependency.version)
_run_cook(dependency)
_inject_list(kwargs, dependency.cook_result, 'CPPPATH')
_inject_list(kwargs, dependency.cook_result, 'CPPDEFINES')
_inject_list(kwargs, dependency.cook_result, 'LIBPATH')
_inject_list(kwargs, dependency.cook_result, 'LIBS')
_inject_list(kwargs, dependency.cook_result, 'LINKFLAGS')
for depdep in dependency.depdeps:
_inject_dependency(depdep, kwargs)
elif isinstance(dependency, _Target):
_inject_list(kwargs, dependency.kwargs, 'CPPPATH')
_inject_list(kwargs, dependency.kwargs, 'CPPDEFINES')
_inject_list(kwargs, dependency.kwargs, 'LIBPATH')
_inject_list(kwargs, dependency.kwargs, 'LIBS')
_inject_list(kwargs, {'LIBS': [dependency]}, 'LIBS')
_inject_list(kwargs, dependency.kwargs, 'LINKFLAGS')
for depdep in dependency.dependencies:
_inject_dependency(depdep, kwargs)
def _rglob(env: Environment, root_path: str, pattern: str, **kwargs):
result_nodes = []
@@ -149,6 +188,30 @@ def _make_interface(env: Environment, dependencies: list = []):
'CPPDEFINES': kwargs.get('CPPDEFINES', [])
}
def _exe_filename(env: Environment, name: str, type: str = 'static') -> str:
if os.name == 'posix':
return name
elif os.name == 'nt':
return f'{name}.exe'
else:
raise Exception('What OS is this?')
def _find_executable(env: Environment, name: str, paths: 'list[str]', type : str = 'static', allow_fail: bool = False, use_glob: bool = False):
fname = _exe_filename(env, name, type)
for path in paths:
lib_path = os.path.join(path, fname)
if use_glob:
files = glob.glob(lib_path)
if len(files) == 1:
return files[0]
elif len(files) > 1:
raise Exception(f'Multiple candidates found for executable with name {name} in paths: "{", ".join(paths)}" with name: "{", ".join(files)}".')
elif os.path.exists(lib_path):
return lib_path
if allow_fail:
return None
raise Exception(f'Could not find executable with name {name} in paths: "{", ".join(paths)}" filename: "{fname}".')
def _lib_filename(env: Environment, name: str, type: str = 'static') -> str:
if os.name == 'posix':
ext = {
@@ -183,10 +246,37 @@ def _find_lib(env: Environment, name: str, paths: 'list[str]', type : str = 'sta
def _error(env: Environment, message: str):
print(message, file=sys.stderr)
env.Exit(1)
Exit(1)
def _find_common_depenency_version(name: str, versionA: _VersionSpec, versionB: _VersionSpec) -> _VersionSpec:
result_version = _VersionSpec()
def _try_merge_dicts(dictA: dict, dictB: dict) -> 'dict|None':
result = {}
for key, valueA in dictA.items():
if key in dictB:
valueB = dictB[key]
if type(valueA) != type(valueB):
return None
elif type(valueA) == list:
result[key] = valueA + valueB
elif type(valueA) == dict:
mergedValue = _try_merge_dicts(valueA, valueB)
if mergedValue is None:
return None
result[key] = mergedValue
elif valueA != valueB:
return None
else:
result[key] = valueA
for key, valueB in dictB.items():
if key not in result:
result[key] = valueB
return result
def _find_common_dependency_version(name: str, versionA: _VersionSpec, versionB: _VersionSpec) -> _VersionSpec:
options = _try_merge_dicts(versionA.options, versionB.options)
if options is None:
return None
result_version = _VersionSpec(options=options)
if versionA.minimum_version is not None:
if versionB.minimum_version is not None:
result_version.minimum_version = max(versionA.minimum_version, versionB.minimum_version)
@@ -209,19 +299,19 @@ def _find_common_depenency_version(name: str, versionA: _VersionSpec, versionB:
return result_version
def _parse_version_spec(version_spec: dict) -> _VersionSpec:
return _VersionSpec(version_spec.get('min'), version_spec.get('max'))
return _VersionSpec(version_spec.get('min'), version_spec.get('max'), version_spec.get('options', {}))
def _can_add_dependency(env: Environment, name: str, version_spec: _VersionSpec) -> bool:
if name not in env['SPP_DEPENDENCIES']:
return True
dependency = env['SPP_DEPENDENCIES'][name]
common_version_spec = _find_common_depenency_version(name, dependency.version_spec, version_spec)
common_version_spec = _find_common_dependency_version(name, dependency.version_spec, version_spec)
return common_version_spec is not None
def _add_dependency(env: Environment, name: str, version_spec: _VersionSpec) -> _Dependency:
if name in env['SPP_DEPENDENCIES']:
dependency = env['SPP_DEPENDENCIES'][name]
common_version_spec = _find_common_depenency_version(name, dependency.version_spec, version_spec)
common_version_spec = _find_common_dependency_version(name, dependency.version_spec, version_spec)
if common_version_spec is None:
raise Exception(f'Incompatible versions detected for {name}: {dependency.version_spec} and {version_spec}')
if dependency.version_spec != common_version_spec:
@@ -256,18 +346,26 @@ def _version_matches(version, version_spec: _VersionSpec) -> bool:
def _find_version(env: Environment, dependency: _Dependency):
for update in (False, True):
versions = dependency.recipe.versions(env, update=update)
versions_signature = inspect.signature(dependency.recipe.versions)
kwargs = {}
if 'options' in versions_signature.parameters:
kwargs['options'] = dependency.version_spec.options
versions = dependency.recipe.versions(env, update=update, **kwargs)
_sort_versions(versions)
for version in versions:
kwargs = {}
dependencies_signature = inspect.signature(dependency.recipe.dependencies)
if 'options' in dependencies_signature.parameters:
kwargs['options'] = dependency.version_spec.options
if _version_matches(version, dependency.version_spec):
canadd = True
for depname, depspec in dependency.recipe.dependencies(env, version).items():
for depname, depspec in dependency.recipe.dependencies(env, version, **kwargs).items():
if not _can_add_dependency(env, depname, _parse_version_spec(depspec)):
canadd = False
break
if canadd:
depdeps = []
for depname, depspec in dependency.recipe.dependencies(env, version).items():
for depname, depspec in dependency.recipe.dependencies(env, version, **kwargs).items():
depdeps.append(_add_dependency(env, depname, _parse_version_spec(depspec)))
dependency.version = version
dependency.depdeps = depdeps
@@ -276,10 +374,16 @@ def _find_version(env: Environment, dependency: _Dependency):
print(f'Required version: {dependency.version_spec}')
raise Exception(f'Could not find a suitable version for dependency {dependency.name}.')
def _wrap_builder(builder, is_lib: bool = False):
def _wrap_builder(builder, target_type: TargetType):
def _wrapped(env, dependencies = {}, *args, **kwargs):
target_dependencies = []
for name, version_spec in dependencies.items():
if version_spec == {} and name not in env['SPP_DEPENDENCIES']: # this is basically a shortcut to adding targets from other modules without having to save them in the env
dep_target = _find_target(env, name)
if dep_target is not None and dep_target.target_type != TargetType.PROGRAM:
target_dependencies.append(dep_target)
# TODO: there might be an issue here with dependencies not being injected this way :/
continue
target_dependencies.append(_add_dependency(env, name, _parse_version_spec(version_spec)))
if 'CPPPATH' not in kwargs:
@@ -303,6 +407,15 @@ def _wrap_builder(builder, is_lib: bool = False):
kwargs['source'] = new_source
target = _Target()
if 'name' in kwargs:
target.name = kwargs['name']
else:
trgt = _target_entry(kwargs.get('target'))
if trgt is not None:
target.name = str(trgt.name)
else:
target.name = 'Unknown target'
target.target_type = target_type
target.builder = builder
target.args = args
target.kwargs = kwargs
@@ -327,6 +440,7 @@ def _wrap_depends(depends):
def _wrapped(env, dependant, dependency):
if isinstance(dependant, _Target) or isinstance(dependency, _Target):
env.Append(SPP_TARGET_DEPENDENCIES = [(dependant, dependency)])
return
elif isinstance(dependant, dict) and '_target' in dependant:
dependant = dependant['_target']
elif isinstance(dependency, dict) and '_target' in dependency:
@@ -349,12 +463,19 @@ def _build_target(target: _Target):
_build_target(lib)
target.kwargs['LIBS'].remove(lib)
target.kwargs['LIBS'].append(lib.target)
target.target = target.builder(*target.args, **target.kwargs)
new_kwargs = target.kwargs.copy()
if 'target' in new_kwargs: # there should always be a target, right?
new_kwargs['target'] = f"{new_kwargs['target']}-{build_type}"
target.target = target.builder(*target.args, **new_kwargs)
def _version_to_string(version) -> str:
return '.'.join([str(v) for v in version])
def _finalize(env: Environment):
if generate_project:
_generate_project(generate_project)
Exit(0)
version_requirements = {dep.name: {
'min': dep.version_spec.minimum_version and _version_to_string(dep.version_spec.minimum_version),
'max': dep.version_spec.maximum_version and _version_to_string(dep.version_spec.maximum_version),
@@ -378,6 +499,18 @@ def _finalize(env: Environment):
_build_target(target)
for target in env['SPP_DEFAULT_TARGETS']:
env.Default(target.target)
for dependant, dependency in env['SPP_TARGET_DEPENDENCIES']:
if isinstance(dependant, _Target):
dependant = dependant.target
if isinstance(dependency, _Target):
dependency = dependency.target
env.Depends(dependant, dependency)
def _find_target(env: Environment, target_name: str) -> '_Target|None':
for target in env['SPP_TARGETS']:
if target.name == target_name:
return target
return None
def _get_fallback_cache_dir() -> str:
return Dir('#cache').abspath
@@ -394,12 +527,138 @@ def _find_system_cache_dir() -> str:
# fallback
return _get_fallback_cache_dir()
def _target_entry(target_value):
if target_value is None:
return None
if not isinstance(target_value, list):
target_value = [target_value]
if len(target_value) < 1:
return None
if isinstance(target_value[0], str):
target_value[0] = env.Entry(target_value[0])
return target_value[0]
def _generate_project(project_type: str) -> None:
try:
import jinja2
except ImportError:
_error(None, 'Project generation requires the jinja2 to be installed.')
source_folder, target_folder = {
'clion': (os.path.join(_spp_dir.abspath, 'util', 'clion_project_template'), Dir('#.idea').abspath),
'vscode': (os.path.join(_spp_dir.abspath, 'util', 'vscode_project_template'), Dir('#.vscode').abspath)
}.get(project_type, (None, None))
if not source_folder:
_error(None, 'Invalid project type option.')
uuid_cache_file = pathlib.Path(env['SHARED_CACHE_DIR'], 'uuids.json')
uuid_cache = {}
save_uuid_cache = False
if uuid_cache_file.exists():
try:
with uuid_cache_file.open('r') as f:
uuid_cache = json.load(f)
except Exception as e:
print(f'Error loading UUID cache: {e}')
def _generate_uuid(name: str = '') -> str:
nonlocal save_uuid_cache
if name and name in uuid_cache:
return uuid_cache[name]
new_uuid = str(uuid.uuid4())
if name:
uuid_cache[name] = new_uuid
save_uuid_cache = True
return new_uuid
root_path = pathlib.Path(env.Dir('#').abspath)
def _get_executables() -> list:
result = []
for target in env['SPP_TARGETS']:
if target.target_type == TargetType.PROGRAM:
trgt = _target_entry(target.kwargs['target'])
def _exe_path(build_type) -> str:
exe_path = pathlib.Path(trgt.abspath).relative_to(root_path)
exe_path = exe_path.parent / f'{env.subst("$PROGPREFIX")}{exe_path.name}-{build_type}{env.subst("$PROGSUFFIX")}'
return str(exe_path)
result.append({
'name': target.name,
'filename': _exe_path
})
return result
def _get_libraries() -> list:
result = []
for target in env['SPP_TARGETS']:
if target.target_type == TargetType.STATIC_LIBRARY:
trgt = _target_entry(target.kwargs['target'])
def _lib_path(build_type) -> str:
lib_path = pathlib.Path(trgt.abspath).relative_to(root_path)
lib_path = lib_path.parent / f'{env.subst("$LIBPREFIX")}{lib_path.name}-{build_type}{env.subst("$LIBSUFFIX")}'
return str(lib_path)
result.append({
'name': target.name,
'filename': _lib_path
})
elif target.target_type == TargetType.SHARED_LIBRARY:
trgt = _target_entry(target.kwargs['target'])
def _lib_path(build_type) -> str:
lib_path = pathlib.Path(trgt.abspath).relative_to(root_path)
lib_path = lib_path.parent / f'{env.subst("$SHLIBPREFIX")}{lib_path.name}-{build_type}{env.subst("$SHLIBSUFFIX")}'
return str(lib_path)
result.append({
'name': target.name,
'filename': _lib_path
})
return result
def _escape_path(input: str) -> str:
return input.replace('\\', '\\\\')
jinja_env = jinja2.Environment()
jinja_env.globals['generate_uuid'] = _generate_uuid
jinja_env.globals['project'] = {
'name': env.Dir('#').name,
'executables': _get_executables(),
'libraries': _get_libraries(),
'build_types': ['debug', 'release_debug', 'release', 'profile']
}
jinja_env.globals['scons_exe'] = shutil.which('scons')
jinja_env.globals['nproc'] = multiprocessing.cpu_count()
jinja_env.filters['escape_path'] = _escape_path
source_path = pathlib.Path(source_folder)
target_path = pathlib.Path(target_folder)
for source_file in source_path.rglob('*'):
if source_file.is_file():
target_file = target_path / (source_file.relative_to(source_path))
target_file.parent.mkdir(parents=True, exist_ok=True)
if source_file.suffix != '.jinja':
shutil.copyfile(source_file, target_file)
continue
with source_file.open('r') as f:
templ = jinja_env.from_string(f.read())
target_file = target_file.with_suffix('')
with target_file.open('w') as f:
f.write(templ.render())
if save_uuid_cache:
try:
with uuid_cache_file.open('w') as f:
json.dump(uuid_cache, f)
except Exception as e:
print(f'Error writing uuid cache: {e}')
Import('config')
if not config.get('PROJECT_NAME'):
config['PROJECT_NAME'] = 'PROJECT'
if not config.get('CXX_STANDARD'):
config['CXX_STANDARD'] = 'c++23'
if not config.get('CXX_NO_EXCEPTIONS'):
config['CXX_NO_EXCEPTIONS'] = False
if not config.get('PREPROCESSOR_PREFIX'):
config['PREPROCESSOR_PREFIX'] = config['PROJECT_NAME'].upper() # TODO: may be nicer?
@@ -473,6 +732,17 @@ AddOption(
action = 'store_true'
)
AddOption(
'--generate_project',
dest = 'generate_project',
type = 'choice',
choices = ('clion', 'vscode'),
nargs = 1,
action = 'store'
)
_spp_dir = Dir('.')
build_type = GetOption('build_type')
unity_mode = GetOption('unity_mode')
variant = GetOption('variant')
@@ -481,6 +751,7 @@ config_file = GetOption('config_file')
compiler = GetOption('compiler')
update_repositories = GetOption('update_repositories')
dump_env = GetOption('dump_env')
generate_project = GetOption('generate_project')
default_CC = {
'gcc': 'gcc',
@@ -507,14 +778,24 @@ vars.Add('LINKFLAGS', 'Linker Flags')
vars.Add('PYTHON', 'Python Executable', 'python')
vars.Add('COMPILATIONDB_FILTER_FILES', 'Removes source files from the compilation DB that are not from the current'
' project.', config['COMPILATIONDB_FILTER_FILES'])
vars.Add('SHOW_INCLUDES', 'Show include hierarchy (for debugging).', False)
vars.Add('ENABLE_ASAN', 'Enable address sanitization.', bool(enable_asan))
tools = ['default', 'compilation_db', 'unity_build']
if 'TOOLS' in config:
tools.extend(config['TOOLS'])
env = Environment(tools = tools, variables = vars, ENV = os.environ)
env['RECIPES_FOLDERS'] = [Dir('recipes')]
env['SPP_RECIPES_FOLDERS'] = []
env['SYSTEM_CACHE_DIR'] = os.path.join(_find_system_cache_dir(), 'spp_cache')
env['CLONE_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'cloned')
env['DOWNLOAD_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'downloaded')
env['UPDATE_REPOSITORIES'] = update_repositories
env['CXX_STANDARD'] = config['CXX_STANDARD'] # make it available to everyone
env['DEPS_CFLAGS'] = []
env['DEPS_CXXFLAGS'] = []
env['DEPS_LINKFLAGS'] = []
print(f'Detected system cache directory: {env["SYSTEM_CACHE_DIR"]}')
try:
os.makedirs(env['SYSTEM_CACHE_DIR'], exist_ok=True)
@@ -525,7 +806,9 @@ except:
env['CLONE_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'cloned')
env['DOWNLOAD_DIR'] = os.path.join(env['SYSTEM_CACHE_DIR'], 'downloaded')
env['UPDATE_REPOSITORIES'] = update_repositories
env['CXX_STANDARD'] = config['CXX_STANDARD'] # make it available to everyone
env['CXX_NO_EXCEPTIONS'] = config['CXX_NO_EXCEPTIONS']
env['DEPS_CFLAGS'] = []
env['DEPS_CXXFLAGS'] = []
env['DEPS_LINKFLAGS'] = []
@@ -560,12 +843,17 @@ env.Append(CPPDEFINES = [])
env.Append(LINKFLAGS = [])
# init SPP environment variables
env['SPP_DIR'] = _spp_dir.abspath
env['SPP_TARGETS'] = []
env['SPP_DEFAULT_TARGETS'] = []
env['SPP_TARGET_DEPENDENCIES'] = []
env['SPP_DEPENDENCIES'] = {}
env['SPP_RECIPES'] = {}
env['OBJSUFFIX'] = f".{env['BUILD_TYPE']}{env['OBJSUFFIX']}"
if variant:
env['OBJSUFFIX'] = f".{variant}{env['OBJSUFFIX']}"
# create the cache dir
os.makedirs(env['CACHE_DIR'], exist_ok=True)
cache_gitignore = f'{env["CACHE_DIR"]}/.gitignore'
@@ -620,8 +908,10 @@ elif unity_mode == 'stress': # compile everything in one single file to stress t
# setup compiler specific options
if env['COMPILER_FAMILY'] == 'gcc' or env['COMPILER_FAMILY'] == 'clang':
env.Append(CCFLAGS = ['-Wall', '-Wextra', '-Werror', '-Wstrict-aliasing', '-pedantic'])
env.Append(CCFLAGS = ['-Wall', '-Wextra', '-Werror', '-Wstrict-aliasing', '-pedantic', '-fvisibility=hidden'])
env.Append(CXXFLAGS = [f'-std={config["CXX_STANDARD"]}'])
if env['CXX_NO_EXCEPTIONS']:
env.Append(CXXFLAGS = [f'-fno-exceptions'])
if build_type != 'release':
env.Append(LINKFLAGS = [f'-Wl,-rpath,{env["LIB_DIR"]}'])
env['LINKCOM'] = env['LINKCOM'].replace('$_LIBFLAGS', '-Wl,--start-group $_LIBFLAGS -Wl,--end-group')
@@ -634,10 +924,11 @@ if env['COMPILER_FAMILY'] == 'gcc' or env['COMPILER_FAMILY'] == 'clang':
# -Wtautological-compare triggers in libfmt and doesn't seem too useful anyway
env.Append(CCFLAGS = ['-Wno-missing-field-initializers', '-Wno-maybe-uninitialized'])
env.Append(CXXFLAGS = ['-Wno-subobject-linkage', '-Wno-dangling-reference', '-Wno-init-list-lifetime', '-Wno-tautological-compare'])
else: # clang only
# no-gnu-anonymous-struct - we don't care
env.Append(CCFLAGS = ['-Wno-gnu-anonymous-struct'])
# no-missing-field-initializers - useful in some cases, annoying in most
# no-ambiguous-reversed-operator - should be quite useful, but we get a false positive, apparently?
env.Append(CCFLAGS = ['-Wno-gnu-anonymous-struct', '-Wno-missing-field-initializers', '-Wno-ambiguous-reversed-operator'])
env.Append(CXXFLAGS = ['-fexperimental-library']) # enable std::jthread
if build_type == 'debug':
env.Append(CCFLAGS = ['-g', '-O0'], CPPDEFINES = ['_GLIBCXX_DEBUG'])
@@ -653,7 +944,7 @@ if env['COMPILER_FAMILY'] == 'gcc' or env['COMPILER_FAMILY'] == 'clang':
elif build_type == 'release':
env.Append(CCFLAGS = ['-Wno-unused-variable', '-Wno-unused-parameter', '-Wno-unused-but-set-variable', '-Wno-unused-local-typedef', '-Wno-unused-local-typedefs', '-O2'], CPPDEFINES = [f'{config["PREPROCESSOR_PREFIX"]}_RELEASE', 'NDEBUG'])
if enable_asan:
if env['ENABLE_ASAN']:
env.Append(CCFLAGS = ['-fsanitize=address', '-fno-omit-frame-pointer'])
env.Append(LINKFLAGS = ['-fsanitize=address'])
env.Append(DEPS_CXXFLAGS = ['-fsanitize=address', '-fno-omit-frame-pointer'])
@@ -673,20 +964,31 @@ elif env['COMPILER_FAMILY'] == 'cl':
# C4251: missing dll-interface of some std types, yaml-cpp doesn't compile with this enabled
# C4275: same as above
env.Append(CCFLAGS = ['/W4', '/WX', '/wd4201', '/wd4127', '/wd4702', '/wd4251', '/wd4275', '/bigobj', '/vmg',
f'/std:{cxx_version_name}', '/permissive-', '/EHsc', '/FS', '/Zc:char8_t', '/utf-8'])
f'/std:{cxx_version_name}', '/permissive-', '/FS', '/Zc:char8_t', '/utf-8'])
env.Append(CPPDEFINES = ['_CRT_SECURE_NO_WARNINGS']) # I'd like to not use MSVC specific versions of functions because they are "safer" ...
env.Append(DEPS_CXXFLAGS = ['/EHsc', '/Zc:char8_t', '/utf-8', '/vmg'])
env.Append(DEPS_CXXFLAGS = ['/Zc:char8_t', '/utf-8', '/vmg'])
if env['CXX_NO_EXCEPTIONS']:
env.Append(CPPDEFINES = ['_HAS_EXCEPTIONS=0'])
else:
env.Append(CXXFLAGS = ['/EHsc'])
env.Append(DEPS_CXXFLAGS = ['/EHsc'])
if env['SHOW_INCLUDES']:
env.Append(CCFLAGS = ['/showIncludes'])
if build_type == 'debug':
env.Append(CCFLAGS = ['/Od', '/Zi', '/MDd'], LINKFLAGS = ' /DEBUG')
env.Append(CPPDEFINES = ['_DEBUG', '_ITERATOR_DEBUG_LEVEL=2'])
env.Append(DEPS_CXXFLAGS = ['/MDd', '/Zi', '/D_DEBUG', '/D_ITERATOR_DEBUG_LEVEL=2'])
env.Append(DEPS_LINKFLAGS = ['/DEBUG'])
elif build_type == 'release_debug' or build_type == 'profile':
env.Append(CCFLAGS = ['/O2', '/Zi'], LINKFLAGS = ' /DEBUG')
env.Append(DEPS_CXXFLAGS = ['/Zi'])
env.Append(CCFLAGS = ['/O2', '/MD', '/Zi'], LINKFLAGS = ' /DEBUG')
env.Append(DEPS_CXXFLAGS = ['/Zi', '/MD'])
env.Append(DEPS_LINKFLAGS = ['/DEBUG'])
else:
env.Append(CCFLAGS = ['/O2'])
env.Append(CCFLAGS = ['/O2', '/MD'])
env.Append(DEPS_CXXFLAGS = ['/MD'])
if env['ENABLE_ASAN']:
env.Append(CCFLAGS = ['/fsanitize=address'])
if env['COMPILER_FAMILY'] == 'gcc':
env.Append(CXXFLAGS = ['-Wno-volatile'])
@@ -704,21 +1006,23 @@ env.AddMethod(_rglob, 'RGlob')
env.AddMethod(_deps_from_json, 'DepsFromJson')
env.AddMethod(_make_interface, 'MakeInterface')
env.AddMethod(_lib_filename, 'LibFilename')
env.AddMethod(_find_executable, 'FindExecutable')
env.AddMethod(_find_lib, 'FindLib')
env.AddMethod(_error, 'Error')
env.AddMethod(_wrap_builder(env.Library, is_lib = True), 'Library')
env.AddMethod(_wrap_builder(env.StaticLibrary, is_lib = True), 'StaticLibrary')
env.AddMethod(_wrap_builder(env.SharedLibrary, is_lib = True), 'SharedLibrary')
env.AddMethod(_wrap_builder(env.Program), 'Program')
env.AddMethod(_wrap_builder(env.Library, TargetType.STATIC_LIBRARY), 'Library')
env.AddMethod(_wrap_builder(env.StaticLibrary, TargetType.STATIC_LIBRARY), 'StaticLibrary')
env.AddMethod(_wrap_builder(env.SharedLibrary, TargetType.SHARED_LIBRARY), 'SharedLibrary')
env.AddMethod(_wrap_builder(env.Program, TargetType.PROGRAM), 'Program')
env.AddMethod(_wrap_default(env.Default), 'Default')
env.AddMethod(_wrap_depends(env.Depends), 'Depends')
env.AddMethod(_wrap_builder(env.UnityProgram), 'UnityProgram')
env.AddMethod(_wrap_builder(env.UnityLibrary, is_lib = True), 'UnityLibrary')
env.AddMethod(_wrap_builder(env.UnityStaticLibrary, is_lib = True), 'UnityStaticLibrary')
env.AddMethod(_wrap_builder(env.UnitySharedLibrary, is_lib = True), 'UnitySharedLibrary')
env.AddMethod(_wrap_builder(env.UnityProgram, TargetType.PROGRAM), 'UnityProgram')
env.AddMethod(_wrap_builder(env.UnityLibrary, TargetType.STATIC_LIBRARY), 'UnityLibrary')
env.AddMethod(_wrap_builder(env.UnityStaticLibrary, TargetType.STATIC_LIBRARY), 'UnityStaticLibrary')
env.AddMethod(_wrap_builder(env.UnitySharedLibrary, TargetType.SHARED_LIBRARY), 'UnitySharedLibrary')
env.AddMethod(_module, 'Module')
env.AddMethod(_finalize, 'Finalize')
env.AddMethod(_find_target, 'FindTarget')
if hasattr(env, 'Gch'):
env.AddMethod(_wrap_builder(env.Gch), 'Gch')
@@ -730,5 +1034,6 @@ if dump_env:
print('==== Begin Environment Dump =====')
print(env.Dump())
print('==== End Environment Dump =====')
Exit(0)
Return('env')

View File

@@ -9,7 +9,7 @@ _BUILT_STAMPFILE = '.spp_built'
Import('env')
def _autotools_project(env: Environment, project_root: str, config_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = [], configure_script_path: str = 'configure') -> dict:
def _autotools_project(env: Environment, project_root: str, config_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = [], configure_script_path: str = 'configure', skip_steps = ()) -> dict:
config = env['BUILD_TYPE']
build_dir = os.path.join(project_root, f'build_{config}')
install_dir = os.path.join(project_root, f'install_{config}')
@@ -32,9 +32,15 @@ def _autotools_project(env: Environment, project_root: str, config_args: 'list[s
if not os.path.exists(config_script) and os.path.exists(f'{config_script}.ac'):
subprocess.run(('autoreconf', '--install', '--force'), cwd=project_root)
subprocess.run((config_script, f'--prefix={install_dir}', *config_args), cwd=build_dir, env=env, stdout=sys.stdout, stderr=sys.stderr, check=True)
subprocess.run(('make', f'-j{jobs}', *build_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
subprocess.run(('make', 'install', *install_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
if 'configure' not in skip_steps:
subprocess.run((config_script, f'--prefix={install_dir}', *config_args), cwd=build_dir, env=env, stdout=sys.stdout, stderr=sys.stderr, check=True)
if 'build' not in skip_steps:
subprocess.run(('make', f'-j{jobs}', *build_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
if 'install' not in skip_steps:
subprocess.run(('make', 'install', *install_args), cwd=build_dir, stdout=sys.stdout, stderr=sys.stderr, check=True)
else:
# must still create the install dir for the stamp file
os.makedirs(install_dir, exist_ok=True)
pathlib.Path(install_dir, _BUILT_STAMPFILE).touch()
libpath = []
@@ -44,6 +50,7 @@ def _autotools_project(env: Environment, project_root: str, config_args: 'list[s
libpath.append(full_path)
return {
'build_dir': build_dir,
'install_dir': install_dir,
'LIBPATH': libpath,
'CPPPATH': [os.path.join(install_dir, 'include')]

View File

@@ -40,10 +40,11 @@ def _generate_cmake_args(env: Environment, dependencies: 'list[dict]') -> 'list[
args.append(f'-D{name}={cmd_quote(value)}')
return args
def _calc_version_hash(dependencies: 'list[dict]') -> str:
def _calc_version_hash(env, dependencies: 'list[dict]') -> str:
return json.dumps({
'version': _VERSION,
'dependencies': dependencies
'dependencies': dependencies,
'cxxflags': env['DEPS_CXXFLAGS']
})
def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str]' = [], build_args : 'list[str]' = [], install_args : 'list[str]' = [], dependencies: 'list[dict]' = []) -> dict:
@@ -51,7 +52,7 @@ def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str
build_dir = os.path.join(project_root, f'build_{config}')
install_dir = os.path.join(project_root, f'install_{config}')
version_hash = _calc_version_hash(dependencies)
version_hash = _calc_version_hash(env, dependencies)
stamp_file = pathlib.Path(install_dir, _BUILT_STAMPFILE)
is_built = stamp_file.exists()
@@ -97,6 +98,7 @@ def _cmake_project(env: Environment, project_root: str, generate_args: 'list[str
return {
'install_dir': install_dir,
'BINPATH': [os.path.join(install_dir, 'bin')],
'LIBPATH': libpath,
'CPPPATH': [os.path.join(install_dir, 'include')]
}

View File

@@ -2,7 +2,7 @@
from git import Repo
from git.exc import GitError
import hashlib
import re
import inspect
from SCons.Script import *
Import('env')
@@ -21,21 +21,30 @@ def _clone(env: Environment, repo_name: str, remote_url: str):
def _git_branch(env: Environment, repo_name: str, remote_url: str, git_ref: str = 'main') -> dict:
repo, origin = _clone(env, repo_name, remote_url)
worktree_dir = os.path.join(env['CLONE_DIR'], 'git', repo_name, hashlib.shake_128(git_ref.encode('utf-8')).hexdigest(6)) # TODO: commit hash would be better, right? -> not if it's a branch!
update_submodules = False
if not os.path.exists(worktree_dir):
print(f'Checking out into {worktree_dir}.')
origin.fetch(tags=True)
origin.fetch(tags=True, force=True)
os.makedirs(worktree_dir)
repo.git.worktree('add', worktree_dir, git_ref)
worktree_repo = Repo(worktree_dir)
update_submodules = True
elif env['UPDATE_REPOSITORIES']:
worktree_repo = Repo(worktree_dir)
if not worktree_repo.head.is_detached:
print(f'Updating git repository at {worktree_dir}')
worktree_origin = worktree_repo.remotes['origin']
worktree_origin.pull()
update_submodules = True
else:
print(f'Not updating git repository {worktree_dir} as it is not on a branch.')
if update_submodules:
for submodule in worktree_repo.submodules:
submodule.update(init=True)
return {
'checkout_root': worktree_dir
'checkout_root': worktree_dir,
'repo': repo,
'origin': origin
}
def _git_tags(env: Environment, repo_name: str, remote_url: str, force_fetch: bool = False) -> 'list[str]':
@@ -48,16 +57,25 @@ def _make_callable(val):
if callable(val):
return val
else:
return lambda env: val
def _wrapped(*args, **kwargs):
return val
return _wrapped
def _git_recipe(env: Environment, globals: dict, repo_name, repo_url, cook_fn, versions = None, tag_pattern = None, tag_fn = None, ref_fn = None, dependencies: dict = {}) -> None:
_repo_name = _make_callable(repo_name)
_repo_url = _make_callable(repo_url)
_tag_pattern = _make_callable(tag_pattern)
versions_cb = versions and _make_callable(versions)
dependencies_cb = _make_callable(dependencies)
def _versions(env: Environment, update: bool = False):
pattern = _tag_pattern(env)
def _versions(env: Environment, update: bool = False, options: dict = {}):
if 'ref' in options:
return [(0, 0, 0)] # no versions if compiling from a branch
pattern_signature = inspect.signature(_tag_pattern)
kwargs = {}
if 'options' in pattern_signature.parameters:
kwargs['options'] = options
pattern = _tag_pattern(env, **kwargs)
if pattern:
tags = env.GitTags(repo_name = _repo_name(env), remote_url = _repo_url(env), force_fetch=update)
result = []
@@ -73,17 +91,33 @@ def _git_recipe(env: Environment, globals: dict, repo_name, repo_url, cook_fn, v
else:
return [(0, 0, 0)]
def _dependencies(env: Environment, version) -> 'dict':
return dependencies
def _dependencies(env: Environment, version, options: dict) -> 'dict':
dependencies_signature = inspect.signature(dependencies_cb)
kwargs = {}
if 'options' in dependencies_signature.parameters:
kwargs['options'] = options
return dependencies_cb(env, version, **kwargs)
def _cook(env: Environment, version) -> dict:
if tag_fn:
git_ref = f'refs/tags/{tag_fn(version)}'
def _cook(env: Environment, version, options: dict = {}) -> dict:
if 'ref' in options:
git_ref = options['ref']
elif tag_fn:
tag_signature = inspect.signature(tag_fn)
kwargs = {}
if 'options' in tag_signature.parameters:
kwargs['options'] = options
git_ref = f'refs/tags/{tag_fn(version, **kwargs)}'
else:
assert ref_fn
git_ref = ref_fn(env, version)
repo = env.GitBranch(repo_name = _repo_name(env), remote_url = _repo_url(env), git_ref = git_ref)
return cook_fn(env, repo)
cook_signature = inspect.signature(cook_fn)
kwargs = {}
if 'options' in cook_signature.parameters:
kwargs['options'] = options
return cook_fn(env, repo, **kwargs)
globals['versions'] = _versions
globals['dependencies'] = _dependencies

View File

@@ -1,4 +1,5 @@
import os
import pathlib
Import('env')
@@ -31,8 +32,33 @@ def _wrap_jinja(orig_jinja):
return target
return _wrapped
def _find_file(env, fname):
for path in env['JINJA_FILE_SEARCHPATH']:
fullpath = os.path.join(path.abspath, fname)
if os.path.exists(fullpath):
return env.File(fullpath)
return None
def _file_size(env, fname: str) -> int:
file = _find_file(env, fname)
if not file:
env.Error(f'File does not exist: {fname}. Searched in: {[d.abspath for d in env["JINJA_FILE_SEARCHPATH"]]}')
return file.get_size()
def _file_content_hex(env, fname: str) -> str:
file = _find_file(env, fname)
if not file:
env.Error(f'File does not exist: {fname}. Searched in: {[d.abspath for d in env["JINJA_FILE_SEARCHPATH"]]}')
bytes = file.get_contents()
return ','.join([hex(byte) for byte in bytes])
env.AddMethod(_wrap_jinja(env.Jinja), 'Jinja')
env.Append(JINJA_FILTERS = {'load_config': _jinja_load_config})
env.Append(JINJA_GLOBALS = {
'file_size': lambda *args: _file_size(env, *args),
'file_content_hex': lambda *args: _file_content_hex(env, *args)
})
env.Append(JINJA_TEMPLATE_SEARCHPATH = ['data/jinja'])
env['JINJA_CONFIG_SEARCHPATH'] = [env.Dir('#data/config')]
env['JINJA_FILE_SEARCHPATH'] = [env.Dir('#')]
Return('env')

View File

@@ -0,0 +1,11 @@
import os
Import('env')
def _recipe_repository(env, repo_name: str, remote_url: str, git_ref: str = 'master') -> None:
repo = env.GitBranch(repo_name = os.path.join('recipe_repos', repo_name), remote_url = remote_url, git_ref = git_ref)
env.Append(SPP_RECIPES_FOLDERS = [os.path.join(repo['checkout_root'], 'recipes')])
env.AddMethod(_recipe_repository, 'RecipeRepo')
Return('env')

View File

@@ -0,0 +1,13 @@
FROM debian:sid-slim
RUN apt-get -y update && \
apt-get -y upgrade && \
apt-get -y install build-essential clang-19 gcc-14 g++-14 python3 python3-pip \
virtualenv python-is-python3 clang-tidy git ninja-build cmake
RUN ln -s /usr/bin/clang-19 /usr/local/bin/clang \
&& ln -s /usr/bin/clang++-19 /usr/local/bin/clang++ \
&& ln -s /usr/bin/clang-tidy-19 /usr/local/bin/clang-tidy \
&& ln -s /usr/bin/run-clang-tidy-19 /usr/local/bin/run-clang-tidy
COPY scripts /opt/scripts
RUN chmod a+x /opt/scripts/*.sh

View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -xe
python -m virtualenv venv
source venv/bin/activate
pip install scons
pip install -r external/scons-plus-plus/requirements.txt
scons -j$(nproc) --build_type=debug --variant=linux_clang_debug --compiler=clang
scons -j$(nproc) --build_type=release_debug --variant=linux_clang_release_debug --compiler=clang
scons -j$(nproc) --build_type=release --variant=linux_clang_release --compiler=clang

View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -xe
python -m virtualenv venv
source venv/bin/activate
pip install scons
pip install -r external/scons-plus-plus/requirements.txt
scons -j$(nproc) --build_type=debug --variant=linux_gcc_debug --compiler=gcc
scons -j$(nproc) --build_type=release_debug --variant=linux_gcc_release_debug --compiler=gcc
scons -j$(nproc) --build_type=release --variant=linux_gcc_release --compiler=gcc

View File

@@ -1,31 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo) -> dict:
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
lib_name = {
'debug': 'Catch2d'
}.get(env['BUILD_TYPE'], 'Catch2')
libs = []
if not env.get('CATCH2_OWN_MAIN'):
libs.append({
'debug': 'Catch2Maind'
}.get(env['BUILD_TYPE'], 'Catch2Main'))
libs.append(lib_name)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib(lib, paths=build_result['LIBPATH']) for lib in libs]
}
env.GitRecipe(
globals = globals(),
repo_name = 'Catch2',
repo_url = 'https://github.com/catchorg/Catch2.git',
tag_pattern = re.compile(r'^v([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'v{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,31 +0,0 @@
import re
from SCons.Script import *
_REPO_NAME = 'ImageMagick'
_REPO_URL = 'https://github.com/ImageMagick/ImageMagick.git'
_TAG_PATTERN = re.compile(r'^([0-9]+)\.([0-9]+)\.([0-9]+)-([0-9]+)$')
def versions(env: Environment, update: bool = False):
tags = env.GitTags(repo_name = _REPO_NAME, remote_url = _REPO_URL, force_fetch=update)
result = []
for tag in tags:
match = _TAG_PATTERN.match(tag)
if match:
result.append((int(match.groups()[0]), int(match.groups()[1]), int(match.groups()[2]), int(match.groups()[3])))
return result
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
raise Exception('this still needs to be implemented property :/')
# git_ref = f'refs/tags/{version[0]}.{version[1]}.{version[2]}-{version[3]}'
# repo = env.GitBranch(repo_name = _REPO_NAME, remote_url = _REPO_URL, git_ref = git_ref)
# checkout_root = repo['checkout_root']
# build_result = env.AutotoolsProject(checkout_root)
# return {
# 'LIBPATH': build_result['LIBPATH'],
# 'CPPPATH': build_result['CPPPATH'],
# 'LIBS': ['backtrace']
# }

View File

@@ -1,37 +0,0 @@
import platform
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root, generate_args = ['-DSDL_STATIC=ON', '-DSDL_SHARED=OFF'])
libs = []
if platform.system() == 'Windows':
if env['BUILD_TYPE'] == 'debug':
libs.append('SDL2-staticd')
else:
libs.append('SDL2-static')
libs.extend(('kernel32', 'user32', 'gdi32', 'winmm', 'imm32', 'ole32', 'oleaut32', 'version', 'uuid', 'advapi32', 'setupapi', 'shell32', 'dinput8'))
else:
if env['BUILD_TYPE'] == 'debug':
libs.append('SDL2d')
else:
libs.append('SDL2')
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': [os.path.join(build_result['install_dir'], 'include/SDL2')], # SDL is really weird about include paths ...
'LIBS': libs
}
env.GitRecipe(
globals = globals(),
repo_name = 'SDL',
repo_url = 'https://github.com/libsdl-org/SDL.git',
tag_pattern = re.compile(r'^release-([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'release-{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,45 +0,0 @@
import re
from SCons.Script import *
_REPO_NAMES = {
'default': 'VulkanHeaders',
'mewin': 'VulkanHeaders_mewin'
}
_REPO_URLS = {
'default': 'https://github.com/KhronosGroup/Vulkan-Headers.git',
'mewin': 'https://git.mewin.de/mewin/vulkan-headers.git'
}
_TAG_PATTERN = re.compile(r'^v([0-9]+)\.([0-9]+)\.([0-9]+)$')
def _get_repo_name(env: Environment) -> str:
return _REPO_NAMES[env.get('VULKANHEADERS_REMOTE', 'default')]
def _get_repo_url(env: Environment) -> str:
return _REPO_URLS[env.get('VULKANHEADERS_REMOTE', 'default')]
def versions(env: Environment, update: bool = False):
if env.get('VULKANHEADERS_REMOTE') == 'mewin':
return [(0, 0, 0)]
tags = env.GitTags(repo_name = _get_repo_name(env), remote_url = _get_repo_url(env), force_fetch=update)
result = []
for tag in tags:
match = _TAG_PATTERN.match(tag)
if match:
result.append((int(match.groups()[0]), int(match.groups()[1]), int(match.groups()[2])))
return result
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
if env.get('VULKANHEADERS_REMOTE') == 'mewin':
git_ref = 'main'
else:
git_ref = f'refs/tags/v{version[0]}.{version[1]}.{version[2]}'
repo = env.GitBranch(repo_name = _get_repo_name(env), remote_url = _get_repo_url(env), git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}

View File

@@ -1,18 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}
env.GitRecipe(
globals = globals(),
repo_name = 'argparse',
repo_url = 'https://github.com/p-ranav/argparse.git',
tag_pattern = re.compile(r'^v([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'v{version[0]}.{version[1]}',
cook_fn = _git_cook
)

View File

@@ -1,67 +0,0 @@
import json
import os
import re
import requests
from SCons.Script import *
_VERSIONS_URL = 'https://api.github.com/repos/boostorg/boost/releases'
_VERSION_PATTERN = re.compile(r'^boost-([0-9]+)\.([0-9]+)\.([0-9]+)$')
def versions(env: Environment, update: bool = False):
versions_file = os.path.join(env['DOWNLOAD_DIR'], 'boost_versions.json')
if update or not os.path.exists(versions_file):
req = requests.get(_VERSIONS_URL)
versions_data = json.loads(req.text)
result = []
for version_data in versions_data:
match = _VERSION_PATTERN.match(version_data['name'])
if not match:
continue
result.append((int(match.groups()[0]), int(match.groups()[1]), int(match.groups()[2])))
with open(versions_file, 'w') as f:
json.dump(result, f)
return result
else:
try:
with open(versions_file, 'r') as f:
return [tuple(v) for v in json.load(f)]
except:
print('boost_versions.json is empty or broken, redownloading.')
return versions(env, update=True)
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
if env.get('BOOST_LIBS') is None:
raise Exception('BOOST_LIBS not set. Set to a list of boost libs to link or "*" to link everything.')
if version >= (1, 85, 0):
url = f'https://github.com/boostorg/boost/releases/download/boost-{version[0]}.{version[1]}.{version[2]}/boost-{version[0]}.{version[1]}.{version[2]}-cmake.tar.gz'
else:
url = f'https://github.com/boostorg/boost/releases/download/boost-{version[0]}.{version[1]}.{version[2]}/boost-{version[0]}.{version[1]}.{version[2]}.tar.gz'
repo = env.DownloadAndExtract(f'boost_{version[0]}.{version[1]}.{version[2]}', url = url, skip_folders = 1)
checkout_root = repo['extracted_root']
build_result = env.CMakeProject(checkout_root)
libs = []
if '*' in env['BOOST_LIBS']:
lib_dir = build_result['LIBPATH'][0]
for lib_file in os.listdir(lib_dir):
fname = os.path.join(lib_dir, lib_file)
if not os.path.isfile(fname):
continue
libs.append(fname)
else:
for lib in set(env['BOOST_LIBS']):
if os.name == 'posix':
libs.append(env.FindLib(f'boost_{lib}', paths=build_result['LIBPATH']))
elif os.name == 'nt':
libs.append(env.FindLib(f'libboost_{lib}-*', paths=build_result['LIBPATH'], use_glob=True))
else:
raise Exception('Boost not supported on this platform.')
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': libs
}

View File

@@ -1,19 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo) -> dict:
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root]
}
env.GitRecipe(
globals = globals(),
repo_name = 'cgltf',
repo_url = 'https://github.com/jkuhlmann/cgltf.git',
tag_pattern = re.compile(r'^v([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'v{version[0]}.{version[1]}',
cook_fn = _git_cook
)

View File

@@ -1,39 +0,0 @@
import re
from SCons.Script import *
def _build_lib_name(env: Environment) -> str:
if os.name == 'posix':
return {
'debug': 'curl-d'
}.get(env['BUILD_TYPE'], 'curl')
elif os.name == 'nt':
raise Exception('TODO')
else:
raise Exception('curl is not supported yet on this OS')
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(checkout_root, generate_args=['-DBUILD_CURL_EXE=OFF','-DBUILD_SHARED_LIBS=OFF',
'-DBUILD_STATIC_LIBS=ON', '-DHTTP_ONLY=ON',
'-DCURL_USE_LIBSSH2=OFF'])
lib_name = _build_lib_name(env)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib(lib_name, paths=build_result['LIBPATH'])],
}
env.GitRecipe(
globals = globals(),
repo_name = 'curl',
repo_url = 'https://github.com/curl/curl.git',
tag_pattern = re.compile(r'^curl-([0-9]+)_([0-9]+)_([0-9]+)$'),
tag_fn = lambda version: f'curl-{version[0]}_{version[1]}_{version[2]}',
cook_fn = _git_cook,
dependencies = {
'openssl': {},
'zlib': {},
'psl': {}
}
)

View File

@@ -1,27 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(checkout_root)
lib_name = {
'debug': 'fmtd'
}.get(env['BUILD_TYPE'], 'fmt')
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib(lib_name, paths=build_result['LIBPATH'])]
}
env.GitRecipe(
globals = globals(),
repo_name = 'fmt',
repo_url = 'https://github.com/fmtlib/fmt.git',
tag_pattern = re.compile(r'^([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,52 +0,0 @@
import re
from SCons.Script import *
_REPO_NAMES = {
'default': 'glm',
'mewin': 'glm_mewin'
}
_REPO_URLS = {
'default': 'https://github.com/g-truc/glm.git',
'mewin': 'https://git.mewin.de/mewin/glm.git'
}
_TAG_PATTERN = re.compile(r'^([0-9]+)\.([0-9]+)\.([0-9]+)$')
_TAG_PATTERN_ALT = re.compile(r'^0\.([0-9]+)\.([0-9]+)\.([0-9]+)$')
def _get_repo_name(env: Environment) -> str:
return _REPO_NAMES[env.get('GLM_REMOTE', 'default')]
def _get_repo_url(env: Environment) -> str:
return _REPO_URLS[env.get('GLM_REMOTE', 'default')]
def versions(env: Environment, update: bool = False):
if env.get('GLM_REMOTE') == 'mewin':
return [(0, 0, 0)]
tags = env.GitTags(repo_name = _get_repo_name(env), remote_url = _get_repo_url(env), force_fetch=update)
result = []
for tag in tags:
match = _TAG_PATTERN.match(tag)
if match:
result.append((int(match.groups()[0]), int(match.groups()[1]), int(match.groups()[2])))
else:
match = _TAG_PATTERN_ALT.match(tag)
if match:
result.append((0, int(match.groups()[0]), int(match.groups()[1]) * 10 + int(match.groups()[2])))
return result
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
if env.get('GLM_REMOTE') == 'mewin':
git_ref = 'master'
elif version[0] == 0:
git_ref = f'refs/tags/0.{version[1]}.{int(version[2]/10)}.{version[2]%10}'
else:
git_ref = f'refs/tags/{version[0]}.{version[1]}.{version[2]}'
repo = env.GitBranch(repo_name = _get_repo_name(env), remote_url = _get_repo_url(env), git_ref = git_ref)
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root],
}

View File

@@ -1,111 +0,0 @@
import glob
import pathlib
import platform
import re
import shutil
from SCons.Script import *
_SCRIPT_STAMPFILE = '.spp_script_run'
def _git_cook(env: Environment, repo) -> dict:
checkout_root = repo['checkout_root']
# TODO: windows?
did_run_script = os.path.exists(os.path.join(repo['checkout_root'], _SCRIPT_STAMPFILE))
if not did_run_script or env['UPDATE_REPOSITORIES']:
python_exe = os.path.realpath(sys.executable)
script_file = os.path.join(repo['checkout_root'], 'update_glslang_sources.py')
prev_cwd = os.getcwd()
os.chdir(repo['checkout_root'])
if env.Execute(f'"{python_exe}" {script_file}'):
env.Exit(1)
os.chdir(prev_cwd)
pathlib.Path(repo['checkout_root'], _SCRIPT_STAMPFILE).touch()
# generate the build_info.h
generator_script = os.path.join(repo['checkout_root'], 'build_info.py')
generator_script_input = os.path.join(repo['checkout_root'], 'build_info.h.tmpl')
generator_script_output = os.path.join(repo['checkout_root'], 'glslang/build_info.h')
env.Command(
target = generator_script_output,
source = [generator_script, generator_script_input, os.path.join(repo['checkout_root'], 'CHANGES.md')],
action = f'"$PYTHON" "{generator_script}" "{repo["checkout_root"]}" -i "{generator_script_input}" -o "$TARGET"'
)
platform_source_dir = {
'Linux': 'Unix',
'Windows': 'Windows',
'Darwin': 'Unix'
}.get(platform.system(), 'Unix')
glslang_source_files = env.RGlob(os.path.join(repo['checkout_root'], 'glslang/GenericCodeGen/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/MachineIndependent/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/OGLCompilersDLL/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'glslang/ResourceLimits/'), '*.cpp') \
+ env.RGlob(os.path.join(repo['checkout_root'], 'SPIRV/'), '*.cpp') \
+ [os.path.join(repo['checkout_root'], f'glslang/OSDependent/{platform_source_dir}/ossource.cpp')]
# disable warnings
additional_cxx_flags = {
'clang': ['-w'],
'gcc': ['-w'],
'cl': ['/w']
}.get(env['COMPILER_FAMILY'], [])
env.StaticLibrary(
CCFLAGS = env['CCFLAGS'] + additional_cxx_flags,
CPPPATH = repo['checkout_root'],
target = env['LIB_DIR'] + '/glslang_full',
source = glslang_source_files
)
# build the include folder
include_dir = os.path.join(checkout_root, 'include')
if not os.path.exists(include_dir) or env['UPDATE_REPOSITORIES']:
def copy_headers(dst, src):
os.makedirs(dst, exist_ok=True)
for file in glob.glob(os.path.join(src, '*.h')):
shutil.copy(file, dst)
copy_headers(os.path.join(include_dir, 'glslang/HLSL'), os.path.join(checkout_root, 'glslang/HLSL'))
copy_headers(os.path.join(include_dir, 'glslang/Include'), os.path.join(checkout_root, 'glslang/Include'))
copy_headers(os.path.join(include_dir, 'glslang/MachineIndependent'), os.path.join(checkout_root, 'glslang/MachineIndependent'))
copy_headers(os.path.join(include_dir, 'glslang/Public'), os.path.join(checkout_root, 'glslang/Public'))
copy_headers(os.path.join(include_dir, 'glslang/SPIRV'), os.path.join(checkout_root, 'SPIRV'))
return {
'CPPPATH': [include_dir],
'LIBS': [os.path.join(env['LIB_DIR'], env.LibFilename('glslang_full'))]
}
_REPO_NAMES = {
'default': 'glslang',
'mewin': 'glslang_mewin'
}
_REPO_URLS = {
'default': 'https://github.com/KhronosGroup/glslang.git',
'mewin': 'https://git.mewin.de/mewin/glslang.git'
}
_TAG_PATTERNS = {
'default': re.compile(r'^([0-9]+)\.([0-9]+)\.([0-9]+)$'),
'mewin': None
}
def _ref_fn(env: Environment, version) -> str:
remote = env.get('GLSLANG_REMOTE', 'default')
if remote == 'default':
return f'refs/tags/{version[0]}.{version[1]}.{version[2]}'
elif remote == 'mewin':
return 'master'
else:
raise Exception('invalid glslang remote')
env.GitRecipe(
globals = globals(),
repo_name = lambda env: _REPO_NAMES[env.get('GLSLANG_REMOTE', 'default')],
repo_url = lambda env: _REPO_URLS[env.get('GLSLANG_REMOTE', 'default')],
tag_pattern = lambda env: _TAG_PATTERNS[env.get('GLSLANG_REMOTE', 'default')],
cook_fn = _git_cook,
ref_fn = _ref_fn
)

View File

@@ -1,48 +0,0 @@
import json
import os
import re
import requests
from SCons.Script import *
_VERSIONS_URL = 'https://gitlab.com/api/v4/projects/2882658/releases'
_VERSION_PATTERN = re.compile(r'^([0-9]+)\.([0-9]+)\.([0-9]+)$')
def versions(env: Environment, update: bool = False):
versions_file = os.path.join(env['DOWNLOAD_DIR'], 'libidn2_versions.json')
if update or not os.path.exists(versions_file):
req = requests.get(_VERSIONS_URL)
versions_data = json.loads(req.text)
result = []
for version_data in versions_data:
match = _VERSION_PATTERN.match(version_data['name'])
if not match:
continue
result.append((int(match.groups()[0]), int(match.groups()[1]), int(match.groups()[2])))
with open(versions_file, 'w') as f:
json.dump(result, f)
return result
else:
try:
with open(versions_file, 'r') as f:
return [tuple(v) for v in json.load(f)]
except:
print('libidn2_versions.json is empty or broken, redownloading.')
return versions(env, update=True)
def dependencies(env: Environment, version) -> 'dict':
return {
'unistring': {}
}
def cook(env: Environment, version) -> dict:
url = f'https://ftp.gnu.org/gnu/libidn/libidn2-{version[0]}.{version[1]}.{version[2]}.tar.gz'
repo = env.DownloadAndExtract(f'libidn2_{version[0]}.{version[1]}.{version[2]}', url = url, skip_folders = 1)
checkout_root = repo['extracted_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib('idn2', paths=build_result['LIBPATH'])]
}

View File

@@ -1,38 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
imgui_source_files = [
os.path.join(repo['checkout_root'], 'imgui.cpp'),
os.path.join(repo['checkout_root'], 'imgui_draw.cpp'),
os.path.join(repo['checkout_root'], 'imgui_tables.cpp'),
os.path.join(repo['checkout_root'], 'imgui_widgets.cpp')
]
imgui_add_sources = []
for backend in env.get('IMGUI_BACKENDS', []):
imgui_add_sources.append(f'backends/imgui_impl_{backend}.cpp')
env.StaticLibrary(
CPPPATH = [repo['checkout_root']],
CPPDEFINES = ['IMGUI_IMPL_VULKAN_NO_PROTOTYPES=1'],
target = env['LIB_DIR'] + '/imgui',
source = imgui_source_files,
add_source = imgui_add_sources
)
return {
'CPPPATH': [repo['checkout_root']],
'LIBS': [os.path.join(env['LIB_DIR'], env.LibFilename('imgui'))]
}
env.GitRecipe(
globals = globals(),
repo_name = 'imgui',
repo_url = 'https://github.com/ocornut/imgui.git',
tag_pattern = re.compile(r'^v([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'v{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,20 +0,0 @@
import json
from SCons.Script import *
_REPO_NAME = 'iwa'
_REPO_URL = 'https://git.mewin.de/mewin/iwa.git'
def versions(env: Environment, update: bool = False):
return [(0, 0, 0)]
def dependencies(env: Environment, version) -> 'dict':
repo = env.GitBranch(repo_name = _REPO_NAME, remote_url = _REPO_URL, git_ref = 'master')
checkout_root = repo['checkout_root']
with open(os.path.join(checkout_root, 'dependencies.json'), 'r') as f:
return env.DepsFromJson(json.load(f))
def cook(env: Environment, version) -> dict:
repo = env.GitBranch(repo_name = _REPO_NAME, remote_url = _REPO_URL, git_ref = 'master')
checkout_root = repo['checkout_root']
return env.Module(os.path.join(checkout_root, 'SModule'))

View File

@@ -1,22 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
return {
'CPPPATH': build_result['CPPPATH']
}
env.GitRecipe(
globals = globals(),
repo_name = 'json',
repo_url = 'https://github.com/nlohmann/json.git',
tag_pattern = re.compile(r'^v([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'v{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,21 +0,0 @@
from SCons.Script import *
def versions(env: Environment, update: bool = False):
return [(1, 0)]
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
if env['COMPILER_FAMILY'] not in ('gcc', 'clang'):
env.Error('libbacktrace requires gcc or clang.')
repo = env.GitBranch(repo_name = 'libbacktrace', remote_url = 'https://github.com/ianlancetaylor/libbacktrace.git', git_ref = 'master')
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': ['backtrace']
}

View File

@@ -1,21 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(checkout_root)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib('jpeg', paths=build_result['LIBPATH'])],
}
env.GitRecipe(
globals = globals(),
repo_name = 'libjpeg-turbo',
repo_url = 'https://github.com/libjpeg-turbo/libjpeg-turbo.git',
tag_pattern = re.compile(r'^([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,39 +0,0 @@
import os
import re
from SCons.Script import *
def _build_lib_name(env: Environment) -> str:
if os.name == 'posix':
return {
'debug': 'png16d'
}.get(env['BUILD_TYPE'], 'png16')
elif os.name == 'nt':
return {
'debug': 'libpng16_staticd'
}.get(env['BUILD_TYPE'], 'libpng16_static')
else:
raise Exception('libpng is not supported yet on this OS')
def _git_cook(env: Environment, repo: dict) -> dict:
lib_zlib = env.Cook('zlib')
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(checkout_root, dependencies = [lib_zlib])
lib_name = _build_lib_name(env)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib(lib_name, paths=build_result['LIBPATH'])]
}
env.GitRecipe(
globals = globals(),
repo_name = 'libpng',
repo_url = 'https://git.code.sf.net/p/libpng/code.git',
tag_pattern = re.compile(r'^v([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'v{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook,
dependencies = {
'zlib': {}
}
)

View File

@@ -1,19 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
return {
'CPPPATH': [os.path.join(checkout_root, 'include')]
}
env.GitRecipe(
globals = globals(),
repo_name = 'magic_enum',
repo_url = 'https://github.com/Neargye/magic_enum.git',
tag_pattern = re.compile(r'^v([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'v{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,21 +0,0 @@
from SCons.Script import *
import os
def versions(env: Environment, update: bool = False):
return [(1, 0)]
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
repo = env.GitBranch(repo_name = 'mecab', remote_url = 'https://github.com/taku910/mecab.git', git_ref = 'master')
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(os.path.join(checkout_root, 'mecab'))
return {
'LIBPATH': build_result['LIBPATH'],
'CPPPATH': build_result['CPPPATH'],
'LIBS': ['mecab']
}

View File

@@ -1,20 +0,0 @@
import json
from SCons.Script import *
_REPO_NAME = 'mijin'
_REPO_URL = 'https://git.mewin.de/mewin/mijin2.git'
def versions(env: Environment, update: bool = False):
return [(0, 0, 0)]
def dependencies(env: Environment, version) -> 'dict':
repo = env.GitBranch(repo_name = _REPO_NAME, remote_url = _REPO_URL, git_ref = 'master')
checkout_root = repo['checkout_root']
with open(os.path.join(checkout_root, 'dependencies.json'), 'r') as f:
return env.DepsFromJson(json.load(f))
def cook(env: Environment, version) -> dict:
repo = env.GitBranch(repo_name = _REPO_NAME, remote_url = _REPO_URL, git_ref = 'master')
checkout_root = repo['checkout_root']
return env.Module(os.path.join(checkout_root, 'SModule'))

View File

@@ -1,27 +0,0 @@
from SCons.Script import *
def versions(env: Environment, update: bool = False):
return [(1, 0)]
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
repo = env.GitBranch(repo_name = 'mikktspace', remote_url = 'https://github.com/mmikk/MikkTSpace.git', git_ref = 'master')
checkout_root = repo['checkout_root']
ccflags = env['CCFLAGS'].copy()
if env['COMPILER_FAMILY'] == 'cl':
ccflags.append('/wd4456')
lib_mikktspace = env.StaticLibrary(
CCFLAGS = ccflags,
CPPPATH = [checkout_root],
target = env['LIB_DIR'] + '/mikktspace',
source = [os.path.join(repo['checkout_root'], 'mikktspace.c')]
)
return {
'CPPPATH': [checkout_root],
'LIBS': [lib_mikktspace]
}

View File

@@ -1,21 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
build_result = env.AutotoolsProject(checkout_root, config_args = ['no-shared', 'no-tests', 'no-docs'], configure_script_path='Configure')
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib(libname, paths=build_result['LIBPATH']) for libname in ('ssl', 'crypto')]
}
env.GitRecipe(
globals = globals(),
repo_name = 'openssl',
repo_url = 'https://github.com/openssl/openssl.git',
tag_pattern = re.compile(r'^openssl-([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'openssl-{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,70 +0,0 @@
import json
import os
import re
import requests
from SCons.Script import *
_VERSIONS_URL = 'https://api.github.com/repos/rockdaboot/libpsl/releases'
_VERSION_PATTERN = re.compile(r'^Release v([0-9]+)\.([0-9]+)\.([0-9]+)$')
def versions(env: Environment, update: bool = False):
versions_file = os.path.join(env['DOWNLOAD_DIR'], 'libpsl_versions.json')
if update or not os.path.exists(versions_file):
req = requests.get(_VERSIONS_URL)
versions_data = json.loads(req.text)
result = []
for version_data in versions_data:
match = _VERSION_PATTERN.match(version_data['name'])
if not match:
continue
result.append((int(match.groups()[0]), int(match.groups()[1]), int(match.groups()[2])))
with open(versions_file, 'w') as f:
json.dump(result, f)
return result
else:
try:
with open(versions_file, 'r') as f:
return [tuple(v) for v in json.load(f)]
except:
print('libpsl_versions.json is empty or broken, redownloading.')
return versions(env, update=True)
def dependencies(env: Environment, version) -> 'dict':
return {
'idn2': {},
'unistring': {}
}
def cook(env: Environment, version) -> dict:
url = f'https://github.com/rockdaboot/libpsl/releases/download/{version[0]}.{version[1]}.{version[2]}/libpsl-{version[0]}.{version[1]}.{version[2]}.tar.gz'
repo = env.DownloadAndExtract(f'libpsl_{version[0]}.{version[1]}.{version[2]}', url = url, skip_folders = 1)
checkout_root = repo['extracted_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib('psl', paths=build_result['LIBPATH'])]
}
#def _git_cook(env: Environment, repo: dict) -> dict:
# checkout_root = repo['checkout_root']
# subprocess.run((os.path.join(checkout_root, 'autogen.sh'),), cwd=checkout_root)
# build_result = env.AutotoolsProject(checkout_root)
# return {
# 'CPPPATH': build_result['CPPPATH'],
# 'LIBS': [env.FindLib('psl', paths=build_result['LIBPATH'])]
# }
#
#env.GitRecipe(
# globals = globals(),
# repo_name = 'psl',
# repo_url = 'https://github.com/rockdaboot/libpsl.git',
# tag_pattern = re.compile(r'^libpsl-([0-9]+)\.([0-9]+)\.([0-9]+)$'),
# tag_fn = lambda version: f'libpsl-{version[0]}.{version[1]}.{version[2]}',
# cook_fn = _git_cook,
# dependencies = {
# 'idn2': {},
# 'unistring': {}
# }
#)

View File

@@ -1,34 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
lib_fmt = env.Cook('fmt')
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root, dependencies=[lib_fmt])
lib_name = {
'debug': 'spdlogd'
}.get(env['BUILD_TYPE'], 'spdlog')
cppdefines = ['SPDLOG_COMPILE_LIB=1', 'SPDLOG_FMT_EXTERNAL=1']
return {
'CPPPATH': build_result['CPPPATH'],
'CPPDEFINES': cppdefines,
'LIBS': [env.FindLib(lib_name, paths=build_result['LIBPATH'])]
}
env.GitRecipe(
globals = globals(),
repo_name = 'spdlog',
repo_url = 'https://github.com/gabime/spdlog.git',
tag_pattern = re.compile(r'^v([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'v{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook,
dependencies = {
'fmt': {}
}
)

View File

@@ -1,19 +0,0 @@
from SCons.Script import *
_REPO_NAME = 'stb'
_REPO_URL = 'https://github.com/nothings/stb.git'
def versions(env: Environment, update: bool = False):
return [(0, 0, 0)]
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
repo = env.GitBranch(repo_name = _REPO_NAME, remote_url = _REPO_URL, git_ref = 'master')
checkout_root = repo['checkout_root']
return {
'CPPPATH': [checkout_root]
}

View File

@@ -1,42 +0,0 @@
import json
import os
import re
import requests
from SCons.Script import *
_VERSIONS_URL = 'https://ftp.gnu.org/gnu/libunistring/?F=0'
_VERSION_PATTERN = re.compile(r'href="libunistring-([0-9]+)\.([0-9]+)\.([0-9]+)\.tar\.gz"')
def versions(env: Environment, update: bool = False):
versions_file = os.path.join(env['DOWNLOAD_DIR'], 'libunistring_versions.json')
if update or not os.path.exists(versions_file):
req = requests.get(_VERSIONS_URL)
result = []
for match in _VERSION_PATTERN.finditer(req.text):
result.append((int(match.groups()[0]), int(match.groups()[1]), int(match.groups()[2])))
with open(versions_file, 'w') as f:
json.dump(result, f)
return result
else:
try:
with open(versions_file, 'r') as f:
return [tuple(v) for v in json.load(f)]
except:
print('libunistring_versions.json is empty or broken, redownloading.')
return versions(env, update=True)
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
url = f'https://ftp.gnu.org/gnu/libunistring/libunistring-{version[0]}.{version[1]}.{version[2]}.tar.gz'
repo = env.DownloadAndExtract(f'libunistring_{version[0]}.{version[1]}.{version[2]}', url = url, skip_folders = 1)
checkout_root = repo['extracted_root']
build_result = env.AutotoolsProject(checkout_root)
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib('unistring', paths=build_result['LIBPATH'])]
}

View File

@@ -1,23 +0,0 @@
import os
from SCons.Script import *
def available(env: Environment):
if os.name != 'nt':
return 'Winsock2 is only available on Windows.'
def versions(env: Environment, update: bool = False):
if os.name == 'nt':
return [(0, 0, 0)]
else:
return []
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
return {
'LIBS': ['Ws2_32']
}

View File

@@ -1,26 +0,0 @@
import re
from SCons.Script import *
def _git_cook(env: Environment, repo: dict) -> dict:
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
lib_name = {
'debug': 'yaml-cppd'
}.get(env['BUILD_TYPE'], 'yaml-cpp')
return {
'CPPPATH': build_result['CPPPATH'],
'LIBS': [env.FindLib(lib_name, paths=build_result['LIBPATH'])]
}
env.GitRecipe(
globals = globals(),
repo_name = 'yaml-cpp',
repo_url = 'https://github.com/jbeder/yaml-cpp.git',
tag_pattern = re.compile(r'^yaml-cpp-([0-9]+)\.([0-9]+)\.([0-9]+)$'),
tag_fn = lambda version: f'yaml-cpp-{version[0]}.{version[1]}.{version[2]}',
cook_fn = _git_cook
)

View File

@@ -1,49 +0,0 @@
import os
import re
from SCons.Script import *
_REPO_NAME = 'zlib'
_REPO_URL = 'https://github.com/madler/zlib.git'
_TAG_PATTERN = re.compile(r'^v([0-9]+)\.([0-9]+)(?:\.([0-9]+))?$')
def _build_lib_name(env: Environment) -> str:
if os.name == 'posix':
return 'z'
elif os.name == 'nt':
return {
'debug': 'zlibstaticd'
}.get(env['BUILD_TYPE'], 'zlibstatic')
else:
raise Exception('libpng is not supported yet on this OS')
def versions(env: Environment, update: bool = False):
tags = env.GitTags(repo_name = _REPO_NAME, remote_url = _REPO_URL, force_fetch=update)
result = []
for tag in tags:
match = _TAG_PATTERN.match(tag)
if match:
result.append((int(match.groups()[0]), int(match.groups()[1]), int(match.groups()[2] or 0)))
return result
def dependencies(env: Environment, version) -> 'dict':
return {}
def cook(env: Environment, version) -> dict:
git_ref = f'refs/tags/v{version[0]}.{version[1]}'
if version[2] != 0:
git_ref = git_ref + f'.{version[2]}'
repo = env.GitBranch(repo_name = _REPO_NAME, remote_url = _REPO_URL, git_ref = git_ref)
checkout_root = repo['checkout_root']
build_result = env.CMakeProject(project_root=checkout_root)
include_dir = os.path.join(build_result['install_dir'], 'include')
lib_name = _build_lib_name(env)
lib_file = env.FindLib(lib_name, paths=build_result['LIBPATH'])
return {
'CPPPATH': [include_dir],
'LIBS': [lib_file],
'CMAKE_VARS': {
'ZLIB_LIBRARY': lib_file,
'ZLIB_INCLUDE_DIR': include_dir
}
}

View File

@@ -1,2 +1,4 @@
GitPython
psutil
Jinja2
requests

View File

@@ -0,0 +1,8 @@
# Default ignored files
/shelf/
/workspace.xml
# Editor-based HTTP Client requests
/httpRequests/
# Datasource local storage ignored files
/dataSources/
/dataSources.local.xml

View File

@@ -0,0 +1,35 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CLionExternalBuildManager">
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<target id="{{ generate_uuid('target_' + executable.name + '_' + build_type) }}" name="{{ executable.name }} {{ build_type_name }}" defaultType="TOOL">
<configuration id="{{ generate_uuid('configuration_' + executable.name + '_' + build_type) }}" name="{{ executable.name }} {{ build_type_name }}">
<build type="TOOL">
<tool actionId="Tool_External Tools_{{ executable.name }} {{ build_type_name }}" />
</build>
<clean type="TOOL">
<tool actionId="Tool_External Tools_{{ executable.name }} {{ build_type_name }} Clean" />
</clean>
</configuration>
</target>
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<target id="{{ generate_uuid('target_' + library.name + '_' + build_type) }}" name="{{ library.name }} {{ build_type_name }}" defaultType="TOOL">
<configuration id="{{ generate_uuid('configuration_' + library.name + '_' + build_type) }}" name="{{ library.name }} {{ build_type_name }}">
<build type="TOOL">
<tool actionId="Tool_External Tools_{{ library.name }} {{ build_type_name }}" />
</build>
<clean type="TOOL">
<tool actionId="Tool_External Tools_{{ library.name }} {{ build_type_name }} Clean" />
</clean>
</configuration>
</target>
{% endfor %}
{% endfor %}
</component>
</project>

View File

@@ -0,0 +1,17 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CompDBSettings">
<option name="linkedExternalProjectsSettings">
<CompDBProjectSettings>
<option name="externalProjectPath" value="$PROJECT_DIR$" />
<option name="modules">
<set>
<option value="$PROJECT_DIR$" />
</set>
</option>
</CompDBProjectSettings>
</option>
</component>
<component name="CompDBWorkspace" PROJECT_DIR="$PROJECT_DIR$" />
<component name="ExternalStorageConfigurationManager" enabled="true" />
</project>

View File

@@ -0,0 +1,40 @@
<toolSet name="External Tools">
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<tool name="{{ executable.name }} {{ build_type_name }}" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="-j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) }} compile_commands.json" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
<tool name="{{ executable.name }} {{ build_type_name }} Clean" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="--build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) }} -c" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
<tool name="{{ library.name }} {{ build_type_name }}" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="-j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) }} compile_commands.json" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
<tool name="{{ library.name }} {{ build_type_name }} Clean" showInMainMenu="false" showInEditor="false" showInProject="false" showInSearchPopup="false" disabled="false" useConsole="true" showConsoleOnStdOut="false" showConsoleOnStdErr="false" synchronizeAfterRun="true">
<exec>
<option name="COMMAND" value="{{ scons_exe }}" />
<option name="PARAMETERS" value="--build_type={{ build_type }} --unity=disable {{ library.filename(build_type) }} -c" />
<option name="WORKING_DIRECTORY" value="$ProjectFileDir$" />
</exec>
</tool>
{% endfor %}
{% endfor %}
</toolSet>

View File

@@ -0,0 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="" vcs="Git" />
<mapping directory="$PROJECT_DIR$/external/scons-plus-plus" vcs="Git" />
</component>
</project>

View File

@@ -0,0 +1,109 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="AutoImportSettings">
<option name="autoReloadType" value="SELECTIVE" />
</component>
<component name="CMakeRunConfigurationManager">
<generated />
</component>
<component name="CMakeSettings">
<configurations>
<configuration PROFILE_NAME="Debug" ENABLED="true" CONFIG_NAME="Debug" />
</configurations>
</component>
<component name="ClangdSettings">
<option name="formatViaClangd" value="false" />
</component>
<component name="CompDBLocalSettings">
<option name="availableProjects">
<map>
<entry>
<key>
<ExternalProjectPojo>
<option name="name" value="{{ project.name }}" />
<option name="path" value="$PROJECT_DIR$" />
</ExternalProjectPojo>
</key>
<value>
<list>
<ExternalProjectPojo>
<option name="name" value="{{ project.name }}" />
<option name="path" value="$PROJECT_DIR$" />
</ExternalProjectPojo>
</list>
</value>
</entry>
</map>
</option>
<option name="projectSyncType">
<map>
<entry key="$PROJECT_DIR$" value="RE_IMPORT" />
</map>
</option>
</component>
<component name="ExternalProjectsData">
<projectState path="$PROJECT_DIR$">
<ProjectState />
</projectState>
</component>
<component name="Git.Settings">
<option name="RECENT_GIT_ROOT_PATH" value="$PROJECT_DIR$" />
</component>
<component name="ProjectColorInfo">{
&quot;associatedIndex&quot;: 5
}</component>
<component name="ProjectViewState">
<option name="hideEmptyMiddlePackages" value="true" />
<option name="showLibraryContents" value="true" />
</component>
<component name="PropertiesComponent"><![CDATA[{
"keyToString": {
{% for executable in project.executables -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
"Custom Build Application.{{ executable.name }} {{ build_type_name }}.executor": "Debug",
{% endfor -%}
{% endfor -%}
"RunOnceActivity.RadMigrateCodeStyle": "true",
"RunOnceActivity.ShowReadmeOnStart": "true",
"RunOnceActivity.cidr.known.project.marker": "true",
"RunOnceActivity.readMode.enableVisualFormatting": "true",
"cf.first.check.clang-format": "false",
"cidr.known.project.marker": "true",
"git-widget-placeholder": "master",
"node.js.detected.package.eslint": "true",
"node.js.detected.package.tslint": "true",
"node.js.selected.package.eslint": "(autodetect)",
"node.js.selected.package.tslint": "(autodetect)",
"nodejs_package_manager_path": "npm",
"settings.editor.selected.configurable": "CLionExternalConfigurable",
"vue.rearranger.settings.migration": "true"
}
}]]></component>
<component name="RunManager" selected="Custom Build Application.{{ project.executables[0].name }} {{ project.build_types[0] }}">
{% for executable in project.executables -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
<configuration name="{{ executable.name }} {{ build_type_name }}" type="CLionExternalRunConfiguration" factoryName="Application" REDIRECT_INPUT="false" ELEVATE="false" USE_EXTERNAL_CONSOLE="false" EMULATE_TERMINAL="false" WORKING_DIR="file://$ProjectFileDir$" PASS_PARENT_ENVS_2="true" PROJECT_NAME="{{ project.name }}" TARGET_NAME="{{ executable.name }} {{ build_type_name }}" CONFIG_NAME="{{ executable.name }} {{ build_type_name }}" RUN_PATH="$PROJECT_DIR$/{{ executable.filename(build_type) }}">
<method v="2">
<option name="CLION.EXTERNAL.BUILD" enabled="true" />
</method>
</configuration>
{% endfor -%}
{% endfor -%}
{% for library in project.libraries -%}
{% for build_type in project.build_types -%}
{% set build_type_name = build_type | capitalize -%}
<configuration name="{{ library.name }} {{ build_type_name }}" type="CLionExternalRunConfiguration" factoryName="Application" REDIRECT_INPUT="false" ELEVATE="false" USE_EXTERNAL_CONSOLE="false" EMULATE_TERMINAL="false" PASS_PARENT_ENVS_2="true" PROJECT_NAME="{{ project.name }}" TARGET_NAME="{{ library.name }} {{ build_type_name }}" CONFIG_NAME="{{ library.name }} {{ build_type_name }}">
<method v="2">
<option name="CLION.EXTERNAL.BUILD" enabled="true" />
</method>
</configuration>
{% endfor -%}
{% endfor -%}
</component>
<component name="SpellCheckerSettings" RuntimeDictionaries="0" Folders="0" CustomDictionaries="0" DefaultDictionary="application-level" UseSingleDictionary="true" transferred="true" />
<component name="TypeScriptGeneratedFilesManager">
<option name="version" value="3" />
</component>
</project>

View File

@@ -0,0 +1,6 @@
{
"recommendations": [
"ms-vscode.cpptools",
"llvm-vs-code-extensions.vscode-clangd"
]
}

View File

@@ -0,0 +1,20 @@
{
"configurations": [
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
{
"name": "{{ executable.name }} ({{ build_type | capitalize }})",
"type": "cppvsdbg",
"request": "launch",
"program": "{{ executable.filename(build_type) | escape_path }}",
"args": [],
"stopAtEntry": false,
"cwd": "${workspaceFolder}",
"environment": [],
"console": "integratedTerminal"
}
{% endfor %}
{% endfor %}
]
}

View File

@@ -0,0 +1,69 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"tasks": [
{% for executable in project.executables %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
{
"label": "{{ executable.name }} {{ build_type_name }}",
"type": "shell",
"command": "{{ scons_exe | escape_path }} -j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) | escape_path }} compile_commands.json",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{
"label": "{{ executable.name }} {{ build_type_name }} Clean",
"type": "shell",
"command": "{{ scons_exe | escape_path }} --build_type={{ build_type }} --unity=disable {{ executable.filename(build_type) | escape_path }} -c",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{% endfor %}
{% endfor %}
{% for library in project.libraries %}
{% for build_type in project.build_types %}
{% set build_type_name = build_type | capitalize -%}
{
"label": "{{ library.name }} {{ build_type_name }}",
"type": "shell",
"command": "{{ scons_exe | escape_path }} -j{{ nproc }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) | escape_path }} compile_commands.json",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{
"label": "{{ library.name }} {{ build_type_name }} Clean",
"type": "shell",
"command": "{{ scons_exe | escape_path }} --build_type={{ build_type }} --unity=disable {{ library.filename(build_type) | escape_path }} -c",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": {
"kind": "build",
"isDefault": false
}
},
{% endfor %}
{% endfor %}
]
}