mirror of
https://github.com/Ed94/gencpp.git
synced 2024-12-22 07:44:45 -08:00
Preparing to implement ADT for csv functions.
I'm rewritting it the way I'd like to learn it. - I want to use csv parsing heavily with the library so I'm just going to add it to the scanner. - Globaly memory allocator moved to regular gen header/source as its something really just made for the library. - Some small refactors to macros - The parser was updated to support tokenizing preprocessor directives. - The purpose is based off intuition that it will be required for the scanner.
This commit is contained in:
parent
2a319ed6db
commit
9a784fe92f
3
.vscode/settings.json
vendored
3
.vscode/settings.json
vendored
@ -23,5 +23,6 @@
|
||||
"C_Cpp.intelliSenseEngineFallback": "disabled",
|
||||
"mesonbuild.configureOnOpen": true,
|
||||
"C_Cpp.errorSquiggles": "enabled",
|
||||
"godot_tools.scene_file_config": ""
|
||||
"godot_tools.scene_file_config": "",
|
||||
"C_Cpp.default.compilerPath": "cl.exe"
|
||||
}
|
29
Readme.md
29
Readme.md
@ -18,14 +18,12 @@ These build up a code AST to then serialize with a file builder.
|
||||
* [On multithreading](#on-multi-threading)
|
||||
* [Extending the library](#extending-the-library)
|
||||
* [TODO](#todo)
|
||||
* [Thoughts](#thoughts)
|
||||
|
||||
## Notes
|
||||
|
||||
The project has reached an *alpha* state, all the current functionality works for the test cases but it will most likely break in many other cases.
|
||||
|
||||
Note: Do not trying to do any large generations with this (at least not without changing the serialization implementation).
|
||||
It does not resue any memory yet for dynamic strings and thus any signficant size memory will result in massive consumption.
|
||||
|
||||
The project has no external dependencies beyond:
|
||||
|
||||
* `stdarg.h`
|
||||
@ -680,11 +678,28 @@ Names or Content fields are interned strings and thus showed be cached using `ge
|
||||
* Implement a context stack for the parsing, allows for accurate scope validation for the AST types.
|
||||
* Make a more robust test suite.
|
||||
* Generate a single-header library.
|
||||
* Improve the allocation strategy for strings in `Builder`, `AST::to_string`, `Parser::lex`, all three can use some form of slab allocation strategy...
|
||||
* Can most likely use a simple slag allocator.
|
||||
* Convert global allocation strategy to use the dual-scratch allocator for a contextual scope.
|
||||
* May be in need of a better name, I found a few repos with this same one...
|
||||
* Support module and attribute parsing (Marked with TODOs for now..)
|
||||
* Suffix specifiers for functions (const, override, final)
|
||||
* Trailing specifiers (postfix ) for functions (const, override, final)
|
||||
* Implement the Scanner
|
||||
* Implement the Editor
|
||||
* Support parsing full enum definitions inside a typedef. (For C patterns)
|
||||
* Support defining/parsing full definitions inside a typedef. (For C patterns)
|
||||
* Make the libray boostrap itself? It would make the code generated have less macros.
|
||||
* Easier to tailor make the library for other projects.
|
||||
* Most code can be in componentized into files and then scanned in.
|
||||
* Can offer a more c-like version for the implementation, make namespaces optional, etc. (Good way to stress test it)
|
||||
|
||||
# Thoughts
|
||||
|
||||
This project came about for a few reasons:
|
||||
|
||||
* I've been trying out the "handmade" approach to programming to see whats its like in practice vs what I have to use at work, and what I learned before getting exposed to the community.
|
||||
* Its very hard to unlearn OOP.
|
||||
* Not a fan of pure C, maybe I'll succumb to the drawbacks.
|
||||
* All alternatives to C/C++ are too opionionated instead of providing a lax frontend, or a proper compiler backend with a frontend api to quickly roll your own forntend.
|
||||
* One of the core issues I've always had with programming is there has always been a need for metaprogramming, but every single tool has horrible error deduction for the user (backend blackbox from codebase size or closed-source, error log nightmare).
|
||||
* I spend an obnoxious amount of time trying to express code that cannot be expressed well in templates or macros and still have an adequate editor experience, even with full blown IDEs.
|
||||
* I wanted to be able to easily refactor interated with projects with some form of curation, and still have the ability to not maintain a separate fork (IF the scanner gets implemetned, that is possible).
|
||||
* I did not use Metadesk as it was an esoteric library for me to use as a dependency when I didn't fully grasp the vision for how this library would end up. (Not much practice doing metaprogramming or code gen/transform development)
|
||||
* I have no issue rewritting the library to use it as a backend if its worth while but its most likely better to just make an extension for it.
|
||||
|
458
project/gen.cpp
458
project/gen.cpp
@ -11,24 +11,25 @@
|
||||
|
||||
namespace gen
|
||||
{
|
||||
namespace StaticData
|
||||
{
|
||||
global Array< Pool > CodePools = { nullptr };
|
||||
global Array< Arena > StringArenas = { nullptr };
|
||||
#pragma region StaticData
|
||||
// TODO : Convert global allocation strategy to use the dual-scratch allocator for a contextual scope.
|
||||
global AllocatorInfo GlobalAllocator;
|
||||
global Array<Arena> Global_AllocatorBuckets;
|
||||
|
||||
global StringTable StringCache;
|
||||
global Array< Pool > CodePools = { nullptr };
|
||||
global Array< Arena > StringArenas = { nullptr };
|
||||
|
||||
// TODO : Need to implement String memory management for seriaization intermediates.
|
||||
global StringTable StringCache;
|
||||
|
||||
global Arena LexArena;
|
||||
global Arena LexArena;
|
||||
|
||||
global AllocatorInfo Allocator_DataArrays = heap();
|
||||
global AllocatorInfo Allocator_CodePool = heap();
|
||||
global AllocatorInfo Allocator_Lexer = heap();
|
||||
global AllocatorInfo Allocator_StringArena = heap();
|
||||
global AllocatorInfo Allocator_StringTable = heap();
|
||||
global AllocatorInfo Allocator_TypeTable = heap();
|
||||
}
|
||||
global AllocatorInfo Allocator_DataArrays = heap();
|
||||
global AllocatorInfo Allocator_CodePool = heap();
|
||||
global AllocatorInfo Allocator_Lexer = heap();
|
||||
global AllocatorInfo Allocator_StringArena = heap();
|
||||
global AllocatorInfo Allocator_StringTable = heap();
|
||||
global AllocatorInfo Allocator_TypeTable = heap();
|
||||
#pragma endregion StaticData
|
||||
|
||||
#pragma region Constants
|
||||
global CodeType t_auto;
|
||||
@ -169,6 +170,7 @@ namespace gen
|
||||
mem_copy( result, this, sizeof( AST ) );
|
||||
result->Parent = nullptr;
|
||||
#else
|
||||
// TODO : Stress test this...
|
||||
switch ( Type )
|
||||
{
|
||||
case Invalid:
|
||||
@ -391,7 +393,7 @@ namespace gen
|
||||
#endif
|
||||
|
||||
// TODO : Need to refactor so that intermeidate strings are freed conviently.
|
||||
String result = String::make( Memory::GlobalAllocator, "" );
|
||||
String result = String::make( GlobalAllocator, "" );
|
||||
|
||||
switch ( Type )
|
||||
{
|
||||
@ -582,7 +584,7 @@ namespace gen
|
||||
{
|
||||
result.append_fmt( "export\n{\n" );
|
||||
|
||||
Code curr = cast<Code>();
|
||||
Code curr = { this };
|
||||
s32 left = NumEntries;
|
||||
while ( left-- )
|
||||
{
|
||||
@ -751,7 +753,7 @@ namespace gen
|
||||
|
||||
if ( NumEntries - 1 > 0)
|
||||
{
|
||||
for ( CodeParam param : Next->cast<CodeParam>() )
|
||||
for ( CodeParam param : (CodeParam){ (AST_Param*)Next } )
|
||||
{
|
||||
result.append_fmt( ", %s", param.to_string() );
|
||||
}
|
||||
@ -849,7 +851,7 @@ namespace gen
|
||||
|
||||
result.append_fmt( "%s %s", UnderlyingType->to_string(), Name );
|
||||
|
||||
if ( UnderlyingType->ArrExpr )
|
||||
if ( UnderlyingType->Type == Typename && UnderlyingType->ArrExpr )
|
||||
{
|
||||
result.append_fmt( "[%s];", UnderlyingType->ArrExpr->to_string() );
|
||||
}
|
||||
@ -1087,54 +1089,70 @@ namespace gen
|
||||
#pragma endregion AST
|
||||
|
||||
#pragma region Gen Interface
|
||||
void init()
|
||||
internal void* Global_Allocator_Proc( void* allocator_data, AllocType type, sw size, sw alignment, void* old_memory, sw old_size, u64 flags )
|
||||
{
|
||||
using namespace StaticData;
|
||||
Arena& last = Global_AllocatorBuckets.back();
|
||||
|
||||
Memory::setup();
|
||||
|
||||
// Setup the arrays
|
||||
switch ( type )
|
||||
{
|
||||
CodePools = Array<Pool>::init_reserve( Allocator_DataArrays, InitSize_DataArrays );
|
||||
case EAllocation_ALLOC:
|
||||
{
|
||||
if ( last.TotalUsed + size > last.TotalSize )
|
||||
{
|
||||
Arena bucket = Arena::init_from_allocator( heap(), Global_BucketSize );
|
||||
|
||||
if ( CodePools == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the CodePools array" );
|
||||
if ( bucket.PhysicalStart == nullptr )
|
||||
fatal( "Failed to create bucket for Global_AllocatorBuckets");
|
||||
|
||||
StringArenas = Array<Arena>::init_reserve( Allocator_DataArrays, InitSize_DataArrays );
|
||||
if ( ! Global_AllocatorBuckets.append( bucket ) )
|
||||
fatal( "Failed to append bucket to Global_AllocatorBuckets");
|
||||
|
||||
if ( StringArenas == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the StringArenas array" );
|
||||
last = Global_AllocatorBuckets.back();
|
||||
}
|
||||
|
||||
return alloc_align( last, size, alignment );
|
||||
}
|
||||
case EAllocation_FREE:
|
||||
{
|
||||
// Doesn't recycle.
|
||||
}
|
||||
break;
|
||||
case EAllocation_FREE_ALL:
|
||||
{
|
||||
// Memory::cleanup instead.
|
||||
}
|
||||
break;
|
||||
case EAllocation_RESIZE:
|
||||
{
|
||||
if ( last.TotalUsed + size > last.TotalSize )
|
||||
{
|
||||
Arena bucket = Arena::init_from_allocator( heap(), Global_BucketSize );
|
||||
|
||||
if ( bucket.PhysicalStart == nullptr )
|
||||
fatal( "Failed to create bucket for Global_AllocatorBuckets");
|
||||
|
||||
if ( ! Global_AllocatorBuckets.append( bucket ) )
|
||||
fatal( "Failed to append bucket to Global_AllocatorBuckets");
|
||||
|
||||
last = Global_AllocatorBuckets.back();
|
||||
}
|
||||
|
||||
void* result = alloc_align( last.Backing, size, alignment );
|
||||
|
||||
if ( result != nullptr && old_memory != nullptr )
|
||||
{
|
||||
mem_copy( result, old_memory, old_size );
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
// Setup the code pool and code entries arena.
|
||||
{
|
||||
Pool code_pool = Pool::init( Allocator_CodePool, CodePool_NumBlocks, sizeof(AST) );
|
||||
|
||||
if ( code_pool.PhysicalStart == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the code pool" );
|
||||
|
||||
CodePools.append( code_pool );
|
||||
|
||||
#ifdef GEN_FEATURE_PARSING
|
||||
LexArena = Arena::init_from_allocator( Allocator_Lexer, LexAllocator_Size );
|
||||
#endif
|
||||
|
||||
Arena string_arena = Arena::init_from_allocator( Allocator_StringArena, SizePer_StringArena );
|
||||
|
||||
if ( string_arena.PhysicalStart == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the string arena" );
|
||||
|
||||
StringArenas.append( string_arena );
|
||||
}
|
||||
|
||||
// Setup the hash tables
|
||||
{
|
||||
StringCache = StringTable::init( Allocator_StringTable );
|
||||
|
||||
if ( StringCache.Entries == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the StringCache");
|
||||
}
|
||||
return nullptr;
|
||||
}
|
||||
|
||||
internal void define_constants()
|
||||
{
|
||||
Code::Global = make_code();
|
||||
Code::Global->Name = get_cached_string( txt_StrC("Global Code") );
|
||||
Code::Global->Content = Code::Global->Name;
|
||||
@ -1248,10 +1266,73 @@ namespace gen
|
||||
# undef def_constant_spec
|
||||
}
|
||||
|
||||
void init()
|
||||
{
|
||||
// Setup global allocator
|
||||
{
|
||||
GlobalAllocator = AllocatorInfo { & Global_Allocator_Proc, nullptr };
|
||||
|
||||
Global_AllocatorBuckets = Array<Arena>::init_reserve( heap(), 128 );
|
||||
|
||||
if ( Global_AllocatorBuckets == nullptr )
|
||||
fatal( "Failed to reserve memory for Global_AllocatorBuckets");
|
||||
|
||||
Arena bucket = Arena::init_from_allocator( heap(), Global_BucketSize );
|
||||
|
||||
if ( bucket.PhysicalStart == nullptr )
|
||||
fatal( "Failed to create first bucket for Global_AllocatorBuckets");
|
||||
|
||||
Global_AllocatorBuckets.append( bucket );
|
||||
|
||||
}
|
||||
|
||||
// Setup the arrays
|
||||
{
|
||||
CodePools = Array<Pool>::init_reserve( Allocator_DataArrays, InitSize_DataArrays );
|
||||
|
||||
if ( CodePools == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the CodePools array" );
|
||||
|
||||
StringArenas = Array<Arena>::init_reserve( Allocator_DataArrays, InitSize_DataArrays );
|
||||
|
||||
if ( StringArenas == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the StringArenas array" );
|
||||
}
|
||||
|
||||
// Setup the code pool and code entries arena.
|
||||
{
|
||||
Pool code_pool = Pool::init( Allocator_CodePool, CodePool_NumBlocks, sizeof(AST) );
|
||||
|
||||
if ( code_pool.PhysicalStart == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the code pool" );
|
||||
|
||||
CodePools.append( code_pool );
|
||||
|
||||
#ifdef GEN_FEATURE_PARSING
|
||||
LexArena = Arena::init_from_allocator( Allocator_Lexer, LexAllocator_Size );
|
||||
#endif
|
||||
|
||||
Arena string_arena = Arena::init_from_allocator( Allocator_StringArena, SizePer_StringArena );
|
||||
|
||||
if ( string_arena.PhysicalStart == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the string arena" );
|
||||
|
||||
StringArenas.append( string_arena );
|
||||
}
|
||||
|
||||
// Setup the hash tables
|
||||
{
|
||||
StringCache = StringTable::init( Allocator_StringTable );
|
||||
|
||||
if ( StringCache.Entries == nullptr )
|
||||
fatal( "gen::init: Failed to initialize the StringCache");
|
||||
}
|
||||
|
||||
define_constants();
|
||||
}
|
||||
|
||||
void deinit()
|
||||
{
|
||||
using namespace StaticData;
|
||||
|
||||
s32 index = 0;
|
||||
s32 left = CodePools.num();
|
||||
do
|
||||
@ -1281,13 +1362,48 @@ namespace gen
|
||||
LexArena.free();
|
||||
#endif
|
||||
|
||||
Memory::cleanup();
|
||||
index = 0;
|
||||
left = Global_AllocatorBuckets.num();
|
||||
do
|
||||
{
|
||||
Arena* bucket = & Global_AllocatorBuckets[ index ];
|
||||
bucket->free();
|
||||
index++;
|
||||
}
|
||||
while ( left--, left );
|
||||
|
||||
Global_AllocatorBuckets.free();
|
||||
}
|
||||
|
||||
void reset()
|
||||
{
|
||||
s32 index = 0;
|
||||
s32 left = CodePools.num();
|
||||
do
|
||||
{
|
||||
Pool* code_pool = & CodePools[index];
|
||||
code_pool->clear();
|
||||
index++;
|
||||
}
|
||||
while ( left--, left );
|
||||
|
||||
index = 0;
|
||||
left = StringArenas.num();
|
||||
do
|
||||
{
|
||||
Arena* string_arena = & StringArenas[index];
|
||||
string_arena->TotalUsed = 0;;
|
||||
index++;
|
||||
}
|
||||
while ( left--, left );
|
||||
|
||||
StringCache.clear();
|
||||
|
||||
define_constants();
|
||||
}
|
||||
|
||||
AllocatorInfo get_string_allocator( s32 str_length )
|
||||
{
|
||||
using namespace StaticData;
|
||||
|
||||
Arena* last = & StringArenas.back();
|
||||
|
||||
uw size_req = str_length + sizeof(String::Header) + sizeof(char*);
|
||||
@ -1308,8 +1424,6 @@ namespace gen
|
||||
// Will either make or retrive a code string.
|
||||
StringCached get_cached_string( StrC str )
|
||||
{
|
||||
using namespace StaticData;
|
||||
|
||||
s32 hash_length = str.Len > kilobytes(1) ? kilobytes(1) : str.Len;
|
||||
u64 key = crc32( str.Ptr, hash_length );
|
||||
{
|
||||
@ -1331,8 +1445,6 @@ namespace gen
|
||||
*/
|
||||
Code make_code()
|
||||
{
|
||||
using namespace StaticData;
|
||||
|
||||
Pool* allocator = & CodePools.back();
|
||||
if ( allocator->FreeList == nullptr )
|
||||
{
|
||||
@ -1693,30 +1805,30 @@ namespace gen
|
||||
|
||||
void set_allocator_data_arrays( AllocatorInfo allocator )
|
||||
{
|
||||
StaticData::Allocator_DataArrays = allocator;
|
||||
Allocator_DataArrays = allocator;
|
||||
}
|
||||
|
||||
void set_allocator_code_pool( AllocatorInfo allocator )
|
||||
{
|
||||
StaticData::Allocator_CodePool = allocator;
|
||||
Allocator_CodePool = allocator;
|
||||
}
|
||||
|
||||
void set_allocator_lexer( AllocatorInfo allocator )
|
||||
{
|
||||
StaticData::Allocator_Lexer = allocator;
|
||||
Allocator_Lexer = allocator;
|
||||
}
|
||||
|
||||
void set_allocator_string_arena( AllocatorInfo allocator )
|
||||
{
|
||||
StaticData::Allocator_StringArena = allocator;
|
||||
Allocator_StringArena = allocator;
|
||||
}
|
||||
|
||||
void set_allocator_string_table( AllocatorInfo allocator )
|
||||
{
|
||||
StaticData::Allocator_StringArena = allocator;
|
||||
Allocator_StringArena = allocator;
|
||||
}
|
||||
|
||||
#pragma region Helper Marcos
|
||||
#pragma region Helper Marcojs
|
||||
// This snippet is used in nearly all the functions.
|
||||
# define name_check( Context_, Name_ ) \
|
||||
{ \
|
||||
@ -2420,7 +2532,7 @@ namespace gen
|
||||
return result;
|
||||
}
|
||||
|
||||
CodeTypedef def_typedef( StrC name, CodeType type, CodeAttributes attributes, ModuleFlag mflags )
|
||||
CodeTypedef def_typedef( StrC name, Code type, CodeAttributes attributes, ModuleFlag mflags )
|
||||
{
|
||||
name_check( def_typedef, name );
|
||||
null_check( def_typedef, type );
|
||||
@ -3194,67 +3306,71 @@ namespace gen
|
||||
/*
|
||||
This is a simple lexer that focuses on tokenizing only tokens relevant to the library.
|
||||
It will not be capable of lexing C++ code with unsupported features.
|
||||
|
||||
For the sake of scanning files, it can scan preprocessor directives
|
||||
*/
|
||||
|
||||
# define Define_TokType \
|
||||
Entry( Access_Private, "private" ) \
|
||||
Entry( Access_Protected, "protected" ) \
|
||||
Entry( Access_Public, "public" ) \
|
||||
Entry( Access_MemberSymbol, "." ) \
|
||||
Entry( Access_StaticSymbol, "::") \
|
||||
Entry( Ampersand, "&" ) \
|
||||
Entry( Ampersand_DBL, "&&" ) \
|
||||
Entry( Assign_Classifer, ":" ) \
|
||||
Entry( BraceCurly_Open, "{" ) \
|
||||
Entry( BraceCurly_Close, "}" ) \
|
||||
Entry( BraceSquare_Open, "[" ) \
|
||||
Entry( BraceSquare_Close, "]" ) \
|
||||
Entry( Capture_Start, "(" ) \
|
||||
Entry( Capture_End, ")" ) \
|
||||
Entry( Comment, "__comment__" ) \
|
||||
Entry( Char, "__char__" ) \
|
||||
Entry( Comma, "," ) \
|
||||
Entry( Decl_Class, "class" ) \
|
||||
Entry( Decl_Enum, "enum" ) \
|
||||
Entry( Decl_Extern_Linkage, "extern" ) \
|
||||
Entry( Decl_Friend, "friend" ) \
|
||||
Entry( Decl_Module, "module" ) \
|
||||
Entry( Decl_Namespace, "namespace" ) \
|
||||
Entry( Decl_Operator, "operator" ) \
|
||||
Entry( Decl_Struct, "struct" ) \
|
||||
Entry( Decl_Template, "template" ) \
|
||||
Entry( Decl_Typedef, "typedef" ) \
|
||||
Entry( Decl_Using, "using" ) \
|
||||
Entry( Decl_Union, "union" ) \
|
||||
Entry( Identifier, "__SymID__" ) \
|
||||
Entry( Module_Import, "import" ) \
|
||||
Entry( Module_Export, "export" ) \
|
||||
Entry( Number, "number" ) \
|
||||
Entry( Operator, "operator" ) \
|
||||
Entry( Spec_Alignas, "alignas" ) \
|
||||
Entry( Spec_Const, "const" ) \
|
||||
Entry( Spec_Consteval, "consteval" ) \
|
||||
Entry( Spec_Constexpr, "constexpr" ) \
|
||||
Entry( Spec_Constinit, "constinit" ) \
|
||||
Entry( Spec_Extern, "extern" ) \
|
||||
Entry( Spec_Global, "global" ) \
|
||||
Entry( Spec_Inline, "inline" ) \
|
||||
Entry( Spec_Internal_Linkage, "internal" ) \
|
||||
Entry( Spec_LocalPersist, "local_persist" ) \
|
||||
Entry( Spec_Mutable, "mutable" ) \
|
||||
Entry( Spec_Static, "static" ) \
|
||||
Entry( Spec_ThreadLocal, "thread_local" ) \
|
||||
Entry( Spec_Volatile, "volatile") \
|
||||
Entry( Star, "*" ) \
|
||||
Entry( Statement_End, ";" ) \
|
||||
Entry( String, "__String__" ) \
|
||||
Entry( Type_Unsigned, "unsigned" ) \
|
||||
Entry( Type_Signed, "signed" ) \
|
||||
Entry( Type_Short, "short" ) \
|
||||
Entry( Type_Long, "long" ) \
|
||||
Entry( Type_char, "char" ) \
|
||||
Entry( Type_int, "int" ) \
|
||||
Entry( Type_double, "double" )
|
||||
Entry( Access_Private, "private" ) \
|
||||
Entry( Access_Protected, "protected" ) \
|
||||
Entry( Access_Public, "public" ) \
|
||||
Entry( Access_MemberSymbol, "." ) \
|
||||
Entry( Access_StaticSymbol, "::") \
|
||||
Entry( Ampersand, "&" ) \
|
||||
Entry( Ampersand_DBL, "&&" ) \
|
||||
Entry( Assign_Classifer, ":" ) \
|
||||
Entry( BraceCurly_Open, "{" ) \
|
||||
Entry( BraceCurly_Close, "}" ) \
|
||||
Entry( BraceSquare_Open, "[" ) \
|
||||
Entry( BraceSquare_Close, "]" ) \
|
||||
Entry( Capture_Start, "(" ) \
|
||||
Entry( Capture_End, ")" ) \
|
||||
Entry( Comment, "__comment__" ) \
|
||||
Entry( Char, "__char__" ) \
|
||||
Entry( Comma, "," ) \
|
||||
Entry( Decl_Class, "class" ) \
|
||||
Entry( Decl_Enum, "enum" ) \
|
||||
Entry( Decl_Extern_Linkage, "extern" ) \
|
||||
Entry( Decl_Friend, "friend" ) \
|
||||
Entry( Decl_Module, "module" ) \
|
||||
Entry( Decl_Namespace, "namespace" ) \
|
||||
Entry( Decl_Operator, "operator" ) \
|
||||
Entry( Decl_Struct, "struct" ) \
|
||||
Entry( Decl_Template, "template" ) \
|
||||
Entry( Decl_Typedef, "typedef" ) \
|
||||
Entry( Decl_Using, "using" ) \
|
||||
Entry( Decl_Union, "union" ) \
|
||||
Entry( Identifier, "__identifier__" ) \
|
||||
Entry( Module_Import, "import" ) \
|
||||
Entry( Module_Export, "export" ) \
|
||||
Entry( Number, "number" ) \
|
||||
Entry( Operator, "operator" ) \
|
||||
Entry( Preprocessor_Directive, "#") \
|
||||
Entry( Preprocessor_Include, "include" ) \
|
||||
Entry( Spec_Alignas, "alignas" ) \
|
||||
Entry( Spec_Const, "const" ) \
|
||||
Entry( Spec_Consteval, "consteval" ) \
|
||||
Entry( Spec_Constexpr, "constexpr" ) \
|
||||
Entry( Spec_Constinit, "constinit" ) \
|
||||
Entry( Spec_Extern, "extern" ) \
|
||||
Entry( Spec_Global, "global" ) \
|
||||
Entry( Spec_Inline, "inline" ) \
|
||||
Entry( Spec_Internal_Linkage, "internal" ) \
|
||||
Entry( Spec_LocalPersist, "local_persist" ) \
|
||||
Entry( Spec_Mutable, "mutable" ) \
|
||||
Entry( Spec_Static, "static" ) \
|
||||
Entry( Spec_ThreadLocal, "thread_local" ) \
|
||||
Entry( Spec_Volatile, "volatile") \
|
||||
Entry( Star, "*" ) \
|
||||
Entry( Statement_End, ";" ) \
|
||||
Entry( String, "__string__" ) \
|
||||
Entry( Type_Unsigned, "unsigned" ) \
|
||||
Entry( Type_Signed, "signed" ) \
|
||||
Entry( Type_Short, "short" ) \
|
||||
Entry( Type_Long, "long" ) \
|
||||
Entry( Type_char, "char" ) \
|
||||
Entry( Type_int, "int" ) \
|
||||
Entry( Type_double, "double" )
|
||||
|
||||
enum class TokType : u32
|
||||
{
|
||||
@ -3368,7 +3484,7 @@ namespace gen
|
||||
|
||||
if ( Arr[Idx].Type != type )
|
||||
{
|
||||
String token_str = String::make( Memory::GlobalAllocator, { Arr[Idx].Length, Arr[Idx].Text } );
|
||||
String token_str = String::make( GlobalAllocator, { Arr[Idx].Length, Arr[Idx].Text } );
|
||||
|
||||
log_failure( "gen::%s: expected %s, got %s", context, str_tok_type(type), str_tok_type(Arr[Idx].Type) );
|
||||
|
||||
@ -3395,7 +3511,7 @@ namespace gen
|
||||
}
|
||||
};
|
||||
|
||||
TokArray lex( StrC content )
|
||||
TokArray lex( StrC content, bool keep_preprocess_directives = false )
|
||||
{
|
||||
# define current ( * scanner )
|
||||
|
||||
@ -3441,7 +3557,7 @@ namespace gen
|
||||
Tokens.free();
|
||||
}
|
||||
|
||||
Tokens = Array<Token>::init_reserve( StaticData::LexArena, content.Len / 6 );
|
||||
Tokens = Array<Token>::init_reserve( LexArena, content.Len / 6 );
|
||||
|
||||
while (left )
|
||||
{
|
||||
@ -3453,6 +3569,29 @@ namespace gen
|
||||
|
||||
switch ( current )
|
||||
{
|
||||
case '#':
|
||||
token.Text = scanner;
|
||||
token.Length = 1;
|
||||
token.Type = TokType::Preprocessor_Directive;
|
||||
move_forward();
|
||||
|
||||
while (left && current != '\n' )
|
||||
{
|
||||
if ( current == '\\' )
|
||||
{
|
||||
move_forward();
|
||||
|
||||
if ( current != '\n' && keep_preprocess_directives )
|
||||
{
|
||||
log_failure( "gen::lex: invalid preprocessor directive, will still grab but will not compile %s", token.Text );
|
||||
}
|
||||
}
|
||||
|
||||
move_forward();
|
||||
token.Length++;
|
||||
}
|
||||
goto FoundToken;
|
||||
|
||||
case '.':
|
||||
token.Text = scanner;
|
||||
token.Length = 1;
|
||||
@ -3826,7 +3965,7 @@ namespace gen
|
||||
}
|
||||
else
|
||||
{
|
||||
String context_str = String::fmt_buf( Memory::GlobalAllocator, "%s", scanner, min( 100, left ) );
|
||||
String context_str = String::fmt_buf( GlobalAllocator, "%s", scanner, min( 100, left ) );
|
||||
|
||||
log_failure( "Failed to lex token %s", context_str );
|
||||
|
||||
@ -3841,6 +3980,9 @@ namespace gen
|
||||
|
||||
if ( token.Type != TokType::Invalid )
|
||||
{
|
||||
if ( token.Type == TokType::Preprocessor_Directive && keep_preprocess_directives == false )
|
||||
continue;
|
||||
|
||||
Tokens.append( token );
|
||||
continue;
|
||||
}
|
||||
@ -3848,10 +3990,7 @@ namespace gen
|
||||
TokType type = get_tok_type( token.Text, token.Length );
|
||||
|
||||
if ( type == TokType::Invalid)
|
||||
{
|
||||
// Its most likely an identifier...
|
||||
type = TokType::Identifier;
|
||||
}
|
||||
|
||||
token.Type = type;
|
||||
Tokens.append( token );
|
||||
@ -3893,6 +4032,12 @@ namespace gen
|
||||
# define check( Type_ ) ( left && currtok.Type == Type_ )
|
||||
#pragma endregion Helper Macros
|
||||
|
||||
struct ParseContext
|
||||
{
|
||||
ParseContext* Parent;
|
||||
char const* Fn;
|
||||
};
|
||||
|
||||
internal Code parse_function_body ( Parser::TokArray& toks, char const* context );
|
||||
internal Code parse_global_nspace ( Parser::TokArray& toks, char const* context );
|
||||
|
||||
@ -5908,13 +6053,23 @@ namespace gen
|
||||
{
|
||||
using namespace Parser;
|
||||
|
||||
Token name = { nullptr, 0, TokType::Invalid };
|
||||
Code array_expr = { nullptr };
|
||||
CodeType type = { nullptr };
|
||||
Token name = { nullptr, 0, TokType::Invalid };
|
||||
Code array_expr = { nullptr };
|
||||
Code type = { nullptr };
|
||||
|
||||
eat( TokType::Decl_Typedef );
|
||||
|
||||
type = parse_type( toks, stringize(parse_typedef) );
|
||||
if ( check( TokType::Decl_Enum ) )
|
||||
type = parse_enum( toks, context );
|
||||
|
||||
else if ( check(TokType::Decl_Struct ) )
|
||||
type = parse_enum( toks, context );
|
||||
|
||||
else if ( check(TokType::Decl_Union) )
|
||||
type = parse_union( toks, context );
|
||||
|
||||
else
|
||||
type = parse_type( toks, context );
|
||||
|
||||
if ( ! check( TokType::Identifier ) )
|
||||
{
|
||||
@ -5925,7 +6080,7 @@ namespace gen
|
||||
name = currtok;
|
||||
eat( TokType::Identifier );
|
||||
|
||||
array_expr = parse_array_decl( toks, stringize(parse_typedef) );
|
||||
array_expr = parse_array_decl( toks, context );
|
||||
|
||||
eat( TokType::Statement_End );
|
||||
|
||||
@ -5938,8 +6093,8 @@ namespace gen
|
||||
|
||||
result->UnderlyingType = type;
|
||||
|
||||
if ( array_expr && array_expr->Type != Invalid )
|
||||
type->ArrExpr = array_expr;
|
||||
if ( type->Type == Typename && array_expr && array_expr->Type != Invalid )
|
||||
type.cast<CodeType>()->ArrExpr = array_expr;
|
||||
|
||||
return result;
|
||||
}
|
||||
@ -5985,7 +6140,7 @@ namespace gen
|
||||
|
||||
while ( ! check( TokType::BraceCurly_Close ) )
|
||||
{
|
||||
Code entry = parse_variable( toks, stringize(parse_union) );
|
||||
Code entry = parse_variable( toks, context );
|
||||
|
||||
if ( entry )
|
||||
body.append( entry );
|
||||
@ -6055,10 +6210,10 @@ namespace gen
|
||||
|
||||
eat( TokType::Operator );
|
||||
|
||||
type = parse_type( toks, stringize(parse_typedef) );
|
||||
type = parse_type( toks, context );
|
||||
}
|
||||
|
||||
array_expr = parse_array_decl( toks, stringize(parse_typedef) );
|
||||
array_expr = parse_array_decl( toks, context );
|
||||
|
||||
eat( TokType::Statement_End );
|
||||
|
||||
@ -6152,7 +6307,7 @@ namespace gen
|
||||
specifiers = def_specifiers( num_specifiers, specs_found );
|
||||
}
|
||||
|
||||
CodeType type = parse_type( toks, stringize(parse_variable) );
|
||||
CodeType type = parse_type( toks, context );
|
||||
|
||||
if ( type == Code::Invalid )
|
||||
return CodeInvalid;
|
||||
@ -6160,7 +6315,7 @@ namespace gen
|
||||
name = currtok;
|
||||
eat( TokType::Identifier );
|
||||
|
||||
CodeVar result = parse_variable_after_name( ModuleFlag::None, attributes, specifiers, type, name, toks, stringize(parse_variable) );
|
||||
CodeVar result = parse_variable_after_name( ModuleFlag::None, attributes, specifiers, type, name, toks, context );
|
||||
|
||||
return result;
|
||||
}
|
||||
@ -6365,7 +6520,7 @@ namespace gen
|
||||
return false;
|
||||
}
|
||||
|
||||
Buffer = String::make_reserve( Memory::GlobalAllocator, Builder_StrBufferReserve );
|
||||
Buffer = String::make_reserve( GlobalAllocator, Builder_StrBufferReserve );
|
||||
|
||||
return true;
|
||||
}
|
||||
@ -6394,4 +6549,3 @@ namespace gen
|
||||
}
|
||||
// End: gen_time
|
||||
#endif
|
||||
|
||||
|
@ -533,8 +533,7 @@ namespace gen
|
||||
template< class Type >
|
||||
Type cast()
|
||||
{
|
||||
AST* ast = this;
|
||||
return * rcast( Type*, & ast );
|
||||
return * this;
|
||||
}
|
||||
|
||||
operator Code();
|
||||
@ -669,7 +668,7 @@ namespace gen
|
||||
|
||||
// Used when the its desired when omission is allowed in a definition.
|
||||
#define NoCode { nullptr }
|
||||
#define CodeInvalid (* Code::Invalid.ast)
|
||||
#define CodeInvalid (* Code::Invalid.ast) // Uses an implicitly overloaded cast from the AST to the desired code type.
|
||||
|
||||
#pragma region Code Types
|
||||
#define Define_CodeType( Typename ) \
|
||||
@ -1243,7 +1242,7 @@ namespace gen
|
||||
{
|
||||
CodeAttributes Attributes;
|
||||
char _PAD_SPECS_ [ sizeof(AST*) ];
|
||||
CodeType UnderlyingType;
|
||||
Code UnderlyingType;
|
||||
char _PAD_PROPERTIES_[ sizeof(AST*) * 2 ];
|
||||
};
|
||||
};
|
||||
@ -1335,6 +1334,10 @@ namespace gen
|
||||
// However on Windows at least, it doesn't need to occur as the OS will clean up after the process.
|
||||
void deinit();
|
||||
|
||||
// Clears the allocations, but doesn't return to the heap, the calls init() again.
|
||||
// Ease of use.
|
||||
void reset();
|
||||
|
||||
// Used internally to retrive or make string allocations.
|
||||
// Strings are stored in a series of string arenas of fixed size (SizePer_StringArena)
|
||||
StringCached get_cached_string( StrC str );
|
||||
@ -1403,7 +1406,7 @@ namespace gen
|
||||
CodeTemplate def_template( CodeParam params, Code definition, ModuleFlag mflags = ModuleFlag::None );
|
||||
|
||||
CodeType def_type ( StrC name, Code arrayexpr = NoCode, CodeSpecifier specifiers = NoCode, CodeAttributes attributes = NoCode );
|
||||
CodeTypedef def_typedef( StrC name, CodeType type, CodeAttributes attributes = NoCode, ModuleFlag mflags = ModuleFlag::None );
|
||||
CodeTypedef def_typedef( StrC name, Code type, CodeAttributes attributes = NoCode, ModuleFlag mflags = ModuleFlag::None );
|
||||
|
||||
CodeUnion def_union( StrC name, Code body, CodeAttributes attributes = NoCode, ModuleFlag mflags = ModuleFlag::None );
|
||||
|
||||
@ -1655,6 +1658,9 @@ namespace gen
|
||||
constexpr s32 InitSize_DataArrays = 16;
|
||||
constexpr s32 InitSize_StringTable = megabytes(4);
|
||||
|
||||
// NOTE: This limits the maximum size of an allocation
|
||||
// If you are generating a string larger than this, increase the size of the bucket here.
|
||||
constexpr uw Global_BucketSize = megabytes(10);
|
||||
constexpr s32 CodePool_NumBlocks = kilobytes(4);
|
||||
constexpr s32 SizePer_StringArena = megabytes(1);
|
||||
|
||||
@ -1920,7 +1926,6 @@ namespace gen
|
||||
Define_AST_Cast( Var );
|
||||
#undef Define_AST_Cast
|
||||
|
||||
|
||||
#define Define_CodeCast( type ) \
|
||||
Code::operator Code ## type() const \
|
||||
{ \
|
||||
@ -1999,7 +2004,7 @@ namespace gen
|
||||
CodeParam& CodeParam::operator ++()
|
||||
{
|
||||
ast = ast->Next.ast;
|
||||
return *this;
|
||||
return * this;
|
||||
}
|
||||
|
||||
CodeBody def_body( CodeT type )
|
||||
@ -2053,27 +2058,21 @@ namespace gen
|
||||
#ifdef GEN_EXPOSE_BACKEND
|
||||
namespace gen
|
||||
{
|
||||
namespace Memory
|
||||
{
|
||||
extern Array<Arena> Global_AllocatorBuckets;
|
||||
}
|
||||
// Global allocator used for data with process lifetime.
|
||||
extern AllocatorInfo GlobalAllocator;
|
||||
extern Array< Arena > Global_AllocatorBuckets;
|
||||
extern Array< Pool > CodePools;
|
||||
extern Array< Arena > StringArenas;
|
||||
|
||||
namespace StaticData
|
||||
{
|
||||
extern Array< Pool > CodePools;
|
||||
extern Array< Arena > StringArenas;
|
||||
extern StringTable StringCache;
|
||||
|
||||
extern StringTable StringCache;
|
||||
extern Arena LexArena;
|
||||
|
||||
extern Arena LexArena;
|
||||
|
||||
extern AllocatorInfo Allocator_DataArrays;
|
||||
extern AllocatorInfo Allocator_CodePool;
|
||||
extern AllocatorInfo Allocator_Lexer;
|
||||
extern AllocatorInfo Allocator_StringArena;
|
||||
extern AllocatorInfo Allocator_StringTable;
|
||||
extern AllocatorInfo Allocator_TypeTable;
|
||||
}
|
||||
extern AllocatorInfo Allocator_DataArrays;
|
||||
extern AllocatorInfo Allocator_CodePool;
|
||||
extern AllocatorInfo Allocator_Lexer;
|
||||
extern AllocatorInfo Allocator_StringArena;
|
||||
extern AllocatorInfo Allocator_StringTable;
|
||||
extern AllocatorInfo Allocator_TypeTable;
|
||||
}
|
||||
|
||||
#endif
|
||||
#endif
|
||||
|
@ -225,372 +225,6 @@ namespace gen
|
||||
}
|
||||
#pragma endregion String Ops
|
||||
|
||||
#pragma region Memory
|
||||
void* mem_copy( void* dest, void const* source, sw n )
|
||||
{
|
||||
if ( dest == NULL )
|
||||
{
|
||||
return NULL;
|
||||
}
|
||||
|
||||
return memcpy( dest, source, n );
|
||||
}
|
||||
|
||||
void const* mem_find( void const* data, u8 c, sw n )
|
||||
{
|
||||
u8 const* s = zpl_cast( u8 const* ) data;
|
||||
while ( ( zpl_cast( uptr ) s & ( sizeof( uw ) - 1 ) ) && n && *s != c )
|
||||
{
|
||||
s++;
|
||||
n--;
|
||||
}
|
||||
if ( n && *s != c )
|
||||
{
|
||||
sw const* w;
|
||||
sw k = GEN__ONES * c;
|
||||
w = zpl_cast( sw const* ) s;
|
||||
while ( n >= size_of( sw ) && ! GEN__HAS_ZERO( *w ^ k ) )
|
||||
{
|
||||
w++;
|
||||
n -= size_of( sw );
|
||||
}
|
||||
s = zpl_cast( u8 const* ) w;
|
||||
while ( n && *s != c )
|
||||
{
|
||||
s++;
|
||||
n--;
|
||||
}
|
||||
}
|
||||
|
||||
return n ? zpl_cast( void const* ) s : NULL;
|
||||
}
|
||||
|
||||
#define GEN_HEAP_STATS_MAGIC 0xDEADC0DE
|
||||
|
||||
struct _heap_stats
|
||||
{
|
||||
u32 magic;
|
||||
sw used_memory;
|
||||
sw alloc_count;
|
||||
};
|
||||
|
||||
global _heap_stats _heap_stats_info;
|
||||
|
||||
void heap_stats_init( void )
|
||||
{
|
||||
zero_item( &_heap_stats_info );
|
||||
_heap_stats_info.magic = GEN_HEAP_STATS_MAGIC;
|
||||
}
|
||||
|
||||
sw heap_stats_used_memory( void )
|
||||
{
|
||||
GEN_ASSERT_MSG( _heap_stats_info.magic == GEN_HEAP_STATS_MAGIC, "heap_stats is not initialised yet, call heap_stats_init first!" );
|
||||
return _heap_stats_info.used_memory;
|
||||
}
|
||||
|
||||
sw heap_stats_alloc_count( void )
|
||||
{
|
||||
GEN_ASSERT_MSG( _heap_stats_info.magic == GEN_HEAP_STATS_MAGIC, "heap_stats is not initialised yet, call heap_stats_init first!" );
|
||||
return _heap_stats_info.alloc_count;
|
||||
}
|
||||
|
||||
void heap_stats_check( void )
|
||||
{
|
||||
GEN_ASSERT_MSG( _heap_stats_info.magic == GEN_HEAP_STATS_MAGIC, "heap_stats is not initialised yet, call heap_stats_init first!" );
|
||||
GEN_ASSERT( _heap_stats_info.used_memory == 0 );
|
||||
GEN_ASSERT( _heap_stats_info.alloc_count == 0 );
|
||||
}
|
||||
|
||||
struct _heap_alloc_info
|
||||
{
|
||||
sw size;
|
||||
void* physical_start;
|
||||
};
|
||||
|
||||
void* heap_allocator_proc( void* allocator_data, AllocType type, sw size, sw alignment, void* old_memory, sw old_size, u64 flags )
|
||||
{
|
||||
void* ptr = NULL;
|
||||
// unused( allocator_data );
|
||||
// unused( old_size );
|
||||
if ( ! alignment )
|
||||
alignment = GEN_DEFAULT_MEMORY_ALIGNMENT;
|
||||
|
||||
#ifdef GEN_HEAP_ANALYSIS
|
||||
sw alloc_info_size = size_of( _heap_alloc_info );
|
||||
sw alloc_info_remainder = ( alloc_info_size % alignment );
|
||||
sw track_size = max( alloc_info_size, alignment ) + alloc_info_remainder;
|
||||
switch ( type )
|
||||
{
|
||||
case EAllocation_FREE :
|
||||
{
|
||||
if ( ! old_memory )
|
||||
break;
|
||||
_heap_alloc_info* alloc_info = zpl_cast( _heap_alloc_info* ) old_memory - 1;
|
||||
_heap_stats_info.used_memory -= alloc_info->size;
|
||||
_heap_stats_info.alloc_count--;
|
||||
old_memory = alloc_info->physical_start;
|
||||
}
|
||||
break;
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
size += track_size;
|
||||
}
|
||||
break;
|
||||
default :
|
||||
break;
|
||||
}
|
||||
#endif
|
||||
|
||||
switch ( type )
|
||||
{
|
||||
#if defined( GEN_COMPILER_MSVC ) || ( defined( GEN_COMPILER_GCC ) && defined( GEN_SYSTEM_WINDOWS ) ) || ( defined( GEN_COMPILER_TINYC ) && defined( GEN_SYSTEM_WINDOWS ) )
|
||||
case EAllocation_ALLOC :
|
||||
ptr = _aligned_malloc( size, alignment );
|
||||
if ( flags & ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
zero_size( ptr, size );
|
||||
break;
|
||||
case EAllocation_FREE :
|
||||
_aligned_free( old_memory );
|
||||
break;
|
||||
case EAllocation_RESIZE :
|
||||
{
|
||||
AllocatorInfo a = heap();
|
||||
ptr = default_resize_align( a, old_memory, old_size, size, alignment );
|
||||
}
|
||||
break;
|
||||
|
||||
#elif defined( GEN_SYSTEM_LINUX ) && ! defined( GEN_CPU_ARM ) && ! defined( GEN_COMPILER_TINYC )
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
ptr = aligned_alloc( alignment, ( size + alignment - 1 ) & ~( alignment - 1 ) );
|
||||
|
||||
if ( flags & GEN_ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
{
|
||||
zero_size( ptr, size );
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE :
|
||||
{
|
||||
free( old_memory );
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_RESIZE :
|
||||
{
|
||||
AllocatorInfo a = heap();
|
||||
ptr = default_resize_align( a, old_memory, old_size, size, alignment );
|
||||
}
|
||||
break;
|
||||
#else
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
posix_memalign( &ptr, alignment, size );
|
||||
|
||||
if ( flags & GEN_ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
{
|
||||
zero_size( ptr, size );
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE :
|
||||
{
|
||||
free( old_memory );
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_RESIZE :
|
||||
{
|
||||
AllocatorInfo a = heap();
|
||||
ptr = default_resize_align( a, old_memory, old_size, size, alignment );
|
||||
}
|
||||
break;
|
||||
#endif
|
||||
|
||||
case EAllocation_FREE_ALL :
|
||||
break;
|
||||
}
|
||||
|
||||
#ifdef GEN_HEAP_ANALYSIS
|
||||
if ( type == EAllocation_ALLOC )
|
||||
{
|
||||
_heap_alloc_info* alloc_info = zpl_cast( _heap_alloc_info* )( zpl_cast( char* ) ptr + alloc_info_remainder );
|
||||
zero_item( alloc_info );
|
||||
alloc_info->size = size - track_size;
|
||||
alloc_info->physical_start = ptr;
|
||||
ptr = zpl_cast( void* )( alloc_info + 1 );
|
||||
_heap_stats_info.used_memory += alloc_info->size;
|
||||
_heap_stats_info.alloc_count++;
|
||||
}
|
||||
#endif
|
||||
|
||||
return ptr;
|
||||
}
|
||||
|
||||
void* Arena::allocator_proc( void* allocator_data, AllocType type, sw size, sw alignment, void* old_memory, sw old_size, u64 flags )
|
||||
{
|
||||
Arena* arena = rcast(Arena*, allocator_data);
|
||||
void* ptr = NULL;
|
||||
|
||||
// unused( old_size );
|
||||
|
||||
switch ( type )
|
||||
{
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
void* end = pointer_add( arena->PhysicalStart, arena->TotalUsed );
|
||||
sw total_size = align_forward_i64( size, alignment );
|
||||
|
||||
// NOTE: Out of memory
|
||||
if ( arena->TotalUsed + total_size > (sw) arena->TotalSize )
|
||||
{
|
||||
// zpl__printf_err("%s", "Arena out of memory\n");
|
||||
fatal("Arena out of memory! (Possibly could not fit for the largest size Arena!!)");
|
||||
return nullptr;
|
||||
}
|
||||
|
||||
ptr = align_forward( end, alignment );
|
||||
arena->TotalUsed += total_size;
|
||||
|
||||
if ( flags & ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
zero_size( ptr, size );
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE :
|
||||
// NOTE: Free all at once
|
||||
// Use Temp_Arena_Memory if you want to free a block
|
||||
break;
|
||||
|
||||
case EAllocation_FREE_ALL :
|
||||
arena->TotalUsed = 0;
|
||||
break;
|
||||
|
||||
case EAllocation_RESIZE :
|
||||
{
|
||||
// TODO : Check if ptr is on top of stack and just extend
|
||||
AllocatorInfo a = arena->Backing;
|
||||
ptr = default_resize_align( a, old_memory, old_size, size, alignment );
|
||||
}
|
||||
break;
|
||||
}
|
||||
return ptr;
|
||||
}
|
||||
|
||||
void* Pool::allocator_proc( void* allocator_data, AllocType type, sw size, sw alignment, void* old_memory, sw old_size, u64 flags )
|
||||
{
|
||||
Pool* pool = zpl_cast( Pool* ) allocator_data;
|
||||
void* ptr = NULL;
|
||||
|
||||
// unused( old_size );
|
||||
|
||||
switch ( type )
|
||||
{
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
uptr next_free;
|
||||
|
||||
GEN_ASSERT( size == pool->BlockSize );
|
||||
GEN_ASSERT( alignment == pool->BlockAlign );
|
||||
GEN_ASSERT( pool->FreeList != NULL );
|
||||
|
||||
next_free = *zpl_cast( uptr* ) pool->FreeList;
|
||||
ptr = pool->FreeList;
|
||||
pool->FreeList = zpl_cast( void* ) next_free;
|
||||
pool->TotalSize += pool->BlockSize;
|
||||
|
||||
if ( flags & ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
zero_size( ptr, size );
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE :
|
||||
{
|
||||
uptr* next;
|
||||
if ( old_memory == NULL )
|
||||
return NULL;
|
||||
|
||||
next = zpl_cast( uptr* ) old_memory;
|
||||
*next = zpl_cast( uptr ) pool->FreeList;
|
||||
pool->FreeList = old_memory;
|
||||
pool->TotalSize -= pool->BlockSize;
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE_ALL :
|
||||
{
|
||||
sw actual_block_size, block_index;
|
||||
void* curr;
|
||||
uptr* end;
|
||||
|
||||
actual_block_size = pool->BlockSize + pool->BlockAlign;
|
||||
pool->TotalSize = 0;
|
||||
|
||||
// NOTE: Init intrusive freelist
|
||||
curr = pool->PhysicalStart;
|
||||
for ( block_index = 0; block_index < pool->NumBlocks - 1; block_index++ )
|
||||
{
|
||||
uptr* next = zpl_cast( uptr* ) curr;
|
||||
*next = zpl_cast( uptr ) curr + actual_block_size;
|
||||
curr = pointer_add( curr, actual_block_size );
|
||||
}
|
||||
|
||||
end = zpl_cast( uptr* ) curr;
|
||||
*end = zpl_cast( uptr ) NULL;
|
||||
pool->FreeList = pool->PhysicalStart;
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_RESIZE :
|
||||
// NOTE: Cannot resize
|
||||
GEN_PANIC( "You cannot resize something allocated by with a pool." );
|
||||
break;
|
||||
}
|
||||
|
||||
return ptr;
|
||||
}
|
||||
|
||||
Pool Pool::init_align( AllocatorInfo backing, sw num_blocks, sw block_size, sw block_align )
|
||||
{
|
||||
Pool pool = {};
|
||||
|
||||
sw actual_block_size, pool_size, block_index;
|
||||
void *data, *curr;
|
||||
uptr* end;
|
||||
|
||||
zero_item( &pool );
|
||||
|
||||
pool.Backing = backing;
|
||||
pool.BlockSize = block_size;
|
||||
pool.BlockAlign = block_align;
|
||||
pool.NumBlocks = num_blocks;
|
||||
|
||||
actual_block_size = block_size + block_align;
|
||||
pool_size = num_blocks * actual_block_size;
|
||||
|
||||
data = alloc_align( backing, pool_size, block_align );
|
||||
|
||||
// NOTE: Init intrusive freelist
|
||||
curr = data;
|
||||
for ( block_index = 0; block_index < num_blocks - 1; block_index++ )
|
||||
{
|
||||
uptr* next = ( uptr* ) curr;
|
||||
*next = ( uptr ) curr + actual_block_size;
|
||||
curr = pointer_add( curr, actual_block_size );
|
||||
}
|
||||
|
||||
end = ( uptr* ) curr;
|
||||
*end = ( uptr ) NULL;
|
||||
|
||||
pool.PhysicalStart = data;
|
||||
pool.FreeList = data;
|
||||
|
||||
return pool;
|
||||
}
|
||||
#pragma endregion Memory
|
||||
|
||||
#pragma region Printing
|
||||
enum
|
||||
{
|
||||
@ -1144,6 +778,402 @@ namespace gen
|
||||
}
|
||||
#pragma endregion Printing
|
||||
|
||||
#pragma region Memory
|
||||
void* mem_copy( void* dest, void const* source, sw n )
|
||||
{
|
||||
if ( dest == NULL )
|
||||
{
|
||||
return NULL;
|
||||
}
|
||||
|
||||
return memcpy( dest, source, n );
|
||||
}
|
||||
|
||||
void const* mem_find( void const* data, u8 c, sw n )
|
||||
{
|
||||
u8 const* s = zpl_cast( u8 const* ) data;
|
||||
while ( ( zpl_cast( uptr ) s & ( sizeof( uw ) - 1 ) ) && n && *s != c )
|
||||
{
|
||||
s++;
|
||||
n--;
|
||||
}
|
||||
if ( n && *s != c )
|
||||
{
|
||||
sw const* w;
|
||||
sw k = GEN__ONES * c;
|
||||
w = zpl_cast( sw const* ) s;
|
||||
while ( n >= size_of( sw ) && ! GEN__HAS_ZERO( *w ^ k ) )
|
||||
{
|
||||
w++;
|
||||
n -= size_of( sw );
|
||||
}
|
||||
s = zpl_cast( u8 const* ) w;
|
||||
while ( n && *s != c )
|
||||
{
|
||||
s++;
|
||||
n--;
|
||||
}
|
||||
}
|
||||
|
||||
return n ? zpl_cast( void const* ) s : NULL;
|
||||
}
|
||||
|
||||
#define GEN_HEAP_STATS_MAGIC 0xDEADC0DE
|
||||
|
||||
struct _heap_stats
|
||||
{
|
||||
u32 magic;
|
||||
sw used_memory;
|
||||
sw alloc_count;
|
||||
};
|
||||
|
||||
global _heap_stats _heap_stats_info;
|
||||
|
||||
void heap_stats_init( void )
|
||||
{
|
||||
zero_item( &_heap_stats_info );
|
||||
_heap_stats_info.magic = GEN_HEAP_STATS_MAGIC;
|
||||
}
|
||||
|
||||
sw heap_stats_used_memory( void )
|
||||
{
|
||||
GEN_ASSERT_MSG( _heap_stats_info.magic == GEN_HEAP_STATS_MAGIC, "heap_stats is not initialised yet, call heap_stats_init first!" );
|
||||
return _heap_stats_info.used_memory;
|
||||
}
|
||||
|
||||
sw heap_stats_alloc_count( void )
|
||||
{
|
||||
GEN_ASSERT_MSG( _heap_stats_info.magic == GEN_HEAP_STATS_MAGIC, "heap_stats is not initialised yet, call heap_stats_init first!" );
|
||||
return _heap_stats_info.alloc_count;
|
||||
}
|
||||
|
||||
void heap_stats_check( void )
|
||||
{
|
||||
GEN_ASSERT_MSG( _heap_stats_info.magic == GEN_HEAP_STATS_MAGIC, "heap_stats is not initialised yet, call heap_stats_init first!" );
|
||||
GEN_ASSERT( _heap_stats_info.used_memory == 0 );
|
||||
GEN_ASSERT( _heap_stats_info.alloc_count == 0 );
|
||||
}
|
||||
|
||||
struct _heap_alloc_info
|
||||
{
|
||||
sw size;
|
||||
void* physical_start;
|
||||
};
|
||||
|
||||
void* heap_allocator_proc( void* allocator_data, AllocType type, sw size, sw alignment, void* old_memory, sw old_size, u64 flags )
|
||||
{
|
||||
void* ptr = NULL;
|
||||
// unused( allocator_data );
|
||||
// unused( old_size );
|
||||
if ( ! alignment )
|
||||
alignment = GEN_DEFAULT_MEMORY_ALIGNMENT;
|
||||
|
||||
#ifdef GEN_HEAP_ANALYSIS
|
||||
sw alloc_info_size = size_of( _heap_alloc_info );
|
||||
sw alloc_info_remainder = ( alloc_info_size % alignment );
|
||||
sw track_size = max( alloc_info_size, alignment ) + alloc_info_remainder;
|
||||
switch ( type )
|
||||
{
|
||||
case EAllocation_FREE :
|
||||
{
|
||||
if ( ! old_memory )
|
||||
break;
|
||||
_heap_alloc_info* alloc_info = zpl_cast( _heap_alloc_info* ) old_memory - 1;
|
||||
_heap_stats_info.used_memory -= alloc_info->size;
|
||||
_heap_stats_info.alloc_count--;
|
||||
old_memory = alloc_info->physical_start;
|
||||
}
|
||||
break;
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
size += track_size;
|
||||
}
|
||||
break;
|
||||
default :
|
||||
break;
|
||||
}
|
||||
#endif
|
||||
|
||||
switch ( type )
|
||||
{
|
||||
#if defined( GEN_COMPILER_MSVC ) || ( defined( GEN_COMPILER_GCC ) && defined( GEN_SYSTEM_WINDOWS ) ) || ( defined( GEN_COMPILER_TINYC ) && defined( GEN_SYSTEM_WINDOWS ) )
|
||||
case EAllocation_ALLOC :
|
||||
ptr = _aligned_malloc( size, alignment );
|
||||
if ( flags & ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
zero_size( ptr, size );
|
||||
break;
|
||||
case EAllocation_FREE :
|
||||
_aligned_free( old_memory );
|
||||
break;
|
||||
case EAllocation_RESIZE :
|
||||
{
|
||||
AllocatorInfo a = heap();
|
||||
ptr = default_resize_align( a, old_memory, old_size, size, alignment );
|
||||
}
|
||||
break;
|
||||
|
||||
#elif defined( GEN_SYSTEM_LINUX ) && ! defined( GEN_CPU_ARM ) && ! defined( GEN_COMPILER_TINYC )
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
ptr = aligned_alloc( alignment, ( size + alignment - 1 ) & ~( alignment - 1 ) );
|
||||
|
||||
if ( flags & GEN_ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
{
|
||||
zero_size( ptr, size );
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE :
|
||||
{
|
||||
free( old_memory );
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_RESIZE :
|
||||
{
|
||||
AllocatorInfo a = heap();
|
||||
ptr = default_resize_align( a, old_memory, old_size, size, alignment );
|
||||
}
|
||||
break;
|
||||
#else
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
posix_memalign( &ptr, alignment, size );
|
||||
|
||||
if ( flags & GEN_ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
{
|
||||
zero_size( ptr, size );
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE :
|
||||
{
|
||||
free( old_memory );
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_RESIZE :
|
||||
{
|
||||
AllocatorInfo a = heap();
|
||||
ptr = default_resize_align( a, old_memory, old_size, size, alignment );
|
||||
}
|
||||
break;
|
||||
#endif
|
||||
|
||||
case EAllocation_FREE_ALL :
|
||||
break;
|
||||
}
|
||||
|
||||
#ifdef GEN_HEAP_ANALYSIS
|
||||
if ( type == EAllocation_ALLOC )
|
||||
{
|
||||
_heap_alloc_info* alloc_info = zpl_cast( _heap_alloc_info* )( zpl_cast( char* ) ptr + alloc_info_remainder );
|
||||
zero_item( alloc_info );
|
||||
alloc_info->size = size - track_size;
|
||||
alloc_info->physical_start = ptr;
|
||||
ptr = zpl_cast( void* )( alloc_info + 1 );
|
||||
_heap_stats_info.used_memory += alloc_info->size;
|
||||
_heap_stats_info.alloc_count++;
|
||||
}
|
||||
#endif
|
||||
|
||||
return ptr;
|
||||
}
|
||||
|
||||
void* Arena::allocator_proc( void* allocator_data, AllocType type, sw size, sw alignment, void* old_memory, sw old_size, u64 flags )
|
||||
{
|
||||
Arena* arena = rcast(Arena*, allocator_data);
|
||||
void* ptr = NULL;
|
||||
|
||||
// unused( old_size );
|
||||
|
||||
switch ( type )
|
||||
{
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
void* end = pointer_add( arena->PhysicalStart, arena->TotalUsed );
|
||||
sw total_size = align_forward_i64( size, alignment );
|
||||
|
||||
// NOTE: Out of memory
|
||||
if ( arena->TotalUsed + total_size > (sw) arena->TotalSize )
|
||||
{
|
||||
// zpl__printf_err("%s", "Arena out of memory\n");
|
||||
fatal("Arena out of memory! (Possibly could not fit for the largest size Arena!!)");
|
||||
return nullptr;
|
||||
}
|
||||
|
||||
ptr = align_forward( end, alignment );
|
||||
arena->TotalUsed += total_size;
|
||||
|
||||
if ( flags & ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
zero_size( ptr, size );
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE :
|
||||
// NOTE: Free all at once
|
||||
// Use Temp_Arena_Memory if you want to free a block
|
||||
break;
|
||||
|
||||
case EAllocation_FREE_ALL :
|
||||
arena->TotalUsed = 0;
|
||||
break;
|
||||
|
||||
case EAllocation_RESIZE :
|
||||
{
|
||||
// TODO : Check if ptr is on top of stack and just extend
|
||||
AllocatorInfo a = arena->Backing;
|
||||
ptr = default_resize_align( a, old_memory, old_size, size, alignment );
|
||||
}
|
||||
break;
|
||||
}
|
||||
return ptr;
|
||||
}
|
||||
|
||||
void* Pool::allocator_proc( void* allocator_data, AllocType type, sw size, sw alignment, void* old_memory, sw old_size, u64 flags )
|
||||
{
|
||||
Pool* pool = zpl_cast( Pool* ) allocator_data;
|
||||
void* ptr = NULL;
|
||||
|
||||
// unused( old_size );
|
||||
|
||||
switch ( type )
|
||||
{
|
||||
case EAllocation_ALLOC :
|
||||
{
|
||||
uptr next_free;
|
||||
|
||||
GEN_ASSERT( size == pool->BlockSize );
|
||||
GEN_ASSERT( alignment == pool->BlockAlign );
|
||||
GEN_ASSERT( pool->FreeList != NULL );
|
||||
|
||||
next_free = *zpl_cast( uptr* ) pool->FreeList;
|
||||
ptr = pool->FreeList;
|
||||
pool->FreeList = zpl_cast( void* ) next_free;
|
||||
pool->TotalSize += pool->BlockSize;
|
||||
|
||||
if ( flags & ALLOCATOR_FLAG_CLEAR_TO_ZERO )
|
||||
zero_size( ptr, size );
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE :
|
||||
{
|
||||
uptr* next;
|
||||
if ( old_memory == NULL )
|
||||
return NULL;
|
||||
|
||||
next = zpl_cast( uptr* ) old_memory;
|
||||
*next = zpl_cast( uptr ) pool->FreeList;
|
||||
pool->FreeList = old_memory;
|
||||
pool->TotalSize -= pool->BlockSize;
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_FREE_ALL :
|
||||
{
|
||||
sw actual_block_size, block_index;
|
||||
void* curr;
|
||||
uptr* end;
|
||||
|
||||
actual_block_size = pool->BlockSize + pool->BlockAlign;
|
||||
pool->TotalSize = 0;
|
||||
|
||||
// NOTE: Init intrusive freelist
|
||||
curr = pool->PhysicalStart;
|
||||
for ( block_index = 0; block_index < pool->NumBlocks - 1; block_index++ )
|
||||
{
|
||||
uptr* next = zpl_cast( uptr* ) curr;
|
||||
*next = zpl_cast( uptr ) curr + actual_block_size;
|
||||
curr = pointer_add( curr, actual_block_size );
|
||||
}
|
||||
|
||||
end = zpl_cast( uptr* ) curr;
|
||||
*end = zpl_cast( uptr ) NULL;
|
||||
pool->FreeList = pool->PhysicalStart;
|
||||
}
|
||||
break;
|
||||
|
||||
case EAllocation_RESIZE :
|
||||
// NOTE: Cannot resize
|
||||
GEN_PANIC( "You cannot resize something allocated by with a pool." );
|
||||
break;
|
||||
}
|
||||
|
||||
return ptr;
|
||||
}
|
||||
|
||||
Pool Pool::init_align( AllocatorInfo backing, sw num_blocks, sw block_size, sw block_align )
|
||||
{
|
||||
Pool pool = {};
|
||||
|
||||
sw actual_block_size, pool_size, block_index;
|
||||
void *data, *curr;
|
||||
uptr* end;
|
||||
|
||||
zero_item( &pool );
|
||||
|
||||
pool.Backing = backing;
|
||||
pool.BlockSize = block_size;
|
||||
pool.BlockAlign = block_align;
|
||||
pool.NumBlocks = num_blocks;
|
||||
|
||||
actual_block_size = block_size + block_align;
|
||||
pool_size = num_blocks * actual_block_size;
|
||||
|
||||
data = alloc_align( backing, pool_size, block_align );
|
||||
|
||||
// NOTE: Init intrusive freelist
|
||||
curr = data;
|
||||
for ( block_index = 0; block_index < num_blocks - 1; block_index++ )
|
||||
{
|
||||
uptr* next = ( uptr* ) curr;
|
||||
*next = ( uptr ) curr + actual_block_size;
|
||||
curr = pointer_add( curr, actual_block_size );
|
||||
}
|
||||
|
||||
end = ( uptr* ) curr;
|
||||
*end = ( uptr ) NULL;
|
||||
|
||||
pool.PhysicalStart = data;
|
||||
pool.FreeList = data;
|
||||
|
||||
return pool;
|
||||
}
|
||||
|
||||
void Pool::clear()
|
||||
{
|
||||
sw actual_block_size, block_index;
|
||||
void* curr;
|
||||
uptr* end;
|
||||
|
||||
actual_block_size = BlockSize + BlockAlign;
|
||||
|
||||
curr = PhysicalStart;
|
||||
for ( block_index = 0; block_index < NumBlocks - 1; block_index++ )
|
||||
{
|
||||
uptr* next = ( uptr* ) curr;
|
||||
*next = ( uptr ) curr + actual_block_size;
|
||||
curr = pointer_add( curr, actual_block_size );
|
||||
}
|
||||
|
||||
end = ( uptr* ) curr;
|
||||
*end = ( uptr ) NULL;
|
||||
|
||||
FreeList = PhysicalStart;
|
||||
}
|
||||
#pragma endregion Memory
|
||||
|
||||
#pragma region ADT
|
||||
|
||||
#pragma endregion ADT
|
||||
|
||||
#pragma region CSV
|
||||
|
||||
#pragma endregion CSV
|
||||
|
||||
#pragma region Hashing
|
||||
global u32 const _crc32_table[ 256 ] = {
|
||||
0x00000000, 0x77073096, 0xee0e612c, 0x990951ba, 0x076dc419, 0x706af48f, 0xe963a535, 0x9e6495a3, 0x0edb8832, 0x79dcb8a4, 0xe0d5e91e, 0x97d2d988, 0x09b64c2b, 0x7eb17cbd,
|
||||
@ -1840,107 +1870,6 @@ namespace gen
|
||||
#pragma endregion Timing
|
||||
#endif
|
||||
|
||||
namespace Memory
|
||||
{
|
||||
global AllocatorInfo GlobalAllocator;
|
||||
global Array<Arena> Global_AllocatorBuckets;
|
||||
|
||||
void* Global_Allocator_Proc( void* allocator_data, AllocType type, sw size, sw alignment, void* old_memory, sw old_size, u64 flags )
|
||||
{
|
||||
Arena& last = Global_AllocatorBuckets.back();
|
||||
|
||||
switch ( type )
|
||||
{
|
||||
case EAllocation_ALLOC:
|
||||
{
|
||||
if ( last.TotalUsed + size > last.TotalSize )
|
||||
{
|
||||
Arena bucket = Arena::init_from_allocator( heap(), Global_BucketSize );
|
||||
|
||||
if ( bucket.PhysicalStart == nullptr )
|
||||
fatal( "Failed to create bucket for Global_AllocatorBuckets");
|
||||
|
||||
if ( ! Global_AllocatorBuckets.append( bucket ) )
|
||||
fatal( "Failed to append bucket to Global_AllocatorBuckets");
|
||||
|
||||
last = Global_AllocatorBuckets.back();
|
||||
}
|
||||
|
||||
return alloc_align( last, size, alignment );
|
||||
}
|
||||
case EAllocation_FREE:
|
||||
{
|
||||
// Doesn't recycle.
|
||||
}
|
||||
break;
|
||||
case EAllocation_FREE_ALL:
|
||||
{
|
||||
// Memory::cleanup instead.
|
||||
}
|
||||
break;
|
||||
case EAllocation_RESIZE:
|
||||
{
|
||||
if ( last.TotalUsed + size > last.TotalSize )
|
||||
{
|
||||
Arena bucket = Arena::init_from_allocator( heap(), Global_BucketSize );
|
||||
|
||||
if ( bucket.PhysicalStart == nullptr )
|
||||
fatal( "Failed to create bucket for Global_AllocatorBuckets");
|
||||
|
||||
if ( ! Global_AllocatorBuckets.append( bucket ) )
|
||||
fatal( "Failed to append bucket to Global_AllocatorBuckets");
|
||||
|
||||
last = Global_AllocatorBuckets.back();
|
||||
}
|
||||
|
||||
void* result = alloc_align( last.Backing, size, alignment );
|
||||
|
||||
if ( result != nullptr && old_memory != nullptr )
|
||||
{
|
||||
mem_copy( result, old_memory, old_size );
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
return nullptr;
|
||||
}
|
||||
|
||||
void setup()
|
||||
{
|
||||
GlobalAllocator = AllocatorInfo { & Global_Allocator_Proc, nullptr };
|
||||
|
||||
Global_AllocatorBuckets = Array<Arena>::init_reserve( heap(), 128 );
|
||||
|
||||
if ( Global_AllocatorBuckets == nullptr )
|
||||
fatal( "Failed to reserve memory for Global_AllocatorBuckets");
|
||||
|
||||
Arena bucket = Arena::init_from_allocator( heap(), Global_BucketSize );
|
||||
|
||||
if ( bucket.PhysicalStart == nullptr )
|
||||
fatal( "Failed to create first bucket for Global_AllocatorBuckets");
|
||||
|
||||
Global_AllocatorBuckets.append( bucket );
|
||||
}
|
||||
|
||||
void cleanup()
|
||||
{
|
||||
s32 index = 0;
|
||||
s32 left = Global_AllocatorBuckets.num();
|
||||
do
|
||||
{
|
||||
Arena* bucket = & Global_AllocatorBuckets[ index ];
|
||||
bucket->free();
|
||||
index++;
|
||||
}
|
||||
while ( left--, left );
|
||||
|
||||
Global_AllocatorBuckets.free();
|
||||
}
|
||||
|
||||
// namespace Memory
|
||||
}
|
||||
// namespace gen
|
||||
}
|
||||
|
||||
|
@ -147,14 +147,14 @@ namespace gen
|
||||
|
||||
// Bits
|
||||
|
||||
#define bit( Value_ ) ( 1 << Value_ )
|
||||
#define bitfield_is_equal( Type_, Field_, Mask_ ) ( (Type_(Mask_) & Type_(Field_)) == Type_(Mask_) )
|
||||
#define bit( Value ) ( 1 << Value )
|
||||
#define bitfield_is_equal( Type, Field, Mask ) ( (Type(Mask) & Type(Field)) == Type(Mask) )
|
||||
|
||||
// Casting
|
||||
#define ccast( Type_, Value_ ) * const_cast< Type_* >( & (Value_) )
|
||||
#define pcast( Type_, Value_ ) ( * (Type_*)( & (Value_) ) )
|
||||
#define rcast( Type_, Value_ ) reinterpret_cast< Type_ >( Value_ )
|
||||
#define scast( Type_, Value_ ) static_cast< Type_ >( Value_ )
|
||||
#define ccast( Type, Value ) ( * const_cast< Type* >( & (Value) ) )
|
||||
#define pcast( Type, Value ) ( * reinterpret_cast< Type* >( & ( Value ) ) )
|
||||
#define rcast( Type, Value ) reinterpret_cast< Type >( Value )
|
||||
#define scast( Type, Value ) static_cast< Type >( Value )
|
||||
|
||||
// Num Arguments (Varadics)
|
||||
#if defined(__GNUC__) || defined(__clang__)
|
||||
@ -194,8 +194,8 @@ namespace gen
|
||||
#endif
|
||||
|
||||
// Stringizing
|
||||
#define stringize_va( ... ) #__VA_ARGS__
|
||||
#define stringize( ... ) stringize_va( __VA_ARGS__ )
|
||||
#define stringize_va( ... ) #__VA_ARGS__
|
||||
#define stringize( ... ) stringize_va( __VA_ARGS__ )
|
||||
|
||||
// Function do once
|
||||
|
||||
@ -227,13 +227,21 @@ namespace gen
|
||||
#define is_between( x, lower, upper ) ( ( ( lower ) <= ( x ) ) && ( ( x ) <= ( upper ) ) )
|
||||
#define min( a, b ) ( ( a ) < ( b ) ? ( a ) : ( b ) )
|
||||
#define size_of( x ) ( sw )( sizeof( x ) )
|
||||
#define swap( Type, a, b ) \
|
||||
do \
|
||||
{ \
|
||||
Type tmp = ( a ); \
|
||||
( a ) = ( b ); \
|
||||
( b ) = tmp; \
|
||||
} while ( 0 )
|
||||
// #define swap( Type, a, b ) \
|
||||
// do \
|
||||
// { \
|
||||
// Type tmp = ( a ); \
|
||||
// ( a ) = ( b ); \
|
||||
// ( b ) = tmp; \
|
||||
// } while ( 0 )
|
||||
|
||||
template< class Type >
|
||||
void swap( Type a, Type b )
|
||||
{
|
||||
Type tmp = a;
|
||||
a = b;
|
||||
b = tmp;
|
||||
}
|
||||
#pragma endregion Macros
|
||||
|
||||
#pragma region Basic Types
|
||||
@ -834,6 +842,8 @@ namespace gen
|
||||
static
|
||||
Pool init_align( AllocatorInfo backing, sw num_blocks, sw block_size, sw block_align );
|
||||
|
||||
void clear();
|
||||
|
||||
void free()
|
||||
{
|
||||
if ( Backing.Proc )
|
||||
@ -1054,7 +1064,7 @@ namespace gen
|
||||
len /= 2;
|
||||
while ( len-- )
|
||||
{
|
||||
swap( char, *a, *b );
|
||||
swap( *a, *b );
|
||||
a++, b--;
|
||||
}
|
||||
return str;
|
||||
@ -1083,6 +1093,64 @@ namespace gen
|
||||
}
|
||||
#pragma endregion String Ops
|
||||
|
||||
#pragma region Printing
|
||||
struct FileInfo;
|
||||
|
||||
#ifndef GEN_PRINTF_MAXLEN
|
||||
# define GEN_PRINTF_MAXLEN 65536
|
||||
#endif
|
||||
|
||||
// NOTE: A locally persisting buffer is used internally
|
||||
char* str_fmt_buf( char const* fmt, ... );
|
||||
char* str_fmt_buf_va( char const* fmt, va_list va );
|
||||
sw str_fmt_va( char* str, sw n, char const* fmt, va_list va );
|
||||
sw str_fmt_out_va( char const* fmt, va_list va );
|
||||
sw str_fmt_out_err( char const* fmt, ... );
|
||||
sw str_fmt_out_err_va( char const* fmt, va_list va );
|
||||
sw str_fmt_file_va( FileInfo* f, char const* fmt, va_list va );
|
||||
|
||||
constexpr
|
||||
char const* Msg_Invalid_Value = "INVALID VALUE PROVIDED";
|
||||
|
||||
inline
|
||||
sw log_fmt(char const* fmt, ...)
|
||||
{
|
||||
sw res;
|
||||
va_list va;
|
||||
|
||||
va_start(va, fmt);
|
||||
res = str_fmt_out_va(fmt, va);
|
||||
va_end(va);
|
||||
|
||||
return res;
|
||||
}
|
||||
|
||||
inline
|
||||
sw fatal(char const* fmt, ...)
|
||||
{
|
||||
local_persist thread_local
|
||||
char buf[GEN_PRINTF_MAXLEN] = { 0 };
|
||||
|
||||
va_list va;
|
||||
|
||||
#if Build_Debug
|
||||
va_start(va, fmt);
|
||||
str_fmt_va(buf, GEN_PRINTF_MAXLEN, fmt, va);
|
||||
va_end(va);
|
||||
|
||||
assert_crash(buf);
|
||||
return -1;
|
||||
#else
|
||||
va_start(va, fmt);
|
||||
str_fmt_out_err_va( fmt, va);
|
||||
va_end(va);
|
||||
|
||||
exit(1);
|
||||
return -1;
|
||||
#endif
|
||||
}
|
||||
#pragma endregion Printing
|
||||
|
||||
#pragma region Containers
|
||||
template<class Type>
|
||||
struct Array
|
||||
@ -1573,8 +1641,10 @@ namespace gen
|
||||
return { str_len( str ), str };
|
||||
}
|
||||
|
||||
// Currently sed with strings as a parameter to indicate to free after append.
|
||||
constexpr sw FreeAfter = 0xF4EEAF7E4;
|
||||
sw StrC_len( char const* str )
|
||||
{
|
||||
return (sw) ( str - 1 );
|
||||
}
|
||||
|
||||
// Dynamic String
|
||||
// This is directly based off the ZPL string api.
|
||||
@ -1694,14 +1764,69 @@ namespace gen
|
||||
return true;
|
||||
}
|
||||
|
||||
bool make_space_for( char const* str, sw add_len );
|
||||
bool make_space_for( char const* str, sw add_len )
|
||||
{
|
||||
sw available = avail_space();
|
||||
|
||||
// NOTE: Return if there is enough space left
|
||||
if ( available >= add_len )
|
||||
{
|
||||
return true;
|
||||
}
|
||||
else
|
||||
{
|
||||
sw new_len, old_size, new_size;
|
||||
|
||||
void* ptr;
|
||||
void* new_ptr;
|
||||
|
||||
AllocatorInfo allocator = get_header().Allocator;
|
||||
Header* header = nullptr;
|
||||
|
||||
new_len = grow_formula( length() + add_len );
|
||||
ptr = & get_header();
|
||||
old_size = size_of( Header ) + length() + 1;
|
||||
new_size = size_of( Header ) + new_len + 1;
|
||||
|
||||
new_ptr = resize( allocator, ptr, old_size, new_size );
|
||||
|
||||
if ( new_ptr == nullptr )
|
||||
return false;
|
||||
|
||||
header = zpl_cast( Header* ) new_ptr;
|
||||
header->Allocator = allocator;
|
||||
header->Capacity = new_len;
|
||||
|
||||
Data = rcast( char*, header + 1 );
|
||||
|
||||
return str;
|
||||
}
|
||||
}
|
||||
|
||||
bool append( char const* str )
|
||||
{
|
||||
return append( str, str_len( str ) );
|
||||
}
|
||||
|
||||
bool append( char const* str, sw length );
|
||||
bool append( char const* str, sw length )
|
||||
{
|
||||
if ( sptr(str) > 0 )
|
||||
{
|
||||
sw curr_len = this->length();
|
||||
|
||||
if ( ! make_space_for( str, length ) )
|
||||
return false;
|
||||
|
||||
Header& header = get_header();
|
||||
|
||||
mem_copy( Data + curr_len, str, length );
|
||||
|
||||
Data[ curr_len + length ] = '\0';
|
||||
|
||||
header.Length = curr_len + length;
|
||||
}
|
||||
return str;
|
||||
}
|
||||
|
||||
bool append( StrC str)
|
||||
{
|
||||
@ -1961,7 +2086,7 @@ namespace gen
|
||||
|
||||
// Internals
|
||||
char** Filenames; // zpl_array
|
||||
String Buffer;
|
||||
char* Buffer; // zpl_string
|
||||
};
|
||||
|
||||
struct FileInfo
|
||||
@ -2178,21 +2303,189 @@ namespace gen
|
||||
}
|
||||
#pragma endregion File Handling
|
||||
|
||||
#pragma region Printing
|
||||
#pragma region ADT
|
||||
enum ADT_Type : u32
|
||||
{
|
||||
EADTTYPE_UNINITIALISED, /* node was not initialised, this is a programming error! */
|
||||
EADTTYPE_ARRAY,
|
||||
EADTTYPE_OBJECT,
|
||||
EADTTYPE_STRING,
|
||||
EADTTYPE_MULTISTRING,
|
||||
EADTTYPE_INTEGER,
|
||||
EADTTYPE_REAL,
|
||||
};
|
||||
|
||||
#ifndef GEN_PRINTF_MAXLEN
|
||||
# define GEN_PRINTF_MAXLEN 65536
|
||||
enum ADT_Props : u32
|
||||
{
|
||||
EADTPROPS_NONE,
|
||||
EADTPROPS_NAN,
|
||||
EADTPROPS_NAN_NEG,
|
||||
EADTPROPS_INFINITY,
|
||||
EADTPROPS_INFINITY_NEG,
|
||||
EADTPROPS_FALSE,
|
||||
EADTPROPS_TRUE,
|
||||
EADTPROPS_NULL,
|
||||
EADTPROPS_IS_EXP,
|
||||
EADTPROPS_IS_HEX,
|
||||
|
||||
// Used internally so that people can fill in real numbers they plan to write.
|
||||
EADTPROPS_IS_PARSED_REAL,
|
||||
};
|
||||
|
||||
enum ADT_NamingStyle : u32
|
||||
{
|
||||
EADTNAME_STYLE_DOUBLE_QUOTE,
|
||||
EADTNAME_STYLE_SINGLE_QUOTE,
|
||||
EADTNAME_STYLE_NO_QUOTES,
|
||||
};
|
||||
|
||||
enum ADT_AssignStyle : u32
|
||||
{
|
||||
EADTASSIGN_STYLE_COLON,
|
||||
EADTASSIGN_STYLE_EQUALS,
|
||||
EADTASSIGN_STYLE_LINE,
|
||||
};
|
||||
|
||||
enum ADT_DelimStyle : u32
|
||||
{
|
||||
EADTDELIM_STYLE_COMMA,
|
||||
EADTDELIM_STYLE_LINE,
|
||||
EADTDELIM_STYLE_NEWLINE,
|
||||
};
|
||||
|
||||
enum ADT_Error : u32
|
||||
{
|
||||
EADTERROR_NONE,
|
||||
EADTERROR_INTERNAL,
|
||||
EADTERROR_ALREADY_CONVERTED,
|
||||
EADTERROR_INVALID_TYPE,
|
||||
EADTERROR_OUT_OF_MEMORY,
|
||||
};
|
||||
|
||||
struct ADT_Node
|
||||
{
|
||||
static ADT_Node* make_branch( AllocatorInfo backing, char const* name, b32 is_array );
|
||||
static ADT_Node* make_leaf( AllocatorInfo backing, char const* name, u8 type );
|
||||
|
||||
static ADT_Node* set_arr( char const* name, AllocatorInfo backing );
|
||||
static ADT_Node* set_flt( char const* name, f64 value );
|
||||
static ADT_Node* set_int( char const* name, s64 value );
|
||||
static ADT_Node* set_obj( char const* name, AllocatorInfo backing );
|
||||
static ADT_Node* set_str( char const* name, char const* value );
|
||||
|
||||
static void swap( ADT_Node* node, ADT_Node* other );
|
||||
|
||||
ADT_Node* append_arr( char const* name );
|
||||
ADT_Node* append_flt( char const* name, f64 value );
|
||||
ADT_Node* append_int( char const* name, s64 value );
|
||||
ADT_Node* append_obj( char const* name );
|
||||
ADT_Node* append_str( char const* name, char const* value );
|
||||
|
||||
ADT_Node* destroy();
|
||||
|
||||
ADT_Node* query( char const* uri );
|
||||
|
||||
ADT_Node* find( char const* name, b32 deep_search );
|
||||
|
||||
ADT_Node* alloc();
|
||||
ADT_Node* alloc_at( sw index );
|
||||
|
||||
ADT_Node* move_node( ADT_Node* new_parent );
|
||||
|
||||
ADT_Node* move_node_at( ADT_Node* new_parent, sw index );
|
||||
|
||||
char* parse_number( char* base );
|
||||
|
||||
void remove( ADT_Node* node );
|
||||
|
||||
ADT_Error str_to_number();
|
||||
|
||||
ADT_Error print_number( FileInfo* file );
|
||||
ADT_Error print_string( FileInfo* file, char const* escapsed_chars, char const* escape_symbol );
|
||||
|
||||
#pragma region Layout
|
||||
char const* name;
|
||||
ADT_Node* parent;
|
||||
|
||||
/* properties */
|
||||
ADT_Type type;
|
||||
ADT_Props props;
|
||||
#ifndef ZPL_PARSER_DISABLE_ANALYSIS
|
||||
u8 cfg_mode : 1;
|
||||
u8 name_style : 2;
|
||||
u8 assign_style : 2;
|
||||
u8 delim_style : 2;
|
||||
u8 delim_line_width : 4;
|
||||
u8 assign_line_width : 4;
|
||||
#endif
|
||||
|
||||
// NOTE: A locally persisting buffer is used internally
|
||||
char* str_fmt_buf( char const* fmt, ... );
|
||||
char* str_fmt_buf_va( char const* fmt, va_list va );
|
||||
sw str_fmt_va( char* str, sw n, char const* fmt, va_list va );
|
||||
sw str_fmt_out_va( char const* fmt, va_list va );
|
||||
sw str_fmt_out_err( char const* fmt, ... );
|
||||
sw str_fmt_out_err_va( char const* fmt, va_list va );
|
||||
sw str_fmt_file_va( FileInfo* f, char const* fmt, va_list va );
|
||||
#pragma endregion Printing
|
||||
/* adt data */
|
||||
union
|
||||
{
|
||||
char const* string;
|
||||
struct ADT_Node* nodes; ///< zpl_array
|
||||
|
||||
struct
|
||||
{
|
||||
union
|
||||
{
|
||||
f64 real;
|
||||
s64 integer;
|
||||
};
|
||||
|
||||
#ifndef ZPL_PARSER_DISABLE_ANALYSIS
|
||||
/* number analysis */
|
||||
s32 base;
|
||||
s32 base2;
|
||||
u8 base2_offset : 4;
|
||||
s8 exp : 4;
|
||||
u8 neg_zero : 1;
|
||||
u8 lead_digit : 1;
|
||||
#endif
|
||||
};
|
||||
};
|
||||
#pragma endregion Layout
|
||||
};
|
||||
|
||||
#pragma endregion ADT
|
||||
|
||||
#pragma region CSV
|
||||
enum CSV_Error : u32
|
||||
{
|
||||
ECSV_Error__NONE,
|
||||
ECSV_Error__INTERNAL,
|
||||
ECSV_Error__UNEXPECTED_END_OF_INPUT,
|
||||
ECSV_Error__MISMATCHED_ROWS,
|
||||
};
|
||||
|
||||
typedef ADT_Node CSV_Object;
|
||||
|
||||
GEN_DEF_INLINE u8 csv_parse( CSV_Object* root, char* text, AllocatorInfo allocator, b32 has_header );
|
||||
u8 csv_parse_delimiter( CSV_Object* root, char* text, AllocatorInfo allocator, b32 has_header, char delim );
|
||||
void csv_free( CSV_Object* obj );
|
||||
|
||||
GEN_DEF_INLINE void csv_write( FileInfo* file, CSV_Object* obj );
|
||||
GEN_DEF_INLINE String csv_write_string( AllocatorInfo a, CSV_Object* obj );
|
||||
void csv_write_delimiter( FileInfo* file, CSV_Object* obj, char delim );
|
||||
String csv_write_string_delimiter( AllocatorInfo a, CSV_Object* obj, char delim );
|
||||
|
||||
/* inline */
|
||||
|
||||
GEN_IMPL_INLINE u8 csv_parse( CSV_Object* root, char* text, AllocatorInfo allocator, b32 has_header )
|
||||
{
|
||||
return csv_parse_delimiter( root, text, allocator, has_header, ',' );
|
||||
}
|
||||
|
||||
GEN_IMPL_INLINE void csv_write( FileInfo* file, CSV_Object* obj )
|
||||
{
|
||||
csv_write_delimiter( file, obj, ',' );
|
||||
}
|
||||
|
||||
GEN_IMPL_INLINE String csv_write_string( AllocatorInfo a, CSV_Object* obj )
|
||||
{
|
||||
return csv_write_string_delimiter( a, obj, ',' );
|
||||
}
|
||||
#pragma endregion CSV
|
||||
|
||||
#ifdef GEN_BENCHMARK
|
||||
//! Return CPU timestamp.
|
||||
@ -2205,124 +2498,6 @@ namespace gen
|
||||
u64 time_rel_ms( void );
|
||||
#endif
|
||||
|
||||
namespace Memory
|
||||
{
|
||||
// NOTE: This limits the maximum size of an allocation
|
||||
// If you are generating a string larger than this, increase the size of the bucket here.
|
||||
constexpr uw Global_BucketSize = megabytes(10);
|
||||
|
||||
|
||||
// Global allocator used for data with process lifetime.
|
||||
extern AllocatorInfo GlobalAllocator;
|
||||
|
||||
// Heap allocator is being used for now to isolate errors from being memory related (tech debt till ready to address)
|
||||
// #define g_allocator heap()
|
||||
|
||||
void setup();
|
||||
void cleanup();
|
||||
}
|
||||
|
||||
constexpr
|
||||
char const* Msg_Invalid_Value = "INVALID VALUE PROVIDED";
|
||||
|
||||
inline
|
||||
sw log_fmt(char const* fmt, ...)
|
||||
{
|
||||
sw res;
|
||||
va_list va;
|
||||
|
||||
va_start(va, fmt);
|
||||
res = str_fmt_out_va(fmt, va);
|
||||
va_end(va);
|
||||
|
||||
return res;
|
||||
}
|
||||
|
||||
inline
|
||||
sw fatal(char const* fmt, ...)
|
||||
{
|
||||
local_persist thread_local
|
||||
char buf[GEN_PRINTF_MAXLEN] = { 0 };
|
||||
|
||||
va_list va;
|
||||
|
||||
#if Build_Debug
|
||||
va_start(va, fmt);
|
||||
str_fmt_va(buf, GEN_PRINTF_MAXLEN, fmt, va);
|
||||
va_end(va);
|
||||
|
||||
assert_crash(buf);
|
||||
return -1;
|
||||
#else
|
||||
va_start(va, fmt);
|
||||
str_fmt_out_err_va( fmt, va);
|
||||
va_end(va);
|
||||
|
||||
exit(1);
|
||||
return -1;
|
||||
#endif
|
||||
}
|
||||
|
||||
bool String::make_space_for( char const* str, sw add_len )
|
||||
{
|
||||
sw available = avail_space();
|
||||
|
||||
// NOTE: Return if there is enough space left
|
||||
if ( available >= add_len )
|
||||
{
|
||||
return true;
|
||||
}
|
||||
else
|
||||
{
|
||||
sw new_len, old_size, new_size;
|
||||
|
||||
void* ptr;
|
||||
void* new_ptr;
|
||||
|
||||
AllocatorInfo allocator = get_header().Allocator;
|
||||
Header* header = nullptr;
|
||||
|
||||
new_len = grow_formula( length() + add_len );
|
||||
ptr = & get_header();
|
||||
old_size = size_of( Header ) + length() + 1;
|
||||
new_size = size_of( Header ) + new_len + 1;
|
||||
|
||||
new_ptr = resize( allocator, ptr, old_size, new_size );
|
||||
|
||||
if ( new_ptr == nullptr )
|
||||
return false;
|
||||
|
||||
header = zpl_cast( Header* ) new_ptr;
|
||||
header->Allocator = allocator;
|
||||
header->Capacity = new_len;
|
||||
|
||||
Data = rcast( char*, header + 1 );
|
||||
|
||||
return str;
|
||||
}
|
||||
}
|
||||
|
||||
bool String::append( char const* str, sw length )
|
||||
{
|
||||
u64 time_start = time_rel_ms();
|
||||
if ( sptr(str) > 0 )
|
||||
{
|
||||
sw curr_len = this->length();
|
||||
|
||||
if ( ! make_space_for( str, length ) )
|
||||
return false;
|
||||
|
||||
Header& header = get_header();
|
||||
|
||||
mem_copy( Data + curr_len, str, length );
|
||||
|
||||
Data[ curr_len + length ] = '\0';
|
||||
|
||||
header.Length = curr_len + length;
|
||||
}
|
||||
return str;
|
||||
}
|
||||
|
||||
// gen namespace
|
||||
}
|
||||
|
||||
|
@ -41,14 +41,14 @@ void check_sanity()
|
||||
constexpr
|
||||
s32 num_iterations = 650000;
|
||||
|
||||
Array<CodeTypedef> typedefs = Array<CodeTypedef>::init_reserve( Memory::GlobalAllocator, num_iterations * 2 );
|
||||
Array<CodeTypedef> typedefs = Array<CodeTypedef>::init_reserve( GlobalAllocator, num_iterations * 2 );
|
||||
|
||||
s32 idx = num_iterations;
|
||||
while( --idx )
|
||||
{
|
||||
// Stress testing string allocation
|
||||
String type_name = String::fmt_buf( Memory::GlobalAllocator, "type_%d", idx );
|
||||
String typedef_name = String::fmt_buf( Memory::GlobalAllocator, "typedef_%d", idx );
|
||||
String type_name = String::fmt_buf( GlobalAllocator, "type_%ld", idx );
|
||||
String typedef_name = String::fmt_buf(GlobalAllocator, "typedef_%ld", idx );
|
||||
|
||||
CodeTypedef type_as_int = def_typedef( type_name, t_int );
|
||||
CodeType type = def_type( type_name );
|
||||
@ -59,17 +59,17 @@ void check_sanity()
|
||||
}
|
||||
|
||||
log_fmt("\nMemory before builder:\n");
|
||||
log_fmt("Num Global Arenas : %llu TotalSize: %llu !\n", Memory::Global_AllocatorBuckets.num(), Memory::Global_AllocatorBuckets.num() * Memory::Global_BucketSize);
|
||||
log_fmt("Num Code Pools : %llu TotalSize: %llu !\n", StaticData::CodePools.num(), StaticData::CodePools.num() * CodePool_NumBlocks * StaticData::CodePools.back().BlockSize);
|
||||
log_fmt("Num String Cache Arenas : %llu TotalSize: %llu !\n", StaticData::StringArenas.num(), StaticData::StringArenas.num() * SizePer_StringArena);
|
||||
log_fmt("Num String Cache : %llu\n", StaticData::StringCache.Entries.num(), StaticData::StringCache);
|
||||
log_fmt("Num Global Arenas : %llu TotalSize: %llu !\n", Global_AllocatorBuckets.num(), Global_AllocatorBuckets.num() * Global_BucketSize);
|
||||
log_fmt("Num Code Pools : %llu TotalSize: %llu !\n", CodePools.num(), CodePools.num() * CodePool_NumBlocks * CodePools.back().BlockSize);
|
||||
log_fmt("Num String Cache Arenas : %llu TotalSize: %llu !\n", StringArenas.num(), StringArenas.num() * SizePer_StringArena);
|
||||
log_fmt("Num String Cache : %llu\n", StringCache.Entries.num(), StringCache);
|
||||
|
||||
Builder builder;
|
||||
builder.open( "sanity.gen.hpp" );
|
||||
|
||||
idx = num_iterations;
|
||||
idx = typedefs.num();
|
||||
#ifdef GEN_BENCHMARK
|
||||
u64 time_start = time_rel_ms();
|
||||
u64 time_start = time_rel_ms();
|
||||
#endif
|
||||
while( --idx )
|
||||
{
|
||||
@ -82,10 +82,10 @@ void check_sanity()
|
||||
#endif
|
||||
|
||||
log_fmt("\nMemory after builder:\n");
|
||||
log_fmt("Num Global Arenas : %llu TotalSize: %llu !\n", Memory::Global_AllocatorBuckets.num(), Memory::Global_AllocatorBuckets.num() * Memory::Global_BucketSize);
|
||||
log_fmt("Num Code Pools : %llu TotalSize: %llu !\n", StaticData::CodePools.num(), StaticData::CodePools.num() * CodePool_NumBlocks * StaticData::CodePools.back().BlockSize);
|
||||
log_fmt("Num String Cache Arenas : %llu TotalSize: %llu !\n", StaticData::StringArenas.num(), StaticData::StringArenas.num() * SizePer_StringArena);
|
||||
log_fmt("Num String Cache : %llu\n", StaticData::StringCache.Entries.num(), StaticData::StringCache);
|
||||
log_fmt("Num Global Arenas : %llu TotalSize: %llu !\n", Global_AllocatorBuckets.num(), Global_AllocatorBuckets.num() * Global_BucketSize);
|
||||
log_fmt("Num Code Pools : %llu TotalSize: %llu !\n", CodePools.num(), CodePools.num() * CodePool_NumBlocks * CodePools.back().BlockSize);
|
||||
log_fmt("Num String Cache Arenas : %llu TotalSize: %llu !\n", StringArenas.num(), StringArenas.num() * SizePer_StringArena);
|
||||
log_fmt("Num String Cache : %llu\n", StringCache.Entries.num(), StringCache);
|
||||
|
||||
log_fmt("\nSanity passed!\n");
|
||||
gen::deinit();
|
||||
|
Loading…
Reference in New Issue
Block a user