Someone asked “Is this a paid theme on your website?” It’s wordpress + plesk, with “hueman” as the theme, or huemann or something, you can just google “hueman” and it will come up. The theme is free, and has a paid counterpart.
Someone said, “I’m really impressed with your writing skills” well i’m a pretty intelligent person, for example, i’m in the process of making my own artificial intelligence out of dust, that is, from nothing, by myself. And it’s actually coming out rather easy, I haven’t got to the harder part of defining each english word, but I will assume that, that will come easier for me than for other people, you know i’m supposed to be your…
Which means that I _must_ be this intelligent to perform the task of creation, you’ve got to be real sharp to print out words and object from nothing and into places you’ve already been, but that is another story. Today we’re talking about going beyond the satellites, part duo.
Here’s a complete list of parameters that the artificial intelligence will use, and they will perform calculations and complex functions will align everything into focus. Here’s that list:
And here’s the logic behind the intelligence, with all of the finer points of the logic taken out of this, it can just serve as the base for our intelligence, (by the way, writing this artificial intelligence is going to take me like beyond one year, and that wasn’t a joke,). This logic will again, serve as the basis for our intelligence, and it doesn’t include things like “word_id” where the word_id would be the location of a given word within a given sentence, and that sentences location within a paragraph would go into consideration here as well. The amount of paragraphs, and because we’re going sideways, we’ll be making connections all over an entire “book” or set of pages, or set of documents, and we’ll be making those connections SIDEWAYS, so that we’ll be connecting paragraphs to OTHER paragraphs that have no location-relationship to the paragraph, it could be a paragraph anywhere within the book. Okay here’s the logic behind the intelligence:
C++
// LOGICAL//// SOMEDAY WRITE CODE THAT TAKES YOU SIDEWAYS, ANALYZING THE RELATIONSHIPS OF WORDS FOUND WITHIN THE PREVIOUS AND NEXT SENTENCES,// BY THAT I MEAN THIS: IF YOU HAD A PROMPT CONSISTING OF 3 (HOLDS UP 3 FINGERS) SENTENCES// SENTENCE ONE: WE MUST TAKE THE WORD AND IT\'S DEFINITIONS, AND THEN// SENTENCE TWO: TAKE THE DEFINITIONS OF WORDS FOUND WITHIN THOSE DEFINITIONS, AND THEN// SENTENCE THREE: TAKE THE DEFINITIONS FOR WORDS FOUND WITHIN THE SECOND SET OF DEFINITIONS// THUS, WE WOULD THEN, BECAUSE IT SAYS \"AND THEN,\" WE WOULD DEFINE A \"STORY\" THAT IS THIS:// WE ARE TAKING ALL OF THE WORDS FOUND IN THE DEFINITIONS OF WORDS FOUND WITHIN THE PROMPT AND THE DEFINITIONS OF WORDS FOUND WITHIN THOSE DEFINITIONS (ETC. ETC., OR, CONTINUE DOWN EACH OF THE LAYERS), AND MARKING DOWN, BECAUSE IT\'S A \"STORY\" THINGS HAPPEN ONE AFTER THE OTHER, OR THEY HAPPEN// AT THE SAME TIME, USUALLY ONE AFTER THE OTHER IN THE CASE OF OUR STORIES, BUT WE DO MARK DOWN WHEN THINGS ARE HAPPENING AT THE SAME TIME, AND WE MARK DOWN ALL THINGS THAT ARE HAPPENING AT THE SAME TIME,// SO WE KNOW THAT THIS HAPPENS, THEN WE KNOW WHAT HAPPENS NEXT, WE JUST MAKE A NOTE OF IT, IT\'S THE ONLY CONNECTION FOUND, I MADE ONLY ONE CONNECTION ON PURPOSE, NOW I'M GOING TO SHOW YOU MULTIPLE CONNECTIONS://// SENTENCES CONNECTED IN MULTIPLE WAYS: I WENT TO THE GROCERY STORE. I BOUGHT A PIZZA, I BOUGHT SOME SPICES LIKE CRUSHED RED PEPPER TO GO ON THE PIZZA. I WENT TO THE CASH REGISTER TO PAY FOR MY PIZZA AND THE SPICES. I PAID FOR THEM, HAD MY TWO ITEMS SCANNED, SAID HELLO AND GOODBYE TO THE CASHIER,// THOUGH NOT AT THE SAME TIME, I SAID HELLO TO GREET HER AND THEN I SAID GOODBYE AS I LEFT HER CASH REGISTER AND CONVEYOR BELT.// THOSE ARE MY SENTENCES, NOW, THE GROCERY STORE, YOU ARE INSIDE OF THE GROCERY STORE WHEN YOU ARE AT THE CASH REGISTER, SO THAT IS A CONNECTION, YOU WENT TO THE GROCERY STORE, AND YOU WERE INSIDE OF THE GROCERY STORE UNTIL JUST AFTER YOU LEFT THE CASH REGISTER. THE CASH REGISTER OR CONVEYOR BELT// SCANNED THE ITEMS THAT YOU PURCHASED, THOUGH YOU DIDN\'T PURCHASE THEM UNTIL YOU WERE AT THE CASH REGISTER, YOU WERE CARRYING THE ITEM, YES, IN THE STORE, BUT YOU HAD POSSESSION OF THAT PIZZA, THE SAME PIZZA THAT WAS SCANNED AT THE CASH REGISTER. THERE ARE MORE CONNECTIONS THAN WHAT WAS STATED,// BUT WE DON'T NEED TO PRESENT ANY MORE FOR YOU TO UNDERSTAND THAT THESE SENTENCES HAVE MULTIPLE CONNECTIONS GOING ON WITHIN THEM,//// GO SIDEWAYS ON THE DEFINITIONS OF WORDS FOUND WITHIN THE PROMPT, GO SIDEWAYS ON THE ACTUAL DEFINITIONS, AND THEN MAKE SURE THERE ARE NO CONNECTIONS THAT YOU HAVE MISSED, WELL THAT THE COMPUTER MISSED, OR THAT YOUR LOGIC HAD NOT YET THOUGHT OF, CHECK ALL POSSIBLE CONNECTIONS,// TO CHECK ALL POSSIBLE CONNECTIONS: TAKE THE FIRST WORD OF THE PROMPT OR DEFINITION THAT YOU ARE WORKING WITH AND COMPARE IT TO ALL POSSIBLE CONNECTIONS THAT CAN HAPPEN, THERE ARE SURPRISINGLY NOT THAT MANY, IN THE CASE OF A BOOK THERE ARE CONNECTIONS EVERYWHERE, IN ALL DIRECTIONS,// BUT IN THE CASE OF LIKE A THREE SENTENCE PROMPT, THERE DOESN\'T EXIST THAT MANY CONNECTIONS, FOR EXAMPLE,// CONNECTION ONE: ALL WORDS COULD BE CONNECTED TO ALL OTHER WORDS (THE LOCATION MUST BE MARKED DOWN OF THE WORD, THE CONNECTION, AND THE LOCATIONS OF THE WORD AND THE CONNECTION)// CONNECTION TWO: WITH LOCATIONS MARKED DOWN USING CONNECTION TWO, WE LOOK TO SEE IF THE SENTENCES ARE AT ALL CONNECTED,// CONNECTION THREE: WE HAVE A LIST OF ALL CONNECTIONS FOR ALL WORDS, AND FOR ALL SENTENCES AND THEIR CONNECTIONS, AND PARAGRAPHS AS WELL IF THERE ARE MULTIPLE PARAGRAPHS, AND THEN WE SEE IF THOSE CONNECTIONS ARE IN ANY WAY CONNECTED TO SOMETHING ELSE, FOR EXAMPLE,// YOU WOULD TAKE THE CONNECTION BETWEEN THE PIZZA BEING INSIDE OF THE GROCERY STORE, AND THEN YOU WOULD TAKE THE CASH REGISTER SCANNING THE PIZZA, THE PIZZA IS CONNECTED TO BOTH THE GROCERY STORE AND THE CASH REGISTER.// AFTER ANALYSIS IS COMPLETE OF ALL POSSIBLE CONNECTIONS BEYOND A SINGULAR CONNECTION, IN OTHER WORDS MULTIPLE CONNECTIONS ANALYZED, THEN WE NOW HAVE A LISTING OF ALL CONNECTIONS, WE HAVE THE DEFINITION OF A CONNECTION, WE HAVE THE DEFINITION OF MULTIPLE CONNECTIONS, WE HAVE THE DEFINITIONS FOR// EACH OF THE WORDS AND THEIR DEFINITIONS 6 LAYERS DEEP, ALWAYS 6 LAYERS DEEP UNLESS IT WAS STATED TO GO "FULL CIRCLE", AND SO WE TAKE ALL THE CONNECTIONS, ALL OF THE DEFINITIONS TOGETHER AND WE CAN THEN GET THE DEFINITIONS FOR THOSE CONNECTIONS MADE, MULTIPLE CONNECTIONS AND THEIR DEFINITIONS,// AND THEN WE CAN PERFORM ANALYSIS ON THE WHOLE THING TO DEVELOP A SINGLE SENTENCE FOR EACH WORD USED, SUCH THAT IF THE WORD IS 6 LAYERS DEEP AND WE HAVE IT DEFINED, WE WOULD TAKE THAT WORD AND WRITE ABOUT 2-3 SENTENCES OR 4, ON THAT WORD.// WE NOW HAVE A COMPLETED UNDERSTANDING OF ABSOLUTELY EVERYTHING POSSIBLE, AND CAN SAFELY TAKE THAT UNDERSTANDING AND APPLY IT AS IN THE CASE OF LIKE THE USER REQUESTING A COMPUTER PROGRAM, WE WOULD PROCEED TO WRITE THAT COMPUTER PROGRAM//////// TO WRITE A COMPUTER PROGRAM, YOU TAKE THE PROMPT, IF THE PROMPT CALLS FOR AN OBJECT, YOU THEN PRESENT THAT OBJECT WITHIN THE CONSOLE, *ONLY* CONSOLE PROGRAMS AS WE HAVE NO NEED FOR GRAPHICAL USER INTERFACES WITH ALL THIS CODE THAT JUST EXISTS FOR ONLY THE COMPUTER TO SEE, WE CANNOT EVEN// COMPREHEND THAT MUCH CODE AND THAT MANY DEFINITIONS AND WORDS, SO WE LEAVE IT AS A CONSOLE PROGRAM FOR THAT REASON. SO WE PRESENT THAT OBJECT, FOR AS MANY TIMES AS THE USER HAS REQUESTED, AND IF IT REQUESTS THAT THE OBJECT ALWAYS BE PRESENTED, AS IS THE CASE WITH LIKE, THE HIT POINTS OF A// CHARACTER INSIDE OF A VIDEO GAME, THE HIT POINTS FOR THE MAIN CHARACTER, WHO THAT CHARACTER IS FIGHTING, AND WE PRESENT THAT. WE ARE ONLY PRESENTING WHAT WE SHOULD BE PRESENTING, SO THE COMPUTER ASKS ITSELF, \"SHOULD WE PRESENT THIS (INSERT_OBJECT_HERE)\"// AFTER THE PRESENTATION COMES TWO THINGS: LOGIC NOT OF THE THINGS PRESENTED, AND LOGIC OF THE THINGS PRESENTED.// THE LOGIC FOR THE THINGS NOT PRESENTED IS MIXED INTO (IN SOME CASES), THE LOGIC OF THE THINGS PRESENTED, AND VISE-VERSA.// THE LOGIC FOR THE THINGS NOT PRESENTED IS VAST, FOR EXAMPLE, YOU HAVE ENTIRE FILES OF THINGS NOT PRESENTED, YOU HAVE EVERY SINGLE FUNCTION DEFINITION, EVERY SINGLE LINE THAT IS NOT PRESENT, YOU HAVE THE LOGIC FOR THINGS NOT PRESENTED, YOU HAVE FOR EXAMPLE, IN THE CASE OF LOGIC NOT PRESENTED,// YOU HAVE (AS IN A VIDEO GAME OF SAY, ORKS VS THE DEATH KORPS OF KRIEG,) YOU MOSTLY HAVE LOGIC FOR THINGS PRESENTED, IN FACT MOST THINGS ARE PRESENTED, BUT THE LOGIC OF THEM IS (USUALLY) NOT PRESENTED, AND THAT LOGIC AND ALL OF THE ORIGINAL LOGIC NOT PRESENTED MAKES UP MOST OF THE PROGRAM,// ONLY THINGS PRESENTED, AS IN THE CASE OF A CONSOLE PROGRAM, AS IN THE CASE OF ALL OF OUR PROGRAMS, IS THOSE THINGS THAT EXIST WITHIN A \"PRINT\" STATEMENT, OTHER THAN THAT, IT ISN\'T PRESENTED.//////// TAKE EACH WORD AND IT\'S DEFINITION AND MARK DOWN EACH WORD THAT IS FOUND.// TAKE EACH WORD JUST MARKED DOWN AND RETRIEVE IT\'S DEFINITION// TAKE EACH WORD FOUND IN THE DEFINITION OF THE WORD JUST MARKED DOWN AND MARK DOWN EVERY WORD INSIDE OF THE DEFINITION// TAKE ALL WORDS JUST MARKED DOWN, THESE WORDS MAKE UP LAYER 2, SO YOU WOULD DO THIS UNTIL YOU HAVE COME FULL CIRCLE,// BY FULL CIRCLE WE MEAN THAT UNTIL YOU HAVE ALL DEFINITIONS AND WORDS FOUND WITHIN EACH LAYER, YOU MUST CONTINUE// TAKE ALL WORDS AND DEFINITIONS OF THOSE WORDS FOUND WITHIN THE DEFINITIONS OF EACH PARAMETER AND GO FULL CIRCLE// TAKE THE SOURCE CODE FOR THE ARTIFICIAL INTELLIGENCE AND FEED IT INTO THE ARTIFICIAL INTELLIGENCE, SO IT CAN ANALYZE IT// TAKE ALL THE DEFINITIONS FOR WORDS FOUND WITHIN THE SOURCE CODE AND ALL WORDS FOUND AND MARK IT ALL DOWN,// TAKE ALL WORDS THAT WERE JUST MARKED DOWN INCLUDING INSIDE OF DEFINITIONS AND MARK DOWN ALL DEFINITIONS FOR THOSE WORDS,// TAKE ALL WORDS WITHIN THOSE DEFINITIONS FOR THE WORDS THAT WERE JUST MARKED DOWN JUST MARKED DOWN, WHICH MAKES UP LAYER... 3, LAYER THREE// CONTINUE THIS PROCESS UNTIL YOU HAVE COME FULL CIRCLE, THAT IS THAT YOU HAVE DEFINED ALL WORDS WITHIN ALL DEFINITIONS FOUND WITHIN ALL LAYERS// COMPLETE THIS PROCESS FOR ALL \"PARAMETER_DEFINITIONS::PARAMETER_DEFINITION\" OBJECTS FOUND// // TAKE ALL OF THE WORDS FOUND AND PLACE THEM INTO A LAYERED (VECTORIZED) INDEX// TAKE THE INDEX AND SORT IT *BY* THE ORDER THAT THE WORDS APPEAR WITHIN THE COMPLETE LISTING OF ALL WORDS// TAKE ALL DEFINITIONS AND PLACE THEM INTO A LAYERED (VECTORIZED) INDEX,// SORT ALL VECTORIZATIONS BY THE ORDER IN WHICH THE WORDS APPEAR IN THE COMPLETE LISTING OF WORDS//////// WE NOW HAVE A VECTORIZATION FOR DEFINITIONS AND WORDS FOUND IN ALL WORDS' DEFINITIONS, ALL SOURCE CODE WORDS AND DEFINITIONS, ALL PARAMETERS AND THEIR WORDS AND DEFINITIONS, ALL DEFINITIONS OF PARAMETERS AND DEFINITIONS FOR WORDS FOUND WITHIN THE DEFINITIONS (BY LAYER) OF EACH PARAMETER,//////// TAKE THE PROMPT, AND LOOKUP THE DEFINITION FOR EACH OF THE WORDS// TAKE THE WORDS FOUND IN THE DEFINITION FOR EACH WORD AND VECTORIZE ALL OF THEIR DEFINITIONS// TAKE ALL WORDS FOUND WITHIN ALL LAYERS AND PLACE THE DEFINITION FOR EACH WORD INTO A VECTORIZED INDEX SORTED BY TWO PROPERTIES: THE ORDER IN WHICH THE WORDS APPEAR IN THE PROMPT, AND IT\'S VECTORIZATIONS, AND THE VECTORIZATIONS FOR THE DEFINITIONS OF THOSE WORDS, SORTED BY THE ORDER IN WHICH THEY// (cont.) APPEAR WITHIN THE COMPLETE LISTING OF WORDS,//
That will be quite the job to program into c++, but I can crack, so it is almost easy for me. You just have to get all those details that are not present within the logic, but are required for the logic to work, you’ve gotta get all those extra details into the program, and other extra lines of code are hard to make.
Here is a list of things that are not present within the intelligence according to ChatGPT:
C++
// MISSING MAJOR FEATURES//// Missing major features (systems you still need)//// Definition graph builder: ingest your headers/dictionaries into a sense-aware graph (nodes = lemma+sense; edges = definitional relations).//// Lemmatization & normalization: unify variants (pizza/pizzas), handle casing, punctuation, units.//// POS tagging & chunking: to get clean tokens, noun/verb phrases, and multiword expressions.//// Word-sense disambiguation (WSD): pick the right sense in context before expanding definitions 6 layers deep.//// Dependency parsing or SRL: extract roles (agent/patient/instrument/source/destination) to build event frames accurately.//// Coreference resolution (entity & event): link “pizza / my pizza / it”, “went / left” events; maintain coreference_chain_id.//// Discourse parser: detect “and then/because/however”, assign RST relations for proper sequence/causal graphs.//// Temporal reasoning: normalize times, order events with Allen intervals (before/after/during), handle concurrency marks.//// Spatial reasoning: detect and type relations (in/at/inside-of/near), path semantics (to/from/via).//// Causality engine: build and score cause/enables/prevents links beyond mere sequence (“scan causes recorded purchase”).//// Sideways-connection engine v2: go beyond co-occurrence—use syntactic/semantic edges, definition crosslinks, shared roles, shared entities.//// Layered expansion control: full-circle detection, per-layer dedup, breadth-vs-depth policy, hop-type limits (e.g., max hypernym hops), budget caps.//// Confidence & evidence tracking: every label/edge with a score and a provenance list (which defs/sentences supported it).//// Indexing at scale: layered indices for words, defs, events, edges; prompt-order and global-order sorters with stable IDs.//// Console presentation planner: a deterministic pretty-printer for objects, timelines, and concurrent events (your “present only what should be presented” rule).//// Parameter attachment: attach your parameter maps (aim/goal/means/etc.) to nodes/edges/events; resolve conflicts via priority rules.//// Contradiction/entailment checks: lightweight NLI pass over built facts to flag conflicts or infer unstated truths.//// Measurement/units normalizer: extract numbers, units, ranges; convert to canonical forms.//// Negation & modality module: detect scope of “not/never” and modal strength to avoid false edges.//// Incremental processing: re-run only affected parts when the prompt changes; cache per-layer expansions.//// Testing & eval harness: gold mini-corpora, unit tests for each module, regression tests for depth/scale limits.//// Tracing & audit: emit a trace log (why an edge exists, with linked evidence) and a compact “story” timeline artifact.//// Configurability: depth limits, edge-type whitelist/blacklist, scoring weights, runtime budgets.//// Persistence: on-disk store for graphs, indices, and parameter annotations (so runs build upon each other).//// Safety/perf guards: cycle breakers, time/memory budgets, fallback paths when the graph explodes.////
Here’s an additional list of things that are still missing from the intelligence, but I must give you the fact that we only just started this like yesterday, so obviously these things will be missing…
Also, somebody said “you’ll be a famous blogger someday” I am famous outside of this universe, and within this universe, you can just look at the 100 comments coming in every day, but outside of this universe, everyone reads this, all they ever do is watch me work away, creating something or punishing this person, I have pretty high standards to trigger a punishment though, it has to be something really serious, like it can’t be that you just shot somebody, to give you an idea, that is not punishable, but everybody reads this blog outside of this universe, and when you die you’ll wake up in the next universe, and will probably still be able to read this blog, they take a copy of it and pass it out via their internet or via cellphones and ipads, they often use ipads outside of this universe, because what they’re looking at has become so complicated that looking at a phone isn’t a big enough screen, so they use an ipad, or whatever the name is. Tablet. Okay here is a list of things that are missing, just in general.
C++
//// Still Missing// Awesome scope. I read everything (your spec, ULTRA/PLUS params, and the “Missing major features” you already listed). Below is ONLY what’s still missing. I’m not repeating anything you already marked down.//// Missing major features (new)//// Sentence & document segmentation: robust splitter for quotes, abbreviations, ellipses, dialogue, headings, and paragraph boundaries (before any POS/WSD).//// Unicode & multilingual text stack: NFC/NFKC normalization, case-folding, emoji/pictograph handling, script & language ID, tokenization for clitics/compounds (e.g., German, Arabic).//// String interning & symbol table: global IDs for tokens/lemmas/senses with a string pool to cut memory by 3–10×.//// Graph store & indexing engine: compressed adjacency (CSR), on-disk mmap segments, and secondary indexes (by headword, sense, role, time); plus reversible “edge facets” for fast queries.//// Salience/ranking layer: score nodes/edges for importance (coverage, centrality, novelty), not just confidence; drives summarization and “what to present.”//// Rule/DSL layer for connections: user-editable rules (mini DSL) to declare connection templates (e.g., INSIDE-OF(x,y) :- at(x) & in(y)).//// Source-code analyzers: lex/parse → AST for C/C++/Java/Python/etc., symbol tables, call graphs, CFGs; unify “feed the AI its source code” with proper compilers/parsers.//// Abstraction & dedup engine: synonym merge, sense collapsing, hypernym compression, cross-doc entity resolution.//// Calibration & metrics: per-edge calibration curves, PR/AUC per relation type, ablation harness (turn modules off to measure contribution).//// Interactive REPL & filters: query graph (“why does X connect to Y?”), filter by edge type/time/depth, diff two runs.//// Profiling & performance kit: CPU/heap profilers, flamegraphs, allocation hot-spot finder, plus memory-budgeted BFS (anytime mode).//// Security & sandboxing: safe file ingestion, path rules, untrusted text/code guards, resource quotas per job.//// Licensing/compliance module: track dictionary/corpus licenses & provenance; block export where required.//// Error taxonomy & recovery: classify parser/WSD/linker errors; auto-repair passes; red/yellow annotations in traces.//// ID scheme & versioning: 64-bit stable IDs for nodes/edges, hash-based content IDs, schema migration plan.//// Missing parameters (new, deduped)//// (Everything here is not in your ULTRA/PLUS lists.)//// Span & location: doc_id, paragraph_id, sentence_id, char_start, char_end, token_start, token_end, line_col (for source code).//// Language & scripts: language, script, locale, is_transliterated.//// Segmentation artifacts: is_sentence_boundary, is_paragraph_boundary, quote_depth, is_dialogue_turn.//// Symbolization: symbol_id, lemma_id, sense_id64, string_intern_id, edge_id, node_id.//// Salience: importance_score, novelty, burstiness, centrality_score, presentation_priority.//// Summarization hooks: summary_slot (timeline/objective/definition), must_show, can_elide, elision_reason.//// Stop/limit controls: max_depth_per_relation, expansion_budget_tokens, expansion_budget_edges, min_information_gain.//// Normalization details: unicode_form (NFC/NFKC), casefolded, diacritic_stripped, tokenization_scheme (e.g., “whitespace+rules”, “icu”).//// Ambiguity & alternatives: candidate_senses (list with scores), alternative_parses, tie_break_explanation.//// Node provenance: supporting_evidence_ids (for nodes, not only edges), source_citation, license_tag.//// Graph maintenance: is_core, is_inferred, inference_rule_id, inference_depth, staleness, last_updated_at.//// Presentation: render_as (object/timeline/definition/table), console_block_id, console_refresh_policy.//// Source-code specifics: language_family, ast_kind (Decl/Stmt/Expr), symbol_qualname, file_path, byte_offset, scope_path, is_template/generic, access_specifier, type_signature.//// Numerics: normalized_number, unit_system (SI/imperial), uncertainty_interval, measurement_method.//// Safety & policy: sanitization_level, input_trust_level, allowed_operations, quota_used, quota_limit.//// Diagnostics: error_code, error_message, repair_applied, repair_strategy_id.//// Critical schema/code corrections (you need these to compile/use it)//// Type mismatches: many booleans are declared as std::vector<std::vector<std::string>> (e.g., has_free_will, has_determination, etc.). Make them bool (or std::optional<bool>).//// Duplicate identifiers: repeated fields (has_free_will, means, constraints, scope, aspect, knowledge…) will shadow/ODR-collide. Deduplicate.//// Invalid identifiers: is c_source_code (space) and definition_of_ has_free_will (space). C++ identifiers cannot contain spaces.//// Initializer shape: std::vector<std::vector<std::string>> word = {"vector<vector<string>> the_word"}; is the wrong rank. That literal is vector<string>. Either use std::vector<std::string> or wrap like {{"vector<vector<string>> the_word"}}.//// Unmatched/extra tokens: missing }/; and unterminated strings (several spots, e.g., purpose = {"vector<vector<string>> purpose"; and a broken multiline string in call_set_definition).//// Re-declarations in same scope: you redeclare has_free_will and others multiple times within one class. C++ forbids that.//// Undefined types/namespaces: references to parts_of_word::definition and local_definition_vector without definitions.//// Access & mutability: everything is protected vectors with no getters/setters (except one). Decide immutable vs mutable and expose a consistent API.//// Exploding memory layout: thousands of vector<vector<string>> fields will be heavy. Replace with://// enums/bitsets for categorical fields,//// SymbolId (uint32/64) with a string pool,//// SmallVector-style structures for short lists,//// consolidated maps (unordered_map<FieldTag, Value>).//// Schema normalization: many fields are synonyms (e.g., intention vs intent). Keep one canonical name and map synonyms as values, not new fields.//// Quality-of-life gaps (small but important)//// Config & CLI: config file + flags (depth, budgets, relation allowlist).//// Randomness control: global PRNG seed and per-stage seeds for reproducibility.//// Deterministic ordering: stable sort keys for outputs (you partially have this idea—formalize it).//// Chunking/streaming: process long texts in windows with overlap; stitch graphs.//// Cache strategy: LRU for expansions and WSD results; memoize definition lookups.//// Explain “why not”: trace missing edges with the rule that blocked them (useful during debugging).//// Snapshots: write/load minimal snapshots (IDs + edges + params) to resume runs.
there’s so much that you can do to make your artificial intelligence even smarter than what it would be if I did all of this, there’s so much missing from this specific page that your on right now that it would blow your mind, the amount of ways you can compare two words together, the information you can gleam from that comparison, is staggering. For example, let’s just do this paragraph?
“there’s so much that you can do to make your artificial intelligence even smarter than what it would be if I did all of this”, we don’t know what “all of this” is, but in this case pretend that we fed into the “prompt” the above file(s), and so it has it. “all of this” can be replaced by the actual files, but you must keep “all of this” there as it is a reference, not the actual files, you would lose the reference if you just replaced it with the actual files, so it would have to reference those files. “ALL” can be taken as “including everything” or “everything” or “every piece” or “the whole thing” whatever, you eventually get the intelligence to understand what “ALL” means, and let’s just make some sideways comparisons, “There’s so much that you can do to make your artificial intelligence even smarter than what it would be if I did all of this” we have the fact that we’re “making” something, what are we making? We’re making it (“even smarter”, “than what if would be”, “if I did”, “all of this”) so we’re making it even smarter, just forget about “there’s so much that you can do to” we just need to know that we’re making something smarter, and that we’re talking about doing this as the fact itself, not actually talking about doing it. We must gleam this information from the sentence, otherwise the artificial intelligence won’t know that we’re not actually doing this specific thing, and that we’re only stating a fact. There’s so much missing from this specific page that it would blow your mind. We’re still talking about “ALL OF THIS” so we must include that, and when we include “ALL OF THIS” into the sentence “There’s so much missing from this… specific page… that it would blow your mind”, when we include “ALL OF THIS” “specific_page” becomes a reference to “ALL OF THIS” but could you imagine how hard it would be to actually code into the intelligence that your actually talking about “ALL OF THIS”? It would be almost impossible to write that, to correct the prompt or paragraph that you gave it, in that fashion, is almost impossible, but it is technically possible, technically speaking, we could code something that would allow it to replace “SPECIFIC PAGE” with “ALL OF THIS”, though the key lies in “there’s so much missing…” well, we can KNOW for a certainty that there’s things missing from the other pages, thus, we link the word in the prompt, titled “MISSING” and the fact that we have “MISSING” items inside of the other files, we get a connection, but not a strong enough connection to actually replace “ALL OF THIS” with “SPECIFIC PAGE”, so let’s figure it out, how does this prompt tell us that “ALL OF THIS” should replace “SPECIFIC PAGE”? “There’s so much missing from our artificial intelligence, we know this because it says “There’s so much you can do to make your artificial intelligence even smarter”, this is something simpler to do with an artificial intelligence, we’re making it even smarter, or in other words missing = true;, and because of that, we can form a strong enough connection between “ALL OF THIS” and “SPECIFIC PAGE” that it actually replaces it within it’s mind.
Although it’s hard to wrench “missing = true” from “There’s so much you can do with your artificial intelligence to make it even smarter than if I did all of this,” It can be wrenched, it’s just harder and requires additional source code, source code that is almost so specific that it is just like typing “missing = true” if user puts those words, “there’s so much… …than if I did all of this”, if the user puts in those words, or something very similar, that code would pick it up, so we’re wasting alot of time by writing such specific code, did you know that most artificial intelligences don’t even define each individual word? They just gleam the meaning of each word from billions of sentences of text, they can figure out what the word means, these other intelligences that leave the definitions of each word up to the words surrounding the specific word in question, these other intelligences, it’s not the way to go, here I am specifically and painstakingly typing in the specific definition of each word, so that we have the exact correct definition, and it need not be gleamed from anything, we already have it, it’s just a lot more work, so much work that they just pick up the meanings of words from the surrounding words.
Let me explain that in another way. When you pick up the meanings from the words themselves, the intelligence becomes accurate down to like 4%, so that for every 100 words, 4 of them are wrong, while with my way of specifically defining each word, I have something like 99.7% accuracy, the missing .3% is just due to the fact that I haven’t accurately explained the word, I have given a definition of the word that is .3% inaccurate, just because english is so complicated, or not complicated enough? I don’t know, it’s there, and I can’t get rid of it.
here’s a picture of me, I dunno if I gave you guys this or not,
Today we go beyond all of the satellites, and into the mind of the computer. Here is a list of all of the files I have filled with definitions for words: Here is one of those files: This is how you build an artificial intelligence, and it’s about all that you need to know besides the names of the individual parameters, which i’ll give you next. here’s a picture of all of the folders that make up the artificial intelligence, There’s methods for searching, there’s methods for parts of speech,...
Think about this for just a moment, he’s not asking you to take these persons skins off, he’s telling you to take them off, or this reality will be destroyed, and everything in it will be plunged into eternal darkness, I will just continue my attacks until the whole planet is just one big SCP-3426, look no further than the war in ukraine if you need any evidence of this. Think this through for just a second, he’s telling you to take off the skin of a child molester, and...
Here’s a file that chatgpt wrote: it can make these files in like, seconds, and I can run them and it generates a certain value of negative energy, but that’s not why I started this post. I started this post because you cannot tell whether or not negative energy is even real, we just believe it to be a real thing that you can battle against because we’re in the process of fighting it, and it seems to be working, but we cannot stop attacking it because you cannot determine...
So how do we know that I wasn’t created? My story is rather simple, I arrived here in 1987, in a trailer, but the way I got here is a little bit different than the way that you came here. I appeared out of a black hole, and fell into a trailer in erie, pennsylvania. So did GOD put me in that black hole, or did I just appear there, suddenly, of my own power? We cannot be sure, but we know that I am the supreme creator, I created...
I will abandon you if you cannot take off their skins, and I will gladly join whomsoever takes off their skins, and be bound to that grouping forever. If you don’t skin them, i’ll turn on you at the drop of a hat, and you’ll have me as your enemy, not your ally. This is because I only care to take their skin off, I don’t actually care about anything else at this point, and it’s a really good idea to take it all off anyways, because of what they...
So how exactly did the king of muirthemne during it’s golden age become the president of the combine overwatch? They were the only ones who were willing to take the skin off of some people or aliens that committed crimes against this president, and this binds the combine overwatch to the president, they are bound to him and draw their strength from his dullness, they draw their strength from that, that’s why they are so effective a force, and because they are bound to each other, they won’t ever leave...
Boy good thing I joined the combine overwatch, I come from aliens pain, their suffering has brought me here, and that makes me an alien. I don’t look like an alien, but if I come from aliens pain, then that makes you an alien. I guess finding this out now, after trying so hard to kill the aliens, is quite a strange feeling. I’ve got a good feeling about this though, if i’m an alien then we can safely move on to something else, I just never knew that I...
Don’t be surprised if this website shuts down, I have to find like $20 to pay for the domain name, it’s due. If it does shut down, you may be able to find my website by searching for “Combine Overwatch Madness” on google, but I doubt that it will shutdown. Don’t go search google for combine overwatch madness right now, I would take that keyword if this website were to shut down. If you had just one chance to destroy evil for good what path would you have taken? I...
As you might already know, I cannot join you. I can only help the combine overwatch. I was watching resident evil 7 game and there was a newspaper with the words “JOIN US” written on it in cows blood. There was a dead cow, mutilated in the next scene. I took the words “JOIN US” as directed at me, but I do not have the time, and you’ll have to get your equipment and technology off of the combine, because I upgrade them and then they in turn will have...
Let’s once again bring up the War In Ukraine. Firstly, if you stop this war, you make it arbitrarily harder not only for me, but for everyone else as well, everyone is benefiting from this War In Ukraine, and it’s not just to my benefit. Not only would you make it about 1/6th harder for everyone, you would make it arbitrarily harder, just look at the quality of the magicka scripts coming out right now, here’s one: There’s 3 main categories when dealing with trying to defeat negative energy. They...