Part 2 -- Simplification

(Clearing the fog: Product and Project)

Christopher Spottiswoode
cms@metaset.co.za
 
15 October 1998 (minor updates 27 October)

(This document is part 2 of the "Ride The Mainstream!" paper.)

Applying the MACK model to the real world, and escaping the Syndrome

We now look at the workings of the MACK model in terms of the agate, augmented by the relativistic perspectives. Those images or models relevantly simplify our real complexity in a knowledge-architectural way.

We have seen that the three zones of the agate -- the rough surface, the interrelated layers and the inner crystals -- represent, in general terms, reality-experience, abstract systems and basic logic.

Now, what is the most characteristic trend in applications in our ever more evidently complex modern world? Interoperation, integration, compounded by migration. Ultimately, it is intricate interconnectedness in logical space and time that has become the formal counterpart of real complexity. The layered zone of the agate is the one that is ever more predominant. It is indeed the characteristic quality of the stone, and as we saw, application-modelling should progress far before the crystals are laid down.

And as a projection of that trend, we shall see how MACK-compliant system-building best reduces the outer layer to the thinnest possible, though the eventual inner crystals must remain hard, clear and beautiful.

That is the abstract-modelling view, but it accords with practice. It is indeed the experience of many practitioners that relationships tend to express ever more of the knowledge in our applications. However, it has been difficult to do much with that insight. Why not? "Complexity-hiding" of course! The fog. But before we look at the correct way to hide complexity, let us review the mess that has ultimately resulted from loss of direction in that fog due to taking the Divine Programmer point of view.

The first result is that our ER diagrams become horrendously intricate. Our state-diagrams too. The Divine Programmer, still in naïve realtime, tries to represent everything at once. We try nesting but still can't see the picture. We try to "hide" the "complexity" in functions or "encapsulate" it in classes, but it keeps squeezing out (Exceptions, transactions that will not remain ACID, asynchronicity, distribution creating ORB complications, ambiguities in namespaces or interface repositories, versions, ...). We try to hide it at higher levels (Façade or Mediator patterns, 3-tier architectures, RM-ODP's viewpoints, ...). But any apparent orthogonal cleavage planes or logical boundaries are fragile. Integration and change sooner or later scratch and chip them, eventually to dust and ashes. (If only they would simply shatter!)

No wonder Booch, so clearly synthesizing the real experiences of our industry, came to label the predominant feature of software systems as "complexity"! No wonder industry demand has resulted in a UML.

In MACK we follow the agate and can make far more of relationships. The big problem becomes a big and real opportunity (We saw in Part 1 how it is already a big but theoretical opportunity).

The real complexity, or as real as we can get it, is at the surface, not inside. The surface is where our Realworld Equivalents or REs are. (It is merely serendipitous that "res" is the latin for "thing" or "matter". So as to ride on that luck, we shall call abstract things "entities".) The basic REs are the familiar visible things such as documents, text, words and letters, decimal numbers and digits, graphics and pixels. More interesting are things like realtime and i/o events, stored records, bit-represented numbers, mathematical numbers, more-or-less blobs, and source or compiled files. They are the artifacts with which we construct the views that we map to our outer reality, or with whose terms or mechanisms we store or add to or correct the knowledge model or database, or with which we represent or calculate, or into which we bundle output knowledge or interpret vague input. They are the constituents of the rough outer layer of the agate. They have some real properties or behaviour that we choose not to represent in purely abstract logical form. In MACK that measure of reality is cast in "RE-methods", while in Metaset they presently consist of mere C functions or code fragments.

RE-methods reduce dynamic reality from the outer boundary into the terms of the abstract model, creating abstract events which trigger abstract operations in the semantic net of the layered zone. It is the elemental simplifying "seeing as" process in its basic sensory input form.

(As implied in that link, the basic form of a MACK-conformant program or MACK-realization such as Metaset is then of course an event loop driving a finite-state-machine, while all RE-methods execute as mere calls or inline code bound-in. That is quite normal or mainstream. Let us return to the uniquenesses.)

That boundary/insides dividing-line represents the most basic orthogonality and stable dividing-line in our epistemological and knowledge-architectural views: that between reality and formality. However, we shall see how its exact positioning is variable, and under our fine and epistemologically-relevant control.

Most importantly, though, this is where the big pay-offs of automated logic in the layered zones can follow. But it is also where the most insidious mistakes are made. We shall shortly look at both.

However, it is interesting to note first why it has only relatively recently become necessary to make this big distinction. Conventional data-processing, with its punched-card origins, too predominantly consisted of the kinds of data and manipulations that are here classed as part of the RE boundary, that is, the REs and RE-methods. It was only later that all the integration started taking place. That would have been partly -- maybe even largely -- where Charlie Bachman's DBMS Copernican data-universe intuition came from. But as we saw in the box following, it is only with MACK's relativistic view of that universe that architecture has finally caught up.

So now we shall have the real pay-offs within the agate! Picking the rich fruits of infinite complexity via its rendering as indefinite formal interconnectedness.

At this point read about the "magic ingredient" in my 1997 paper (where the "typology" there is the "model" here). Evidently, all layer-zone programming is non-declarative, and as the market fills out the basic functionality it will become ever more plug and play, even conversational, in a lively way, interacting with the wider market without really trying, as supply and demand really get down to getting together and simplifying needs in terms of responsive and non-alienating supply. There will be no sharp dividing line between programming and using. The focus will always merely be on the activities of meeting goals that meet needs, coherently viewed in an automatically-reflective way. (The OMG's recent Task and Session CBOs really are too narrowly focused!)

All that takes place via the clearly-layered logical interior, where every abstract thing or "entity" is totally defined by its formal relationships with other entities. Only an entity which is an instance of an "RE-domain" has a real component thing or "RE-value" (or just "RE") in a strict 1-to-1 mapping with it. It is then in coherent RE terms that abstract combinations are given external meaning. Every other aspect of every entity is in the abstract or formal domain and susceptible to indefinite automatic placing in further contexts, synergetically, following strict binary-logic paths.

Such potential, in a rich application environment, as our ever more reusing, interoperating and diverse market will increasingly become, simply cannot be overestimated. But is it not a new Charybdis? Potentially, yes, in our confusable minds, but that will not happen unduly in practice, since we shall appropriately simplify it thanks to the market's better-provable simple-product perspectives! And potentially, yes, in the logic-chained model, but not in practice, as this is where the relativistic views come in, and only presently-relevant consequences are followed through in the immediate users' realtime, while the remainder can be carried through when required in the other realtimes. (Activity and user realtime or perspective management is a logically-simple even though very refinable task, and is therefore also handled reflectively hence automatically.)

But while that will be wonderful, let us first remind ourselves of the obvious and ubiquitous downside.

Though it is all very well to "see as", ad infinitum, in creative new contexts, when we subsequently multiply the manipulations we can go terribly wrong in it. ("… to really foul things up takes a computer"!) Some typical IT application questions show how familiar we are with the built-in possibility of infection with some interpretive confusion: How "first grade" is that apple in the on-line store? When enquiring into a human resources database, is your notion of "can speak French" the same as that of the person who coded the data? How bad -- or good -- is this alleged "bad credit risk"?

That is the familiar world of the input or edit program, with potentially good or bad impacts in all those reports and other derived renderings. It is where the inherent fuzziness of the real world is filtered or focused out, and any eventual nonsense is presumably not due to the good intervening logic, but to the original input reality-mapping which often, like Procrustes, distorts or mutilates the usual intangibilities of real complexity and potentially leads to great problems (of the kind that "Chaos" journalists love and technophobes exult in...).

But however the fuzziness in the interpretation is resolved or ignored, the consequences are generally quite clear: You either buy or don't buy, employ or don't employ, get credit or don't. Life becomes very binary. And we accept it or otherwise live with it. (Or else, with great justification, we decide to rebel against such oversimplifying system and do something about it ... but such judgment is quite another kind of "matter of interpretation", higher up in the layers of the market...)

We may note, though, that once any possible "semantic mismatch" between reality and formality is cleared up, if the models are well-defined (according to the relevant rules that are also still trade-secret), then there is no further semantic drift. The agate layers are solidly crystalline (even though of the "micro" kind, and even that is looked after by further models, of the reflective kind).

So it is with every application of a good abstract model. Within its RE-applicability constraints, it has an inner inexorability which works. But caveat emptor outside those not-always-explicit constraints.

The reality-filtering or -interpretation, that is, the "RE-mapping", is not always as simple as the above examples might indicate.

REs are evidently of variable granularity: some can be composed from others. That gives us the first kind of interrelationships between REs. View-composition is carried out by RE-methods. We can also define interrelationships such as ordering (Balance isLessThan CreditLimit) or mathematical ones (Y isSineOf X) using RE-method algorithms that do indeed hide some basic realities, which we may happily regard as real complexities which we may not wish to express further in our information systems. We accept ignoring totally any further complexity that may be lurking there. But the entities themselves (X, Y, isSineOf) typically also have many abstract properties of their own. Those are all in the layered zone.

RE-methods validly hide complexity. But where there is even a possibility of interconnection, it is better to go first into the layered zone and model the detail there.

Thus RE-methods composing views should be compilable after composition from more logical rules in models in the layered zone, in terms of application aspects such as relatednesses or relevancies (e.g. unique names are useful for visually verfiying codes), and more generic aspects such as styles and widget-natures. On the agate model, once compiled and executable under appropriate circumstances, they then belong to the crystal zone, but their meaning is derived from the particular boundary and related layers which define and model the application or execution-context.

Rather similarly, state-machines handling i/o and other events are almost always best kept in the layered zone, with only the thinnest or most elemental interrupt-handling kept on the rough surface (that is, logically on the rough surface, though bound-in as inner crystals). In that way MACK enables a decomposition of the total i/o function into naïve realtime physical i/o events triggering the application of operations imposing the corresponding relativistic realtime perspectives at the appropriate respective times.

Similarly again, legacy file assimilation into MACK-compliant form, like canonical version migration, is extensively modelled before being assembled into the appropriate "user-meaning handlers" encapsulating user semantics in efficient RE-method-shaped code.

There is also much scope for refinement and elaboration around the outer boundary. For example, what are the relationships between "Balance isLessThan CreditLimit" and "X isLessThan Y"? And how do the storage aspects integrate with those abstract relationships? But now we are getting uncomfortably close to the MACK trade-secret core, so I pass on...

In Metaset there is at present no strict control over the good behaviour of the C-coded RE-methods. That is controllable in the short term. For example, an untrusted RE-method can be run by a stub in another process, with pre- and post-conditions added automatically. In the medium term we can be sure appropriate alternative coding methods will arise, perhaps analogous to Java's sandbox. There is no great problem here as RE-method functionality is typically highly-refined by the abstraction-out of generic layered-zone functionality. For example, there is no room for any OO here, as all abstraction and refinement is handled via the agate layers: openly and reflectively-accessible.

However we generate the inner crystals that in run-time represent our formal RE meanings, that is, whether manually or by binding-and-code-generation, they always operate within a well-defined topological zone of the layered semantic net that is semantically-adjacent to the outer reality-boundary concerned. That is conventional "thin coupling". The art of model design, including its associated RE- and RE-method-design, is to ensure that such semantic-topological-coupling is maximally-invariant under likely model-manipulations and -transformations. That is the key to the stable reusability of RE-methods and their defining layered-zone models.

That might sound complicated and difficult, but all of it really does have plain and familiar counterparts. It requires the same skill-sets and habits that any OO designer finds natural (though not the structures or tools!). Except that with MACK the realities of constraints and facilities work out manageably, with many familiar problems naturally resolved and further opportunities offered, as we shall now see in some further examples.

But I don't think it is wise of me to try to sketch too much of these applied architectural aspects, considering that the fine-structuring techniques, and most particularly its precise "Beyond OO!" mechanisms, are still MACK trade-secret. I shall merely refer you again to the relevant section of my 1997 paper (and its link re the trade-secret aspect). (Remember once more that the "typology" of my earlier papers is here just "model".)

Also see the 1996 faq question 7 reply on how the model is the medium for the application of patterns. There is the important message that the MACK-compliant world will be very dependent on good pattern people for the creation of suitable reflective models of the kinds so often referred to above. It further points out how they will find the MACK environment a very gratifying one in which to hone and ply their talents. Their product-cycle-time in the groupware-assisted open market will often be measurable in hours.

Finally, it is useful to summarize by pointing out that there is no room for any Divine Programmer in such dispensation. He occupies merely the thinnest boundary zone, while his products exist in and are managed by a strict and transparent framework of far more easily and even reflectively manageable logic representing the relativistic world in which his compositions are to operate. And that big link represents the closure of that theme and our escape from the Divine Programmer Syndrome.

(And while it would be nice to hear some applause penetrating through the fog of my own words, that is an unreasonable expectation, but at least we there have a closure of my most immediate invitation to you!)

But all that is still theory (though it does all seem very obvious to my programmer's mind…). Now comes the real practice (where the proof of the pudding is becoming ever more evident, as already stated).

I shall assume for the rest of this Part 2 that all the above, together with Part 1, is at least a consistent and coherent indication that MACK:

I am asking quite a lot of you there, of course, but in the light of the trade-secret requirements, I can merely trust that the whole credibility argument in Part 1, including its link to my own IT biography, will support me enough for you to be able to regard the remainder of this Part 2 with a sufficient degree of "suspension of disbelief"…

The status of Metaset programming

The programming is all in C, programmed using Microsoft's Visual C++ development environment on Windows NT 4.0, but using the very minimum of the features of the development and runtime environments (in rough terms, define WIN32_LEAN_AND_MEAN and include WinSock). The APIs used, and their evidently underlying entities and properties, are easily modelled canonically to facilitate transformation to different windowing systems' object models and porting in the eventual canonical bind-and-code-generate way.

Metaset is at present a program (roughly an internal version 3) which represents a good portion of the inner quartz crystals of a basic "boot" or "seed" agate, though in the form of "manually-bound" RE-methods. That is, they are not all formally organized in the layered zone yet, where they will be logically situated at the outer boundary, though they are structured in anticipation of it.

At the same time, the inner models of the layered zone of the agate are being built up and stored in the integrated database, all very MACK-canonically again, and being retrieved and printed out in various interrelated views.

The design, especially after a rewrite that took place this year, is very tight yet provides for smooth, canonical, distributed-independent-supplier, market-driven growth towards all the dynamic multi-user, distributed, transformable, scalable, etc, qualities that are required for the full groupware and market-infrastructure application environment targeted, with 24x365¼ operation. The runtime efficiency is excellent so far.

All the C is structured with a view to eventual formalization as nice simple RE-methods. As a result the programming has several interesting qualities. There are no non-canonical work fields, hence no problem of hidden conflict or mismatches, and no memory leaks. In fact, all memory allocation is done by Metaset in very big quanta (themselves managed canonically) and then managed internally in a canonical way. As indicated in the "boot product" description and further details in my 1997 paper, as far as the user is concerned, the environment is fileless, with user data-management integrated in application-appropriate ways. That enables very extensive and reliable use of C indirection facilities, partly explaining the speed of processing as well as the virtually complete absence of lost pointers during the programming itself.

The present focus is on building up the inner models mentioned, and it is anticipated that, after not much more work there, the long-anticipated self-booting nature of the MACK model will begin to manifest itself in an increasingly spectacular exponential way.

But that work is quite difficult at this stage, involving painstaking assembly of the kernel models, which are the closest thing in MACK to meta-metamodels, and the very ones that will make that kind of work easy for everyone ever after! So I am reluctant to give estimated launch dates in the continued absence of a synergetic team (of the kind of which I have had many most gratifying experiences) funded and managed by a competent and committed organization. The latter is something which at this stage I myself can neither afford nor manage without removing myself from the programming, which I am still reluctant to do. As a result, as indicated under 1995 in my IT story, I have been working in solo mode, with some signal success as related there, but the time is now ripe for some synergy, considering the increasingly clear nature of the entire design, in itself for me a highly-convincing proof of the pudding.

But before we can consider how the project might be taken further with a view to such synergy, we have to consider how MACK and the MackWeb will transform the software market.

Commercial incentives for developers in the MackWeb

The key commercial opportunities are twofold:

The MACK agate model implies an enormous scope for the latter but, relatively, a shrinking scope for the former. Why is that? The question is key, as the consequences to our industry are evidently going to be very radical.

It all derives from the relative proportions of surface to interior as size grows, as already commented. Interconnectedness starts to predominate ever more.

However, that does not mean that the surface will shrink! The surface is of course where the asset of "mastered complexity" is appropriately hidden in a conventional way in RE-methods. Quite the contrary, it too will explode as total application volume grows exponentially. I/O device drivers, multimedia handlers, mathematical techniques everywhere, standalone external processes being managed by and being fed data by MackWeb applications … all that conventional coding and commerce will still be there.

Furthermore, all that business will even be greatly facilitated by the MackWeb data-management and market infrastructures. The MackWeb itself will in effect become a distributed "application operating system" to a significant degree, much as Old Web browsers are at present tending to become (as especially Microsoft is insisting...). And e-commerce will be implicit in the burgeoning MackWeb market infrastructure with its concomitant supply-side and demand-side stimulation. So that business will be even greater.

The same "surface" argument will apply to services too, as user-contact is part of the reality-interface. Service growth too will be greatly facilitated and further transformed by the new medium.

No, the real significance is in the growth of the interior of the agate, and in its nature, because that is where so-called complexity is no longer hidden. Transparent simplicity will be pervasive.

It will all be revealed or revealable because it will be more simple to manage, see and understand, but more importantly, it is interconnection, and the prime interconnections are between parties in the market. There is even an "existential" quality to that primacy.

That is of course mere cliché from The Mainstream once more: the open market requires transparency for most rational individual behaviour and optimal overall satisfaction and efficiency.

And a more infrastructure-facilitated and automated market requires it even more. It is, after all, already common cause that component reuse requires a full-feature component marketplace.

Despite a possible surfeit of opportunities, though, the user will not be confused, thanks to better view-design and more needle-sharp marketing in general, thanks to the MackWeb's more powerful projection of The Mainstream trends towards personalized marketing, thanks to MACK's more appropriate approach to complexity, mainly thanks to its model-based relativity with less obscuring of relevant details by the fog of bad complexity-hiding! (Does that tie it up for you? I hope the coherence and closure is apparent...)

The "fraud" by software suppliers will simply fade away under natural market forces (That means you and you and you behaving as is clearly the natural MackWeb-enabled way to behave in the market).

But does that mean that software suppliers will only make and sell RE-methods (nicely packaged and presented and smoothly implementable by means of neat model contexts in the extended market infrastructure though they will be)?

Not entirely.

They will also make totally-packaged products for specific situations, where the more precise definitions of those situations will of course be determined by the market, with the help of the better application designers/programmers with their perspicacious pattern-detecting eyes.

What form would those packages take? As indicated often enough, those definitions, precisely market-managed and user-committed, are then "bound-and-code-generated" into a full MACK-conformant program, deliverable with uncompiled integrated repository containing the remaining flexible models or perspectives within the total model for the target class of user, together with still-layered zones for the dynamic in-context demand-driven training and help.

Quiz question for the student: Given the whole exposition of this paper, what is that package best called?

Answer: An agate! Moreover, one with fewer layers but large and brilliant crystals representing all those carefully-composed bound-in models. And each one unique to its market niche. (Did you pass that test?)

Thus for example we may expect to find such agates representing any point in the spectra from hardwired application to development workbench, from thin-client to thick-client, or from appliance controllers to multi-user servers.

The gem of rich reality is not a diamond, artificially cut and polished in tiered-architectural terms, but a naturally-shaped and niche-unique agate, hugging its own users' complexities.

But is that not just more fraud of the same kind? No, because transparency still pertains, and a new and better market niche can easily and naturally evolve from the original one.

As we dig out their never-ending matrix rock of reality, we will find agates galore, as new ones are being generated all the time, thanks to ever-better facilities for growing them by leveraging that Internet of interconnecting channels.

Agates are much much cheaper than diamonds, and ultimately far more interesting and representative of what matters to us.

Digging together

At this stage one might simplify the options in terms of three scenarios, and I list them in increasing order of my own preference. All will offer considerable service opportunities to any participant, and more of them for the early birds. The main distinguishing feature will be the asset-selling opportunities, that is, the sale of RE-methods, encapsulated reality as we know it already, nicely-packaged in the various ways indicated, but always obsolescent in a dynamic and open market.

Plan C:

I continue with my solo programming, while you either forget the whole thing or try to concoct your own New Web in the light of any of the above perspectives that might appeal to you. I expect to beat you to deliverable product, with various Metaset agates that I will be able to sell before the competition catches up. It is in the interest of the MackWeb critical mass for me to make such competition happen as soon as possible. A ballpark return for me might be one million copies at US$20 each, with MackWeb-based support. That would also prove the point of easy supportability of a tool and medium intended "to help people simplify complexity"!

The release product is intended to be open-source, in the especially meaningful MACK-canonical way, so the money would amount to a support fee combined with payment for marketplace services. Such confidence! (But is it greed? Or unrealistic? You tell me.)

The confidence is inspired -- and the greed mitigated -- by my desire to have a New Web groupware facility available for my use for the purposes that my own story here has indicated, and as soon as possible. Such use would work better if I did not have to work hard to sell my services at the same time. That way my MackWeb use and personal application could be directed where I thought it would bring optimal returns as measured in various non-financial terms.

If you beat me to the market, then my scope for asset sales is reduced but I expect to be able to sell my services on another New Web too, and use one for my own purposes as indicated.

The winner would "pull the plum out": be the frontrunner in the setting-up of the most fundamental kernel-RE-method marketplace, with its effective responsibility for the administration of the most universal of MACK-compliant Common Knowledge.

While I would be a candidate for that function in the short term (as I would for a while remain the best-qualified person for it), I am keen to withdraw from that internal function and apply myself rather to using the market medium for external purposes. The latter would also include the promotion of the medium on both the supply and demand sides of the market.

Plan B:

Some company buys Metaset as-is -- lock, stock and barrel -- including my contracted services to assist with their completion of a sellable product. Do your own sums and make an offer (But in this plan my price could well be higher, and further expenses would be involved). The same motivations as in Plan C pertain.

Plan A:

Metaset as-is becomes public property as soon as possible and is put up on the Old Web. I improve the documentation in the work-in-progress. (There is already copious documentation despite the lack of colleagues to read it. I have always used documentation as a way of clarifying my thoughts, maybe simulating stimulation by colleagues. And besides, what would happen if I were to be "run over by a bus"? (The answer to that question is that there is already a good chance that someone else would be able to take the whole project further, on the basis of the existing programs and documentation, and I plan to improve those chances in the short term...)) But I would otherwise withdraw from the programming race so that I could start concentrating on preparing for my own further application of an eventual New Web.

There would be a free-for-all in the race to put agates on the market, but I would be available to assist whoever asks me to try to help in their own efforts. (I had earlier been concerned about standards chaos and/or superseded-version legacy problems, under such many-streamed development circumstances, but I have concluded that such threats and costs are not only containable but would be more than offset by the greater resources that are typically applied to a problem in an open-market development mode.)

Most likely this Plan A would produce a New Web soonest (I have seen those young crack programmers at work, and I never was one of them), hence it is my preference.

But in order to guarantee my future independence in its application, and in the first place to be sure that my message has got across to a sufficient degree, I would insist on my work-in-progress asset being paid for, upfront. That would also ensure that I could always fall back on Plan C in the hopefully unlikely event that the market race did not function at all.

My preference would be for a lump-sum of the order of US$2M, thus, in my estimation, adequately ensuring the independence just mentioned.

Hoping, then, that my message does get across somewhere, I would envisage a combination of two motivating forces behind the marshalling of such a payment (or, alternatively, an initiation of a Plan B):

  1. Any one user or a consortium of users, impatient of the performance of the present software market, might think it worthwhile to place a small side bet on what must, if I face facts, look like an outsider in the race to a future standard software architecture.
  2. Individuals in the freeware/shareware developer community, or sympathetic to it, are probably better able to appreciate the "fraudulent" nature of the present market. They should be able to see how MACK and a MackWeb or other New Web might loosen up the market. So they might manage to motivate and mobilize the above kind of sentiment. Certainly, in a MackWeb that community would flourish like they don't even dream about it now!

Objection! If it is that difficult to make a bit of money out of what must be enormous initial opportunities, then there must be a problem somewhere!

No. Firstly, I have long made my own objectives clear publicly, most particularly as the "Jack" persona in the OMG finds True Love play, but it is also apparent from my own computer story already referred to. Those motivations are urgent.

Secondly, it is most appropriate that kernel-MACK, including all models and RE-methods, as required for the minimum MackWeb groupware product that can bootstrap or seed the MACK-compliant market, should be regarded as totally public property, in an open-source way. They effectively represent all the design patterns that are already public property anyway, and -- more crucially -- the less hassle and cost involved in their widest reuse in the open market, the soonest MACK and a New Web will achieve critical mass (and it would not preclude a lot of business being done with more refined or specialized applications).

An easily and widely used groupware-based marketing medium that can really help us all simplify complexity together, and as soon as possible, remains my absolute Number One priority.

Amazingly, though explicably in terms of the fog of misconceived "complexity-hiding" in the Divine Programmer's world, such a medium does still not exist, and it could so easily exist "sooner and better", if only the Metaset/MACK project could get going on a wider front. And it could so easily do that, with your help! The skies over The Mainstream could so easily be blue.

Conclusion

Not wishing to wait any longer or grapple with trying to clear my own long-standing fog, Plan C is the one I shall pursue, while hoping that some one or more individuals will take the initiative and make either Plan A or Plan B go ahead.

I shall however still welcome any feedback on any of the above, or suggestions of any alternative plans, and advice as to how I might clear that fog of my own.