This is the mail archive of the gsl-discuss@sources.redhat.com mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Re: GSL, VSIPL, and the ultimate numerical library


"Robert W. Brewer" wrote:
> 
> It sounds like that is what Gerard was getting at with
> the GSE: an interactive way to simulate, test, and debug an
> algorithm at a very high level, together with additional analysis
> and graphing tools.  And then a migration path to deployment
> on a target, which would be further and further optimized in
> various ways for the specific application.  That would
> be very nice, but it is also very lofty.

As long as we're shooting for the moon, I would like to add
the need for component deployment as well as development.
Not only should people be able to develop "in-house"
applications with a degree of flexibility, but they should
be able to create components targetted to their problem
domain, not only for their own re-use, but for others
as well. Scientific computing tends to be "I have an
idea for a CFD solver (or whatever), let me go ahead
and code a giant black-box monolithic application,
reinventing several wheels along the way, creating
an inflexible tool. Maybe I will be able to solve
a few problems with it before it has to be thrown
away." Rather, it could be "Let me develop my solver
as a component to be dropped into a reusable framework,
developed over time by other experts in this and related
fields, and when I shelve the project I won't have to
throw away all the developed infrastructure."

This is not appropriate for every project, but I see
a number of "research/production" codes which are
doomed from the start by their monolithic nature.
The wasted effort can easily dominate the project.
Maybe people would change if they were
given significantly better tools. Maybe.

> Currently the way
> I would do that, and the way I've mainly seen that done is
> like this:
> ... [outline deleted]
>

Even that outline is often a dream. In high performance
computing I have seen an outline more like:

  - spend years writing giant F90 code, reinventing all aspects,
    including data structures, communication methodology,
    file formats, etc., often at the lowest conceivable
    level of abstraction
  - have a single developer hack at the code until it is
    comprehensible only to him
  - solve a few problems, then hack some more, introducing
    "options" which were found necessary to get good results
    on certain types of problems; make the changes as non-dynamic
    as possible by introducing them into the build process, so
    the code is no longer a single platform for problem solving,
    but a confusing multi-headed gorgon
  - insist that refactoring or re-engineering is not a
    viable option because further investment in "development"
    cannot be justified (it would probably be impossible anyway)
  - repeat until dead

At some level, the problem comes from a lack of any
viable tools or models for prototyping and component
deployment, especially on massively parallel platforms
where nothing ever works right and the system changes
on a monthly or weekly basis.

I hope we're not just doomed.


> Now certainly a library or a GSE is no substitute for
> discipline and keeping the analysis and design documentation
> and models updated.  But I think it can help, especially
> when the library has high-level enough functionality that
> the final code is forced to be somewhat self-documenting.

Yes. Systems should be natural, and their naturalness
should flow directly from the design process.

Maybe we are doomed after all.
Oh dear.


-- 
G. Jungman

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]