next up previous contents
Next: 1. A standard basis Up: Standard bases, syzygies and Previous: Contents

Introduction

The aim of this article is to describe recent advances and improvements on the tangent cone algorithm of T. Mora. This tangent cone algorithm is itself a variant of B. Buchberger's celebrated algorithm for constructing a Gröbner basis of an ideal in a polynomial ring over a field. In the same manner as the knowledge of a Gröbner basis allows the computation of numerous invariants of the coordinate ring of a projective algebraic variety, a standard basis (computed by the tangent cone algorithm) does so for invariants of the local ring of an algebraic variety at a given point. In this paper we describe a generalization which includes Buchberger's and Mora's algorithm as special cases. That is, we prove -- with an appropriate definition of ecart -- that Mora's algorithm terminates for any ordering on the monomials of $K[x_1, \ldots, x_n]$, which is compatible with the natural semigroup structure (a fact which was found independently by Gräbe [G]), in particular, the variables may have as well negative, positive or zero weights (cf. §1). This generalization has already had mathematical applications for examples to deformations of projective varieties or to the theory of moduli spaces for singularities, which will be published independently. Moreover, more or less all algorithms using Gröbner bases (such as computation of syzygies, ideal theoretic operations, etc.) are now available in this general context. Our generalization provides also an easy manner to implement standard bases for modules over the Weyl algebra (cf. §6).

This general standard basis algorithm is described in §1, its implementation in the computer algebra system SINGULAR in §2, which has been developed by the authors over the last few years. The need for an efficient implementation of the tangent cone algorithm arose from applications in the theory of singularities. First of all there was the success in disproving a conjectured generalization of a theorem of K. Saito (cf. [PS]). Later, the attempt to attack Zariski's multiplicity question was one of the most stimulating points. Indeed, none of the existing computer algebra systems were able to compute a series of possible counter examples. Therefore, we had to invent special strategies in particular for zero-dimensional ideals in the local case.

These strategies and their implementation in SINGULAR are described in §2. The most relevant new strategies are the HCtest and the ecartMethod. The HCtest has dramatic impact on the efficiency of the algorithm for zero-dimensional ideals in the local case. It was first found by Pfister and Schönemann and has been implemented in a forerunner of SINGULAR since 1985. It consists of the computation of the minimal monomial not contained in the initial ideal and discarding all bigger monomials in further computations. The Pfister-Schönemann HCtest has also proved to be fairly successful in an early implementation of the tangent cone algorithm by Zimnol [Zi] (which was based on ideas of H. Brakhage).

The ecartMethod consists of the choice of a weight vector of positive integers such that the weighted ecart of the input polynomials, with respect to this vector, become as small as possible. Of course, this method is most useful if the system offers an automatic choice of the ecart vector. In SINGULAR the automatic choice is realized by optimizing a certain functional but the user also has the option to choose his own ecart vectors. The ecartMethod and the functional will be described in more detail in [Po], together with the weighted sugarMethod and another method to compute weights for the variables, which applies in the classifical Buchberger case.

Whilst the HCtest applies only to 0-dimensional ideals, the ecartMethod works in any case and is usually superior to other strategies. But we noticed that computing a standard basis with respect to the lexicographical ordering is extremely fast in the local case. However, the meaning of this ordering is less clear (it is not an elimination ordering in this case). Nevertheless, it can be used to compute a lexicographical standard basis and then to recompute a standard basis from this one with respect to the desired ordering. In many cases this is surprisingly fast. §3 contains a comparison of these strategies by giving the timing for many examples.

In §4 we prove that Schreyer's method to compute syzygies generalizes to arbitrary semigroup orderings. It seems to be the first algorithmic proof of the fact that the length of a free resolution is equal to the number of variables which actually occur in the equations (and not on all variables of the ring) in the local and mixed local-global case. It follows basically Schreyer's original proof [S1] but contains some new ideas, since Macaulay's lemma, which is usually applied, does not hold for orderings which are not well-orderings. We describe the implementation in SINGULAR, which is the first implementation even in the classical Buchberger case. We believe that this algorithm, and further improvements, will have theoretical as well as practical advantages in many cases of interest, in particular in cases which are far from a complete intersection. On the other hand, it does not give a minimal resolution and computing a minimal resolution step by step is sometimes more efficient. We describe some essentials for an efficient implementation and include a table of experiments with comparisons of different strategies.

Chapter §5 contains a partial positive answer to Zariski's multiplicity conjecture or question. This question is concerned with the topological nature of the algebraically defined multiplicity of a hypersurface singularity and belongs certainly to the most outstanding open problems in singularity theory. Although there are partial positive answers, e.g. by Zariski, Lê, Lipmann, Laufer, O'Shea, Yau and the second named author, it has basically resisted all attacks. Our partial result, which supports the conjecture, was prompted by computer experiments with SINGULAR in order to find a counter example. The proof (given in §5) does not use any computer computation but the computer experiments were essential in guessing the result. In the last chapter we include for completeness a proof that the module of leading terms (with respect to any semigroup ordering) is a flat specialization of the original module. This is the basis of most applications, e.g. for computing Milnor numbers or multiplicities and Hilbert functions of singularities.

SINGULAR is implemented in C and C++. A test version is available unter ftp from saturn.mathematik.uni-kl.de. The ground fields for standard basis computations are ${\Bbb Z}/p{\Bbb Z}$ (p a small prime number < 216), ${\Bbb Q}$, ${\Bbb Q}[Z]/(f)$, ( ${\Bbb Z}/p{\Bbb Z}) [Z]/(f)$ (f irreducible), ${\Bbb Q}(Z, Y,
X, \ldots)$ and $({\Bbb Z}/p{\Bbb Z})$ $(Z, Y, X, \ldots)$ (finitely many parameters). Of course, the computations in algebraic or transcendental extensions are very time consuming. SINGULAR contains, besides the usual functions such as Krull dimension, Hilbert function and ideal theoretic operations, also several libraries with useful procedures (such as degree for projective varieties, Milnor numbers and Tjurinia numbers for isolated complete intersection singularities, T1 and T2 for affine varieties and isolated singularities, etc.). Finally, there is a quite capable interface with online help and comfortable programming facilities.


next up previous contents
Next: 1. A standard basis Up: Standard bases, syzygies and Previous: Contents
| ZCA Home | Reports |