This is the mail archive of the gdb-patches@sourceware.org mailing list for the GDB project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [patch] PR python/10807 API for macros.


Matt Rice <ratmice@gmail.com> writes:

> In the gdb.Macro class,  I didn't used methods instead of attributes,
> because the values are lazily computed/can return different objects if
> called multiple times, thus attributes seemed weird, even though I
> think the Macro methods are fairly attribute like.  so if anyone has
> any opinion here on attributes vs methods.

Thanks.

> +typedef struct
> +{
> +  PyObject_HEAD;
> +  const char *name;
> +  struct macro_source_file *src_file;
> +  int src_line;
> +} macro_object;
> +
> +static PyTypeObject macro_object_type;

We've been trying to put comments in typedef/struct definitions to
narrate each field.  Some of the older Python files do not have them,
but the newer ones do.  Just a one line comment on each field is fine.

> +static PyObject *
> +argv_to_py (struct macro_definition *macro)
> +{
> +  PyObject *ret = NULL;
> +
> +  if (macro->kind == macro_function_like)
> +    {
> +      Py_ssize_t i;
> +      PyObject *ret = PyList_New (macro->argc);
> +
> +      if (ret == NULL)
> +	return NULL;
> +
> +      for (i = 0; i < macro->argc; i++)
> +        {
> +	  PyObject *item = PyString_FromString (macro->argv[i]);
> +
> +	  if (!item)
> +	    goto err_ret;
> +
> +	  if (PyList_SetItem (ret, i, item) != 0)
> +            goto err_ret;
> +        }
> +
> +      return ret;
> +    }
> +
> +  Py_RETURN_NONE;
> +
> +  err_ret:
> +    Py_XDECREF (ret);
> +    return NULL;
> +}

Some minor nits.

Some indents looks a little weird, but it might just be the mailer.  You
don't need Py_XDECREF here as you already accounted for the PyList being
NULL.  So just Py_DECREF.  

> +
> +static PyObject *
> +include_trail_to_py(struct macro_definition *macro,
> +		   const char *name,
> +		   struct macro_source_file *src_file,
> +		   int src_line)
> +{
> +  PyObject *tuple = PyTuple_New (2);
> +  PyObject *result = PyList_New (0);
> +  PyObject *tmp;
> +  struct macro_source_file *file;
> +  int line;
> +
> +  if (!tuple || !result)
> +    goto err_exit;

Even though either one of these indicates a failure case, and cannot
continue, you need to check the individual return status right after
instantiation.  This is so you return the correct exception back to the
user.

> +  file = macro_definition_location (src_file, src_line, name, &line);
> +  if (!file)
> +    goto err_exit;

I'm not sure it is right to do this.  macro_definition_location is a GDB
function that can return NULL, not out of error, but because there is no
definition of a splay tree node for that definition.  Looking at the
functions you call, and macro_definition_location calls, this can
expected and does not signal a Python error condition.  While, at this
point it is right to exit because there is nothing more you can do,
returning NULL from a Python function raises the expectation there is a
Python error with an accompanying exception.  If I remember correctly,
Python will complain about this.  Perhaps it would be better in this
case to return None, and document the return strategy in the comments at
the function header.

> +
> +  tmp = PyString_FromString (file->filename);
> +  if (!tmp)
> +    goto err_exit;
> +  PyTuple_SetItem (tuple, 0, tmp);
> +
> +  tmp = PyInt_FromLong (line);
> +  if (!tmp)
> +    goto err_exit;
> +  if (PyTuple_SetItem (tuple, 1, tmp) != 0)
> +    goto err_exit;
> +  if (PyList_Append (result, tuple) != 0)
> +    goto err_exit;
> +  Py_DECREF (tuple);

> +  while (file->included_by)
> +    {
> +      tuple = PyTuple_New (2);
> +
> +      if (!tuple)
> +        goto err_exit;
> +
> +      tmp = PyString_FromString (file->included_by->filename);
> +      if (!tmp)
> +        goto err_exit;
> +      if (PyTuple_SetItem (tuple, 0, tmp) != 0)
> +        goto err_exit;
> +
> +      tmp = PyInt_FromLong (file->included_at_line);
> +      if (!tmp)
> +        goto err_exit;
> +      if (PyTuple_SetItem (tuple, 1, tmp) != 0)
> +        goto err_exit;
> +
> +      if (PyList_Append (result, tuple) != 0)
> +	goto err_exit;
> +      Py_DECREF (tuple);
> +
> +      file = file->included_by;
> +    }
> +
> +  return result;
> +
> +  err_exit:
> +    Py_XDECREF (tuple);
> +    Py_DECREF (result);

Either of these could be NULL, so you need to use XDECREF in both cases.

> +    return NULL;
> +}
> +
> +/* Create a new macro (gdb.Macro) object that encapsulates the
> +   macro_definition structure from GDB.  */
> +PyObject *
> +macropy_object_create (struct macro_definition *macro,
> +		       const char *name,
> +		       struct macro_source_file *ms,
> +		       int line)

I normally try to document parameter usage here.  And other publicly
facing functions, too.

 +{
> +  macro_object *macro_obj;
> +
> +  macro_obj = PyObject_New (macro_object, &macro_object_type);
> +  if (macro_obj)
> +    {
> +      /* Save enough to lookup the macro again in the methods.
> +	 Then do a lookup and lazily copy things into PyObjects.
> +         Consecutive lookups should be OK, because of the splay tree.
> +         We cannot save a macro definition due to inconsistent memory
> +	 management.  We rely on the fact that the macro_source_file
> +         is not released until exit.
> +
> +	 It seems we may need to move to using a 'macro_scope'
> +	 if we want a python api for user-defined macros. */


What do you mean by inconsistent memory management?  Can you expand/explain
this.  Do macro definitions have a life-cycle in GDB?

> +static PyObject *
> +macropy_str (PyObject *self)
> +{
> +  PyObject *result = PyString_FromString ("<gdb.Macro ");
> +  macro_object *me = (macro_object *) self;
> +  struct macro_definition *macro;
> +  PyObject *argv = macropy_argv_method (self, Py_None);
> +  PyObject *definition = macropy_definition_method (self, Py_None);
> +  PyObject *include_trail = macropy_include_trail_method (self, Py_None);
> +  PyObject *is_function_like = macropy_is_function_like_method (self, Py_None);
> +  PyObject *name = PyString_FromString (me->name);
> +
> +  if (!definition || !is_function_like || !argv
> +      || !include_trail || !name || !result)
> +    goto err_ret;


Because all of these are defined in the local block, what exception will
be returned on what failure? If RESULT fails, then ARGV, I think the
exception written for RESULT will be overwritten.  I know it is a pain,
but I think you need to check each one.


> +static int
> +macropy_compare (PyObject *self, PyObject *o2)
> +{
> +  PyObject *my_str = macropy_str (self);
> +  int result;
> +
> +  if (!my_str)
> +    return -1;
> +
> +  if (PyObject_TypeCheck (o2, &macro_object_type))
> +    {
> +      PyObject *other_str = macropy_str (o2);
> +
> +      if (other_str)
> +        result = PyObject_Compare (my_str, other_str);

You need to cope with errors here.  Use PyErr_Occurred() or
gdbpy_print_stack.


> +      else
> +	result = -1;
> +
> +      Py_DECREF (my_str);
> +      Py_XDECREF (other_str);
> +      return result;
> +    }
> +
> +  result = PyObject_Compare (my_str, o2);
> +

Same as above.

> +  Py_DECREF (my_str);
> +  return result;
> +}
> +
> +static long
> +macropy_hash(PyObject *o)
> +{
> +  long result;
> +  PyObject *my_str = macropy_str (o);
> +
> +  if (!my_str)
> +    return -1;
> +
> +  result = PyObject_Hash (my_str);

Same as above, re failure.  Why do you need a custom hash function?



> +  PyObject *tmp;
> +
> +  if (! PyObject_TypeCheck (mud->list, &PyList_Type))
> +    goto err_exit;

These comments are somewhat complex, and really follow the three paragraph
comments belows. So take them as one big comment.

Normally, a function that does not return a Python object has to
deal with stack printing/error detection internally. Because this is not
a Python function (IE returning NULL signifies an error, you have to
check and deal with the error here)  this is done with
gdbpy_print_stack.  However ....

> +  if (mud->errors != 0)
> +    goto err_exit;
> +
> +  tmp = macropy_object_create (definition, name, source, line);
> +  if (tmp)
> +    {
> +      if (PyList_Append (mud->list, tmp) != 0)
> +	goto err_exit;

...

> +static PyObject *
> +stpy_macros (PyObject *self, PyObject *args)
> +{
> +  struct symtab *st = symtab_object_to_symtab (self);
> +  struct macro_user_data mud;
> +  PyObject *result;
> +
> +  STPY_REQUIRE_VALID (self, st);
> +
> +  result = PyList_New (0);
> +  if (result == NULL)
> +    return NULL;
> +
> +  mud.list = result;
> +  mud.errors = 0;
> +  macro_for_each (st->macro_table, add_macro_to_list, &mud);
> +
> +  if (mud.errors != 0)
> +    {
> +      Py_DECREF (result);
> +      return NULL;

I'm really puzzled what to do here.  I'm assuming the iterator won't
stop because the helper function has encountered errors (at least, the
function pointer prototype is a void return).  Because macro_for_each is
a GDB iterator function that calls (through a function pointer and other
GCC helpers) add_macro_to_list, which itself calls Python functions,
each iteration can raise a Python error.  If we print and cope with the
error for each iteration in the helper as suggested above, the user will
see each exception and no exception data is overwritten.  Also there
seems (to me, at least) no way to abort the iterator earlier.

OTOH we want to make sure that this function returns correctly,
according to how Python functions should.  So we SHOULD return NULL here
if there were errors, but if we do what I suggested above, every
exception will already be printed and cleared.  So returning NULL here
will cause Python to complain.  But I also don't want previous iteration
exceptions overwritten either. Maybe your way is right in that we only
report the last iterations exception.  Or maybe we should construct our
own exception and return that.  I do not know.  I'd appreciate
comments here from the maintainers.


> --- /dev/null
> +++ b/gdb/testsuite/gdb.python/py-macro.c
> @@ -0,0 +1,72 @@
> +#ifdef DEF_MACROS

Needs a copyright/license header.


> +#
> +# You should have received a copy of the GNU General Public License
> +# along with this program.  If not, see <http://www.gnu.org/licenses/>.
> +
> +# This file is part of the GDB testsuite.  It tests the mechanism
> +# exposing macros to Python.
> +
> +if $tracelevel then {
> +    strace $tracelevel
> +}
> +
> +load_lib gdb-python.exp
> +
> +set testfile "py-macro"
> +set srcfile ${testfile}.c
> +set binfile ${objdir}/${subdir}/${testfile}
> +get_compiler_info ${binfile}
> +if [test_compiler_info gcc*] {
> +    lappend options additional_flags=-g3
> +} else {
> +  untested ${testfile}.exp
> +  return -1
> +}
> +
> +if { [prepare_for_testing ${testfile}.exp ${testfile} ${srcfile} $options] } {
> +    untested ${testfile}.exp
> +    return -1
> +}
> +
> +# Start with a fresh gdb.
> +gdb_exit
> +gdb_start
> +gdb_reinitialize_dir $srcdir/$subdir
> +gdb_load ${binfile}
> +
> +# Skip all tests if Python scripting is not enabled.
> +if { [skip_python_tests] } { continue }

This is something we should do for all Python tests, so I am not just
singling this one out.  But we should probably make this test earlier.

Thanks for writing this patch.  I am really looking forward to it being
checked in when it is approved!

Cheers,

Phil


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]