Saturday, June 19, 2021

The NEF Header

NEF (NMR Exchange Format1) files have headers (one per file) that define which programs wrote the file and it’s history. However, there are a few things that are not clear

Here’s the header

save_nef_nmr_meta_data
   _nef_nmr_meta_data.sf_category      nef_nmr_meta_data
   _nef_nmr_meta_data.sf_framecode     nef_nmr_meta_data
   _nef_nmr_meta_data.format_name      nmr_exchange_format
   _nef_nmr_meta_data.format_version   1.1
   _nef_nmr_meta_data.program_name     NEFPipelines
   _nef_nmr_meta_data.program_version  0.0.1
   _nef_nmr_meta_data.creation_date    2021-06-19T17:36:39.073848
   _nef_nmr_meta_data.uuid             NEFPipelines-2021-06-19T17:36:39.073848-9006508160

   loop_
      _nef_run_history.run_number
      _nef_run_history.program_name
      _nef_run_history.program_version
      _nef_run_history.script_name

     1   NEFPipelines   0.0.1   header.py    

   stop_

save_

Please note firstly the entries sf_category and sf_framecode these are mandetory for the frame to be recognised.

The first frame that that isn’t clear in its format is _nef_nmr_meta_data.creation_date, however this appears to be a isoformat date time, and the mostr reasonable decision is that this is a UTC 2 date time as there is no time zone information and this is unique worldwide. The simple way yom ake this in python is

from datetime import datetime
utc_date_time = datetime.now().isoformat()

The second question is what is the _nef_nmr_meta_data.uuid tag? This is a UUID3 which uniquely identifies this version of the file apart form any other4. This has the form: NEFPipelines-2021-06-19T17:36:39.073848-9006508160. The first part is obvious its our programmes name and the second part is the current time. However, whats the third part 9006508160 well its just a 10 digit random number to ensure that the uuid is unique (think of creating the file at the same time on multiple threads…without the random number they would all have the same Universally Unique Identifier!

from random import randint
from datetime import datetime

utc_date_time = datetime.now().isoformat()
random_value = ''.join(["{}".format(randint(0, 9)) for num in range(10)])
uuid = f'NEFPipelines-{utc_date_time}-{random_value}'

Finally there is the loop

   loop_
      _nef_run_history.run_number
      _nef_run_history.program_name
      _nef_run_history.program_version
      _nef_run_history.script_name

     1   NEFPipelines   0.0.1   header.p    

   stop_

This just lists the programs that have editied the file in order… lastest to oldest.

So a complete header would be

data_new

save_nef_nmr_meta_data
   _nef_nmr_meta_data.sf_category      nef_nmr_meta_data
   _nef_nmr_meta_data.sf_framecode     nef_nmr_meta_data
   _nef_nmr_meta_data.format_name      nmr_exchange_format
   _nef_nmr_meta_data.format_version   1.1
   _nef_nmr_meta_data.program_name     NEFPipelines
   _nef_nmr_meta_data.program_version  0.0.1
   _nef_nmr_meta_data.creation_date    2021-06-19T17:36:39.073848
   _nef_nmr_meta_data.uuid             NEFPipelines-2021-06-19T17:36:39.073848-9006508160

   loop_
      _nef_run_history.run_number
      _nef_run_history.program_name
      _nef_run_history.program_version
      _nef_run_history.script_name

     1   NEFPipelines   0.0.1   header.p    

   stop_
...

Oh and one last thing, this header should be renewed each time a program reads and modifies the file, so new program names, dates, and uuid, plus an extra line in the_nef_run_history loop.


  1. NMR Exchange Format a unified and open standard for representation of NMR restraint data. ↩︎

  2. Universal coodinated time the worlds primary time standard and the effective successor of Greenewich Mean Tim (GMT). ↩︎

  3. UUIDs are Universally Unique Identifiers ↩︎

  4. you could use a hash but then you would need to know what the hash of the file is before the file is complete, and adding the hash to the file would change the value of the files hash… ↩︎

Saturday, October 24, 2020

Generative Programming with templates in ARIA

 

Generative Programming with Templates In Aria

generative
/ˈdʒɛn(ə)rətɪv/
adjective
  1. 1. 
    relating to or capable of production or reproduction.
programming
/ˈprəʊɡramɪŋ/
noun
  1. 1. 
    the process or activity of writing computer programs.
template
/ˈtɛmpleɪt,ˈtɛmplət/

noun

     1. a shaped piece of rigid material used as a pattern for processes such as cutting out, shaping, or drilling.


From two of these definitions we could say a generative program is program that literaly gives birth to another program. So a generative programme at it's simplest its using one computer program to write another program which you then run. Thought you might think this sounds complicated and esoteric its quite often quite a useful technique that allows us to control other programs, decouple one program from another and run complex processes. In actual fact most programmers should be aware of the process as it is fundamental to how programs are built. For example when we write program in a compiled programming language we use another program, the compiler, to transform text into another file, the executable that we then run. In the case of python which is compiled and interpreted the interpreter coverts a python file <file>.py into a byte code file <file>.pyc which is then read and used to run the python interpreter. This avoids having to recompile each time the script is run if there is no change in the text of the .py file.

The picture below shows you the general case for two python interpreters. Python interpreter A takes a template and uses it along with some programming logic in the script generating_script.py to produce generated_script.py. The script generated_script.py is then run and  by interpreter B to produce some output.





So how would we do this in a simple python program, in this case one that produces a csh script? The easiest way is with strings, dictionaries and templates. Consider this python program.

Python-[template.py]:------------------------------------------------

#!/bin/python

template = '''

    #!/bin/csh


    echo the template parameter a was %(a)s

'''

parameters {'a' : 'wibble'}

text = template % parameters

with open ('my_script.csh','w') as file_handle:

    file_handle.write(text)

---------------------------------------------------------:EndPython

Here we create a string template in a variable called template and a dictionary of replacement text to go into the string template called parameters. We then combine the two using the % operator which replaces %(a)s in the template with the value defined by the string "a" in the dictionary parameters (value "wibble") giving us the final text which is then written to my_script.csh.

Csh-[my_script.csh]:-------------------------------------------------

    #!/bin/csh


    echo the template parameter a was wibble

--------------------------------------------------------------:EndCsh

Now if we now run myscript.csh by typing

csh my_script.csh

we get the following printed on the terminal

the template parameter a was wibble

Of course the generated program can be quite flexible, for example we could write a program that reads parameters from a file or command line and will do different things depending on what is in the file or command line, or even more esoteric things such as the status of a device or web page. Consider the following more complicated case from the real world, a toy implementation of the process used to generate a psf file (protein structure file) in the program ARIA which generates protein structures from NMR data. In this case we will reconsidering what would happen if ARIA used xplor-nih as it's structure generation program (currently it uses CNS or Yasara).





Multiple steps occur (note we are using a simplified version of how things work to make it clearer)

First ARIA reads a project file that kicks the whole process off. Then

1. ARIA writes a template pdb file which contains sequence data but no useable coordinates (all coordinates are 0.000 0.000 0.000). This is provided in this example in test_templates/data/sequence/hrdc.pdb

2. Aria writes run.json, this file contains parameters for all fixed data used by ARIA during multiple stages of calculation (structure generation and multiple rounds of structure calculation). An example would be something like

Json-[run.json]:-----------------------------------------------------

{
"data": {
"pdb_or_sequence": "PDB",
"initial_pdb" : "/Users/gst9/Dropbox/git/ariaxc/ariaxc/tests/test_templates/data/sequence/hrdc.pdb",
"initial_seq" : "/Users/gst9/Dropbox/git/ariaxc/ariaxc/tests/test_templates/data/hrdc.seq",



"filenames" : {

"project_root" : "/Users/gst9/Dropbox/git/ariaxc/ariaxc/tests/test_templates",
"xplor_root" : "/Users/gst9/programs/xplor-nih/2.51",
"file_root" : "hrdc"
}
}
}

-------------------------------------------------------------:EndJson

Note this contains all the information about where files can be found and where results should be put.

3. Aria generates a driver csh script that's going to run the structure generation engine xplor-nih using the python make_generate_template.py

Python-[make_generate_template.py]:----------------------------------

import json
import sys


#1.a assume we are in generate_template as working directory
with open('run.json') as fh:
json_data = json.load(fh)


project_root = json_data['data']['filenames']['project_root']
tmp_dir = project_root + '/tmp'
output_dr = tmp_dir + 'generate_template'

py_xplor = json_data['data']['filenames']['xplor_root'] + '/' + 'bin/pyXplor'

data = {
"py_xplor" : py_xplor,
"tmp_dir" : tmp_dir,
"project_root" : project_root,
}


template = """
# SGE facility
#$ -N generate_template
#$ -S /bin/csh

## results will be stored here
setenv NEWIT ./

## project path
setenv RUN %(project_root)s

## individual run.cns is stored here
setenv RUN_CNS %(tmp_dir)s

## CNS working directory
cd ${RUN_CNS}/generate_template

## solves some NFS sync problems
cat %(project_root)s/protocols/generate_template.py > /dev/null

## command line
%(py_xplor)s %(project_root)s/protocols/generate_template.py >! generate_template.out

touch done
""" % data

with open ('generate_template.csh', 'w') as out_file:
out_file.write(template)

-----------------------------------------------------------:EndPython

Again in our case we are going to pull parameters from run.json, though in reality the data is pulled from ARIA internal data structures (effectively the the file run1.xml and data distributed with the program such as forcefields) The script is parameterised by 

  1. the location of the xplor-nih distribution on the users computer (xplor_root)
  2. the location of the project directory (project_root)
  3. the root name for the files produced by the project (file_root)
All other required names and paths are then derived from these

4. Aria runs generate_template.csh, which changes to the directory  <project_root>/tmp/ generate_template finds the xplor-nih instance on the computer from <data.filenames.xplor_root>, and runs generate_template.py (which now gets its os.getcwd() as <project_root>/tmp/generate_template because generate_template.csh has changed to this directory using cd

Csh-[generate_template.csh]:-----------------------------------------

# SGE facility
#$ -N generate_template
#$ -S /bin/csh

## results will be stored here
setenv NEWIT ./

## project path
setenv RUN /Users/gst9/Dropbox/git/ariaxc/ariaxc/tests/test_templates

## individual run.cns is stored here
setenv RUN_CNS /Users/gst9/Dropbox/git/ariaxc/ariaxc/tests/test_templates/tmp

## CNS working directory
cd ${RUN_CNS}/generate_template

## solves some NFS sync problems
cat /Users/gst9/Dropbox/git/ariaxc/ariaxc/tests/test_templates/protocols/generate_template.py > /dev/null

## command line
/Users/gst9/programs/xplor-nih/2.51/bin/pyXplor /Users/gst9/Dropbox/git/ariaxc/ariaxc/tests/test_templates/protocols/generate_template.py >! generate_template.out

touch done

--------------------------------------------------------------:EndCsh

5. protocols/generate_template.py is run and reads run.json and makes the following decisions on how to run based on the data read. it knows where to find run.json because it is in the pwd retrieved using os.getcwd() (remember the pwd was set by generate_template.csh and is inherited from this parent process)

  1. to use a pdb file or a sequence file for structure generation (<data.pdb_or_sequence> in the json file)
  2. where the initial pdb or sequence file is stored  (<data.initial_pdb> in the json file)
  3. where the final data should be stored (<data.tmp> in the json file)
  4. what the root of the output file should be (<data.file_root> in the json file)
  5. where to find the pyXplor program (using <data.xplor_root> in the json file)
generate_template.py creates the psf file by running pdb2psf or seq2psf using the input files defined by run.json and putting it in <filenames.project_root>/xplor/begin/<data.file_root>.psf

 
Python-[generate_template.py]:---------------------------------------

 import json

import os
import subprocess

# 0. lets grab the starting directory froom the command line
root_directory = os.getcwd()
print(os.getcwd())

#1.a
with open(root_directory + '/' + 'run.json') as fh:
json_data = json.load(fh)

#1.b
pdb_or_sequence = json_data['data']['pdb_or_sequence']
xplor_root = json_data['data']['filenames']['xplor_root']
out_dir = json_data['data']['filenames']['project_root'] + '/xplor/begin'
psf_file_name = out_dir + '/' + json_data['data']['filenames']['file_root'] + '.psf'

#2 uses seq2psf or pdb2psf to create a psf file <fileroot>.psf in the begin directory in project_root
if pdb_or_sequence == 'PDB':
pdb_file = json_data['data']['initial_pdb']

pdb2psf = xplor_root + '/bin/pdb2psf'

subprocess.call([pdb2psf, pdb_file, '-outfile', psf_file_name])

elif pdb_or_sequence == 'SEQ':
sequence_file = json_data['data']['initial_sequence']

seq2psf = xplor_root + '/bin/seq2psf'

subprocess.call([seq2psf, sequence_file, '-outfile', psf_file_name])

else:
raise Exception('unexpected choise for sequence source file %(pdb_or_sequence)s' % {'pdb_or_sequence' : pdb_or_sequence})

--------------------------------------------------------------:Python













Thursday, April 01, 2010

So nmr isn't entirely safe??

This arrived in the lab today. So maybe I can't describe NMR as absolutely safe any more...


have a moral easter

Tuesday, June 23, 2009

whats good about galileo

Well its amazing the latest eclipse release train has come round the tracks and to use an old marketing cliché 'its good to talk'. So what about the new version whats good

well I have been doing an awful lot of emf work and this has improved a lot

  1. much faster conversion of genmodels into java source code. I used to watch this menu bar a lot (tens of minutes)



    Its now much much faster, which is great when you have really big models...

  2. Also since I tend to keep up with the M[1-7] versions this one looks like an excellent addition. I will have to try soon (as soon as I can get through to the friends of eclipse server or get a good torrent feed)




  3. Links in java doc headers yeah! but still no search function in the external html viewer which is hard to setup if you are offline...
  4. rectangular selections, I won't suffer from nedit withdrawal pangs any further




  5. p2 the replacement for the update manager is much better, much more reliable. However, the UI is still a bit strange



    • why do you type to add to the site selection, rather than filter the selection?
    • why is the menu item in help called 'Install New Software...' when it leads to a dialog to 'Install and Manage Software...'?


Some pain points

when generating code with emf, debugging problems can be truly painful
  • errors appear in projects that are hidden,
  • errors get reported in dialog boxes which refer to line numbers in source text which doesn't have line numbers and has to be cut into other editors for analysis.
  • errors which don't have line numbers or filenames
  • errors in dialog boxes rather than in problem problem panes
  • its really hard to rerun a source generation run from a gen model
  • no usable text editor for jet
  • limitations in jet and no sign of jet2
  • etc
also
  • you can't jump to a super class in the ecore editor
  • navigating the ecore editor doesn't have the keyboard shortcuts the navigator has
  • in mint if you go the ecore file from the genmodel it doesn't take you to the item you right clicked on
  • the ecore sample editor doesn't notice resource changes...
now don't get me wrong these are all minor niggles compared to the size and breadth of the features implemented this year especially for example the introduction of emf databinding....

so all I have to do now for next year is to participate more and look forward to a sunny helios

regards
gary

Friday, July 04, 2008

A problem of layout?

Today there has been a certain amount of debate about the lack of visibility for certain projects on the Eclipse Ganymede download page so here is a 'quick fix' with all the other packages as a pseudo package

Monday, June 23, 2008

Looking for the Wood not the Trees

So what do I like about eclipse Ganymede? Its new sparkly and has lost of new features (in fact there so many neat tweaks that I can't keep up with them and I expect I won't use some just because they are too hard to find)

Diversity

So why the excitement? Well I don't think its the bling, its the fact that there is a faster more versatile version of the platform that I use for so many things. You see it used to be that I used to have a range of strategies for editing and dealing with the heterogeneous data and programming languages that I use.


Can we say nedit, Komodo, IntelliJ IDEA, wing etc. Most of them were open source thought I must admit I used IntelliJ IDEA (and damn good it was too [eclipse still isn't as polished but is more versatile and open]). However, now I have one tool suite that allows me to deal with most of these at the same time within the same editing environment, so I have less things to remember less windows to find on my desktop and more support for what I am doing there's more...


Community & Open Source


I have found the eclipse community (and especially the emf mailing list http://www.eclipse.org/newsportal/thread.php?group=eclipse.tools.emf and especially Ed Merks) to be quite wonderful. Dealing with a newbie, who makes quite few mistakes and still being friendly helpful and authoritative all at the same time is a wonderful skill.


Having an open source platform has also been a vital asset so on the project I have been working on. For example how do I replace <%packageA.packageB.Class%> with the correct declaration in a custom jet template? Go look at the source code ;-) (though it would be even nicer if I could go and look at the javadoc!) (see the getBody function). So another one of my answers is its the whole, not the trees.

In conclusion Where else could I have work flow that goes python->model-> java all in the same environment, with such ease?

Sunday, March 23, 2008

How do ccpn data models get written

Here are my brief results from digging around the ccpn data-model production software (v2.1) from source forge to see how data-model writing gets carried out (note these are my random doodlings and may not represents how it really works i.e. ymmv)

  1. you call python makeJavaApi.py from the command line which lives in ccpn/python/memops/scripts_v2
  2. This then uses XmlModelIo.readModel() to read the pure ccpn metamodel which returns the root meta-package as its result
  3. a thing called the ModelPortal gets constructed from the meta-model. (This seems to provide functions that access the meta-model in particular ways, examples include for example leafPackagesByImport - leaf packages sorted by import (imported before importing), dataTypesByInheritance - data types sorted by inheritance (supertype before subtype), dataObjTypesAlphabetic - data types sorted alphabetically by name etc...)
  4. JavaFileModelAdapt.processModel(modelPortal) processes the model-portal (and its owned meta-model) to adapt it to the idiosyncrasies of the requirements of the java language
  5. the class method JavaFileApiGen.writeApi gets called with the model-portal as its first parameter and information in the other parameters about where to store the model the version.
  6. finally inside JavaFileApiGen.writeApi we do some more setup create a JavaFileApiGen object and then call processModel() on it this then calls a general meta-model traverser which is in the class ModelTraverse.processModel
  7. now comes the interesting bit! This then causes a series of callbacks on the JavaFileApiGen class to be called. Using the following hack I think I can work out most of them...
    grep def ../metamodel/ModelTraverse_py_2_1.py | tr '(' ' ' | awk '{print $2}'
  8. this then visits the following list of apparently interesting methods
    • processBranchPackage
    • initLeafPackage
    • processLeafPackage
    • endLeafPackage
    • processDataType
    • processConstant
    • processException
    • initClass
    • processClass
    • endClass
    • initDataObjType
    • processDataObjType
    • endDataObjType
    • processAttribute
    • processRole
    • initOperation
    • processOperation
    • endOperation
    • processParameter
    which seem to correspon well with the parts of the datamodel... ;-)

Don't invite eclipse euorpa to the feast!

Eclipse is a wonderful thing and the update manager can make life easy. However, if you want to install something a little out the way and unsupported you can so easily get in a mess under Europa (I guess this is why P2 is such a big thing for Ganymede)

Why Europa isn't my quartermaster!
    1. If you have the wrong platform bundles installed ie x86 rather than x86_64 specifically
      • org.eclipse.core.filesystem.linux.x86_64_1.0.100.v20070510.jar
      • org.eclipse.equinox.launcher.gtk.linux.x86_64_1.0.2.R331_v20071019
      • org.eclipse.platform.source.linux.gtk.x86_64_3.3.2.R33x_v20071022-_19UEksF-G8Yc6bUv3Dz
      • org.eclipse.rcp.source.linux.gtk.x86_64_3.3.2.R33x_r20071022-8y8eE9CEV3FspP8HJrY1M2dS
      • org.eclipse.swt.gtk.linux.x86_64_3.3.2.v3347.jar

        the error message you get on startup is very obtuse


      exit code code=13 is about as friendly as my favourite error message 'an unknown error occurred at an unknown location'

    2. I can't see where it says i have the wrong platform modules here... isn't this something the launcher should check for?
Then things can get worse ;-)

I tried to swap the x86 bundles for the X86_64 versions I had lying around and then got this...



now if you understand what you have just done this is OK. However, it's still not the most useful message! What should a launchers companion library be called (Yhe Charlotte Bartlett.dll)

So how did you get me in this mess Stanley?

Now the way I got into this mess wasn't so obvious either....

I decided I wanted to play with emf and use the Soyatec eUML2 free editor. This all seems well and fine so I did what they asked and added the Soyatec website to my update site list and tried to install the eUML2 component. Thats when the hell started as I tried to get the right set of dependencies for the soyatec product to work with the other plugins on the Europa Discovery Site and failed over and over on dependencies on gmf and ocl so for example I would get

eUML2 (*.*.*.*) requires feature ***.***.gmf (x.x..xxxx) or compatible.

where x.x..xxxx was a lower version number than the one I already hadinstalled....

More heartache

So having got into dependany hell where did I go next. I remembered that Yoxos/Inoopract has this damn neat Yoxos on demand service and stuff me it does what it says on the jar:

  • it sorted my dependencies like a treat
  • it made me my own custom eclipse which I can go back to
  • it was free
  • Yoxos has its own provisioning which is available inside eclipse and works again like a dream. It resolves dependencies and just making it all work (in actual fact this plug-in appears to be very much related to the online experience which appears to be a RAP version of the Yoxos provisioning plug-in)
so can you see the problem (remembering the problems I discussed above), here is a screen shot of the yoxo online provisioning web app that might give you a clue:



Yep I am Linux_x86_64 whereas the button here is effectively Linux_86. I guess this is something Innoopract/Yoxos need to solve or at least they do need to flag that they don't support x86_64 linux.

Wrapup

So what can we conclude from this trip round the houses?

  • getting the right versions of eclipse plug-ins can still be a hassle even with the update manager and the newest dependency management tools
  • If you have a system which doesn't exactly match the rest of the world (though i might say come on x86_64 isn't that rare ;-) you can easily end up in trouble
  • the best laid plans of mice and men fail especially when you plan to 'just quickly' install a piece of software (alway a bad move)
Random thoughts

  • I suceeded in the end by doing the off-line Soyatec installation which included all the required dependencies at the right version. Heres the picture to prove it:


  • Why is the update manager in the help menu??? It always seemed strange to me; File maybe Window maybe but help no way!!
  • Why doesn't the eclipse Updates window have a maximize button some of the package names are long enough....



    (and yes I know I can expand it it just isn't convenient)

  • why don't I blog more often (no don't ask me to answer that one)


Friday, July 14, 2006

installing scipy and numpy with analysis

scipy and numpy provide sophisticated and fast mathematical routines for python and installing them with analysis should be a breeze. However, it wasn't quite as simple as it seemed ;-) Specifically analysis is built with narrow unicode support and scipy/numpy require wide support....

so you have to add --enable-unicode=ucs4 to the python configure command line when compiling python for analysis using the supplied python. This command this is at about line 346 in installCode.py fro analysis 1.0.10

cmds = []
cmds.append('./configure --enable-unicode=ucs4 --prefix=%s/%s' % (top_dir,python_rel_dir))
cmds.append('make')
cmds.append('make install')
runCmds(cmds)

in the function def compilePython(x11_abs_dir, tcl_abs_dir, tk_abs_dir):

Monday, June 26, 2006

what to do if an update fails

ccp analysis has a very nice update feature. However, if it fails you can get in a state where you can't start analysis to get the updates. So here are two simple cures (one is from Tim and uses a python interface, the second is just how to get to the update site with a web browser to pull down raw files).

To run the update agent without analysis running

  1. initialise you analysis environment so the ccpn root directory is set e.g. setenv
    CCPN_HOME ... in csh syntax or export CCPN_HOME='...' in sh syntax
  2. if need be setup you pythonpath to include the ccpn directories
  3. python $CCPN_HOME/python/ccpnmr/update/UpdatePopup.py and proceed as usual

to use a web browser...

  1. http://www.bio.cam.ac.uk/~ccpn/ccpNmrUpdate
  2. this directory will contain files of the form python__temp_ccpnmr__temp_analysis_.py where would be Util for Util.py for example
  3. download save and install

Monday, June 12, 2006

analysis on dapper drake & Edgy Eft

Installing analysis on ubuntu dapper drake (ubuntu 6.06LTS) is not a clean affair for various murky reasons (none of which are to do with the ccpn chaps by the way just life and code rot).

So here goes

Prerequisites

  1. you need gcc and g++ installed (packages gcc and g++-4.0)
  2. you need glut installed (packages freeglut3, freeglut3-dev)
  3. you need the x development headers and extension installed (packages libext6 and libxext-dev libx11-dev; more on this later)
  4. you must have build-essential installed if you want to compile your own python
Installation

  1. don't try to use the installation of tcl/tk that comee from ubuntu they don't play well with the ccpn installation script (tcl/tk lib and include are not both under the same directory)
  2. don't use the tcl/tk 8.3 files that come with ccpnmr 1.0.10 tas hey have configure scripts that are broken under the latest versions of bash
  3. get tcl8.4.13-src.tar.gz and tk8.4.13-src.tar.gz http://www.tcl.tk/software/tcltk/8.4.html
  4. mkdir tk8.4 and tcl8.4 in ccpnmr1.0
  5. place thetcl/tk tar files in their respective directories
  6. gunzip them
  7. remove the -src part of the name e.g. tk8.4.13-src.tar.gz -> tk8.4.13.tar.gz
now edit the installation script installCode.py
  1. line 27 tcltk_version = '8.4' #was tcltk_version = '8.3'
  2. line 28 tcltk_release = '8.4.13' #was tcltk_release = '8.3...'
fiddle with the X extension libraries

  1. /usr/lib/libXext.so doesn't exists
  2. so become root
  3. ln -s libXext.so.6 libXext.so
setup the environment variables changing tcl /tk 8.3 to 8.4

  1. setenv LD_LIBRARY_PATH\ ${CCPNMR_TOP_DIR}/tcl8.4/lib:${CCPNMR_TOP_DIR}/tk8.4/lib
  2. setenv TCL_LIBRARY {$CCPNMR_TOP_DIR}/tcl8.4/lib/tcl8.4
  3. setenv TK_LIBRARY {$CCPNMR_TOP_DIR}/tk8.4/lib/tk8.4

carry out your normal installation using installCode.py

all done and here is the proof (and no I didn't fake it in photoshop ;-))

notes
  1. all paths entered have to be absolute
  2. the X11 directory is /usr

Saturday, May 20, 2006

Minimise menu bar problem

Under KDE you can minimise the Analysis window. However, it doesn't have an entry in KDE's list of windows (only spectra get this), so you can't maximise it again. Workround top.deiconify() in the shell...

Chains and LinkSequentialSpinSystemsPopup

LinkSequentialSpinSystemsPopup ddidn't report chains for residues if there is more than one chain in the MolSystem. It does now ;-)



However, I have only modified it so chains will show when there are multiple chains present...

Friday, May 12, 2006

A feature with backup

If you have auto backup on in analysis and the directory you backup to is missing. If you try to switch auto backup off it will ask you if you want to create the directory... if you then say no it will say the directory can't be created and so it is not setting up automatic backups... If you say yes and you are on the wrong computer of course it will fail to create the directory and throw an error.

conclusion it seems to be doing input validation at the wrong places and assuming all exits require validation

Strange ones

A couple of strange corners in the ccpn datamodel

  1. Project doesn't have a parent Object field
  2. there are isotopes and isotopecodes
    1. isotopes are objects
    2. isotope codes are fixed length strings of the form 1H 13C etc
  3. you can't get an isotopecode from an isotope and vice versa without coding...

Friday, April 28, 2006

munging readline in analysis

some programs such as IPython and faetures such as command line hcacking require gnu readline. Here is what I did under ubuntu to get it going

  1. all this is using the version of python that you use to run analysis (you may need to set the environment variables TCL_LIBRARY etc to get i to work (see $CCPNMR_TOP_DIR/bin/analysis)
  2. this is using ccpns own downloaded python
  3. change to $CCPNMR_TOP_DIR/python2.4/Python-2.4.2/Modules/
  4. edit Setup and uncomment line at about line 160 which reads 'readline readline.c -lreadline -ltermcap'
  5. cd to $CCPNMR_TOP_DIR/python2.4/Python-2.4.2
  6. run python setup.py install




Get ipython in analysis

  1. download ipython http://ipython.scipy.org/dist/
  2. extract to your ccpnm directory tar ipython-0.7.1.fix1.tar.gz
  3. go to tar ipython-0.7.1
  4. type setup.py install where is the python you used to install analysis
  5. edit $CCPNMR_TOP_DIR/ccpnmr1.0/python/ccpnmr/analysis/AnalysisGui.py
  6. at about line 52 after import Tkinter put import IPython on a new line
  7. at about line 105 add:
    from IPython.Shell import IPShellEmbed

    ipshell = IPShellEmbed()

    ipshell()

Thursday, April 20, 2006

analysis tables widget

Two problems here
  1. the tables widget creates new rows on the fly (I guess) so the widths of the columns are constantly jumping around
  2. if you hide the rows in a table widget you get nice little + signs but which column is which when you want to reexpand them. Wouldn't a popup be better in general....

analysis spin systems editor

analysis, edit spin system dialog: you can display strips and display cells... So why is it missing a button with a name something like 'display resonances'.

Friday, April 07, 2006

Creation

Let there be light