Reporting
Please prepare a few slides to present (which will be uploaded on our confluence page)

  1. Developing a customized optimizer : Sammy
  2. Developing a customized loss-function : Sammy
  3. Finding algorithmic error :
        Impact from initialization of weights
        Limitation in experimental, statistical & systematic error:    Aaryan
       (How big can be the error of F be to do a decent extraction)
  4. Identifying the optimum # of points of F corresponding to different angles  : Pranav
  5. Identifying kinematic limitations in a decent extraction for a particular formalism and for particular-model (model for CFFs)
  6. Cross-validation model comparison:  Cole
  7. Comparison between Tensorflow Vs Pytorch  : Cole
  8. Handling different observables such as FUU, FUL, FUT etc.
  9. Validation approach to identify the error : Surya


Work organization

GitHub page: https://github.com/extraction-tools/ANN
Confluence page: https://confluence.its.virginia.edu/display/twist/ANN+Fitting+Project


Weekly Updates

DateTitlePresenterSlidesSummary / Notes

 


Cole

 

Error on F and Consistency StatisticsAaryan

 

limiting the angles of phiPranav

 


Pranav

 


Pranav

 


Aaryan

 


Pranav

 


Aaryan

 


Cole

 

Sensitivity of (F-BH) (in BKM02)Aaryan

 

Fits using method2 with reduced phi (in BKM10)Pranav

 


Pranav

 


Pranav

 


Pranav

 


Aaryan

 


Pranav

 


Amanuel

 


Cole






Weekly Updates

Before  


[Tuesday][A]
Pranav: Generating pseudo data & optimizing local fits (with Liliet)
Aaryan: Exploring & testing the Method 2 with initialization of weights, and exploring how to bootstrapping effectively

[Tuesday][B]
Surya: Optimizing local fits
Cole: Finding bounds for CFFs using a novel approach


[Friday][A]
Nathan: Finding bounds for CFFs using grid-search
Andrew: Finding bounds for CFFs using Nick's approach & exploring loss-function
[Friday][B]
Sammy: Exploring loss function, implementation of stigmergic loss
Arthur: Fast prototyping with GPUs




DateTitlePresenterSlidesNotes

 

Plan                                           Ishara            There are 9 working groups reporting updates every other week (please see attached slides)

Tuesday group

DateTitlePresenterSlidesSummary / Notes

  

Possible approaches: weight averagingAaryan & Pranav           

 

CFF Bounds Local Fit ExperimentationCole

 


Pranav

 

Different Loss Function ExplorationAaryanPlayed around with different loss functions including MAE, MSE, and working with the interference values. Also worked with method 2 that does not reinitialize weights after each replica, and saw how predictions for CFFs get better for following replicas.

 

Optimizing local fitsAaryanExploring methods with seeing when to reset weights and observing differences in averages when weights are reset for each replica and when they are not.

 

Optimizing local fitsPranav

 


Cole

 


Surya

 

Different MethodsAaryanTried combining Nick's Method and Pure Bootstrapping Method and saw that averages are similar to correct results while distribution of results are still narrow as pure bootstrapping.

 


PranavWorked on implementing a K-means clustering algorithm in order to narrow down networks at intermediate stages using weight averaging. Two methods: using the networks immediately after weight averaging and training for an additional 100 epochs after weight averaging. Overall, using relatively small numbers of replicas and training epochs, the results were extremely poor for both methods tried. The next step is to experiment with different numbers of replicas and cluster centers to see if better results can be obtained.

 

TensorFlow Graph Execution and tf.functionCole

 

Exploring combined-type MethodAaryanwith/without weight reset and random selection of weights for initial training

 

PyTorch Vs Tensorflow for BKM02Cole

 

Consistency/Reproducibility of Experimental MethodAaryan


 

K-means algorithm for intermediate weight averagingPranav

Friday group

DateTitlePresenterSlidesSummary / Notes

 

A model to narrow bounds for ReEAndrew

Information on tensorflow update is included

 


Andrew

 


Nathanwill be uploaded soon

 

Mathematica ImplementationAnnabel

 

Custom loss function updateSammy




Prospective projects

  1. Developing a customized optimizer : Sammy
  2. Developing a customized loss-function : Sammy
  3. Finding algorithmic error :
        Impact from initialization of weights
  4. Identifying specific angles (low/high angles critical for good fits)
  5. Limitation in experimental, statistical & systematic error:    Aaryan
    (How big can be the error of F be to do a decent extraction)
  6. Identifying the optimum # of points of F corresponding to different angles  : Pranav
  7. Identifying kinematic limitations in a decent extraction for a particular formalism and for particular-model (model for CFFs)
  8. Cross-validation model comparison:  Cole
  9. Comparison between Tensorflow Vs Pytorch  : Cole
  10. Handling different observables such as FUU, FUL, FUT etc.
  11. Validation approach to identify the error : Surya



Chapter#TopicAssignment
2Hyper-parameter optimization for Local Fits
3Hyper-parameter optimization for Global Fits
4Limitation in experimental, statistical systematic errorAaryan
5Identifying the optimum number of points of F corresponding to different
angles
Pranav
6Validation approach to identify the errorSurya
7Boot-Strapping Value
8Cross-validation model comparisonCole
9Developing a customized optimizer and loss-functionSammy
10Comparison between Tensorflow Vs PytorchSummy
11Finding algorithmic error
12Model Ensemble Study
13Identifying kinematic limitations in a decent extraction for a particular
formalism and for particular-model (model for CFFs)

14CFFs extraction through Grid-Search algorithmNathan
15Handling different observables such as FUU, FUL, FUT etc
16NN Fit only to interference term


  • No labels