1. Test Modules
  2. Differential Validation
    1. Feedback Validation
    2. Learning Validation
    3. Total Accuracy
    4. Frozen and Alive Status
  3. Results

Target Description: The type Log activation layer.

Report Description: The type Basic.

Subreport: Logs for com.simiacryptus.ref.lang.ReferenceCountingBase

Test Modules

Using Seed 369509753806209024

Differential Validation

SingleDerivativeTester.java:153 executed in 0.00 seconds (0.000 gc):

        log.info(RefString.format("Inputs: %s", prettyPrint(inputPrototype)));
        log.info(RefString.format("Inputs Statistics: %s", printStats(inputPrototype)));
        log.info(RefString.format("Output: %s", outputPrototype.prettyPrint()));
        assert outputPrototype != null;
        log.info(RefString.format("Outputs Statistics: %s", outputPrototype.getScalarStatistics()));
      },
      outputPrototype.addRef(),
      RefUtil.addRef(inputPrototype)));
Logging
Inputs: [
[ [ 0.08 ], [ -0.128 ], [ -0.608 ] ],
[ [ 0.7 ], [ 0.496 ], [ 1.764 ] ]
]
Inputs Statistics: {meanExponent=-0.403119694464533, negative=2, min=-0.608, max=1.764, mean=0.38399999999999995, count=6, sum=2.304, positive=4, stdDev=0.747821725636086, zeros=0}
Output: [
[ [ -2.5257286443082556 ], [ -2.05572501506252 ], [ -0.49758039701597007 ] ],
[ [ -0.35667494393873245 ], [ -0.7011793522572096 ], [ 0.5675839575845996 ] ]
]
Outputs Statistics: {meanExponent=-0.07260886584603334, negative=5, min=-2.5257286443082556, max=0.5675839575845996, mean=-0.928217399166348, count=6, sum=-5.569304394998088, positive=1, stdDev=1.0507451785605684, zeros=0}

Feedback Validation

We validate the agreement between the implemented derivative of the inputs apply finite difference estimations:

SingleDerivativeTester.java:169 executed in 0.02 seconds (0.000 gc):

        return testFeedback(
            statistics,
            component.addRef(),
            RefUtil.addRef(inputPrototype),
            outputPrototype.addRef());
      },
      outputPrototype.addRef(),
      RefUtil.addRef(inputPrototype),
      component.addRef()));
Logging
Feedback for input 0
Inputs Values: [
[ [ 0.08 ], [ -0.128 ], [ -0.608 ] ],
[ [ 0.7 ], [ 0.496 ], [ 1.764 ] ]
]
Value Statistics: {meanExponent=-0.403119694464533, negative=2, min=-0.608, max=1.764, mean=0.38399999999999995, count=6, sum=2.304, positive=4, stdDev=0.747821725636086, zeros=0}
Implemented Feedback: [ [ 12.5, 0.0, 0.0, 0.0, 0.0, 0.0 ], [ 0.0, 1.4285714285714286, 0.0, 0.0, 0.0, 0.0 ], [ 0.0, 0.0, -7.8125, 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0, 2.0161290322580645, 0.0, 0.0 ], [ 0.0, 0.0, 0.0, 0.0, -1.6447368421052633, 0.0 ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.5668934240362812 ] ]
Implemented Statistics: {meanExponent=0.403119694464533, negative=2, min=-7.8125, max=12.5, mean=0.19595436229890306, count=36, sum=7.05435704276051, positive=4, stdDev=2.500194127182347, zeros=30}
Measured Feedback: [ [ 12.499999213488877, 0.0, 0.0, 0.0, 0.0, 0.0 ], [ 0.0, 1.4285714278194916, 0.0, 0.0, 0.0, 0.0 ], [ 0.0, 0.0, -7.8125002911377806, 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0, 2.016129019288826, 0.0, 0.0 ], [ 0.0, 0.0, 0.0, 0.0, -1.6447368633709658, 0.0 ], [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.5668934210945054 ] ]
Measured Statistics: {meanExponent=0.4031196926640855, negative=2, min=-7.8125002911377806, max=12.499999213488877, mean=0.1959543313106376, count=36, sum=7.054355927182954, positive=4, stdDev=2.500194045719943, zeros=30}
Feedback Error: [ [ -7.865111228966271E-7, 0.0, 0.0, 0.0, 0.0, 0.0 ], [ 0.0, -7.519369571440393E-10, 0.0, 0.0, 0.0, 0.0 ], [ 0.0, 0.0, -2.9113778055034345E-7, 0.0, 0.0, 0.0 ], [ 0.0, 0.0, 0.0, -1.29692385719693E-8, 0.0, 0.0 ], [ 0.0, 0.0, 0.0, 0.0, -2.1265702532247133E-8, 0.0 ], [ 0.0, 0.0, 0.0, 0.0, 0.0, -2.9417758229399738E-9 ] ]
Error Statistics: {meanExponent=-7.642468557569196, negative=6, min=-7.865111228966271E-7, max=0.0, mean=-3.098826548142419E-8, count=36, sum=-1.115577557331271E-6, positive=0, stdDev=1.3636356540306542E-7, zeros=30}

Returns

    {
      "absoluteTol" : {
        "count" : 36,
        "sum" : 1.115577557331271E-6,
        "min" : 0.0,
        "max" : 7.865111228966271E-7,
        "sumOfSquare" : 7.039906044113998E-13,
        "standardDeviation" : 1.3636356540306542E-7,
        "average" : 3.098826548142419E-8
      },
      "relativeTol" : {
        "count" : 6,
        "sum" : 6.263223243550298E-8,
        "min" : 2.6317793506967634E-10,
        "max" : 3.146044590562471E-8,
        "sumOfSquare" : 1.395881340863869E-15,
        "standardDeviation" : 1.1121165388234003E-8,
        "average" : 1.0438705405917162E-8
      }
    }

Learning Validation

We validate the agreement between the implemented derivative of the internal weights apply finite difference estimations:

SingleDerivativeTester.java:185 executed in 0.00 seconds (0.000 gc):

        return testLearning(
            statistics,
            component.addRef(),
            RefUtil.addRef(inputPrototype),
            outputPrototype.addRef());
      },
      outputPrototype.addRef(),
      RefUtil.addRef(inputPrototype),
      component.addRef()));

Returns

    {
      "absoluteTol" : {
        "count" : 36,
        "sum" : 1.115577557331271E-6,
        "min" : 0.0,
        "max" : 7.865111228966271E-7,
        "sumOfSquare" : 7.039906044113998E-13,
        "standardDeviation" : 1.3636356540306542E-7,
        "average" : 3.098826548142419E-8
      },
      "relativeTol" : {
        "count" : 6,
        "sum" : 6.263223243550298E-8,
        "min" : 2.6317793506967634E-10,
        "max" : 3.146044590562471E-8,
        "sumOfSquare" : 1.395881340863869E-15,
        "standardDeviation" : 1.1121165388234003E-8,
        "average" : 1.0438705405917162E-8
      }
    }

Total Accuracy

The overall agreement accuracy between the implemented derivative and the finite difference estimations:

SingleDerivativeTester.java:200 executed in 0.00 seconds (0.000 gc):

    //log.info(String.format("Component: %s\nInputs: %s\noutput=%s", component, Arrays.toStream(inputPrototype), outputPrototype));
    log.info(RefString.format("Finite-Difference Derivative Accuracy:"));
    log.info(RefString.format("absoluteTol: %s", statistics.absoluteTol));
    log.info(RefString.format("relativeTol: %s", statistics.relativeTol));
Logging
Finite-Difference Derivative Accuracy:
absoluteTol: 3.0988e-08 +- 1.3636e-07 [0.0000e+00 - 7.8651e-07] (36#)
relativeTol: 1.0439e-08 +- 1.1121e-08 [2.6318e-10 - 3.1460e-08] (6#)

Frozen and Alive Status

SingleDerivativeTester.java:208 executed in 0.00 seconds (0.000 gc):

    testFrozen(component.addRef(), RefUtil.addRef(inputPrototype));
    testUnFrozen(component.addRef(), RefUtil.addRef(inputPrototype));

LayerTests.java:605 executed in 0.00 seconds (0.000 gc):

    throwException(exceptions.addRef());

Results

classdetailsresult
com.simiacryptus.mindseye.test.unit.SingleDerivativeTesterToleranceStatistics{absoluteTol=3.0988e-08 +- 1.3636e-07 [0.0000e+00 - 7.8651e-07] (36#), relativeTol=1.0439e-08 +- 1.1121e-08 [2.6318e-10 - 3.1460e-08] (6#)}OK
  {
    "result": "OK",
    "performance": {
      "execution_time": "0.169",
      "gc_time": "0.112"
    },
    "created_on": 1587005558423,
    "file_name": "derivativeTest",
    "report": {
      "simpleName": "Basic",
      "canonicalName": "com.simiacryptus.mindseye.layers.java.LogActivationLayerTest.Basic",
      "link": "https://github.com/SimiaCryptus/mindseye-java/tree/c9a1867488dc7e77a975f095285b5882c0486db6/src/test/java/com/simiacryptus/mindseye/layers/java/LogActivationLayerTest.java",
      "javaDoc": "The type Basic."
    },
    "archive": "s3://code.simiacrypt.us/tests/com/simiacryptus/mindseye/layers/java/LogActivationLayer/Basic/derivativeTest/202004165238",
    "id": "f6cf7539-7c20-47b5-99f5-f7b400efb980",
    "report_type": "Components",
    "display_name": "Derivative Validation",
    "target": {
      "simpleName": "LogActivationLayer",
      "canonicalName": "com.simiacryptus.mindseye.layers.java.LogActivationLayer",
      "link": "https://github.com/SimiaCryptus/mindseye-java/tree/c9a1867488dc7e77a975f095285b5882c0486db6/src/main/java/com/simiacryptus/mindseye/layers/java/LogActivationLayer.java",
      "javaDoc": "The type Log activation layer."
    }
  }