GaussianMixtures

← Back to package list

If you think that there is an error in how your package is being tested or represented, please file an issue at NewPkgEval.jl, making sure to read the FAQ first.

Results with Julia v1.2.0

Testing was successful. Last evaluation was ago and took 8 minutes, 21 seconds.

Click here to download the log file.

 Resolving package versions...
 Installed Missings ─────────── v0.4.3
 Installed GaussianMixtures ─── v0.3.0
 Installed DataAPI ──────────── v1.1.0
 Installed PDMats ───────────── v0.9.10
 Installed FileIO ───────────── v1.1.0
 Installed NearestNeighbors ─── v0.4.4
 Installed StatsBase ────────── v0.32.0
 Installed BinaryProvider ───── v0.5.8
 Installed ScikitLearnBase ──── v0.5.0
 Installed Blosc ────────────── v0.5.1
 Installed URIParser ────────── v0.4.0
 Installed StatsFuns ────────── v0.9.0
 Installed HDF5 ─────────────── v0.12.5
 Installed Rmath ────────────── v0.5.1
 Installed JLD ──────────────── v0.9.1
 Installed Compat ───────────── v2.2.0
 Installed OrderedCollections ─ v1.1.0
 Installed DataStructures ───── v0.17.6
 Installed Parameters ───────── v0.12.0
 Installed QuadGK ───────────── v2.1.1
 Installed Distributions ────── v0.21.9
 Installed StaticArrays ─────── v0.12.1
 Installed CMake ────────────── v1.1.2
 Installed SortingAlgorithms ── v0.3.1
 Installed CMakeWrapper ─────── v0.2.3
 Installed Distances ────────── v0.8.2
 Installed LegacyStrings ────── v0.4.1
 Installed Clustering ───────── v0.13.3
 Installed BinDeps ──────────── v0.8.10
 Installed Arpack ───────────── v0.3.1
 Installed SpecialFunctions ─── v0.8.0
  Updating `~/.julia/environments/v1.2/Project.toml`
  [cc18c42c] + GaussianMixtures v0.3.0
  Updating `~/.julia/environments/v1.2/Manifest.toml`
  [7d9fca2a] + Arpack v0.3.1
  [9e28174c] + BinDeps v0.8.10
  [b99e7846] + BinaryProvider v0.5.8
  [a74b3585] + Blosc v0.5.1
  [631607c0] + CMake v1.1.2
  [d5fb7624] + CMakeWrapper v0.2.3
  [aaaa29a8] + Clustering v0.13.3
  [34da2185] + Compat v2.2.0
  [9a962f9c] + DataAPI v1.1.0
  [864edb3b] + DataStructures v0.17.6
  [b4f34e82] + Distances v0.8.2
  [31c24e10] + Distributions v0.21.9
  [5789e2e9] + FileIO v1.1.0
  [cc18c42c] + GaussianMixtures v0.3.0
  [f67ccb44] + HDF5 v0.12.5
  [4138dd39] + JLD v0.9.1
  [1b4a561d] + LegacyStrings v0.4.1
  [e1d29d7a] + Missings v0.4.3
  [b8a86587] + NearestNeighbors v0.4.4
  [bac558e1] + OrderedCollections v1.1.0
  [90014a1f] + PDMats v0.9.10
  [d96e819e] + Parameters v0.12.0
  [1fd47b50] + QuadGK v2.1.1
  [79098fc4] + Rmath v0.5.1
  [6e75b9c4] + ScikitLearnBase v0.5.0
  [a2af1166] + SortingAlgorithms v0.3.1
  [276daf66] + SpecialFunctions v0.8.0
  [90137ffa] + StaticArrays v0.12.1
  [2913bbd2] + StatsBase v0.32.0
  [4c63d2b9] + StatsFuns v0.9.0
  [30578b45] + URIParser v0.4.0
  [2a0f44e3] + Base64 
  [ade2ca70] + Dates 
  [8bb1440f] + DelimitedFiles 
  [8ba89e20] + Distributed 
  [b77e0a4c] + InteractiveUtils 
  [76f85450] + LibGit2 
  [8f399da3] + Libdl 
  [37e2e46d] + LinearAlgebra 
  [56ddb016] + Logging 
  [d6f4376e] + Markdown 
  [a63ad114] + Mmap 
  [44cfe95a] + Pkg 
  [de0858da] + Printf 
  [9abbd945] + Profile 
  [3fa0cd96] + REPL 
  [9a3f8284] + Random 
  [ea8e919c] + SHA 
  [9e88b42a] + Serialization 
  [1a1011a3] + SharedArrays 
  [6462fe0b] + Sockets 
  [2f01184e] + SparseArrays 
  [10745b16] + Statistics 
  [4607b0f0] + SuiteSparse 
  [8dfed614] + Test 
  [cf7118a7] + UUIDs 
  [4ec0a83e] + Unicode 
  Building CMake ───────────→ `~/.julia/packages/CMake/nSK2r/deps/build.log`
  Building Blosc ───────────→ `~/.julia/packages/Blosc/lzFr0/deps/build.log`
  Building HDF5 ────────────→ `~/.julia/packages/HDF5/Zh9on/deps/build.log`
  Building Rmath ───────────→ `~/.julia/packages/Rmath/4wt82/deps/build.log`
  Building SpecialFunctions → `~/.julia/packages/SpecialFunctions/ne2iw/deps/build.log`
  Building Arpack ──────────→ `~/.julia/packages/Arpack/cu5By/deps/build.log`
   Testing GaussianMixtures
    Status `/tmp/jl_IbwimU/Manifest.toml`
  [7d9fca2a] Arpack v0.3.1
  [9e28174c] BinDeps v0.8.10
  [b99e7846] BinaryProvider v0.5.8
  [a74b3585] Blosc v0.5.1
  [631607c0] CMake v1.1.2
  [d5fb7624] CMakeWrapper v0.2.3
  [aaaa29a8] Clustering v0.13.3
  [34da2185] Compat v2.2.0
  [9a962f9c] DataAPI v1.1.0
  [864edb3b] DataStructures v0.17.6
  [b4f34e82] Distances v0.8.2
  [31c24e10] Distributions v0.21.9
  [5789e2e9] FileIO v1.1.0
  [cc18c42c] GaussianMixtures v0.3.0
  [f67ccb44] HDF5 v0.12.5
  [4138dd39] JLD v0.9.1
  [1b4a561d] LegacyStrings v0.4.1
  [e1d29d7a] Missings v0.4.3
  [b8a86587] NearestNeighbors v0.4.4
  [bac558e1] OrderedCollections v1.1.0
  [90014a1f] PDMats v0.9.10
  [d96e819e] Parameters v0.12.0
  [1fd47b50] QuadGK v2.1.1
  [79098fc4] Rmath v0.5.1
  [6e75b9c4] ScikitLearnBase v0.5.0
  [a2af1166] SortingAlgorithms v0.3.1
  [276daf66] SpecialFunctions v0.8.0
  [90137ffa] StaticArrays v0.12.1
  [2913bbd2] StatsBase v0.32.0
  [4c63d2b9] StatsFuns v0.9.0
  [30578b45] URIParser v0.4.0
  [2a0f44e3] Base64  [`@stdlib/Base64`]
  [ade2ca70] Dates  [`@stdlib/Dates`]
  [8bb1440f] DelimitedFiles  [`@stdlib/DelimitedFiles`]
  [8ba89e20] Distributed  [`@stdlib/Distributed`]
  [b77e0a4c] InteractiveUtils  [`@stdlib/InteractiveUtils`]
  [76f85450] LibGit2  [`@stdlib/LibGit2`]
  [8f399da3] Libdl  [`@stdlib/Libdl`]
  [37e2e46d] LinearAlgebra  [`@stdlib/LinearAlgebra`]
  [56ddb016] Logging  [`@stdlib/Logging`]
  [d6f4376e] Markdown  [`@stdlib/Markdown`]
  [a63ad114] Mmap  [`@stdlib/Mmap`]
  [44cfe95a] Pkg  [`@stdlib/Pkg`]
  [de0858da] Printf  [`@stdlib/Printf`]
  [9abbd945] Profile  [`@stdlib/Profile`]
  [3fa0cd96] REPL  [`@stdlib/REPL`]
  [9a3f8284] Random  [`@stdlib/Random`]
  [ea8e919c] SHA  [`@stdlib/SHA`]
  [9e88b42a] Serialization  [`@stdlib/Serialization`]
  [1a1011a3] SharedArrays  [`@stdlib/SharedArrays`]
  [6462fe0b] Sockets  [`@stdlib/Sockets`]
  [2f01184e] SparseArrays  [`@stdlib/SparseArrays`]
  [10745b16] Statistics  [`@stdlib/Statistics`]
  [4607b0f0] SuiteSparse  [`@stdlib/SuiteSparse`]
  [8dfed614] Test  [`@stdlib/Test`]
  [cf7118a7] UUIDs  [`@stdlib/UUIDs`]
  [4ec0a83e] Unicode  [`@stdlib/Unicode`]
[ Info: Testing Data
(100000, -2.114878172691752e6, [96223.37099167121, 3776.6290083287804], [1909.2768727377727 7007.681893981676 2057.751876771842; -2297.413238961541 -7148.41306425312 -1925.5434208540764], Array{Float64,2}[[95379.39787914659 -3623.4900105113147 -84.24382146875247; -3623.4900105113156 85013.48872674159 -2699.9411671975677; -84.2438214687525 -2699.9411671975677 96252.5067471903], [4967.194975203896 3437.1217262247465 347.3823538918625; 3437.1217262247465 14918.008311395191 2530.88007238308; 347.38235389186246 2530.88007238308 4159.433108632379]])
┌ Warning: rmprocs: process 1 not removed
└ @ Distributed /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.2/Distributed/src/cluster.jl:1005
[ Info: Initializing GMM, 8 Gaussians diag covariance 2 dimensions using 272 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       1.053910e+03
      1       9.421860e+02      -1.117237e+02 |        6
      2       9.005112e+02      -4.167474e+01 |        0
      3       9.005112e+02       0.000000e+00 |        0
K-means converged with 3 iterations (objv = 900.5112473160834)
┌ Info: K-means with 272 data points using 3 iterations
└ 11.3 data points per parameter
[ Info: Running 0 iterations EM on full cov GMM with 8 Gaussians in 2 dimensions
┌ Info: EM with 272 data points 0 iterations avll -2.075865
└ 5.8 data points per parameter
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:221
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:221
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:221
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:221
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex_evalf at broadcast.jl:625 [inlined]
└ @ Core ./broadcast.jl:625
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:230
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:230
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex_evalf at broadcast.jl:625 [inlined]
└ @ Core ./broadcast.jl:625
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex_evalf at broadcast.jl:625 [inlined]
└ @ Core ./broadcast.jl:625
[ Info: iteration 1, lowerbound -3.794054
[ Info: iteration 2, lowerbound -3.645538
[ Info: iteration 3, lowerbound -3.484707
[ Info: iteration 4, lowerbound -3.300915
[ Info: iteration 5, lowerbound -3.118718
[ Info: iteration 6, lowerbound -2.970408
[ Info: dropping number of Gaussions to 7
[ Info: iteration 7, lowerbound -2.867358
[ Info: dropping number of Gaussions to 6
[ Info: iteration 8, lowerbound -2.810434
[ Info: dropping number of Gaussions to 5
[ Info: iteration 9, lowerbound -2.797870
[ Info: dropping number of Gaussions to 3
[ Info: iteration 10, lowerbound -2.779378
[ Info: iteration 11, lowerbound -2.763700
[ Info: iteration 12, lowerbound -2.750304
[ Info: iteration 13, lowerbound -2.729211
[ Info: iteration 14, lowerbound -2.697057
[ Info: iteration 15, lowerbound -2.650881
[ Info: iteration 16, lowerbound -2.590663
[ Info: iteration 17, lowerbound -2.522281
[ Info: iteration 18, lowerbound -2.456572
[ Info: iteration 19, lowerbound -2.402338
[ Info: iteration 20, lowerbound -2.361246
[ Info: iteration 21, lowerbound -2.331239
[ Info: iteration 22, lowerbound -2.312529
[ Info: iteration 23, lowerbound -2.307533
[ Info: dropping number of Gaussions to 2
[ Info: iteration 24, lowerbound -2.302923
[ Info: iteration 25, lowerbound -2.299260
[ Info: iteration 26, lowerbound -2.299256
[ Info: iteration 27, lowerbound -2.299254
[ Info: iteration 28, lowerbound -2.299254
[ Info: iteration 29, lowerbound -2.299253
[ Info: iteration 30, lowerbound -2.299253
[ Info: iteration 31, lowerbound -2.299253
[ Info: iteration 32, lowerbound -2.299253
[ Info: iteration 33, lowerbound -2.299253
[ Info: iteration 34, lowerbound -2.299253
[ Info: iteration 35, lowerbound -2.299253
[ Info: iteration 36, lowerbound -2.299253
[ Info: iteration 37, lowerbound -2.299253
[ Info: iteration 38, lowerbound -2.299253
[ Info: iteration 39, lowerbound -2.299253
[ Info: iteration 40, lowerbound -2.299253
[ Info: iteration 41, lowerbound -2.299253
[ Info: iteration 42, lowerbound -2.299253
[ Info: iteration 43, lowerbound -2.299253
[ Info: iteration 44, lowerbound -2.299253
[ Info: iteration 45, lowerbound -2.299253
[ Info: iteration 46, lowerbound -2.299253
[ Info: iteration 47, lowerbound -2.299253
[ Info: iteration 48, lowerbound -2.299253
[ Info: iteration 49, lowerbound -2.299253
[ Info: iteration 50, lowerbound -2.299253
[ Info: 50 variational Bayes EM-like iterations using 272 data points, final lowerbound -2.299253
History[Mon Dec  2 17:37:16 2019: Initializing GMM, 8 Gaussians diag covariance 2 dimensions using 272 data points
, Mon Dec  2 17:37:23 2019: K-means with 272 data points using 3 iterations
11.3 data points per parameter
, Mon Dec  2 17:37:24 2019: EM with 272 data points 0 iterations avll -2.075865
5.8 data points per parameter
, Mon Dec  2 17:37:26 2019: GMM converted to Variational GMM
, Mon Dec  2 17:37:32 2019: iteration 1, lowerbound -3.794054
, Mon Dec  2 17:37:32 2019: iteration 2, lowerbound -3.645538
, Mon Dec  2 17:37:32 2019: iteration 3, lowerbound -3.484707
, Mon Dec  2 17:37:32 2019: iteration 4, lowerbound -3.300915
, Mon Dec  2 17:37:32 2019: iteration 5, lowerbound -3.118718
, Mon Dec  2 17:37:32 2019: iteration 6, lowerbound -2.970408
, Mon Dec  2 17:37:33 2019: dropping number of Gaussions to 7
, Mon Dec  2 17:37:33 2019: iteration 7, lowerbound -2.867358
, Mon Dec  2 17:37:33 2019: dropping number of Gaussions to 6
, Mon Dec  2 17:37:33 2019: iteration 8, lowerbound -2.810434
, Mon Dec  2 17:37:33 2019: dropping number of Gaussions to 5
, Mon Dec  2 17:37:33 2019: iteration 9, lowerbound -2.797870
, Mon Dec  2 17:37:33 2019: dropping number of Gaussions to 3
, Mon Dec  2 17:37:33 2019: iteration 10, lowerbound -2.779378
, Mon Dec  2 17:37:33 2019: iteration 11, lowerbound -2.763700
, Mon Dec  2 17:37:33 2019: iteration 12, lowerbound -2.750304
, Mon Dec  2 17:37:33 2019: iteration 13, lowerbound -2.729211
, Mon Dec  2 17:37:33 2019: iteration 14, lowerbound -2.697057
, Mon Dec  2 17:37:33 2019: iteration 15, lowerbound -2.650881
, Mon Dec  2 17:37:33 2019: iteration 16, lowerbound -2.590663
, Mon Dec  2 17:37:33 2019: iteration 17, lowerbound -2.522281
, Mon Dec  2 17:37:33 2019: iteration 18, lowerbound -2.456572
, Mon Dec  2 17:37:33 2019: iteration 19, lowerbound -2.402338
, Mon Dec  2 17:37:33 2019: iteration 20, lowerbound -2.361246
, Mon Dec  2 17:37:33 2019: iteration 21, lowerbound -2.331239
, Mon Dec  2 17:37:33 2019: iteration 22, lowerbound -2.312529
, Mon Dec  2 17:37:33 2019: iteration 23, lowerbound -2.307533
, Mon Dec  2 17:37:33 2019: dropping number of Gaussions to 2
, Mon Dec  2 17:37:33 2019: iteration 24, lowerbound -2.302923
, Mon Dec  2 17:37:33 2019: iteration 25, lowerbound -2.299260
, Mon Dec  2 17:37:33 2019: iteration 26, lowerbound -2.299256
, Mon Dec  2 17:37:33 2019: iteration 27, lowerbound -2.299254
, Mon Dec  2 17:37:33 2019: iteration 28, lowerbound -2.299254
, Mon Dec  2 17:37:33 2019: iteration 29, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 30, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 31, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 32, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 33, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 34, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 35, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 36, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 37, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 38, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 39, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 40, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 41, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 42, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 43, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 44, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 45, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 46, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 47, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 48, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 49, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: iteration 50, lowerbound -2.299253
, Mon Dec  2 17:37:33 2019: 50 variational Bayes EM-like iterations using 272 data points, final lowerbound -2.299253
]
α = [95.95490777397131, 178.04509222602871]
β = [95.95490777397131, 178.04509222602871]
m = [2.000229257775246 53.85198717246062; 4.250300733269789 79.28686694436004]
ν = [97.95490777397131, 180.04509222602871]
W = LinearAlgebra.UpperTriangular{Float64,Array{Float64,2}}[[0.37587636119504886 -0.008953123827348369; 0.0 0.012748664777409751], [0.1840415554748283 -0.007644049042328595; 0.0 0.008581705166331017]]
Kind: diag, size256
nx: 100000 sum(zeroth order stats): 99999.99999999999
avll from stats: -0.9990036717744462
avll from llpg:  -0.9990036717744452
avll direct:     -0.9990036717744452
sum posterior: 100000.0
Kind: full, size16
nx: 100000 sum(zeroth order stats): 100000.0
avll from stats: -0.9573164636528305
avll from llpg:  -0.957316463652831
avll direct:     -0.957316463652831
sum posterior: 100000.0
32×26 Array{Float64,2}:
  0.0222097    0.176742    -0.0231408    -0.039351     0.215101     0.053445     -0.016615    -0.0392146    -0.114455     0.0340123     0.185065     -0.0329518   -0.0190561   -0.0155323    0.225663    -0.0552715    0.094904     0.113927     0.017948    -0.110629   -0.0316003    0.0826712   -0.0144637  -0.0107549    -0.00195681    0.121887  
 -0.0102852   -0.0861489   -0.0784853     0.0867545    0.0701538    0.0701644    -0.0105751    0.0553872    -0.0316545   -0.0665026    -0.0736675     0.0865245    0.0414358   -0.00430916  -0.00731767   0.0125473    0.159586    -0.00648283  -0.0290379   -0.181819   -0.120316    -0.189294    -0.106103    0.103116     -0.000550234  -0.00330515
  0.210259     0.0516378   -0.0212542    -0.02393     -0.0855034   -0.116948     -0.0638983   -0.210206      0.128015     0.121677     -0.0241252    -0.128016    -0.0408318    0.31654      0.0424094   -0.255963     0.124758    -0.0453534    0.0387677    0.175968    0.0231617   -0.0653336    0.0105226  -0.0521201    -0.046576     -0.0689671 
  0.119503     0.0223266   -0.0485108     0.158633    -0.0272659    0.177086     -0.0282571    0.023781     -0.0562304    0.168692     -0.06817       0.200126    -0.0739513    0.0317089    0.0836056    0.050799    -0.179269     0.0499351   -0.00073226  -0.069721    0.101987    -0.0147747    0.0928411   0.0377047    -0.127327      0.00414756
  0.0796082    0.219068    -0.0854412     0.103737     0.155255     0.000820188   0.198768     0.16737       0.0719577   -0.142188      0.122491     -0.0114656   -0.069444     0.147943    -0.186161    -0.0264207   -0.181996     0.119395     0.0571309   -0.0027217   0.0501695   -0.0162199    0.0689366   0.0817008    -0.0751605    -0.0796727 
 -0.015696    -0.149564    -0.052656     -0.099632     0.137773    -0.101755     -0.00651788   0.0701845     0.092817    -0.00252082   -0.127901      0.12028      0.0578009   -0.0290246    0.0752596   -0.0169988    0.201283     0.018061    -0.0223743    0.0747355  -0.0614807   -0.140416     0.0270851  -0.115974      0.246433     -0.0186691 
  0.064692     0.0963104   -0.0669936     0.123625    -0.0965399    0.0845628    -0.0329131    0.0920518     0.0353875   -0.122661     -0.0660994    -0.00384992   0.111671    -0.159431    -0.0755593   -0.242515    -0.22465     -0.0576846   -0.0400945    0.0442749  -0.0704225   -0.0803433    0.081897    0.117158     -0.0893726    -0.0320291 
 -0.0234764   -0.102635    -0.105611      0.0764303   -0.0127636    0.0503632     0.147752     0.0107797    -0.030951    -0.0681123    -0.0903873    -0.104058     0.226137     0.286718     0.00664695  -0.0277398    0.030615    -0.0119022   -0.08029     -0.113634   -0.01792      0.0543432    0.201538    0.061854     -0.0939748     0.00825803
  0.100383    -0.023946    -0.0199448     0.0943937   -0.0747605    0.0665176    -0.0613324    0.0818778    -0.0920329   -0.0203531    -0.00098559    0.0385797   -0.00152315  -0.0456925   -0.0752642    0.01309      0.182756    -0.0673023    0.133757     0.0339956   0.0028058    0.0115032   -0.0545524  -0.0714531    -0.0591645     0.00645575
  0.117859    -0.0716838   -0.0296914     0.0325418   -0.0404203    0.013899      0.0970878    0.16296       0.052219    -0.0236529    -0.0169793    -0.00893926  -0.123915    -0.0888516    0.0691074   -0.228122    -0.0288173    0.157068     0.083874    -0.0482551   0.0960371   -0.0294223   -0.0278333  -0.0389106     0.14821       0.218291  
  0.0849182    0.157393     0.102114     -0.0500767   -0.0133673   -0.070891     -0.0132182   -0.0466712    -0.00715678  -0.076129     -0.0521871     0.0204229   -0.0350921   -0.115994    -0.122598     0.0125824   -0.272848     0.00538057   0.164035     0.028036   -0.0964269    0.0579629    0.240488   -0.0541       -0.141802     -0.193835  
 -0.121173     0.00720576  -0.156229      0.0309118   -0.00901493   0.0199332     0.0391601    0.00343793   -0.0534503   -0.22524       0.0570044     0.184228     0.126112     0.042122    -0.0188395   -0.132093     0.0314922    0.00551588   0.0524165   -0.0699775  -0.0759579    0.00172804  -0.0257334   0.182738     -0.123333     -0.0201478 
  0.0535962   -0.0933627    0.172898     -0.0272323   -0.0687208   -0.139829     -0.248774    -0.0646791     0.120446     0.0774156     0.0570479     0.202911     0.0252658   -0.143536    -0.163988     0.183927    -0.108276    -0.0387614   -0.0185237   -0.0468873  -0.0511025    0.200308     0.0863146   0.13461      -0.0673134    -0.271045  
  0.0494427    0.0397772    0.121424     -0.0757257   -0.142581    -0.0740529    -0.0756197    0.0238335     0.0361107   -0.0346224     0.0422445     0.00499342   0.176407    -0.0685585   -0.0996214   -0.0539451   -0.00993551   0.111119     0.10589      0.115113    0.0169306    0.0719264   -0.118264    0.13092      -0.00820628   -0.0103846 
  0.00610146  -0.129988    -0.0273662    -0.0249311    0.0901247    0.10068       0.0406753   -0.0483176     0.152669    -0.0325414    -0.0272537    -0.196347     0.0488286    0.131154    -0.112255    -0.030456     0.191861     0.198226    -0.0548744   -0.035932    0.154452    -0.0623246   -0.116862   -0.0200627    -0.100665      0.0314509 
  0.147874    -0.197019     0.0925209    -0.0329023    0.0369839   -0.165006      0.0387756   -0.0714557    -0.0773335   -0.0320854     0.0702629    -0.0913127   -0.020082     0.143133    -0.223217    -0.0236753    0.176052     0.0165188    0.065484    -0.0414635  -0.17766      0.0485011   -0.0463764  -0.00832365    0.0114936     0.244876  
  0.0265216    0.23223     -0.10253       0.161718    -0.107146    -0.0867443    -0.0338805   -0.0601722    -0.171333    -0.000575663   0.0768967    -0.00628956   0.102055     0.0823353   -0.0641972   -0.0214682    0.0376788    0.130583     0.198582     0.0107426   0.101249    -0.129923    -0.0475545  -0.12156       0.0230799    -0.0485312 
 -0.0770597   -0.0291846    0.216291      0.102172    -0.0942685    0.00453586   -0.0585655   -0.000683499   0.052166     0.0333258    -0.226368     -0.0846456    0.103095     0.0135459    0.190892     0.123377    -0.0291639   -0.0298075   -0.0604525   -0.0827     -0.110256    -0.217909     0.0482874  -0.0892359     0.129752     -0.19189   
 -0.0529829    0.0648953    0.0189889    -0.0930767   -0.135949     0.00284567    0.0885369   -0.0504685     0.0769622   -0.0293731     0.0685646     0.121655    -0.0452124   -0.104323    -0.177429     0.0675385    0.048104    -0.0740086    0.0498632    0.158035   -0.100446     0.090954    -0.0816228   0.00501795   -0.172154      0.132932  
  0.023183     0.0357988    0.0348258    -0.188952    -0.102996    -0.0647111    -0.155396    -0.0902789     0.0455435    0.0454895    -0.0886994     0.041398     0.0439369    0.0443977    0.144041    -0.0881071    0.0856944    0.115834     0.045976    -0.0322765  -0.235426     0.0476719   -0.13185    -0.0557694     0.000609982   0.101224  
 -0.00564584  -0.053865    -0.0289672     0.0525274    0.22538      0.161977     -0.033069     0.00253017    0.00381851  -0.177716     -0.0993129     0.161963    -0.1052       0.0923348   -0.0785064   -0.0719027   -0.1835      -0.0957025    0.00649774   0.0320148   0.00285347   0.0738335   -0.113451    0.0944876    -0.163195     -0.0528892 
 -0.0599292    0.00270709  -0.138172      0.114137    -0.115297    -0.0455774     0.150355    -0.0846045    -0.0521967   -0.0911186     0.0813916     0.0697489    0.060314    -0.0356075    0.103682    -0.0540574   -0.0287193    0.0060039   -0.103871    -0.325934    0.0913295    0.0864778   -0.121449    0.0566949     0.163161      0.088983  
  0.0354993    0.179528     0.0814844     0.0422044   -0.0462181   -0.118314      0.0807364    0.161176      0.0124375    0.130271     -0.0109455    -0.113467     0.147878    -0.195453     0.0872559    0.120772     0.0203895    0.0214172    0.0582247    0.109537   -0.0647072    0.0042534    0.116298   -0.135891     -0.0648587     0.0425753 
  0.149447     0.0305382   -0.0260324     0.149099    -0.155173    -0.126095     -0.0921249   -0.127369     -0.0227696    0.0662471     0.25228      -0.104184     0.0521551   -0.0774472   -0.0591458   -0.0163746    0.114206     0.14943     -0.042548    -0.106812   -0.0605524   -0.0566245   -0.166322    0.0271689    -0.194901      0.106371  
  0.195075    -0.0375584    0.0781004    -0.00514496  -0.0997586   -0.093269     -0.00296478  -0.0229808     0.130479     0.03585       0.0984036     0.0266917    0.0604929   -0.062825     0.0575171   -0.0975946   -0.0624631   -0.0547993   -0.0133333    0.0344919  -0.095144    -0.124902    -0.0408598   0.15002      -0.0337472    -0.0185671 
 -0.155419    -0.118121     0.00184268   -0.110584     0.0108835    0.0221302     0.25283     -0.0159275    -0.0417499    0.143607      0.0036374     0.0118472   -0.023772    -0.0427541   -0.0182648    0.128088     0.0713701   -0.0840751    0.0587288   -0.102675   -0.154755    -0.0503586    0.0675112   0.106613     -0.0892131    -0.111939  
 -0.107046    -0.0715789    0.000554391   0.0490423    0.0421613    0.0559648    -0.125724    -0.00018038    0.0698416    0.0527655    -0.000158263   0.0350306   -0.0209922   -0.0570214   -0.0550226    0.0259002   -0.157599    -0.0114701    0.00298455  -0.256175    0.00404165   0.0303636   -0.148282   -0.0218999    -0.153841     -0.0963326 
 -0.143847     0.186713     0.0497352    -0.0583975   -0.0647204    0.0979032     0.0608547   -0.0450038     0.0434281   -0.024315     -0.157534      0.0907487    0.0890454    0.126329    -0.149528     0.0800626    0.113975    -0.205569     0.0280373   -0.056085    0.0403242   -0.0955746    0.0123845   0.0968324     0.0214686    -0.146532  
  0.16015     -0.0746503   -0.113972     -0.0144278   -0.170203    -0.173938     -0.0585456    0.134789     -0.0646417   -0.121454     -0.0920971     0.075831    -0.15344     -0.0118915    0.0164275    0.07181     -0.0113809   -0.0953973    0.00702353  -0.0862027   0.016661     0.210375    -0.047496   -0.114642      0.0510965    -0.0323217 
  0.04662     -0.192991    -0.000775945  -0.0420062    0.0715233    0.115579      0.0239538   -0.102647     -0.0570604   -0.16044      -0.0207042     0.0294442   -0.0376625    0.0314389    0.236716     0.00203644  -0.252459     0.0291861   -0.056588    -0.0455894   0.0342278   -0.085846    -0.0265002  -0.0620294     0.0908761    -0.0321034 
 -0.196684    -0.0224511    0.134921     -0.0962217   -0.00284376  -0.022332      0.0678901   -0.151742     -0.19453      0.161035      0.150356      0.122639     0.0682973    0.12859      0.149177     0.0609064   -0.0431283   -0.193743     0.0300906   -0.0224187   0.0239757    0.0999714    0.0290359  -0.000294335  -0.0513268    -0.0576123 
  0.148532    -0.0702002   -0.0557627    -0.0164084    0.0613394    0.0763719     0.0530511   -0.0533868    -0.210168     0.195094     -0.0481003    -0.0713035   -0.064836     0.103694    -0.00955869  -0.00828504  -0.0622801    0.0107943   -0.056398    -0.0278557   0.0969544    0.0167522    0.0174254   0.0171642    -0.0496808     0.134206  kind diag, method split
┌ Info: 0: avll = 
└   tll[1] = -1.3697428266623226
[ Info: Running 50 iterations EM on diag cov GMM with 2 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.369815
[ Info: iteration 2, average log likelihood -1.369750
[ Info: iteration 3, average log likelihood -1.369354
[ Info: iteration 4, average log likelihood -1.364902
[ Info: iteration 5, average log likelihood -1.350207
[ Info: iteration 6, average log likelihood -1.341441
[ Info: iteration 7, average log likelihood -1.339009
[ Info: iteration 8, average log likelihood -1.337788
[ Info: iteration 9, average log likelihood -1.336996
[ Info: iteration 10, average log likelihood -1.336394
[ Info: iteration 11, average log likelihood -1.335858
[ Info: iteration 12, average log likelihood -1.335328
[ Info: iteration 13, average log likelihood -1.334823
[ Info: iteration 14, average log likelihood -1.334275
[ Info: iteration 15, average log likelihood -1.333660
[ Info: iteration 16, average log likelihood -1.333103
[ Info: iteration 17, average log likelihood -1.332671
[ Info: iteration 18, average log likelihood -1.332341
[ Info: iteration 19, average log likelihood -1.332092
[ Info: iteration 20, average log likelihood -1.331899
[ Info: iteration 21, average log likelihood -1.331743
[ Info: iteration 22, average log likelihood -1.331607
[ Info: iteration 23, average log likelihood -1.331481
[ Info: iteration 24, average log likelihood -1.331360
[ Info: iteration 25, average log likelihood -1.331245
[ Info: iteration 26, average log likelihood -1.331144
[ Info: iteration 27, average log likelihood -1.331059
[ Info: iteration 28, average log likelihood -1.330984
[ Info: iteration 29, average log likelihood -1.330918
[ Info: iteration 30, average log likelihood -1.330858
[ Info: iteration 31, average log likelihood -1.330802
[ Info: iteration 32, average log likelihood -1.330748
[ Info: iteration 33, average log likelihood -1.330693
[ Info: iteration 34, average log likelihood -1.330632
[ Info: iteration 35, average log likelihood -1.330567
[ Info: iteration 36, average log likelihood -1.330501
[ Info: iteration 37, average log likelihood -1.330438
[ Info: iteration 38, average log likelihood -1.330382
[ Info: iteration 39, average log likelihood -1.330338
[ Info: iteration 40, average log likelihood -1.330306
[ Info: iteration 41, average log likelihood -1.330281
[ Info: iteration 42, average log likelihood -1.330261
[ Info: iteration 43, average log likelihood -1.330245
[ Info: iteration 44, average log likelihood -1.330233
[ Info: iteration 45, average log likelihood -1.330224
[ Info: iteration 46, average log likelihood -1.330218
[ Info: iteration 47, average log likelihood -1.330212
[ Info: iteration 48, average log likelihood -1.330208
[ Info: iteration 49, average log likelihood -1.330205
[ Info: iteration 50, average log likelihood -1.330202
┌ Info: EM with 100000 data points 50 iterations avll -1.330202
└ 952.4 data points per parameter
┌ Info: 1
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.369814750081125
│     -1.369750325810214
│      ⋮                
└     -1.330202421360847
[ Info: Running 50 iterations EM on diag cov GMM with 4 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.330310
[ Info: iteration 2, average log likelihood -1.330185
[ Info: iteration 3, average log likelihood -1.329471
[ Info: iteration 4, average log likelihood -1.322619
[ Info: iteration 5, average log likelihood -1.305274
[ Info: iteration 6, average log likelihood -1.296305
[ Info: iteration 7, average log likelihood -1.293928
[ Info: iteration 8, average log likelihood -1.292965
[ Info: iteration 9, average log likelihood -1.292410
[ Info: iteration 10, average log likelihood -1.292031
[ Info: iteration 11, average log likelihood -1.291751
[ Info: iteration 12, average log likelihood -1.291540
[ Info: iteration 13, average log likelihood -1.291378
[ Info: iteration 14, average log likelihood -1.291248
[ Info: iteration 15, average log likelihood -1.291133
[ Info: iteration 16, average log likelihood -1.291023
[ Info: iteration 17, average log likelihood -1.290914
[ Info: iteration 18, average log likelihood -1.290803
[ Info: iteration 19, average log likelihood -1.290696
[ Info: iteration 20, average log likelihood -1.290596
[ Info: iteration 21, average log likelihood -1.290507
[ Info: iteration 22, average log likelihood -1.290429
[ Info: iteration 23, average log likelihood -1.290362
[ Info: iteration 24, average log likelihood -1.290304
[ Info: iteration 25, average log likelihood -1.290254
[ Info: iteration 26, average log likelihood -1.290209
[ Info: iteration 27, average log likelihood -1.290170
[ Info: iteration 28, average log likelihood -1.290134
[ Info: iteration 29, average log likelihood -1.290101
[ Info: iteration 30, average log likelihood -1.290072
[ Info: iteration 31, average log likelihood -1.290045
[ Info: iteration 32, average log likelihood -1.290019
[ Info: iteration 33, average log likelihood -1.289995
[ Info: iteration 34, average log likelihood -1.289971
[ Info: iteration 35, average log likelihood -1.289949
[ Info: iteration 36, average log likelihood -1.289929
[ Info: iteration 37, average log likelihood -1.289912
[ Info: iteration 38, average log likelihood -1.289898
[ Info: iteration 39, average log likelihood -1.289886
[ Info: iteration 40, average log likelihood -1.289875
[ Info: iteration 41, average log likelihood -1.289864
[ Info: iteration 42, average log likelihood -1.289854
[ Info: iteration 43, average log likelihood -1.289844
[ Info: iteration 44, average log likelihood -1.289833
[ Info: iteration 45, average log likelihood -1.289821
[ Info: iteration 46, average log likelihood -1.289809
[ Info: iteration 47, average log likelihood -1.289795
[ Info: iteration 48, average log likelihood -1.289781
[ Info: iteration 49, average log likelihood -1.289766
[ Info: iteration 50, average log likelihood -1.289750
┌ Info: EM with 100000 data points 50 iterations avll -1.289750
└ 473.9 data points per parameter
┌ Info: 2
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.3303099452325347
│     -1.3301849331043167
│      ⋮                 
└     -1.289750255357867 
[ Info: Running 50 iterations EM on diag cov GMM with 8 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.289890
[ Info: iteration 2, average log likelihood -1.289709
[ Info: iteration 3, average log likelihood -1.289015
[ Info: iteration 4, average log likelihood -1.282907
[ Info: iteration 5, average log likelihood -1.264828
[ Info: iteration 6, average log likelihood -1.250941
[ Info: iteration 7, average log likelihood -1.245756
[ Info: iteration 8, average log likelihood -1.243160
[ Info: iteration 9, average log likelihood -1.241423
[ Info: iteration 10, average log likelihood -1.240009
[ Info: iteration 11, average log likelihood -1.238827
[ Info: iteration 12, average log likelihood -1.237959
[ Info: iteration 13, average log likelihood -1.237342
[ Info: iteration 14, average log likelihood -1.236928
[ Info: iteration 15, average log likelihood -1.236670
[ Info: iteration 16, average log likelihood -1.236507
[ Info: iteration 17, average log likelihood -1.236395
[ Info: iteration 18, average log likelihood -1.236305
[ Info: iteration 19, average log likelihood -1.236226
[ Info: iteration 20, average log likelihood -1.236150
[ Info: iteration 21, average log likelihood -1.236079
[ Info: iteration 22, average log likelihood -1.236014
[ Info: iteration 23, average log likelihood -1.235957
[ Info: iteration 24, average log likelihood -1.235908
[ Info: iteration 25, average log likelihood -1.235867
[ Info: iteration 26, average log likelihood -1.235833
[ Info: iteration 27, average log likelihood -1.235802
[ Info: iteration 28, average log likelihood -1.235772
[ Info: iteration 29, average log likelihood -1.235740
[ Info: iteration 30, average log likelihood -1.235702
[ Info: iteration 31, average log likelihood -1.235653
[ Info: iteration 32, average log likelihood -1.235582
[ Info: iteration 33, average log likelihood -1.235463
[ Info: iteration 34, average log likelihood -1.235252
[ Info: iteration 35, average log likelihood -1.234870
[ Info: iteration 36, average log likelihood -1.234287
[ Info: iteration 37, average log likelihood -1.233679
[ Info: iteration 38, average log likelihood -1.233257
[ Info: iteration 39, average log likelihood -1.233006
[ Info: iteration 40, average log likelihood -1.232825
[ Info: iteration 41, average log likelihood -1.232662
[ Info: iteration 42, average log likelihood -1.232508
[ Info: iteration 43, average log likelihood -1.232371
[ Info: iteration 44, average log likelihood -1.232261
[ Info: iteration 45, average log likelihood -1.232181
[ Info: iteration 46, average log likelihood -1.232128
[ Info: iteration 47, average log likelihood -1.232094
[ Info: iteration 48, average log likelihood -1.232073
[ Info: iteration 49, average log likelihood -1.232058
[ Info: iteration 50, average log likelihood -1.232048
┌ Info: EM with 100000 data points 50 iterations avll -1.232048
└ 236.4 data points per parameter
┌ Info: 3
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.2898898428295709
│     -1.2897094448309159
│      ⋮                 
└     -1.2320480111190375
[ Info: Running 50 iterations EM on diag cov GMM with 16 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.232252
[ Info: iteration 2, average log likelihood -1.231950
[ Info: iteration 3, average log likelihood -1.229306
[ Info: iteration 4, average log likelihood -1.208860
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     2
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.176978
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.164027
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.153305
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      2
│     15
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.140138
[ Info: iteration 9, average log likelihood -1.153671
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      7
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.137291
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     2
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 11, average log likelihood -1.141699
[ Info: iteration 12, average log likelihood -1.150169
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     15
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 13, average log likelihood -1.131899
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      2
│      7
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 14, average log likelihood -1.136921
[ Info: iteration 15, average log likelihood -1.153039
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 16, average log likelihood -1.133266
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      2
│      7
│     15
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 17, average log likelihood -1.135456
[ Info: iteration 18, average log likelihood -1.161189
[ Info: iteration 19, average log likelihood -1.145534
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     2
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 20, average log likelihood -1.132248
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 21, average log likelihood -1.143847
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      5
│     15
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 22, average log likelihood -1.130387
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     2
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 23, average log likelihood -1.144969
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 24, average log likelihood -1.153338
[ Info: iteration 25, average log likelihood -1.140271
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     2
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 26, average log likelihood -1.126955
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     15
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.137527
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      5
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 28, average log likelihood -1.136294
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     2
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 29, average log likelihood -1.142131
[ Info: iteration 30, average log likelihood -1.150022
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 31, average log likelihood -1.132346
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      2
│      7
│     15
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.123137
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 33, average log likelihood -1.149725
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 34, average log likelihood -1.143996
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     2
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 35, average log likelihood -1.134665
[ Info: iteration 36, average log likelihood -1.143769
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      2
│     15
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 37, average log likelihood -1.123796
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      5
│      7
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.137870
[ Info: iteration 39, average log likelihood -1.149824
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     2
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 40, average log likelihood -1.131510
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 41, average log likelihood -1.133202
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      7
│     15
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.120947
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     2
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 43, average log likelihood -1.136631
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 44, average log likelihood -1.147382
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     2
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 45, average log likelihood -1.135135
[ Info: iteration 46, average log likelihood -1.141712
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      2
│     15
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 47, average log likelihood -1.122777
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      5
│      7
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 48, average log likelihood -1.137483
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     2
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 49, average log likelihood -1.149441
[ Info: iteration 50, average log likelihood -1.141035
┌ Info: EM with 100000 data points 50 iterations avll -1.141035
└ 118.1 data points per parameter
┌ Info: 4
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.2322516833785015
│     -1.2319496232027665
│      ⋮                 
└     -1.1410352317967183
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      3
│      4
│     13
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 1, average log likelihood -1.126905
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      3
│      4
│     14
│     29
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -1.118582
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│     13
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 3, average log likelihood -1.119666
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      3
│      4
│     14
│     29
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -1.106328
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      3
│      4
│     13
│     14
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.081155
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.058483
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.063673
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.054660
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│     13
│     14
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.046958
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.049344
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      3
│      4
│     13
│     14
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 11, average log likelihood -1.054170
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 12, average log likelihood -1.041737
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 13, average log likelihood -1.053748
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 14, average log likelihood -1.049597
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│     13
│     14
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 15, average log likelihood -1.045997
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 16, average log likelihood -1.049084
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      3
│      4
│     13
│     14
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 17, average log likelihood -1.054081
┌ Warning: Variances had to be floored 
│   ind =
│    13-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 18, average log likelihood -1.041543
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      3
│      4
│     13
│     14
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 19, average log likelihood -1.061097
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 20, average log likelihood -1.044778
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│     13
│     14
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 21, average log likelihood -1.050854
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 22, average log likelihood -1.051757
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 23, average log likelihood -1.046527
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 24, average log likelihood -1.046329
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 25, average log likelihood -1.056032
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 26, average log likelihood -1.049494
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.045829
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 28, average log likelihood -1.056546
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 29, average log likelihood -1.049046
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 30, average log likelihood -1.046330
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 31, average log likelihood -1.056026
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.049497
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 33, average log likelihood -1.045801
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 34, average log likelihood -1.056550
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 35, average log likelihood -1.049013
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 36, average log likelihood -1.046342
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 37, average log likelihood -1.055995
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.049499
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 39, average log likelihood -1.045782
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 40, average log likelihood -1.056552
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 41, average log likelihood -1.048985
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.046354
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 43, average log likelihood -1.055970
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 44, average log likelihood -1.049502
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 45, average log likelihood -1.045767
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 46, average log likelihood -1.056555
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 47, average log likelihood -1.048963
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 48, average log likelihood -1.046365
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 49, average log likelihood -1.055949
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      9
│     10
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 50, average log likelihood -1.049506
┌ Info: EM with 100000 data points 50 iterations avll -1.049506
└ 59.0 data points per parameter
┌ Info: 5
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.12690473570304  
│     -1.118582345256897 
│      ⋮                 
└     -1.0495062978252185
┌ Info: Total log likelihood: 
│   tll =
│    251-element Array{Float64,1}:
│     -1.3697428266623226
│     -1.369814750081125 
│     -1.369750325810214 
│     -1.369353921643935 
│      ⋮                 
│     -1.046364881503956 
│     -1.0559490061798955
└     -1.0495062978252185
32×26 Array{Float64,2}:
 -0.498433     -0.106962   -0.020088    -0.0853475   -0.0282584     0.0148004    0.0861752    -0.0289046    0.0876035     0.146578      0.0549129     0.00624975  -0.0458066  -0.0487825   -0.0175068     0.145935    -0.0326283  -0.0968675   -0.00661744  -0.157223    -0.240695    -0.0628797    0.125064     0.0436711    0.0183301   -0.108392  
  0.0477567    -0.128512    0.029817    -0.101668     0.06241       0.0316144    0.32148      -0.0159446   -0.115427      0.140611     -0.16515       0.00552949  -0.0237107  -0.0352466   -0.0159561     0.137505     0.122575   -0.0773841    0.0906941   -0.0794294   -0.127928    -0.0439647    0.00735016   0.139433    -0.054531    -0.110588  
  0.532768      0.197139   -0.0818359   -0.0341078   -0.0237438     0.154786    -0.0292416     0.0338574   -0.0517878     0.175216      0.199311      0.203484    -0.069737    0.0313865    0.0802043    -0.480174    -0.175345    0.130633    -0.0524346   -0.0480865    0.0911871   -0.00933298   0.136808     0.0475299   -0.133378    -0.0500846 
 -0.148581     -0.170607   -0.00419926   0.340955    -0.0258551     0.275789    -0.0501746     0.0312709   -0.0532751     0.156434     -0.323033      0.196902    -0.074024    0.0312529    0.0781195     0.425629    -0.185571   -0.0598744    0.0612015   -0.0389902    0.104208    -0.0309352    0.0273105    0.0368786   -0.132103     0.0387881 
  0.000948158  -0.0903243  -0.0811517    0.080519     0.0614633     0.077366    -0.0692556     0.0680464   -0.0325166    -0.0502774    -0.0837656     0.0618112    0.0240432  -0.00296733  -0.0558473     0.0130607    0.162407    0.0543063   -0.00187177  -0.187472    -0.124235    -0.19834     -0.107082     0.108042     0.0210116    0.0109771 
  0.0670262     0.207193   -0.098736     0.118869     0.152011     -0.00824294   0.171611      0.158265     0.068471     -0.154488      0.189365     -0.00639065  -0.0671056   0.138322    -0.188146     -0.0278321   -0.175352    0.0537645    0.0876014   -0.00405658   0.0487384   -0.0144864    0.0620421    0.0905351   -0.0644152   -0.0700768 
  0.173158     -0.0537976  -0.0564271   -0.0121153    0.0617349     0.0704696    0.0624343    -0.0435559   -0.18658       0.158224     -0.0366554    -0.0549241   -0.0656324   0.0980732   -0.000471775  -0.0448351   -0.0827798   0.0211412   -0.0463694   -0.0306333    0.110146     0.0160448   -0.0239691    0.034695    -0.0478855    0.144365  
  0.145236      0.0332337  -0.0222386    0.15666     -0.15104      -0.127417    -0.105758     -0.106954    -0.0465614     0.0770549     0.233483     -0.087083     0.0546445  -0.0823074   -0.0668643    -0.0515072    0.113193    0.120383    -0.0313277   -0.103554    -0.0530432   -0.0571167   -0.116088     0.00576539  -0.18903      0.0953557 
 -0.0949352    -0.0661992   0.10656     -0.0335159    0.000609046   0.0619512   -0.111176      0.00418843   0.0661659     0.0129577    -0.000757894   0.036529    -0.0845138  -0.0568593   -0.0562334     0.0644602   -0.24727    -0.0238123    0.0594475   -0.177596     0.0413536    0.0765036   -0.205477    -0.072611    -0.575648    -0.0566812 
 -0.0983079    -0.0648251  -0.235307     0.175865     0.100351      0.0561333   -0.109551     -0.0258122    0.0640163     0.13176      -0.00119409    0.0372364    0.153396   -0.0543545   -0.0557314    -0.16565      0.109075    0.0156565   -0.15233     -0.298357    -0.0179237   -0.0479027   -0.044356     0.0274613    0.771407    -0.159701  
 -0.0315498     0.043384   -0.108756     0.0752043   -0.055697      0.0662485    0.0246958     0.0508731   -0.00340814   -0.173227     -0.000944437   0.0990435    0.103761   -0.0759136   -0.0396627    -0.176949    -0.077716   -0.0270668    0.00148142  -0.0111669   -0.0705769   -0.0305485    0.00876697   0.145985    -0.0959209   -0.0178515 
  0.153245     -0.0174261  -0.022861     0.0015973   -0.056788     -0.0543765    0.000275622  -0.0406675    0.0890531     0.0555368    -0.0259428    -0.0680272   -0.0850631   0.138942     0.0698539    -0.242592     0.0461382   0.054937     0.0639779    0.0539748    0.0763644   -0.0507347   -0.0364718   -0.0455474    0.0526956    0.065448  
  0.19466       0.0360111  -0.124149    -0.0245237   -1.3232       -0.220148     0.113636     -0.0310059    0.0827018     0.184894      0.481834      0.0183907    0.0418902  -0.0814416    0.0592241    -0.0979888   -0.0376921  -0.0552137   -0.0941409    0.0669984   -0.0993337   -0.460192     0.0671371    0.119488     0.00707324  -0.00876681
  0.195286     -0.0456314   0.0877009   -0.00905908   0.164332     -0.047694    -0.0260865    -0.0253357    0.147224      0.0135879     0.0405687     0.0230355    0.0683234  -0.0403685    0.0568462    -0.09834     -0.0712604  -0.0545491    0.00308539   0.00978843  -0.0973712   -0.0603058   -0.0811042    0.144169    -0.0297749   -0.0143966 
  0.0588895    -0.208465   -0.0204018   -0.170036     0.0611871     0.177805     0.0253691    -0.0795076   -0.147281     -0.137625     -0.031442      0.0201342    0.0850402  -0.275587     0.285333     -0.0357014   -0.248907    0.0317045   -0.170251    -0.0756407    0.0673494   -0.095952    -0.0267112   -0.194508     0.0987867   -0.0487129 
  0.0299116    -0.165834    0.0209252    0.0745759    0.0839631     0.0015062    0.0230332    -0.144673     0.000195797  -0.16836      -0.00393732    0.0366149   -0.139986    0.283848     0.233643      0.00868025  -0.254305    0.0262637    0.051285    -0.0162355   -0.00121234  -0.00426186  -0.0269439    0.0123699    0.0822019   -0.00888441
 -0.154621      0.202767    0.0389096   -0.0551703   -0.0771912     0.102838     0.0564518    -0.024456     0.0433035    -0.0211066    -0.157291      0.0840982    0.0885647   0.143502    -0.158185      0.0514275    0.10952    -0.206505     0.048216    -0.0546248    0.0451085   -0.0957862   -0.00826171   0.0978853    0.00541964  -0.14213   
 -0.00219841   -0.0858685  -0.0141942   -0.0140367    0.0860132     0.0622579    0.00906805   -0.0110417    0.137127     -0.0289792    -0.0512593    -0.118738     0.0479182   0.111143    -0.0808762    -0.0413733    0.158139    0.190038    -0.0154096   -0.0792104    0.1332      -0.0631326   -0.108891    -0.0232207   -0.0930141    0.0015948 
  0.0617355    -0.0986145   0.163021    -0.0281089   -0.0546665    -0.155021    -0.247921     -0.0737957    0.133157      0.0464098     0.0891379     0.195454     0.022205   -0.197321    -0.149596      0.174687    -0.10751    -0.0626369   -0.0358506   -0.0521508   -0.0633999    0.182049     0.0779236    0.134206    -0.0445491   -0.275548  
  0.0256558     0.0149001  -0.0647861    0.0779016   -0.019692      0.029814     0.0405919    -0.0237034   -0.0865748    -0.0260106     0.0651607     0.0310239    0.018526   -0.0392336    0.0912591    -0.0310458    0.0846877   0.0182268    0.00605218  -0.139833     0.0213877    0.0331951   -0.0741742   -0.00765784   0.0393194    0.0711089 
  0.0542616     0.0661768   0.0422298   -0.00837417   0.100911      0.0389659   -0.0217718    -0.0329218   -0.0152984    -0.102507     -0.0457852     0.0592042   -0.0637451   0.0173461   -0.0633779    -0.0424182   -0.192995   -0.0350416    0.0908849    0.00828922  -0.0322679    0.0635643    0.0331118    0.0272119   -0.140109    -0.0810395 
 -0.0441745     0.0375852   0.10548     -0.0763753   -0.108981     -0.0169374   -0.0121474    -0.0374946    0.0558705     0.00408494   -0.0643704     0.0324743    0.0184542  -0.025478     0.0211384     0.0522484    0.0253932   0.0159501    0.0227389    0.0230506   -0.14929     -0.0256975   -0.0251351   -0.0459275   -0.0455351   -0.00265833
 -0.0652272     0.0314852   0.126216    -0.0969671   -0.0564826    -0.0486594   -0.0194977    -0.0756265   -0.0600042     0.0440837     0.0631381     0.0688532    0.117895    0.0281839    0.0357002    -0.00769035  -0.0342713  -0.037238     0.0768594    0.0701394   -0.002014     0.047068    -0.0346455    0.0403883   -0.0237651   -0.0169457 
  0.0433634     0.247976   -0.122113     0.160662    -0.108176     -0.0898536   -0.0315076    -0.058033    -0.169797     -0.00280097    0.0532811     0.00707717   0.1021      0.0784596   -0.0626989    -0.00973479   0.0252899   0.134163     0.197663     0.0118348    0.101676    -0.128705    -0.0455529   -0.120372     0.0237033   -0.0250785 
  0.199454     -0.226113    0.104341    -0.0333492    0.0443764    -0.161578     0.0256738    -0.0970429   -0.0876844    -0.0336515     0.0693317    -0.0820073   -0.0282839   0.126322    -0.207523     -0.0177079    0.176876    0.0184545    0.0555497   -0.0302999   -0.138053     0.0442708   -0.029398    -0.0419358    0.00609621   0.251509  
  0.0531068    -0.109812   -0.0926771    0.0822793   -0.00617784    0.0609741    0.153279      0.00322829  -0.0301634    -0.0656772    -0.0697383    -0.0293801    0.226255    0.284386     0.0165664    -0.0239051    0.0399742  -0.0168811   -0.0820791   -0.11148     -0.0180778    0.0111826    0.220133     0.074831    -0.0848964    0.00932999
 -0.0422155    -0.0812833  -0.0435563   -0.0996937    0.134319     -0.0940636   -0.00318772    0.0204924   -0.114536     -0.000400085  -0.218106     -0.0792465    0.0391336   0.00733006   0.0641796    -0.107783     0.196132    0.0262729   -0.0231914    0.0689998   -0.134492    -0.186851    -0.0307967   -0.0769454    0.128628     0.0243831 
  0.0117585    -0.188355   -0.0601155   -0.105585     0.155692     -0.104331     0.0228082     0.125474     0.32577      -0.00217497   -0.0381744     0.240267     0.115263   -0.0662555    0.046954      0.00333695   0.204472    0.00977192  -0.0194599    0.0776496    0.0907478   -0.110808     0.0429737   -0.17686      0.393832    -0.0379994 
 -0.347874      0.362404    0.100626     0.0422608    0.0849221    -0.117267    -0.0446892     0.161095     0.0541932     0.148329     -0.0296801    -0.185107     0.0988324  -0.200475     0.0786247     0.123003     0.0219946   0.0216397   -0.0315188    0.0593645    0.0486775   -0.643764     0.119981    -0.142253    -0.0636188    0.0694036 
  0.468941      0.0337246   0.0655176    0.0424254   -0.143137     -0.11614      0.225464      0.161169    -0.0337716     0.0519002     0.00443581   -0.0696101    0.170608   -0.196263     0.0960638     0.109626     0.0275693   0.0212471    0.15492      0.159795    -0.168558     0.596002     0.0800106   -0.127434    -0.0662848    0.00876151
  0.0739481    -0.0917565  -0.111474    -0.335884    -0.253097     -0.173704    -0.0709069     0.0680379   -0.142768     -0.209516     -0.19682       0.050026    -0.128142   -0.0517894    0.0201477     0.348768     0.0346083  -0.102624    -0.0537643   -0.0616583    0.285112     0.029877    -0.00147943  -0.117548     0.115985    -0.024293  
  0.196737     -0.0328039  -0.112967     0.459781    -0.0965858    -0.173032    -0.0405498     0.206569     0.0627302    -0.0512402     0.0249005     0.0803581   -0.180786    0.0123994    0.0164364    -0.205116    -0.0476837  -0.0738375    0.0963563   -0.124495    -0.376261     0.334062    -0.0198764   -0.0991814   -0.0103331   -0.0452335 [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 1, average log likelihood -1.045753
┌ Warning: Variances had to be floored 
│   ind =
│    13-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -1.041270
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 3, average log likelihood -1.045742
┌ Warning: Variances had to be floored 
│   ind =
│    13-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -1.041256
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.045739
┌ Warning: Variances had to be floored 
│   ind =
│    13-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.041257
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.045736
┌ Warning: Variances had to be floored 
│   ind =
│    13-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.041259
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      3
│      4
│      6
│     13
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.045733
┌ Warning: Variances had to be floored 
│   ind =
│    13-element Array{Int64,1}:
│      3
│      4
│      6
│      9
│      ⋮
│     30
│     31
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.041261
┌ Info: EM with 100000 data points 10 iterations avll -1.041261
└ 59.0 data points per parameter
[ Info: Initializing GMM, 32 Gaussians diag covariance 26 dimensions using 100000 data points
kind diag, method kmeans
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       7.821785e+05
      1       6.368946e+05      -1.452839e+05 |       32
      2       6.086117e+05      -2.828289e+04 |       32
      3       5.926905e+05      -1.592118e+04 |       32
      4       5.820642e+05      -1.062635e+04 |       32
      5       5.757068e+05      -6.357370e+03 |       32
      6       5.719461e+05      -3.760750e+03 |       32
      7       5.698008e+05      -2.145326e+03 |       32
      8       5.683587e+05      -1.442083e+03 |       32
      9       5.671791e+05      -1.179532e+03 |       32
     10       5.660698e+05      -1.109302e+03 |       32
     11       5.649560e+05      -1.113865e+03 |       32
     12       5.639417e+05      -1.014256e+03 |       32
     13       5.630938e+05      -8.479370e+02 |       32
     14       5.624368e+05      -6.569258e+02 |       32
     15       5.619870e+05      -4.498779e+02 |       32
     16       5.616506e+05      -3.363292e+02 |       32
     17       5.613596e+05      -2.910175e+02 |       32
     18       5.611671e+05      -1.925403e+02 |       32
     19       5.610575e+05      -1.095652e+02 |       32
     20       5.609909e+05      -6.665375e+01 |       31
     21       5.609546e+05      -3.630351e+01 |       32
     22       5.609290e+05      -2.555296e+01 |       32
     23       5.609009e+05      -2.810115e+01 |       32
     24       5.608741e+05      -2.681049e+01 |       31
     25       5.608524e+05      -2.167026e+01 |       28
     26       5.608335e+05      -1.890256e+01 |       32
     27       5.608152e+05      -1.835893e+01 |       30
     28       5.607956e+05      -1.958717e+01 |       32
     29       5.607736e+05      -2.201754e+01 |       32
     30       5.607473e+05      -2.630413e+01 |       30
     31       5.607203e+05      -2.694454e+01 |       32
     32       5.606864e+05      -3.389546e+01 |       32
     33       5.606542e+05      -3.217294e+01 |       32
     34       5.606109e+05      -4.330537e+01 |       31
     35       5.605721e+05      -3.888017e+01 |       31
     36       5.605382e+05      -3.381426e+01 |       31
     37       5.604930e+05      -4.522240e+01 |       32
     38       5.604563e+05      -3.671652e+01 |       32
     39       5.604198e+05      -3.654633e+01 |       32
     40       5.603921e+05      -2.767309e+01 |       27
     41       5.603728e+05      -1.925337e+01 |       30
     42       5.603538e+05      -1.900520e+01 |       32
     43       5.603421e+05      -1.174536e+01 |       25
     44       5.603310e+05      -1.107228e+01 |       21
     45       5.603232e+05      -7.767585e+00 |       28
     46       5.603126e+05      -1.065350e+01 |       26
     47       5.602994e+05      -1.316181e+01 |       28
     48       5.602883e+05      -1.114307e+01 |       29
     49       5.602717e+05      -1.653785e+01 |       25
     50       5.602518e+05      -1.995165e+01 |       28
K-means terminated without convergence after 50 iterations (objv = 560251.7944513058)
┌ Info: K-means with 32000 data points using 50 iterations
└ 37.0 data points per parameter
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.285711
[ Info: iteration 2, average log likelihood -1.257906
[ Info: iteration 3, average log likelihood -1.231533
[ Info: iteration 4, average log likelihood -1.198311
[ Info: iteration 5, average log likelihood -1.158973
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      4
│     15
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.106426
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     10
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.098903
[ Info: iteration 8, average log likelihood -1.061625
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      7
│     15
│     16
│     17
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.013097
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      4
│     26
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.050162
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     10
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 11, average log likelihood -1.043920
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      7
│     17
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 12, average log likelihood -1.034042
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     13
│     15
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 13, average log likelihood -1.022091
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      4
│     10
│     16
│     26
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 14, average log likelihood -1.028552
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      7
│     22
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 15, average log likelihood -1.064349
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 16, average log likelihood -1.047197
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     13
│     15
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 17, average log likelihood -1.017536
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      4
│     10
│     26
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 18, average log likelihood -1.035892
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      7
│     22
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 19, average log likelihood -1.053342
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 20, average log likelihood -1.037564
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│     13
│     15
│     16
│     17
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 21, average log likelihood -0.989351
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      4
│      7
│     10
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 22, average log likelihood -1.035440
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     22
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 23, average log likelihood -1.068727
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 24, average log likelihood -1.023292
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      7
│     13
│     15
│     16
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 25, average log likelihood -0.985099
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      4
│     10
│     17
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 26, average log likelihood -1.038632
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     22
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.058329
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 28, average log likelihood -1.019189
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│     10
│     13
│     15
│     16
│     25
│     28
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 29, average log likelihood -0.981092
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      4
│     17
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 30, average log likelihood -1.055054
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      7
│     22
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 31, average log likelihood -1.047106
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.015345
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      4
│     10
│     13
│     15
│     16
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 33, average log likelihood -0.973182
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      7
│     17
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 34, average log likelihood -1.043605
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     22
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 35, average log likelihood -1.044221
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 36, average log likelihood -1.012435
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      4
│      7
│     10
│     13
│     15
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 37, average log likelihood -0.969629
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     17
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.057752
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     22
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 39, average log likelihood -1.027331
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      7
│     10
│     13
│     15
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 40, average log likelihood -0.986797
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     25
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 41, average log likelihood -1.027928
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      4
│     17
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.029827
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      7
│     10
│     16
│     22
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 43, average log likelihood -0.998566
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│     13
│     15
│     25
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 44, average log likelihood -1.006139
[ Info: iteration 45, average log likelihood -1.056317
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      4
│      7
│     17
│     22
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 46, average log likelihood -0.992914
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│     10
│     13
│     15
│     16
│     25
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 47, average log likelihood -0.993960
[ Info: iteration 48, average log likelihood -1.075488
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 49, average log likelihood -1.015232
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      4
│     10
│     15
│     16
│      ⋮
│     25
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 50, average log likelihood -0.972380
┌ Info: EM with 100000 data points 50 iterations avll -0.972380
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
  0.147553     0.039354    -0.0227592    0.161081    -0.147144    -0.136728    -0.101189    -0.11541      -0.0495617    0.0758645    0.251771     -0.0846308    0.0534671    -0.079008    -0.0772997    -0.0511473    0.116906      0.129041    -0.0242149   -0.102026    -0.0481306   -0.0521476   -0.119514     0.00998791  -0.211519     0.0986936 
  0.0471248    0.0870358    0.125135    -0.0806344   -0.115497    -0.0811637   -0.071012     0.0269507     0.0698729   -0.0670484    0.0197501     0.00885193   0.182256     -0.0674272   -0.100245     -0.0627633   -0.000596044   0.0927598    0.106156     0.129593     0.00589238   0.0279198   -0.114889     0.13153     -0.012365    -0.00211467
  0.04325      0.247856    -0.122566     0.160762    -0.108074    -0.089916    -0.0317961   -0.0580339    -0.170226    -0.0029451    0.0540763     0.00665984   0.101818      0.0777387   -0.0628053    -0.00952529   0.0263467     0.132981     0.197612     0.0109674    0.10171     -0.128374    -0.0458552   -0.120863     0.0238932   -0.0262124 
  0.128415    -0.0725611   -0.114473     0.0639929   -0.175785    -0.173396    -0.0518561    0.148081     -0.0486286   -0.144543    -0.0924028     0.063782    -0.165627     -0.0239705    0.0156445     0.0905896   -0.00760078   -0.0915979    0.0211247   -0.0884056   -0.0228984    0.181664    -0.0124927   -0.111781     0.0555083   -0.0368093 
  0.0638123   -0.0986153    0.162167    -0.0281043   -0.0555765   -0.156342    -0.247888    -0.0737965     0.133811     0.0475822    0.0907273     0.197457     0.0219554    -0.198754    -0.152286      0.175407    -0.107927     -0.063109    -0.0350781   -0.0512796   -0.0648834    0.18166      0.0783892    0.134359    -0.0459058   -0.276626  
 -0.0828182   -0.0318445    0.232135     0.10127     -0.089007     0.00565519  -0.044767     0.000764818   0.04523      0.0318456   -0.22559      -0.103821     0.0919828     0.0281644    0.187968      0.131453    -0.0229363     0.0373829   -0.0605752   -0.101473    -0.110366    -0.217734     0.0536641   -0.0911914    0.120138    -0.214801  
  0.0743616    0.213166    -0.112815     0.114166     0.147424    -0.00651158   0.169772     0.157564      0.0752795   -0.149734     0.194812     -0.0117629   -0.0733765     0.147316    -0.183544     -0.0242664   -0.176974      0.0568222    0.0814334    0.00301999   0.0500543   -0.0129413    0.0628834    0.104112    -0.0536606   -0.0658328 
  0.0453732   -0.189819    -0.00077471  -0.0539015    0.0725524    0.0930546    0.0241115   -0.111371     -0.0758377   -0.153654    -0.0183924     0.0281846   -0.0223967    -0.00483253   0.261555     -0.0137039   -0.251514      0.0289176   -0.0638713   -0.0464442    0.034383    -0.0535436   -0.0268978   -0.0942678    0.0902075   -0.0299586 
 -0.0130371   -0.143407    -0.0538707   -0.103356     0.147175    -0.100201     0.0125759    0.0821775     0.126059    -0.00200933  -0.12035       0.10297      0.0821212    -0.0304693    0.0532544    -0.0432728    0.199098      0.014956    -0.0213086    0.0748151   -0.00875681  -0.149136     0.0140321   -0.133995     0.277509    -0.0115473 
  0.0656676    0.198595     0.0830752    0.0424669   -0.0269691   -0.116515     0.0918082    0.160418      0.00760179   0.103578    -0.0109665    -0.129233     0.137932     -0.198845     0.0872982     0.117056     0.0253276     0.0214025    0.0630273    0.110867    -0.0599942   -0.0173305    0.10394     -0.136164    -0.0656135    0.0373277 
  0.0419695   -0.0518834   -0.00171307   0.0523882    0.218236     0.166946    -0.0537946    0.00565787   -0.0025991   -0.168846    -0.130685      0.149139    -0.110151      0.121315    -0.0512883    -0.0796247   -0.208223     -0.0940814    0.0326192    0.088002     0.0325478    0.0605148   -0.108586     0.0996608   -0.173107    -0.0418882 
 -0.00602536   0.176788    -0.00267996  -0.0423991    0.185988     0.0792459    0.00315096  -0.0801116    -0.095311     0.0368575    0.17655      -0.0389309   -0.0277962    -0.00255994   0.224046     -0.0374248    0.0842031     0.0915615    0.00518258  -0.11776     -0.0589008    0.0355666   -0.0148472   -0.0100525   -0.00633902   0.124271  
  0.123544    -0.0895915   -0.0285692    0.0231894   -0.0156389    0.0108249    0.0955434    0.16847       0.0512029   -0.00509184  -0.0259226    -0.00895488  -0.129976     -0.0840938    0.080254     -0.226063    -0.0217435     0.15238      0.0864022   -0.0470526    0.094037    -0.0290547   -0.0456681   -0.0370239    0.15182      0.218823  
 -0.151239     0.196868     0.0399045   -0.0562048   -0.0701421    0.103783     0.0573022   -0.0204497     0.0443257   -0.0215387   -0.156622      0.0828317    0.0885626     0.142768    -0.159403      0.0534641    0.110909     -0.202826     0.0476205   -0.0572288    0.0449722   -0.0947461   -0.00709397   0.0972149    0.00472579  -0.144621  
 -0.0959721   -0.0682447    0.00256006   0.0311072    0.035249     0.0610724   -0.112612    -0.00375047    0.066188     0.0489688   -0.000150323   0.0362471   -0.0144469    -0.0564543   -0.058053      0.00061413  -0.13929      -0.0119406   -0.00194777  -0.214283     0.0264385    0.0371881   -0.160295    -0.0418887   -0.177525    -0.0883835 
  0.0550257   -0.110277    -0.0953964    0.0911421   -0.00616651   0.0612245    0.159975     0.00450148   -0.0301352   -0.0669179   -0.0711624    -0.0321535    0.226341      0.286767     0.014914     -0.0231759    0.0386973    -0.0177538   -0.0859636   -0.112844    -0.0157319    0.0104546    0.222353     0.0763083   -0.087701     0.00931174
 -0.120701    -0.00613549  -0.151687     0.0122029   -0.00558342   0.0313312    0.0600173    0.0262026    -0.0525977   -0.217932     0.0785728     0.198319     0.0951884     0.046873    -0.0289778    -0.122364     0.0648748     0.00533679   0.0529213   -0.0561532   -0.0715172   -0.00234943  -0.0349964    0.189477    -0.108575    -0.020418  
  0.199065    -0.227818     0.104975    -0.0334901    0.0445319   -0.162504     0.0259488   -0.0974918    -0.0885916   -0.0347575    0.0695974    -0.0818828   -0.0285709     0.126188    -0.208811     -0.0168763    0.176964      0.0183616    0.0557636   -0.0292732   -0.141633     0.0437809   -0.0285181   -0.0415953    0.00616615   0.251044  
  0.00678815  -0.0838947   -0.080099     0.0833358    0.0651112    0.0744366   -0.0594923    0.069906     -0.0281138   -0.0541705   -0.0825217     0.0605215    0.0191715    -0.00167302  -0.0661769     0.013233     0.156185      0.0475485   -0.00390585  -0.18214     -0.117155    -0.19291     -0.106539     0.106561     0.0177559    0.0080276 
  0.0618447    0.101797    -0.0725541    0.128584    -0.096103     0.0994345   -0.0162893    0.0858215     0.0543628   -0.131003    -0.0650145     0.00307987   0.103225     -0.184582    -0.0566822    -0.237365    -0.217475     -0.0554989   -0.0267822    0.0212884   -0.0764229   -0.0618031    0.0614888    0.104129    -0.101688    -0.0164011 
  0.124078    -0.025042     0.00284512   0.134801    -0.0622819    0.058728    -0.0325624    0.0319477    -0.102913    -0.0496284   -0.00824823   -0.0108128   -0.000696425  -0.099345    -0.0697419     0.0521701    0.175873     -0.0705617    0.389689     0.0402778   -0.00376368   0.0181141   -0.0926303   -0.0581025   -0.0237144    0.00962692
 -0.189934    -0.0203111    0.137215    -0.0985675   -0.00567546  -0.0234282    0.0613789   -0.175923     -0.198787     0.146891     0.140828      0.13801      0.0659285     0.121202     0.149458      0.0569886   -0.0742911    -0.191266     0.0573087    0.0237394    0.0192227    0.0872321    0.0679296   -0.0276151   -0.0437502   -0.0575138 
  0.00527023   0.112212     0.0705735   -0.080761    -0.0899362   -0.0623874    0.0512168   -0.05709       0.0297278   -0.0538133    0.0249295     0.0780642   -0.0381447    -0.0993146   -0.183242      0.0385132   -0.100521     -0.0317567    0.120964     0.0784823   -0.0962243    0.0738896    0.0729183   -0.0204059   -0.171145    -0.0274678 
  0.15372     -0.0310916   -0.0854744    0.00510152  -0.126872     0.0424372   -0.0852308    0.207976     -0.179804     0.0845457    0.0141099     0.1956       0.0252438     0.0175923   -0.0981193    -0.0658423    0.211865     -0.0547521   -0.549916     0.00542882   0.009247     0.00957934   0.00118541  -0.100882    -0.165576     0.0048553 
  0.00454566  -0.0953188   -0.0285841   -0.0147958    0.12583      0.0817756    0.0301156   -0.00875007    0.148901    -0.0326669   -0.0434678    -0.144336     0.0454769     0.124026    -0.110473     -0.0327387    0.173816      0.195305    -0.0134252   -0.0871807    0.172416    -0.0625039   -0.122385    -0.0220172   -0.112278     0.00384848
  0.202331     0.0155488   -0.0431142    0.151054    -0.0240651    0.218386    -0.0381926    0.0327138    -0.0532141    0.166552    -0.0561543     0.201314    -0.0719687     0.0313094    0.0792425    -0.0342992   -0.181589      0.0358387    0.00633381  -0.0389729    0.0987481   -0.0196558    0.0859325    0.042682    -0.133634    -0.00844894
 -0.0609474   -0.0433345   -0.136142     0.131276    -0.0982224   -0.0369761    0.146423    -0.0773183    -0.0465772   -0.0962374    0.0731993     0.0491515    0.0609269    -0.0524228    0.156003     -0.0858059   -0.0211479     0.0243048   -0.100606    -0.320939     0.0885884    0.05562     -0.127289     0.0585151    0.160184     0.0924529 
  0.176121    -0.0508722   -0.0573032   -0.0129576    0.0623634    0.0716824    0.061401    -0.0466492    -0.190374     0.158822    -0.0381697    -0.0576716   -0.0649896     0.0997313   -0.000254423  -0.0429438   -0.08651       0.0179545   -0.0490863   -0.0287322    0.110331     0.0172119   -0.0266021    0.0383785   -0.0505853    0.141622  
  0.195385    -0.0293443    0.0530212   -0.0114476   -0.0824844   -0.0777381   -0.00443892  -0.0262661     0.138816     0.0406969    0.115965      0.0239514    0.0656832    -0.0487908    0.0558388    -0.0980853   -0.0651801    -0.054941    -0.0133048    0.0229408   -0.0977641   -0.128515    -0.0580866    0.14338     -0.026727    -0.013707  
  0.0176852    0.0335231    0.0338885   -0.234593    -0.119435    -0.0513798   -0.154238    -0.0846543     0.0615524    0.0423949   -0.10081       0.0429108    0.0451987     0.0448169    0.141297     -0.0960283    0.0371444     0.135455     0.0495932   -0.0285901   -0.248098     0.0510709   -0.105098    -0.0577263    0.00766186   0.103312  
  0.196855     0.061426    -0.0197464   -0.0151634   -0.0897534   -0.116156    -0.0879368   -0.24666       0.123105     0.116343    -0.0247591    -0.128844    -0.0394631     0.355895     0.0361313    -0.257884     0.113235     -0.0496899    0.0383261    0.145798     0.0755743   -0.0758033   -0.0337305   -0.0594745   -0.0500887   -0.0830792 
 -0.184959    -0.120207     0.00705868  -0.0974773    0.026383     0.0252101    0.228807    -0.0209913    -0.0311683    0.143253    -0.0700654     0.00616272  -0.0343086    -0.0412294   -0.01749       0.139531     0.057374     -0.0843126    0.0473321   -0.10962     -0.177627    -0.0500325    0.0586914    0.101581    -0.0295009   -0.109451  [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 1, average log likelihood -1.073725
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      7
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -1.025701
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 3, average log likelihood -0.984348
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      4
│      7
│     10
│     13
│      ⋮
│     25
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -0.939563
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     17
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.063045
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      7
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.023609
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     13
│     15
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -0.976148
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      4
│      7
│     10
│     13
│     16
│     22
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -0.972217
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     17
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.039429
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      7
│     13
│     15
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.008283
┌ Info: EM with 100000 data points 10 iterations avll -1.008283
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
 -0.0166053    0.110563    -0.183208     0.0531379   -0.107064     0.0875804     0.0174845    0.0734103   -0.187315    -0.0867017     0.00663728   0.1305       0.0432916   -0.064135    -0.226168    -0.0772661    0.126304    -0.189283     0.0374428    0.143371    0.0710716    0.0592686  -0.00877961  -0.118197    -0.15155      0.00284188 
  0.0866441   -0.0124058    0.148401    -0.113142     0.0319453   -0.0442155    -0.0993497   -0.0552196   -0.0620519   -0.0847001     0.024268    -0.030692    -0.164241     0.165859     0.0736268    0.220004    -0.0133107    0.173356     0.00502596  -0.073486    0.0344046   -0.0907989   0.117935    -0.0833327    0.0160368    0.0661584  
 -0.106399    -0.0287169   -0.0878646    0.0786117    0.0526573    0.142258     -0.0434908   -0.196218    -0.00689858  -0.0059386     0.178388    -0.117749     0.117365    -0.147133    -0.0581622   -0.0733494   -0.00745703   0.140618    -0.0174298   -0.0409102   0.0236262    0.0571079   0.208139     0.186412     0.0116205    0.129076   
 -0.0176571   -0.0740013    0.0171362    0.00781413  -0.02292     -0.179712     -0.150145    -0.0150136   -0.0564729    0.0354077     0.0488137    0.0411339   -0.0417387    0.0313722   -0.122224     0.071503     0.0469055   -0.128566    -0.0306176    0.0311032  -0.164489    -0.140368   -0.0625282   -0.0406304    0.02132     -0.0316563  
 -0.0129363    0.0360591    0.0240092   -0.0110622   -0.0616043   -0.0361149     0.110282    -0.265499     0.1027      -0.0836457    -0.0471626    0.104634     0.00562882  -0.00698903  -0.0821488    0.0384022    0.0182561    0.13941      0.00873086   0.104592    0.158685     0.0130107   0.0382201    0.155485    -0.101366    -0.100936   
 -0.0214672   -0.0740233   -0.142981    -0.019342     0.224293    -0.09558       0.0771341    0.00197775  -0.0442212    0.000617424  -0.0833348    0.316135     0.0763279    0.0181609    0.0739827   -0.0617531   -0.0182141   -0.137596     0.0414144   -0.119197    0.0782597    0.0252814   0.13092      0.060469    -0.0502538    0.0326444  
 -0.191866    -0.0915219    0.0497031   -0.167042     0.00167871  -0.000844049  -0.0594837    0.163756     0.111947    -0.154315      0.121383    -0.00977406   0.078644     0.0147612   -0.057126     0.049441    -0.143602    -0.188966    -0.00791762   0.0265393   0.0611237   -0.102733   -0.0357652   -0.0272824   -0.00164753  -0.0289301  
  0.136453     0.0569433   -0.0143076   -0.0591554   -0.0220214   -0.0941974    -0.112754     0.0700449   -0.0661974   -0.056591      0.0614222    0.0475208   -0.104556     0.0955749    0.0127707    0.187895    -0.103002     0.0444651    0.00693975  -0.0833447  -0.181299     0.190087    0.125992     0.0953917    0.173824    -0.0911125  
 -0.134571    -0.0973093    0.041576    -0.14517      0.0919723    0.0899908    -0.0597728    0.179818    -0.0119241    0.167684     -0.085521    -0.142999     0.0776714    0.127891    -0.00607621  -0.0690067    0.038187     0.0236098   -0.0596023   -0.0357483   0.0144795    0.0427027   0.102675    -0.0351698    0.145105     0.000839031
  0.20957     -0.148288    -0.0232652    0.0580789   -0.0543663    0.137957      0.0740284    0.0513188   -0.0976995   -0.0349398     0.0205178    0.104499    -0.0179864    0.0710753   -0.115359    -0.193035    -0.0140457    0.0465309   -0.0106125    0.0188688  -0.00102193   0.132604   -0.154756    -0.0720027   -0.0123214   -0.0109225  
  0.0231542    0.105368     0.00989596  -0.10307      0.166838    -0.0238156    -0.144015    -0.082554     0.01862     -0.00933639    0.0223017    0.0258366    0.074297     0.0854452   -0.0508798    0.0349599    0.0964815   -0.17116      0.139312    -0.109879   -0.0138113   -0.105009    0.0107802    0.0763824    0.0379973    0.210176   
 -0.105635    -0.0430787   -0.0474494   -0.0784093   -0.128391     0.228521     -0.178558     0.0380452   -0.0655319    0.177462      0.0751915    0.124687     0.017326     0.203229     0.0132805   -0.00363862  -0.137619    -0.0660284    0.0615307   -0.129765    0.0749198   -0.0309976   0.0933521    0.0897899    0.0553001   -0.0268522  
 -0.0390345    0.132784     0.0369965   -0.0645956    0.0876376    0.057035     -0.0503627   -0.018768     0.191314     0.0764728     0.0884287   -0.0647158    0.0203878   -0.196343     0.111906     0.100442    -0.0346567    0.106076    -0.172014     0.0236839  -0.0550392    0.022169    0.0321407   -0.053996     0.0758497    0.151823   
 -0.0464855   -0.0474119   -0.0219125    0.00506045   0.145485     0.13365       0.0348496    0.0131854   -0.0641044   -0.118598     -0.186041    -0.0507061    0.0178008   -0.0140233    0.215832     0.123784    -0.0626555    0.0165298    0.0389631    0.130574    0.142499     0.0531042  -0.108878     0.265509    -0.10157     -0.0205082  
  0.00627643  -0.0702743    0.0815723    0.0086431   -0.0590052    0.189423      0.0161854   -0.193467    -0.0552885   -0.131153     -0.0942828    0.134988    -0.0853096    0.0586891   -0.0847791   -0.112076     0.0739149   -0.107936     0.0338742   -0.150189    0.144895     0.154156   -0.260109     0.0166955   -0.266554     0.00860972 
 -0.078177    -0.0784807    0.033158    -0.0739009    0.120891    -0.146561      0.0607101    0.0711641    0.139252     0.0633357     0.0660971    0.0646615    0.0185769   -0.044019    -0.0422151   -0.109149     0.133284    -0.00253921  -0.222656     0.283187    0.0403829   -0.0627766   0.0735082   -0.0998898   -0.0601356   -0.0113962  
 -0.09862     -0.103704     0.115699     0.0056879    0.09685     -0.0653046     0.010481    -0.0568703   -0.0494353    0.0042131     0.0918798   -0.0640056   -0.0937269   -0.0300051   -0.186661    -0.161158     0.0454047   -0.060499     0.0984647   -0.0955979  -0.0265041    0.125497   -0.0503099   -0.020935     0.140331     0.0225489  
 -0.081198    -0.0323119   -0.0356441   -0.0576732   -0.00135165  -0.119915      0.128433     0.00141455  -0.0986945    0.0614939    -0.0639538    0.0308676   -0.00182417  -0.170467    -0.0102998   -0.205581    -0.120352    -0.119224     0.181886    -0.217001    0.00801867  -0.0668751   0.0653878   -0.0431357    0.0982978   -0.169359   
 -0.112389     0.0105514   -0.0809683   -0.0266189    0.0100353    0.00323761    0.138912     0.00378762   0.0613482   -0.0567506     0.0692836   -0.116034    -0.05762      0.173472     0.134238     0.218633     0.0057995    0.031525     0.0863854   -0.117014    0.0220388    0.0932664   0.0536842   -0.0456212   -0.0126967   -0.0851244  
  0.170435    -0.0584165    0.102573    -0.00826485   0.0099219    0.0102814    -0.128517     0.0464894    0.0561165    0.0645227     0.0758341    0.109645    -0.0210094    0.0649709    0.141185    -0.0959658    0.102667     0.0965156    0.0542015    0.051631    0.207878     0.0407526   0.039191    -0.0750487   -0.0118499    0.188018   
 -0.0516641   -0.0794306   -0.184804    -0.0111966    0.0346325    0.0340253    -0.208461     0.0947194   -0.0336754   -0.0903492     0.123238    -0.018403     0.00189139  -0.132276     0.0723263    0.00479195   0.0298619    0.257705     0.120604     0.176823   -0.0604911    0.0712964   0.0752251   -0.0875501   -0.0602354    0.0137943  
  0.070173    -0.157144    -0.0606082    0.0167657   -0.167287    -0.129657     -0.0678799    0.00264824  -0.0250262    0.223033      0.00163684   0.12381      0.0392431   -0.210471     0.0587202   -0.0587197    0.0433744    0.0550197    0.0903382    0.0392032   0.0375674    0.0314138   0.0636824    0.106307    -0.0150172   -0.106873   
 -0.0132165   -0.0301602   -0.136806    -0.00709883   0.0272455   -0.0174273     0.184029     0.00394289  -0.0399201   -0.0395982    -0.0714181    0.202499    -0.0339105    0.00858009  -0.176208     0.00959546  -0.035157     0.0926649   -0.134443    -0.18327    -0.0572414   -0.0315018  -0.0483049   -0.0839783    0.0770866   -0.209311   
 -0.134698     0.125809    -0.0397577   -0.15921     -0.108139     0.0754011     0.0949287   -0.129934     0.0466325   -0.0638618    -0.00721124   0.0380835    0.0490442    0.135678    -0.134327    -0.101333    -0.169836    -0.0401033   -0.07307     -0.0528094  -0.0846364   -0.0730748  -0.193177    -0.0500143    0.00103127  -0.0528283  
  0.134638    -0.00221488   0.165488     0.00612338  -0.0366256    0.0077341     0.0689058   -0.132011    -0.0870728    0.0717198    -0.104664    -0.105746     0.180054    -0.00731562  -0.0456502   -0.121574    -0.04762     -0.0428203    0.0334944   -0.153409    0.0306087    0.055948    0.0815278    0.0174708    0.192126    -0.11626    
  0.0874546   -0.145213     0.162402    -0.079825    -0.0997162   -0.0296873    -0.16958      0.0603697    0.0184377   -0.125484      0.138633     0.00780917   0.128583     0.0862409   -0.124384    -0.0374915    0.00777008  -0.0126508    0.0724924    0.03195    -0.121572    -0.127151    0.0830894   -0.157031     0.193927    -0.0521432  
 -0.0650741    0.034562    -0.0896526    0.00712406  -0.0216786    0.0972731    -0.0946612    0.0515433    0.0388062    0.0516121    -0.142086     0.0584521    0.195911     0.0462126   -0.0582176    0.0401771    0.0625585    0.139987     0.00175878   0.0602285  -0.104716    -0.1187      0.136736    -0.0067276    0.0378927    0.0219839  
  0.0367133   -0.0345596   -0.103838     0.123649     0.177655     0.238886     -0.00377861  -0.198812    -0.0216598   -0.135918     -0.113743    -0.0295356    0.0714999   -0.0780567   -0.21188      0.214362     0.0724817   -0.0468891   -0.0700941    0.0889517   0.1209      -0.117681   -0.0559094    0.0567378   -0.0128873    0.0561133  
 -0.0500498    0.00234824   0.08609      0.056347    -0.047615    -0.0209122     0.069851    -0.0676556    0.0306691    0.0306343     0.0816705   -0.0671477    0.0546575   -0.179896     0.130734    -0.0819829    0.107287    -0.032454    -0.0894662   -0.164943   -0.128776    -0.0655849   0.0374987   -0.102393    -0.130545     0.0182284  
  0.00941461   0.105067     0.138569     0.0173381    0.0472794   -0.112611      0.252625    -0.0191546    0.0500095    0.109562      0.0636414   -0.0369295    0.0284634   -0.0296224    0.143161    -0.124215    -0.101331    -0.0562247   -0.164825    -0.148603   -0.0139978    0.153239    0.304355    -0.0291654    0.0641022    0.0972843  
 -0.0383383   -0.0487447   -0.0850408    0.0722634    0.0151722   -0.00406727    0.0334349    0.0602093    0.170915     0.080406     -0.164305     0.151336     0.00520804  -0.0029604   -0.0517673   -0.250616    -0.0981408    0.0498077    0.0666835    0.115486    0.182992    -0.0194626   0.0470159    0.00338595   0.0304496   -0.0124415  
  0.217809    -0.11017      0.0248037    0.113771     0.0111945   -0.0251462     0.00739596  -0.0652484   -0.0504971    0.0695666    -0.121258    -0.0975096    0.155945    -0.0601015   -0.0571722   -0.116266    -0.0597713   -0.112051    -0.0155478    0.105314    0.144641    -0.0477481  -0.0242431   -0.117837    -0.0135788    0.012339   kind full, method split
┌ Info: 0: avll = 
└   tll[1] = -1.4228581805290048
[ Info: Running 50 iterations EM on diag cov GMM with 2 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.422877
[ Info: iteration 2, average log likelihood -1.422812
[ Info: iteration 3, average log likelihood -1.422760
[ Info: iteration 4, average log likelihood -1.422693
[ Info: iteration 5, average log likelihood -1.422602
[ Info: iteration 6, average log likelihood -1.422474
[ Info: iteration 7, average log likelihood -1.422279
[ Info: iteration 8, average log likelihood -1.421954
[ Info: iteration 9, average log likelihood -1.421394
[ Info: iteration 10, average log likelihood -1.420522
[ Info: iteration 11, average log likelihood -1.419451
[ Info: iteration 12, average log likelihood -1.418510
[ Info: iteration 13, average log likelihood -1.417922
[ Info: iteration 14, average log likelihood -1.417633
[ Info: iteration 15, average log likelihood -1.417507
[ Info: iteration 16, average log likelihood -1.417453
[ Info: iteration 17, average log likelihood -1.417429
[ Info: iteration 18, average log likelihood -1.417419
[ Info: iteration 19, average log likelihood -1.417414
[ Info: iteration 20, average log likelihood -1.417412
[ Info: iteration 21, average log likelihood -1.417411
[ Info: iteration 22, average log likelihood -1.417410
[ Info: iteration 23, average log likelihood -1.417410
[ Info: iteration 24, average log likelihood -1.417409
[ Info: iteration 25, average log likelihood -1.417409
[ Info: iteration 26, average log likelihood -1.417408
[ Info: iteration 27, average log likelihood -1.417408
[ Info: iteration 28, average log likelihood -1.417408
[ Info: iteration 29, average log likelihood -1.417408
[ Info: iteration 30, average log likelihood -1.417408
[ Info: iteration 31, average log likelihood -1.417407
[ Info: iteration 32, average log likelihood -1.417407
[ Info: iteration 33, average log likelihood -1.417407
[ Info: iteration 34, average log likelihood -1.417407
[ Info: iteration 35, average log likelihood -1.417407
[ Info: iteration 36, average log likelihood -1.417407
[ Info: iteration 37, average log likelihood -1.417407
[ Info: iteration 38, average log likelihood -1.417407
[ Info: iteration 39, average log likelihood -1.417407
[ Info: iteration 40, average log likelihood -1.417406
[ Info: iteration 41, average log likelihood -1.417406
[ Info: iteration 42, average log likelihood -1.417406
[ Info: iteration 43, average log likelihood -1.417406
[ Info: iteration 44, average log likelihood -1.417406
[ Info: iteration 45, average log likelihood -1.417406
[ Info: iteration 46, average log likelihood -1.417406
[ Info: iteration 47, average log likelihood -1.417406
[ Info: iteration 48, average log likelihood -1.417406
[ Info: iteration 49, average log likelihood -1.417406
[ Info: iteration 50, average log likelihood -1.417406
┌ Info: EM with 100000 data points 50 iterations avll -1.417406
└ 952.4 data points per parameter
┌ Info: 1
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4228771432273186
│     -1.4228122112138295
│      ⋮                 
└     -1.4174061036192254
[ Info: Running 50 iterations EM on diag cov GMM with 4 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.417421
[ Info: iteration 2, average log likelihood -1.417356
[ Info: iteration 3, average log likelihood -1.417298
[ Info: iteration 4, average log likelihood -1.417226
[ Info: iteration 5, average log likelihood -1.417132
[ Info: iteration 6, average log likelihood -1.417018
[ Info: iteration 7, average log likelihood -1.416891
[ Info: iteration 8, average log likelihood -1.416765
[ Info: iteration 9, average log likelihood -1.416653
[ Info: iteration 10, average log likelihood -1.416561
[ Info: iteration 11, average log likelihood -1.416490
[ Info: iteration 12, average log likelihood -1.416439
[ Info: iteration 13, average log likelihood -1.416402
[ Info: iteration 14, average log likelihood -1.416376
[ Info: iteration 15, average log likelihood -1.416357
[ Info: iteration 16, average log likelihood -1.416343
[ Info: iteration 17, average log likelihood -1.416332
[ Info: iteration 18, average log likelihood -1.416322
[ Info: iteration 19, average log likelihood -1.416314
[ Info: iteration 20, average log likelihood -1.416306
[ Info: iteration 21, average log likelihood -1.416298
[ Info: iteration 22, average log likelihood -1.416291
[ Info: iteration 23, average log likelihood -1.416283
[ Info: iteration 24, average log likelihood -1.416276
[ Info: iteration 25, average log likelihood -1.416269
[ Info: iteration 26, average log likelihood -1.416262
[ Info: iteration 27, average log likelihood -1.416256
[ Info: iteration 28, average log likelihood -1.416249
[ Info: iteration 29, average log likelihood -1.416243
[ Info: iteration 30, average log likelihood -1.416237
[ Info: iteration 31, average log likelihood -1.416231
[ Info: iteration 32, average log likelihood -1.416225
[ Info: iteration 33, average log likelihood -1.416220
[ Info: iteration 34, average log likelihood -1.416215
[ Info: iteration 35, average log likelihood -1.416211
[ Info: iteration 36, average log likelihood -1.416207
[ Info: iteration 37, average log likelihood -1.416203
[ Info: iteration 38, average log likelihood -1.416199
[ Info: iteration 39, average log likelihood -1.416196
[ Info: iteration 40, average log likelihood -1.416193
[ Info: iteration 41, average log likelihood -1.416190
[ Info: iteration 42, average log likelihood -1.416188
[ Info: iteration 43, average log likelihood -1.416185
[ Info: iteration 44, average log likelihood -1.416183
[ Info: iteration 45, average log likelihood -1.416181
[ Info: iteration 46, average log likelihood -1.416180
[ Info: iteration 47, average log likelihood -1.416178
[ Info: iteration 48, average log likelihood -1.416177
[ Info: iteration 49, average log likelihood -1.416176
[ Info: iteration 50, average log likelihood -1.416175
┌ Info: EM with 100000 data points 50 iterations avll -1.416175
└ 473.9 data points per parameter
┌ Info: 2
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4174212726599202
│     -1.4173556354141132
│      ⋮                 
└     -1.416174856319443 
[ Info: Running 50 iterations EM on diag cov GMM with 8 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.416184
[ Info: iteration 2, average log likelihood -1.416136
[ Info: iteration 3, average log likelihood -1.416097
[ Info: iteration 4, average log likelihood -1.416052
[ Info: iteration 5, average log likelihood -1.415997
[ Info: iteration 6, average log likelihood -1.415929
[ Info: iteration 7, average log likelihood -1.415847
[ Info: iteration 8, average log likelihood -1.415754
[ Info: iteration 9, average log likelihood -1.415654
[ Info: iteration 10, average log likelihood -1.415552
[ Info: iteration 11, average log likelihood -1.415453
[ Info: iteration 12, average log likelihood -1.415363
[ Info: iteration 13, average log likelihood -1.415282
[ Info: iteration 14, average log likelihood -1.415213
[ Info: iteration 15, average log likelihood -1.415154
[ Info: iteration 16, average log likelihood -1.415106
[ Info: iteration 17, average log likelihood -1.415068
[ Info: iteration 18, average log likelihood -1.415037
[ Info: iteration 19, average log likelihood -1.415012
[ Info: iteration 20, average log likelihood -1.414991
[ Info: iteration 21, average log likelihood -1.414975
[ Info: iteration 22, average log likelihood -1.414961
[ Info: iteration 23, average log likelihood -1.414948
[ Info: iteration 24, average log likelihood -1.414938
[ Info: iteration 25, average log likelihood -1.414928
[ Info: iteration 26, average log likelihood -1.414919
[ Info: iteration 27, average log likelihood -1.414910
[ Info: iteration 28, average log likelihood -1.414902
[ Info: iteration 29, average log likelihood -1.414894
[ Info: iteration 30, average log likelihood -1.414886
[ Info: iteration 31, average log likelihood -1.414878
[ Info: iteration 32, average log likelihood -1.414870
[ Info: iteration 33, average log likelihood -1.414862
[ Info: iteration 34, average log likelihood -1.414854
[ Info: iteration 35, average log likelihood -1.414846
[ Info: iteration 36, average log likelihood -1.414838
[ Info: iteration 37, average log likelihood -1.414829
[ Info: iteration 38, average log likelihood -1.414821
[ Info: iteration 39, average log likelihood -1.414812
[ Info: iteration 40, average log likelihood -1.414803
[ Info: iteration 41, average log likelihood -1.414795
[ Info: iteration 42, average log likelihood -1.414785
[ Info: iteration 43, average log likelihood -1.414776
[ Info: iteration 44, average log likelihood -1.414767
[ Info: iteration 45, average log likelihood -1.414758
[ Info: iteration 46, average log likelihood -1.414748
[ Info: iteration 47, average log likelihood -1.414739
[ Info: iteration 48, average log likelihood -1.414729
[ Info: iteration 49, average log likelihood -1.414720
[ Info: iteration 50, average log likelihood -1.414710
┌ Info: EM with 100000 data points 50 iterations avll -1.414710
└ 236.4 data points per parameter
┌ Info: 3
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4161835782526506
│     -1.416136376992469 
│      ⋮                 
└     -1.4147099841116832
[ Info: Running 50 iterations EM on diag cov GMM with 16 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.414709
[ Info: iteration 2, average log likelihood -1.414640
[ Info: iteration 3, average log likelihood -1.414573
[ Info: iteration 4, average log likelihood -1.414495
[ Info: iteration 5, average log likelihood -1.414399
[ Info: iteration 6, average log likelihood -1.414285
[ Info: iteration 7, average log likelihood -1.414157
[ Info: iteration 8, average log likelihood -1.414020
[ Info: iteration 9, average log likelihood -1.413885
[ Info: iteration 10, average log likelihood -1.413755
[ Info: iteration 11, average log likelihood -1.413636
[ Info: iteration 12, average log likelihood -1.413530
[ Info: iteration 13, average log likelihood -1.413435
[ Info: iteration 14, average log likelihood -1.413353
[ Info: iteration 15, average log likelihood -1.413283
[ Info: iteration 16, average log likelihood -1.413222
[ Info: iteration 17, average log likelihood -1.413170
[ Info: iteration 18, average log likelihood -1.413125
[ Info: iteration 19, average log likelihood -1.413086
[ Info: iteration 20, average log likelihood -1.413052
[ Info: iteration 21, average log likelihood -1.413022
[ Info: iteration 22, average log likelihood -1.412994
[ Info: iteration 23, average log likelihood -1.412969
[ Info: iteration 24, average log likelihood -1.412946
[ Info: iteration 25, average log likelihood -1.412925
[ Info: iteration 26, average log likelihood -1.412905
[ Info: iteration 27, average log likelihood -1.412887
[ Info: iteration 28, average log likelihood -1.412869
[ Info: iteration 29, average log likelihood -1.412853
[ Info: iteration 30, average log likelihood -1.412837
[ Info: iteration 31, average log likelihood -1.412823
[ Info: iteration 32, average log likelihood -1.412808
[ Info: iteration 33, average log likelihood -1.412795
[ Info: iteration 34, average log likelihood -1.412782
[ Info: iteration 35, average log likelihood -1.412770
[ Info: iteration 36, average log likelihood -1.412758
[ Info: iteration 37, average log likelihood -1.412746
[ Info: iteration 38, average log likelihood -1.412735
[ Info: iteration 39, average log likelihood -1.412725
[ Info: iteration 40, average log likelihood -1.412715
[ Info: iteration 41, average log likelihood -1.412705
[ Info: iteration 42, average log likelihood -1.412695
[ Info: iteration 43, average log likelihood -1.412686
[ Info: iteration 44, average log likelihood -1.412677
[ Info: iteration 45, average log likelihood -1.412669
[ Info: iteration 46, average log likelihood -1.412660
[ Info: iteration 47, average log likelihood -1.412652
[ Info: iteration 48, average log likelihood -1.412645
[ Info: iteration 49, average log likelihood -1.412637
[ Info: iteration 50, average log likelihood -1.412630
┌ Info: EM with 100000 data points 50 iterations avll -1.412630
└ 118.1 data points per parameter
┌ Info: 4
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4147094830649485
│     -1.4146398813876038
│      ⋮                 
└     -1.4126298090301337
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.412631
[ Info: iteration 2, average log likelihood -1.412563
[ Info: iteration 3, average log likelihood -1.412498
[ Info: iteration 4, average log likelihood -1.412423
[ Info: iteration 5, average log likelihood -1.412330
[ Info: iteration 6, average log likelihood -1.412214
[ Info: iteration 7, average log likelihood -1.412075
[ Info: iteration 8, average log likelihood -1.411916
[ Info: iteration 9, average log likelihood -1.411747
[ Info: iteration 10, average log likelihood -1.411577
[ Info: iteration 11, average log likelihood -1.411415
[ Info: iteration 12, average log likelihood -1.411264
[ Info: iteration 13, average log likelihood -1.411128
[ Info: iteration 14, average log likelihood -1.411006
[ Info: iteration 15, average log likelihood -1.410899
[ Info: iteration 16, average log likelihood -1.410806
[ Info: iteration 17, average log likelihood -1.410724
[ Info: iteration 18, average log likelihood -1.410652
[ Info: iteration 19, average log likelihood -1.410589
[ Info: iteration 20, average log likelihood -1.410532
[ Info: iteration 21, average log likelihood -1.410482
[ Info: iteration 22, average log likelihood -1.410437
[ Info: iteration 23, average log likelihood -1.410396
[ Info: iteration 24, average log likelihood -1.410359
[ Info: iteration 25, average log likelihood -1.410324
[ Info: iteration 26, average log likelihood -1.410293
[ Info: iteration 27, average log likelihood -1.410263
[ Info: iteration 28, average log likelihood -1.410236
[ Info: iteration 29, average log likelihood -1.410210
[ Info: iteration 30, average log likelihood -1.410186
[ Info: iteration 31, average log likelihood -1.410163
[ Info: iteration 32, average log likelihood -1.410141
[ Info: iteration 33, average log likelihood -1.410120
[ Info: iteration 34, average log likelihood -1.410100
[ Info: iteration 35, average log likelihood -1.410081
[ Info: iteration 36, average log likelihood -1.410063
[ Info: iteration 37, average log likelihood -1.410045
[ Info: iteration 38, average log likelihood -1.410028
[ Info: iteration 39, average log likelihood -1.410011
[ Info: iteration 40, average log likelihood -1.409995
[ Info: iteration 41, average log likelihood -1.409980
[ Info: iteration 42, average log likelihood -1.409964
[ Info: iteration 43, average log likelihood -1.409950
[ Info: iteration 44, average log likelihood -1.409935
[ Info: iteration 45, average log likelihood -1.409921
[ Info: iteration 46, average log likelihood -1.409908
[ Info: iteration 47, average log likelihood -1.409894
[ Info: iteration 48, average log likelihood -1.409881
[ Info: iteration 49, average log likelihood -1.409868
[ Info: iteration 50, average log likelihood -1.409856
┌ Info: EM with 100000 data points 50 iterations avll -1.409856
└ 59.0 data points per parameter
┌ Info: 5
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4126313860696968
│     -1.4125627107909937
│      ⋮                 
└     -1.409856094077938 
┌ Info: Total log likelihood: 
│   tll =
│    251-element Array{Float64,1}:
│     -1.4228581805290048
│     -1.4228771432273186
│     -1.4228122112138295
│     -1.4227596367746478
│      ⋮                 
│     -1.4098811549005763
│     -1.409868464102259 
└     -1.409856094077938 
32×26 Array{Float64,2}:
  0.441228   -0.272838     0.0796862  -0.0229342  -0.394662     0.00819622   0.131763    0.181003     0.126813   -0.345309    -0.16209      0.016548    0.152587     0.0810581  -0.48004      0.215345   -0.0780401    -0.135078     0.0672647  -0.300837     0.310834     0.00418339   0.471778   -0.00430801   0.197198    -0.0155901
  0.794313    0.330556     0.199918    0.208015    0.0853269    0.149569    -0.336467   -0.0355373    0.385639    0.662656     0.409408     0.186165    0.226732    -0.186405    0.520545    -0.647929   -0.0820761     0.0366115   -0.0691201   0.0486865    0.237588     0.260288     0.849429    0.195131     0.0830657   -0.288288 
  0.389642    0.329546    -0.424639    0.261078   -0.171869    -0.199976     0.179754   -0.151141    -0.209583   -0.22456      0.365306    -0.218161    0.353752    -0.158881    0.0766385    0.376841   -0.0761789     1.04598     -0.0260167   0.156196    -0.531061    -0.221938     0.101773   -0.482927     0.373948     0.473671 
  0.103943    0.644677     0.654545   -0.334218   -0.233267     0.153911    -0.922607    0.177692     0.213065   -0.210091    -0.0287765   -0.268186    0.0105422    0.14274     0.202102    -0.451887   -0.272895     -0.0636895   -0.340292   -0.30687     -0.26544      0.0405398    0.178546   -0.417982     0.00120749   1.11894  
 -0.197828   -0.426428     0.0125243  -0.0551011  -0.459021     0.511331     0.0523172  -0.419759    -0.513559   -0.707996    -0.250351    -0.11595    -0.380418     0.680179    0.473147     0.524178   -0.138345     -0.732509     0.0328357  -0.1542      -0.14507      0.492463    -0.438626   -0.130408    -0.883997     0.442793 
 -0.200076   -0.189446     0.158308   -0.475395   -0.105608     0.0554051   -0.161435   -0.0700609    0.139692    0.464671    -0.118333    -0.295346   -0.00747581   0.158931    0.0377704   -0.183522    0.0177944    -0.518179     0.138547   -0.629193     0.476996    -0.160596    -0.12066     0.588774    -0.775492    -0.239128 
 -0.0710875  -0.598335     0.192451   -0.675147   -0.340007    -0.918606    -0.0238825  -0.00149546   0.151444    0.191547    -0.117176    -0.116008    0.00751063  -0.164975   -0.826634     0.486727   -0.409619     -0.292705    -0.0135105  -0.0144372    0.0542094   -0.471111     0.126467   -0.0680045    0.0841596    0.121401 
 -0.379386    0.461829     0.24851    -0.125834    0.204149     0.706416     0.430148    0.0427891    0.633575   -0.129943    -0.229168     0.0842242  -0.00136897   0.0518246  -1.09406      0.517069    0.103254      0.0220896   -0.266862   -0.44339      0.0297165   -0.434917    -0.387737    0.672047     0.178685    -0.0377387
 -0.535911    0.140203     0.276022   -0.0964732   0.329458    -0.0548203   -0.181642   -0.402136     0.240989    0.311116     0.0761052    0.482769   -0.0698943   -0.0689223   0.114706    -0.155816    0.197479     -0.100554    -0.369624    0.250345    -0.88619      0.0607614   -0.566004   -0.230802     0.0507242    0.0666626
 -0.514107    0.0563566   -0.545614    0.173697    0.379736    -0.157201     0.212376    0.163374    -0.555638   -0.0268919   -0.0420235   -0.24562    -0.152717    -0.460289    0.148706    -0.191606    0.0292371     0.0947055    0.247195    0.323732     0.461025     0.0392844   -0.415898    0.195728     0.00549815  -0.366086 
 -0.176825   -0.0567808   -0.117822    0.580206    0.160073     0.353401    -0.128351    0.593906    -0.544709   -0.612103    -0.754083     0.493947    0.0206741    0.918256    0.00701345   0.0930291   0.485673      0.0305964   -0.299924   -0.324001    -0.0465685   -0.12806     -0.0844413   0.496716    -0.378664    -0.180624 
  0.108434   -0.21125     -0.792984    0.404377    0.655938     0.0355151    0.285404   -0.730322     0.494823   -0.00866143  -0.278781     0.193461    0.654294     0.279673   -0.0829109    0.182404    0.608351      0.0491936   -0.023341   -0.608699    -0.430629     0.685281     0.250557    0.40841     -0.0021111    0.0571324
 -0.318222   -0.00759703   0.791216   -0.709451   -0.209177     0.0261883    0.0975242   0.380187    -0.210266   -0.0151796    0.125299    -0.447088   -0.5296       0.0948146   0.183642    -0.223999   -0.698249     -0.0284928    0.167811    0.388474     0.622331    -0.466348    -0.202174   -0.198987     0.292344    -0.114426 
 -0.648596   -0.15773     -0.751096    0.381518    0.424438     0.123426    -0.130199    0.779534    -0.22687    -0.196736     0.284192    -0.759354   -0.239646    -0.149067   -0.0372801    0.102396    0.241662     -0.0722375    0.625994    0.354142     0.00525292  -0.12813     -0.310244   -0.0966805    0.0473502   -0.27234  
 -0.0919342  -0.820829     0.0625824  -0.0281925  -0.0125085    0.260773    -0.365373    0.160979     0.434523    0.167752     0.321752     0.935504   -0.210051     0.196181    0.0232261   -0.150726    0.0752304    -0.256549     0.220296    0.464046    -0.0289456    0.213425    -0.2223      0.194868     0.0210916   -0.456407 
 -0.317227    0.0440996    0.48204     0.400381   -0.197476     0.0247417   -0.689154    0.539016    -0.373474    0.266572    -0.00679703   0.795294   -0.3326      -0.370409   -0.323484    -0.512767   -0.0907671     0.27143     -0.0936298   0.793843     0.448224    -0.254627    -0.40345    -0.123372     0.0405335    0.050764 
 -0.0663812   0.24496     -0.211423    0.233447    0.352366    -0.426532     0.476767   -0.205556    -0.984327   -0.119776    -0.0835671   -0.385639   -0.164505    -0.307388   -0.428844    -0.110692    0.118849      0.118715    -0.266759   -0.346998     0.528026    -0.0971309    0.158188   -0.258175     0.0775184    0.268614 
 -0.0701719   0.430916    -0.153024   -0.0473318   0.289073    -0.447126     0.497577   -0.0527995    0.428845    0.559455     0.431728    -0.363429    0.0193529   -1.07231    -0.612343    -0.609093    0.0161245     0.482417     0.614299   -0.314029     0.182422    -0.113384     0.143973   -0.386383     0.585557    -0.415337 
 -0.598296    0.0592178   -0.386543   -0.253037    0.196271    -0.262742     0.454668   -0.220729    -0.089939   -0.0941496   -0.235738     0.0811016  -0.152545    -0.0593558  -0.0741726    0.12461     0.351017      0.238683     0.232184    0.332716    -0.80258     -0.108038    -0.677644   -0.0277993    0.172724    -0.289504 
 -0.086223   -0.369356    -0.0693115   0.0241982   0.20812     -0.135269     0.63925    -0.0551637   -0.292838   -0.259113    -0.157468    -0.0359982   0.0804952   -0.0452512  -0.252131     0.344785    0.148201      0.0203398   -0.0624609   0.461672     0.751605    -0.0810785   -0.550364    0.0426132    0.223429     0.0233163
 -0.0402553  -0.0894167   -0.0136889   0.0189062  -0.0953503    0.119215    -0.0940662   0.306061    -0.113197   -0.0249359   -0.0941734   -0.245371   -0.00608292  -0.0536296  -0.165749     0.140478   -0.000624626  -0.196631     0.119138   -0.131031     0.334892    -0.0414926    0.156512    0.100402    -0.0672937   -0.115994 
  0.11925    -0.0258724   -0.0109448  -0.0446458   0.14182     -0.0438083    0.0945991  -0.219876     0.0280409   0.0684298    0.103713     0.223354    0.0318516    0.0897066   0.100986    -0.22534     0.0747826     0.139558    -0.0623995   0.11753     -0.170178     0.0674808    0.0132175   0.0217998    0.00056404  -0.0190709
 -0.526739    0.155465     0.089061   -0.153954    0.00705902  -0.479846    -0.224668   -0.228493     0.249986   -0.122033    -0.193761     0.0228479   0.332675    -0.0356911  -0.340967     0.22362    -0.240794     -0.335815    -0.168794   -0.129476    -0.362765    -0.286152    -0.171955   -0.360592    -0.323558     0.199903 
 -0.165991    0.395927     0.209533    0.333026   -0.184603     0.699517    -0.613262    0.057493     0.0122881  -0.373347    -0.043092    -0.205054    0.17457      0.115214    0.123301     0.292877   -0.0490711     0.170804    -0.168047    0.152567    -0.188482     0.0487772   -0.16884    -0.0634246    0.193498     0.409336 
 -0.18967     0.0566953   -0.144963    0.161601    0.359388     0.00520491  -0.724978    0.134567    -0.154165    0.249113     0.157669    -0.0736647   0.0149136    0.0671468   0.683192    -0.724623    0.104098     -0.18105      0.160942    0.0429961   -0.338701     0.256927     0.129541   -0.163438    -0.347487     0.0197295
 -0.277542   -0.195515     0.134522   -0.264292   -0.2033       0.402675    -0.224517   -0.10687     -0.178707    0.496848     0.120228    -0.496768   -0.30473     -0.186403    0.447642     0.166915   -0.0726692    -0.00451919  -0.14054    -0.173876    -0.278552     0.746696     0.329977    0.46874      0.129714    -0.0819844
  0.737096   -0.52292     -0.291241    0.171047   -0.211801     0.167903    -0.140772    0.361825    -0.873955   -0.0547286    0.223937     0.195296   -0.0610534    0.109303    0.486977    -0.259412   -0.2089        0.0612716    0.300687    0.345206     0.282945     0.270862     0.339888    0.0679111    0.431375    -0.268943 
  0.74602    -0.346874    -0.500489   -0.0252858   0.0207727    0.25018      0.223199    0.371594     0.384496   -0.441248     0.0817344   -0.108311   -0.0892228    0.87226    -0.0406793   -0.0571993   0.141751     -0.00159719   0.643092   -0.280759     0.135445    -0.247301     0.485423   -0.283484     0.0444978   -0.26723  
  0.222484    0.0102348   -0.19152     0.289938    0.0672412   -0.171861     0.0280123  -0.155351     0.371011   -0.383413    -0.133128     0.402818    0.281062    -0.477916   -0.357409    -0.0199522   0.216953     -0.423735    -0.344212   -0.42172     -0.202439     0.0704508    0.781826   -0.690406     0.272371     0.0606588
  0.679345    0.0401709    0.316395   -0.147616   -0.324044    -0.109692     0.158358   -0.0829903    0.422634    0.0479471   -0.0364368    0.337111    0.628579     0.018245   -0.169154     0.137137   -0.261429      0.139314    -0.263775   -0.170067     0.21511      0.0717346    0.610152    0.194192     0.00989169   0.17286  
  0.572066   -0.0758014    0.52399    -0.473607   -0.460226    -0.0367961    0.340559   -1.02992      0.0412344   0.0587586   -0.229808     0.948477   -0.0113884    0.238511    0.218888    -0.427432   -0.458546      0.174342    -0.431942   -0.330892     0.0266685    0.14009      0.367875    0.0619313   -0.179856     0.123143 
  0.127456    0.583988     0.0343554   0.314995    0.33153      0.342346     0.528399   -0.620359    -0.0580136   0.262087     0.192868     0.381354    0.136336    -0.0948278   0.817912    -0.245325    0.462022      0.107996    -0.386117    0.00487736   0.024338    -0.0466896   -0.434039   -0.119787    -0.151585    -0.310258 [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.409844
[ Info: iteration 2, average log likelihood -1.409832
[ Info: iteration 3, average log likelihood -1.409821
[ Info: iteration 4, average log likelihood -1.409810
[ Info: iteration 5, average log likelihood -1.409799
[ Info: iteration 6, average log likelihood -1.409788
[ Info: iteration 7, average log likelihood -1.409778
[ Info: iteration 8, average log likelihood -1.409768
[ Info: iteration 9, average log likelihood -1.409758
kind full, method kmeans
[ Info: iteration 10, average log likelihood -1.409749
┌ Info: EM with 100000 data points 10 iterations avll -1.409749
└ 59.0 data points per parameter
[ Info: Initializing GMM, 32 Gaussians diag covariance 26 dimensions using 100000 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       9.778891e+05
      1       7.058346e+05      -2.720546e+05 |       32
      2       6.933066e+05      -1.252792e+04 |       32
      3       6.882476e+05      -5.059015e+03 |       32
      4       6.855857e+05      -2.661924e+03 |       32
      5       6.837962e+05      -1.789526e+03 |       32
      6       6.824679e+05      -1.328256e+03 |       32
      7       6.814295e+05      -1.038382e+03 |       32
      8       6.805557e+05      -8.738057e+02 |       32
      9       6.797801e+05      -7.756667e+02 |       32
     10       6.791563e+05      -6.237701e+02 |       32
     11       6.786508e+05      -5.054859e+02 |       32
     12       6.782197e+05      -4.311323e+02 |       32
     13       6.778474e+05      -3.722878e+02 |       32
     14       6.775223e+05      -3.250886e+02 |       32
     15       6.772281e+05      -2.942247e+02 |       32
     16       6.769817e+05      -2.463616e+02 |       32
     17       6.767626e+05      -2.190790e+02 |       32
     18       6.765674e+05      -1.952304e+02 |       32
     19       6.764029e+05      -1.644823e+02 |       32
     20       6.762509e+05      -1.519964e+02 |       32
     21       6.761124e+05      -1.385427e+02 |       32
     22       6.759791e+05      -1.333198e+02 |       32
     23       6.758498e+05      -1.293071e+02 |       32
     24       6.757332e+05      -1.165800e+02 |       32
     25       6.756217e+05      -1.114263e+02 |       32
     26       6.755230e+05      -9.877762e+01 |       32
     27       6.754272e+05      -9.576538e+01 |       32
     28       6.753227e+05      -1.045166e+02 |       32
     29       6.752264e+05      -9.632382e+01 |       32
     30       6.751255e+05      -1.008349e+02 |       32
     31       6.750275e+05      -9.805879e+01 |       32
     32       6.749302e+05      -9.723471e+01 |       32
     33       6.748224e+05      -1.078030e+02 |       32
     34       6.747141e+05      -1.083108e+02 |       32
     35       6.746181e+05      -9.598959e+01 |       32
     36       6.745368e+05      -8.132468e+01 |       32
     37       6.744710e+05      -6.579681e+01 |       32
     38       6.744170e+05      -5.405384e+01 |       32
     39       6.743618e+05      -5.515528e+01 |       32
     40       6.743077e+05      -5.411319e+01 |       32
     41       6.742540e+05      -5.368987e+01 |       32
     42       6.741956e+05      -5.835543e+01 |       32
     43       6.741341e+05      -6.150538e+01 |       32
     44       6.740697e+05      -6.438739e+01 |       32
     45       6.740092e+05      -6.058855e+01 |       32
     46       6.739528e+05      -5.632333e+01 |       32
     47       6.738981e+05      -5.472760e+01 |       32
     48       6.738448e+05      -5.332296e+01 |       32
     49       6.737934e+05      -5.138437e+01 |       32
     50       6.737464e+05      -4.698303e+01 |       32
K-means terminated without convergence after 50 iterations (objv = 673746.4189568642)
┌ Info: K-means with 32000 data points using 50 iterations
└ 37.0 data points per parameter
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.421672
[ Info: iteration 2, average log likelihood -1.416605
[ Info: iteration 3, average log likelihood -1.415118
[ Info: iteration 4, average log likelihood -1.413890
[ Info: iteration 5, average log likelihood -1.412635
[ Info: iteration 6, average log likelihood -1.411656
[ Info: iteration 7, average log likelihood -1.411107
[ Info: iteration 8, average log likelihood -1.410832
[ Info: iteration 9, average log likelihood -1.410677
[ Info: iteration 10, average log likelihood -1.410572
[ Info: iteration 11, average log likelihood -1.410491
[ Info: iteration 12, average log likelihood -1.410425
[ Info: iteration 13, average log likelihood -1.410368
[ Info: iteration 14, average log likelihood -1.410319
[ Info: iteration 15, average log likelihood -1.410274
[ Info: iteration 16, average log likelihood -1.410233
[ Info: iteration 17, average log likelihood -1.410195
[ Info: iteration 18, average log likelihood -1.410160
[ Info: iteration 19, average log likelihood -1.410127
[ Info: iteration 20, average log likelihood -1.410097
[ Info: iteration 21, average log likelihood -1.410068
[ Info: iteration 22, average log likelihood -1.410040
[ Info: iteration 23, average log likelihood -1.410014
[ Info: iteration 24, average log likelihood -1.409989
[ Info: iteration 25, average log likelihood -1.409965
[ Info: iteration 26, average log likelihood -1.409942
[ Info: iteration 27, average log likelihood -1.409921
[ Info: iteration 28, average log likelihood -1.409900
[ Info: iteration 29, average log likelihood -1.409880
[ Info: iteration 30, average log likelihood -1.409861
[ Info: iteration 31, average log likelihood -1.409843
[ Info: iteration 32, average log likelihood -1.409826
[ Info: iteration 33, average log likelihood -1.409809
[ Info: iteration 34, average log likelihood -1.409793
[ Info: iteration 35, average log likelihood -1.409777
[ Info: iteration 36, average log likelihood -1.409763
[ Info: iteration 37, average log likelihood -1.409748
[ Info: iteration 38, average log likelihood -1.409734
[ Info: iteration 39, average log likelihood -1.409721
[ Info: iteration 40, average log likelihood -1.409708
[ Info: iteration 41, average log likelihood -1.409696
[ Info: iteration 42, average log likelihood -1.409684
[ Info: iteration 43, average log likelihood -1.409672
[ Info: iteration 44, average log likelihood -1.409661
[ Info: iteration 45, average log likelihood -1.409649
[ Info: iteration 46, average log likelihood -1.409639
[ Info: iteration 47, average log likelihood -1.409628
[ Info: iteration 48, average log likelihood -1.409617
[ Info: iteration 49, average log likelihood -1.409607
[ Info: iteration 50, average log likelihood -1.409597
┌ Info: EM with 100000 data points 50 iterations avll -1.409597
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
 -0.022205    0.322157     0.289527    0.00648285   0.0148847   0.623632    -0.987735    0.00446466    0.441722    -0.405641     0.0971016   -0.0820963   0.170743     0.580103    0.432535    -0.124695   -0.0753165    0.160299    -0.403482    -0.0703845   -0.600261    0.12209    -0.0984033   -0.303089      0.00660742   1.05021   
  0.30904    -0.427966     0.0686122   0.235581     0.196277   -0.39562      0.510966   -0.0762371    -0.0786304   -0.321303    -0.0663084    0.478357    0.0971364   -0.0277925  -0.5443       0.0968212   0.00431295  -0.0617586   -0.22318      0.419799     0.51183    -0.278792   -0.0800787   -0.548394      0.292131     0.120821  
 -0.0795694   0.217019    -0.12042    -0.0733821    0.35901    -0.660724     0.505967   -0.216565      0.00337464   0.490365     0.352984    -0.420342   -0.138018    -1.08895    -0.846074    -0.459116   -0.0864803    0.379463     0.555238    -0.399927     0.279458   -0.128602    0.156837    -0.339471      0.519994    -0.306649  
 -0.527314    0.196834     0.0498817   0.137017     0.614184   -0.0847569   -0.0614911  -0.0393042    -0.26481      0.211069    -0.549389    -0.454033    0.643603    -0.0446769   0.078701     0.283651    0.00029495  -0.101713    -0.233841     0.0284389    0.317531   -0.149527   -0.508472     0.158612     -0.661115     0.233283  
  0.0717964  -0.00655952   0.21644    -0.147417    -0.395587    0.501052     0.550474   -0.748082     -0.604111    -0.177574    -0.207053     0.0295069  -0.531879     0.409704    0.805257     0.167636    0.0221518   -0.47439      0.0175798   -0.540666     0.136746    0.270572   -0.675529    -0.0680925    -1.07872     -0.0276701 
 -0.595826   -0.284088    -0.147781   -0.486276    -0.42219     0.336126    -0.225825    0.0027283    -0.276476     0.512229     0.133888    -0.786588   -0.346971    -0.517209    0.667006     0.133679    0.00860494   0.502215    -0.00181034  -0.17811     -0.339192    0.512997    0.220034     0.97719      -0.101054     0.0426564 
 -0.228392    0.0436184    0.805942   -0.385235    -0.123442    0.406825    -0.324536   -0.000425316   0.0241365    0.251699     0.127978    -0.148685   -0.42294      0.0388924   0.142701    -0.210683   -0.454445    -0.336091    -0.197131    -0.0740298    0.0724677   0.229677   -0.00369233  -0.00191377    0.125942    -0.00492157
  0.594281   -0.606669    -0.224906   -0.0927571   -0.435166    0.134172    -0.130346    0.0873727    -0.373054    -0.283764    -0.00748415   0.365624   -0.19661      0.57426     0.723068    -0.277213   -0.192216    -0.319035     0.549481     0.437929    -0.104803    0.427629    0.584386    -0.098993      0.324042    -0.106102  
  0.362405   -0.092741    -0.995518    0.23671      0.487817   -0.01678      0.574596   -0.555515      0.291826    -0.231477    -0.218128     0.0191737   0.601246     0.255513    2.18634e-5   0.266779    0.504388     0.154465     0.174148    -0.87273     -0.353557    0.307488    0.212012     0.129952      0.0416706    0.05654   
  0.114352   -0.0478561   -0.0442196   0.00791036   0.0556338   0.0876975    0.0105427  -0.0137349    -0.0607076    0.0282727    0.00835789   0.0190348   0.0359643    0.0109179   0.0704466   -0.0905404   0.0956413   -0.00261078  -0.0193371    0.0209966    0.0639789   0.102565    0.0931862    0.0930243    -0.0201682   -0.0366427 
 -0.176387    0.146901     0.132988   -0.358321    -0.0713065  -0.62195     -0.0823144  -0.236587      0.278669     0.0603468    0.157316     0.352633    0.337256    -0.142982   -0.0717782   -0.1267     -0.107936    -0.0599449   -0.0419737    0.125883    -0.498924   -0.287135   -0.167242    -0.440132     -0.138081     0.106319  
  0.773962    0.089331     0.501145   -0.231174    -0.447491   -0.00240826   0.212212   -0.491781      0.275804     0.070606    -0.102972     0.741225    0.482639    -0.0011771  -0.0512935   -0.0875547  -0.332319     0.278375    -0.588533    -0.284349     0.213185    0.179336    0.665039     0.155822     -0.019708     0.22393   
 -0.187554   -0.492551    -0.0508733   0.0278179   -0.175845    0.0739554   -0.153464    0.356443     -0.124308    -0.596754    -0.440373    -0.0730404  -0.137096     0.464156   -0.441469     0.409526    0.179326    -0.243775     0.0431634   -0.246635     0.15102    -0.182497    0.036042     0.23354      -0.276738     0.0089671 
 -1.09745    -0.120292    -0.28767    -0.459269     0.237612   -0.216149     0.0738353  -0.507399      0.0721696   -0.0486913   -0.442483     0.236016   -0.104303    -0.206898   -0.317151     0.193755    0.0151977   -0.507364     0.0139819    0.493995    -0.503783    0.18402    -0.72376      0.271401     -0.0529029   -0.166935  
 -0.125157   -0.226701     0.120762   -0.0063884   -0.032702   -0.142754    -0.0978813  -0.0528416    -0.0866615   -0.105474    -0.128657     0.149212   -0.00168835   0.120178   -0.254217     0.12476    -0.0847975   -0.117268    -0.0836626   -0.00722454  -0.115226    0.0449683  -0.0662909   -0.0426356    -0.00354449   0.12848   
 -0.771198   -0.442733    -0.325335   -0.105711     0.930944    0.267785    -0.243604    0.570341     -0.0339919    0.28701      0.40878     -0.287092   -0.566961     0.426285    0.657607    -0.513155    0.420009    -0.486944     0.348665     0.157764    -0.451139   -0.397564   -0.358747    -0.550119     -0.141931    -0.521778  
  0.400555   -0.0714201   -0.396267   -0.034415    -0.430634    0.0331611   -0.359094    0.116634      0.481414     0.228412     0.104371     0.0878321  -0.12944      0.267435    0.0465171   -0.304676   -0.15561     -0.0961593    0.538615    -0.524153    -0.114102   -0.374923    0.781098    -0.238416     -0.0854949   -0.314867  
 -0.0113838   0.369774    -0.563067    0.337899     0.289511    0.0376233    0.357556    0.215076      0.192092    -0.13852      0.591848    -0.0108784   0.451704    -0.56642    -0.0460304    0.1289      0.0713037    0.766852    -0.130627     0.691262    -0.313752   -0.241634   -0.0642982   -0.420033      0.630635    -0.0625317 
  0.541775   -0.518414    -0.272008    0.0462082    0.103726    0.392149     0.48144     0.544325     -0.579852    -0.109395     0.187833    -0.177113   -0.14052      0.275604    0.0242339    0.0893978  -0.0476958    0.197583     0.232905     0.187278     0.743382   -0.0354806  -0.0970485    0.265902      0.446838    -0.610647  
  0.213689   -0.428516    -0.164272   -0.0727578    0.384405   -0.0225173    0.199534   -0.330917      0.529062     0.437941     0.390684     0.431346    0.24385      0.282651    0.216094    -0.369863    0.133118     0.0384968    0.107409     0.172252    -0.0919079   0.437132    0.178398     0.552853     -0.280555    -0.433849  
  0.357697    0.671573     0.185033    0.21929      0.205366    0.219968     0.0202843  -0.148933      0.275618     0.538844     0.216655     0.303713   -0.098394    -0.299417    0.731551    -1.08422     0.174887     0.268634    -0.0734383    0.0251148    0.314027    0.120608    0.233434     0.0996342     0.226408    -0.505289  
  0.207716    0.259738     0.0168752  -0.196299    -0.792911   -0.473216    -0.103246    0.219422     -0.654036    -0.388147    -0.092117    -0.808888    0.060909    -0.247651    0.0236153    0.172729   -0.285955     0.221428    -0.00792848   0.0250053    0.399641   -0.182532    0.225038    -0.486448     -0.0537754    0.588432  
 -0.122502    0.366728    -0.0121974   0.285912     0.129721   -0.0806301    0.214522   -0.405704     -0.542941     0.0439414    0.0237555    0.236132   -0.160186     0.162378    0.349953    -0.198574    0.370251     0.680387    -0.189982     0.223835    -0.470359   -0.0292346  -0.325539    -0.249853      0.22576      0.145221  
  0.104338   -0.502156     0.108919   -0.724147    -0.369555   -0.329292    -0.233459    0.194451      0.407725     0.24665      0.144436    -0.158067    0.0333343   -0.249045   -0.431797     0.177657   -0.186719    -0.900912    -0.0896696   -0.55879      0.455948   -0.283006    0.360424     0.298667     -0.484558    -0.302097  
  0.30368    -0.107148     0.4181      0.308582     0.0900382   1.24591     -0.0625449  -0.201873      1.12379     -0.0190343   -0.15494      0.87624     0.137404    -0.0726552   0.0399403    0.592237    0.970529    -0.140636    -0.0330316    0.136969    -0.512319   -0.231007   -0.675903     0.348615      0.344294    -0.116714  
 -0.493363   -0.286884     0.31152     0.217266    -0.252115    0.0228376   -0.683304    0.658329     -0.283427     0.00965869   0.0571038    0.385376   -0.698531    -0.196902   -0.176085    -0.314136   -0.142082     0.150976     0.218538     0.939531     0.411148   -0.221155   -0.561894    -0.0150734    -0.0875594   -0.117643  
 -0.0479854   0.237415     0.0323209   0.189457     0.235713   -0.228313    -0.203423   -0.338012      0.289379    -0.218299    -0.333687     0.0339476   0.237155    -0.244425   -0.290243     0.0073088   0.237612    -0.640278    -0.611272    -0.815776    -0.464299    0.272127    0.716017    -0.648156     -0.0146002    0.214327  
  0.141737    0.179823    -0.0361396   0.617282     0.193288    0.228794    -0.927226    0.360849     -0.432621     0.24297      0.262895     0.0402799   0.264788    -0.238683    0.30461     -0.486293    0.0182206   -0.140386     0.141214     0.0251201    0.0502269   0.418182    0.271378     0.0195747    -0.185873     0.127977  
 -0.440902    0.0755199   -0.722629    0.291318     0.459708    0.0153707    0.349011    0.183029     -0.378705    -0.226371     0.0247278   -0.54913    -0.278401    -0.319223   -0.110642     0.0334789   0.350542    -0.106258     0.318196     0.114272     0.224389    0.0398051  -0.325579     0.000718942   0.139021    -0.178206  
 -0.462908    0.733968     0.180355   -0.0517872    0.197389    0.604929     0.589637    0.165783      0.480367    -0.122494    -0.283607     0.0230803  -0.152671     0.137094   -1.21422      0.461818    0.24233      0.241495    -0.421172    -0.657918     0.136032   -0.488601   -0.377414     0.671332     -0.00346627  -0.144758  
 -0.0224551   0.198381    -0.0471393   0.229364    -0.242806    0.108548     0.0105255   0.062284      0.0905566   -0.137843     0.0953308   -0.196013    0.440959    -0.101833   -0.498322     0.713881   -0.377435     0.289817     0.067659    -0.179775    -0.14792    -0.0665043   0.197001    -0.0882855     0.24575      0.166117  
 -0.161627   -0.0449995    0.635993   -0.901874    -0.194603   -0.418839     0.407286    0.0752666     0.336898     0.176438    -0.327337    -0.210649    0.0279047    0.250185   -0.119387    -0.0491902  -0.559147    -0.0151527    0.259547     0.208526     0.61277    -0.670189   -0.00916694   0.235631      0.676275    -0.0190885 [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.409587
[ Info: iteration 2, average log likelihood -1.409577
[ Info: iteration 3, average log likelihood -1.409568
[ Info: iteration 4, average log likelihood -1.409558
[ Info: iteration 5, average log likelihood -1.409549
[ Info: iteration 6, average log likelihood -1.409540
[ Info: iteration 7, average log likelihood -1.409531
[ Info: iteration 8, average log likelihood -1.409521
[ Info: iteration 9, average log likelihood -1.409513
[ Info: iteration 10, average log likelihood -1.409504
┌ Info: EM with 100000 data points 10 iterations avll -1.409504
└ 59.0 data points per parameter
[ Info: Initializing GMM, 2 Gaussians diag covariance 2 dimensions using 900 data points
┌ Info: K-means with 900 data points using 3 iterations
└ 150.0 data points per parameter
[ Info: Running 10 iterations EM on full cov GMM with 2 Gaussians in 2 dimensions
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       1.678561e+05
      1       2.230230e+04      -1.455538e+05 |        2
      2       7.823675e+03      -1.447862e+04 |        0
      3       7.823675e+03       0.000000e+00 |        0
K-means converged with 3 iterations (objv = 7823.67549422947)
[ Info: iteration 1, average log likelihood -2.043155
[ Info: iteration 2, average log likelihood -2.043154
[ Info: iteration 3, average log likelihood -2.043154
[ Info: iteration 4, average log likelihood -2.043154
[ Info: iteration 5, average log likelihood -2.043154
[ Info: iteration 6, average log likelihood -2.043154
[ Info: iteration 7, average log likelihood -2.043154
[ Info: iteration 8, average log likelihood -2.043154
[ Info: iteration 9, average log likelihood -2.043154
[ Info: iteration 10, average log likelihood -2.043154
┌ Info: EM with 900 data points 10 iterations avll -2.043154
└ 81.8 data points per parameter
   Testing GaussianMixtures tests passed 

Results with Julia v1.3.0

Testing was successful. Last evaluation was ago and took 7 minutes, 58 seconds.

Click here to download the log file.

 Resolving package versions...
 Installed URIParser ────────── v0.4.0
 Installed GaussianMixtures ─── v0.3.0
 Installed SortingAlgorithms ── v0.3.1
 Installed JLD ──────────────── v0.9.1
 Installed FileIO ───────────── v1.1.0
 Installed Arpack ───────────── v0.3.1
 Installed DataStructures ───── v0.17.6
 Installed StaticArrays ─────── v0.12.1
 Installed QuadGK ───────────── v2.1.1
 Installed Compat ───────────── v2.2.0
 Installed StatsFuns ────────── v0.9.0
 Installed BinaryProvider ───── v0.5.8
 Installed Rmath ────────────── v0.5.1
 Installed Missings ─────────── v0.4.3
 Installed NearestNeighbors ─── v0.4.4
 Installed Distributions ────── v0.21.9
 Installed LegacyStrings ────── v0.4.1
 Installed CMakeWrapper ─────── v0.2.3
 Installed OrderedCollections ─ v1.1.0
 Installed Parameters ───────── v0.12.0
 Installed SpecialFunctions ─── v0.8.0
 Installed Distances ────────── v0.8.2
 Installed BinDeps ──────────── v0.8.10
 Installed DataAPI ──────────── v1.1.0
 Installed Blosc ────────────── v0.5.1
 Installed ScikitLearnBase ──── v0.5.0
 Installed CMake ────────────── v1.1.2
 Installed PDMats ───────────── v0.9.10
 Installed StatsBase ────────── v0.32.0
 Installed Clustering ───────── v0.13.3
 Installed HDF5 ─────────────── v0.12.5
  Updating `~/.julia/environments/v1.3/Project.toml`
  [cc18c42c] + GaussianMixtures v0.3.0
  Updating `~/.julia/environments/v1.3/Manifest.toml`
  [7d9fca2a] + Arpack v0.3.1
  [9e28174c] + BinDeps v0.8.10
  [b99e7846] + BinaryProvider v0.5.8
  [a74b3585] + Blosc v0.5.1
  [631607c0] + CMake v1.1.2
  [d5fb7624] + CMakeWrapper v0.2.3
  [aaaa29a8] + Clustering v0.13.3
  [34da2185] + Compat v2.2.0
  [9a962f9c] + DataAPI v1.1.0
  [864edb3b] + DataStructures v0.17.6
  [b4f34e82] + Distances v0.8.2
  [31c24e10] + Distributions v0.21.9
  [5789e2e9] + FileIO v1.1.0
  [cc18c42c] + GaussianMixtures v0.3.0
  [f67ccb44] + HDF5 v0.12.5
  [4138dd39] + JLD v0.9.1
  [1b4a561d] + LegacyStrings v0.4.1
  [e1d29d7a] + Missings v0.4.3
  [b8a86587] + NearestNeighbors v0.4.4
  [bac558e1] + OrderedCollections v1.1.0
  [90014a1f] + PDMats v0.9.10
  [d96e819e] + Parameters v0.12.0
  [1fd47b50] + QuadGK v2.1.1
  [79098fc4] + Rmath v0.5.1
  [6e75b9c4] + ScikitLearnBase v0.5.0
  [a2af1166] + SortingAlgorithms v0.3.1
  [276daf66] + SpecialFunctions v0.8.0
  [90137ffa] + StaticArrays v0.12.1
  [2913bbd2] + StatsBase v0.32.0
  [4c63d2b9] + StatsFuns v0.9.0
  [30578b45] + URIParser v0.4.0
  [2a0f44e3] + Base64 
  [ade2ca70] + Dates 
  [8bb1440f] + DelimitedFiles 
  [8ba89e20] + Distributed 
  [b77e0a4c] + InteractiveUtils 
  [76f85450] + LibGit2 
  [8f399da3] + Libdl 
  [37e2e46d] + LinearAlgebra 
  [56ddb016] + Logging 
  [d6f4376e] + Markdown 
  [a63ad114] + Mmap 
  [44cfe95a] + Pkg 
  [de0858da] + Printf 
  [9abbd945] + Profile 
  [3fa0cd96] + REPL 
  [9a3f8284] + Random 
  [ea8e919c] + SHA 
  [9e88b42a] + Serialization 
  [1a1011a3] + SharedArrays 
  [6462fe0b] + Sockets 
  [2f01184e] + SparseArrays 
  [10745b16] + Statistics 
  [4607b0f0] + SuiteSparse 
  [8dfed614] + Test 
  [cf7118a7] + UUIDs 
  [4ec0a83e] + Unicode 
  Building Arpack ──────────→ `~/.julia/packages/Arpack/cu5By/deps/build.log`
  Building Rmath ───────────→ `~/.julia/packages/Rmath/4wt82/deps/build.log`
  Building SpecialFunctions → `~/.julia/packages/SpecialFunctions/ne2iw/deps/build.log`
  Building CMake ───────────→ `~/.julia/packages/CMake/nSK2r/deps/build.log`
  Building Blosc ───────────→ `~/.julia/packages/Blosc/lzFr0/deps/build.log`
  Building HDF5 ────────────→ `~/.julia/packages/HDF5/Zh9on/deps/build.log`
   Testing GaussianMixtures
    Status `/tmp/jl_oLIhsP/Manifest.toml`
  [7d9fca2a] Arpack v0.3.1
  [9e28174c] BinDeps v0.8.10
  [b99e7846] BinaryProvider v0.5.8
  [a74b3585] Blosc v0.5.1
  [631607c0] CMake v1.1.2
  [d5fb7624] CMakeWrapper v0.2.3
  [aaaa29a8] Clustering v0.13.3
  [34da2185] Compat v2.2.0
  [9a962f9c] DataAPI v1.1.0
  [864edb3b] DataStructures v0.17.6
  [b4f34e82] Distances v0.8.2
  [31c24e10] Distributions v0.21.9
  [5789e2e9] FileIO v1.1.0
  [cc18c42c] GaussianMixtures v0.3.0
  [f67ccb44] HDF5 v0.12.5
  [4138dd39] JLD v0.9.1
  [1b4a561d] LegacyStrings v0.4.1
  [e1d29d7a] Missings v0.4.3
  [b8a86587] NearestNeighbors v0.4.4
  [bac558e1] OrderedCollections v1.1.0
  [90014a1f] PDMats v0.9.10
  [d96e819e] Parameters v0.12.0
  [1fd47b50] QuadGK v2.1.1
  [79098fc4] Rmath v0.5.1
  [6e75b9c4] ScikitLearnBase v0.5.0
  [a2af1166] SortingAlgorithms v0.3.1
  [276daf66] SpecialFunctions v0.8.0
  [90137ffa] StaticArrays v0.12.1
  [2913bbd2] StatsBase v0.32.0
  [4c63d2b9] StatsFuns v0.9.0
  [30578b45] URIParser v0.4.0
  [2a0f44e3] Base64  [`@stdlib/Base64`]
  [ade2ca70] Dates  [`@stdlib/Dates`]
  [8bb1440f] DelimitedFiles  [`@stdlib/DelimitedFiles`]
  [8ba89e20] Distributed  [`@stdlib/Distributed`]
  [b77e0a4c] InteractiveUtils  [`@stdlib/InteractiveUtils`]
  [76f85450] LibGit2  [`@stdlib/LibGit2`]
  [8f399da3] Libdl  [`@stdlib/Libdl`]
  [37e2e46d] LinearAlgebra  [`@stdlib/LinearAlgebra`]
  [56ddb016] Logging  [`@stdlib/Logging`]
  [d6f4376e] Markdown  [`@stdlib/Markdown`]
  [a63ad114] Mmap  [`@stdlib/Mmap`]
  [44cfe95a] Pkg  [`@stdlib/Pkg`]
  [de0858da] Printf  [`@stdlib/Printf`]
  [9abbd945] Profile  [`@stdlib/Profile`]
  [3fa0cd96] REPL  [`@stdlib/REPL`]
  [9a3f8284] Random  [`@stdlib/Random`]
  [ea8e919c] SHA  [`@stdlib/SHA`]
  [9e88b42a] Serialization  [`@stdlib/Serialization`]
  [1a1011a3] SharedArrays  [`@stdlib/SharedArrays`]
  [6462fe0b] Sockets  [`@stdlib/Sockets`]
  [2f01184e] SparseArrays  [`@stdlib/SparseArrays`]
  [10745b16] Statistics  [`@stdlib/Statistics`]
  [4607b0f0] SuiteSparse  [`@stdlib/SuiteSparse`]
  [8dfed614] Test  [`@stdlib/Test`]
  [cf7118a7] UUIDs  [`@stdlib/UUIDs`]
  [4ec0a83e] Unicode  [`@stdlib/Unicode`]
[ Info: Testing Data
(100000, -1.3296494081103204e7, [94882.51209294474, 5117.487907055261], [-6763.862331130266 1053.5716545757464 -784.977133017846; 7441.029179691554 -1076.8770493687725 951.3450014159525], Array{Float64,2}[[87518.67333606655 408.6664351429119 596.1063722847459; 408.66643514291195 94880.2530762904 -158.09981021565625; 596.1063722847459 -158.0998102156562 94898.32243667853], [12634.607741102463 -749.0687387994843 -1056.835667108604; -749.0687387994843 4817.415018387235 351.03807470307066; -1056.835667108604 351.03807470307066 4566.936035357224]])
┌ Warning: rmprocs: process 1 not removed
└ @ Distributed /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.3/Distributed/src/cluster.jl:1015
[ Info: Initializing GMM, 8 Gaussians diag covariance 2 dimensions using 272 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       1.836751e+03
      1       1.282650e+03      -5.541010e+02 |        7
      2       1.168711e+03      -1.139389e+02 |        6
      3       1.143076e+03      -2.563493e+01 |        0
      4       1.143076e+03       0.000000e+00 |        0
K-means converged with 4 iterations (objv = 1143.0757529455818)
┌ Info: K-means with 272 data points using 4 iterations
└ 11.3 data points per parameter
[ Info: Running 0 iterations EM on full cov GMM with 8 Gaussians in 2 dimensions
┌ Info: EM with 272 data points 0 iterations avll -2.057410
└ 5.8 data points per parameter
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:221
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:221
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:221
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:221
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex at broadcast.jl:630 [inlined]
└ @ Core ./broadcast.jl:630
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:230
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:230
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex at broadcast.jl:630 [inlined]
└ @ Core ./broadcast.jl:630
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex_evalf at broadcast.jl:630 [inlined]
└ @ Core ./broadcast.jl:630
[ Info: iteration 1, lowerbound -3.699687
[ Info: iteration 2, lowerbound -3.592318
[ Info: iteration 3, lowerbound -3.471854
[ Info: iteration 4, lowerbound -3.322291
[ Info: iteration 5, lowerbound -3.146924
[ Info: iteration 6, lowerbound -2.964973
[ Info: iteration 7, lowerbound -2.808308
[ Info: dropping number of Gaussions to 7
[ Info: iteration 8, lowerbound -2.693055
[ Info: dropping number of Gaussions to 6
[ Info: iteration 9, lowerbound -2.611779
[ Info: dropping number of Gaussions to 5
[ Info: iteration 10, lowerbound -2.547222
[ Info: dropping number of Gaussions to 4
[ Info: iteration 11, lowerbound -2.496321
[ Info: dropping number of Gaussions to 3
[ Info: iteration 12, lowerbound -2.444720
[ Info: iteration 13, lowerbound -2.396533
[ Info: iteration 14, lowerbound -2.357996
[ Info: iteration 15, lowerbound -2.328416
[ Info: iteration 16, lowerbound -2.310896
[ Info: iteration 17, lowerbound -2.308144
[ Info: dropping number of Gaussions to 2
[ Info: iteration 18, lowerbound -2.302918
[ Info: iteration 19, lowerbound -2.299260
[ Info: iteration 20, lowerbound -2.299256
[ Info: iteration 21, lowerbound -2.299254
[ Info: iteration 22, lowerbound -2.299254
[ Info: iteration 23, lowerbound -2.299253
[ Info: iteration 24, lowerbound -2.299253
[ Info: iteration 25, lowerbound -2.299253
[ Info: iteration 26, lowerbound -2.299253
[ Info: iteration 27, lowerbound -2.299253
[ Info: iteration 28, lowerbound -2.299253
[ Info: iteration 29, lowerbound -2.299253
[ Info: iteration 30, lowerbound -2.299253
[ Info: iteration 31, lowerbound -2.299253
[ Info: iteration 32, lowerbound -2.299253
[ Info: iteration 33, lowerbound -2.299253
[ Info: iteration 34, lowerbound -2.299253
[ Info: iteration 35, lowerbound -2.299253
[ Info: iteration 36, lowerbound -2.299253
[ Info: iteration 37, lowerbound -2.299253
[ Info: iteration 38, lowerbound -2.299253
[ Info: iteration 39, lowerbound -2.299253
[ Info: iteration 40, lowerbound -2.299253
[ Info: iteration 41, lowerbound -2.299253
[ Info: iteration 42, lowerbound -2.299253
[ Info: iteration 43, lowerbound -2.299253
[ Info: iteration 44, lowerbound -2.299253
[ Info: iteration 45, lowerbound -2.299253
[ Info: iteration 46, lowerbound -2.299253
[ Info: iteration 47, lowerbound -2.299253
[ Info: iteration 48, lowerbound -2.299253
[ Info: iteration 49, lowerbound -2.299253
[ Info: 50 variational Bayes EM-like iterations using 272 data points, final lowerbound -2.299253
History[Tue Dec  3 01:03:41 2019: Initializing GMM, 8 Gaussians diag covariance 2 dimensions using 272 data points
, Tue Dec  3 01:03:48 2019: K-means with 272 data points using 4 iterations
11.3 data points per parameter
, Tue Dec  3 01:03:50 2019: EM with 272 data points 0 iterations avll -2.057410
5.8 data points per parameter
, Tue Dec  3 01:03:51 2019: GMM converted to Variational GMM
, Tue Dec  3 01:04:00 2019: iteration 1, lowerbound -3.699687
, Tue Dec  3 01:04:00 2019: iteration 2, lowerbound -3.592318
, Tue Dec  3 01:04:00 2019: iteration 3, lowerbound -3.471854
, Tue Dec  3 01:04:00 2019: iteration 4, lowerbound -3.322291
, Tue Dec  3 01:04:00 2019: iteration 5, lowerbound -3.146924
, Tue Dec  3 01:04:00 2019: iteration 6, lowerbound -2.964973
, Tue Dec  3 01:04:00 2019: iteration 7, lowerbound -2.808308
, Tue Dec  3 01:04:00 2019: dropping number of Gaussions to 7
, Tue Dec  3 01:04:00 2019: iteration 8, lowerbound -2.693055
, Tue Dec  3 01:04:00 2019: dropping number of Gaussions to 6
, Tue Dec  3 01:04:00 2019: iteration 9, lowerbound -2.611779
, Tue Dec  3 01:04:00 2019: dropping number of Gaussions to 5
, Tue Dec  3 01:04:00 2019: iteration 10, lowerbound -2.547222
, Tue Dec  3 01:04:00 2019: dropping number of Gaussions to 4
, Tue Dec  3 01:04:00 2019: iteration 11, lowerbound -2.496321
, Tue Dec  3 01:04:00 2019: dropping number of Gaussions to 3
, Tue Dec  3 01:04:00 2019: iteration 12, lowerbound -2.444720
, Tue Dec  3 01:04:00 2019: iteration 13, lowerbound -2.396533
, Tue Dec  3 01:04:00 2019: iteration 14, lowerbound -2.357996
, Tue Dec  3 01:04:00 2019: iteration 15, lowerbound -2.328416
, Tue Dec  3 01:04:00 2019: iteration 16, lowerbound -2.310896
, Tue Dec  3 01:04:00 2019: iteration 17, lowerbound -2.308144
, Tue Dec  3 01:04:01 2019: dropping number of Gaussions to 2
, Tue Dec  3 01:04:01 2019: iteration 18, lowerbound -2.302918
, Tue Dec  3 01:04:01 2019: iteration 19, lowerbound -2.299260
, Tue Dec  3 01:04:01 2019: iteration 20, lowerbound -2.299256
, Tue Dec  3 01:04:01 2019: iteration 21, lowerbound -2.299254
, Tue Dec  3 01:04:01 2019: iteration 22, lowerbound -2.299254
, Tue Dec  3 01:04:01 2019: iteration 23, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 24, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 25, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 26, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 27, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 28, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 29, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 30, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 31, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 32, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 33, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 34, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 35, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 36, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 37, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 38, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 39, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 40, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 41, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 42, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 43, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 44, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 45, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 46, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 47, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 48, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: iteration 49, lowerbound -2.299253
, Tue Dec  3 01:04:01 2019: 50 variational Bayes EM-like iterations using 272 data points, final lowerbound -2.299253
]
α = [178.04509222601396, 95.9549077739861]
β = [178.04509222601396, 95.9549077739861]
m = [4.250300733269908 79.2868669443618; 2.0002292577753695 53.851987172461286]
ν = [180.04509222601396, 97.9549077739861]
W = LinearAlgebra.UpperTriangular{Float64,Array{Float64,2}}[[0.18404155547484777 -0.007644049042327639; 0.0 0.008581705166333407], [0.3758763611948421 -0.008953123827346077; 0.0 0.012748664777409385]]
Kind: diag, size256
nx: 100000 sum(zeroth order stats): 100000.00000000003
avll from stats: -1.00544630893504
avll from llpg:  -1.0054463089350327
avll direct:     -1.0054463089350327
sum posterior: 100000.0
Kind: full, size16
nx: 100000 sum(zeroth order stats): 100000.00000000001
avll from stats: -0.9905420970932665
avll from llpg:  -0.9905420970932662
avll direct:     -0.9905420970932663
sum posterior: 100000.0
32×26 Array{Float64,2}:
 -0.152659     0.00439253   0.000241539  -0.0816215    -0.104755   -0.0221816   -0.0413605    0.0872062     0.0273489     0.10762      0.129313    -0.025841     -0.0178274  -0.0833378    0.00452705   0.0512716    0.0606327    0.0595554   -0.0428865    -0.116316    -0.0531986   -0.0340321    0.0873993    -0.106243      0.146806     0.131565  
 -0.0149707   -0.0153529   -0.135754      0.0836164    -0.0342599  -0.14428     -0.129805     0.0488695    -0.0427169     0.161122    -0.104661     0.0980383    -0.131954    0.00493453  -0.0737146    0.0102849   -0.119852    -0.0375968    0.0882875     0.0171269   -0.0740471   -0.0428904    0.116211     -0.144699      0.0870922    0.125322  
  0.1108      -0.0206513    0.0290548    -0.0395373    -0.0293992   0.0749304   -0.0411219    0.0688091    -0.0702349     0.0347231   -0.0852341    0.0259724     0.0138403   0.0276083   -0.248469    -0.0378758    0.0131771   -0.100797     0.0616902     0.136592     0.111803    -0.00588301  -0.0396437    -0.327886      0.0712896    0.0190839 
  0.0777575   -0.0291842   -0.0428668     0.0400109     0.0365867  -0.0954109    0.238294     0.119845      0.106801     -0.00388745   0.0426974   -0.0290454    -0.0674359  -0.165733    -0.0767208   -0.0666769   -0.00304635   0.215026    -0.106668     -0.11544      0.0551159    0.107446     0.0625489     0.184396     -0.00467557   0.0971435 
  0.205331     0.0794547    0.0141687    -0.135519     -0.179689    0.0827715   -0.0728437   -0.143899      0.155417      0.106049     0.163663     0.00636286    0.0385648  -0.0970418   -0.0623077   -0.0678276   -0.0337929    0.0186237   -0.12093      -0.261701    -0.0154392    0.0332679   -0.061669      0.154197     -0.0630567   -0.0295974 
  0.10507      0.0742627   -0.00477559   -0.00237557    0.0349247  -0.164336    -0.00307309   0.0386294     0.00933851    0.0164671    0.129826    -0.0492659     0.0376855  -0.047077    -0.102457    -0.0383447    0.0148563    0.126884    -0.177439      0.126175    -0.109075     0.181621    -0.155165     -0.0486545     0.222159     0.143994  
 -0.0544791    0.0339252   -0.0759404    -0.096476     -0.207354   -0.134511    -0.167358    -0.0559336    -0.107517     -0.0632108    0.0324882    0.174706     -0.123576    0.0185851    0.0762009    0.01962      0.0939776   -0.102992    -0.0453643     0.0880756    0.0615182    0.143085     0.0485895    -0.0498621    -0.0650702   -0.0695773 
  0.00100678  -0.128619     0.138621     -0.0707335     0.117837   -0.0909435   -0.0354221   -0.118388     -0.000568273   0.122712    -0.0143827    0.00233943   -0.0141943   0.0256698   -0.0683514   -0.0661935   -0.0509598    0.0451573   -0.223932      0.0864174   -0.164533    -0.113474     0.00674178   -0.145455     -0.133218    -0.0508521 
  0.123649     0.0553286   -0.10281       0.134945     -0.142881    0.0687002    0.153947     0.0683534    -0.0728069    -0.0711019    0.116341    -0.0276095     0.146094   -0.196938    -0.0558007    0.0396061   -0.0439488    0.0130207    0.14528       0.0937941    0.00447777   0.127479    -0.198667     -0.23362       0.0604155   -0.0890252 
  0.108136     0.00914472   0.0535239    -0.166386     -0.234922   -0.125369    -0.167305    -0.00150059    0.107965      0.1765      -0.0875809    0.0176995     0.202122   -0.102781     0.107828     0.0368611    0.171413    -0.00799339   0.0605192    -0.064608    -0.0968993   -0.0929949    0.10651      -0.105853      0.174697     0.076612  
 -0.151402    -0.130721     0.00793345    0.16297      -0.139331   -0.0467655    0.138125    -0.077515     -0.124133      0.174316    -0.00937939  -0.0292624    -0.0758332   0.122954     0.024297    -0.0554412   -0.126232    -0.00125974   0.190028     -0.178175     0.111945    -0.147221     0.0314858    -0.010665     -0.160043     0.0022084 
  0.0937624    0.109203    -0.0281049    -0.110497      0.0130419   0.122955    -0.11349     -0.0159245     0.107773      0.208871     0.0721468    0.0688467    -0.0475725  -0.0494374    0.0362621   -0.0141423   -0.0534007    0.15408      0.035746     -0.0729587   -0.0430123    0.167607    -0.00648343   -0.140108     -0.045392    -0.20819   
  0.0186622    0.150373     0.0633537     0.0449463     0.0104643  -0.0324811    0.203668    -0.0242865     0.0478424     0.0742807    0.147767    -0.160778     -0.124484   -0.0721955   -0.00679462   0.0995031   -0.0769496    0.0634919    0.105572      0.0486048    0.0732332    0.049056    -0.00138136   -0.0613193    -0.104567     0.087426  
  0.0764191   -0.154783     0.121106      0.124992     -0.0200483   0.0455082    0.0143587   -0.12243      -0.106373      0.0686251    0.0303157   -0.0804222     0.106093    0.0436903    0.0798136   -0.0718366   -0.226406     0.0713671   -0.132127     -0.183716     0.00873345   0.0227537   -0.108877     -0.149547     -0.31786      0.0894153 
  0.0567839    0.0734242   -0.0924114    -0.0682188    -0.0691368  -0.00399477  -0.103408     0.126089      0.00679135    0.0754514    0.028433     0.0460434    -0.142353    0.139946    -0.0747422   -0.205868     0.0977639    0.199206     0.00169683   -0.157588    -0.148622    -0.0678258   -0.100847     -0.0167306     0.0582222   -0.112497  
 -0.111526     0.132558     0.0266241     0.0200872     0.0049743  -0.0271866   -0.00224578   0.0621235    -0.0605794    -0.0290631   -0.137841    -0.137017      0.171642    0.0583168   -0.232357     0.0791642    0.110561     0.199736     0.142433     -0.0615633    0.0326181    0.0408748    0.0855942    -0.0610045    -0.0267435   -0.0747375 
  0.0178053   -0.0917876   -0.129762      0.000783782  -0.0438167  -0.0339316   -0.0260606   -0.0844192     0.00546911   -0.00268186  -0.0424303   -0.0441117    -0.0271657  -0.11105     -0.0133487    0.0509619    0.0685227    0.158767     0.0981313     0.0414997   -0.14514      0.042359    -0.122698     -0.00491782   -0.130646    -0.113657  
  0.138297     0.148915     0.0774568    -0.0124234     0.0723327   0.064976     0.00883236   0.118199      0.0253902    -0.0803131    0.0173877    0.027876     -0.0992395   0.0534362    0.00211012  -0.00346782   0.130304     0.156497    -0.068841     -0.106236    -0.158923    -0.0189519   -0.00739456   -0.107358      0.00756631   0.119557  
  0.00648205  -0.0313886    0.138562      0.196005      0.0633347   0.0106744    0.0637226    0.0859662     0.164324     -0.00759688  -0.0564266   -0.0457504    -0.0329412  -0.0130263    0.0750041    0.0982654    0.0114571    0.00273341   0.057521     -0.166908     0.0962276    0.134462    -0.0982449    -0.101048      0.0107581   -0.105132  
  0.0911834    0.00373292   0.0177513     0.0498073     0.0216733  -0.200767    -0.071342    -0.0793642    -0.0760936    -0.175427     0.236138     0.00847487    0.015872    0.0958157    0.00269756  -0.00323444  -0.0140812    0.0085368    0.116635      0.09797      0.030384     0.154151     0.126558     -0.104944     -0.0978538   -0.0882454 
  0.151126    -0.172355     0.0285651     0.071897      0.083267    0.117746     0.00705252  -0.11034      -0.0756733     0.00895524   0.214297    -0.0306292     0.0674422   0.0878528   -0.0830014   -0.0686739    0.02375      0.0765303   -0.135069     -0.00501101   0.261166     0.162471     0.093456     -0.169068     -0.0402224    0.0459777 
  0.0625686    0.136512     0.106689     -0.173133     -0.0736036   0.021918    -0.120092    -0.0639804     0.0323963    -0.0456406    0.00829834   0.08295       0.0800571   0.00376231  -0.17189      0.0120009    0.115137     0.0862608   -0.112685      0.00803753   0.144158    -0.125867    -0.116199      0.0303684    -0.0948395   -0.110393  
 -0.1441      -0.0287921   -0.0386116    -0.0105098     0.0402261  -0.112222     0.174103    -0.103923      0.144764      0.0314892   -0.0220968    0.0936219    -0.089839   -0.192408     0.0172157   -0.0809898   -0.0372996    0.0939475   -0.237463      0.106553     0.374372     0.0886392   -0.126505      0.00609526   -0.0029287    0.034683  
  0.0386822   -0.091738    -0.00230903    0.0307661     0.202196    0.0771546   -0.033789     0.000969712   0.197427      0.133724     0.0205919    0.0632419    -0.217039    0.0166988    0.0704959    0.0717247   -0.143324    -0.0540198    0.121653     -0.0228525    0.0720353   -0.0961465    0.0607642     0.0550624     0.343689     0.00791962
  0.00993606   0.0277825    0.0882274    -0.122829     -0.119521    0.177937     0.0262258   -0.0364049     0.0722844     0.0818276   -0.177954     0.0725957     0.0890349   0.0317991    0.107582     0.0845455   -0.0627313    0.099363     0.0711702     0.157339     0.0473489   -0.103675     0.00866306    0.105371      0.150118     0.318187  
  0.186853    -0.0134597   -0.0858809     0.170867     -0.0369819  -0.135855    -0.0898916    0.141454     -0.0307965    -0.0501665    0.00811143   0.159782      0.154438   -0.0130654    0.040718    -0.1047      -0.0244834   -0.015263     0.0125087    -0.161943    -0.0313978    0.0175417    0.000857891  -0.0468515     0.128709    -0.202779  
 -0.132815     0.00337554   0.0671035     0.0644217     0.0290366  -0.144433     0.064068     0.0525688     0.0672853    -0.109012    -0.138529     0.0681315    -0.0585953   0.107842    -0.00155064   0.0140205    0.050223    -0.0208672    0.00915865   -0.158485    -0.00272167  -0.140655     0.160044     -0.146268     -0.116319    -0.0268941 
  0.0914822   -0.0556896    0.0276569    -0.0289316     0.0885642   0.0076677    0.0111499   -0.0215928    -0.0639953     0.0641381   -0.104106     0.00907921   -0.0487171  -0.106496     0.0965857    0.110602     0.138486    -0.0457737   -0.0010164    -0.128657     0.240802     0.0133764    0.0802715     0.0253935    -0.0718786   -0.149608  
  0.139853     0.13911     -0.0701426     0.0380018    -0.190606    0.039485     0.0151476    0.114769     -0.0293761     0.0377149   -0.0025952    0.0174727    -0.0325925  -0.126706    -0.0774615   -0.132321    -0.175971    -0.0702939   -0.0158684     0.011208    -0.261693    -0.133995     0.0833502     0.0625707     0.190417     0.0765008 
  0.0102288   -0.266087    -0.244016     -0.0275152     0.0900285  -0.0689679    0.124551     0.10644       0.0411886     0.225579     0.190086     0.12461      -0.0878042   0.0191646    0.0941818    0.0145861   -0.17007     -0.110665    -0.130307     -0.00605957  -0.0711008   -0.207987    -0.0977138     0.0811794     0.0894477    0.0426859 
 -0.0383466    0.103848     0.101137     -0.0387692     0.040001    0.0226064   -0.198142     0.104986     -0.000393984  -0.188436    -0.0926992   -0.000357096  -0.09586     0.02681      0.0451985    0.108505     0.0378757   -0.00674902   0.0878562     0.0107687   -0.100043     0.0803726    0.0542244    -0.0136984    -0.0567741   -0.00293511
 -0.0688459   -0.0526684    0.0775315     0.0678664     0.10896     0.12327     -0.180657     0.00772293    0.17572       0.0848228   -0.0630842    0.00811555   -0.0326786  -0.0367944   -0.0409644   -0.0930666   -0.0157493    0.171481    -0.000923382   0.0616348   -0.00888036   0.0160192    0.0978523    -0.000372719   0.134524     0.0799563 kind diag, method split
┌ Info: 0: avll = 
└   tll[1] = -1.3766914174710592
[ Info: Running 50 iterations EM on diag cov GMM with 2 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.376790
[ Info: iteration 2, average log likelihood -1.376678
[ Info: iteration 3, average log likelihood -1.375473
[ Info: iteration 4, average log likelihood -1.366140
[ Info: iteration 5, average log likelihood -1.350320
[ Info: iteration 6, average log likelihood -1.343975
[ Info: iteration 7, average log likelihood -1.342087
[ Info: iteration 8, average log likelihood -1.341306
[ Info: iteration 9, average log likelihood -1.340935
[ Info: iteration 10, average log likelihood -1.340698
[ Info: iteration 11, average log likelihood -1.340493
[ Info: iteration 12, average log likelihood -1.340308
[ Info: iteration 13, average log likelihood -1.340156
[ Info: iteration 14, average log likelihood -1.340026
[ Info: iteration 15, average log likelihood -1.339910
[ Info: iteration 16, average log likelihood -1.339801
[ Info: iteration 17, average log likelihood -1.339692
[ Info: iteration 18, average log likelihood -1.339577
[ Info: iteration 19, average log likelihood -1.339457
[ Info: iteration 20, average log likelihood -1.339327
[ Info: iteration 21, average log likelihood -1.339187
[ Info: iteration 22, average log likelihood -1.339033
[ Info: iteration 23, average log likelihood -1.338855
[ Info: iteration 24, average log likelihood -1.338649
[ Info: iteration 25, average log likelihood -1.338412
[ Info: iteration 26, average log likelihood -1.338137
[ Info: iteration 27, average log likelihood -1.337805
[ Info: iteration 28, average log likelihood -1.337449
[ Info: iteration 29, average log likelihood -1.337127
[ Info: iteration 30, average log likelihood -1.336868
[ Info: iteration 31, average log likelihood -1.336663
[ Info: iteration 32, average log likelihood -1.336499
[ Info: iteration 33, average log likelihood -1.336367
[ Info: iteration 34, average log likelihood -1.336263
[ Info: iteration 35, average log likelihood -1.336183
[ Info: iteration 36, average log likelihood -1.336120
[ Info: iteration 37, average log likelihood -1.336067
[ Info: iteration 38, average log likelihood -1.336020
[ Info: iteration 39, average log likelihood -1.335980
[ Info: iteration 40, average log likelihood -1.335949
[ Info: iteration 41, average log likelihood -1.335927
[ Info: iteration 42, average log likelihood -1.335911
[ Info: iteration 43, average log likelihood -1.335901
[ Info: iteration 44, average log likelihood -1.335893
[ Info: iteration 45, average log likelihood -1.335888
[ Info: iteration 46, average log likelihood -1.335885
[ Info: iteration 47, average log likelihood -1.335883
[ Info: iteration 48, average log likelihood -1.335881
[ Info: iteration 49, average log likelihood -1.335880
[ Info: iteration 50, average log likelihood -1.335880
┌ Info: EM with 100000 data points 50 iterations avll -1.335880
└ 952.4 data points per parameter
┌ Info: 1
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.3767897895556815
│     -1.376677788877024 
│      ⋮                 
└     -1.3358797372654245
[ Info: Running 50 iterations EM on diag cov GMM with 4 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.335996
[ Info: iteration 2, average log likelihood -1.335880
[ Info: iteration 3, average log likelihood -1.335128
[ Info: iteration 4, average log likelihood -1.329360
[ Info: iteration 5, average log likelihood -1.314013
[ Info: iteration 6, average log likelihood -1.300967
[ Info: iteration 7, average log likelihood -1.295775
[ Info: iteration 8, average log likelihood -1.293508
[ Info: iteration 9, average log likelihood -1.292221
[ Info: iteration 10, average log likelihood -1.291356
[ Info: iteration 11, average log likelihood -1.290733
[ Info: iteration 12, average log likelihood -1.290269
[ Info: iteration 13, average log likelihood -1.289899
[ Info: iteration 14, average log likelihood -1.289588
[ Info: iteration 15, average log likelihood -1.289321
[ Info: iteration 16, average log likelihood -1.289090
[ Info: iteration 17, average log likelihood -1.288884
[ Info: iteration 18, average log likelihood -1.288689
[ Info: iteration 19, average log likelihood -1.288489
[ Info: iteration 20, average log likelihood -1.288275
[ Info: iteration 21, average log likelihood -1.288043
[ Info: iteration 22, average log likelihood -1.287797
[ Info: iteration 23, average log likelihood -1.287547
[ Info: iteration 24, average log likelihood -1.287309
[ Info: iteration 25, average log likelihood -1.287092
[ Info: iteration 26, average log likelihood -1.286897
[ Info: iteration 27, average log likelihood -1.286725
[ Info: iteration 28, average log likelihood -1.286583
[ Info: iteration 29, average log likelihood -1.286473
[ Info: iteration 30, average log likelihood -1.286391
[ Info: iteration 31, average log likelihood -1.286334
[ Info: iteration 32, average log likelihood -1.286293
[ Info: iteration 33, average log likelihood -1.286264
[ Info: iteration 34, average log likelihood -1.286243
[ Info: iteration 35, average log likelihood -1.286226
[ Info: iteration 36, average log likelihood -1.286214
[ Info: iteration 37, average log likelihood -1.286204
[ Info: iteration 38, average log likelihood -1.286197
[ Info: iteration 39, average log likelihood -1.286191
[ Info: iteration 40, average log likelihood -1.286186
[ Info: iteration 41, average log likelihood -1.286183
[ Info: iteration 42, average log likelihood -1.286180
[ Info: iteration 43, average log likelihood -1.286177
[ Info: iteration 44, average log likelihood -1.286175
[ Info: iteration 45, average log likelihood -1.286173
[ Info: iteration 46, average log likelihood -1.286172
[ Info: iteration 47, average log likelihood -1.286171
[ Info: iteration 48, average log likelihood -1.286170
[ Info: iteration 49, average log likelihood -1.286169
[ Info: iteration 50, average log likelihood -1.286169
┌ Info: EM with 100000 data points 50 iterations avll -1.286169
└ 473.9 data points per parameter
┌ Info: 2
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.3359960480778206
│     -1.3358799995625312
│      ⋮                 
└     -1.2861686186159613
[ Info: Running 50 iterations EM on diag cov GMM with 8 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.286369
[ Info: iteration 2, average log likelihood -1.286167
[ Info: iteration 3, average log likelihood -1.285489
[ Info: iteration 4, average log likelihood -1.279749
[ Info: iteration 5, average log likelihood -1.262154
[ Info: iteration 6, average log likelihood -1.249327
[ Info: iteration 7, average log likelihood -1.245420
[ Info: iteration 8, average log likelihood -1.243603
[ Info: iteration 9, average log likelihood -1.241915
[ Info: iteration 10, average log likelihood -1.240025
[ Info: iteration 11, average log likelihood -1.238673
[ Info: iteration 12, average log likelihood -1.237865
[ Info: iteration 13, average log likelihood -1.237328
[ Info: iteration 14, average log likelihood -1.236959
[ Info: iteration 15, average log likelihood -1.236716
[ Info: iteration 16, average log likelihood -1.236548
[ Info: iteration 17, average log likelihood -1.236405
[ Info: iteration 18, average log likelihood -1.236245
[ Info: iteration 19, average log likelihood -1.236044
[ Info: iteration 20, average log likelihood -1.235802
[ Info: iteration 21, average log likelihood -1.235551
[ Info: iteration 22, average log likelihood -1.235343
[ Info: iteration 23, average log likelihood -1.235202
[ Info: iteration 24, average log likelihood -1.235113
[ Info: iteration 25, average log likelihood -1.235055
[ Info: iteration 26, average log likelihood -1.235015
[ Info: iteration 27, average log likelihood -1.234987
[ Info: iteration 28, average log likelihood -1.234964
[ Info: iteration 29, average log likelihood -1.234945
[ Info: iteration 30, average log likelihood -1.234928
[ Info: iteration 31, average log likelihood -1.234911
[ Info: iteration 32, average log likelihood -1.234893
[ Info: iteration 33, average log likelihood -1.234873
[ Info: iteration 34, average log likelihood -1.234850
[ Info: iteration 35, average log likelihood -1.234822
[ Info: iteration 36, average log likelihood -1.234787
[ Info: iteration 37, average log likelihood -1.234745
[ Info: iteration 38, average log likelihood -1.234694
[ Info: iteration 39, average log likelihood -1.234638
[ Info: iteration 40, average log likelihood -1.234583
[ Info: iteration 41, average log likelihood -1.234531
[ Info: iteration 42, average log likelihood -1.234488
[ Info: iteration 43, average log likelihood -1.234453
[ Info: iteration 44, average log likelihood -1.234425
[ Info: iteration 45, average log likelihood -1.234402
[ Info: iteration 46, average log likelihood -1.234383
[ Info: iteration 47, average log likelihood -1.234364
[ Info: iteration 48, average log likelihood -1.234344
[ Info: iteration 49, average log likelihood -1.234322
[ Info: iteration 50, average log likelihood -1.234297
┌ Info: EM with 100000 data points 50 iterations avll -1.234297
└ 236.4 data points per parameter
┌ Info: 3
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.2863689323883747
│     -1.2861669895164447
│      ⋮                 
└     -1.2342970141721927
[ Info: Running 50 iterations EM on diag cov GMM with 16 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.234484
[ Info: iteration 2, average log likelihood -1.234146
[ Info: iteration 3, average log likelihood -1.232373
[ Info: iteration 4, average log likelihood -1.216354
[ Info: iteration 5, average log likelihood -1.179919
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.157193
[ Info: iteration 7, average log likelihood -1.160416
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.148936
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.149409
[ Info: iteration 10, average log likelihood -1.154787
[ Info: iteration 11, average log likelihood -1.145949
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 12, average log likelihood -1.139590
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 13, average log likelihood -1.149315
[ Info: iteration 14, average log likelihood -1.150207
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 15, average log likelihood -1.140737
[ Info: iteration 16, average log likelihood -1.149958
[ Info: iteration 17, average log likelihood -1.142428
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      5
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 18, average log likelihood -1.136643
[ Info: iteration 19, average log likelihood -1.155186
[ Info: iteration 20, average log likelihood -1.143808
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      5
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 21, average log likelihood -1.135884
[ Info: iteration 22, average log likelihood -1.158997
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 23, average log likelihood -1.144417
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 24, average log likelihood -1.145056
[ Info: iteration 25, average log likelihood -1.151124
[ Info: iteration 26, average log likelihood -1.143312
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.137568
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 28, average log likelihood -1.147727
[ Info: iteration 29, average log likelihood -1.149358
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 30, average log likelihood -1.139813
[ Info: iteration 31, average log likelihood -1.148181
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.140104
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      5
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 33, average log likelihood -1.147536
[ Info: iteration 34, average log likelihood -1.158008
[ Info: iteration 35, average log likelihood -1.145357
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 36, average log likelihood -1.138686
[ Info: iteration 37, average log likelihood -1.148327
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.141814
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 39, average log likelihood -1.144872
[ Info: iteration 40, average log likelihood -1.150726
[ Info: iteration 41, average log likelihood -1.143068
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.137346
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 43, average log likelihood -1.146695
[ Info: iteration 44, average log likelihood -1.147457
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      5
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 45, average log likelihood -1.136869
[ Info: iteration 46, average log likelihood -1.159562
[ Info: iteration 47, average log likelihood -1.145020
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     5
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 48, average log likelihood -1.137600
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 49, average log likelihood -1.147172
[ Info: iteration 50, average log likelihood -1.149373
┌ Info: EM with 100000 data points 50 iterations avll -1.149373
└ 118.1 data points per parameter
┌ Info: 4
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.234484458513118 
│     -1.2341456820710413
│      ⋮                 
└     -1.1493734218233285
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      9
│     10
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 1, average log likelihood -1.140353
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      9
│     10
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -1.138056
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      9
│     10
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 3, average log likelihood -1.133289
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      9
│     10
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -1.107692
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      5
│      6
│      9
│     10
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.055836
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.056514
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      9
│     10
│     26
│     29
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.053906
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      2
│      5
│      6
│      9
│     10
│     28
│     30
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.027048
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      8
│      9
│     10
│     24
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.050185
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      9
│     10
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.048896
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      2
│      5
│      6
│      9
│      ⋮
│     28
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 11, average log likelihood -1.010996
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      8
│      9
│     10
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 12, average log likelihood -1.048120
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      2
│      9
│     10
│     24
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 13, average log likelihood -1.021552
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      5
│      6
│      8
│      9
│     10
│     28
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 14, average log likelihood -1.022800
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      2
│      9
│     10
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 15, average log likelihood -1.042138
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      8
│      9
│     10
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 16, average log likelihood -1.026441
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      2
│      5
│      6
│      9
│      ⋮
│     28
│     29
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 17, average log likelihood -0.996071
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      8
│      9
│     10
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 18, average log likelihood -1.059352
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      9
│     10
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 19, average log likelihood -1.024389
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      2
│      5
│      6
│      8
│      ⋮
│     26
│     28
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 20, average log likelihood -1.001069
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      9
│     10
│     24
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 21, average log likelihood -1.046403
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 22, average log likelihood -1.024765
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      5
│      6
│      9
│     10
│     28
│     29
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 23, average log likelihood -1.013343
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 24, average log likelihood -1.039275
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      9
│     10
│     24
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 25, average log likelihood -1.028209
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      2
│      5
│      6
│      8
│      ⋮
│     26
│     28
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 26, average log likelihood -1.011012
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      9
│     10
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.053304
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      8
│      9
│     10
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 28, average log likelihood -1.016092
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      2
│      5
│      6
│      9
│      ⋮
│     28
│     29
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 29, average log likelihood -0.995764
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      8
│      9
│     10
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 30, average log likelihood -1.059349
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│      9
│     10
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 31, average log likelihood -1.024302
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      5
│      6
│      8
│      9
│     10
│     28
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.012407
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│      9
│     10
│     24
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 33, average log likelihood -1.035250
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      8
│      9
│     10
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 34, average log likelihood -1.035232
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      2
│      5
│      6
│      9
│      ⋮
│     28
│     29
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 35, average log likelihood -1.002510
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      8
│      9
│     10
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 36, average log likelihood -1.050263
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      9
│     10
│     24
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 37, average log likelihood -1.017532
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      2
│      5
│      6
│      8
│      ⋮
│     26
│     28
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.010285
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      9
│     10
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 39, average log likelihood -1.053279
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 40, average log likelihood -1.016006
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      5
│      6
│      9
│     10
│     24
│     28
│     29
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 41, average log likelihood -1.007023
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.048321
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      9
│     10
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 43, average log likelihood -1.034963
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      2
│      5
│      6
│      8
│      ⋮
│     26
│     28
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 44, average log likelihood -1.001841
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      9
│     10
│     24
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 45, average log likelihood -1.046432
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      8
│      9
│     10
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 46, average log likelihood -1.024804
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      2
│      5
│      6
│      9
│      ⋮
│     28
│     29
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 47, average log likelihood -1.002169
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      8
│      9
│     10
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 48, average log likelihood -1.050306
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      2
│      9
│     10
│     24
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 49, average log likelihood -1.017501
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      5
│      6
│      8
│      9
│     10
│     28
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 50, average log likelihood -1.021595
┌ Info: EM with 100000 data points 50 iterations avll -1.021595
└ 59.0 data points per parameter
┌ Info: 5
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.140353179092833 
│     -1.1380561355322671
│      ⋮                 
└     -1.0215949071697195
┌ Info: Total log likelihood: 
│   tll =
│    251-element Array{Float64,1}:
│     -1.3766914174710592
│     -1.3767897895556815
│     -1.376677788877024 
│     -1.3754732951560151
│      ⋮                 
│     -1.0503060758402902
│     -1.0175014888899832
└     -1.0215949071697195
32×26 Array{Float64,2}:
  0.141914     0.144927      0.0747891   -0.0300132    0.107108     0.0752315   -0.000977016   0.0997136    0.0484466   -0.0792834    0.00356983   0.0208273    -0.103267     0.0889003     0.00440681  -0.000598239   0.137575     0.133779     -0.0493892   -0.169687    -0.128845    -0.0144263    0.000727796  -0.104266      7.66349e-5   0.0941798 
  0.178103     0.127642     -0.0571447    0.0316738   -0.180183     0.0360902   -0.0157105     0.11249     -0.0401292    0.0353262    0.00947054   0.0184906    -0.019038    -0.116835     -0.0854419   -0.127028     -0.165736    -0.0745379    -0.016701     0.00336301  -0.230875    -0.110078     0.0757505     0.0502429     0.145209     0.0589236 
  0.0701599    0.0856756     0.0586028   -0.0549292   -0.0699533   -0.0800784    0.0116682    -0.0159841    0.0887381    0.118304     0.0154758   -0.0659276     0.00599081  -0.0875925     0.057478     0.0746075     0.0564454    0.0448897     0.104525    -0.0135372   -0.016469    -0.0111804    0.0483085    -0.0912283     0.0345184    0.0561288 
  0.0837011   -0.0512774    -0.0092246    0.0303853    0.0957179   -0.00587256   0.0861279     0.0287979    0.0798633    0.0606368   -0.00223628   0.00790894   -0.140128    -0.0812684     0.0270638    0.0310557    -0.00420182   0.0362592    -0.0154856   -0.0927338    0.0987986    0.0217169    0.0682759     0.0952357     0.0836746   -0.0159672 
  0.153276    -0.176725      0.0309654    0.106412     0.0732669    0.092665     0.0130636    -0.0927913   -0.0722429    0.00893835   0.214106    -0.0252142     0.067526     0.101119     -0.0752752   -0.0473073     0.0226185    0.100808     -0.105316     0.00963325   0.261359     0.143576     0.10029      -0.164552     -0.0279827    0.0173571 
 -0.0506505    0.142134      0.0893664   -0.0442255    0.0319947    0.0146261   -0.205741      0.124671     0.0243941   -0.187124    -0.0939395   -0.00575123   -0.101649     0.0246726     0.0405234    0.10698       0.0112698   -0.033683      0.0891308    0.0115273   -0.0999173    0.0924309    0.071288     -0.000261247  -0.0553871   -0.00502278
  0.112373     0.0623242    -0.00569539   0.00835563   0.0336078   -0.125282    -0.0245484     0.0479585    0.00574683   0.0191116    0.128298    -0.0265228     0.0370266   -0.0595136    -0.102182    -0.0379549     0.0173296    0.124026     -0.167741     0.0604256   -0.104366     0.175684    -0.15112      -0.042728      0.218348     0.135627  
  0.237718     0.0844182     0.019113    -0.129318    -0.183147     0.0379336   -0.0520821    -0.136169     0.154484     0.0935835    0.160991     0.000842462   0.0367757   -0.0928848    -0.0400025   -0.0848951    -0.00140479   0.0212093    -0.124492    -0.304715    -0.0162181    0.0330549   -0.0604011     0.154824     -0.0569195   -0.0152754 
 -0.131923     0.0132048     0.0674366    0.0102732   -0.250362    -0.154866     0.0640618     0.134636     0.0713162   -0.102365    -0.0874137    0.0853967    -0.056102     0.108334     -0.0151496    0.0157266     0.060902     0.114785     -0.0364346   -0.146947    -0.00758811  -0.200719     0.14054      -0.146874     -0.113321    -0.0784967 
 -0.123534    -0.0356479     0.0665502    0.0346243    0.285938    -0.123995     0.0629649     0.00106947   0.0654563   -0.106554    -0.25786      0.046475     -0.0553206    0.108343     -0.00388737   0.018174      0.0517352   -0.173866      0.0312087   -0.146564    -0.0186892   -0.0734775    0.177034     -0.146792     -0.114338    -0.0163302 
  0.0609365    0.190629      0.126639    -0.167002     0.0214214   -0.131504    -0.0610828    -0.0917006    0.048794     0.0326703   -0.383249    -0.0117389     0.0829652    0.0131398    -0.180024    -0.0416433     0.27088     -0.0347978    -0.141756    -0.104334     0.105978    -0.194583    -0.0922977     0.0706567    -0.0437609   -0.17266   
  0.05913      0.0976489     0.0651927   -0.170995    -0.14743      0.14495     -0.253594     -0.0132605    0.0132687   -0.124692     0.183852     0.168213      0.0770745   -0.00612685   -0.156455     0.029363      0.0519382    0.182868     -0.0631098    0.0852725    0.173322    -0.00247389  -0.120057     -0.00924121   -0.09588     -0.0960915 
 -0.154023     0.00430102   -0.025951    -0.0712899   -0.05186     -0.0308421   -0.0284172     0.0881561    0.0264313    0.0968915    0.126363    -0.0268503    -0.0168797   -0.0612764     0.0332858    0.0519749     0.0893497    0.0582159    -0.0255401   -0.114923    -0.0480506   -0.0374588    0.0891481    -0.107984      0.147737     0.131685  
 -0.021073    -0.083706      0.108204    -0.0164701    0.106974     0.0351365   -0.111649     -0.0827023    0.0798777    0.131882    -0.0356318    0.00992952   -0.0265651   -0.0183339    -0.0551884   -0.0665132    -0.0394118    0.105746     -0.0967012    0.0655629   -0.0939975   -0.0448411    0.0532273    -0.0731277     0.00633779  -0.00055684
 -0.0507538    0.0408713    -0.0895502   -0.098971    -0.195031    -0.132485    -0.152025     -0.0566484   -0.105109    -0.0720286    0.0449391    0.168957     -0.133663     0.0187213     0.0600565    0.0191663     0.0893514   -0.148297     -0.0291323    0.0725393    0.0457431    0.141272     0.0488692    -0.0547203    -0.0631477   -0.0346695 
  0.121815     0.0529874    -0.102833     0.13457     -0.13415      0.0456844    0.126821      0.082572    -0.0672119   -0.0891891    0.134567    -0.029954      0.131401    -0.172243     -0.0713965    0.0320508    -0.0342152    0.030093      0.14398      0.093216     0.0132385    0.133671    -0.19935      -0.239159      0.059658    -0.0828405 
 -0.0429815   -0.033198      0.110816     0.197024    -0.0215383    0.0504009   -0.0535007    -0.117823    -0.1752       0.0589947    0.0754439   -0.148873      0.119047    -0.180212      0.304861    -0.0780062    -0.218805     0.0336122    -0.145329    -0.144972    -0.51648      0.0591607   -0.12023      -0.123255     -0.329241     0.121483  
  0.240575    -0.248942      0.120557     0.0785937   -0.0213738    0.0601058    0.110201     -0.126327    -0.0530769    0.0578832    0.0242606   -0.106307      0.149823     0.130116     -0.172671    -0.0671661    -0.232731     0.105678     -0.124367    -0.170698     0.552176    -0.00891309  -0.0795492    -0.168814     -0.287002     0.0529615 
 -0.0899213   -0.0332796    -0.0386773   -0.126606     0.0320664   -0.170388     0.137359     -0.123935     0.11372      0.0403412   -0.00453017   0.00865823   -0.115234    -0.185822     -0.0572998   -0.0854901    -0.0213719    0.0562932    -0.182869     0.0750863    0.163433     0.0419763   -0.118697     -0.0156026    -0.00456687   0.00694846
 -0.165593    -0.00646559   -0.03728      0.0146402    0.0433133   -0.0901837    0.196547     -0.101246     0.171814     0.0263595   -0.0149568    0.113395     -0.0622981   -0.230952      0.0837376   -0.0365534    -0.0431531    0.11921      -0.264985     0.123924     0.404708     0.10638     -0.118103      0.0151182    -0.0030361    0.0480417 
 -0.00388386  -0.0310398     0.125593     0.178131     0.0702964    0.00775512   0.0689851     0.0849176    0.249746     0.0227381   -0.0615518    0.0658259    -0.0718426   -0.0191919     0.070068     0.0945185     0.0128062    0.0399682     0.0331625   -0.644458     0.0963936    0.152708    -0.0473744    -0.0959381    -0.0656409   -0.101319  
 -0.0456992   -0.0267959     0.158301     0.209945     0.0577093    0.0110227    0.0753835     0.0925545    0.0427351   -0.0406119   -0.0495071   -0.161936      0.0326899   -0.0207115     0.0807948    0.13282       0.0121784   -0.00246413    0.0719541    0.344069     0.0955366    0.124368    -0.180963     -0.11138       0.122816    -0.120184  
  0.0838505    0.0508363    -0.0255031   -0.0614446   -0.00835527  -0.114068    -0.0916294    -0.0136575   -0.035205    -0.0688404    0.156383     0.0595286    -0.0584169    0.127854     -0.0246384   -0.0672601     0.0469884    0.0951056     0.0861109    0.00039254  -0.0488825    0.0513267    0.0453584    -0.0736009    -0.0433873   -0.0853888 
  0.176187     0.0166974    -0.085567     0.142161    -0.0616481   -0.119512    -0.0883623     0.152159    -0.0309592   -0.0425422    0.00639768   0.154008      0.118474    -0.000706032   0.0170725   -0.120091     -0.0146135   -0.00665359   -0.00389559  -0.16112     -0.0536421    0.0112877    0.00968432   -0.0450114     0.114979    -0.207984  
 -0.138894    -0.128724     -0.00288934   0.172261    -0.136401    -0.05015      0.137169     -0.0718203   -0.127289     0.182251    -0.0373488   -0.0245366    -0.0729169    0.117441      0.0465543   -0.0633465    -0.144193     0.000152225   0.188634    -0.161748     0.109725    -0.145595     0.0355611    -0.0164255    -0.158822     0.0072571 
 -0.0199828   -0.25451      -0.228487    -0.0284986    0.0836262   -0.06894      0.0831981     0.117569    -0.00683111   0.202067     0.193552     0.118698     -0.0917065    0.0265187     0.131182     0.0525887    -0.170176    -0.107711     -0.129677    -0.0190324   -0.0581738   -0.207872    -0.096822      0.0907019     0.0903069    0.023225  
  0.0186284   -0.102351     -0.0984365    0.0122093   -0.0547539    0.00626387  -0.0410139    -0.0862266    0.00447871   0.0417248   -0.0424763   -0.0413244    -0.0952954   -0.118093     -0.00577039   0.0400589     0.0764504    0.150212      0.127579     0.052385    -0.159389     0.0433713   -0.119968     -0.00465344   -0.13438     -0.108195  
  0.0564682    0.110977     -0.0322963   -0.1012       0.0159186    0.119359    -0.11657      -0.0157516    0.111495     0.20952      0.0729138    0.0709276    -0.0644565   -0.035393      0.0364543   -0.00785054   -0.0821396    0.153391      0.0712066   -0.0781595   -0.0437265    0.168413    -0.0117218    -0.155389     -0.0618925   -0.188238  
 -0.110855     0.0758264     0.0157954    0.021199     0.0105014   -0.0320887   -0.0409727     0.0702985   -0.0496601   -0.0258468   -0.12497     -0.179745      0.147156     0.0591578    -0.228596     0.0717054     0.0819506    0.197231      0.148665    -0.0957757    0.0360997    0.0439438    0.0800455    -0.0625836    -0.0173948   -0.0104972 
  0.11148     -0.00303705    0.0345374   -0.040024    -0.0321226    0.0753986   -0.0435069     0.0750151   -0.0632652    0.00139389  -0.0831247    0.025604      0.0189056    0.0461604    -0.246439    -0.0454177     0.0128652   -0.0965168     0.0413081    0.149972     0.0714701   -0.00855091  -0.0516729    -0.326546      0.0113757    0.0600239 
  0.0134826    0.000604056   0.0851955   -0.122095    -0.131325     0.145362     0.0359987    -0.0347084    0.0716071    0.0795      -0.164822     0.0817974     0.0728177    0.0479927     0.102796     0.0546123    -0.0553159    0.0884979     0.0791276    0.107779     0.0473769   -0.0851153    0.0085588     0.109736      0.131299     0.309689  
 -0.0162782   -0.0121213    -0.135383     0.0888747   -0.0390296   -0.125822    -0.110908      0.036342    -0.034242     0.162088    -0.106975     0.0902955    -0.126173     0.0140759    -0.0490532   -0.00175418   -0.120752    -0.0353159     0.158185     0.0328858   -0.067366    -0.0380004    0.104852     -0.140501      0.105421     0.120997  [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      2
│      9
│     10
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 1, average log likelihood -1.042134
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -1.006282
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      2
│      5
│      6
│      9
│      ⋮
│     29
│     30
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 3, average log likelihood -0.992800
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -1.034512
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      2
│      9
│     10
│     26
│     29
│     30
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.012494
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      2
│      5
│      6
│      8
│      ⋮
│     28
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -0.995200
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│      9
│     10
│     26
│     29
│     30
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.033708
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.013223
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      2
│      5
│      6
│      9
│      ⋮
│     29
│     30
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -0.994134
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│      8
│      9
│     10
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.034415
┌ Info: EM with 100000 data points 10 iterations avll -1.034415
└ 59.0 data points per parameter
kind diag, method kmeans
[ Info: Initializing GMM, 32 Gaussians diag covariance 26 dimensions using 100000 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       7.643327e+05
      1       6.349898e+05      -1.293429e+05 |       32
      2       6.085645e+05      -2.642531e+04 |       32
      3       5.963521e+05      -1.221231e+04 |       32
      4       5.880497e+05      -8.302482e+03 |       32
      5       5.818977e+05      -6.151997e+03 |       32
      6       5.782540e+05      -3.643678e+03 |       32
      7       5.761436e+05      -2.110390e+03 |       32
      8       5.748331e+05      -1.310488e+03 |       32
      9       5.737848e+05      -1.048347e+03 |       32
     10       5.728577e+05      -9.270266e+02 |       32
     11       5.720778e+05      -7.799724e+02 |       32
     12       5.713609e+05      -7.168417e+02 |       32
     13       5.706217e+05      -7.391992e+02 |       32
     14       5.698844e+05      -7.373064e+02 |       32
     15       5.689303e+05      -9.541012e+02 |       32
     16       5.678352e+05      -1.095145e+03 |       32
     17       5.668385e+05      -9.966670e+02 |       32
     18       5.659910e+05      -8.475055e+02 |       32
     19       5.654279e+05      -5.630800e+02 |       32
     20       5.651585e+05      -2.694602e+02 |       32
     21       5.650539e+05      -1.045604e+02 |       32
     22       5.650172e+05      -3.666532e+01 |       31
     23       5.649994e+05      -1.782776e+01 |       32
     24       5.649901e+05      -9.290169e+00 |       27
     25       5.649849e+05      -5.230347e+00 |       26
     26       5.649816e+05      -3.297444e+00 |       25
     27       5.649795e+05      -2.052584e+00 |       24
     28       5.649781e+05      -1.418179e+00 |       16
     29       5.649771e+05      -1.029868e+00 |       16
     30       5.649765e+05      -5.880788e-01 |       19
     31       5.649757e+05      -7.594005e-01 |       12
     32       5.649753e+05      -4.649768e-01 |       18
     33       5.649746e+05      -7.032001e-01 |       11
     34       5.649742e+05      -3.727495e-01 |       12
     35       5.649734e+05      -7.905714e-01 |       11
     36       5.649729e+05      -5.538075e-01 |       11
     37       5.649724e+05      -4.404562e-01 |        8
     38       5.649719e+05      -4.905889e-01 |        9
     39       5.649713e+05      -6.382396e-01 |       14
     40       5.649705e+05      -7.490822e-01 |       16
     41       5.649700e+05      -4.917147e-01 |       12
     42       5.649696e+05      -4.121678e-01 |        9
     43       5.649692e+05      -4.366672e-01 |       11
     44       5.649688e+05      -3.545265e-01 |        5
     45       5.649688e+05      -8.163655e-02 |        3
     46       5.649687e+05      -4.610355e-02 |        5
     47       5.649685e+05      -1.761953e-01 |        9
     48       5.649683e+05      -2.608942e-01 |       10
     49       5.649680e+05      -3.203133e-01 |        6
     50       5.649678e+05      -1.133350e-01 |        5
K-means terminated without convergence after 50 iterations (objv = 564967.8443567804)
┌ Info: K-means with 32000 data points using 50 iterations
└ 37.0 data points per parameter
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.280479
[ Info: iteration 2, average log likelihood -1.245430
[ Info: iteration 3, average log likelihood -1.214836
[ Info: iteration 4, average log likelihood -1.176504
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     11
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.122014
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│     21
│     23
│     26
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.095831
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     15
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.093009
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     4
│     9
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.075266
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     17
│     26
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.055867
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      8
│     18
│     21
│     23
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.033008
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      3
│     12
│     15
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 11, average log likelihood -1.038106
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      2
│      4
│      9
│     16
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 12, average log likelihood -1.045772
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     14
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 13, average log likelihood -1.058680
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      3
│     15
│     17
│     21
│     23
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 14, average log likelihood -1.012520
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      9
│     18
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 15, average log likelihood -1.040694
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      4
│      8
│     16
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 16, average log likelihood -1.029093
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      3
│     14
│     15
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 17, average log likelihood -1.036071
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      9
│     12
│     21
│     23
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 18, average log likelihood -1.043424
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     17
│     18
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 19, average log likelihood -1.027126
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      3
│      4
│     15
│     16
│     27
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 20, average log likelihood -1.007588
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      2
│     14
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 21, average log likelihood -1.064105
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     12
│     21
│     23
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 22, average log likelihood -1.036473
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      4
│     15
│     17
│     18
│     26
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 23, average log likelihood -1.002890
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      8
│      9
│     16
│     29
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 24, average log likelihood -1.027820
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      3
│     14
│     21
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 25, average log likelihood -1.057196
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│     12
│     15
│     23
│     26
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 26, average log likelihood -1.025401
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      9
│     17
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.056218
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      3
│     18
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 28, average log likelihood -1.028909
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      8
│     14
│     15
│     16
│     26
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 29, average log likelihood -1.001164
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     12
│     21
│     23
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 30, average log likelihood -1.050766
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      2
│      3
│      4
│     17
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 31, average log likelihood -1.023268
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      9
│     15
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.042694
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     18
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 33, average log likelihood -1.037067
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     12
│     14
│     23
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 34, average log likelihood -1.025825
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      3
│      8
│      9
│     17
│     21
│     26
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 35, average log likelihood -1.003754
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      2
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 36, average log likelihood -1.056125
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      4
│     15
│     18
│     23
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 37, average log likelihood -1.041394
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.055727
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│     17
│     21
│     26
│     27
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 39, average log likelihood -1.003444
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      4
│      9
│     12
│     15
│     23
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 40, average log likelihood -1.036982
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      3
│     18
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 41, average log likelihood -1.053213
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      2
│      8
│     14
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.018803
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      9
│     12
│     21
│     27
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 43, average log likelihood -1.022430
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      3
│     17
│     23
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 44, average log likelihood -1.042841
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     15
│     18
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 45, average log likelihood -1.044473
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      4
│     14
│     27
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 46, average log likelihood -1.025443
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      3
│      8
│     12
│     21
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 47, average log likelihood -1.020134
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      2
│      9
│     17
│     23
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 48, average log likelihood -1.023195
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     15
│     18
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 49, average log likelihood -1.044317
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      3
│     14
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 50, average log likelihood -1.030117
┌ Info: EM with 100000 data points 50 iterations avll -1.030117
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
  0.00032608  -0.122544     0.148789     -0.0764437    0.10576     -0.0562767   -0.0335326   -0.184063    -0.00604365   0.129861    -0.00921363   0.00197687  -0.0116991    0.0342154   -0.0687814    -0.0462902   -0.0518295    0.0218814    -0.197739      0.0885818   -0.188587    -0.117178     0.00682891  -0.144138     -0.110354    -0.0800895 
  0.104207    -0.00405564   0.045138     -0.117584    -0.15692     -0.103955    -0.155162    -0.00322744   0.178064     0.161963    -0.049946     0.0292773    0.138035    -0.096188     0.10415       0.0567627    0.0903749   -0.0131274     0.0994942    -0.0499685   -0.0804352   -0.0892638    0.102008    -0.0864976     0.21414      0.0463676 
  0.0573279    0.110749    -0.0331134    -0.101137     0.0152878    0.119678    -0.113348    -0.0156644    0.113693     0.209227     0.0713932    0.0707893   -0.0626659   -0.0352421    0.0363895    -0.00740207  -0.0807698    0.152752      0.0705034    -0.0786915   -0.0438926    0.167894    -0.0118787   -0.15429      -0.0610526   -0.186544  
  0.109386     0.077       -0.0822439    -0.637589    -0.0655712    0.0406796   -0.156539    -0.0123658   -0.0499306    0.125345     0.0552556    0.0583042   -0.090331     0.0355483   -0.0406782    -0.119575    -0.014914     0.151328      0.000960676  -0.0718575   -0.120882    -0.0227788   -0.249837     0.00214743    0.146444    -0.0984399 
  0.1208       0.0544936   -0.101953      0.13407     -0.135171     0.0468233    0.126231     0.0822826   -0.0667118   -0.0886549    0.133744    -0.0300482    0.129384    -0.171802    -0.0725423     0.0324399   -0.0348465    0.0295896     0.145429      0.0949782    0.0161267    0.132766    -0.199289    -0.237883      0.0596822   -0.0819421 
  0.018909    -0.101296    -0.0975285     0.0117306   -0.0537675    0.0102278   -0.0407252   -0.0887625    0.00418108   0.0388202   -0.0426094   -0.0413982   -0.10565     -0.119785    -0.00532395    0.0399074    0.0759766    0.151312      0.12889       0.055232    -0.158433     0.0416698   -0.12179     -0.00445713   -0.13361     -0.106575  
 -0.0429997    0.0326342    0.0485306    -0.048932    -0.0657457    0.0622999   -0.00233812   0.0164364    0.00893467   0.0291104   -0.150976    -0.0458522    0.102257     0.0599496   -0.060041      0.068006     0.00398544   0.139809      0.11437       0.0177002    0.0450555   -0.0232868    0.0447132    0.020979      0.0668278    0.166394  
 -0.122609    -0.00871147   0.0654278     0.0226939    0.0175517   -0.142977     0.0621127    0.0662096    0.067269    -0.0985857   -0.159973     0.0623009   -0.0512348    0.106005    -0.0093012     0.0169905    0.0560663   -0.0264297    -0.00550173   -0.141156    -0.0126068   -0.127274     0.157881    -0.140587     -0.10266     -0.0399873 
  0.0566522    0.116825    -0.0503648    -0.20075     -0.0956942   -0.0384154   -0.0250199    0.104001    -0.00670383   0.0243524    0.0638941    0.0173899   -0.118387     0.0567607   -0.0735829    -0.15118      0.0485486    0.138733     -0.0391683    -0.148582    -0.0661969    0.00146939   0.0124674    0.010752      0.095971    -0.07443   
  0.0596512    0.14692      0.0981898    -0.172353    -0.0685385    0.0112726   -0.162497    -0.0501517    0.0301362   -0.0466036   -0.0888525    0.0832085    0.0778144    0.00263591  -0.171059     -0.00363163   0.156136     0.0805481    -0.0999437    -0.00343852   0.143626    -0.0966902   -0.110406     0.0283721    -0.079514    -0.131671  
 -0.01419     -0.0161579   -0.132778      0.0869035   -0.0333477   -0.122012    -0.106674     0.0358638   -0.0338414    0.158867    -0.103859     0.0850475   -0.128267     0.0129846   -0.0527234     0.0011157   -0.120103    -0.0337817     0.161599      0.034313    -0.0682205   -0.0382241    0.109559    -0.137288      0.110728     0.118133  
  0.072034     0.0510242   -0.0147668     0.0615874    0.00308618  -0.19269     -0.0806469   -0.0294303    0.00363781  -0.0725369    0.123295    -0.0374028   -0.00651952   0.062055    -0.0526949    -0.00979019   0.0100266    0.0367255    -0.0204398    -0.0763215    0.00228989   0.0972116    0.066851    -0.158647      0.0428062   -0.0921135 
 -0.141429    -0.128441    -0.000945438   0.17376     -0.134241    -0.0511375    0.137624    -0.072153    -0.118446     0.18663     -0.0351762   -0.0261429   -0.0734751    0.116539     0.0440644    -0.0650454   -0.142898     0.000273367   0.186613     -0.158488     0.109484    -0.146841     0.044259    -0.0173159    -0.158489     0.0101939 
  0.0295396    0.15235      0.0694081     0.0613227    0.0397316   -0.00836682   0.205613    -0.0328054    0.0477602    0.0774189    0.1346      -0.158466    -0.21252     -0.0781237    0.000143155   0.0936654   -0.0599395    0.0983071     0.13935       0.0400288    0.0605455    0.0638663   -0.0101147   -0.0691482    -0.111265     0.0740371 
  0.0446583    0.0634582   -0.0950845    -0.258041    -0.0677162   -0.00382588  -0.105787     0.150194    -0.0281533    0.0911956    0.0163106    0.0650423   -0.141657     0.161676    -0.0633681    -0.205607     0.098476     0.152811      0.00147137   -0.167791    -0.14853     -0.0657192   -0.150277    -0.0471456     0.0568304   -0.105185  
  0.0449843   -0.13105     -0.103137     -0.0340171    0.0274139   -0.0021342    0.0196631    0.0995581   -0.0331954    0.100787     0.0555337    0.0753194   -0.038527     0.0355284   -0.0462624     0.00836282  -0.0827198   -0.098366     -0.0410232     0.0560316    0.0123531   -0.117777    -0.0767964   -0.108052      0.0629935    0.0409726 
  0.145601    -0.208599     0.0254391     0.133045     0.0735576    0.13264      0.00585692  -0.0957874   -0.090077     0.0136909    0.199728    -0.0376287    0.0576533    0.11637     -0.0786222    -0.0615919    0.0272428    0.111349     -0.121276      0.0141651    0.257431     0.154427     0.0954239   -0.161187     -0.031641     0.0182901 
  0.0405061   -0.0658113   -0.0329263     0.0132592    0.298079     0.12882     -0.0214125   -0.00290625   0.354228     0.126477     0.0767648    0.0675163   -0.356739     0.084624     0.0724175     0.0230677   -0.0889786   -0.0163929     0.12089      -0.0189581    0.0768505   -0.0879372    0.0248721    0.060482      0.353595    -0.027105  
 -0.0504319    0.0409837   -0.0898886    -0.0991398   -0.193101    -0.133223    -0.152063    -0.055114    -0.104952    -0.0730398    0.0431784    0.16727     -0.133507     0.0191321    0.0589569     0.0194717    0.0890481   -0.150211     -0.0315307     0.0730699    0.0479652    0.140887     0.0494626   -0.0549276    -0.0637436   -0.035763  
 -0.0161597   -0.0798705    0.0327628     0.039113     0.0067023   -0.0297517    0.0965736   -0.112781     0.0258336    0.045222     0.0205525   -0.0255621    0.0214002   -0.113658     0.0449333    -0.066271    -0.13181      0.0875105    -0.181464     -0.0298412    0.170697     0.0523332   -0.111774    -0.0753006    -0.150284     0.0596548 
  0.180493     0.00306496  -0.0843323     0.163521    -0.046178    -0.123145    -0.0900763    0.141735    -0.0368123   -0.0488997    0.00415529   0.156703     0.167977    -0.0196174    0.0394076    -0.0932046   -0.0282583   -0.0261181    -0.00250045   -0.15907     -0.0353168   -0.00182233   0.00537518  -0.010603      0.12847     -0.219422  
  0.108504     0.0667969   -0.00611863    0.00175792   0.0306853   -0.118292    -0.0313178    0.0450027    0.00686515   0.0243458    0.128545    -0.0274675    0.0338681   -0.0545039   -0.105556     -0.0383566    0.0230535    0.125069     -0.165667      0.0586892   -0.107782     0.169741    -0.158881    -0.054213      0.220348     0.132531  
  0.0797119   -0.0197481   -0.0341508     0.0743793    0.104875    -0.0831172    0.227738     0.0979544    0.0956272    0.00289147   0.0387849   -0.0256165   -0.0875271   -0.114534    -0.0688329    -0.0632731   -0.00588875   0.174459     -0.0999701    -0.118171     0.0463854    0.0992726    0.0655825    0.18354       0.0109867    0.094369  
 -0.0429472   -0.0519198    0.0866128     0.029529     0.108496     0.12317     -0.178917    -0.00173543   0.176505     0.148571    -0.0644955    0.0165103   -0.043231    -0.0625554   -0.0421478    -0.091692    -0.0278002    0.177139     -0.00317262    0.0521224   -0.0129365    0.0251414    0.0980146   -0.000908895   0.10801      0.0810906 
  0.145889     0.143478     0.0738142    -0.0305591    0.0870588    0.0743343   -0.00437983   0.0977113    0.0440051   -0.0794541    0.00563207   0.024575    -0.0950493    0.0837209    0.000437941  -0.00294071   0.126019     0.121214     -0.0493918    -0.157655    -0.129231    -0.0109116   -0.00183061  -0.110466      0.00474722   0.0962049 
 -0.0328191    0.149727     0.0710101    -0.043109     0.0141938    0.0226678   -0.202261     0.114752     0.0119134   -0.168429    -0.0769226   -0.00491338  -0.0941226    0.0228609    0.0291871     0.0808028   -0.00452439  -0.0361274     0.0944875     0.0135186   -0.110437     0.0817035    0.0731193   -0.00860333   -0.05399      0.00459047
  0.174597     0.123405    -0.0788863     0.0426157   -0.175297     0.0447178    0.0194583    0.136456    -0.0562783    0.0582376    0.0102163    0.0395818   -0.0297859   -0.126553    -0.0867508    -0.144759    -0.143266    -0.0249165    -0.0372262     0.0047239   -0.261646    -0.142782     0.074693     0.0908031     0.204228     0.0441274[ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
 
  0.102352    -0.050943     0.0404071    -0.0257112    0.0673829    0.0126466    0.0103373   -0.0330572   -0.0751348    0.063418    -0.108753     0.00518643  -0.0947047   -0.120909     0.101229      0.11234      0.13743     -0.0507422    -0.0330972    -0.13303      0.220291     0.0286255    0.0872158    0.017388     -0.0840677   -0.149306  
  0.107113     0.0201819    0.00994874    0.0475837    0.0158196   -0.210246    -0.0698214   -0.0763164   -0.0552062   -0.153672     0.222971     0.0982642   -9.90151e-5   0.10989      0.0109952     0.00949068   0.00424048   0.0193521     0.140593      0.0963827    0.0206476    0.129693     0.12737     -0.0997932    -0.12203     -0.0859676 
 -0.153973     0.00434505  -0.0263323    -0.0713893   -0.051576    -0.0309981   -0.0282685    0.0880608    0.0260455    0.0956863    0.126269    -0.0267211   -0.0169221   -0.0607859    0.0337132     0.0518473    0.0892173    0.0583146    -0.0259875    -0.115211    -0.0479479   -0.037276     0.0891235   -0.107971      0.146827     0.13173   
 -0.0238351   -0.0292419    0.142281      0.193315     0.0645299    0.00926925   0.0722027    0.0882527    0.151501    -0.00897596  -0.0559217   -0.0444604   -0.0210928   -0.0192755    0.0753622     0.114418     0.0124162    0.0183458     0.0515842    -0.170251     0.0959809    0.139069    -0.112274    -0.104296      0.0254739   -0.110367  
  0.236837     0.0895418    0.0205046    -0.135278    -0.184483     0.0400695   -0.0544114   -0.142615     0.155304     0.0966577    0.162112     0.00132318   0.0382435   -0.0937203   -0.0387851    -0.0866864   -0.00127152   0.0196063    -0.124356     -0.309137    -0.0182554    0.033193    -0.0618122    0.160174     -0.0580583   -0.016537  ┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      8
│     21
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 1, average log likelihood -1.031017
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      8
│     14
│     15
│     17
│      ⋮
│     26
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -0.977287
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      3
│      8
│     18
│     21
│     26
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 3, average log likelihood -0.983148
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      2
│      4
│      8
│     17
│      ⋮
│     26
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -0.996713
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      8
│      9
│     14
│     15
│     21
│     23
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -0.979393
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      2
│      3
│      8
│     17
│      ⋮
│     27
│     29
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -0.969689
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      8
│     15
│     21
│     23
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.010354
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      4
│      8
│     14
│     17
│      ⋮
│     26
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -0.974127
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      2
│      3
│      8
│     18
│     21
│     26
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -0.974731
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      8
│     15
│     17
│     21
│     23
│     26
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -0.998765
┌ Info: EM with 100000 data points 10 iterations avll -0.998765
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
  0.083464     0.162444     0.0699137    0.138463     0.0575144    0.0153214    0.0157244     0.273157    -0.0959355    -0.0757195    -0.0625707    0.0178804    -0.0195698    -0.0705568   -0.0217656    0.0778239   -0.0265108   0.0164775    0.0517858    0.01822    -0.102557     0.0281777    0.080363      0.0241921    0.0308048     0.090509  
  0.150794    -0.163709    -0.0117893   -0.108955    -0.131465     0.0865307   -0.309706     -0.00738797  -0.00891309    0.135433      0.048678     0.0438155    -0.0834195    -0.0581703    0.0879666   -0.0929597   -0.0738521  -0.0215241   -0.0531668    0.0709656   0.179991     0.117383     0.15915       0.0725258   -0.0320088    -0.0311353 
  0.0405352   -0.089235    -0.140238    -0.108579     0.0622361    0.0413541    0.0457817    -0.0832913   -0.0422077    -0.0173846     0.0405833   -0.0209369     0.0458265    -0.0177792   -0.0821308    0.0443959   -0.120336   -0.0218806    0.110349     0.175585    0.131982    -0.0896301    0.0900052     0.00637289  -0.0556379    -0.224138  
  0.10862     -0.00105436  -0.114866    -0.122779    -0.133266     0.00327486   0.122682      0.0195677    0.0845943    -0.127429      0.188292     0.135105     -0.167015     -0.0519993    0.141531     0.00995022  -0.164504    0.168165     0.0729992    0.0721686   0.0344117    0.129392    -0.125283     -0.00615511  -0.000720434   0.0140286 
 -0.0116102    0.0662682    0.0472244   -0.143183     0.029402     0.0932848   -0.0481479     0.116041     0.0649339     0.0582758     0.105661     0.114076     -0.0134553    -0.0299954    0.142672    -0.00901717   0.0130272   0.228693    -0.076334    -0.0556513   0.0767356    0.0710777   -0.00544869   -0.067364     0.0735231    -0.00632576
 -0.0520994   -0.100633    -0.0198552   -0.0127878   -0.0324293   -0.0639752    0.130142      0.131362    -0.0832798    -0.221195      0.00383027  -0.0752322    -0.129195      0.0438754   -0.137634    -0.0950994    0.0732763  -0.0458667    0.108512     0.133056   -0.040291     0.173378     0.000770513  -0.0254199   -0.00999824   -0.106904  
  0.199099    -0.0702021    0.166618    -0.119561     0.106273    -0.0908145    0.0132068    -0.0229791    0.121007     -0.0817218    -0.186023     0.100963      0.118088      0.181972     0.00940424   0.135973     0.104354    0.267839     0.154607     0.0794445   0.0503589    0.0157698    0.148384      0.193539    -0.0827033     0.312085  
 -0.122729     0.154956     0.0996927   -0.0488816   -0.0640559    0.0875096   -0.0918393     0.058699     0.0674837    -0.0983255     0.117165     0.0138713     0.0886624    -0.131485     0.048349    -0.034986     0.0623193  -0.0484093    0.0984293    0.0701814  -0.0281025    0.10825     -0.0401668     0.112424    -0.00646502   -0.177572  
  0.00251816   0.242998    -0.0824034   -0.0322747    0.0841073    0.0124246    0.00559029    0.0111856   -0.108023      0.0367265    -0.0478504    0.0842606    -0.0939766     0.127708    -0.0478201   -0.0182898    0.0430136  -0.0853679    0.151263     0.108785    0.00158381  -0.0305076   -0.156715      0.075258    -0.190769     -0.204209  
  0.0341271   -0.129746     0.132593     0.107954     0.0147313   -0.108379    -0.0197499    -0.0868666   -0.163957      0.0112544     0.018578     0.138899     -0.0405765    -0.0580979   -0.0458656    0.0812241    0.0638781   0.112626     0.0390839   -0.166307   -0.214243    -0.0749587    0.133547      0.167026     0.130565     -0.0648088 
 -0.0869802   -0.115784     0.101406    -0.148081     0.141884     0.187504     0.10152      -0.0607318    0.0235012    -0.00387699   -0.0585723    0.0348344    -0.104956     -0.252855    -0.0231004    0.00476923  -0.0798833  -0.0584683   -0.0715018    0.0187778  -0.141111    -0.0185451   -0.219246      0.0781322    0.127536     -0.0129679 
 -0.0617923    0.21695     -0.180588    -0.0128497    0.165051    -0.0554027   -0.0772125    -0.0113514    0.0596096    -0.0234011    -0.0242498    0.0274046    -0.0472487    -0.0270971    0.0856038    0.0575741    0.0711895   0.0678362   -0.145112    -0.0449308  -0.0554194    0.0269071   -0.0169963     0.0501071    0.036706      0.116525  
  0.0281714   -0.138247    -0.0514392   -0.0257002    0.100762    -0.00183111  -0.000699585  -0.0278523    0.0414776    -0.0262346     0.0715267   -0.0312254    -0.0745007     0.0790385    0.0766818    0.114873     0.130144    0.167012     0.325969    -0.0240029   0.142435    -0.134659     0.154249      0.120366     0.00157409    0.106001  
 -0.0136155    0.079948    -0.119864    -0.113187     0.057033     0.0290744    0.138514      0.0123574    0.105219     -0.00675382   -0.0317799    0.125869      0.0832167     0.155261    -0.079274     0.0229472    0.03147     0.0180218    0.0536114    0.0254955  -0.0770572    0.0349917    0.117505     -0.143681    -0.134677      0.0624101 
 -0.0737162    0.045643    -0.00187484  -0.0208636    0.0931977    0.0135521    0.0997706     0.100226     0.00914753    0.147119      0.196647     0.199077      0.0246846    -0.0694729    0.0531417   -0.0339489   -0.0306633  -0.174976     0.206693     0.0834361   0.035214     0.0557381    0.162333     -0.0267987   -0.0350822     0.0807929 
 -0.0667658    0.151537    -0.132394    -0.244324     0.029662     0.0784121    0.11591      -0.150921     0.0745818     0.0191991     0.0222649    0.100265      0.189038     -0.053257    -0.00418321  -0.175536     0.0454107  -0.0953482    0.11339     -0.0836626  -0.0299341   -0.0146859   -0.0395569     0.0975988    0.120628     -0.0568471 
 -0.0584789    0.0944288   -0.103648     0.114053    -0.0214216    0.0882416    0.0460948    -0.0307212   -0.0434259    -0.0193875     0.0882341   -0.123375     -0.032524     -0.0860924   -0.0493385    0.118809     0.0692871   0.0522217    0.151343    -0.0455351   0.0180762    0.0289582   -0.0265596     0.0662868   -0.184283      0.0918153 
  0.151498    -0.100976    -0.0806579    0.0586159   -0.0170367    0.030824     0.0671381     0.0544736    0.0520888    -0.0419924    -0.0948794    0.0596277     0.000530196  -0.0933462    0.0183151    0.112804     0.0531787  -0.0212034    0.0435525   -0.102228   -0.122826    -0.178136    -0.139535      0.0850148    0.0200544     0.12261   
  0.0634491   -0.0215054   -0.0299156   -0.00374971  -0.0750037   -0.015334    -0.0660797     0.00133229   0.0271114     0.158581     -0.132932     0.0806148    -0.0533277     0.0821565   -0.0399459    0.210068    -0.01596    -0.116764     0.0695838   -0.0506292  -0.0274764   -0.00667298  -0.0547452    -0.0944462    0.188101     -0.0404572 
  0.148943     0.131875     0.0182468    0.0541035   -0.0653473   -0.00444623  -0.0622097    -0.0549537    0.0560404    -0.0835107    -0.00356502   0.000205648  -0.0446483    -0.0542743   -0.142852     0.0967461    0.0290126  -0.0902583    0.133026     0.0445391  -0.00010662  -0.00342727   0.00783571   -0.0832074   -0.0240021     0.0215735 
 -0.0648417   -0.0258743    0.0939791   -0.175859    -0.0864582   -0.0723552   -0.0628418     0.0296257    0.0999538     0.0397134    -0.222824    -0.107431     -0.0715391    -0.128309     0.103757     0.0112017    0.0175063  -0.0182455   -0.146151     0.0748589  -0.119271    -0.149026    -0.195466     -0.0469369    0.0832662    -0.0528636 
  0.046449     0.246296     0.00305049   0.0468048    0.0996465    0.0340392   -0.0067838    -0.191041     0.0477703    -0.112588      0.00995943  -0.0333757     0.00394723   -0.119993     0.0223585    0.213001    -0.105589    0.0626051   -0.150354     0.118739   -0.221601    -0.0541642   -0.0782957    -0.0600922    0.014101      0.0329024 
 -0.0498571    0.0549628   -0.0732454   -0.0198949    0.0869048   -0.187993     0.0786841     0.0205452    0.199476      0.0674429     0.0587214    0.132386     -0.183069     -0.00250749  -0.0264932   -0.0717157   -0.0522459   0.0568798    0.0663323    0.106921    0.0653199    0.0857165    0.0674274     0.0417338   -0.105724      0.0647898 
 -0.0367965   -0.0550635   -0.192387     0.121342    -0.054397    -0.110152     0.167271      0.142035    -0.0263415    -0.0198855     0.0109679   -0.122142      0.102781      0.193545     0.038703     0.0121604    0.120086   -0.213137     0.0148614    0.0705179  -0.0688152   -0.0604443   -0.118323      0.0358597   -0.00779315    0.122018  
 -0.0498771   -0.0453623    0.186613     0.00696929  -0.00203634   0.00316487  -0.0242328    -0.00437263  -0.015646      0.0313335     0.103842     0.14679       0.093726      0.213556     0.0665475   -0.0746969    0.186065   -0.00364038  -0.104196     0.0791206   0.00606819   0.0907493    0.00629904   -0.0492161   -0.0597244    -0.273005  
 -0.0890505   -0.0689047    0.0407078   -0.0851543    0.0492383    0.149577    -0.0877433     0.0324143    0.174006      0.0518355     0.0985556    0.00971924    0.308027     -0.0754425    0.0431594   -0.0362068   -0.214811   -0.0676003    0.236559    -0.0216364   0.0787101    0.0671592   -0.138635      0.100109    -0.0421792     0.00478289
  0.161304     0.0267798    0.0113676    0.236178     0.00676074   0.0270075   -0.212293     -0.13721      0.0767577     0.130357      0.0639232   -0.0549165     0.0549203     0.125372    -0.0814673   -0.0171559    0.17571    -0.0918659    0.00296359   0.106939    0.127996     0.220827     0.0115184     0.0737591    0.0352041     0.204624  
  0.182636     0.0846836    0.211439    -0.0358971    0.112339    -0.11454      0.0610618    -0.0758182    0.123367     -0.000361988  -0.0852482    0.246891      0.0294113     0.0556433   -0.0596593    0.0298162    0.0464746  -0.0264076   -0.0457111    0.135761    0.0183087    0.157019     0.108776     -0.0236494    0.0206431    -0.0230893 
 -0.121493    -0.0291264    0.158104    -0.0428787    0.00799102   0.156695     0.152996      0.0340328    0.000577572   0.00549784   -0.00189646   0.154266     -0.219087     -0.100959    -0.0654498   -0.147139    -0.0139908   0.0738628    0.0240458    0.078378   -0.0947098   -0.0863192    0.0510631    -0.0963267    0.0396545     0.237955  
  0.0656424    0.025386    -0.0327781    0.131853     0.108374     0.0242399   -0.159831     -0.140226    -0.0351789     0.134166     -0.154974     0.157948      0.0219379    -0.116739     0.017436     0.184359    -0.0409514   0.0548861   -0.0111504   -0.157337   -0.117117     0.0801298   -0.0996376    -0.0833404    0.154631     -0.0193188 
 -0.16582      0.115237     0.0256507    0.00613059  -0.00793514   0.0232949   -0.0401057     0.101573    -0.0926465     0.0415843    -0.257662    -0.0457759     0.178822      0.00238534  -0.0342671   -0.214552    -0.0109335   0.16318     -0.0455268    0.125208   -0.0755188    0.00348887  -0.116656      0.117548     0.00173403   -0.0208508 
 -0.0884033    0.0504409    0.196228     0.0466031   -0.123031     0.214542     0.00802521    0.0104415    0.187785     -0.0770438     0.118502     0.0972176     0.0878752     0.0342786   -0.158809     0.0807457    0.125293   -0.0501847    0.0339569    0.0239635  -0.11608     -0.155141     0.136667      0.0772604    0.143522      0.167861  kind full, method split
┌ Info: 0: avll = 
└   tll[1] = -1.4238831715345055
[ Info: Running 50 iterations EM on diag cov GMM with 2 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.423902
[ Info: iteration 2, average log likelihood -1.423836
[ Info: iteration 3, average log likelihood -1.423781
[ Info: iteration 4, average log likelihood -1.423714
[ Info: iteration 5, average log likelihood -1.423630
[ Info: iteration 6, average log likelihood -1.423531
[ Info: iteration 7, average log likelihood -1.423418
[ Info: iteration 8, average log likelihood -1.423288
[ Info: iteration 9, average log likelihood -1.423113
[ Info: iteration 10, average log likelihood -1.422832
[ Info: iteration 11, average log likelihood -1.422340
[ Info: iteration 12, average log likelihood -1.421549
[ Info: iteration 13, average log likelihood -1.420530
[ Info: iteration 14, average log likelihood -1.419589
[ Info: iteration 15, average log likelihood -1.418980
[ Info: iteration 16, average log likelihood -1.418682
[ Info: iteration 17, average log likelihood -1.418555
[ Info: iteration 18, average log likelihood -1.418504
[ Info: iteration 19, average log likelihood -1.418483
[ Info: iteration 20, average log likelihood -1.418475
[ Info: iteration 21, average log likelihood -1.418471
[ Info: iteration 22, average log likelihood -1.418469
[ Info: iteration 23, average log likelihood -1.418469
[ Info: iteration 24, average log likelihood -1.418468
[ Info: iteration 25, average log likelihood -1.418468
[ Info: iteration 26, average log likelihood -1.418468
[ Info: iteration 27, average log likelihood -1.418467
[ Info: iteration 28, average log likelihood -1.418467
[ Info: iteration 29, average log likelihood -1.418467
[ Info: iteration 30, average log likelihood -1.418467
[ Info: iteration 31, average log likelihood -1.418467
[ Info: iteration 32, average log likelihood -1.418467
[ Info: iteration 33, average log likelihood -1.418466
[ Info: iteration 34, average log likelihood -1.418466
[ Info: iteration 35, average log likelihood -1.418466
[ Info: iteration 36, average log likelihood -1.418466
[ Info: iteration 37, average log likelihood -1.418466
[ Info: iteration 38, average log likelihood -1.418466
[ Info: iteration 39, average log likelihood -1.418466
[ Info: iteration 40, average log likelihood -1.418466
[ Info: iteration 41, average log likelihood -1.418466
[ Info: iteration 42, average log likelihood -1.418466
[ Info: iteration 43, average log likelihood -1.418466
[ Info: iteration 44, average log likelihood -1.418466
[ Info: iteration 45, average log likelihood -1.418466
[ Info: iteration 46, average log likelihood -1.418466
[ Info: iteration 47, average log likelihood -1.418466
[ Info: iteration 48, average log likelihood -1.418466
[ Info: iteration 49, average log likelihood -1.418466
[ Info: iteration 50, average log likelihood -1.418466
┌ Info: EM with 100000 data points 50 iterations avll -1.418466
└ 952.4 data points per parameter
┌ Info: 1
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4239024618318172
│     -1.4238355412401373
│      ⋮                 
└     -1.4184656325373775
[ Info: Running 50 iterations EM on diag cov GMM with 4 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.418485
[ Info: iteration 2, average log likelihood -1.418414
[ Info: iteration 3, average log likelihood -1.418356
[ Info: iteration 4, average log likelihood -1.418284
[ Info: iteration 5, average log likelihood -1.418195
[ Info: iteration 6, average log likelihood -1.418097
[ Info: iteration 7, average log likelihood -1.417999
[ Info: iteration 8, average log likelihood -1.417916
[ Info: iteration 9, average log likelihood -1.417854
[ Info: iteration 10, average log likelihood -1.417809
[ Info: iteration 11, average log likelihood -1.417778
[ Info: iteration 12, average log likelihood -1.417756
[ Info: iteration 13, average log likelihood -1.417738
[ Info: iteration 14, average log likelihood -1.417722
[ Info: iteration 15, average log likelihood -1.417708
[ Info: iteration 16, average log likelihood -1.417696
[ Info: iteration 17, average log likelihood -1.417684
[ Info: iteration 18, average log likelihood -1.417672
[ Info: iteration 19, average log likelihood -1.417661
[ Info: iteration 20, average log likelihood -1.417649
[ Info: iteration 21, average log likelihood -1.417638
[ Info: iteration 22, average log likelihood -1.417626
[ Info: iteration 23, average log likelihood -1.417615
[ Info: iteration 24, average log likelihood -1.417603
[ Info: iteration 25, average log likelihood -1.417592
[ Info: iteration 26, average log likelihood -1.417580
[ Info: iteration 27, average log likelihood -1.417569
[ Info: iteration 28, average log likelihood -1.417559
[ Info: iteration 29, average log likelihood -1.417549
[ Info: iteration 30, average log likelihood -1.417539
[ Info: iteration 31, average log likelihood -1.417530
[ Info: iteration 32, average log likelihood -1.417522
[ Info: iteration 33, average log likelihood -1.417514
[ Info: iteration 34, average log likelihood -1.417507
[ Info: iteration 35, average log likelihood -1.417501
[ Info: iteration 36, average log likelihood -1.417495
[ Info: iteration 37, average log likelihood -1.417489
[ Info: iteration 38, average log likelihood -1.417484
[ Info: iteration 39, average log likelihood -1.417479
[ Info: iteration 40, average log likelihood -1.417475
[ Info: iteration 41, average log likelihood -1.417471
[ Info: iteration 42, average log likelihood -1.417467
[ Info: iteration 43, average log likelihood -1.417463
[ Info: iteration 44, average log likelihood -1.417460
[ Info: iteration 45, average log likelihood -1.417457
[ Info: iteration 46, average log likelihood -1.417454
[ Info: iteration 47, average log likelihood -1.417451
[ Info: iteration 48, average log likelihood -1.417449
[ Info: iteration 49, average log likelihood -1.417446
[ Info: iteration 50, average log likelihood -1.417444
┌ Info: EM with 100000 data points 50 iterations avll -1.417444
└ 473.9 data points per parameter
┌ Info: 2
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4184846738719863
│     -1.4184144970926815
│      ⋮                 
└     -1.4174438711725235
[ Info: Running 50 iterations EM on diag cov GMM with 8 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.417452
[ Info: iteration 2, average log likelihood -1.417392
[ Info: iteration 3, average log likelihood -1.417336
[ Info: iteration 4, average log likelihood -1.417268
[ Info: iteration 5, average log likelihood -1.417182
[ Info: iteration 6, average log likelihood -1.417074
[ Info: iteration 7, average log likelihood -1.416945
[ Info: iteration 8, average log likelihood -1.416803
[ Info: iteration 9, average log likelihood -1.416658
[ Info: iteration 10, average log likelihood -1.416525
[ Info: iteration 11, average log likelihood -1.416413
[ Info: iteration 12, average log likelihood -1.416325
[ Info: iteration 13, average log likelihood -1.416261
[ Info: iteration 14, average log likelihood -1.416214
[ Info: iteration 15, average log likelihood -1.416178
[ Info: iteration 16, average log likelihood -1.416151
[ Info: iteration 17, average log likelihood -1.416128
[ Info: iteration 18, average log likelihood -1.416108
[ Info: iteration 19, average log likelihood -1.416090
[ Info: iteration 20, average log likelihood -1.416074
[ Info: iteration 21, average log likelihood -1.416059
[ Info: iteration 22, average log likelihood -1.416045
[ Info: iteration 23, average log likelihood -1.416032
[ Info: iteration 24, average log likelihood -1.416019
[ Info: iteration 25, average log likelihood -1.416008
[ Info: iteration 26, average log likelihood -1.415997
[ Info: iteration 27, average log likelihood -1.415986
[ Info: iteration 28, average log likelihood -1.415977
[ Info: iteration 29, average log likelihood -1.415968
[ Info: iteration 30, average log likelihood -1.415959
[ Info: iteration 31, average log likelihood -1.415951
[ Info: iteration 32, average log likelihood -1.415943
[ Info: iteration 33, average log likelihood -1.415936
[ Info: iteration 34, average log likelihood -1.415929
[ Info: iteration 35, average log likelihood -1.415923
[ Info: iteration 36, average log likelihood -1.415917
[ Info: iteration 37, average log likelihood -1.415911
[ Info: iteration 38, average log likelihood -1.415905
[ Info: iteration 39, average log likelihood -1.415900
[ Info: iteration 40, average log likelihood -1.415895
[ Info: iteration 41, average log likelihood -1.415890
[ Info: iteration 42, average log likelihood -1.415886
[ Info: iteration 43, average log likelihood -1.415881
[ Info: iteration 44, average log likelihood -1.415877
[ Info: iteration 45, average log likelihood -1.415873
[ Info: iteration 46, average log likelihood -1.415869
[ Info: iteration 47, average log likelihood -1.415865
[ Info: iteration 48, average log likelihood -1.415861
[ Info: iteration 49, average log likelihood -1.415857
[ Info: iteration 50, average log likelihood -1.415854
┌ Info: EM with 100000 data points 50 iterations avll -1.415854
└ 236.4 data points per parameter
┌ Info: 3
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4174515802751309
│     -1.4173917873887416
│      ⋮                 
└     -1.4158535188834591
[ Info: Running 50 iterations EM on diag cov GMM with 16 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.415858
[ Info: iteration 2, average log likelihood -1.415791
[ Info: iteration 3, average log likelihood -1.415727
[ Info: iteration 4, average log likelihood -1.415650
[ Info: iteration 5, average log likelihood -1.415553
[ Info: iteration 6, average log likelihood -1.415435
[ Info: iteration 7, average log likelihood -1.415301
[ Info: iteration 8, average log likelihood -1.415160
[ Info: iteration 9, average log likelihood -1.415021
[ Info: iteration 10, average log likelihood -1.414892
[ Info: iteration 11, average log likelihood -1.414774
[ Info: iteration 12, average log likelihood -1.414670
[ Info: iteration 13, average log likelihood -1.414578
[ Info: iteration 14, average log likelihood -1.414498
[ Info: iteration 15, average log likelihood -1.414427
[ Info: iteration 16, average log likelihood -1.414366
[ Info: iteration 17, average log likelihood -1.414311
[ Info: iteration 18, average log likelihood -1.414263
[ Info: iteration 19, average log likelihood -1.414219
[ Info: iteration 20, average log likelihood -1.414181
[ Info: iteration 21, average log likelihood -1.414146
[ Info: iteration 22, average log likelihood -1.414114
[ Info: iteration 23, average log likelihood -1.414085
[ Info: iteration 24, average log likelihood -1.414059
[ Info: iteration 25, average log likelihood -1.414035
[ Info: iteration 26, average log likelihood -1.414012
[ Info: iteration 27, average log likelihood -1.413992
[ Info: iteration 28, average log likelihood -1.413974
[ Info: iteration 29, average log likelihood -1.413956
[ Info: iteration 30, average log likelihood -1.413941
[ Info: iteration 31, average log likelihood -1.413926
[ Info: iteration 32, average log likelihood -1.413912
[ Info: iteration 33, average log likelihood -1.413899
[ Info: iteration 34, average log likelihood -1.413888
[ Info: iteration 35, average log likelihood -1.413876
[ Info: iteration 36, average log likelihood -1.413866
[ Info: iteration 37, average log likelihood -1.413855
[ Info: iteration 38, average log likelihood -1.413846
[ Info: iteration 39, average log likelihood -1.413836
[ Info: iteration 40, average log likelihood -1.413828
[ Info: iteration 41, average log likelihood -1.413819
[ Info: iteration 42, average log likelihood -1.413811
[ Info: iteration 43, average log likelihood -1.413802
[ Info: iteration 44, average log likelihood -1.413795
[ Info: iteration 45, average log likelihood -1.413787
[ Info: iteration 46, average log likelihood -1.413779
[ Info: iteration 47, average log likelihood -1.413772
[ Info: iteration 48, average log likelihood -1.413764
[ Info: iteration 49, average log likelihood -1.413757
[ Info: iteration 50, average log likelihood -1.413750
┌ Info: EM with 100000 data points 50 iterations avll -1.413750
└ 118.1 data points per parameter
┌ Info: 4
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4158583008838521
│     -1.4157911939805077
│      ⋮                 
└     -1.413750008028605 
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.413752
[ Info: iteration 2, average log likelihood -1.413681
[ Info: iteration 3, average log likelihood -1.413612
[ Info: iteration 4, average log likelihood -1.413529
[ Info: iteration 5, average log likelihood -1.413421
[ Info: iteration 6, average log likelihood -1.413286
[ Info: iteration 7, average log likelihood -1.413127
[ Info: iteration 8, average log likelihood -1.412953
[ Info: iteration 9, average log likelihood -1.412775
[ Info: iteration 10, average log likelihood -1.412602
[ Info: iteration 11, average log likelihood -1.412440
[ Info: iteration 12, average log likelihood -1.412293
[ Info: iteration 13, average log likelihood -1.412159
[ Info: iteration 14, average log likelihood -1.412039
[ Info: iteration 15, average log likelihood -1.411932
[ Info: iteration 16, average log likelihood -1.411836
[ Info: iteration 17, average log likelihood -1.411751
[ Info: iteration 18, average log likelihood -1.411676
[ Info: iteration 19, average log likelihood -1.411609
[ Info: iteration 20, average log likelihood -1.411550
[ Info: iteration 21, average log likelihood -1.411497
[ Info: iteration 22, average log likelihood -1.411451
[ Info: iteration 23, average log likelihood -1.411410
[ Info: iteration 24, average log likelihood -1.411372
[ Info: iteration 25, average log likelihood -1.411339
[ Info: iteration 26, average log likelihood -1.411308
[ Info: iteration 27, average log likelihood -1.411280
[ Info: iteration 28, average log likelihood -1.411254
[ Info: iteration 29, average log likelihood -1.411230
[ Info: iteration 30, average log likelihood -1.411207
[ Info: iteration 31, average log likelihood -1.411185
[ Info: iteration 32, average log likelihood -1.411164
[ Info: iteration 33, average log likelihood -1.411145
[ Info: iteration 34, average log likelihood -1.411126
[ Info: iteration 35, average log likelihood -1.411108
[ Info: iteration 36, average log likelihood -1.411090
[ Info: iteration 37, average log likelihood -1.411073
[ Info: iteration 38, average log likelihood -1.411057
[ Info: iteration 39, average log likelihood -1.411041
[ Info: iteration 40, average log likelihood -1.411026
[ Info: iteration 41, average log likelihood -1.411010
[ Info: iteration 42, average log likelihood -1.410996
[ Info: iteration 43, average log likelihood -1.410981
[ Info: iteration 44, average log likelihood -1.410966
[ Info: iteration 45, average log likelihood -1.410952
[ Info: iteration 46, average log likelihood -1.410938
[ Info: iteration 47, average log likelihood -1.410924
[ Info: iteration 48, average log likelihood -1.410910
[ Info: iteration 49, average log likelihood -1.410896
[ Info: iteration 50, average log likelihood -1.410882
┌ Info: EM with 100000 data points 50 iterations avll -1.410882
└ 59.0 data points per parameter
┌ Info: 5
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4137521819091907
│     -1.4136813900470042
│      ⋮                 
└     -1.4108823694227164
┌ Info: Total log likelihood: 
│   tll =
│    251-element Array{Float64,1}:
│     -1.4238831715345055
│     -1.4239024618318172
│     -1.4238355412401373
│     -1.4237808510824543
│      ⋮                 
│     -1.4109101026428537
│     -1.4108962055024425
└     -1.4108823694227164
32×26 Array{Float64,2}:
 -0.553984     0.22446     0.307891     0.749996    -0.506771    0.270779     0.0589894   0.234063   -0.117905    -0.0650462   0.376062   -0.130712    0.083545   -0.573585   -0.349229    -0.172751   -0.241845     0.0794482   -0.35029      0.307334     0.212853     0.195629   -0.146018   -0.246876     -0.492512    0.166469 
 -0.190652     0.247904    0.212829     0.56118     -0.445695   -0.0851792   -0.258313    0.124281    0.158512    -0.148089    0.435015    0.179141   -0.149008    0.603381   -0.370773    -0.47816    -0.0275157   -0.237638     0.21717     -0.0158913   -0.00464738  -0.547123   -0.0996559  -0.00772091   -0.0600922  -0.024727 
 -0.00934553  -0.128251   -0.418182     0.16152     -0.111596    0.34577      0.0829477   0.0305031  -0.0764512    0.191657   -0.28071    -0.0251153  -0.512323   -0.557371   -0.170412    -0.0920771  -0.260742    -0.602872     0.455444     0.356949     0.0692826    0.108983   -0.222576    0.438048     -0.119943   -0.357926 
 -0.0277845   -0.529875    0.568722     0.0418424   -0.333912    0.067524    -0.0181221  -0.337174    0.218697    -0.131251    0.376301    0.0804913   0.472218   -0.250977   -0.0660723   -0.514735   -0.257573    -0.426579     0.15857      0.166873    -0.210588    -0.0998601   0.326787    0.44739       0.0203724  -0.0935825
  0.195717    -0.684919   -0.403422    -0.299992     0.751869   -0.339014    -0.193269   -0.100535    0.151268     0.196401   -0.208217    0.337848    0.167397    0.313852   -0.0167023    0.378556    0.270246    -0.172709     0.242029    -0.0831016   -0.845433     0.276625   -0.213512    0.287682     -0.0362846  -0.464403 
 -0.241425     0.136341   -0.559046     0.00760764   0.429551   -0.350168     0.220544   -0.45788     0.508196    -0.0117865  -0.32068    -0.0249189  -0.417321    0.39972    -0.200864    -0.0699775   0.449638    -0.242809     0.770001     0.0767994   -0.15761     -0.248313    0.0605867  -0.0851447     0.443382   -0.353697 
  0.00786407   0.590221   -0.00299337   0.0340515   -0.371203   -0.0127042    0.0271585   0.716538   -0.301146     0.286234   -0.480384   -0.32919    -0.901366    0.317471   -0.334288    -0.0220635   0.0028411    0.368885     0.267687     0.120891     0.0727473   -0.130018   -0.644888   -0.150221     -0.262795   -0.137028 
  0.183336     0.525339   -0.365803     0.0435217    0.784547    0.0416424   -0.0358611   0.447918   -0.00145536   0.446738    0.233988   -0.144668   -0.409354    0.36721     0.262679     0.208015    0.616412     0.212571     0.063347    -0.287143     0.149287     0.18082    -0.314776   -0.291771     -0.203149   -0.202105 
 -0.864914     0.0948946  -0.618243    -0.891282     0.162044    0.174194    -0.317481   -0.567016   -0.250568    -0.341629   -0.166593    0.516      -0.253648   -0.439629   -0.279118     0.230005    0.148052     0.159424    -0.139311    -0.0245377    0.0618777    0.291332   -0.415648    0.125156     -0.0227572   0.4601   
 -0.158334    -0.185799   -0.0907807   -0.219135     0.0726378   0.296661     0.232548   -0.463535   -0.00994249   0.0881685  -0.210725    0.126207    0.151182   -0.500504   -0.0980981    0.298632   -0.637453     0.377499    -0.02144      0.123632     0.113111    -0.0676782  -0.148128   -0.321323      0.460527    0.0521532
  0.189283    -0.262346    0.0480651    0.195951    -0.263141   -0.136773    -0.714357    0.284798    0.460998     0.215301   -0.710154   -0.191323    0.291937   -0.280062    0.417419     0.629838   -0.595425     0.188056     0.275899     0.201113     0.133238    -0.246457    0.233275    0.0865887     0.311694   -0.258836 
  0.0683009   -0.241259   -0.267327    -0.22583     -0.143133    0.270173    -0.0907697   0.422857    0.492336     0.429562   -0.417079    0.448533    0.304361   -0.251702    0.564623     0.202822    0.694466     0.125624    -0.114282    -0.17598     -0.15467      0.157992    0.376609    0.158849      0.267665    0.29286  
  0.253173     0.15445     0.122432     0.0676081   -0.0177418  -0.238966    -0.368018    0.175887   -0.0709944    0.0347837   0.347614    0.0162546   0.263144   -0.02777     0.182391     0.173777    0.0749511   -0.00784902  -0.0239132    0.256897     0.193111     0.28125     0.0574872  -0.0140509    -0.415478   -0.0983669
 -0.0120481    0.139711   -0.0362922   -0.104767    -0.0332919  -0.0615415   -0.139626    0.016435    0.0357722   -0.151481   -0.108553    0.0141516  -0.0804902   0.191684   -0.12336      0.109653   -0.00687407  -0.0358907   -0.00778728  -0.0732009   -0.0783578   -0.0610743  -0.031403   -0.0209979     0.124936   -0.0481491
 -0.0643453   -0.294445    0.00107326   0.285238     0.0649214   0.0678202    0.609156    0.0319043   0.049219     0.0730021   0.0563022  -0.147163   -0.0497954  -0.140469    0.00893829  -0.288289    0.101406     0.281081     0.152826    -0.0388318   -0.157119    -0.0598017  -0.0217003   0.0790038    -0.0712317   0.185438 
  0.0638029    0.612748    0.250209    -0.253452    -0.0414713   0.225698     0.613829   -0.0765463  -0.136776    -0.134259    0.146593    0.249744   -0.210752    0.191703    0.0541103   -0.352199    0.141446    -0.0828631   -0.146924    -0.177479     0.174152     0.0749451   0.298948   -0.0878891     0.129381    0.621272 
 -0.158314    -0.442306    0.174772    -0.212903    -0.0578335   0.199434     0.110314   -0.97801     0.048439    -0.131493    0.0255383  -0.393764   -0.39024    -0.0733475  -0.52676     -0.0928385  -0.709149    -0.578571     0.505991     0.123298    -0.200221    -0.188747   -0.247956    0.206156      0.299881   -0.860089 
  0.260535     0.4764      0.328148    -0.892801     0.16913    -0.172298     0.15247    -0.339399   -0.531199    -0.459388   -0.0224522  -0.369069    0.0601415   0.284272    0.188904    -0.259265    0.493994    -0.0444044    0.0381523    0.6552       0.0217813    0.446123   -0.53337     0.126016     -0.265999   -0.604257 
  0.0944908    0.131534    0.345397    -0.0499203   -0.0831502  -0.400456    -0.227669   -0.726039    0.169265    -0.968685    0.0125606   0.449422    0.19302    -0.230469   -0.0831897    0.0132364  -0.548037    -0.260017    -0.223657     0.101191    -0.722429    -0.162864    0.32538     0.233339      0.336943    0.481907 
  0.346982     0.208716    0.268968    -0.963649     0.142691    0.229471     0.096269   -0.768236    0.156375    -0.194812   -0.244894    0.307718   -0.172946    0.439099    0.407202     0.202773   -0.00967906  -0.185332    -0.354428    -0.300609     0.416808    -0.043629    0.381796    0.000175954   0.407162    0.176406 
 -0.319275    -0.271141   -0.0916594   -0.0934053   -0.231082   -0.056832    -1.21978    -0.179975    0.0594293    0.124423    0.67167     0.211425   -0.122122    0.241591   -0.316096     0.575735    0.0167738   -0.413983     0.074519    -0.343173     0.604247    -0.341176    0.191445   -0.279418     -0.216315   -0.274642 
 -0.623454    -0.135768   -0.584012     0.0305285    0.66167     0.0967846   -0.183002   -0.105317   -0.021334     0.252379    0.0160198   0.490782    0.200922   -0.442656   -0.380368     0.406268   -0.471207     0.752287    -0.0966547    0.00804234   0.173998    -0.281033   -0.26637    -0.958699      0.784086   -0.045222 
 -0.429643    -0.496946    0.216964     0.370692     0.283772    0.2632       0.674894   -0.215712    0.251919    -0.0223373  -0.372513    0.16001    -0.559946    0.28405    -0.26483      0.336816    0.307844    -0.38651     -1.05487      0.395591     0.0204349   -0.823927    0.234409    0.081933     -0.488683    0.175261 
 -0.370925    -0.271044   -0.00137704   0.283739    -0.0795371   0.58781      1.27333    -0.313951    0.0888195    0.141718   -0.141625   -0.116865   -0.0851466  -0.315256    0.114152    -0.303717    0.248617     0.0455978    0.258565    -0.143524    -0.0310993   -0.144106    0.17335     0.259233      0.163075    0.519128 
  0.400004    -0.615391    0.0048349   -0.554955    -0.0791971   0.560414    -0.170389    0.50732    -0.719607     0.100085    0.188298   -0.256199    0.75215    -0.352848   -0.251015    -0.0428335  -0.532506    -0.394643    -0.537288    -0.363079     0.172392     0.370988    0.221591    0.340281     -0.161904    0.322225 
  0.203885     0.774666    0.136058     0.14939     -0.0641312  -0.35554     -0.483036    0.668676   -0.347673    -0.044227    0.0107949   0.587392    0.331549   -0.0596098  -0.0402523    0.456071    0.334757     0.152095    -0.637353     0.0998303    0.119597     0.156503    0.237922    0.0109875    -0.381633    0.497184 
  0.46668      0.145585    0.0706786   -0.0245689    0.194683    0.150791     0.844401    0.320407   -0.181043    -0.168161   -0.177856   -0.302524    0.319728   -0.398316    0.552432    -0.248445   -0.0279912    0.901195    -0.0100677    0.245767    -0.172473     0.683123   -0.312113    0.122507      0.157416    0.336673 
  0.234287     0.28543     0.10513     -0.554415    -0.166075   -0.184855    -0.60768     0.1975     -0.246204    -0.0715261   0.35864     0.048365    0.700252   -0.204457    0.125709    -0.300848   -0.192718     0.647305     0.523453    -0.506272    -0.253286     0.437615    0.0853814  -0.105168      0.533111   -0.0504219
  0.0964268    0.238628    0.00581067  -0.33677      0.159341   -0.0823109   -0.150858   -0.0480864  -0.0325094    0.060909    0.0432009   0.0789057  -0.0859432   0.192532    0.10789      0.132495    0.158933     0.0161898   -0.0357033   -0.0495546    0.0129288    0.0942389  -0.0197253  -0.049208     -0.0274956  -0.100568 
  0.185469    -0.161374   -0.0796741    0.871186    -0.245409   -0.231652    -0.381659    0.662634    0.561258     0.254774    0.301977    0.0517313   0.0117337  -0.0547185   0.0370724   -0.19915     0.169309    -0.185979     0.528462     0.279928    -0.245844    -0.095277   -0.154181    0.245884     -0.350864   -0.171289 
  0.804702    -0.165152    0.516178     0.759611     0.331792   -0.00319524   1.00482     0.549184    0.300555     0.764778    0.628879   -0.189559    0.348138    0.713991    0.0921804   -0.16347    -0.0240873   -0.629042     0.0919628    0.23751      0.304981     0.204303    0.880169   -0.498271      0.141893   -0.325372 
  0.893778    -0.0937053   0.901549    -0.179074    -0.151939   -0.260099    -0.0180602   0.265683    0.405333    -0.429413   -0.487273   -0.764567   -0.154992    0.988064    0.476117    -0.0917891   0.52134     -0.225399     0.11327      0.418983    -0.0129152    0.113833    0.324728    1.06905      -1.0814      0.304531 [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.410869
[ Info: iteration 2, average log likelihood -1.410855
[ Info: iteration 3, average log likelihood -1.410841
[ Info: iteration 4, average log likelihood -1.410828
[ Info: iteration 5, average log likelihood -1.410815
[ Info: iteration 6, average log likelihood -1.410801
[ Info: iteration 7, average log likelihood -1.410789
[ Info: iteration 8, average log likelihood -1.410776
[ Info: iteration 9, average log likelihood -1.410764
[ Info: iteration 10, average log likelihood -1.410752
┌ Info: EM with 100000 data points 10 iterations avll -1.410752
└ 59.0 data points per parameter
kind full, method kmeans
[ Info: Initializing GMM, 32 Gaussians diag covariance 26 dimensions using 100000 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       8.934372e+05
      1       7.076575e+05      -1.857797e+05 |       32
      2       6.906369e+05      -1.702054e+04 |       32
      3       6.850506e+05      -5.586327e+03 |       32
      4       6.824238e+05      -2.626767e+03 |       32
      5       6.808218e+05      -1.602018e+03 |       32
      6       6.797506e+05      -1.071212e+03 |       32
      7       6.788977e+05      -8.529079e+02 |       32
      8       6.782006e+05      -6.970840e+02 |       32
      9       6.776427e+05      -5.579261e+02 |       32
     10       6.771819e+05      -4.607936e+02 |       32
     11       6.768123e+05      -3.695541e+02 |       32
     12       6.765042e+05      -3.081581e+02 |       32
     13       6.762436e+05      -2.605784e+02 |       32
     14       6.760438e+05      -1.997895e+02 |       32
     15       6.758707e+05      -1.730912e+02 |       32
     16       6.757224e+05      -1.483742e+02 |       32
     17       6.755740e+05      -1.483939e+02 |       32
     18       6.754322e+05      -1.417710e+02 |       32
     19       6.752952e+05      -1.370130e+02 |       32
     20       6.751707e+05      -1.244922e+02 |       32
     21       6.750537e+05      -1.170376e+02 |       32
     22       6.749423e+05      -1.113305e+02 |       32
     23       6.748432e+05      -9.914293e+01 |       32
     24       6.747563e+05      -8.691177e+01 |       32
     25       6.746727e+05      -8.359410e+01 |       32
     26       6.745933e+05      -7.935121e+01 |       32
     27       6.745119e+05      -8.137017e+01 |       32
     28       6.744299e+05      -8.209260e+01 |       32
     29       6.743459e+05      -8.396473e+01 |       32
     30       6.742609e+05      -8.500216e+01 |       32
     31       6.741757e+05      -8.517629e+01 |       32
     32       6.740866e+05      -8.908127e+01 |       32
     33       6.740059e+05      -8.077904e+01 |       32
     34       6.739321e+05      -7.378912e+01 |       32
     35       6.738557e+05      -7.639585e+01 |       32
     36       6.737763e+05      -7.939497e+01 |       32
     37       6.736874e+05      -8.885867e+01 |       32
     38       6.735972e+05      -9.025032e+01 |       32
     39       6.735115e+05      -8.563602e+01 |       32
     40       6.734217e+05      -8.987480e+01 |       32
     41       6.733362e+05      -8.541669e+01 |       32
     42       6.732579e+05      -7.830275e+01 |       32
     43       6.731830e+05      -7.492126e+01 |       32
     44       6.731114e+05      -7.163212e+01 |       32
     45       6.730477e+05      -6.367326e+01 |       32
     46       6.729856e+05      -6.210934e+01 |       32
     47       6.729212e+05      -6.441503e+01 |       32
     48       6.728634e+05      -5.773408e+01 |       32
     49       6.728136e+05      -4.980996e+01 |       32
     50       6.727684e+05      -4.525456e+01 |       32
K-means terminated without convergence after 50 iterations (objv = 672768.3845149358)
┌ Info: K-means with 32000 data points using 50 iterations
└ 37.0 data points per parameter
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.422954
[ Info: iteration 2, average log likelihood -1.417835
[ Info: iteration 3, average log likelihood -1.416450
[ Info: iteration 4, average log likelihood -1.415439
[ Info: iteration 5, average log likelihood -1.414438
[ Info: iteration 6, average log likelihood -1.413559
[ Info: iteration 7, average log likelihood -1.412941
[ Info: iteration 8, average log likelihood -1.412563
[ Info: iteration 9, average log likelihood -1.412329
[ Info: iteration 10, average log likelihood -1.412168
[ Info: iteration 11, average log likelihood -1.412046
[ Info: iteration 12, average log likelihood -1.411946
[ Info: iteration 13, average log likelihood -1.411861
[ Info: iteration 14, average log likelihood -1.411787
[ Info: iteration 15, average log likelihood -1.411721
[ Info: iteration 16, average log likelihood -1.411663
[ Info: iteration 17, average log likelihood -1.411610
[ Info: iteration 18, average log likelihood -1.411562
[ Info: iteration 19, average log likelihood -1.411518
[ Info: iteration 20, average log likelihood -1.411478
[ Info: iteration 21, average log likelihood -1.411441
[ Info: iteration 22, average log likelihood -1.411407
[ Info: iteration 23, average log likelihood -1.411375
[ Info: iteration 24, average log likelihood -1.411345
[ Info: iteration 25, average log likelihood -1.411318
[ Info: iteration 26, average log likelihood -1.411291
[ Info: iteration 27, average log likelihood -1.411267
[ Info: iteration 28, average log likelihood -1.411243
[ Info: iteration 29, average log likelihood -1.411221
[ Info: iteration 30, average log likelihood -1.411199
[ Info: iteration 31, average log likelihood -1.411178
[ Info: iteration 32, average log likelihood -1.411158
[ Info: iteration 33, average log likelihood -1.411138
[ Info: iteration 34, average log likelihood -1.411119
[ Info: iteration 35, average log likelihood -1.411101
[ Info: iteration 36, average log likelihood -1.411082
[ Info: iteration 37, average log likelihood -1.411065
[ Info: iteration 38, average log likelihood -1.411047
[ Info: iteration 39, average log likelihood -1.411030
[ Info: iteration 40, average log likelihood -1.411013
[ Info: iteration 41, average log likelihood -1.410997
[ Info: iteration 42, average log likelihood -1.410981
[ Info: iteration 43, average log likelihood -1.410965
[ Info: iteration 44, average log likelihood -1.410950
[ Info: iteration 45, average log likelihood -1.410935
[ Info: iteration 46, average log likelihood -1.410920
[ Info: iteration 47, average log likelihood -1.410906
[ Info: iteration 48, average log likelihood -1.410893
[ Info: iteration 49, average log likelihood -1.410879
[ Info: iteration 50, average log likelihood -1.410866
┌ Info: EM with 100000 data points 50 iterations avll -1.410866
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
 -0.613082    -0.103327   -0.513628    -0.702669     0.499942     0.314947    0.0901797  -0.71684    -0.184452    -0.159354    -0.162303     0.360137   -0.0415063  -0.370816   -0.18413      0.380413    -0.233695      0.354027   -0.222807     0.0994505   0.302528     0.188107   -0.295966   -0.323406     0.441471    0.143985 
  0.334078    -0.252569    0.0503901    0.723895    -0.0585464   -0.416083   -0.359282    0.609245    0.602099    -0.114818     0.319703     0.386735   -0.387522    0.458248    0.0735178   -0.0180704    0.28044      -0.110784    0.331745    -0.107566   -0.326405    -0.572368    0.119188    0.711862    -0.217892    0.232179 
  0.257097    -0.0333044  -0.501857     0.334447    -0.27387      0.44821     0.165516    0.340401    0.0659099    0.383151    -0.326357    -0.119775   -0.36539    -0.618645    0.149116    -0.0998119   -0.0750454    -0.394514    0.502086     0.582796    0.124724     0.124799   -0.495726    0.447387    -0.297969   -0.201768 
 -0.0515135   -0.199043    0.308433     0.185201    -0.289469     0.0686456  -0.0383474  -0.234878    0.263113    -0.202182     0.108514    -0.0143507  -0.0782849   0.228628   -0.0800061   -0.0581319   -0.143258     -0.52445     0.107263     0.109624   -0.0241196   -0.19221     0.109482    0.348487    -0.069532   -0.126096 
  0.396803    -0.568088   -0.00660363   0.0547828    0.64744     -0.81574    -0.185096    0.162821    0.0806759   -0.0866874    0.116848    -0.732335    0.367449    0.333579   -0.232987     0.257343     0.0227074     0.206582    0.197699     0.780253   -0.41648      0.51838    -0.260673   -0.0311738   -0.574408   -0.414943 
  0.422633     0.707817   -0.188827    -0.461711     0.150425    -0.650251   -0.845556    0.136707   -0.429813    -0.160962     0.146495     0.58988     0.35028     0.128732   -0.18558      0.116719    -0.054992      0.30502     0.147441    -0.324797   -0.299427     0.206175    0.110513   -0.102787     0.225104   -0.0287601
  0.0224709   -0.397359   -0.507218     0.0620953    0.561835    -0.0304082   0.339519   -0.269337   -0.0942411   -0.0283431    0.226886     0.77949     0.366875   -0.339466    0.0450851    0.0348405    0.000727142   0.204208    0.230116    -0.265026   -0.839661     0.0277647  -0.288699    0.142844     0.16077     0.164876 
 -0.476953    -0.047659    0.252398     0.109147    -0.227826     0.189019   -0.0748042  -0.413348    0.195599    -0.519651    -0.312061     0.466741    0.0306054  -0.397625   -0.333933     0.261921    -0.302973     -0.600259   -0.758826     0.647078   -0.412194    -0.227431    0.418412    0.183471    -0.124884    0.391313 
 -0.0259248   -0.491728   -0.674369     0.45199     -0.338116     0.34991    -0.342418    0.521559    0.515668     0.23732     -0.113314     0.192111    0.850976   -0.578015    0.34583      0.168946    -0.0677183     0.252959    0.129221    -0.598859    0.0288707    0.361362    0.433932    0.107882     0.69204     0.258113 
 -0.557798    -0.40013     0.0192803    0.400618    -0.0563672   -0.148082   -0.694366    0.0606316   0.333891     0.482311    -0.426381     0.120122    0.276783   -0.460988   -0.178912     0.611347    -0.7675        0.504822    0.194766     0.122668    0.209101    -0.705845    0.062578   -0.283115     0.255425   -0.197304 
  0.27208     -0.423498   -0.196882    -0.626646     0.518021    -0.0835401  -0.222446   -0.174382    0.382096     0.260605    -0.212716     0.655348    0.253368    0.374022    0.470163     0.218428     0.681547     -0.435094    0.0318081   -0.0174114  -0.37198      0.0946975   0.131004    0.495114     0.100441   -0.274426 
  0.0419876    0.233237   -0.314145     0.0869355    0.554843     0.0686471   0.191761    0.502279    0.025127     0.689273     0.0574321   -0.197462   -0.273926    0.0067172   0.362792     0.196559     0.602872      0.357655    0.146054    -0.164493    0.150938     0.0959651  -0.282524   -0.214317    -0.300299   -0.182545 
  0.00820706   0.0300666   0.0510119   -0.00113629  -0.0732952   -0.0311108   0.0119669  -0.0605635   0.0814398   -0.0937089    0.00919535   0.0881225   0.053767    0.0252015  -0.0724961   -0.0277235   -0.0270807    -0.0675365   0.0137625    0.0476473  -0.0275862   -0.06316     0.0215673   0.0464826    0.034743    0.0372903
 -0.490139    -0.0805278   0.274375     0.501921    -0.555181     0.0847884  -0.0404222   0.355699   -0.153266    -0.0692613    0.399171    -0.0342861   0.15039    -0.271989   -0.731536    -0.984104    -0.34558       0.12693     0.461482     0.132426   -0.337383     0.0808566  -0.375521   -0.113025    -0.12302    -0.0747787
  0.228523     0.478484   -0.337321    -0.243836     0.574281    -0.144668    0.201152    0.130796   -0.137285     0.0585899    0.10665     -0.137873   -0.254142    0.553796    0.0495272    0.11445      0.497069      0.274347    0.184322    -0.234632    0.0052801    0.213723   -0.0799384  -0.148803     0.0352874   0.0475052
  0.491313     0.09602     0.892656     0.387524    -0.355945    -0.206189    0.227472    0.0450992   0.0242604   -0.0562403    0.363638    -0.282544    0.331022    0.192379    0.41474     -0.320401    -0.12299      -0.0832533   0.100285    -0.0886337  -0.00752938   0.0119617   0.451155    0.169458    -0.269065    0.119933 
 -0.384983    -0.39054     0.18786      0.511034     0.00500331   0.311471    1.20141    -0.38132     0.249786    -0.0658895   -0.220661    -0.278396   -0.397083    0.0172025  -0.0464185   -0.218766     0.397188     -0.0968606  -0.0985717    0.100731   -0.0534532   -0.438166    0.138672    0.234262    -0.19963     0.379945 
  0.0196748    0.732974    0.115916     0.426692    -0.136819     0.0576004   0.0893736   0.49753    -0.823121    -0.0784782    0.309398     0.0632793   0.404012   -0.531048   -0.0147022    0.277146     0.0202271     0.494229   -0.365867     0.35447     0.352574     0.308784   -0.100893    0.00616429  -0.296654    0.585621 
  0.806071    -0.0524753   0.137985    -0.331713    -0.261758    -0.0558232  -0.493931    0.21338     0.321043     0.105971    -0.969503    -0.359367    0.0885286   0.0140346   0.701992     0.81336     -0.297991      0.12052     0.188988     0.333218    0.191219    -0.0162284   0.38335     0.187226     0.260039   -0.0160056
  0.354438    -0.209011    0.149427     0.704509     0.454647    -0.0284102   0.728853    0.424202    0.327028     0.814624     0.417556     0.26613     0.0965146   0.276238   -0.00412379  -0.100054     0.121612     -0.504654   -0.0475547    0.275527    0.437766     0.0306934   0.802438   -0.421139     0.0611919  -0.0768758
 -0.195624    -0.411644    0.0188767   -0.111551    -0.142145     0.69235     0.0129829   0.340858    0.00879218   0.524259     0.0553922    0.042094   -0.416576    0.390533   -0.504935     0.589325     0.201875     -0.0142269  -0.424834    -0.60856     0.160042    -0.718058   -0.235519   -0.815119    -0.0394219   0.238144 
 -0.00144641  -0.054576   -0.0043048   -0.285025     0.0669237    0.235242    0.403842   -0.0602374  -0.137116     0.0156563   -0.2001      -0.171309    0.0446659  -0.392397    0.177558    -0.00749387  -0.230188      0.574746   -0.0138515    0.139854    0.0352033    0.302417   -0.164406   -0.114014     0.238914    0.0688556
  0.120807    -0.0794015   0.892426    -0.686703     0.219239    -0.205418   -0.33145    -0.756391    0.48489     -0.3171       0.236385    -0.336131    0.113644   -0.131832    0.29905     -0.260053    -0.608585      0.271644    0.011469    -0.788794   -0.64861     -0.0944365   0.166046   -0.183526     1.06034     0.0434442
  0.368071    -0.693373    0.17206     -0.750377    -0.00367436   0.527977    0.143732    0.410243   -0.846913     0.139516     0.0957971   -0.350731    0.623962   -0.304504   -0.148369    -0.17579     -0.455812     -0.406212   -0.339927    -0.484037   -0.051551     0.476546    0.24425     0.507339    -0.0755182   0.229634 
 -0.295403    -0.166012   -0.481455    -0.185961     0.290489    -0.180997    0.051595   -0.527551    0.373294     0.00752357  -0.526842    -0.0650036  -0.505143    0.246584   -0.245786     0.0532703    0.0584421    -0.343163    0.604974    -0.0729462  -0.407981    -0.152501   -0.0965786   0.00439873   0.447005   -0.600313 
 -0.0910405   -0.553848    0.0472491   -0.0180938   -0.136494     0.159582   -0.313561   -0.644691   -0.0926053   -0.176307     0.663084    -0.0752782   0.0163297  -0.12831    -0.55431     -0.00549735  -0.465135     -0.633698    0.273379     0.0917397   0.357309    -0.243157    0.0904489   0.21934     -0.056063   -0.4706   
  0.0387234    0.120427    0.160889    -0.330758    -0.290896    -0.211635   -0.901078    0.269644    0.0712789   -0.00894491   0.332939     0.283194    0.392008   -0.0219568   0.166723     0.287181     0.278645      0.018604   -0.25566     -0.0672782   0.27283      0.288705    0.267101   -0.0720179   -0.373133    0.0481894
 -0.162706     0.698167    0.0799768   -0.214041    -0.393777     0.577808    0.469289   -0.110151    0.0421668   -0.116373     0.0520705    0.543693   -0.259044   -0.131505    0.0771823   -0.572769    -0.0558014    -0.10013    -0.0828528   -0.392111    0.61087     -0.266571    0.484691   -0.216878     0.442742    0.777907 
  0.330382     0.721705    0.556596    -0.988548    -0.0468779   -0.0818173   0.252775   -0.327781   -0.434814    -0.404035    -0.0645718   -0.170342   -0.265065    0.419707    0.162483    -0.31543      0.531256     -0.206539   -0.00205755   0.362003    0.109583     0.325132   -0.271043    0.345776    -0.321912   -0.157306 
 -0.342767     0.429532   -0.0891022    0.41106     -0.0803252   -0.0462622  -0.328522    0.616739   -0.287513     0.250764    -0.101581    -0.119792   -0.585735    0.210232   -0.243727     0.111335     0.125827     -0.153375    0.0602257   -0.138495   -0.0781774   -0.010263   -0.0788611  -0.0529263   -0.355552   -0.270865 
  0.493244     0.254929    0.215722    -0.0569194   -0.0431264    0.147325    0.826232    0.491499    0.335307    -0.169039    -0.578855    -0.126617    0.23097    -0.220805    0.507128    -0.626401     0.205153      0.878621   -0.029247     0.30339    -0.489736     0.599573   -0.138377    0.351096     0.199657    0.450123 
 -0.0456156    0.637224   -0.0885662    0.458675    -0.170301    -0.174741   -0.555301    0.0129448   0.463088    -0.160851     0.126522     0.130063   -0.375052    0.36416    -0.10212      0.0858657    0.0589636    -0.0326376   0.0587566    0.553001    0.303962    -0.286574   -0.618419   -0.46624     -0.211201   -0.287794 [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.410854
[ Info: iteration 2, average log likelihood -1.410842
[ Info: iteration 3, average log likelihood -1.410830
[ Info: iteration 4, average log likelihood -1.410819
[ Info: iteration 5, average log likelihood -1.410808
[ Info: iteration 6, average log likelihood -1.410797
[ Info: iteration 7, average log likelihood -1.410786
[ Info: iteration 8, average log likelihood -1.410776
[ Info: iteration 9, average log likelihood -1.410766
[ Info: iteration 10, average log likelihood -1.410757
┌ Info: EM with 100000 data points 10 iterations avll -1.410757
└ 59.0 data points per parameter
[ Info: Initializing GMM, 2 Gaussians diag covariance 2 dimensions using 900 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       1.678561e+05
      1       2.230230e+04      -1.455538e+05 |        2
      2       7.823675e+03      -1.447862e+04 |        0
      3       7.823675e+03       0.000000e+00 |        0
K-means converged with 3 iterations (objv = 7823.67549422947)
┌ Info: K-means with 900 data points using 3 iterations
└ 150.0 data points per parameter
[ Info: Running 10 iterations EM on full cov GMM with 2 Gaussians in 2 dimensions
[ Info: iteration 1, average log likelihood -2.043155
[ Info: iteration 2, average log likelihood -2.043154
[ Info: iteration 3, average log likelihood -2.043154
[ Info: iteration 4, average log likelihood -2.043154
[ Info: iteration 5, average log likelihood -2.043154
[ Info: iteration 6, average log likelihood -2.043154
[ Info: iteration 7, average log likelihood -2.043154
[ Info: iteration 8, average log likelihood -2.043154
[ Info: iteration 9, average log likelihood -2.043154
[ Info: iteration 10, average log likelihood -2.043154
┌ Info: EM with 900 data points 10 iterations avll -2.043154
└ 81.8 data points per parameter
   Testing GaussianMixtures tests passed 

Results with Julia v1.3.1-pre-7704df0a5a

Testing was successful. Last evaluation was ago and took 10 minutes, 2 seconds.

Click here to download the log file.

 Resolving package versions...
 Installed GaussianMixtures ─── v0.3.0
 Installed LegacyStrings ────── v0.4.1
 Installed Compat ───────────── v2.2.0
 Installed CMake ────────────── v1.1.2
 Installed HDF5 ─────────────── v0.12.5
 Installed QuadGK ───────────── v2.1.1
 Installed DataStructures ───── v0.17.6
 Installed SpecialFunctions ─── v0.8.0
 Installed Blosc ────────────── v0.5.1
 Installed BinDeps ──────────── v0.8.10
 Installed URIParser ────────── v0.4.0
 Installed StaticArrays ─────── v0.12.1
 Installed CMakeWrapper ─────── v0.2.3
 Installed FileIO ───────────── v1.1.0
 Installed Arpack ───────────── v0.3.1
 Installed StatsBase ────────── v0.32.0
 Installed JLD ──────────────── v0.9.1
 Installed StatsFuns ────────── v0.9.0
 Installed Parameters ───────── v0.12.0
 Installed Missings ─────────── v0.4.3
 Installed BinaryProvider ───── v0.5.8
 Installed ScikitLearnBase ──── v0.5.0
 Installed Distances ────────── v0.8.2
 Installed Rmath ────────────── v0.5.1
 Installed DataAPI ──────────── v1.1.0
 Installed Distributions ────── v0.21.9
 Installed SortingAlgorithms ── v0.3.1
 Installed OrderedCollections ─ v1.1.0
 Installed PDMats ───────────── v0.9.10
 Installed NearestNeighbors ─── v0.4.4
 Installed Clustering ───────── v0.13.3
  Updating `~/.julia/environments/v1.3/Project.toml`
  [cc18c42c] + GaussianMixtures v0.3.0
  Updating `~/.julia/environments/v1.3/Manifest.toml`
  [7d9fca2a] + Arpack v0.3.1
  [9e28174c] + BinDeps v0.8.10
  [b99e7846] + BinaryProvider v0.5.8
  [a74b3585] + Blosc v0.5.1
  [631607c0] + CMake v1.1.2
  [d5fb7624] + CMakeWrapper v0.2.3
  [aaaa29a8] + Clustering v0.13.3
  [34da2185] + Compat v2.2.0
  [9a962f9c] + DataAPI v1.1.0
  [864edb3b] + DataStructures v0.17.6
  [b4f34e82] + Distances v0.8.2
  [31c24e10] + Distributions v0.21.9
  [5789e2e9] + FileIO v1.1.0
  [cc18c42c] + GaussianMixtures v0.3.0
  [f67ccb44] + HDF5 v0.12.5
  [4138dd39] + JLD v0.9.1
  [1b4a561d] + LegacyStrings v0.4.1
  [e1d29d7a] + Missings v0.4.3
  [b8a86587] + NearestNeighbors v0.4.4
  [bac558e1] + OrderedCollections v1.1.0
  [90014a1f] + PDMats v0.9.10
  [d96e819e] + Parameters v0.12.0
  [1fd47b50] + QuadGK v2.1.1
  [79098fc4] + Rmath v0.5.1
  [6e75b9c4] + ScikitLearnBase v0.5.0
  [a2af1166] + SortingAlgorithms v0.3.1
  [276daf66] + SpecialFunctions v0.8.0
  [90137ffa] + StaticArrays v0.12.1
  [2913bbd2] + StatsBase v0.32.0
  [4c63d2b9] + StatsFuns v0.9.0
  [30578b45] + URIParser v0.4.0
  [2a0f44e3] + Base64 
  [ade2ca70] + Dates 
  [8bb1440f] + DelimitedFiles 
  [8ba89e20] + Distributed 
  [b77e0a4c] + InteractiveUtils 
  [76f85450] + LibGit2 
  [8f399da3] + Libdl 
  [37e2e46d] + LinearAlgebra 
  [56ddb016] + Logging 
  [d6f4376e] + Markdown 
  [a63ad114] + Mmap 
  [44cfe95a] + Pkg 
  [de0858da] + Printf 
  [9abbd945] + Profile 
  [3fa0cd96] + REPL 
  [9a3f8284] + Random 
  [ea8e919c] + SHA 
  [9e88b42a] + Serialization 
  [1a1011a3] + SharedArrays 
  [6462fe0b] + Sockets 
  [2f01184e] + SparseArrays 
  [10745b16] + Statistics 
  [4607b0f0] + SuiteSparse 
  [8dfed614] + Test 
  [cf7118a7] + UUIDs 
  [4ec0a83e] + Unicode 
  Building CMake ───────────→ `~/.julia/packages/CMake/nSK2r/deps/build.log`
  Building Blosc ───────────→ `~/.julia/packages/Blosc/lzFr0/deps/build.log`
  Building HDF5 ────────────→ `~/.julia/packages/HDF5/Zh9on/deps/build.log`
  Building SpecialFunctions → `~/.julia/packages/SpecialFunctions/ne2iw/deps/build.log`
  Building Arpack ──────────→ `~/.julia/packages/Arpack/cu5By/deps/build.log`
  Building Rmath ───────────→ `~/.julia/packages/Rmath/4wt82/deps/build.log`
   Testing GaussianMixtures
    Status `/tmp/jl_4FwmOW/Manifest.toml`
  [7d9fca2a] Arpack v0.3.1
  [9e28174c] BinDeps v0.8.10
  [b99e7846] BinaryProvider v0.5.8
  [a74b3585] Blosc v0.5.1
  [631607c0] CMake v1.1.2
  [d5fb7624] CMakeWrapper v0.2.3
  [aaaa29a8] Clustering v0.13.3
  [34da2185] Compat v2.2.0
  [9a962f9c] DataAPI v1.1.0
  [864edb3b] DataStructures v0.17.6
  [b4f34e82] Distances v0.8.2
  [31c24e10] Distributions v0.21.9
  [5789e2e9] FileIO v1.1.0
  [cc18c42c] GaussianMixtures v0.3.0
  [f67ccb44] HDF5 v0.12.5
  [4138dd39] JLD v0.9.1
  [1b4a561d] LegacyStrings v0.4.1
  [e1d29d7a] Missings v0.4.3
  [b8a86587] NearestNeighbors v0.4.4
  [bac558e1] OrderedCollections v1.1.0
  [90014a1f] PDMats v0.9.10
  [d96e819e] Parameters v0.12.0
  [1fd47b50] QuadGK v2.1.1
  [79098fc4] Rmath v0.5.1
  [6e75b9c4] ScikitLearnBase v0.5.0
  [a2af1166] SortingAlgorithms v0.3.1
  [276daf66] SpecialFunctions v0.8.0
  [90137ffa] StaticArrays v0.12.1
  [2913bbd2] StatsBase v0.32.0
  [4c63d2b9] StatsFuns v0.9.0
  [30578b45] URIParser v0.4.0
  [2a0f44e3] Base64  [`@stdlib/Base64`]
  [ade2ca70] Dates  [`@stdlib/Dates`]
  [8bb1440f] DelimitedFiles  [`@stdlib/DelimitedFiles`]
  [8ba89e20] Distributed  [`@stdlib/Distributed`]
  [b77e0a4c] InteractiveUtils  [`@stdlib/InteractiveUtils`]
  [76f85450] LibGit2  [`@stdlib/LibGit2`]
  [8f399da3] Libdl  [`@stdlib/Libdl`]
  [37e2e46d] LinearAlgebra  [`@stdlib/LinearAlgebra`]
  [56ddb016] Logging  [`@stdlib/Logging`]
  [d6f4376e] Markdown  [`@stdlib/Markdown`]
  [a63ad114] Mmap  [`@stdlib/Mmap`]
  [44cfe95a] Pkg  [`@stdlib/Pkg`]
  [de0858da] Printf  [`@stdlib/Printf`]
  [9abbd945] Profile  [`@stdlib/Profile`]
  [3fa0cd96] REPL  [`@stdlib/REPL`]
  [9a3f8284] Random  [`@stdlib/Random`]
  [ea8e919c] SHA  [`@stdlib/SHA`]
  [9e88b42a] Serialization  [`@stdlib/Serialization`]
  [1a1011a3] SharedArrays  [`@stdlib/SharedArrays`]
  [6462fe0b] Sockets  [`@stdlib/Sockets`]
  [2f01184e] SparseArrays  [`@stdlib/SparseArrays`]
  [10745b16] Statistics  [`@stdlib/Statistics`]
  [4607b0f0] SuiteSparse  [`@stdlib/SuiteSparse`]
  [8dfed614] Test  [`@stdlib/Test`]
  [cf7118a7] UUIDs  [`@stdlib/UUIDs`]
  [4ec0a83e] Unicode  [`@stdlib/Unicode`]
[ Info: Testing Data
(100000, -5.970719954888905e6, [99994.99999999997, 5.0000000000339755], [73.82271095773883 -305.70576188444784 -75.27582508717067; -10.194109054918508 14.270863806457285 -9.464526679160944], Array{Float64,2}[[99768.48326272935 909.617767462877 -236.20319685724775; 909.617767462877 99396.46789822711 -91.55082067146917; -236.20319685724775 -91.55082067146917 100553.89942937375], [25.904192640286105 -26.397757452243066 17.153498475987465; -26.397757452243066 42.30014826785995 -27.995817504143634; 17.153498475987465 -27.995817504143634 19.17860852065914]])
┌ Warning: rmprocs: process 1 not removed
└ @ Distributed /workspace/srcdir/julia/usr/share/julia/stdlib/v1.3/Distributed/src/cluster.jl:1015
[ Info: Initializing GMM, 8 Gaussians diag covariance 2 dimensions using 272 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       1.471549e+03
      1       9.764717e+02      -4.950776e+02 |        8
      2       9.166793e+02      -5.979231e+01 |        2
      3       9.148495e+02      -1.829846e+00 |        0
      4       9.148495e+02       0.000000e+00 |        0
K-means converged with 4 iterations (objv = 914.8494978374374)
┌ Info: K-means with 272 data points using 4 iterations
└ 11.3 data points per parameter
[ Info: Running 0 iterations EM on full cov GMM with 8 Gaussians in 2 dimensions
┌ Info: EM with 272 data points 0 iterations avll -2.076381
└ 5.8 data points per parameter
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:221
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:221
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:221
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:221
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex at broadcast.jl:630 [inlined]
└ @ Core ./broadcast.jl:630
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = lowerbound(::VGMM{Float64}, ::Array{Float64,1}, ::Array{Float64,2}, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64) at bayes.jl:230
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/bayes.jl:230
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex at broadcast.jl:630 [inlined]
└ @ Core ./broadcast.jl:630
┌ Warning: `lgamma(x::Real)` is deprecated, use `(logabsgamma(x))[1]` instead.
│   caller = _broadcast_getindex_evalf at broadcast.jl:630 [inlined]
└ @ Core ./broadcast.jl:630
[ Info: iteration 1, lowerbound -3.775676
[ Info: iteration 2, lowerbound -3.604697
[ Info: iteration 3, lowerbound -3.416887
[ Info: iteration 4, lowerbound -3.214269
[ Info: iteration 5, lowerbound -3.035162
[ Info: dropping number of Gaussions to 7
[ Info: iteration 6, lowerbound -2.908865
[ Info: iteration 7, lowerbound -2.840379
[ Info: dropping number of Gaussions to 5
[ Info: iteration 8, lowerbound -2.809155
[ Info: dropping number of Gaussions to 4
[ Info: iteration 9, lowerbound -2.790394
[ Info: dropping number of Gaussions to 3
[ Info: iteration 10, lowerbound -2.780716
[ Info: iteration 11, lowerbound -2.773586
[ Info: iteration 12, lowerbound -2.768563
[ Info: iteration 13, lowerbound -2.761301
[ Info: iteration 14, lowerbound -2.750919
[ Info: iteration 15, lowerbound -2.736395
[ Info: iteration 16, lowerbound -2.716696
[ Info: iteration 17, lowerbound -2.691011
[ Info: iteration 18, lowerbound -2.659049
[ Info: iteration 19, lowerbound -2.621316
[ Info: iteration 20, lowerbound -2.579256
[ Info: iteration 21, lowerbound -2.535166
[ Info: iteration 22, lowerbound -2.491713
[ Info: iteration 23, lowerbound -2.451058
[ Info: iteration 24, lowerbound -2.414083
[ Info: iteration 25, lowerbound -2.380453
[ Info: iteration 26, lowerbound -2.349961
[ Info: iteration 27, lowerbound -2.324738
[ Info: iteration 28, lowerbound -2.309763
[ Info: iteration 29, lowerbound -2.308474
[ Info: dropping number of Gaussions to 2
[ Info: iteration 30, lowerbound -2.302914
[ Info: iteration 31, lowerbound -2.299258
[ Info: iteration 32, lowerbound -2.299255
[ Info: iteration 33, lowerbound -2.299254
[ Info: iteration 34, lowerbound -2.299254
[ Info: iteration 35, lowerbound -2.299253
[ Info: iteration 36, lowerbound -2.299253
[ Info: iteration 37, lowerbound -2.299253
[ Info: iteration 38, lowerbound -2.299253
[ Info: iteration 39, lowerbound -2.299253
[ Info: iteration 40, lowerbound -2.299253
[ Info: iteration 41, lowerbound -2.299253
[ Info: iteration 42, lowerbound -2.299253
[ Info: iteration 43, lowerbound -2.299253
[ Info: iteration 44, lowerbound -2.299253
[ Info: iteration 45, lowerbound -2.299253
[ Info: iteration 46, lowerbound -2.299253
[ Info: iteration 47, lowerbound -2.299253
[ Info: iteration 48, lowerbound -2.299253
[ Info: iteration 49, lowerbound -2.299253
[ Info: iteration 50, lowerbound -2.299253
[ Info: 50 variational Bayes EM-like iterations using 272 data points, final lowerbound -2.299253
History[Tue Dec  3 04:04:56 2019: Initializing GMM, 8 Gaussians diag covariance 2 dimensions using 272 data points
, Tue Dec  3 04:05:05 2019: K-means with 272 data points using 4 iterations
11.3 data points per parameter
, Tue Dec  3 04:05:07 2019: EM with 272 data points 0 iterations avll -2.076381
5.8 data points per parameter
, Tue Dec  3 04:05:08 2019: GMM converted to Variational GMM
, Tue Dec  3 04:05:17 2019: iteration 1, lowerbound -3.775676
, Tue Dec  3 04:05:17 2019: iteration 2, lowerbound -3.604697
, Tue Dec  3 04:05:17 2019: iteration 3, lowerbound -3.416887
, Tue Dec  3 04:05:17 2019: iteration 4, lowerbound -3.214269
, Tue Dec  3 04:05:17 2019: iteration 5, lowerbound -3.035162
, Tue Dec  3 04:05:18 2019: dropping number of Gaussions to 7
, Tue Dec  3 04:05:18 2019: iteration 6, lowerbound -2.908865
, Tue Dec  3 04:05:18 2019: iteration 7, lowerbound -2.840379
, Tue Dec  3 04:05:18 2019: dropping number of Gaussions to 5
, Tue Dec  3 04:05:18 2019: iteration 8, lowerbound -2.809155
, Tue Dec  3 04:05:18 2019: dropping number of Gaussions to 4
, Tue Dec  3 04:05:18 2019: iteration 9, lowerbound -2.790394
, Tue Dec  3 04:05:18 2019: dropping number of Gaussions to 3
, Tue Dec  3 04:05:18 2019: iteration 10, lowerbound -2.780716
, Tue Dec  3 04:05:18 2019: iteration 11, lowerbound -2.773586
, Tue Dec  3 04:05:18 2019: iteration 12, lowerbound -2.768563
, Tue Dec  3 04:05:18 2019: iteration 13, lowerbound -2.761301
, Tue Dec  3 04:05:18 2019: iteration 14, lowerbound -2.750919
, Tue Dec  3 04:05:18 2019: iteration 15, lowerbound -2.736395
, Tue Dec  3 04:05:18 2019: iteration 16, lowerbound -2.716696
, Tue Dec  3 04:05:18 2019: iteration 17, lowerbound -2.691011
, Tue Dec  3 04:05:18 2019: iteration 18, lowerbound -2.659049
, Tue Dec  3 04:05:18 2019: iteration 19, lowerbound -2.621316
, Tue Dec  3 04:05:18 2019: iteration 20, lowerbound -2.579256
, Tue Dec  3 04:05:18 2019: iteration 21, lowerbound -2.535166
, Tue Dec  3 04:05:18 2019: iteration 22, lowerbound -2.491713
, Tue Dec  3 04:05:18 2019: iteration 23, lowerbound -2.451058
, Tue Dec  3 04:05:18 2019: iteration 24, lowerbound -2.414083
, Tue Dec  3 04:05:18 2019: iteration 25, lowerbound -2.380453
, Tue Dec  3 04:05:18 2019: iteration 26, lowerbound -2.349961
, Tue Dec  3 04:05:18 2019: iteration 27, lowerbound -2.324738
, Tue Dec  3 04:05:18 2019: iteration 28, lowerbound -2.309763
, Tue Dec  3 04:05:18 2019: iteration 29, lowerbound -2.308474
, Tue Dec  3 04:05:18 2019: dropping number of Gaussions to 2
, Tue Dec  3 04:05:18 2019: iteration 30, lowerbound -2.302914
, Tue Dec  3 04:05:18 2019: iteration 31, lowerbound -2.299258
, Tue Dec  3 04:05:18 2019: iteration 32, lowerbound -2.299255
, Tue Dec  3 04:05:18 2019: iteration 33, lowerbound -2.299254
, Tue Dec  3 04:05:18 2019: iteration 34, lowerbound -2.299254
, Tue Dec  3 04:05:18 2019: iteration 35, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 36, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 37, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 38, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 39, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 40, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 41, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 42, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 43, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 44, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 45, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 46, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 47, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 48, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 49, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: iteration 50, lowerbound -2.299253
, Tue Dec  3 04:05:18 2019: 50 variational Bayes EM-like iterations using 272 data points, final lowerbound -2.299253
]
α = [178.04509222738287, 95.95490777261709]
β = [178.04509222738287, 95.95490777261709]
m = [4.250300733258808 79.28686694419856; 2.0002292577638707 53.85198717240141]
ν = [180.04509222738287, 97.95490777261709]
W = LinearAlgebra.UpperTriangular{Float64,Array{Float64,2}}[[0.1840415554733207 -0.007644049042473372; 0.0 0.008581705166128297], [0.3758763612139578 -0.008953123827573172; 0.0 0.012748664777467464]]
Kind: diag, size256
nx: 100000 sum(zeroth order stats): 99999.99999999994
avll from stats: -0.9844978570835627
avll from llpg:  -0.9844978570835633
avll direct:     -0.9844978570835632
sum posterior: 100000.0
Kind: full, size16
nx: 100000 sum(zeroth order stats): 100000.00000000001
avll from stats: -0.9919551691976011
avll from llpg:  -0.9919551691976009
avll direct:     -0.9919551691976009
sum posterior: 100000.0
32×26 Array{Float64,2}:
  0.0401681   -0.142157    -0.165244     0.110308     0.0336544    0.0858292    0.0516284   -0.132939    -0.10643      0.0957587    0.0300196   -0.096476     0.0368905   -0.0186694    0.0111516   -0.104486    -0.00366264   0.00563595   0.073871     -0.102605     0.124157    -0.0912458    0.13064     -0.00173078  -0.0736968    -0.0438177  
 -0.00155536  -0.136659     0.125738     0.0251005   -0.0644274    0.0235155   -0.00342454  -0.0290318    0.0609071   -0.00726957   0.032705    -0.0802528    0.0241323    0.0625239    0.193832     0.0790377   -0.0522682   -0.0488984   -0.0820275     0.0444124   -0.0398465    0.0606131    0.129483    -0.309875    -0.0204381     0.145774   
  0.149757    -0.0989619    0.0637248    0.169114    -0.0685978    0.0489583    0.00655248  -0.0483229   -0.0854892   -0.0890764    0.00591515  -0.124071     0.0157091    0.0725566   -0.104082     0.135833    -0.207479     0.00835343  -0.0614676     0.180163    -0.0375059    0.0529017    0.141842     0.0986753    0.0314435     0.171136   
 -0.193124     0.086746     0.0335739   -0.0208062    0.0143121   -0.133093     0.00620721  -0.0880891   -0.093029     0.0394623    0.0612095    0.131731     0.0284315    0.00829289   0.0129573    0.0518173   -0.0492147   -0.231818     0.0103111     0.0356251    0.0216162    0.0351488   -0.0685031   -0.106798     0.100634     -0.12137    
 -0.180799    -0.0144132    0.0528651   -0.0792735   -0.0473698    0.0714514    0.0106879    0.143615     0.0297407    0.057001    -0.0434224   -0.0582095    0.0767828    0.007287    -0.115606    -0.0680456   -0.24463     -0.0602764   -0.107219     -0.152171     0.0606618    0.049658     0.0285018   -0.0563478   -0.0579864     0.20082    
  0.076334     0.111343     0.0591105   -0.128116     0.043725     0.0394147   -0.0192577    0.00331028  -0.0445828   -0.192729    -0.0560842   -0.037575     0.0783753    0.00476516  -0.1806       0.073984    -0.049358    -0.0499001   -0.13062       0.163354    -0.00176797   0.140069    -0.036766     0.0993881   -0.0857011     0.00143131 
  0.049198    -0.306672     0.0362058   -0.0242767    0.154805    -0.104024     0.0530583    0.0473367   -0.0230647   -0.0881734   -0.0661527   -0.103897     0.0467438    0.0814999   -0.0205856   -0.0810996   -0.278306     0.0281774   -0.0156972     0.0334241   -0.0733897   -0.127056    -0.0969258    0.126754    -0.0152641     0.0594101  
 -0.0188069    0.022173     0.0465184    0.0890395    0.161061     0.194651     0.0245392   -0.0734321    0.0765056    0.00286347   0.176604    -0.0194705   -0.0701075    0.105142    -0.0791648    0.0743367   -0.0287762    0.115777    -0.0821075    -0.0114703   -0.0192565    0.176245    -0.0995678   -0.120871    -0.0215642    -0.0166174  
  0.0149268   -0.115267    -0.0208115   -0.0142627    0.026889    -0.113055     0.0318324   -0.210897    -0.0711093   -0.0468605    0.119675    -0.0808727   -0.0571923   -0.056484     0.161542    -0.021661     0.00476568   0.106886    -0.236815     -0.0536965   -0.157061    -0.137622     0.116631     0.0570489    0.0909001    -0.0430312  
  0.0610778   -0.0672422   -0.0321444    0.0307094   -0.198301     0.0604722    0.034715     0.153479     0.14869      0.0280918   -0.0313619   -0.0677866    0.0875641   -0.0417956   -0.114881     0.00374717   0.0269687   -0.0800305   -0.13589       0.0526186   -0.128769     0.137253     0.0546663   -0.179674     0.0796905     0.0442193  
 -0.0981668    0.0761335    0.100227    -0.130627    -0.147232    -0.076812    -0.0751781    0.138528     0.226999    -0.0114285    0.00714095  -0.0814914   -0.0621135   -0.0129483    0.0140155   -0.0488984   -0.00734185  -0.034982     0.0441878     0.00706813  -0.0365539   -0.095939     0.0148132   -0.0759732   -0.183063     -0.0173053  
  0.110644    -0.119386    -0.0805614   -0.0110052    0.00530001   0.0576802   -0.0535239   -0.0119327    0.0127084   -0.0423788   -0.0662038   -0.066121     0.04039      0.170221     0.0154803   -0.183334    -0.105435    -0.10032      0.0757125    -0.0173328    0.0406544    0.0378534    0.0343031   -0.0979138   -0.000381235  -0.0350256  
  0.136902     0.0611273    0.0183151   -0.193047     0.0741262    0.055589    -0.0673299   -0.0503871    0.0798437    0.0547707    0.00825121  -0.00161368  -0.0508834    0.129321     0.185121     0.0449175    0.132015    -0.0209892    0.204803      0.105227    -0.0796918    0.131733    -0.0651504    0.104768    -0.0889521    -0.0545097  
 -0.144391    -0.155335    -0.120013     0.0180999    0.0911079   -0.187104     0.0618932    0.103122    -0.109958    -0.0372017    0.182618    -0.0626262   -0.069423    -0.0400204   -0.0096421   -0.0484194   -0.0945462    0.0241143    0.0706444     0.0359817   -0.0917025   -0.14481      0.120936    -0.0398213    0.229484      0.043625   
 -0.00158379   0.0181997    0.0744811   -0.0536644   -0.0622989   -0.0128139    0.103268     0.0226275   -0.132957     0.0186092   -0.224431    -0.0178378   -0.0318412   -0.0409764   -0.0491595    0.100444     0.0648687    0.261142    -0.00439331    0.0166999   -0.182954    -0.0803408    0.121993     0.11387     -0.115091     -0.0205566  
 -0.0251248   -0.185483     0.0500468   -0.212751    -0.0333486    0.15344     -0.129092    -0.073943     0.0105694    0.0695281    0.0515419    0.0413279   -0.0245326   -0.147095     0.0991307   -0.0815832    0.0439166    0.0264408    0.000747163   0.270339     0.0883922    0.00580615   0.0952695    0.131369    -0.011262      0.049575   
  0.136706     0.099552    -0.0958756   -0.17552     -0.0927527   -0.0418364    0.0124186   -0.00758594   0.25915     -0.0467911   -0.107454    -0.238943     0.00951154  -0.00606447   0.0185125    0.047366     0.101493    -0.120967     0.122011     -0.0335992   -0.0955589    0.0533677   -0.00947546   0.145488    -0.202309     -0.0888935  
  0.0260712    0.148846    -0.0688362    0.0277725    0.204451     0.163552     0.130827     0.0880678    0.135398    -0.0376501   -0.135495    -0.199971     0.155273     0.0404535    0.104477     0.086286    -0.0473718    0.0246413    0.0174585     0.141186     0.0238227   -0.0366976   -0.144304     0.0245568   -0.00134947   -0.0578128  
  0.0390766    0.177545     0.00691069   0.0609433   -0.00143553   0.114084    -0.118306     0.0514687    0.00332628  -0.176566    -0.0745408    0.0892653    0.207189     0.106873     0.00664001  -0.174612    -0.0839131    0.0625777    0.00320227    0.0476375    0.0180044    0.0253405   -0.172598     0.107765     0.0612276    -0.051078   
  0.0882217   -0.0586789    0.094771    -0.0597275    0.0701274   -0.00095389  -0.142243     0.0352959    0.109665    -0.0658925    0.145287     0.167986     0.00151717   0.201214     0.0263995    0.0824008   -0.0190125   -0.0259461   -0.046218      0.0766248    0.0363755   -0.238262    -0.00379776   0.0245633    0.0830537    -0.0670119  
  0.29645     -0.0726284   -0.0446135    0.046203     0.0539788   -0.0628599   -0.16386     -0.132033    -0.00119498   0.0951066   -0.102786    -0.0314005   -0.0578883   -0.0486272    0.00765634  -0.0440233    0.0339534    0.0321591   -0.0371919     0.0495886   -0.100357     0.0341203    0.0243681   -0.0680407    0.134315     -0.0173043  
  0.0101451    0.164649    -0.0398943   -0.00312673  -0.177279    -0.0627119   -0.00419405   0.0347572   -0.0568637    0.181995    -0.0985063   -0.109149     0.032635     0.0286685    0.100798     0.001616     0.110164    -0.0867253   -0.0488685    -0.00414336  -0.0402632    0.180006    -0.0299768    0.0116772    0.0538116     0.084745   
 -0.00969742  -0.154143    -0.0536923    0.100437    -0.042879    -0.0732417    0.0987409    0.0514522   -0.153077     0.104255     0.0515144   -0.174051     0.17172     -0.0205082    0.103056     0.124292     0.1006      -0.0637551    0.123607     -0.11237     -0.0626607    0.10095     -0.0254954   -0.156713    -0.128191      0.188311   
 -0.234241    -0.0303729   -0.11807      0.114027     0.10051     -0.0106531    0.0772728    0.110375     0.0366831   -0.0311529    0.0777938   -0.0670209    0.17209     -0.0551384    0.0725076    0.0722786   -0.0705286   -0.113823    -0.0985188     0.101736     0.0902259   -0.246227    -0.0808345    0.101568     0.0849896    -0.168123   
  0.0439658    0.00632998  -0.167129    -0.0727304    0.00959414   0.0748401    0.0660813   -0.0118195   -0.139012    -0.157734    -0.192161    -0.0374417    0.0958423    0.0145454   -0.05279     -0.0105496    0.171546     0.0146169   -0.0726966     0.0260507   -0.0928631   -0.0443981    0.0115945   -0.124303    -0.137955     -0.000627315
  0.121815    -0.1806      -0.0256023   -0.191861    -0.314095    -0.0440446    0.0244324    0.00801728   0.0098476   -0.0266724   -0.0944872    0.0492038   -0.0684004    0.111881     0.066162     0.0654148    0.01767      0.0205153   -0.0134129     0.0164789   -0.186055     0.0247111   -0.0778523   -0.259268    -0.00274108    0.195974   
  0.277199    -0.114951     0.0243181    0.0698104    0.0871288    0.0763358   -0.0544468    0.194922     0.028894    -0.0263337   -0.0456215   -0.153068     0.120419     0.0494176    0.0977727   -0.00615099   0.0152128    0.172743     0.0697227     0.0451108    0.0960361    0.0345536    0.047346     0.0281117   -0.00863617    0.196761   
  0.0255344    0.0866191    0.0639781    0.11314     -0.0712622   -0.043278    -0.191418    -0.0857547   -0.0565558   -0.0375944   -0.00648278   0.0463083   -0.0586902   -0.142965    -0.0938569   -0.131738     0.106256     0.00694993   0.0436041     0.0229313   -0.0296121    0.0126414   -0.0588736   -0.252874     0.299177      0.0347029  
 -0.121361    -0.0395638   -0.0192214    0.0850204   -0.149764     0.0523481    0.0357082    0.10621      0.0257468    0.0934016    0.0092514   -0.1699      -0.032999    -0.0510472   -0.0732543   -0.0837233    0.116746    -0.0704393   -0.00681486   -0.0509841    0.0414033    0.0407719   -0.0119199    0.0254369    0.0362223     0.0837653  
 -0.163448     0.0245663   -0.123994    -0.0844763    0.0146593    0.0519263   -0.122198    -0.0477849    0.0134166    0.172378     0.208423    -0.194274     0.0987968   -0.0213956   -0.0596304   -0.145111    -0.00468066  -0.0957186    0.0158702     0.121074     0.0926532    0.0230698   -0.0347769   -0.0006848    0.108911      0.194039   
 -0.16031      0.0122995   -0.0176206    0.0482598   -0.0747356   -0.210418     0.118941     0.0863376    0.272807     0.102867    -0.047248     0.0407097    0.0369108   -0.016381     0.0157019   -0.0575999   -0.0241631    0.0177212    0.0976913    -0.166516     0.0900719    0.00932491  -0.0106109   -0.0640423    0.0978533     0.131167   
  0.245321     0.0998751   -0.0680446    0.0381963    0.00633168  -0.28785     -0.0148128   -0.0665932   -0.0166107    0.0782369    0.0862163   -0.0167719    0.0503212   -0.229174    -0.0503592   -0.0133253    0.0998986   -0.0137927    0.0918432    -0.126407     0.105432    -0.092363     0.103592     0.0419802   -0.119566     -0.0283992  kind diag, method split
┌ Info: 0: avll = 
└   tll[1] = -1.4282238221885204
[ Info: Running 50 iterations EM on diag cov GMM with 2 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.428307
[ Info: iteration 2, average log likelihood -1.428228
[ Info: iteration 3, average log likelihood -1.427595
[ Info: iteration 4, average log likelihood -1.420081
[ Info: iteration 5, average log likelihood -1.401128
[ Info: iteration 6, average log likelihood -1.393019
[ Info: iteration 7, average log likelihood -1.391440
[ Info: iteration 8, average log likelihood -1.390459
[ Info: iteration 9, average log likelihood -1.389784
[ Info: iteration 10, average log likelihood -1.389404
[ Info: iteration 11, average log likelihood -1.389196
[ Info: iteration 12, average log likelihood -1.389080
[ Info: iteration 13, average log likelihood -1.389012
[ Info: iteration 14, average log likelihood -1.388971
[ Info: iteration 15, average log likelihood -1.388944
[ Info: iteration 16, average log likelihood -1.388926
[ Info: iteration 17, average log likelihood -1.388914
[ Info: iteration 18, average log likelihood -1.388906
[ Info: iteration 19, average log likelihood -1.388900
[ Info: iteration 20, average log likelihood -1.388896
[ Info: iteration 21, average log likelihood -1.388893
[ Info: iteration 22, average log likelihood -1.388891
[ Info: iteration 23, average log likelihood -1.388890
[ Info: iteration 24, average log likelihood -1.388889
[ Info: iteration 25, average log likelihood -1.388888
[ Info: iteration 26, average log likelihood -1.388887
[ Info: iteration 27, average log likelihood -1.388887
[ Info: iteration 28, average log likelihood -1.388887
[ Info: iteration 29, average log likelihood -1.388886
[ Info: iteration 30, average log likelihood -1.388886
[ Info: iteration 31, average log likelihood -1.388886
[ Info: iteration 32, average log likelihood -1.388886
[ Info: iteration 33, average log likelihood -1.388886
[ Info: iteration 34, average log likelihood -1.388886
[ Info: iteration 35, average log likelihood -1.388886
[ Info: iteration 36, average log likelihood -1.388886
[ Info: iteration 37, average log likelihood -1.388886
[ Info: iteration 38, average log likelihood -1.388886
[ Info: iteration 39, average log likelihood -1.388886
[ Info: iteration 40, average log likelihood -1.388886
[ Info: iteration 41, average log likelihood -1.388886
[ Info: iteration 42, average log likelihood -1.388886
[ Info: iteration 43, average log likelihood -1.388886
[ Info: iteration 44, average log likelihood -1.388886
[ Info: iteration 45, average log likelihood -1.388886
[ Info: iteration 46, average log likelihood -1.388886
[ Info: iteration 47, average log likelihood -1.388886
[ Info: iteration 48, average log likelihood -1.388886
[ Info: iteration 49, average log likelihood -1.388886
[ Info: iteration 50, average log likelihood -1.388886
┌ Info: EM with 100000 data points 50 iterations avll -1.388886
└ 952.4 data points per parameter
┌ Info: 1
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.428307250421799 
│     -1.428227900281631 
│      ⋮                 
└     -1.3888859022471014
[ Info: Running 50 iterations EM on diag cov GMM with 4 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.389011
[ Info: iteration 2, average log likelihood -1.388913
[ Info: iteration 3, average log likelihood -1.388714
[ Info: iteration 4, average log likelihood -1.386834
[ Info: iteration 5, average log likelihood -1.376960
[ Info: iteration 6, average log likelihood -1.362444
[ Info: iteration 7, average log likelihood -1.355247
[ Info: iteration 8, average log likelihood -1.351007
[ Info: iteration 9, average log likelihood -1.348022
[ Info: iteration 10, average log likelihood -1.346361
[ Info: iteration 11, average log likelihood -1.345519
[ Info: iteration 12, average log likelihood -1.345085
[ Info: iteration 13, average log likelihood -1.344838
[ Info: iteration 14, average log likelihood -1.344682
[ Info: iteration 15, average log likelihood -1.344578
[ Info: iteration 16, average log likelihood -1.344506
[ Info: iteration 17, average log likelihood -1.344453
[ Info: iteration 18, average log likelihood -1.344414
[ Info: iteration 19, average log likelihood -1.344385
[ Info: iteration 20, average log likelihood -1.344362
[ Info: iteration 21, average log likelihood -1.344344
[ Info: iteration 22, average log likelihood -1.344331
[ Info: iteration 23, average log likelihood -1.344320
[ Info: iteration 24, average log likelihood -1.344313
[ Info: iteration 25, average log likelihood -1.344307
[ Info: iteration 26, average log likelihood -1.344302
[ Info: iteration 27, average log likelihood -1.344298
[ Info: iteration 28, average log likelihood -1.344296
[ Info: iteration 29, average log likelihood -1.344294
[ Info: iteration 30, average log likelihood -1.344292
[ Info: iteration 31, average log likelihood -1.344290
[ Info: iteration 32, average log likelihood -1.344289
[ Info: iteration 33, average log likelihood -1.344288
[ Info: iteration 34, average log likelihood -1.344287
[ Info: iteration 35, average log likelihood -1.344287
[ Info: iteration 36, average log likelihood -1.344286
[ Info: iteration 37, average log likelihood -1.344285
[ Info: iteration 38, average log likelihood -1.344285
[ Info: iteration 39, average log likelihood -1.344284
[ Info: iteration 40, average log likelihood -1.344284
[ Info: iteration 41, average log likelihood -1.344284
[ Info: iteration 42, average log likelihood -1.344283
[ Info: iteration 43, average log likelihood -1.344283
[ Info: iteration 44, average log likelihood -1.344283
[ Info: iteration 45, average log likelihood -1.344282
[ Info: iteration 46, average log likelihood -1.344282
[ Info: iteration 47, average log likelihood -1.344282
[ Info: iteration 48, average log likelihood -1.344282
[ Info: iteration 49, average log likelihood -1.344281
[ Info: iteration 50, average log likelihood -1.344281
┌ Info: EM with 100000 data points 50 iterations avll -1.344281
└ 473.9 data points per parameter
┌ Info: 2
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.3890114154843485
│     -1.3889128289743886
│      ⋮                 
└     -1.3442811999159112
[ Info: Running 50 iterations EM on diag cov GMM with 8 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.344447
[ Info: iteration 2, average log likelihood -1.344262
[ Info: iteration 3, average log likelihood -1.343606
[ Info: iteration 4, average log likelihood -1.337651
[ Info: iteration 5, average log likelihood -1.321277
[ Info: iteration 6, average log likelihood -1.310307
[ Info: iteration 7, average log likelihood -1.305821
[ Info: iteration 8, average log likelihood -1.303241
[ Info: iteration 9, average log likelihood -1.301237
[ Info: iteration 10, average log likelihood -1.299604
[ Info: iteration 11, average log likelihood -1.298222
[ Info: iteration 12, average log likelihood -1.297000
[ Info: iteration 13, average log likelihood -1.295909
[ Info: iteration 14, average log likelihood -1.294959
[ Info: iteration 15, average log likelihood -1.294166
[ Info: iteration 16, average log likelihood -1.293543
[ Info: iteration 17, average log likelihood -1.293081
[ Info: iteration 18, average log likelihood -1.292744
[ Info: iteration 19, average log likelihood -1.292482
[ Info: iteration 20, average log likelihood -1.292250
[ Info: iteration 21, average log likelihood -1.292009
[ Info: iteration 22, average log likelihood -1.291712
[ Info: iteration 23, average log likelihood -1.291307
[ Info: iteration 24, average log likelihood -1.290761
[ Info: iteration 25, average log likelihood -1.290152
[ Info: iteration 26, average log likelihood -1.289625
[ Info: iteration 27, average log likelihood -1.289284
[ Info: iteration 28, average log likelihood -1.289103
[ Info: iteration 29, average log likelihood -1.289021
[ Info: iteration 30, average log likelihood -1.288983
[ Info: iteration 31, average log likelihood -1.288965
[ Info: iteration 32, average log likelihood -1.288954
[ Info: iteration 33, average log likelihood -1.288946
[ Info: iteration 34, average log likelihood -1.288941
[ Info: iteration 35, average log likelihood -1.288937
[ Info: iteration 36, average log likelihood -1.288934
[ Info: iteration 37, average log likelihood -1.288931
[ Info: iteration 38, average log likelihood -1.288928
[ Info: iteration 39, average log likelihood -1.288926
[ Info: iteration 40, average log likelihood -1.288924
[ Info: iteration 41, average log likelihood -1.288922
[ Info: iteration 42, average log likelihood -1.288920
[ Info: iteration 43, average log likelihood -1.288919
[ Info: iteration 44, average log likelihood -1.288917
[ Info: iteration 45, average log likelihood -1.288916
[ Info: iteration 46, average log likelihood -1.288914
[ Info: iteration 47, average log likelihood -1.288913
[ Info: iteration 48, average log likelihood -1.288911
[ Info: iteration 49, average log likelihood -1.288910
[ Info: iteration 50, average log likelihood -1.288908
┌ Info: EM with 100000 data points 50 iterations avll -1.288908
└ 236.4 data points per parameter
┌ Info: 3
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.3444470609855683
│     -1.3442620649624728
│      ⋮                 
└     -1.2889079017490883
[ Info: Running 50 iterations EM on diag cov GMM with 16 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.289101
[ Info: iteration 2, average log likelihood -1.288866
[ Info: iteration 3, average log likelihood -1.287825
[ Info: iteration 4, average log likelihood -1.277055
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.244432
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.225582
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     3
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.215642
[ Info: iteration 8, average log likelihood -1.221535
[ Info: iteration 9, average log likelihood -1.208152
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      4
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.191178
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     7
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 11, average log likelihood -1.205001
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     3
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 12, average log likelihood -1.211748
[ Info: iteration 13, average log likelihood -1.213887
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 14, average log likelihood -1.201262
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 15, average log likelihood -1.203337
[ Info: iteration 16, average log likelihood -1.205237
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     3
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 17, average log likelihood -1.190254
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 18, average log likelihood -1.196012
[ Info: iteration 19, average log likelihood -1.202839
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 20, average log likelihood -1.190991
[ Info: iteration 21, average log likelihood -1.188769
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      3
│      4
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 22, average log likelihood -1.176581
[ Info: iteration 23, average log likelihood -1.203476
[ Info: iteration 24, average log likelihood -1.198892
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 25, average log likelihood -1.184246
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 26, average log likelihood -1.183296
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     3
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.189101
[ Info: iteration 28, average log likelihood -1.202245
[ Info: iteration 29, average log likelihood -1.192781
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      4
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 30, average log likelihood -1.178832
[ Info: iteration 31, average log likelihood -1.195762
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     3
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.187964
[ Info: iteration 33, average log likelihood -1.196182
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 34, average log likelihood -1.187463
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 35, average log likelihood -1.191225
[ Info: iteration 36, average log likelihood -1.194636
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     3
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 37, average log likelihood -1.181946
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.190863
[ Info: iteration 39, average log likelihood -1.199803
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 40, average log likelihood -1.190165
[ Info: iteration 41, average log likelihood -1.188619
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      3
│      4
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.176657
[ Info: iteration 43, average log likelihood -1.203220
[ Info: iteration 44, average log likelihood -1.198726
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 45, average log likelihood -1.184175
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 46, average log likelihood -1.183346
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     3
│     4
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 47, average log likelihood -1.188948
[ Info: iteration 48, average log likelihood -1.202138
[ Info: iteration 49, average log likelihood -1.192736
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      4
│     13
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 50, average log likelihood -1.178874
┌ Info: EM with 100000 data points 50 iterations avll -1.178874
└ 118.1 data points per parameter
┌ Info: 4
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.2891006464443098
│     -1.288866068883419 
│      ⋮                 
└     -1.1788742441728242
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.195967
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│     5
│     6
│     7
│     8
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -1.187842
[ Info: iteration 3, average log likelihood -1.187887
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      5
│      6
│      7
│      8
│     25
│     26
│     27
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -1.165609
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     6
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.142508
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      4
│      5
│      7
│      8
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.099503
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      6
│     13
│     14
│     26
│     27
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.104767
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      5
│      7
│      8
│     25
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.109059
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      4
│      6
│     16
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.103648
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      5
│      7
│      8
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.112288
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      6
│     13
│     25
│     27
│     28
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 11, average log likelihood -1.091299
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      4
│      5
│      7
│      8
│     14
│     26
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 12, average log likelihood -1.098236
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      6
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 13, average log likelihood -1.117630
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      5
│      7
│      8
│     13
│     16
│     25
│     26
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 14, average log likelihood -1.074097
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      1
│      4
│      6
│     14
│     27
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 15, average log likelihood -1.095216
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│     5
│     7
│     8
│     9
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 16, average log likelihood -1.110390
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      6
│     13
│     25
│     26
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 17, average log likelihood -1.087988
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      4
│      5
│      7
│      8
│     16
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 18, average log likelihood -1.092493
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      1
│      6
│     14
│     26
│     28
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 19, average log likelihood -1.109142
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      5
│      7
│      8
│     25
│     27
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 20, average log likelihood -1.098837
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      4
│      6
│     13
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 21, average log likelihood -1.090343
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      5
│      7
│      8
│      9
│     14
│     16
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 22, average log likelihood -1.091173
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      4
│      6
│     25
│     26
│     27
│     28
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 23, average log likelihood -1.099100
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     5
│     7
│     8
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 24, average log likelihood -1.115103
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     13
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 25, average log likelihood -1.096348
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      1
│      4
│      5
│      7
│      ⋮
│     27
│     28
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 26, average log likelihood -1.079592
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      6
│     16
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.112078
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      5
│      7
│      8
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 28, average log likelihood -1.104330
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      4
│      6
│      7
│     25
│     26
│     27
│     28
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 29, average log likelihood -1.088487
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      5
│      8
│      9
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 30, average log likelihood -1.094049
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      1
│      6
│      7
│     13
│     16
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 31, average log likelihood -1.091637
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      4
│      5
│      8
│     25
│     28
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.106043
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     26
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 33, average log likelihood -1.112623
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      5
│      7
│      8
│     14
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 34, average log likelihood -1.085733
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      1
│      4
│      6
│      7
│      ⋮
│     27
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 35, average log likelihood -1.078957
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      5
│      8
│     16
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 36, average log likelihood -1.113130
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      6
│     14
│     25
│     26
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 37, average log likelihood -1.095936
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      4
│      5
│      8
│     13
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.084813
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      6
│      7
│      9
│     16
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 39, average log likelihood -1.100915
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      5
│      8
│     14
│     25
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 40, average log likelihood -1.103600
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│      4
│      6
│     13
│     26
│     28
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 41, average log likelihood -1.093589
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      1
│      5
│      7
│      8
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.104405
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      6
│     14
│     25
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 43, average log likelihood -1.090128
┌ Warning: Variances had to be floored 
│   ind =
│    8-element Array{Int64,1}:
│      4
│      5
│      7
│      8
│     16
│     26
│     28
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 44, average log likelihood -1.079021
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      1
│      6
│     13
│     25
│     27
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 45, average log likelihood -1.111173
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      5
│      7
│      8
│     14
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 46, average log likelihood -1.090977
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      4
│      6
│      7
│      9
│      ⋮
│     28
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 47, average log likelihood -1.067505
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      5
│      8
│     13
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 48, average log likelihood -1.120874
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     14
│     26
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 49, average log likelihood -1.112140
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      4
│      5
│      7
│      8
│     27
│     28
│     29
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 50, average log likelihood -1.090482
┌ Info: EM with 100000 data points 50 iterations avll -1.090482
└ 59.0 data points per parameter
┌ Info: 5
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.1959674460267538
│     -1.1878423780942702
│      ⋮                 
└     -1.090482138891555 
┌ Info: Total log likelihood: 
│   tll =
│    251-element Array{Float64,1}:
│     -1.4282238221885204
│     -1.428307250421799 
│     -1.428227900281631 
│     -1.4275946850168753
│      ⋮                 
│     -1.120873692257318 
│     -1.1121404894677431
└     -1.090482138891555 
32×26 Array{Float64,2}:
  0.0935887    0.0096613     0.0471572    0.0758701    0.17098       0.167956    0.0900172   -0.0925154    0.0437667    0.000194932   0.234457     0.0499366    -0.125867     0.0438763   -0.0648049    0.0647418    0.01461       0.121272    -0.090428    -0.0168502   -0.0392905    0.19719     -0.14648      -0.148253     0.0604917   -0.0575704 
  0.0112427    0.0687238     0.0662274    0.114691    -0.0735755    -0.0527795  -0.230632    -0.061156    -0.0486781   -0.0595635    -0.00723772   0.0392023    -0.0556051   -0.145193    -0.0977665   -0.12892      0.133407     -0.00395249   0.0334018    0.0243476   -0.0273458    0.00278187  -0.00754118   -0.250214     0.299697     0.0139961 
  0.255135     0.0976892    -0.0240604    0.0385197    0.00599376   -0.35474    -0.030868    -0.0694439   -0.0198738    0.118937      0.142665    -0.0112281     0.0548539   -0.235307    -0.0427943    0.0145011    0.078914     -0.00775336   0.0933589   -0.171833     0.096839    -0.0926029    0.111244      0.0488209   -0.107415    -0.0211551 
  0.0431529   -0.141205     -0.15377      0.134578     0.042474      0.0927835   0.0445471   -0.134541    -0.107612     0.0874148     0.0247225   -0.0960996     0.0196917   -0.0211564    0.0089157   -0.118895    -0.00574533   -0.00761152   0.098366    -0.0552501    0.117266    -0.100831     0.129863      0.0143881   -0.0687505   -0.0433055 
  0.0313107    0.0437098    -0.130416    -0.0754274    0.160136      0.122921    0.111096     0.169401    -0.149647    -0.320433     -0.256016    -0.0510409     0.136172     0.00320246   0.00981362  -0.050867     0.041645      0.0152195    0.0194197   -0.0142238   -0.0943205   -0.0981818   -0.374569     -0.132875    -0.148454    -0.00184858
  0.0522029   -0.0485169    -0.260032    -0.107807    -0.141832      0.0403298   0.0562458   -0.333712    -0.134554    -0.105293     -0.0952571   -0.0235933     0.0758987    0.0211991   -0.105876     0.106186     0.36145       0.0160323   -0.200699     0.0620118   -0.0928702   -0.0288021    0.54507      -0.112902    -0.120887     0.004085  
  0.619457     0.108517     -0.147206    -0.0755477   -0.0987385    -1.19445    -0.0557293   -0.0632069    0.00191474   0.0818713    -0.0498412   -0.0644182     0.172122     0.185833     0.0342633    0.357493    -0.284548      1.08797      0.0823971   -0.0297545   -0.325313     0.160431     0.0478295    -0.121161    -0.00654302  -0.00473364
  0.0508296   -0.164435     -0.0719347    0.0202097   -0.0672529     0.123273   -0.0521934   -0.0112784    0.0123542   -0.00106655   -0.0694328   -0.0659941     0.0336102    0.182488     0.0330748   -0.282191    -0.0798456    -0.157989     0.0743738   -0.0144641    0.0599996    0.00107988   0.0358303    -0.100327     0.00889458  -0.0370926 
 -0.0620006   -0.0350821    -0.0169906    0.127202     0.0841684     0.0484862   0.0210366   -0.0578798    0.00601483  -0.0353778     0.166058     0.000299065  -0.0139711    0.076273    -0.0457769    0.0265198    0.000552361   0.105661    -0.034187    -0.011706    -0.074187     0.152442     0.0156945    -0.0506663    0.0489383    0.00371815
 -0.162287    -0.173859     -0.13534     -0.0304463    0.0621659    -0.222596    0.0684721    0.131585    -0.0919789   -0.0016287     0.181805    -0.0653511    -0.0364316   -0.0580835   -0.0032914   -0.0752016   -0.101694      0.00536596   0.0967731    0.0444459   -0.0902317   -0.181861     0.145002     -0.0259581    0.247637     0.0620727 
  0.150055    -0.0818245     0.0530786    0.163752    -0.0550196     0.0660223  -0.00397245  -0.0415942   -0.0728204   -0.0905936     0.0346586   -0.180665      0.0165275    0.0457221   -0.127561     0.143092    -0.210191      0.0230087   -0.0619954    0.160922    -0.0642502   -0.0342097    0.150719      0.0853127    0.0376349    0.148917  
  0.0420092   -0.0114307     0.0934167   -0.0568543   -0.0106561     0.0613528  -0.0118814   -0.00223235   0.0216281   -0.0928068    -0.00310339  -0.0386965     0.0452518    0.044955    -0.00545212   0.0638944   -0.0523392    -0.0345205   -0.10783      0.127198    -0.0213529    0.0859088    0.0374763    -0.124432    -0.0532919    0.0672285 
 -0.0376943   -0.184313      0.047561    -0.181657    -0.0351501     0.12734    -0.1244      -0.0726728    0.00168491   0.0766803     0.0517789    0.0567417    -0.0184513   -0.146787     0.0978384   -0.0894827    0.0258533     0.0233131    0.00139924   0.280118     0.112632     0.0111189    0.0988288     0.142906    -0.0138908    0.0492659 
 -0.152645     0.0658274     0.0229224    0.00267343   0.000547934  -0.127776    0.0243676   -0.0904017   -0.0828487    0.0429811     0.0717892    0.0801113     0.0185621   -0.0286459   -0.001589    -0.00348914  -0.0519841    -0.224325     0.00945656   0.0388786    0.00762961   0.0338639   -0.0455992    -0.059178     0.0875999   -0.119856  
  0.0140412   -0.012208      0.0934781   -0.0501678   -0.0693747    -0.0375133   0.10893      0.0355651   -0.127463     0.0247058    -0.220838    -0.0389943     0.00147571  -0.0273838   -0.0295497    0.110119     0.0676454     0.262956    -0.0343392    0.0123552   -0.182845    -0.0738401    0.107603      0.0828579   -0.102329    -0.0193924 
 -0.138433     0.000623485   0.0437968   -0.025818    -0.00658039    0.0762761   0.0208853    0.0838729    0.0231965    0.0511195    -0.00172873  -0.0441313     0.0283544    0.0151502   -0.104687    -0.0205073   -0.188247     -0.00373885  -0.100022    -0.108474     0.0351514    0.0750257   -0.0125444    -0.0680491   -0.0680431    0.147171  
  0.0359701   -0.142704     -0.044028     0.00666516   0.029989     -0.127743    0.0371221   -0.23204     -0.069714    -0.044047      0.118861    -0.0651277    -0.0525157   -0.057992     0.155339    -0.0250588   -0.0489099     0.103804    -0.227091    -0.0544436   -0.119484    -0.093765     0.0981462     0.0448523    0.108914    -0.0558237 
  0.104015    -0.210094     -0.0162479   -0.0443639   -0.0546863    -0.0497789   0.0214768   -0.00724975  -0.0493159   -0.00242789   -0.0557161   -0.0531286     0.0372799    0.049684     0.0635949    0.0293245   -0.0396607    -0.00928059   0.0194783   -0.00734436  -0.122515    -0.019314    -0.0568362    -0.0867919   -0.0218652    0.125177  
  0.0451758    0.191292     -0.155964     0.088036     0.203624      0.164451    0.133432    -0.484326     0.069536    -0.0384025    -0.0974899   -0.240258      0.196342     0.0469348    0.161043     0.0397421   -0.0300308    -0.0472079    0.0207125    0.141341    -0.0246016   -0.00958346  -0.14513      -0.00194723  -0.00050874  -0.042593  
  0.00495125   0.146372      0.0131521   -0.0325219    0.206553      0.154127    0.127883     0.674217     0.215438    -0.0403247    -0.186277    -0.159307      0.149991     0.0338625    0.059409     0.152535    -0.0430863     0.0743399    0.0172644    0.139046     0.0520867   -0.0583078   -0.141846      0.033324    -0.00022271  -0.0611079 
 -0.116315     0.0352568    -0.12616     -0.0817433   -0.00254898    0.0455696  -0.123467    -0.0405158    0.00970987   0.196213      0.188824    -0.202351      0.105137    -0.0217913   -0.0558953   -0.149355    -0.00720124   -0.117341     0.00396471   0.118708     0.120158     0.00628897  -0.0212755    -0.00388375   0.010829     0.191394  
  0.0664639   -0.0427368    -0.0111       0.0595565   -0.181182      0.0548786   0.0362632    0.0287166    0.156026     0.0381319    -0.0300408   -0.0350494     0.09535     -0.0412327   -0.119269     0.0407118    0.0300392    -0.0563447   -0.155869     0.0342728   -0.134258     0.159936     0.00496751   -0.185649     0.0905668    0.0962407 
  0.139193     0.0526403     0.0128063   -0.191781     0.0729559     0.0447255  -0.0333265   -0.0417694    0.11737      0.0547831     0.0146894   -0.00655414   -0.044137     0.13245      0.19161      0.0669612    0.0428944    -0.00169285   0.178511     0.124602    -0.0913601    0.160262    -0.0611161     0.107008    -0.0841788   -0.0790234 
  0.0389208    0.194076     -0.00588889   0.0663444   -0.00231804    0.114282   -0.113214     0.0435368    0.0159935   -0.176742     -0.074732     0.0986991     0.205571     0.128931     0.0148701   -0.179878    -0.0592989     0.0376131   -0.00936217   0.048775     0.032527     0.0251112   -0.174189      0.108026     0.0508208   -0.0704338 
 -0.547755     0.0409679    -0.171178     0.158379    -0.150857      0.0594994   0.0172468    0.116126     0.0251258    0.054154      0.01246     -0.0889519    -0.0522306   -0.0529052   -0.113227    -0.0838343    0.151338     -0.0983415   -0.0316687    0.0865188    0.136132     0.0489902   -0.0104842     0.179032     0.0311013    0.0713657 
  0.276512    -0.0444139    -0.0690309    0.0454528    0.0217151    -0.0444145  -0.221348    -0.0822664    0.00817787   0.0766443    -0.103014    -0.0467212    -0.0505553   -0.135454    -0.0391867   -0.0548648    0.0126438     0.0440266   -0.0204114    0.025703    -0.0975989    0.0318566    0.00890207   -0.0136774    0.0914025   -0.00657811
  0.0402229   -0.0905579     0.0044872    0.0606163   -0.149054      0.0539601   0.0988195    0.0687602    0.0222199    0.0924054     0.00980122  -0.20031      -0.0392417   -0.0169607   -0.076527    -0.0750682    0.107864     -0.0761908   -0.0372573   -0.0850451    0.0118615    0.0302198   -0.00500427   -0.048435     0.0568227    0.0450862 
 -0.230733    -0.0202409    -0.11211      0.11324      0.113287     -0.0154608   0.0773826    0.105643     0.0342484   -0.0379665     0.0660745   -0.0655964     0.169867    -0.0534554    0.0754881    0.0606065   -0.00997373   -0.100233    -0.0728204    0.108987     0.061911    -0.23606     -0.072986      0.105026     0.058321    -0.132469  
  0.147817     0.0870071    -0.068963    -0.175481    -0.0810596    -0.0502986   0.0109692   -0.0103719    0.249386    -0.0495963    -0.115391    -0.232654      0.00130511  -0.0105617    0.0294529    0.0476696    0.0930539    -0.136232     0.127695    -0.0204597   -0.0897393    0.0490508   -0.00952562    0.153784    -0.182251    -0.0723265 
  0.00821747  -0.00591802    0.1055      -0.0593748   -0.0204068    -0.0273963  -0.127505     0.0803305    0.174123    -0.0297758     0.0782863    0.0267953    -0.0318936    0.0616711    0.0159756    0.0156193   -0.0128629    -0.0302604   -0.00116341   0.0360947    0.00655866  -0.176436     0.000543974  -0.0103185   -0.0177002   -0.0373112 
 -0.153783     0.0151941    -0.0263542    0.0446434   -0.0765289    -0.203794    0.11867      0.0862431    0.253746     0.132482     -0.0762155    0.0384675     0.0437179   -0.0137567    0.0254686   -0.0582872    0.000692542  -0.0114491    0.104476    -0.167571     0.0864104    0.0398719   -0.00559504   -0.095039     0.113569     0.134639  
  0.142439     0.0160134    -0.00707784   0.0724836   -0.0471103     0.0158652  -0.0342688    0.114504    -0.0205116    0.0758213    -0.0654649   -0.128766      0.109897     0.0356689    0.0976765   -0.00140076   0.0575556     0.0659125    0.0157964    0.0308822    0.0269695    0.0911692    0.0118085     0.00501678   0.0252888    0.150034  [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      1
│      6
│      7
│     13
│     25
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 1, average log likelihood -1.097263
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      1
│      5
│      6
│      8
│      ⋮
│     28
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -1.064573
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      1
│      4
│      6
│      7
│      ⋮
│     26
│     27
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 3, average log likelihood -1.079961
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      1
│      5
│      6
│      8
│      ⋮
│     25
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -1.074967
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      1
│      4
│      6
│      7
│      ⋮
│     28
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.067566
┌ Warning: Variances had to be floored 
│   ind =
│    12-element Array{Int64,1}:
│      1
│      5
│      6
│      7
│      ⋮
│     25
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.074838
┌ Warning: Variances had to be floored 
│   ind =
│    9-element Array{Int64,1}:
│      1
│      6
│     13
│     25
│      ⋮
│     28
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.078984
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      1
│      4
│      6
│      7
│      ⋮
│     25
│     26
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.071714
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      1
│      5
│      6
│      7
│      ⋮
│     27
│     28
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.080610
┌ Warning: Variances had to be floored 
│   ind =
│    13-element Array{Int64,1}:
│      1
│      4
│      6
│      7
│      ⋮
│     26
│     29
│     31
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.061954
┌ Info: EM with 100000 data points 10 iterations avll -1.061954
└ 59.0 data points per parameter
kind diag, method kmeans
[ Info: Initializing GMM, 32 Gaussians diag covariance 26 dimensions using 100000 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       9.250572e+05
      1       7.074572e+05      -2.176000e+05 |       32
      2       6.816787e+05      -2.577854e+04 |       32
      3       6.666914e+05      -1.498727e+04 |       32
      4       6.546977e+05      -1.199371e+04 |       32
      5       6.456823e+05      -9.015345e+03 |       32
      6       6.404132e+05      -5.269108e+03 |       32
      7       6.368676e+05      -3.545630e+03 |       32
      8       6.342247e+05      -2.642854e+03 |       32
      9       6.325831e+05      -1.641651e+03 |       32
     10       6.317468e+05      -8.363213e+02 |       32
     11       6.312605e+05      -4.862358e+02 |       32
     12       6.307293e+05      -5.312613e+02 |       32
     13       6.297279e+05      -1.001374e+03 |       32
     14       6.285553e+05      -1.172634e+03 |       32
     15       6.281439e+05      -4.113590e+02 |       32
     16       6.279238e+05      -2.200780e+02 |       32
     17       6.277624e+05      -1.614301e+02 |       32
     18       6.276286e+05      -1.337501e+02 |       32
     19       6.274908e+05      -1.378688e+02 |       31
     20       6.273642e+05      -1.265437e+02 |       31
     21       6.272788e+05      -8.542139e+01 |       32
     22       6.272123e+05      -6.655595e+01 |       32
     23       6.271734e+05      -3.884878e+01 |       32
     24       6.271526e+05      -2.079772e+01 |       31
     25       6.271402e+05      -1.241801e+01 |       31
     26       6.271307e+05      -9.438870e+00 |       27
     27       6.271249e+05      -5.799365e+00 |       28
     28       6.271207e+05      -4.239965e+00 |       24
     29       6.271178e+05      -2.942089e+00 |       23
     30       6.271161e+05      -1.699672e+00 |       22
     31       6.271145e+05      -1.587639e+00 |       20
     32       6.271129e+05      -1.541910e+00 |       20
     33       6.271114e+05      -1.510324e+00 |       21
     34       6.271103e+05      -1.151808e+00 |       18
     35       6.271093e+05      -9.691479e-01 |       20
     36       6.271080e+05      -1.305303e+00 |       17
     37       6.271073e+05      -6.539141e-01 |       12
     38       6.271068e+05      -5.069148e-01 |       10
     39       6.271064e+05      -4.042021e-01 |        5
     40       6.271063e+05      -1.425464e-01 |        7
     41       6.271062e+05      -1.267578e-01 |        0
     42       6.271062e+05       0.000000e+00 |        0
K-means converged with 42 iterations (objv = 627106.1666352667)
┌ Info: K-means with 32000 data points using 42 iterations
└ 37.0 data points per parameter
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.341226
[ Info: iteration 2, average log likelihood -1.313420
[ Info: iteration 3, average log likelihood -1.287073
[ Info: iteration 4, average log likelihood -1.259385
[ Info: iteration 5, average log likelihood -1.220993
[ Info: iteration 6, average log likelihood -1.173161
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     10
│     18
│     19
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.127728
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     20
│     21
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.135730
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     28
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.107755
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     12
│     14
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.087473
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│      1
│     19
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 11, average log likelihood -1.092458
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     10
│     24
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 12, average log likelihood -1.080294
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│     20
│     21
│     28
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 13, average log likelihood -1.081801
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     18
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 14, average log likelihood -1.116663
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     12
│     16
│     19
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 15, average log likelihood -1.078730
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      1
│     10
│     14
│     28
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 16, average log likelihood -1.080520
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     17
│     21
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 17, average log likelihood -1.104372
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      6
│     18
│     20
│     24
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 18, average log likelihood -1.074358
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     12
│     19
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 19, average log likelihood -1.105279
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      1
│     14
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 20, average log likelihood -1.076530
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│     10
│     16
│     21
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 21, average log likelihood -1.063138
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     17
│     20
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 22, average log likelihood -1.079305
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     18
│     19
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 23, average log likelihood -1.076684
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      1
│     12
│     14
│     28
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 24, average log likelihood -1.074709
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     10
│     21
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 25, average log likelihood -1.102929
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     16
│     20
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 26, average log likelihood -1.080835
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     19
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 27, average log likelihood -1.069105
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│     12
│     14
│     18
│     21
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 28, average log likelihood -1.067507
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      1
│     10
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 29, average log likelihood -1.096147
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     20
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 30, average log likelihood -1.092871
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      6
│     16
│     17
│     19
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 31, average log likelihood -1.057182
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      1
│     12
│     14
│     21
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 32, average log likelihood -1.077928
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│     10
│     18
│     20
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 33, average log likelihood -1.100534
[ Info: iteration 34, average log likelihood -1.128266
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     19
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 35, average log likelihood -1.064774
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│     11
│     12
│     14
│     20
│     21
│     28
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 36, average log likelihood -1.056492
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│     10
│     16
│     18
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 37, average log likelihood -1.108259
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     1
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 38, average log likelihood -1.100392
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      6
│     17
│     19
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 39, average log likelihood -1.057978
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│     12
│     20
│     21
│     25
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 40, average log likelihood -1.059412
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      1
│     10
│     14
│     16
│     18
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 41, average log likelihood -1.089518
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     11
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 42, average log likelihood -1.113744
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     6
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 43, average log likelihood -1.080199
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│     12
│     19
│     20
│     21
│     28
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 44, average log likelihood -1.032787
┌ Warning: Variances had to be floored 
│   ind =
│    5-element Array{Int64,1}:
│      1
│     10
│     16
│     17
│     18
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 45, average log likelihood -1.091916
┌ Warning: Variances had to be floored 
│   ind =
│    2-element Array{Int64,1}:
│     14
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 46, average log likelihood -1.108529
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     6
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 47, average log likelihood -1.080713
┌ Warning: Variances had to be floored 
│   ind =
│    6-element Array{Int64,1}:
│     11
│     12
│     19
│     20
│     21
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 48, average log likelihood -1.046653
┌ Warning: Variances had to be floored 
│   ind =
│    1-element Array{Int64,1}:
│     10
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 49, average log likelihood -1.111293
┌ Warning: Variances had to be floored 
│   ind =
│    4-element Array{Int64,1}:
│      1
│     14
│     18
│     28
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 50, average log likelihood -1.062169
┌ Info: EM with 100000 data points 50 iterations avll -1.062169
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
  0.143171     0.0933665   -0.0740133   -0.184389    -0.0868318   -0.0556015    0.0113917   -0.00835084    0.25595      -0.0498483    -0.122969    -0.239264   -0.000839829  -0.0136144    0.0308922     0.0478345    0.0962334    -0.148988     0.130198    -0.0241709    -0.0916639    0.0535138   -0.0094156     0.161443     -0.190143     -0.0775312  
  0.151143    -0.0585199   -0.00965023  -0.196227    -0.109675     0.00709695  -0.006487    -0.0321858     0.0656598     0.0113866    -0.0377027    0.0242077  -0.0662477     0.122921     0.169527      0.0634856    0.0521498    -0.00369835   0.116768     0.0846217    -0.14011      0.0746536   -0.0722529    -0.0602327    -0.0525734     0.0529786  
  0.0110031    0.161667    -0.0390674    0.071147    -0.175466    -0.0627596   -0.00640563   0.020658     -0.0597882     0.192871     -0.0981465   -0.0881786   0.0497441     0.0159439    0.101558      0.00580412   0.104422     -0.076852    -0.0419811    0.00708809   -0.040684     0.173509    -0.0280381     0.013311      0.0520051     0.0922894  
  0.0981159   -0.0658789    0.139142    -0.0351046    0.0738715    0.0225759   -0.151109     0.0309497     0.156308     -0.0708934     0.148972     0.138474    0.00577513    0.133108     0.0183948     0.0828512   -0.0149537    -0.0112191   -0.026105     0.073907      0.0421447   -0.250516    -0.00460137    0.0395861     0.0954716    -0.062983   
 -0.150613    -0.158335    -0.118044    -0.0148016    0.0694669   -0.188339     0.0615325    0.111532     -0.06844      -0.0200745     0.183238    -0.0579423  -0.0406696    -0.0463846   -0.010597     -0.0597081   -0.0976138     0.0163007    0.0763015    0.0407809    -0.0861273   -0.140272     0.131767     -0.0307595     0.228982      0.0550163  
  0.279061    -0.0752601   -0.0486875    0.0458368    0.0539926   -0.0246845   -0.168357    -0.128719     -0.000694869   0.0924112    -0.0967364   -0.0279862  -0.0283334    -0.0204983   -0.00221697   -0.0429485   -0.00551159    0.0701078   -0.0924841    0.0186162    -0.0992973    0.0228859    0.0119267    -0.0362763     0.133014     -0.0152288  
  0.0390857    0.195409    -0.00528362   0.0650942   -0.00148783   0.115143    -0.112418     0.0431441     0.0158725    -0.176951     -0.074846     0.0975199   0.206227      0.129768     0.0147526    -0.18038     -0.0613836     0.0397467   -0.0126422    0.0488192     0.0331378    0.0247691   -0.174039      0.108693      0.0503361    -0.0717266  
  0.150034    -0.0836066    0.0522115    0.164061    -0.0551134    0.0647808   -0.00362452  -0.0416055    -0.0747274    -0.0905059     0.0349918   -0.178055    0.0164122     0.0451695   -0.128635      0.143658    -0.210748      0.0205269   -0.0620027    0.163129     -0.0641338   -0.0290006    0.15039       0.0866048     0.0381945     0.147908   
  0.251406     0.0996166   -0.0278533    0.0383163    0.00598828  -0.351364    -0.0267181   -0.0686144    -0.0190321     0.120958      0.139492    -0.01283     0.0535479    -0.233103    -0.0417404     0.013396     0.0802529    -0.00817568   0.0939045   -0.169431      0.0951685   -0.0938067    0.111895      0.050594     -0.110136     -0.0221412  
 -0.159629     0.00609794  -0.0290634    0.0257784   -0.0890163   -0.184636     0.113173     0.0779385     0.260262      0.14496      -0.0721537    0.0425505   0.0307615    -0.00437675   0.0169258    -0.0523641    0.00579293    0.00304879   0.108877    -0.177331      0.0708657    0.0370471   -0.00933662   -0.0953068     0.118904      0.136536   
  0.0561938   -0.0632147   -0.158701     0.0452185    0.0594326    0.0660994    0.0248706   -0.0764638    -0.120664      0.109811      0.051546    -0.0481936   0.0335739     0.0287402   -0.000244686  -0.104057     0.0764919     0.0245334    0.0414911   -0.054331     -0.012241    -0.140134     0.104669     -0.0737054    -0.109919     -0.00742334 
  0.102583     0.108985     0.0592213   -0.137242     0.0366141    0.0271992   -0.0101924    0.00838815   -0.0143826    -0.204314     -0.0468178   -0.0162914   0.0775971     0.00622915  -0.20683       0.0534938   -0.045366     -0.0370283   -0.147402     0.213334      0.00158978   0.125536    -0.039875      0.03955      -0.0837442     0.0082819  
  0.0247705    0.168925    -0.0706203    0.0266034    0.20514      0.159156     0.130608     0.101416      0.143311     -0.0393507    -0.141852    -0.199214    0.172971      0.0403289    0.109959      0.0968289   -0.036553      0.0141237    0.0189231    0.140189      0.0143731   -0.0341485   -0.143554      0.0159901    -0.000225748  -0.0519734  
 -0.0865906   -0.0538678   -0.0406913    0.0832451   -0.136476     0.0471403    0.0271358    0.0795226     0.0234559     0.0731264     0.00239516  -0.153998   -0.0378207    -0.0478478   -0.0783886    -0.0773267    0.0946869    -0.101189    -0.0379326   -0.0336627     0.0391023    0.0293308   -0.0114207     0.0117687     0.0476135     0.0586754  
  0.0132928   -0.00934275   0.0980743   -0.0652253   -0.069463    -0.0347293    0.110851     0.033965     -0.127627      0.0219157    -0.219674    -0.0333121  -0.00617919   -0.0283009   -0.0346612     0.110379     0.061783      0.264594    -0.0429307    0.0100042    -0.185723    -0.0795265    0.111162      0.0933946    -0.107326     -0.0261312  
 -0.166205    -0.00850295   0.0439135   -0.0688675   -0.0272429    0.0509829    0.0259929    0.126942      0.0136324     0.0560508    -0.0305727   -0.0654405   0.0576167    -0.007496    -0.110781     -0.0519471   -0.229913     -0.0384131   -0.106064    -0.133913      0.0525687    0.0549796    0.000185149  -0.054754     -0.0656053     0.179219   
  0.00495208   0.0493181    0.0462538    0.09662      0.142688     0.202563     0.0138746   -0.0915319     0.0827636     0.00166309    0.20248      0.0311746  -0.0893824     0.110342    -0.0724167     0.0678473   -0.0206989     0.151867    -0.0809876   -0.00127301   -0.0367852    0.177527    -0.0972347    -0.101423     -0.0386152    -0.0294174  
 -0.0388299   -0.184165     0.0484109   -0.186194    -0.0346149    0.126428    -0.123305    -0.0725895     0.00166224    0.076606      0.0519335    0.0590993  -0.0187465    -0.147486     0.0974266    -0.089704     0.0254573     0.0219224    0.00165179   0.27958       0.112821     0.0107374    0.10043       0.145697     -0.0139515     0.0492608  
  0.0394351    0.0107179   -0.176871    -0.115269     0.0223479    0.0987431    0.086466    -0.0660439    -0.135469     -0.2976       -0.246915    -0.0367578   0.116593      0.0209335   -0.0463467     0.0245145    0.191839      0.0128788   -0.100522     0.0323409    -0.0888559   -0.0491193   -0.0101533    -0.126544     -0.128043      0.000116002
 -0.158406     0.0656559    0.0273862    0.0017776   -0.00305348  -0.128053     0.0247777   -0.0862857    -0.0899274     0.0434417     0.0730946    0.0835623   0.0197783    -0.041116     0.00149998   -0.0112811   -0.05308      -0.226331     0.0125099    0.0330394     0.0115214    0.0332254   -0.0387001    -0.0564813     0.0815622    -0.115408   
 -0.229663    -0.00619927  -0.108809     0.112743     0.153128    -0.0215522    0.0788331    0.109111      0.0363216    -0.0525107     0.0764318   -0.0639231   0.173473     -0.0641394    0.0776332     0.0875315    0.0631887    -0.0717817   -0.080898     0.110393      0.0600685   -0.254686    -0.0946388     0.120406      0.0498566    -0.163826   
  0.0228405    0.0655648    0.0648893    0.11346     -0.069484    -0.051334    -0.220741    -0.0605231    -0.0495535    -0.05728      -0.00421736   0.0416236  -0.0549515    -0.142309    -0.0971391    -0.124523     0.133928     -0.00212495   0.0293166    0.0238942    -0.0291497    0.00754068  -0.0194646    -0.251284      0.297367      0.00889314 
  0.028934    -0.142779    -0.043269     0.00511813   0.0258256   -0.129011     0.0369281   -0.227665     -0.0690159    -0.044136      0.117768    -0.0628605  -0.0543528    -0.055755     0.150257     -0.020351    -0.0421862     0.105671    -0.227414    -0.0565901    -0.120848    -0.0931699    0.0957957     0.0370497     0.106098     -0.0570156  
  0.276069    -0.108571     0.024492     0.0631671    0.0629326    0.0762663   -0.059685     0.194252      0.0288793    -0.028569     -0.0482349   -0.153909    0.141599      0.0585426    0.0958359    -0.0116315    0.00191144    0.173064     0.0645616    0.0446279     0.096794     0.0317463    0.0453747     0.00786393    0.0175964     0.199326   
 -0.0112072   -0.165579    -0.0533643    0.0869588   -0.0414866   -0.0448619    0.0950243    0.0665176    -0.151826      0.0947539     0.0365813   -0.159606    0.190643     -0.00665842   0.120982      0.163906     0.101095     -0.0678441    0.134235    -0.0972115    -0.0602044    0.10279     -0.0331439    -0.157228     -0.11844       0.188514   
 -0.0181384   -0.145684     0.130825     0.0272435   -0.0643799    0.0753649   -0.00111213   0.000512766   0.0630627    -0.00145365    0.0382654   -0.0499025   0.0258806     0.0776693    0.170417      0.0861686   -0.0557639    -0.0316464   -0.0766678    0.0422348    -0.0453484    0.0313238    0.130582     -0.306572     -0.0201847     0.145591   
  0.041729    -0.309394     0.0453607   -0.0304084    0.153564    -0.111902     0.0649949   -0.000826309  -0.036995     -0.0867085    -0.0637286   -0.093288    0.0418493     0.0562169   -0.0173075    -0.102563    -0.25303       0.0117752   -0.00895419   0.0543788    -0.116848    -0.136556    -0.0896939     0.130837     -0.0123872     0.0508972  
  0.0404311   -0.141074    -0.151714     0.136931     0.03479      0.0897087    0.0451776   -0.133722     -0.101948      0.0857098     0.0228937   -0.0983691   0.0204327    -0.0200925    0.00914278   -0.100553    -0.0059029    -0.00708894   0.103275    -0.062024      0.107818    -0.0857233    0.129131      0.0157957    -0.0681113    -0.0431912  
 -0.0984316    0.0585849    0.0678993   -0.0899103   -0.14443     -0.0763639   -0.0779013    0.138281      0.214126      0.000989318  -0.00703916  -0.0968017  -0.0624998    -0.00862245   0.0163983    -0.0525553   -0.000767921  -0.0414221    0.0389027   -0.00358146   -0.0237891   -0.106447     0.0112586    -0.06539      -0.155604     -0.01515    
 -0.123521     0.0345351   -0.13599     -0.0879938    0.00148357   0.0471076   -0.124422    -0.0365434     0.00969422    0.205235      0.191979    -0.202018    0.109833     -0.0255918   -0.0569028    -0.158869    -0.00690585   -0.112384    -9.9469e-5    0.12332       0.128571    -0.00250869  -0.025175     -0.000935983   0.0214647     0.196146   
  0.0643029   -0.0429821   -0.0112558    0.0602468   -0.181859     0.0556891    0.0364639    0.03138       0.156317      0.0374101    -0.030575    -0.0352694   0.0944736    -0.0412716   -0.118872      0.0435283    0.0298695    -0.0583011   -0.154803     0.0349012    -0.134551     0.156423     0.00484811   -0.186419      0.0904472     0.0945928  
  0.0278275   -0.130041    -0.0849508    0.0351244   -0.0602126    0.0377488   -0.0213508    0.0170728     0.0154605    -0.0047695    -0.0466833   -0.0664281   0.0700103     0.14274      0.0447378    -0.214016    -0.147878     -0.0963537    0.0630301   -0.000334999   0.0444575   -0.0266737    0.0262041    -0.071027      0.0134745    -0.0448079  [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     16
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 1, average log likelihood -1.070685
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      6
│     11
│     12
│     16
│      ⋮
│     21
│     25
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 2, average log likelihood -1.020850
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      6
│     10
│     12
│     16
│     21
│     25
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 3, average log likelihood -1.021838
┌ Warning: Variances had to be floored 
│   ind =
│    14-element Array{Int64,1}:
│      1
│      6
│     11
│     12
│      ⋮
│     25
│     28
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 4, average log likelihood -1.004903
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     16
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 5, average log likelihood -1.065642
┌ Warning: Variances had to be floored 
│   ind =
│    11-element Array{Int64,1}:
│      6
│     10
│     11
│     12
│      ⋮
│     21
│     25
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 6, average log likelihood -1.017419
┌ Warning: Variances had to be floored 
│   ind =
│    7-element Array{Int64,1}:
│      6
│     12
│     16
│     21
│     25
│     28
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 7, average log likelihood -1.026108
┌ Warning: Variances had to be floored 
│   ind =
│    14-element Array{Int64,1}:
│      1
│      6
│     10
│     11
│      ⋮
│     21
│     25
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 8, average log likelihood -1.006346
┌ Warning: Variances had to be floored 
│   ind =
│    3-element Array{Int64,1}:
│      6
│     16
│     25
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 9, average log likelihood -1.070101
┌ Warning: Variances had to be floored 
│   ind =
│    10-element Array{Int64,1}:
│      6
│     11
│     12
│     16
│      ⋮
│     21
│     25
│     32
└ @ GaussianMixtures ~/.julia/packages/GaussianMixtures/RGtTJ/src/train.jl:255
[ Info: iteration 10, average log likelihood -1.021320
┌ Info: EM with 100000 data points 10 iterations avll -1.021320
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
 -0.101981     -0.139924    -0.0147637    -0.0575284   -0.0672938    0.000708168  -0.053624    -0.00733809  -0.115464     -0.0251965     0.134273     0.0138385   -0.142555    -0.144584    -0.0439331   -0.161359    -0.103304     0.0661239   -0.153971    -0.0910677   -0.0123652   -0.0667176   -0.0544731   -0.0121765  -0.251042     0.14649   
  0.144268      0.0440032    0.0256317     0.14737      0.0601981    0.0754644    -0.167729    -0.0420771   -0.0286954     0.107933     -0.103196     0.139149     0.0822638   -0.0296125   -0.0883451   -0.0489403   -0.133943    -0.00529243  -0.0384083    0.0187477    0.0622672   -0.0281678    0.128504    -0.026662   -0.151253    -0.0966717 
  0.0667824     0.120366     0.049957      0.0752106   -0.00290967  -0.172934      0.0669324   -0.0932462   -0.191367     -0.097981      0.095433    -0.206686    -0.146171     0.157132    -0.129081     0.0704421   -0.0474567   -0.0426085    0.0538379    0.0589958    0.0414194    0.0672518   -0.0104216   -0.142331    0.107436    -0.0873605 
 -0.0232666    -0.0686812   -0.105403      0.167791    -0.117885     0.164326     -0.10601     -0.116821    -0.179027      0.0063742     0.126068     0.020528     0.0656724   -0.00392491  -0.0178656   -0.0461144   -0.00289586   0.0747468   -0.0672703    0.0765375    0.0883175   -0.181212    -0.145673     0.145745   -0.00663864  -0.128505  
  0.12773       0.026806    -0.0344065    -0.0328938    0.120618     0.108307      0.0139577    0.00609422  -0.0192322     0.148664     -0.031748     0.0897045    0.0702665    0.0702735   -0.064953    -0.0142878    0.0668699   -0.103894     0.192783    -0.0815948   -0.00225684   0.0207827    0.171375     0.170519    0.0402881    0.156836  
 -0.0184703    -0.22267      0.102996     -0.00816421  -0.103205    -0.283822     -0.0343723    0.0635839   -0.00670225    0.0539611     0.0276661   -0.0411039   -0.0523456   -2.71739e-5  -0.0827799    0.125857     0.0660308    0.0671224    0.151522     0.0594604    0.104441     0.0799453   -0.0424493    0.0619261   0.139955     0.0426832 
 -0.0345421    -0.0996991   -0.00347998   -0.0708078   -0.0817438    0.103923      0.0774592   -0.106068     0.0379221     0.179594      0.0145857   -0.0489604   -0.0442476   -0.0730267   -0.0760034    0.0479768    0.110422     0.0302693    0.0907198    0.0104129    0.0652075   -0.0277502   -0.277623     0.0361527  -0.117965     0.145548  
 -0.00526425   -0.00890723   0.0609257     0.0176383    0.0036122   -0.0872108     0.113746    -0.00123927  -0.237247     -0.026766      0.00995448  -0.0466602    0.111262    -0.105368    -0.0580055    0.0376365   -0.0434742    0.0430591    0.0336996    0.0325048    0.105874    -0.0184361    0.12553     -0.146014    0.105314    -0.0505001 
 -0.107096     -0.158132    -0.0343394     0.109923    -0.0613121   -0.0743394    -0.0516616    0.0940322   -0.0675895     0.0119553     0.066769    -0.0709363    0.0851012   -0.097087    -0.0833889    0.015651     0.0212797    0.127748     0.132746     0.00207844   0.126281    -0.0239067    0.127986    -0.0787768  -0.202512     0.136283  
 -0.0445427     0.037226     0.153976     -0.0108246   -0.14632      0.111626      0.10346      0.138582     0.00416554   -0.0245899    -0.145578    -0.0222683   -0.0277145    0.0631214    0.0652997   -0.209983    -0.00214709   0.0258822    0.00920217  -0.100491    -0.113111    -0.0136726    0.0144247    0.0794487   0.0830016   -0.0175805 
 -0.0746788    -0.0740086    0.122795     -0.104347    -0.103605    -0.128701      0.0248884   -0.135283    -0.00229979    0.089256      0.280651    -0.123243     0.0465364   -0.214781    -0.159865     0.0129824    0.051142     0.0160793    0.0447365    0.0564036   -0.0664428    0.0688929   -0.0418086    0.0901133   0.102561     0.103862  
 -0.095078      0.0369499    0.0408389    -0.141954     0.227849    -0.176726      0.0987126    0.120661     0.0542907     0.0492113    -0.0800077    0.0467023    0.210028     0.253646    -0.10013      0.0694669    0.1019       0.139598    -0.112136     0.097139     0.0369256   -0.00373463   0.103712    -0.0124014  -0.0285361   -0.239349  
 -0.000290402  -0.0836596   -0.0234091    -0.00947603   0.114064     0.0110738    -0.0103141   -0.105305    -0.0608133    -0.0280388    -0.0259296    0.154008    -0.0332952   -0.0438448    0.032013     0.00278197   0.0689249   -0.0592875    0.00466224  -0.147415    -0.25487      0.211569    -0.14311      0.099846   -0.0349369   -0.144994  
 -0.052167      0.0292955   -0.16051       0.0504807    0.0512636   -0.1162       -0.0178519   -0.0240521    0.194877      0.144701      0.0373686    0.00172098  -0.00568062   0.0597726    0.096429    -0.193561    -0.202822    -0.0437345    0.0987629   -0.0586433    0.187383    -0.0369482   -0.104353     0.024165   -0.0685206    0.166814  
  0.0766967     0.14593      0.03405      -0.049335     0.165788     0.00655774   -0.0663889    0.0670693    0.0390735    -0.128501     -0.0916587    0.118622    -0.02246      0.0116482   -0.0956909    0.0271128    0.100532    -0.0109541   -0.136413     0.137312    -0.0507417    0.110454    -0.00846021   0.115205    0.0560164   -0.209041  
 -0.100811     -0.0469469   -0.0985642    -0.0661619    0.160384     0.200829     -0.0148269   -0.00208999  -0.0772033     0.0531565    -0.0576449   -0.0188938    0.0134666   -0.0870041   -0.145697     0.0395728    0.169789     0.047822     0.00618118   0.154163     0.0056083    0.0340771   -0.0102604    0.0814454   0.0318696    0.0429718 
 -0.0503504    -0.0945471    0.0453703    -0.0434226   -0.0261912    0.205166      0.122702    -0.0305762    0.037709      0.165052     -0.127422     0.174725    -0.122179     0.10595     -0.15061     -0.0797967    0.0660722   -0.0153202   -0.0378368   -0.0771082    0.0800475    0.0420787   -0.0631837    0.0222839   0.156996    -0.127279  
  0.166809     -0.215974    -0.0245925     0.210204    -0.0851447    0.149926     -0.16104      0.148908    -0.0920112    -0.0103576     0.0080353   -0.0728056   -0.108868    -0.024539    -0.0257626   -0.0638834    0.020552    -0.0125998    0.0163341   -0.0588587   -0.0488258    0.00777118  -0.0467847    0.0099773   0.163462    -0.140892  
 -0.050669     -0.0147       0.118165     -0.132266    -0.153467    -0.0812352    -0.00964187  -0.152404    -0.082458      0.0326579     0.0192201    0.0523398    0.0522011    0.155356    -0.314504    -0.0545812    0.0791187    0.0497003    0.0840831   -0.182627    -0.0357633   -0.147777    -0.10765     -0.0201957   0.044219    -0.122522  
 -0.0165756    -0.103346     0.0102208     0.0534703    0.0652154   -0.196641     -0.0991252    0.0267418    0.0815814     0.0483982    -0.0100291   -0.0876193    0.123886    -0.00494652  -0.163619     0.0215467   -0.0880306   -0.00494809  -0.0867539   -0.139851     0.00212706   0.00294252  -0.202593     0.0540194   0.0130371   -0.00792331
  0.129393     -0.167091    -0.0689479     0.0818063    0.0557163    0.0267053    -0.101635     0.0979462   -0.154122      0.156017     -0.0766029   -0.0218138   -0.183576    -0.142585     0.0991345   -0.0465845    0.100674     0.151396    -0.0814054    0.0668248   -0.180229    -0.0344792    0.171324     0.0407325   0.0170478   -0.0704965 
 -0.141588     -0.0492952   -0.0980489    -0.0302793   -0.193901     0.0763774    -0.0127956   -0.146389    -0.105857     -0.000881895   0.0313543   -0.196678     0.0667959   -0.0982037   -0.11629      0.0390436    0.0637166    0.337076     0.128567    -0.0452586    0.0599625    0.00749858  -0.0340363   -0.0749977  -0.0459839    0.0411169 
 -0.0452921    -0.0627218    0.0235651    -0.0384493   -0.0436809    0.0107745     0.0427019   -0.0351441    0.0793134     0.0434724    -0.0989972   -0.108496     0.16776      0.0162717   -0.00517557  -0.00955945  -0.0441038   -0.0569552   -0.0512816   -0.133028     0.146446    -0.0819598   -0.181671     0.227589   -0.115444    -0.0229202 
  0.00905316   -0.142016     0.114431     -0.0897602    0.112206    -0.0921294    -0.0372736   -0.20778      0.0498223    -0.0491767    -0.0945398    0.18465      0.135844     0.0378514    0.015096     0.0420473    0.13465      0.0153763    0.0293192   -0.129936    -0.107596    -0.134131     0.0262168    0.0189297  -0.139638     0.0538881 
  0.00442491   -0.0841515    0.0088933    -0.0556086   -0.22331      0.0784791    -0.0082178   -0.025062    -0.168125      0.0835345    -0.0992479   -0.0193923   -0.112622    -0.165501    -0.0953687   -0.00368619  -0.0449461   -0.119712    -0.0740872   -0.272175    -0.0202673   -0.0234772   -0.0201851   -0.100123   -0.010896    -0.133491  
 -0.0510854     0.0653889    0.000602249   0.0474303    0.00192272  -0.140563      0.00639289  -0.0451821    0.000412918  -0.0410914     0.0499609   -0.175185     0.110179    -0.0640717    0.0483158   -0.0401551   -0.0759242   -0.0191763    0.14997      0.156708     0.135932     0.118412    -0.028119     0.0568912  -0.0307554    0.0743363 
  0.197782      0.0924345    0.0493848     0.0612183   -0.084625     0.0384003     0.0182845    0.0189139   -0.0180792     0.107363      0.17257     -0.0718633    0.116842     0.0800475   -0.0107605   -0.133819     0.044585    -0.12613      0.0597967   -0.12935     -0.0274153    0.0620678   -0.0436201    0.123333    0.0650811   -0.0316005 
 -0.106855      0.00314558   0.039102      0.0836758   -0.0451008   -0.111979     -0.0556817    0.0591002    0.0843733     0.0520546    -0.0661508   -0.0986712    0.212807    -0.00130657  -0.0259523    0.0708971   -0.0779561   -0.00897296  -0.0984461    0.0932103   -0.0208572    0.131322    -0.043813     0.207454   -0.077093     0.014093  
 -0.144066     -0.0990649    0.093234     -0.104168     0.0799919   -0.017736     -0.089898    -0.1088      -0.070524     -0.0666028     0.050233    -0.130542    -0.0931789   -0.187394     0.244445     0.0197798   -0.0670337   -0.12079     -0.128595    -0.0524639   -0.0233553   -0.0433993    0.152667     0.0270579   0.0627029   -0.0267708 
  0.114289     -0.0546515    0.0750101     0.0635498    0.00473003  -0.102602     -0.0395925   -0.01532      0.0306031     0.0225875     0.102549     0.210177    -0.117598     0.213087     0.0674532   -0.173621     0.0322789    0.075254    -0.0354197    0.0183175   -0.0397242    0.0615104    0.12678     -0.103513   -0.0855066   -0.00567505
 -0.0567507    -0.0200557   -0.159726      0.0448762    0.0144968    0.0018435     0.11342     -0.0761147    0.159226     -0.0730318    -0.138881    -0.0558674   -0.10907      0.0204649   -0.135118    -0.0631976    0.0308114    0.0807088    0.0950313    0.00456123  -0.0685664    0.04428      0.103933    -0.135744   -0.0406341    0.0195798 
  0.091382     -0.201548     0.0427969     0.111463     0.070161    -0.117246      0.157541    -0.200459    -0.093475      0.107533     -0.100712     0.1243       0.0355704   -0.1428      -0.0398259    0.10161     -0.0468989   -0.240055    -0.0372865    0.0243237    0.0252749    0.0395427    0.0309651    0.0684364   0.125149     0.0333584 kind full, method split
┌ Info: 0: avll = 
└   tll[1] = -1.4306630159785543
[ Info: Running 50 iterations EM on diag cov GMM with 2 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.430683
[ Info: iteration 2, average log likelihood -1.430615
[ Info: iteration 3, average log likelihood -1.430563
[ Info: iteration 4, average log likelihood -1.430499
[ Info: iteration 5, average log likelihood -1.430418
[ Info: iteration 6, average log likelihood -1.430310
[ Info: iteration 7, average log likelihood -1.430165
[ Info: iteration 8, average log likelihood -1.429955
[ Info: iteration 9, average log likelihood -1.429621
[ Info: iteration 10, average log likelihood -1.429070
[ Info: iteration 11, average log likelihood -1.428241
[ Info: iteration 12, average log likelihood -1.427240
[ Info: iteration 13, average log likelihood -1.426362
[ Info: iteration 14, average log likelihood -1.425803
[ Info: iteration 15, average log likelihood -1.425521
[ Info: iteration 16, average log likelihood -1.425392
[ Info: iteration 17, average log likelihood -1.425336
[ Info: iteration 18, average log likelihood -1.425311
[ Info: iteration 19, average log likelihood -1.425299
[ Info: iteration 20, average log likelihood -1.425294
[ Info: iteration 21, average log likelihood -1.425292
[ Info: iteration 22, average log likelihood -1.425291
[ Info: iteration 23, average log likelihood -1.425290
[ Info: iteration 24, average log likelihood -1.425289
[ Info: iteration 25, average log likelihood -1.425289
[ Info: iteration 26, average log likelihood -1.425289
[ Info: iteration 27, average log likelihood -1.425288
[ Info: iteration 28, average log likelihood -1.425288
[ Info: iteration 29, average log likelihood -1.425288
[ Info: iteration 30, average log likelihood -1.425288
[ Info: iteration 31, average log likelihood -1.425288
[ Info: iteration 32, average log likelihood -1.425288
[ Info: iteration 33, average log likelihood -1.425288
[ Info: iteration 34, average log likelihood -1.425287
[ Info: iteration 35, average log likelihood -1.425287
[ Info: iteration 36, average log likelihood -1.425287
[ Info: iteration 37, average log likelihood -1.425287
[ Info: iteration 38, average log likelihood -1.425287
[ Info: iteration 39, average log likelihood -1.425287
[ Info: iteration 40, average log likelihood -1.425287
[ Info: iteration 41, average log likelihood -1.425287
[ Info: iteration 42, average log likelihood -1.425287
[ Info: iteration 43, average log likelihood -1.425287
[ Info: iteration 44, average log likelihood -1.425287
[ Info: iteration 45, average log likelihood -1.425287
[ Info: iteration 46, average log likelihood -1.425287
[ Info: iteration 47, average log likelihood -1.425287
[ Info: iteration 48, average log likelihood -1.425287
[ Info: iteration 49, average log likelihood -1.425287
[ Info: iteration 50, average log likelihood -1.425287
┌ Info: EM with 100000 data points 50 iterations avll -1.425287
└ 952.4 data points per parameter
┌ Info: 1
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.4306831660090746
│     -1.4306151506930622
│      ⋮                 
└     -1.4252866806625486
[ Info: Running 50 iterations EM on diag cov GMM with 4 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.425307
[ Info: iteration 2, average log likelihood -1.425236
[ Info: iteration 3, average log likelihood -1.425181
[ Info: iteration 4, average log likelihood -1.425115
[ Info: iteration 5, average log likelihood -1.425032
[ Info: iteration 6, average log likelihood -1.424933
[ Info: iteration 7, average log likelihood -1.424822
[ Info: iteration 8, average log likelihood -1.424712
[ Info: iteration 9, average log likelihood -1.424611
[ Info: iteration 10, average log likelihood -1.424524
[ Info: iteration 11, average log likelihood -1.424450
[ Info: iteration 12, average log likelihood -1.424385
[ Info: iteration 13, average log likelihood -1.424325
[ Info: iteration 14, average log likelihood -1.424269
[ Info: iteration 15, average log likelihood -1.424213
[ Info: iteration 16, average log likelihood -1.424159
[ Info: iteration 17, average log likelihood -1.424107
[ Info: iteration 18, average log likelihood -1.424059
[ Info: iteration 19, average log likelihood -1.424015
[ Info: iteration 20, average log likelihood -1.423976
[ Info: iteration 21, average log likelihood -1.423943
[ Info: iteration 22, average log likelihood -1.423916
[ Info: iteration 23, average log likelihood -1.423895
[ Info: iteration 24, average log likelihood -1.423879
[ Info: iteration 25, average log likelihood -1.423866
[ Info: iteration 26, average log likelihood -1.423856
[ Info: iteration 27, average log likelihood -1.423849
[ Info: iteration 28, average log likelihood -1.423843
[ Info: iteration 29, average log likelihood -1.423838
[ Info: iteration 30, average log likelihood -1.423834
[ Info: iteration 31, average log likelihood -1.423831
[ Info: iteration 32, average log likelihood -1.423829
[ Info: iteration 33, average log likelihood -1.423826
[ Info: iteration 34, average log likelihood -1.423825
[ Info: iteration 35, average log likelihood -1.423823
[ Info: iteration 36, average log likelihood -1.423821
[ Info: iteration 37, average log likelihood -1.423820
[ Info: iteration 38, average log likelihood -1.423819
[ Info: iteration 39, average log likelihood -1.423818
[ Info: iteration 40, average log likelihood -1.423817
[ Info: iteration 41, average log likelihood -1.423816
[ Info: iteration 42, average log likelihood -1.423815
[ Info: iteration 43, average log likelihood -1.423814
[ Info: iteration 44, average log likelihood -1.423813
[ Info: iteration 45, average log likelihood -1.423813
[ Info: iteration 46, average log likelihood -1.423812
[ Info: iteration 47, average log likelihood -1.423812
[ Info: iteration 48, average log likelihood -1.423811
[ Info: iteration 49, average log likelihood -1.423811
[ Info: iteration 50, average log likelihood -1.423810
┌ Info: EM with 100000 data points 50 iterations avll -1.423810
└ 473.9 data points per parameter
┌ Info: 2
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.425306581944389 
│     -1.4252359257826868
│      ⋮                 
└     -1.42381029590753  
[ Info: Running 50 iterations EM on diag cov GMM with 8 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.423821
[ Info: iteration 2, average log likelihood -1.423769
[ Info: iteration 3, average log likelihood -1.423726
[ Info: iteration 4, average log likelihood -1.423679
[ Info: iteration 5, average log likelihood -1.423623
[ Info: iteration 6, average log likelihood -1.423555
[ Info: iteration 7, average log likelihood -1.423476
[ Info: iteration 8, average log likelihood -1.423388
[ Info: iteration 9, average log likelihood -1.423295
[ Info: iteration 10, average log likelihood -1.423202
[ Info: iteration 11, average log likelihood -1.423114
[ Info: iteration 12, average log likelihood -1.423035
[ Info: iteration 13, average log likelihood -1.422966
[ Info: iteration 14, average log likelihood -1.422909
[ Info: iteration 15, average log likelihood -1.422861
[ Info: iteration 16, average log likelihood -1.422823
[ Info: iteration 17, average log likelihood -1.422791
[ Info: iteration 18, average log likelihood -1.422764
[ Info: iteration 19, average log likelihood -1.422741
[ Info: iteration 20, average log likelihood -1.422721
[ Info: iteration 21, average log likelihood -1.422702
[ Info: iteration 22, average log likelihood -1.422684
[ Info: iteration 23, average log likelihood -1.422668
[ Info: iteration 24, average log likelihood -1.422651
[ Info: iteration 25, average log likelihood -1.422636
[ Info: iteration 26, average log likelihood -1.422620
[ Info: iteration 27, average log likelihood -1.422605
[ Info: iteration 28, average log likelihood -1.422590
[ Info: iteration 29, average log likelihood -1.422576
[ Info: iteration 30, average log likelihood -1.422562
[ Info: iteration 31, average log likelihood -1.422548
[ Info: iteration 32, average log likelihood -1.422535
[ Info: iteration 33, average log likelihood -1.422523
[ Info: iteration 34, average log likelihood -1.422511
[ Info: iteration 35, average log likelihood -1.422500
[ Info: iteration 36, average log likelihood -1.422489
[ Info: iteration 37, average log likelihood -1.422479
[ Info: iteration 38, average log likelihood -1.422469
[ Info: iteration 39, average log likelihood -1.422460
[ Info: iteration 40, average log likelihood -1.422452
[ Info: iteration 41, average log likelihood -1.422444
[ Info: iteration 42, average log likelihood -1.422436
[ Info: iteration 43, average log likelihood -1.422429
[ Info: iteration 44, average log likelihood -1.422422
[ Info: iteration 45, average log likelihood -1.422416
[ Info: iteration 46, average log likelihood -1.422410
[ Info: iteration 47, average log likelihood -1.422404
[ Info: iteration 48, average log likelihood -1.422398
[ Info: iteration 49, average log likelihood -1.422392
[ Info: iteration 50, average log likelihood -1.422387
┌ Info: EM with 100000 data points 50 iterations avll -1.422387
└ 236.4 data points per parameter
┌ Info: 3
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.423821260693816 
│     -1.423769159493944 
│      ⋮                 
└     -1.4223870713948619
[ Info: Running 50 iterations EM on diag cov GMM with 16 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.422391
[ Info: iteration 2, average log likelihood -1.422338
[ Info: iteration 3, average log likelihood -1.422290
[ Info: iteration 4, average log likelihood -1.422237
[ Info: iteration 5, average log likelihood -1.422175
[ Info: iteration 6, average log likelihood -1.422102
[ Info: iteration 7, average log likelihood -1.422018
[ Info: iteration 8, average log likelihood -1.421922
[ Info: iteration 9, average log likelihood -1.421819
[ Info: iteration 10, average log likelihood -1.421713
[ Info: iteration 11, average log likelihood -1.421606
[ Info: iteration 12, average log likelihood -1.421502
[ Info: iteration 13, average log likelihood -1.421405
[ Info: iteration 14, average log likelihood -1.421316
[ Info: iteration 15, average log likelihood -1.421238
[ Info: iteration 16, average log likelihood -1.421170
[ Info: iteration 17, average log likelihood -1.421113
[ Info: iteration 18, average log likelihood -1.421065
[ Info: iteration 19, average log likelihood -1.421024
[ Info: iteration 20, average log likelihood -1.420989
[ Info: iteration 21, average log likelihood -1.420959
[ Info: iteration 22, average log likelihood -1.420933
[ Info: iteration 23, average log likelihood -1.420910
[ Info: iteration 24, average log likelihood -1.420889
[ Info: iteration 25, average log likelihood -1.420870
[ Info: iteration 26, average log likelihood -1.420852
[ Info: iteration 27, average log likelihood -1.420835
[ Info: iteration 28, average log likelihood -1.420819
[ Info: iteration 29, average log likelihood -1.420804
[ Info: iteration 30, average log likelihood -1.420790
[ Info: iteration 31, average log likelihood -1.420776
[ Info: iteration 32, average log likelihood -1.420763
[ Info: iteration 33, average log likelihood -1.420750
[ Info: iteration 34, average log likelihood -1.420738
[ Info: iteration 35, average log likelihood -1.420726
[ Info: iteration 36, average log likelihood -1.420715
[ Info: iteration 37, average log likelihood -1.420704
[ Info: iteration 38, average log likelihood -1.420694
[ Info: iteration 39, average log likelihood -1.420684
[ Info: iteration 40, average log likelihood -1.420674
[ Info: iteration 41, average log likelihood -1.420665
[ Info: iteration 42, average log likelihood -1.420656
[ Info: iteration 43, average log likelihood -1.420648
[ Info: iteration 44, average log likelihood -1.420640
[ Info: iteration 45, average log likelihood -1.420632
[ Info: iteration 46, average log likelihood -1.420624
[ Info: iteration 47, average log likelihood -1.420617
[ Info: iteration 48, average log likelihood -1.420610
[ Info: iteration 49, average log likelihood -1.420604
[ Info: iteration 50, average log likelihood -1.420598
┌ Info: EM with 100000 data points 50 iterations avll -1.420598
└ 118.1 data points per parameter
┌ Info: 4
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.422391260690002 
│     -1.4223375631247595
│      ⋮                 
└     -1.4205975638959212
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.420600
[ Info: iteration 2, average log likelihood -1.420540
[ Info: iteration 3, average log likelihood -1.420486
[ Info: iteration 4, average log likelihood -1.420423
[ Info: iteration 5, average log likelihood -1.420347
[ Info: iteration 6, average log likelihood -1.420251
[ Info: iteration 7, average log likelihood -1.420136
[ Info: iteration 8, average log likelihood -1.420002
[ Info: iteration 9, average log likelihood -1.419855
[ Info: iteration 10, average log likelihood -1.419704
[ Info: iteration 11, average log likelihood -1.419556
[ Info: iteration 12, average log likelihood -1.419418
[ Info: iteration 13, average log likelihood -1.419293
[ Info: iteration 14, average log likelihood -1.419182
[ Info: iteration 15, average log likelihood -1.419086
[ Info: iteration 16, average log likelihood -1.419002
[ Info: iteration 17, average log likelihood -1.418927
[ Info: iteration 18, average log likelihood -1.418862
[ Info: iteration 19, average log likelihood -1.418804
[ Info: iteration 20, average log likelihood -1.418752
[ Info: iteration 21, average log likelihood -1.418705
[ Info: iteration 22, average log likelihood -1.418663
[ Info: iteration 23, average log likelihood -1.418624
[ Info: iteration 24, average log likelihood -1.418589
[ Info: iteration 25, average log likelihood -1.418557
[ Info: iteration 26, average log likelihood -1.418527
[ Info: iteration 27, average log likelihood -1.418499
[ Info: iteration 28, average log likelihood -1.418473
[ Info: iteration 29, average log likelihood -1.418448
[ Info: iteration 30, average log likelihood -1.418425
[ Info: iteration 31, average log likelihood -1.418403
[ Info: iteration 32, average log likelihood -1.418383
[ Info: iteration 33, average log likelihood -1.418363
[ Info: iteration 34, average log likelihood -1.418344
[ Info: iteration 35, average log likelihood -1.418327
[ Info: iteration 36, average log likelihood -1.418310
[ Info: iteration 37, average log likelihood -1.418293
[ Info: iteration 38, average log likelihood -1.418278
[ Info: iteration 39, average log likelihood -1.418263
[ Info: iteration 40, average log likelihood -1.418248
[ Info: iteration 41, average log likelihood -1.418234
[ Info: iteration 42, average log likelihood -1.418221
[ Info: iteration 43, average log likelihood -1.418208
[ Info: iteration 44, average log likelihood -1.418195
[ Info: iteration 45, average log likelihood -1.418183
[ Info: iteration 46, average log likelihood -1.418170
[ Info: iteration 47, average log likelihood -1.418159
[ Info: iteration 48, average log likelihood -1.418147
[ Info: iteration 49, average log likelihood -1.418136
[ Info: iteration 50, average log likelihood -1.418124
┌ Info: EM with 100000 data points 50 iterations avll -1.418124
└ 59.0 data points per parameter
┌ Info: 5
│   : avll =  = ": avll = "
│   avll =
│    50-element Array{Float64,1}:
│     -1.420599947917332 
│     -1.4205404875267773
│      ⋮                 
└     -1.4181244446971544
┌ Info: Total log likelihood: 
│   tll =
│    251-element Array{Float64,1}:
│     -1.4306630159785543
│     -1.4306831660090746
│     -1.4306151506930622
│     -1.4305629227447   
│      ⋮                 
│     -1.418146906175143 
│     -1.4181355581634971
└     -1.4181244446971544
32×26 Array{Float64,2}:
  0.176803    0.459202     -0.225534    0.248889    -0.390635    -0.0542349    -1.04925     0.172164    -0.0915596   0.206061   -0.388236    -0.0151245   -0.153981   -0.275756     0.13719      0.237755   -0.212165    -0.867003     0.27714     -0.0823151   -0.289999     0.221736    0.304426    -0.291272    -0.288065   -0.306922 
  0.222145    0.140431     -0.359759   -0.392482    -0.628127    -0.121562      0.464939   -0.102455     0.276919    0.341689   -0.133961     0.0533401   -0.243887    0.0168762    0.260544     0.142051   -0.0475632   -0.701849     0.12964     -0.165757    -0.46024     -0.118772    0.0918808    0.142911    -0.0599526   0.129543 
 -0.156916    0.0217758    -0.499665   -0.769991    -0.557102     0.075957     -0.158969    0.0937732   -0.343299   -0.0412322   0.267124    -0.352151    -0.107941   -0.0562884   -0.687714     0.129521   -0.0910253   -0.672927     0.0471499    0.321784    -0.404482    -0.329345    0.246599     0.225794    -0.483054   -0.347171 
 -0.238273    0.124342      0.39366    -0.463865    -0.196695     0.069092     -0.468911   -0.0450144   -0.510865    0.0326231   0.597111    -0.394455     0.136836    0.171449    -0.003716     0.291637   -0.359557    -0.456357    -0.287119     0.0196799    0.031128    -0.144618   -0.301991     0.300436     0.0495015   0.279525 
 -0.296366    0.122269     -0.160042    0.228338    -0.204418    -0.418094      0.0621958  -0.0696043   -0.145726    0.157441    0.00162187  -0.120695     0.213744    0.0148714    0.146077    -0.0470249  -0.00186814  -0.270721    -0.361304    -0.275305     0.0342199   -0.403728   -1.00337     -0.0750569   -0.224572   -0.115136 
 -0.171355    0.32892       0.0190283   0.838985     0.395264    -0.0223108     0.147605    0.0284628    0.244221   -0.300849   -0.182441     0.18116     -0.0615932   0.21177      0.440112    -0.0396523  -0.526363    -0.0643544    0.229727    -0.414865    -0.0380946    0.161662   -0.38239     -0.0198526    0.0105655   0.0182467
  0.0190695   0.0781904     0.0491968  -0.071967    -0.224868    -0.0666909    -0.153681    0.0517946   -0.0410723   0.0844483   0.032743    -0.113528     0.066304    0.00352657  -0.0942114    0.0959285   0.0267787   -0.136779    -0.0420539    0.00252718  -0.0388643    0.0552019  -0.00277683   0.0149071    0.0143518  -0.068443 
  0.100028   -0.268241      0.100916   -0.00724168   0.464035     0.000375231   0.232527   -0.203009    -0.0577291   0.0109932  -0.0509378    0.180792    -0.052657   -0.0917001   -0.120387    -0.312426    0.0476759    0.313197     0.00995449   0.0224344    0.11708     -0.139455    0.160846     0.13519      0.0153712   0.11214  
  0.400703    0.110982      0.46674     0.634091     0.0270471   -0.115272     -0.327698   -0.0714377   -0.0697822   0.371523   -0.3038       0.0881606    0.0278942  -0.47188     -0.278635    -0.280314    0.801037     0.276508    -0.43131     -0.415942     0.0375403    0.14465    -0.0288253   -0.213994     0.440795   -0.335054 
 -0.264372    0.110166      0.630268    0.127705     0.214653     0.0231696     0.119567    0.00602268  -0.348915   -0.104773    0.883392    -0.0465617   -0.100052   -0.417116     0.456202    -0.110936    0.805426     0.702226    -0.399596     0.171809    -0.30285      0.473722   -0.100583    -0.175933     0.388759    0.053653 
  0.0744451  -0.730239      0.172041   -0.725468    -0.368057     0.132481      0.0962818   0.15291      0.0379135   0.284288    0.0312134   -0.156888     0.351291    0.105924    -0.392192     0.413942    0.521734     0.4153      -1.17907      0.145302    -0.0586008    0.201326   -0.0683591    0.465831     0.752882   -0.473569 
 -0.41678    -0.399506      0.0539485   0.134589    -0.128146     0.102272     -0.167699    0.212639    -0.180615    0.25899    -0.333789     0.261717     0.221812   -0.25043     -0.00892259   0.370052    0.257131     0.19069      0.412909     0.0852041   -0.201363     0.152438    0.482606     0.72454      1.00057     0.26153  
 -0.451288   -0.2859       -0.171085   -0.0284535   -0.225319     0.169503     -0.0523349   0.0520683    0.139142   -0.627628    0.0387755   -0.101447     0.146491    0.155945    -0.113036     0.468969    0.429333     0.397139     0.359093    -0.204704    -0.268414     0.328532    0.116763    -0.228593    -0.0266188  -0.406495 
 -0.0338115  -0.742062     -0.591908    0.034397    -0.110436     0.28188      -0.303476    0.161977     0.490455    0.416466   -0.28106      0.2225      -0.121815    0.303473     0.153064    -0.0852153  -0.124035     0.419441     0.152811    -0.0256574    0.135813     0.256531    0.00842036   0.154434     0.148743   -0.0961892
 -0.46646     0.0345048    -0.140148   -0.447913     0.136482     0.294002      0.414924   -0.178072    -0.447445   -0.240212   -0.104253    -0.436889    -0.358809   -0.362228    -0.613567     0.160152   -0.0542501    0.559688     0.335199     0.43635      0.372184     0.0011751   0.0624153    0.191725     0.0390111   0.108645 
  0.0929954  -0.321414     -0.0775946  -0.2627      -0.00174003   0.105097      0.558992   -0.278143     0.0311836   0.100791    0.113249     0.750703     0.236239   -0.56383     -0.364916    -0.284873   -0.0199918    0.307044     0.699828     0.313545     0.77971      0.587819    0.110362    -0.165689    -0.745218   -0.152504 
 -0.468403   -0.0705398    -0.163173   -0.139362    -0.161684    -0.271355     -0.0273425   0.187012    -0.0454441  -0.325863    0.373566    -0.0114182   -0.224892    0.235683    -0.0930042    0.134804   -0.0115612   -0.078297     0.0982805   -0.585686     0.00084255   0.0307877  -0.851458     0.0304091   -0.431603   -0.394597 
 -0.425467   -0.0431272    -0.6441     -0.342241    -0.525356     0.288106     -0.111862   -0.194494     0.314866    0.103252    0.250363    -0.00310173   0.30331     0.308118     0.702854     0.628047   -0.452995    -0.00258673   0.257471     0.485312     0.118562     0.0120234  -0.332634     0.200879    -0.299119    0.261498 
 -0.0175342  -0.423084      0.0227022  -0.331703    -0.0579178    0.235506     -0.0815269   0.0408747   -0.325797   -0.109312   -0.235303     0.11875      0.0538645  -0.424523    -0.394281    -0.11642     0.473786     0.242495     0.309607     0.25187     -0.117353     0.184905    0.354244    -0.0537985    0.0928699  -0.301471 
 -0.172819    0.399799      0.0475645   0.267987    -0.236813    -0.0806297    -0.192989    0.195481    -0.138112    0.0516114   0.0404916   -0.59199     -0.161361    0.193705     0.171162     0.288588    0.108741    -0.440763    -0.385527    -0.216685    -0.611789    -0.391766   -0.0253561    0.0875275    0.440951    0.0154534
  0.220875    0.620156      0.739806    0.0480584   -0.477843    -0.855942      0.705119   -0.165319    -0.313794   -0.265791   -0.0322984    0.16878     -0.821038   -0.105907    -0.106209    -0.370962    0.113807    -0.0773302    0.486177    -0.0216453   -0.374594     0.123347   -0.406713    -0.0726421   -0.691246    0.13191  
  0.621272    0.42857       0.517979   -0.34541      0.266647    -0.189275      0.16893    -0.357716    -0.09123    -0.389946    0.34915     -0.460874    -0.483514    0.0239774    0.510128     0.419623   -0.100974    -0.505448    -0.264774    -0.141866    -0.145151    -0.175526   -0.346581    -0.370446    -0.374917    0.0590786
 -0.243589    0.512287      0.270969   -0.502378    -0.00664037  -0.015572     -0.362734   -0.628056     0.361947    0.397042    0.417706    -0.116825     0.494181    0.103485    -0.246596     0.347391    0.0724152    0.193067     0.329208     0.274691     0.228501     0.615483   -0.325478    -0.487712    -0.20513     0.302605 
  0.136973    0.457792      0.880334   -0.392261     0.168213    -0.282695      0.691411   -0.166724    -0.347566   -0.357554    0.169233    -0.197138     0.865157   -0.358219    -0.497427    -0.137432    0.65978     -0.270829    -0.319531     0.355795     0.165502    -0.209172    0.484744    -0.418909    -0.173559    0.115955 
  0.318755   -0.000725616  -0.190124    0.8984       0.752475    -0.00911092    0.590279   -0.64134      0.968649    0.351766   -0.0215153    0.131041    -0.0163988   0.381322    -0.111986    -0.219557    0.155884     0.486662    -0.169678    -0.323897     0.261437     0.228752    0.300513    -0.108067    -0.0113979   0.330237 
 -0.17361    -0.773533      0.629722    0.36254      0.868395    -0.271054     -0.0845716  -0.429275    -0.302702   -0.141922   -0.532603     0.427807     0.192239    0.0260127    0.39969     -0.3428     -0.147347     0.519936     0.206186    -0.246351     0.703618     0.182817   -0.425307    -0.0971029    0.125272    0.757275 
  1.14936     0.0179955    -0.226807   -0.420365    -0.338228     0.013918      0.162146   -0.145165    -0.263823    0.490862   -0.0730365    0.430686     0.327316   -0.315448    -0.0588874    0.0485277  -0.490735    -0.324885     0.0626969    0.00687684   0.578423    -0.0597072   0.0123023   -0.00662611   0.0883684   0.720625 
  0.894602   -0.224552      0.320379    0.23725      0.278041    -0.168237     -0.0844479   0.101649     0.340196    0.47195    -0.0334984    0.144602     0.0363192   0.50705      0.472324    -0.172233   -0.0840769   -0.0834858   -0.364924     0.191569     0.234573    -0.292821    0.350804    -0.279116     0.0633605   0.226402 
 -0.178083   -0.0553835    -0.136238    0.669905     0.531748     0.125492     -0.592425    0.0749655    0.268862   -0.0855529   0.12536     -0.123362     0.4505      0.225774     0.184794    -0.747048   -0.692548     0.0737011   -0.181488     0.336458     0.288722    -0.207173    0.48571     -0.203089     0.0251807  -0.164206 
 -0.575389    0.0355628     0.0567941   0.772984     0.331586    -0.228381     -0.15812     0.295859     0.0651193   0.760767    0.104422    -0.458735     0.315076   -0.283618    -0.456476    -0.529277    0.249117     0.225771     0.3736       0.234034     0.105404     0.617043    0.243118     0.685097     0.179908    0.0704874
 -0.0139958  -0.474481     -0.031834    0.425367     0.187652     0.10363       0.0395369   0.427835    -0.677045   -0.327472   -0.445825     0.053885    -0.321528    0.0540926   -0.246883    -0.689532   -0.0879879    0.0184557   -0.18781     -0.24495     -0.253773    -0.678517    0.333089     0.397992     0.277885   -0.354688 
  0.673846    0.0170556    -0.259704    0.307561     0.58581      0.178499      0.21258    -0.129237     0.532686   -0.110508   -0.600586     0.195756    -0.590868   -0.00713218  -0.726674    -0.723206   -0.777036    -0.0827696   -0.196573    -0.569785     0.122156    -0.0995672   0.319477    -0.156997    -0.443267   -0.456942 [ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.418114
[ Info: iteration 2, average log likelihood -1.418103
[ Info: iteration 3, average log likelihood -1.418092
[ Info: iteration 4, average log likelihood -1.418082
[ Info: iteration 5, average log likelihood -1.418072
[ Info: iteration 6, average log likelihood -1.418062
[ Info: iteration 7, average log likelihood -1.418052
[ Info: iteration 8, average log likelihood -1.418043
[ Info: iteration 9, average log likelihood -1.418033
[ Info: iteration 10, average log likelihood -1.418024
┌ Info: EM with 100000 data points 10 iterations avll -1.418024
└ 59.0 data points per parameter
kind full, method kmeans
[ Info: Initializing GMM, 32 Gaussians diag covariance 26 dimensions using 100000 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       9.239531e+05
      1       7.140519e+05      -2.099012e+05 |       32
      2       7.024738e+05      -1.157818e+04 |       32
      3       6.978570e+05      -4.616719e+03 |       32
      4       6.952014e+05      -2.655639e+03 |       32
      5       6.934409e+05      -1.760478e+03 |       32
      6       6.921476e+05      -1.293367e+03 |       32
      7       6.911120e+05      -1.035567e+03 |       32
      8       6.902545e+05      -8.575032e+02 |       32
      9       6.895304e+05      -7.240769e+02 |       32
     10       6.889266e+05      -6.038487e+02 |       32
     11       6.884274e+05      -4.992028e+02 |       32
     12       6.880098e+05      -4.175156e+02 |       32
     13       6.876529e+05      -3.569716e+02 |       32
     14       6.873567e+05      -2.961349e+02 |       32
     15       6.870989e+05      -2.578882e+02 |       32
     16       6.868712e+05      -2.276737e+02 |       32
     17       6.866426e+05      -2.286092e+02 |       32
     18       6.864316e+05      -2.109415e+02 |       32
     19       6.862282e+05      -2.034651e+02 |       32
     20       6.860304e+05      -1.977910e+02 |       32
     21       6.858705e+05      -1.598606e+02 |       32
     22       6.857233e+05      -1.471649e+02 |       32
     23       6.855898e+05      -1.335458e+02 |       32
     24       6.854744e+05      -1.154185e+02 |       32
     25       6.853709e+05      -1.034884e+02 |       32
     26       6.852694e+05      -1.015059e+02 |       32
     27       6.851757e+05      -9.364132e+01 |       32
     28       6.850870e+05      -8.871261e+01 |       32
     29       6.849998e+05      -8.720706e+01 |       32
     30       6.849134e+05      -8.640256e+01 |       32
     31       6.848395e+05      -7.389791e+01 |       32
     32       6.847737e+05      -6.580777e+01 |       32
     33       6.847125e+05      -6.120471e+01 |       32
     34       6.846538e+05      -5.870168e+01 |       32
     35       6.846004e+05      -5.340094e+01 |       32
     36       6.845425e+05      -5.794227e+01 |       32
     37       6.844864e+05      -5.605828e+01 |       32
     38       6.844320e+05      -5.438416e+01 |       32
     39       6.843784e+05      -5.361520e+01 |       32
     40       6.843339e+05      -4.451139e+01 |       32
     41       6.842910e+05      -4.285459e+01 |       32
     42       6.842530e+05      -3.801141e+01 |       32
     43       6.842219e+05      -3.116286e+01 |       32
     44       6.841917e+05      -3.018329e+01 |       32
     45       6.841591e+05      -3.261570e+01 |       32
     46       6.841236e+05      -3.546322e+01 |       32
     47       6.840880e+05      -3.563064e+01 |       32
     48       6.840589e+05      -2.904433e+01 |       32
     49       6.840290e+05      -2.994957e+01 |       32
     50       6.839997e+05      -2.927499e+01 |       32
K-means terminated without convergence after 50 iterations (objv = 683999.7083942282)
┌ Info: K-means with 32000 data points using 50 iterations
└ 37.0 data points per parameter
[ Info: Running 50 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.429737
[ Info: iteration 2, average log likelihood -1.424787
[ Info: iteration 3, average log likelihood -1.423512
[ Info: iteration 4, average log likelihood -1.422598
[ Info: iteration 5, average log likelihood -1.421598
[ Info: iteration 6, average log likelihood -1.420581
[ Info: iteration 7, average log likelihood -1.419793
[ Info: iteration 8, average log likelihood -1.419308
[ Info: iteration 9, average log likelihood -1.419024
[ Info: iteration 10, average log likelihood -1.418841
[ Info: iteration 11, average log likelihood -1.418710
[ Info: iteration 12, average log likelihood -1.418608
[ Info: iteration 13, average log likelihood -1.418525
[ Info: iteration 14, average log likelihood -1.418457
[ Info: iteration 15, average log likelihood -1.418400
[ Info: iteration 16, average log likelihood -1.418351
[ Info: iteration 17, average log likelihood -1.418308
[ Info: iteration 18, average log likelihood -1.418271
[ Info: iteration 19, average log likelihood -1.418238
[ Info: iteration 20, average log likelihood -1.418208
[ Info: iteration 21, average log likelihood -1.418182
[ Info: iteration 22, average log likelihood -1.418158
[ Info: iteration 23, average log likelihood -1.418136
[ Info: iteration 24, average log likelihood -1.418116
[ Info: iteration 25, average log likelihood -1.418097
[ Info: iteration 26, average log likelihood -1.418080
[ Info: iteration 27, average log likelihood -1.418064
[ Info: iteration 28, average log likelihood -1.418049
[ Info: iteration 29, average log likelihood -1.418035
[ Info: iteration 30, average log likelihood -1.418021
[ Info: iteration 31, average log likelihood -1.418009
[ Info: iteration 32, average log likelihood -1.417997
[ Info: iteration 33, average log likelihood -1.417986
[ Info: iteration 34, average log likelihood -1.417975
[ Info: iteration 35, average log likelihood -1.417964
[ Info: iteration 36, average log likelihood -1.417954
[ Info: iteration 37, average log likelihood -1.417945
[ Info: iteration 38, average log likelihood -1.417935
[ Info: iteration 39, average log likelihood -1.417926
[ Info: iteration 40, average log likelihood -1.417916
[ Info: iteration 41, average log likelihood -1.417907
[ Info: iteration 42, average log likelihood -1.417898
[ Info: iteration 43, average log likelihood -1.417889
[ Info: iteration 44, average log likelihood -1.417880
[ Info: iteration 45, average log likelihood -1.417871
[ Info: iteration 46, average log likelihood -1.417862
[ Info: iteration 47, average log likelihood -1.417852
[ Info: iteration 48, average log likelihood -1.417843
[ Info: iteration 49, average log likelihood -1.417834
[ Info: iteration 50, average log likelihood -1.417825
┌ Info: EM with 100000 data points 50 iterations avll -1.417825
└ 59.0 data points per parameter
32×26 Array{Float64,2}:
 -0.835275   -0.396054   -0.201187   -0.145023   -0.0479659    0.0222965  -0.139711   -0.104897    -0.463252    -0.198047    0.414291   -0.00727464   0.349115    0.159093    0.138311      0.163146   -0.290124    -0.218185   -0.0956738   -0.37388      0.379038    -0.566987    -1.17565     0.366948    -0.289357     0.302347 
  0.547957   -0.0567826   0.458555    0.056632    0.17306     -0.46855     0.157277   -0.298667     0.425143     0.427939    0.350547   -0.0561229    0.512984    0.571593    0.217819     -0.109869    0.127462    -0.23847    -0.80594      0.281004     0.408271    -0.294615    -0.119137   -0.300058    -0.407464     0.193164 
  0.578195   -1.05094    -0.384444   -0.0332381  -0.222642     0.418344   -0.0346212   0.369151    -0.104702     0.240667   -0.628827    0.334354    -0.356368    0.615788   -0.000587403  -0.301378   -0.152985    -0.0292554  -0.235604    -0.280096    -0.0150104   -0.697693     0.130783    0.314873     0.50611      0.0216384
  0.113941   -0.311241    0.557248    0.335295    0.333835     0.115169    0.0910933  -0.0771403   -0.101607    -0.0372755   0.249684    0.35932      0.20314    -0.424805    0.388491     -0.287382    0.608426     0.912627   -0.379845    -0.0193282    0.180286     0.345234    -0.157755   -0.0358963    0.496697     0.0604787
 -0.0379875  -0.106578   -0.0627435  -0.0339651   0.00318106   0.017706    0.0723195  -0.00869402  -0.00107715   0.0282508  -0.0316444   0.039582     0.0268991  -0.0233319  -0.0862544    -0.0459873   0.00124525   0.0862492   0.0661855    0.00989598   0.0331848   -0.0425286    0.0609926   0.1691       0.041221    -0.0236036
  0.193513   -0.0981009   0.194437    0.872568    1.06699      0.0595033   0.218694   -0.488968     0.349692     0.0035378  -0.347314    0.21221     -0.342766    0.197264   -0.0442708    -0.462218   -0.374218     0.368924   -0.00241609  -0.537679     0.337288     0.136879     0.0747505  -0.0409767    0.00183043   0.341336 
 -0.479113    0.0296114  -0.65758    -0.460448   -0.0934362    0.425423    0.449024   -0.124019     0.108195    -0.37177     0.0542736  -0.499877    -0.670641    0.152691   -0.143155      0.191727   -0.628765     0.133375    0.379095     0.285041    -0.101182    -0.240573     0.22037     0.735338    -0.303186    -0.259477 
  0.609338    0.0653729  -0.391586   -0.802148   -0.493088    -0.0443484   0.353876   -0.160777     0.0115579    0.717512    0.0791484  -0.0231194    0.611624   -0.304495   -0.0531875     0.420514   -0.278734    -0.539892    0.559774     0.141445     0.475794     0.00645212   0.0341207   0.0998298    0.217442     0.754695 
 -0.249991   -0.274877    0.271478    0.0138977   0.414444     0.17231     0.296504   -0.0335577   -0.513887    -0.435173   -0.033079   -0.1765       0.0488901  -0.259548   -0.433656     -0.569309    0.228105    -0.111297   -0.135789     0.149526    -0.477955    -0.954914     0.808792   -0.00783571  -0.214765    -0.268798 
  0.374989    0.195627   -0.449145    0.539476    0.00104357   0.0054638   0.401416   -0.305797     0.925574     0.559872   -0.0857728  -0.0298127   -0.0610654   0.521377   -0.156507      0.154193    0.813964     0.294489    0.206386     0.134605    -0.553659     0.267915     0.884688   -0.642523    -0.0897908   -0.191866 
 -0.260418   -0.734044   -0.0299131   0.394673    0.186927     0.140045   -0.525021    0.325529    -0.27336     -0.0252707  -0.574376    0.352376    -0.110566   -0.249321   -0.405514     -0.130903    0.243127     0.431309    0.630176    -0.169212    -0.0663832    0.58502      0.735059    0.536085     0.393168    -0.162707 
 -0.303859   -0.249169   -0.71724     0.771756   -0.0289958   -0.36421     0.103006    0.405898     0.155567     0.139444   -0.329435    0.417854     0.830046   -0.0716986  -0.536933     -0.533658   -0.24764      0.177179    0.111315     0.0838219    0.872104    -0.266379     0.183211    0.0335733    0.00153887  -0.251848 
  0.30044     0.0453139  -0.141907    0.586815    0.454097     0.158509   -0.494167    0.0786461    0.268442     0.0717433  -0.130447    0.117194     0.0881869   0.185151    0.255662     -0.559318   -0.824857    -0.0709745   0.0367484    0.149527     0.240227    -0.0768928    0.428595   -0.248946    -0.0309901    0.0182929
 -0.168982    0.39954    -0.29356    -0.405925   -0.710885    -0.033945   -0.602535    0.228509    -0.352116     0.143239    0.235044   -0.319861     0.28714     0.0120291  -0.19898       0.318985   -0.226667    -0.633897    0.0148225    0.327944    -0.338919    -0.177107    -0.0106099   0.19107     -0.312071    -0.185696 
 -0.0574378  -0.259077    0.266001   -0.667035    0.222048     0.160583    0.130471   -0.266057    -0.0238334    0.0428368  -0.0884133   0.077924     0.139575   -0.136066   -0.331954      0.056102    0.193332     0.640851    0.249266     0.539968     0.361814     0.170305     0.337917   -0.166181     0.0690766    0.521229 
 -0.227963    0.18399     0.353896    0.59372     0.0212404   -0.0508991  -0.264057    0.121997    -0.302384     0.293313   -0.326555   -0.401955     0.0421846  -0.219367    0.272963     -0.0259209   0.255961     0.0372472  -0.363661     0.109382    -0.563505    -0.415415     0.353683    0.372773     1.09476      0.228427 
 -0.191114   -0.740328   -0.0989261  -0.634678   -0.41058      0.3003     -0.0199864   0.0811099    0.0940503    0.269393    0.0355121  -0.190464     0.468019    0.0860983  -0.318248      0.508857    0.432143     0.371493   -0.764255     0.139176    -0.0717683    0.35623      0.0973421   0.49781      0.672268    -0.426205 
  0.0296342   0.320582   -0.16174     0.624317   -0.595956    -0.327485    0.208892    0.403071     0.234885     0.0216029   0.0886445   0.276882    -0.205305    0.0794546   0.425999     -0.224303   -0.0331971   -0.57758    -0.140798    -0.9589      -0.455797    -0.415357    -0.201882    0.379667    -0.257322    -0.565653 
  0.595578    0.0835502   0.148912    0.216503    0.231317    -0.383642    0.150031    0.00879989  -0.254403     0.292653   -0.358099    0.0518879   -0.219139   -0.438708   -0.861752     -0.613551    0.376691     0.188839   -0.417655    -0.460668     0.291266    -0.125168    -0.260221    0.0711509    0.00746725  -0.52772  
 -0.329813   -0.0986625  -0.530675   -0.011201   -0.353952     0.0968194  -0.416187   -0.155193     0.742525     0.248779   -0.013516    0.286796     0.0366437   0.46825     0.64636       0.556988   -0.335185     0.125295    0.391478     0.212536    -0.0779294    0.44727     -0.363302    0.167571     0.0845629    0.326682 
  0.810533    0.764112    0.550674   -0.118739    0.26381      0.279506    0.236221   -0.100152    -0.44001     -0.36405     0.0310362  -0.236409    -0.0550192  -0.217307    0.0450609     0.178731   -0.474143    -0.607249   -0.40178      0.0793486    0.465848    -0.134607     0.124681    0.0605795    0.20956      0.420397 
 -0.57294     0.0223739  -0.11235     0.354043   -0.0123666   -0.222269   -0.507239    0.449231     0.14927     -0.18503     0.309612   -1.16711     -0.0449486   0.597337   -0.107533      0.422291    0.373985     0.058537   -0.0742247   -0.232157    -0.21217      0.0818848   -0.309778   -0.320495     0.204362    -0.375318 
 -0.395795    0.0102254   0.715452   -0.704622   -0.1296      -0.0418258   0.322781   -0.0759441   -0.496997    -0.246783    0.78483    -0.223875    -0.680056   -0.135692   -0.196665      0.371613    0.478078     0.0578943  -0.0532327   -0.159898    -0.34635      0.281783    -0.287746    0.0745894    0.107196    -0.137515 
 -0.599537    0.25388     0.198314    0.431816    0.501593    -0.11624     0.0923016  -0.0726777   -0.109386     0.460397    0.391639   -0.396209     0.45679    -0.560312   -0.362861     -0.446813    0.412607     0.0787568   0.531629     0.446606     0.0771744    0.903004     0.237788    0.335625    -0.0977867    0.141752 
  0.422029   -0.424554   -0.1201     -0.322095    0.0547622    0.0749615   0.407743   -0.429156    -0.0676894    0.0496185   0.112911    0.839466     0.171704   -0.300051   -0.118242     -0.170559   -0.297117     0.241252    0.460859     0.00217709   0.862923     0.543566    -0.135616   -0.278879    -0.721089     0.0573498
 -0.0181202  -0.015391   -0.0433725  -0.455823   -0.355855     0.0380893  -0.0996998   0.123972    -0.333935    -0.0980678  -0.012069    0.0233823   -0.0721533  -0.274648   -0.107773      0.118005    0.176615    -0.160563    0.0185813    0.143822    -0.227193    -0.0184808   -0.0157701   0.00820062   0.0237001   -0.189538 
  0.277888    0.349974    0.281042   -0.160515   -0.150695    -0.494847    0.407765   -0.424914    -0.0780521    0.25531     0.0449432  -0.0645835   -0.263589    0.0856795   0.297997     -0.0696243  -0.139034    -0.568374   -0.12173     -0.101392    -0.401767    -0.200263    -0.351038   -0.0583192   -0.220931     0.426575 
 -0.442411   -0.0949013   0.0134368   0.791726    0.460045    -0.106991    0.129031   -0.0953696    0.213062    -0.258117   -0.248546    0.0274251   -0.0556954   0.236829    0.268813     -0.067356   -0.124702     0.190597    0.0150071   -0.456247    -0.107744     0.0922344   -0.568512   -0.147146    -0.0501601   -0.18776  
 -0.571059    0.282766   -0.0479701  -0.316736   -0.51928      0.225849   -0.0149706  -0.523949    -0.0865061   -0.484462    0.0199882   0.0709306    0.509111   -0.179538   -0.497731      0.340336    0.344971     0.31016     0.563279    -0.307757    -0.241691     0.477531    -0.280147   -0.476285    -0.368378    -0.350284 
  0.0603309   0.916563    0.405278    0.0775303  -0.16978     -0.757067    0.335563    0.142214    -0.187465    -0.373872    0.0543332  -0.19617     -0.555335   -0.259293   -0.0562508    -0.201297   -0.0124009    0.0238893   0.578919     0.23754      0.0608045    0.0523201   -0.733632   -0.57989     -0.842221    -0.0871921
  0.467482   -0.108993   -0.357449   -0.569962   -0.232468     0.112044   -0.240636   -0.0388786   -0.0376128   -0.156552   -0.29353     0.0273573   -0.856163   -0.160386    0.0484554     0.1274     -0.188596    -0.678341    0.0818432   -0.137962    -0.27686      0.189554     0.176012   -0.263227    -0.473307    -0.383638 
  0.154324    0.0518264   0.268467    0.123943   -0.197086    -0.227373   -0.529617    0.101213     0.0152783    0.226024    0.227196   -0.237176     0.22592     0.052449    0.0555874     0.254214    0.252956    -0.227248   -0.327535    -0.219994     0.00191115   0.245768    -0.254258   -0.334079     0.105201    -0.0810407[ Info: Running 10 iterations EM on diag cov GMM with 32 Gaussians in 26 dimensions
[ Info: iteration 1, average log likelihood -1.417816
[ Info: iteration 2, average log likelihood -1.417807
[ Info: iteration 3, average log likelihood -1.417798
[ Info: iteration 4, average log likelihood -1.417789
[ Info: iteration 5, average log likelihood -1.417780
[ Info: iteration 6, average log likelihood -1.417771
[ Info: iteration 7, average log likelihood -1.417763
[ Info: iteration 8, average log likelihood -1.417754
[ Info: iteration 9, average log likelihood -1.417746
[ Info: iteration 10, average log likelihood -1.417738
┌ Info: EM with 100000 data points 10 iterations avll -1.417738
└ 59.0 data points per parameter
[ Info: Initializing GMM, 2 Gaussians diag covariance 2 dimensions using 900 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       1.678561e+05
      1       2.230230e+04      -1.455538e+05 |        2
      2       7.823675e+03      -1.447862e+04 |        0
      3       7.823675e+03       0.000000e+00 |        0
K-means converged with 3 iterations (objv = 7823.67549422947)
┌ Info: K-means with 900 data points using 3 iterations
└ 150.0 data points per parameter
[ Info: Running 10 iterations EM on full cov GMM with 2 Gaussians in 2 dimensions
[ Info: iteration 1, average log likelihood -2.043155
[ Info: iteration 2, average log likelihood -2.043154
[ Info: iteration 3, average log likelihood -2.043154
[ Info: iteration 4, average log likelihood -2.043154
[ Info: iteration 5, average log likelihood -2.043154
[ Info: iteration 6, average log likelihood -2.043154
[ Info: iteration 7, average log likelihood -2.043154
[ Info: iteration 8, average log likelihood -2.043154
[ Info: iteration 9, average log likelihood -2.043154
[ Info: iteration 10, average log likelihood -2.043154
┌ Info: EM with 900 data points 10 iterations avll -2.043154
└ 81.8 data points per parameter
   Testing GaussianMixtures tests passed