pyFTS.probabilistic package

Module contents

Probability Distribution objects

Submodules

pyFTS.probabilistic.ProbabilityDistribution module

class pyFTS.probabilistic.ProbabilityDistribution.ProbabilityDistribution(type='KDE', **kwargs)[source]

Bases: object

Represents a discrete or continous probability distribution If type is histogram, the PDF is discrete If type is KDE the PDF is continuous

append(values)[source]

Increment the frequency count for the values

Parameters:values – A list of values to account the frequency
append_interval(intervals)[source]

Increment the frequency count for all values inside an interval

Parameters:intervals – A list of intervals do increment the frequency
averageloglikelihood(data)[source]

Average log likelihood of the probability distribution with respect to data

Parameters:data
Returns:
bins = None

Number of bins on a discrete PDF

build_cdf_qtl()[source]
crossentropy(q)[source]

Cross entropy between the actual probability distribution and the informed one, H(P,Q) = - ∑ P(x) log ( Q(x) )

Parameters:q – a probabilistic.ProbabilityDistribution object
Returns:Cross entropy between this probability distribution and the given distribution
cumulative(values)[source]

Return the cumulative probability densities for the input values, such that F(x) = P(X <= x)

Parameters:values – A list of input values
Returns:The cumulative probability densities for the input values
density(values)[source]

Return the probability densities for the input values

Parameters:values – List of values to return the densities
Returns:List of probability densities for the input values
differential_offset(value)[source]

Auxiliary function for probability distributions of differentiated data

Parameters:value
Returns:
empiricalloglikelihood()[source]

Empirical Log Likelihood of the probability distribution, L(P) = ∑ log( P(x) )

Returns:
entropy()[source]

Return the entropy of the probability distribution, H(P) = E[ -ln P(X) ] = - ∑ P(x) log ( P(x) )

:return:the entropy of the probability distribution

expected_value()[source]

Return the expected value of the distribution, as E[X] = ∑ x * P(x)

Returns:The expected value of the distribution
kullbackleiblerdivergence(q)[source]

Kullback-Leibler divergence between the actual probability distribution and the informed one. DKL(P || Q) = - ∑ P(x) log( P(X) / Q(x) )

Parameters:q – a probabilistic.ProbabilityDistribution object
Returns:Kullback-Leibler divergence
labels = None

Bins labels on a discrete PDF

plot(axis=None, color='black', tam=[10, 6], title=None)[source]
pseudologlikelihood(data)[source]

Pseudo log likelihood of the probability distribution with respect to data

Parameters:data
Returns:
quantile(values)[source]

Return the Universe of Discourse values in relation to the quantile input values, such that Q(tau) = min( {x | F(x) >= tau })

Parameters:values – input values
Returns:The list of the quantile values for the input values
set(value, density)[source]

Assert a probability ‘density’ for a certain value ‘value’, such that P(value) = density

Parameters:
  • value – A value in the universe of discourse from the distribution
  • density – The probability density to assign to the value
type = None

If type is histogram, the PDF is discrete If type is KDE the PDF is continuous

uod = None

Universe of discourse

pyFTS.probabilistic.kde module

Kernel Density Estimation

class pyFTS.probabilistic.kde.KernelSmoothing(h, kernel='epanechnikov')[source]

Bases: object

Kernel Density Estimation

h = None

Width parameter

kernel = None

Kernel function

kernel_function(u)[source]

Apply the kernel

Parameters:u
Returns:
probability(x, data)[source]

Probability of the point x on data

Parameters:
  • x
  • data
Returns: