pyFTS.probabilistic package

Module contents

Probability Distribution objects

Submodules

pyFTS.probabilistic.ProbabilityDistribution module

class pyFTS.probabilistic.ProbabilityDistribution.ProbabilityDistribution(type='KDE', **kwargs)

Bases: object

Represents a discrete or continous probability distribution If type is histogram, the PDF is discrete If type is KDE the PDF is continuous

append(values)

Increment the frequency count for the values

Parameters:values – A list of values to account the frequency
append_interval(intervals)

Increment the frequency count for all values inside an interval

Parameters:intervals – A list of intervals do increment the frequency
averageloglikelihood(data)

Average log likelihood of the probability distribution with respect to data

Parameters:data
Returns:
build_cdf_qtl()
crossentropy(q)

Cross entropy between the actual probability distribution and the informed one, H(P,Q) = - ∑ P(x) log ( Q(x) )

Parameters:q – a probabilistic.ProbabilityDistribution object
Returns:Cross entropy between this probability distribution and the given distribution
cumulative(values)

Return the cumulative probability densities for the input values, such that F(x) = P(X <= x)

Parameters:values – A list of input values
Returns:The cumulative probability densities for the input values
density(values)

Return the probability densities for the input values

Parameters:values – List of values to return the densities
Returns:List of probability densities for the input values
differential_offset(value)

Auxiliary function for probability distributions of differentiated data

Parameters:value
Returns:
empiricalloglikelihood()

Empirical Log Likelihood of the probability distribution, L(P) = ∑ log( P(x) )

Returns:
entropy()

Return the entropy of the probability distribution, H(P) = E[ -ln P(X) ] = - ∑ P(x) log ( P(x) )

:return:the entropy of the probability distribution

expected_value()

Return the expected value of the distribution, as E[X] = ∑ x * P(x)

Returns:The expected value of the distribution
kullbackleiblerdivergence(q)

Kullback-Leibler divergence between the actual probability distribution and the informed one. DKL(P || Q) = - ∑ P(x) log( P(X) / Q(x) )

Parameters:q – a probabilistic.ProbabilityDistribution object
Returns:Kullback-Leibler divergence
plot(axis=None, color='black', tam=[10, 6], title=None)
pseudologlikelihood(data)

Pseudo log likelihood of the probability distribution with respect to data

Parameters:data
Returns:
quantile(values)

Return the Universe of Discourse values in relation to the quantile input values, such that Q(tau) = min( {x | F(x) >= tau })

Parameters:values – input values
Returns:The list of the quantile values for the input values
set(value, density)

Assert a probability ‘density’ for a certain value ‘value’, such that P(value) = density

Parameters:
  • value – A value in the universe of discourse from the distribution
  • density – The probability density to assign to the value

pyFTS.probabilistic.kde module

Kernel Density Estimation

class pyFTS.probabilistic.kde.KernelSmoothing(**kwargs)

Bases: object

Kernel Density Estimation

kernel_function(u)

Apply the kernel

Parameters:u
Returns:
probability(x, **kwargs)

Probability of the point x on data

Parameters:
  • x
  • data
Returns: