Is it possible, on the basis of this stack, to visualize data not only from logs, say, any other information, for example, sales statistics(or anything else)? Now, on the backend, this is done through highchart, it seems that I don’t want to produce a kitchen garden from technologies.In fact, it seems to me that it is possible to give statistics to elasticssearch, and not just logs? Has anyone tried? Share your experience, please.
  • Completely.The convenience of using such a bundle, of course, depends on specific tasks, but I think the range is quite wide.Moreover, statistics are essentially logs. – Chinese5 Jul 15 '16 at 18:14
  • Chinese5: have you already implemented something like this? Logs should be formed in a specific format or will any format of logs be parsed? – Angry20 Jul 20 '16 at 16:56

1 Answers 1

Good day,
Yes, you can.If you look at the Beats line, then the ability to add metric data is realized through it.
Plus a rich arsenal of plugins for LS: netflow, sflow, graphite.

We use netflow statistics visualization and this is enough.

But for visualization of data, from different systems, for internal and external customers a bunch is used Graphite=>Grafana<=ES, Zabbix=>Grafana

Much in the selection of a tool depends on the TOR.What you want to see and what mathematical or statistical processing the data must pass.
  • Thank! – Angry20 Jul 23 '16 at 14:37
  • Uptight62 we want to receive graphic information not only on loads but also on keeping statistics - from attendance to sales, that is, who needs what time for which categories of purchasing activity in general, everything that can be useful after analyzing the information - is necessary.
    Line beats does it work like? ELK + Beats Or is it enough just to beats out of the box? Well, there are a lot of questions regarding the implementation of this bunch - is it worth it to take statistics collection to a separate server(how is this implemented"according to the mind"? For monitoring in real time? How to get the details of this or that functionality - for example, see how many active baskets are and so on - we want to know everything asball and real-time, details, unfortunately, have not yet arrived from marketers, but in general terms like this.) Thank you.
    – Angry20 Aug 15 '16 at 15:06
  • Angry20: good day, beats collect data and send it to LS for additional processing or to ES
    right here(https://www is a picture explaining the concept.
    That is, ES will act as a data warehouse.Visualization can be either Kibana tools or Grafana, for example.
    advising something specific without additional data is difficult.
    for general reasons, these are memory and fast disks on the processing server(and possibly a separate stack server, depending on the amount of data) and can be a Radish or RebitMKyu broker to prevent data loss during the receiving and initial processing.
    According to my(not very large) experience, I can say that good visualization directly depends on the correctness of data preparation.
    For ES, for example, these are indices.
    In your case, this is how the data is presented in a"raw" form and how it will have to be processed before returning to the visualization system.It takes time for such projects(although perhaps I"...just do not know how to prepare them")

    The question is, why don’t you want to use instumens to visualize metric data.Same as Graphite + Grafana for example?
    – Uptight62 Aug 16 '16 at 15:10
  • Uptight62: Thank you for your comprehensive answer.Why? We plan to hryvat and metric data as well.The desire to draw and display statistics not only from metric data but also from marketing, say, is because I don’t want(if possible on the ELK + something else stack - Beats, etc) to draw this statistics then why not? That's why.And yes, there metrics will be stored too, naturally. – Angry20 Aug 16 '16 at 15:27
  • Uptight62: and another such question - does it make sense to put all this on a separate server? How better to organize the collection processing and drawing all the statistics? Thank. – Angry20 Aug 16 '16 at 15:30
  • Angry20: Pavel, depends on the amount of data, the retention period and how many people will work with it.
    The main feature of Apache Lucene(ELK engine) is the consumption of excess, as compared with traditional metric storage, the consumption of disk space.
    I can advise you to monitor the download by events and the speed of indexing, and if the indicators will grow to allocate a separate server, in a cluster or just a separate server.
    How many documents do you have per second come to the server?
    – Uptight62 Aug 16 '16 at 22:21
  • Uptight62: While I’m not sure how many documents will go up onto the stack(ELK + Beats) - there is no clear understanding of how the logging process will work, we only form requirements for the introduction of the stack.We will work with this data technically, it is possible to allocate a separate server for monitoring, and we probably will. – Angry20 Aug 17 '16 at 10:33