Marvel not showing nodes stats

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Marvel not showing nodes stats

prometheus
Hi,

We have development test cluster 8 nodes running on 0.90.10 ES.   Also we have one monitoring node separated from dev cluster for marvel.

We installed all nodes to marvel latest /1.1.1/   pointing to monitoring marvel IP in the ES config.  Told the monitoring node not collect its data.

We restarted all nodes.

Waited hours......

we have only :

CLUSTER SUMMARY

  • Name: indictoYeniStatus: greenNodes: 8Indices: 15Shards: 75Data: 1.47 TBCPU: 1311%Memory: 111.68 GB / 235.20 GBUp time: 19.9 hVersion: 0.90.10
But All other information is not available. It says in the dashboard below panel : 

Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for node.ip_port.raw]


and


Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for index.raw]


I couldnt find a fix.


Is there any solution to this ?  


Thanks

PS






--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ce631c19-da03-4c31-9b11-98bf8d6d7d41%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Marvel not showing nodes stats

prometheus
Error log in the monitoring node:


******************

[2014-05-28 17:28:20,009][DEBUG][action.search.type       ] [MarvelSE02] [.marvel-2014.05.28][2], node[44R9_5vfRVmEJDgApyBRrQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@6c4e0d81] lastShard [true]
org.elasticsearch.search.SearchParseException: [.marvel-2014.05.28][2]: query[ConstantScore(BooleanFilter(+cache(@timestamp:[1401283762615 TO 1401287300008]) +cache(_type:node_stats) +cache(@timestamp:[1401286680000 TO 1401287340000])))],from[-1],size[10]: Parse Failure [Failed to parse source [{"size":10,"query":{"filtered":{"query":{"match_all":{}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"from":1401283762615,"to":"now"}}},{"term":{"_type":"node_stats"}},{"range":{"@timestamp":{"from":"now-10m/m","to":"now/m"}}}]}}}},"facets":{"timestamp":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"@timestamp","order":"term","size":2000}},"master_periods":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"@timestamp","order":"term","size":2000},"facet_filter":{"term":{"node.master":"true"}}},"os.cpu.usage":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"os.cpu.usage","order":"term","size":2000}},"os.load_average.1m":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"os.load_average.1m","order":"term","size":2000}},"jvm.mem.heap_used_percent":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"jvm.mem.heap_used_percent","order":"term","size":2000}},"fs.total.available_in_bytes":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"fs.total.available_in_bytes","order":"term","size":2000}},"fs.total.disk_io_op":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"fs.total.disk_io_op","order":"term","size":2000}}}}]]
        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:581)
        at org.elasticsearch.search.SearchService.createContext(SearchService.java:484)
        at org.elasticsearch.search.SearchService.createContext(SearchService.java:469)
        at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:462)
        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:234)
        at org.elasticsearch.search.action.SearchServiceTransportAction.sendExecuteQuery(SearchServiceTransportAction.java:202)
        at org.elasticsearch.action.search.type.TransportSearchQueryThenFetchAction$AsyncAction.sendExecuteFirstPhase(TransportSearchQueryThenFetchAction.java:80)
        at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.performFirstPhase(TransportSearchTypeAction.java:216)
        at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.performFirstPhase(TransportSearchTypeAction.java:203)
        at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$2.run(TransportSearchTypeAction.java:186)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:724)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [timestamp]: failed to find mapping for node.ip_port.raw
        at org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:119)
        at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:94)
        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:569)
        ... 12 more
******************


On Wednesday, May 28, 2014 5:57:18 PM UTC+3, Prometheus WillSurvive wrote:
Hi,

We have development test cluster 8 nodes running on 0.90.10 ES.   Also we have one monitoring node separated from dev cluster for marvel.

We installed all nodes to marvel latest /1.1.1/   pointing to monitoring marvel IP in the ES config.  Told the monitoring node not collect its data.

We restarted all nodes.

Waited hours......

we have only :

CLUSTER SUMMARY

  • Name: indictoYeniStatus: greenNodes: 8Indices: 15Shards: 75Data: 1.47 TBCPU: 1311%Memory: 111.68 GB / 235.20 GBUp time: 19.9 hVersion: 0.90.10
But All other information is not available. It says in the dashboard below panel : 

Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for node.ip_port.raw]


and


Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for index.raw]


I couldnt find a fix.


Is there any solution to this ?  


Thanks

PS






--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/fe8dd28d-1a27-44b8-8d00-5ebaffd5c428%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Marvel not showing nodes stats

ElasticSearch Users mailing list
Are there any marvel errors on one of the data nodes that marvel is getting the data from? Usually you error means marvel could not ship the data over to the monitoring cluster.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/c12edb40-cb79-46d5-a434-80a0b03cf731%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Marvel not showing nodes stats

jeffbyrnes
In reply to this post by prometheus
I'm experiencing a similar issue to this. We have two clusters:
  • 2 node monitoring cluster (1 master/data & 1 just data)
  • 5 node production cluster (2 data, 3 masters)
The output below is from the non-master data node of the Marvel monitoring cluster. There are no errors being reported by any of the production nodes.

[2014-08-26 21:10:51,503][DEBUG][action.search.type       ] [stage-search-marvel-1c] [.marvel-2014.08.26][2], node[iGRH8Gc2QO698RMlWy8rgQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@355e93ff]
org.elasticsearch.transport.RemoteTransportException: [stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: [.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* +cache(_type:index_stats) +cache(@timestamp:[1409086800000 TO 1409087460000])))],from[-1],size[10]: Parse Failure [Failed to parse source [{"size":10,"query":{"filtered":{"query":{"match_all":{}},"filter":{"bool":{"must":[{"match_all":{}},{"term":{"_type":"index_stats"}},{"range":{"@timestamp":{"from":"now-10m/m","to":"now/m"}}}]}}}},"facets":{"timestamp":{"terms_stats":{"key_field":"index.raw","value_field":"@timestamp","order":"term","size":2000}},"primaries.docs.count":{"terms_stats":{"key_field":"index.raw","value_field":"primaries.docs.count","order":"term","size":2000}},"primaries.indexing.index_total":{"terms_stats":{"key_field":"index.raw","value_field":"primaries.indexing.index_total","order":"term","size":2000}},"total.search.query_total":{"terms_stats":{"key_field":"index.raw","value_field":"total.search.query_total","order":"term","size":2000}},"total.merges.total_size_in_bytes":{"terms_stats":{"key_field":"index.raw","value_field":"total.merges.total_size_in_bytes","order":"term","size":2000}},"total.fielddata.memory_size_in_bytes":{"terms_stats":{"key_field":"index.raw","value_field":"total.fielddata.memory_size_in_bytes","order":"term","size":2000}}}}]]
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
    at org.elasticsearch.search.SearchService.createContext(SearchService.java:515)
    at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:487)
    at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:256)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:688)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:677)
    at org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.run(MessageChannelHandler.java:275)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [timestamp]: failed to find mapping for index.raw
    at org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:126)
    at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:648)
    ... 9 more
[2014-08-26 21:10:51,503][DEBUG][action.search.type       ] [stage-search-marvel-1c] [.marvel-2014.08.26][2], node[iGRH8Gc2QO698RMlWy8rgQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@32f235e9]
org.elasticsearch.transport.RemoteTransportException: [stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: [.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* +cache(_type:node_stats) +cache(@timestamp:[1409086800000 TO 1409087460000])))],from[-1],size[10]: Parse Failure [Failed to parse source [{"size":10,"query":{"filtered":{"query":{"match_all":{}},"filter":{"bool":{"must":[{"match_all":{}},{"term":{"_type":"node_stats"}},{"range":{"@timestamp":{"from":"now-10m/m","to":"now/m"}}}]}}}},"facets":{"timestamp":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"@timestamp","order":"term","size":2000}},"master_nodes":{"terms":{"field":"node.ip_port.raw","size":2000},"facet_filter":{"term":{"node.master":"true"}}},"os.cpu.usage":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"os.cpu.usage","order":"term","size":2000}},"os.load_average.1m":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"os.load_average.1m","order":"term","size":2000}},"jvm.mem.heap_used_percent":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"jvm.mem.heap_used_percent","order":"term","size":2000}},"fs.total.available_in_bytes":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"fs.total.available_in_bytes","order":"term","size":2000}},"fs.total.disk_io_op":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"fs.total.disk_io_op","order":"term","size":2000}}}}]]
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
    at org.elasticsearch.search.SearchService.createContext(SearchService.java:515)
    at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:487)
    at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:256)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:688)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:677)
    at org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.run(MessageChannelHandler.java:275)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [timestamp]: failed to find mapping for node.ip_port.raw
    at org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:126)
    at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:648)
    ... 9 more
[2014-08-26 21:10:51,504][DEBUG][action.search.type       ] [stage-search-marvel-1c] All shards failed for phase: [query]


On Wednesday, May 28, 2014 10:57:18 AM UTC-4, Prometheus WillSurvive wrote:
Hi,

We have development test cluster 8 nodes running on 0.90.10 ES.   Also we have one monitoring node separated from dev cluster for marvel.

We installed all nodes to marvel latest /1.1.1/   pointing to monitoring marvel IP in the ES config.  Told the monitoring node not collect its data.

We restarted all nodes.

Waited hours......

we have only :

CLUSTER SUMMARY

  • Name: indictoYeniStatus: greenNodes: 8Indices: 15Shards: 75Data: 1.47 TBCPU: 1311%Memory: 111.68 GB / 235.20 GBUp time: 19.9 hVersion: 0.90.10
But All other information is not available. It says in the dashboard below panel : 

Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for node.ip_port.raw]


and


Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for index.raw]


I couldnt find a fix.


Is there any solution to this ?  


Thanks

PS






--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/2940771a-7fbd-45f2-a617-3fcea36fff14%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
-Jeff
@thejeffbyrnes
Reply | Threaded
Open this post in threaded view
|

Re: Marvel not showing nodes stats

Boaz Leskes
Jeff, 

Two things can be at play:

1) Either you have no data (easily checked by direct call to the monitoring cluster, from Sense or curl )
2) Or something is wrong with the mapping - typically caused by missing an index template. Can you check wether your monitoring cluster has GET _template/marvel ?

Cheers,
Boaz

On Tuesday, August 26, 2014 11:23:22 PM UTC+2, Jeff Byrnes wrote:
I'm experiencing a similar issue to this. We have two clusters:
  • 2 node monitoring cluster (1 master/data & 1 just data)
  • 5 node production cluster (2 data, 3 masters)
The output below is from the non-master data node of the Marvel monitoring cluster. There are no errors being reported by any of the production nodes.

[2014-08-26 21:10:51,503][DEBUG][action.search.type       ] [stage-search-marvel-1c] [.marvel-2014.08.26][2], node[iGRH8Gc2QO698RMlWy8rgQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@355e93ff]
org.elasticsearch.transport.RemoteTransportException: [stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: [.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* +cache(_type:index_stats) +cache(@timestamp:[1409086800000 TO 1409087460000])))],from[-1],size[10]: Parse Failure [Failed to parse source [{"size":10,"query":{"filtered":{"query":{"match_all":{}},"filter":{"bool":{"must":[{"match_all":{}},{"term":{"_type":"index_stats"}},{"range":{"@timestamp":{"from":"now-10m/m","to":"now/m"}}}]}}}},"facets":{"timestamp":{"terms_stats":{"key_field":"index.raw","value_field":"@timestamp","order":"term","size":2000}},"primaries.docs.count":{"terms_stats":{"key_field":"index.raw","value_field":"primaries.docs.count","order":"term","size":2000}},"primaries.indexing.index_total":{"terms_stats":{"key_field":"index.raw","value_field":"primaries.indexing.index_total","order":"term","size":2000}},"total.search.query_total":{"terms_stats":{"key_field":"index.raw","value_field":"total.search.query_total","order":"term","size":2000}},"total.merges.total_size_in_bytes":{"terms_stats":{"key_field":"index.raw","value_field":"total.merges.total_size_in_bytes","order":"term","size":2000}},"total.fielddata.memory_size_in_bytes":{"terms_stats":{"key_field":"index.raw","value_field":"total.fielddata.memory_size_in_bytes","order":"term","size":2000}}}}]]
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
    at org.elasticsearch.search.SearchService.createContext(SearchService.java:515)
    at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:487)
    at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:256)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:688)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:677)
    at org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.run(MessageChannelHandler.java:275)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [timestamp]: failed to find mapping for index.raw
    at org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:126)
    at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:648)
    ... 9 more
[2014-08-26 21:10:51,503][DEBUG][action.search.type       ] [stage-search-marvel-1c] [.marvel-2014.08.26][2], node[iGRH8Gc2QO698RMlWy8rgQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@32f235e9]
org.elasticsearch.transport.RemoteTransportException: [stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: [.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* +cache(_type:node_stats) +cache(@timestamp:[1409086800000 TO 1409087460000])))],from[-1],size[10]: Parse Failure [Failed to parse source [{"size":10,"query":{"filtered":{"query":{"match_all":{}},"filter":{"bool":{"must":[{"match_all":{}},{"term":{"_type":"node_stats"}},{"range":{"@timestamp":{"from":"now-10m/m","to":"now/m"}}}]}}}},"facets":{"timestamp":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"@timestamp","order":"term","size":2000}},"master_nodes":{"terms":{"field":"node.ip_port.raw","size":2000},"facet_filter":{"term":{"node.master":"true"}}},"os.cpu.usage":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"os.cpu.usage","order":"term","size":2000}},"os.load_average.1m":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"os.load_average.1m","order":"term","size":2000}},"jvm.mem.heap_used_percent":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"jvm.mem.heap_used_percent","order":"term","size":2000}},"fs.total.available_in_bytes":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"fs.total.available_in_bytes","order":"term","size":2000}},"fs.total.disk_io_op":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"fs.total.disk_io_op","order":"term","size":2000}}}}]]
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
    at org.elasticsearch.search.SearchService.createContext(SearchService.java:515)
    at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:487)
    at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:256)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:688)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:677)
    at org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.run(MessageChannelHandler.java:275)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [timestamp]: failed to find mapping for node.ip_port.raw
    at org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:126)
    at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:648)
    ... 9 more
[2014-08-26 21:10:51,504][DEBUG][action.search.type       ] [stage-search-marvel-1c] All shards failed for phase: [query]


On Wednesday, May 28, 2014 10:57:18 AM UTC-4, Prometheus WillSurvive wrote:
Hi,

We have development test cluster 8 nodes running on 0.90.10 ES.   Also we have one monitoring node separated from dev cluster for marvel.

We installed all nodes to marvel latest /1.1.1/   pointing to monitoring marvel IP in the ES config.  Told the monitoring node not collect its data.

We restarted all nodes.

Waited hours......

we have only :

CLUSTER SUMMARY

  • Name: indictoYeniStatus: greenNodes: 8Indices: 15Shards: 75Data: 1.47 TBCPU: 1311%Memory: 111.68 GB / 235.20 GBUp time: 19.9 hVersion: 0.90.10
But All other information is not available. It says in the dashboard below panel : 

Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for node.ip_port.raw]


and


Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for index.raw]


I couldnt find a fix.


Is there any solution to this ?  


Thanks

PS






--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/5c4bdbd0-e80f-4357-81cc-d30a8120f6ce%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Marvel not showing nodes stats

jeffbyrnes
Boaz,

Thanks for the response; I checked on it late yesterday, and it seems to have righted itself somehow. I brought up another cluster this morning, and it immediately received all the node & index data properly, so I must have gotten something stuck on my first try.

Thanks for the response!

-- 
Jeff Byrnes
@berkleebassist
Operations Engineer
704.516.4628

On August 28, 2014 at 4:41:13 PM, Boaz Leskes ([hidden email]) wrote:

Jeff, 

Two things can be at play:

1) Either you have no data (easily checked by direct call to the monitoring cluster, from Sense or curl )
2) Or something is wrong with the mapping - typically caused by missing an index template. Can you check wether your monitoring cluster has GET _template/marvel ?

Cheers,
Boaz

On Tuesday, August 26, 2014 11:23:22 PM UTC+2, Jeff Byrnes wrote:
I'm experiencing a similar issue to this. We have two clusters:
  • 2 node monitoring cluster (1 master/data & 1 just data)
  • 5 node production cluster (2 data, 3 masters)
The output below is from the non-master data node of the Marvel monitoring cluster. There are no errors being reported by any of the production nodes.

[2014-08-26 21:10:51,503][DEBUG][action.search.type       ] [stage-search-marvel-1c] [.marvel-2014.08.26][2], node[iGRH8Gc2QO698RMlWy8rgQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@355e93ff]
org.elasticsearch.transport.RemoteTransportException: [stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: [.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* +cache(_type:index_stats) +cache(@timestamp:[1409086800000 TO 1409087460000])))],from[-1],size[10]: Parse Failure [Failed to parse source [{"size":10,"query":{"filtered":{"query":{"match_all":{}},"filter":{"bool":{"must":[{"match_all":{}},{"term":{"_type":"index_stats"}},{"range":{"@timestamp":{"from":"now-10m/m","to":"now/m"}}}]}}}},"facets":{"timestamp":{"terms_stats":{"key_field":"index.raw","value_field":"@timestamp","order":"term","size":2000}},"primaries.docs.count":{"terms_stats":{"key_field":"index.raw","value_field":"primaries.docs.count","order":"term","size":2000}},"primaries.indexing.index_total":{"terms_stats":{"key_field":"index.raw","value_field":"primaries.indexing.index_total","order":"term","size":2000}},"total.search.query_total":{"terms_stats":{"key_field":"index.raw","value_field":"total.search.query_total","order":"term","size":2000}},"total.merges.total_size_in_bytes":{"terms_stats":{"key_field":"index.raw","value_field":"total.merges.total_size_in_bytes","order":"term","size":2000}},"total.fielddata.memory_size_in_bytes":{"terms_stats":{"key_field":"index.raw","value_field":"total.fielddata.memory_size_in_bytes","order":"term","size":2000}}}}]]
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
    at org.elasticsearch.search.SearchService.createContext(SearchService.java:515)
    at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:487)
    at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:256)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:688)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:677)
    at org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.run(MessageChannelHandler.java:275)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [timestamp]: failed to find mapping for index.raw
    at org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:126)
    at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:648)
    ... 9 more
[2014-08-26 21:10:51,503][DEBUG][action.search.type       ] [stage-search-marvel-1c] [.marvel-2014.08.26][2], node[iGRH8Gc2QO698RMlWy8rgQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@32f235e9]
org.elasticsearch.transport.RemoteTransportException: [stage-search-marvel-1b][inet[/10.99.111.122:9300]][search/phase/query]
Caused by: org.elasticsearch.search.SearchParseException: [.marvel-2014.08.26][2]: query[ConstantScore(BooleanFilter(+*:* +cache(_type:node_stats) +cache(@timestamp:[1409086800000 TO 1409087460000])))],from[-1],size[10]: Parse Failure [Failed to parse source [{"size":10,"query":{"filtered":{"query":{"match_all":{}},"filter":{"bool":{"must":[{"match_all":{}},{"term":{"_type":"node_stats"}},{"range":{"@timestamp":{"from":"now-10m/m","to":"now/m"}}}]}}}},"facets":{"timestamp":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"@timestamp","order":"term","size":2000}},"master_nodes":{"terms":{"field":"node.ip_port.raw","size":2000},"facet_filter":{"term":{"node.master":"true"}}},"os.cpu.usage":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"os.cpu.usage","order":"term","size":2000}},"os.load_average.1m":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"os.load_average.1m","order":"term","size":2000}},"jvm.mem.heap_used_percent":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"jvm.mem.heap_used_percent","order":"term","size":2000}},"fs.total.available_in_bytes":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"fs.total.available_in_bytes","order":"term","size":2000}},"fs.total.disk_io_op":{"terms_stats":{"key_field":"node.ip_port.raw","value_field":"fs.total.disk_io_op","order":"term","size":2000}}}}]]
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:664)
    at org.elasticsearch.search.SearchService.createContext(SearchService.java:515)
    at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:487)
    at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:256)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:688)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryTransportHandler.messageReceived(SearchServiceTransportAction.java:677)
    at org.elasticsearch.transport.netty.MessageChannelHandler$RequestHandler.run(MessageChannelHandler.java:275)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [timestamp]: failed to find mapping for node.ip_port.raw
    at org.elasticsearch.search.facet.termsstats.TermsStatsFacetParser.parse(TermsStatsFacetParser.java:126)
    at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:648)
    ... 9 more
[2014-08-26 21:10:51,504][DEBUG][action.search.type       ] [stage-search-marvel-1c] All shards failed for phase: [query]


On Wednesday, May 28, 2014 10:57:18 AM UTC-4, Prometheus WillSurvive wrote:
Hi,

We have development test cluster 8 nodes running on 0.90.10 ES.   Also we have one monitoring node separated from dev cluster for marvel.

We installed all nodes to marvel latest /1.1.1/   pointing to monitoring marvel IP in the ES config.  Told the monitoring node not collect its data.

We restarted all nodes.

Waited hours......

we have only :

CLUSTER SUMMARY

  • Name: indictoYeniStatus: greenNodes: 8Indices: 15Shards: 75Data: 1.47 TBCPU: 1311%Memory: 111.68 GB / 235.20 GBUp time: 19.9 hVersion: 0.90.10
But All other information is not available. It says in the dashboard below panel : 

Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for node.ip_port.raw]


and


Oops! FacetPhaseExecutionException[Facet [timestamp]: failed to find mapping for index.raw]


I couldnt find a fix.


Is there any solution to this ?  


Thanks

PS






--
You received this message because you are subscribed to a topic in the Google Groups "elasticsearch" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/elasticsearch/79aAgZ_YQHc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to [hidden email].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/5c4bdbd0-e80f-4357-81cc-d30a8120f6ce%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/etPan.53ff943f.66334873.2540%40Jeff-MacBook-Pro.
For more options, visit https://groups.google.com/d/optout.
--
-Jeff
@thejeffbyrnes
Reply | Threaded
Open this post in threaded view
|

Re: Marvel not showing nodes stats

bsarkar
This post has NOT been accepted by the mailing list yet.
In reply to this post by Boaz Leskes
Hi Boaz,

We are also experiencing similar problems on Marvel dashboard and the problem has not rectified itself even after 36 hours. Regarding your suggestion below

2) Or something is wrong with the mapping - typically caused by missing an
index template. Can you check wether your monitoring cluster has GET
_template/marvel ?

I executed GET _template/marvel on our Marvel cluster and got empty response. What could this mean? How can I fix it?